Science.gov

Sample records for addition quantitative results

  1. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  2. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    NASA Astrophysics Data System (ADS)

    Lemou, Mohammed

    2016-09-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  3. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  4. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  5. Quantitative Analysis of Polymer Additives with MALDI-TOF MS Using an Internal Standard Approach

    NASA Astrophysics Data System (ADS)

    Schwarzinger, Clemens; Gabriel, Stefan; Beißmann, Susanne; Buchberger, Wolfgang

    2012-06-01

    MALDI-TOF MS is used for the qualitative analysis of seven different polymer additives directly from the polymer without tedious sample pretreatment. Additionally, by using a solid sample preparation technique, which avoids the concentration gradient problems known to occur with dried droplets and by adding tetraphenylporphyrine as an internal standard to the matrix, it is possible to perform quantitative analysis of additives directly from the polymer sample. Calibration curves for Tinuvin 770, Tinuvin 622, Irganox 1024, Irganox 1010, Irgafos 168, and Chimassorb 944 are presented, showing coefficients of determination between 0.911 and 0.990.

  6. Quantitative analysis of polymer additives with MALDI-TOF MS using an internal standard approach.

    PubMed

    Schwarzinger, Clemens; Gabriel, Stefan; Beißmann, Susanne; Buchberger, Wolfgang

    2012-06-01

    MALDI-TOF MS is used for the qualitative analysis of seven different polymer additives directly from the polymer without tedious sample pretreatment. Additionally, by using a solid sample preparation technique, which avoids the concentration gradient problems known to occur with dried droplets and by adding tetraphenylporphyrine as an internal standard to the matrix, it is possible to perform quantitative analysis of additives directly from the polymer sample. Calibration curves for Tinuvin 770, Tinuvin 622, Irganox 1024, Irganox 1010, Irgafos 168, and Chimassorb 944 are presented, showing coefficients of determination between 0.911 and 0.990.

  7. The quantitative surface analysis of an antioxidant additive in a lubricant oil matrix by desorption electrospray ionization mass spectrometry

    PubMed Central

    Da Costa, Caitlyn; Reynolds, James C; Whitmarsh, Samuel; Lynch, Tom; Creaser, Colin S

    2013-01-01

    RATIONALE Chemical additives are incorporated into commercial lubricant oils to modify the physical and chemical properties of the lubricant. The quantitative analysis of additives in oil-based lubricants deposited on a surface without extraction of the sample from the surface presents a challenge. The potential of desorption electrospray ionization mass spectrometry (DESI-MS) for the quantitative surface analysis of an oil additive in a complex oil lubricant matrix without sample extraction has been evaluated. METHODS The quantitative surface analysis of the antioxidant additive octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix was carried out by DESI-MS in the presence of 2-(pentyloxy)ethyl 3-(3,5-di-tert-butyl-4-hydroxyphenyl)propionate as an internal standard. A quadrupole/time-of-flight mass spectrometer fitted with an in-house modified ion source enabling non-proximal DESI-MS was used for the analyses. RESULTS An eight-point calibration curve ranging from 1 to 80 µg/spot of octyl (4-hydroxy-3,5-di-tert-butylphenyl)propionate in an oil lubricant matrix and in the presence of the internal standard was used to determine the quantitative response of the DESI-MS method. The sensitivity and repeatability of the technique were assessed by conducting replicate analyses at each concentration. The limit of detection was determined to be 11 ng/mm2 additive on spot with relative standard deviations in the range 3–14%. CONCLUSIONS The application of DESI-MS to the direct, quantitative surface analysis of a commercial lubricant additive in a native oil lubricant matrix is demonstrated. © 2013 The Authors. Rapid Communications in Mass Spectrometry published by John Wiley & Sons, Ltd. PMID:24097398

  8. Integrated microfluidic device for serum biomarker quantitation using either standard addition or a calibration curve.

    PubMed

    Yang, Weichun; Sun, Xiuhua; Wang, Hsiang-Yu; Woolley, Adam T

    2009-10-01

    Detection and accurate quantitation of biomarkers such as alpha-fetoprotein (AFP) can be a key aspect of early stage cancer diagnosis. Microfluidic devices provide attractive analysis capabilities, including low sample and reagent consumption, as well as short assay times. However, to date microfluidic analyzers have relied almost exclusively on calibration curves for sample quantitation, which can be problematic for complex mixtures such as human serum. We have fabricated integrated polymer microfluidic systems that can quantitatively determine fluorescently labeled AFP in human serum using either the method of standard addition or a calibration curve. Our microdevices couple an immunoaffinity purification step with rapid microchip electrophoresis separation in a laser-induced fluorescence detection system, all under automated voltage control in a miniaturized polymer microchip. In conjunction with laser-induced fluorescence detection, these systems can quantify AFP at approximately 1 ng/mL levels in approximately 10 microL of human serum in a few tens of minutes. Our polymer microdevices have been applied in determining AFP in spiked serum samples. These integrated microsystems offer excellent potential for rapid, simple, and accurate biomarker quantitation in a point-of-care setting.

  9. Mars-GRAM 2010: Additions and Resulting Improvements

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Burns, K. Lee

    2013-01-01

    factors. The adjustment factors generated by this process had to satisfy the gas law as well as the hydrostatic relation and are expressed as a function of height (z), Latitude (Lat) and areocentric solar longitude (Ls). The greatest adjustments are made at large optical depths such as tau greater than 1. The addition of the adjustment factors has led to better correspondence to TES Limb data from 0-60 km altitude as well as better agreement with MGS, ODY and MRO data at approximately 90-130 km altitude. Improved Mars-GRAM atmospheric simulations for various locations, times and dust conditions on Mars will be presented at the workshop session. The latest results validating Mars-GRAM 2010 versus Mars Climate Sounder data will also be presented. Mars-GRAM 2010 updates have resulted in improved atmospheric simulations which will be very important when beginning systems design, performance analysis, and operations planning for future aerocapture, aerobraking or landed missions to Mars.

  10. Additional Results of Ice-Accretion Scaling at SLD Conditions

    NASA Technical Reports Server (NTRS)

    Bond, Thomas H. (Technical Monitor); Anderson, David N.; Tsao, Jen-Ching

    2005-01-01

    To determine scale velocity an additional similarity parameter is needed to supplement the Ruff scaling method. A Weber number based on water droplet MVD has been included in several studies because the effect of droplet splashing on ice accretion was believed to be important, particularly for SLD conditions. In the present study, ice shapes recorded at Appendix-C conditions and recent results at SLD conditions are reviewed to show that droplet diameter cannot be important to main ice shape, and for low airspeeds splashing does not appear to affect SLD ice shapes. Evidence is presented to show that while a supplementary similarity parameter probably has the form of a Weber number, it must be based on a length proportional to model size rather than MVD. Scaling comparisons were made between SLD reference conditions and Appendix-C scale conditions using this Weber number. Scale-to-reference model size ratios were 1:1.7 and 1:3.4. The reference tests used a 91-cm-chord NACA 0012 model with a velocity of approximately 50 m/s and an MVD of 160 m. Freezing fractions of 0.3, 0.4, and 0.5 were included in the study.

  11. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions.

  12. Multiple Linkage Disequilibrium Mapping Methods to Validate Additive Quantitative Trait Loci in Korean Native Cattle (Hanwoo).

    PubMed

    Li, Yi; Kim, Jong-Joo

    2015-07-01

    The efficiency of genome-wide association analysis (GWAS) depends on power of detection for quantitative trait loci (QTL) and precision for QTL mapping. In this study, three different strategies for GWAS were applied to detect QTL for carcass quality traits in the Korean cattle, Hanwoo; a linkage disequilibrium single locus regression method (LDRM), a combined linkage and linkage disequilibrium analysis (LDLA) and a BayesCπ approach. The phenotypes of 486 steers were collected for weaning weight (WWT), yearling weight (YWT), carcass weight (CWT), backfat thickness (BFT), longissimus dorsi muscle area, and marbling score (Marb). Also the genotype data for the steers and their sires were scored with the Illumina bovine 50K single nucleotide polymorphism (SNP) chips. For the two former GWAS methods, threshold values were set at false discovery rate <0.01 on a chromosome-wide level, while a cut-off threshold value was set in the latter model, such that the top five windows, each of which comprised 10 adjacent SNPs, were chosen with significant variation for the phenotype. Four major additive QTL from these three methods had high concordance found in 64.1 to 64.9Mb for Bos taurus autosome (BTA) 7 for WWT, 24.3 to 25.4Mb for BTA14 for CWT, 0.5 to 1.5Mb for BTA6 for BFT and 26.3 to 33.4Mb for BTA29 for BFT. Several candidate genes (i.e. glutamate receptor, ionotropic, ampa 1 [GRIA1], family with sequence similarity 110, member B [FAM110B], and thymocyte selection-associated high mobility group box [TOX]) may be identified close to these QTL. Our result suggests that the use of different linkage disequilibrium mapping approaches can provide more reliable chromosome regions to further pinpoint DNA makers or causative genes in these regions. PMID:26104396

  13. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  14. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  15. [Quantitative determination of morphine in opium powder by addition and correlation method using capillary electrophoresis].

    PubMed

    Sun, Guo-xiang; Miao, Ju-ru; Wang, Yu; Sun, Yu-qing

    2002-01-01

    The morphine in opium powder has been quantitatively determined by addition and correlation method (ACM), in which capillary zone electrophoresis was applied, and the average recovery was 100.6%. The relative standard deviation (RSD) of migration time was not more than 2.4%, the RSD of relative migration time was not more than 1.1%, and the RSD of the relative area was not more than 0.51%. Meanwhile, the contrast test has been done by the calibration curve method with an internal standard correlation. The content of morphine in opium powder determined by ACM was the same as that by using the calibration curve method with an internal standard correlated. The study shows that ACM is simple, quick and accurate.

  16. Microwave coupling into a slotted cavity. Additional results

    NASA Astrophysics Data System (ADS)

    Baeckstroem, M.; Loren, J.

    1994-12-01

    Further evaluation of simple formulas for shielding effectiveness and for absorption cross section of a wire inside a shielded structure have been made. The results give further support to the expressions, derived earlier in FOA report C 30712-8.3,3.2 (PB94-123742). The main objective of the work has been to find and evaluate simple expressions for microwave coupling into electronic compartments. The expressions are intended to be used for bounding calculations in design and analysis of system hardness against intense microwave radiation, e.g. HPM (High Power Microwaves). It is shown that introduction of microwave absorbing material into the cavity gives an expected increase in shielding effectiveness. It is also shown that shielding effectiveness depends only to a little extent on the position and length of the wire. The total transmission area for multiple apertures can be expressed as the sum of the areas of the individual apertures. The absorption cross section for a wire inside the cavity is shown to depend only slightly on wire position and length, even when the wire is located very close to a wall. The results lead to further improvement of the methodology for analysis of system hardness against HPM radiation. It also lays a foundation for a more scientific approach in the design of shielded structures. Such an approach would result in an increased reliability and also in a reduction of costs due to a reduced need for (large) safety margins and fewer late design modifications. The report also proposes a new method to measure shielding effectiveness of apertures.

  17. Large-Scale Spray Releases: Additional Aerosol Test Results

    SciTech Connect

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  18. Small-Scale Spray Releases: Additional Aerosol Test Results

    SciTech Connect

    Schonewill, Philip P.; Gauglitz, Phillip A.; Kimura, Marcia L.; Brown, G. N.; Mahoney, Lenna A.; Tran, Diana N.; Burns, Carolyn A.; Kurath, Dean E.

    2013-08-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are largely absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale. The small-scale testing and resultant data are described in Mahoney et al. (2012b) and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used to mimic the

  19. Additional Results of Glaze Icing Scaling in SLD Conditions

    NASA Technical Reports Server (NTRS)

    Tsao, Jen-Ching

    2016-01-01

    New guidance of acceptable means of compliance with the super-cooled large drops (SLD) conditions has been issued by the U.S. Department of Transportation's Federal Aviation Administration (FAA) in its Advisory Circular AC 25-28 in November 2014. The Part 25, Appendix O is developed to define a representative icing environment for super-cooled large drops. Super-cooled large drops, which include freezing drizzle and freezing rain conditions, are not included in Appendix C. This paper reports results from recent glaze icing scaling tests conducted in NASA Glenn Icing Research Tunnel (IRT) to evaluate how well the scaling methods recommended for Appendix C conditions might apply to SLD conditions. The models were straight NACA 0012 wing sections. The reference model had a chord of 72 in. and the scale model had a chord of 21 in. Reference tests were run with airspeeds of 100 and 130.3 kn and with MVD's of 85 and 170 micron. Two scaling methods were considered. One was based on the modified Ruff method with scale velocity found by matching the Weber number WeL. The other was proposed and developed by Feo specifically for strong glaze icing conditions, in which the scale liquid water content and velocity were found by matching reference and scale values of the nondimensional water-film thickness expression and the film Weber number Wef. All tests were conducted at 0 deg AOA. Results will be presented for stagnation freezing fractions of 0.2 and 0.3. For nondimensional reference and scale ice shape comparison, a new post-scanning ice shape digitization procedure was developed for extracting 2-D ice shape profiles at any selected span-wise location from the high fidelity 3-D scanned ice shapes obtained in the IRT.

  20. Correlation of qualitative and quantitative results from testing respirator fit

    SciTech Connect

    Hardis, K.E.

    1983-02-01

    Three qualitative respirator fit tests were evaluated for their ability to measure respiratory protection adequately. The methods were the negative pressure test, the isoamyl acetate test, and the irritant smoke test. Each test was performed concurrently with a single qualitative fit test, the dioctylphthalate (DOP) test, during 274 half-mask and 274 full face piece wearings. Most (95%) of the tested study had adequately fitting respirators as determined by quantitative testing. Of these subjects, 96-100% passed the qualitative fit tests. Of the 5% of the study subjects with inadequately fitting half-mask respirators, 93-100% of the inadequate fits were detected by qualitative methods. Twenty-three to 46% of the poorly fitting full face masks were detected by qualitative methods. The probability of passing or failing a qualitative test with an inadequately fitting respirator can be estimated; however, the uncertainty associated with each estimate is largely due to the small number of study subjects with poorly fitting respirators.

  1. Origins of stereoselectivity in the Diels-Alder addition of chiral hydroxyalkyl vinyl ketones to cyclopentadiene: a quantitative computational study.

    PubMed

    Bakalova, Snezhana M; Kaneti, Jose

    2008-12-18

    Modest basis set level MP2/6-31G(d,p) calculations on the Diels-Alder addition of S-1-alkyl-1-hydroxy-but-3-en-2-ones (1-hydroxy-1-alkyl methyl vinyl ketones) to cyclopentadiene correctly reproduce the trends in known experimental endo/exo and diastereoface selectivity. B3LYP theoretical results at the same or significantly higher basis set level, on the other hand, do not satisfactorily model observed endo/exo selectivities and are thus unsuitable for quantitative studies. The same is valid also with regard to subtle effects originating from, for example, conformational distributions of reactants. The latter shortcomings are not alleviated by the fact that observed diastereoface selectivities are well-reproduced by DFT calculations. Quantitative computational studies of large cycloaddition systems would require higher basis sets and better account for electron correlation than MP2, such as, for example, CCSD. Presently, however, with 30 or more non-hydrogen atoms, these computations are hardly feasible. We present quantitatively correct stereochemical predictions using a hybrid layered ONIOM computational approach, including the chiral carbon atom and the intramolecular hydrogen bond into a higher level, MP2/6-311G(d,p) or CCSD/6-311G(d,p), layer. Significant computational economy is achieved by taking account of surrounding bulky (alkyl) residues at 6-31G(d) in a low HF theoretical level layer. We conclude that theoretical calculations based on explicit correlated MO treatment of the reaction site are sufficiently reliable for the prediction of both endo/exo and diastereoface selectivity of Diels-Alder addition reactions. This is in line with the understanding of endo/exo selectivity originating from dynamic electron correlation effects of interacting pi fragments and diastereofacial selectivity originating from steric interactions of fragments outside of the Diels-Alder reaction site. PMID:18637663

  2. Additives

    NASA Technical Reports Server (NTRS)

    Smalheer, C. V.

    1973-01-01

    The chemistry of lubricant additives is discussed to show what the additives are chemically and what functions they perform in the lubrication of various kinds of equipment. Current theories regarding the mode of action of lubricant additives are presented. The additive groups discussed include the following: (1) detergents and dispersants, (2) corrosion inhibitors, (3) antioxidants, (4) viscosity index improvers, (5) pour point depressants, and (6) antifouling agents.

  3. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  4. 75 FR 4323 - Additional Quantitative Fit-testing Protocols for the Respiratory Protection Standard

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-27

    ... performed particle counts on samples collected during the Study. Table 1 provides the exercise and sampling... revised PortaCount quantitative fit-testing protocols are not sufficiently accurate or reliable to include...) to Appendix A of ] its Respiratory Protection Standard (see 69 FR 46986). OSHA also published...

  5. Validation and Estimation of Additive Genetic Variation Associated with DNA Tests for Quantitative Beef Cattle Traits

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The U.S. National Beef Cattle Evaluation Consortium (NBCEC) has been involved in the validation of commercial DNA tests for quantitative beef quality traits since their first appearance on the U.S. market in the early 2000s. The NBCEC Advisory Council initially requested that the NBCEC set up a syst...

  6. Additive effects of pollinators and herbivores result in both conflicting and reinforcing selection on floral traits.

    PubMed

    Sletvold, Nina; Moritz, Kim K; Agren, Jon

    2015-01-01

    Mutualists and antagonists are known to respond to similar floral cues, and may thus cause opposing selection on floral traits. However, we lack a quantitative understanding of their independent and interactive effects. In a population of the orchid Gymnadenia conopsea, we manipulated the intensity of pollination and herbivory in a factorial design to examine whether both interactions influence selection on flowering phenology, floral display, and morphology. Supplemental hand-pollination increased female fitness by 31% and one-quarter of all plants were damaged by herbivores. Both interactions contributed to selection. Pollinators mediated selection for later flowering and herbivores for earlier flowering, while both selected for longer spurs. The strength of selection was similar for both agents, and their effects were additive. As a consequence, there was no. net selection on phenology, whereas selection on spur length was strong. The experimental results demonstrate that both pollinators and herbivores can markedly influence the strength of selection on flowering phenology and floral morphology, and cause both conflicting and reinforcing selection. They also indicate that the direction of selection on phenology will vary with the relative intensity of the mutualistic and antagonistic interaction, potentially resulting in both temporal and among-population variation in optimal flowering time.

  7. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  8. Experimental demonstration of quantitation errors in MR spectroscopy resulting from saturation corrections under changing conditions

    NASA Astrophysics Data System (ADS)

    Galbán, Craig J.; Ellis, Scott J.; Spencer, Richard G. S.

    2003-04-01

    Metabolite concentration measurements in in vivo NMR are generally performed under partially saturated conditions, with correction for partial saturation performed after data collection using a measured saturation factor. Here, we present an experimental test of the hypothesis that quantitation errors can occur due to application of such saturation factor corrections in changing systems. Thus, this extends our previous theoretical work on quantitation errors due to varying saturation factors. We obtained results for two systems frequently studied by 31P NMR, the ischemic rat heart and the electrically stimulated rat gastrocnemius muscle. The results are interpreted in light of previous theoretical work which defined the degree of saturation occurring in a one-pulse experiment for a system with given spin-lattice relaxation times, T1s, equilibrium magnetizations, M0s, and reaction rates. We found that (i) the assumption of constancy of saturation factors leads to quantitation errors on the order of 40% in inorganic phosphate; (ii) the dominant contributor to the quantitation errors in inorganic phosphate is most likely changes in T1; (iii) T1 and M0 changes between control and intervention periods, and chemical exchange contribute to different extents to quantitation errors in phosphocreatine and γ-ATP; (iv) relatively small increases in interpulse delay substantially decreased quantitation errors for metabolites in ischemic rat hearts; (v) random error due to finite SNR led to approximately 4% error in quantitation, and hence was a substantially smaller contributor than were changes in saturation factors.

  9. A Method for Quantitative Evaluation of the Results of Postural Tests.

    PubMed

    Alifirova, V M; Brazovskii, K S; Zhukova, I A; Pekker, Ya S; Tolmachev, I V; Fokin, V A

    2016-07-01

    A method for quantitative evaluation of the results of postural tests is proposed. The method is based on contact-free measurements of 3D coordinates of body point movements. The result can serve as an integral test based on the Mahalanobis distance. PMID:27492397

  10. Does contraceptive treatment in wildlife result in side effects? A review of quantitative and anecdotal evidence.

    PubMed

    Gray, Meeghan E; Cameron, Elissa Z

    2010-01-01

    The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.

  11. An empirical approach to the bond additivity model in quantitative interpretation of sum frequency generation vibrational spectra

    NASA Astrophysics Data System (ADS)

    Wu, Hui; Zhang, Wen-kai; Gan, Wei; Cui, Zhi-feng; Wang, Hong-fei

    2006-10-01

    Knowledge of the ratios between different polarizability βi'j'k' tensor elements of a chemical group in a molecule is crucial for quantitative interpretation and polarization analysis of its sum frequency generation vibrational spectroscopy (SFG-VS) spectrum at interface. The bond additivity model (BAM) or the hyperpolarizability derivative model along with experimentally obtained Raman depolarization ratios has been widely used to obtain such tensor ratios for the CH3, CH2, and CH groups. Successfully, such treatment can quantitatively reproduce the intensity polarization dependence in SFG-VS spectra for the symmetric (SS) and asymmetric (AS) stretching modes of CH3 and CH2 groups, respectively. However, the relative intensities between the SS and AS modes usually do not agree with each other within this model even for some of the simplest molecular systems, such as the air/methanol interface. This fact certainly has cast uncertainties on the effectiveness and conclusions based on the BAM. One of such examples is that the AS mode of CH3 group has never been observed in SFG-VS spectra from the air/methanol interface, while this AS mode is usually very strong for SFG-VS spectra from the air/ethanol interface, other short chain alcohol, as well as long chain surfactants. In order to answer these questions, an empirical approach from known Raman and IR spectra is used to make corrections to the BAM. With the corrected ratios between the βi'j'k' tensor elements of the SS and AS modes, all features in the SFG-VS spectra of the air/methanol and air/ethanol interfaces can be quantitatively interpreted. This empirical approach not only provides new understandings of the effectiveness and limitations of the bond additivity model but also provides a practical way for its application in SFG-VS studies of molecular interfaces.

  12. Meta-analysis of results from quantitative trait loci mapping studies on pig chromosome 4.

    PubMed

    Silva, K M; Bastiaansen, J W M; Knol, E F; Merks, J W M; Lopes, P S; Guimarães, S E F; van Arendonk, J A M

    2011-06-01

    Meta-analysis of results from multiple studies could lead to more precise quantitative trait loci (QTL) position estimates compared to the individual experiments. As the raw data from many different studies are not readily available, the use of results from published articles may be helpful. In this study, we performed a meta-analysis of QTL on chromosome 4 in pig, using data from 25 separate experiments. First, a meta-analysis was performed for individual traits: average daily gain and backfat thickness. Second, a meta-analysis was performed for the QTL of three traits affecting loin yield: loin eye area, carcass length and loin meat weight. Third, 78 QTL were selected from 20 traits that could be assigned to one of three broad categories: carcass, fatness or growth traits. For each analysis, the number of identified meta-QTL was smaller than the number of initial QTL. The reduction in the number of QTL ranged from 71% to 86% compared to the total number before the meta-analysis. In addition, the meta-analysis reduced the QTL confidence intervals by as much as 85% compared to individual QTL estimates. The reduction in the confidence interval was greater when a large number of independent QTL was included in the meta-analysis. Meta-QTL related to growth and fatness were found in the same region as the FAT1 region. Results indicate that the meta-analysis is an efficient strategy to estimate the number and refine the positions of QTL when QTL estimates are available from multiple populations and experiments. This strategy can be used to better target further studies such as the selection of candidate genes related to trait variation.

  13. A Longitudinal Study of Man: A Course of Study. Volume II: Quantitative Results. Final Report.

    ERIC Educational Resources Information Center

    Cort, H. Russell, Jr.; Peskowitz, Nancy

    This second volume of the summative evaluation of "Man: A Course of Study" (MACOS) presents results of quantitative analyses of what MACOS students seemed to learn, what they retained one year later, and how what they learned was different from what students in other social studies courses learned. The first part of the document compares MACOS and…

  14. Analysis of 129I in Groundwater Samples: Direct and Quantitative Results below the Drinking Water Standard

    SciTech Connect

    Brown, Christopher F.; Geiszler, Keith N.; Lindberg, Michael J.

    2007-03-03

    Due to its long half-life (15.7 million years) and relatively unencumbered migration in subsurface environments, 129I has been recognized as a contaminant of concern at numerous federal, private, and international facilities. In order to understand the long-term risk associated with 129I at these locations, quantitative analysis of groundwater samples must be performed. However, the ability to quantitatively assess the 129I content in groundwater samples requires specialized extraction and sophisticated analytical techniques, which are complicated and not always available to the general scientific community. This paper highlights an analytical method capable of directly quantifying 129I in groundwater samples at concentrations below the MCL without the need for sample pre-concentration. Samples were analyzed on a Perkin Elmer ELAN DRC II ICP-MS after minimal dilution using O2 as the reaction gas. Analysis of continuing calibration verification standards indicated that the DRC mode could be used for quantitative analysis of 129I in samples below the drinking water standard (0.0057 ng/ml or 1 pCi/L). The low analytical detection limit of 129I analysis in the DRC mode coupled with minimal sample dilution (1.02x) resulted in a final sample limit of quantification of 0.0051 ng/ml. Subsequent analysis of three groundwater samples containing 129I resulted in fully quantitative results in the DRC mode, and spike recovery analyses performed on all three samples confirmed that the groundwater matrix did not adversely impact the analysis of 129I in the DRC mode. This analytical approach has been proven to be a cost-effective, high-throughput technique for the direct, quantitative analysis of 129I in groundwater samples at concentrations below the current MCL.

  15. Bridging the gap between qualitative and quantitative colocalization results in fluorescence microscopy studies

    PubMed Central

    Zinchuk, Vadim; Wu, Yong; Grossenbacher-Zinchuk, Olga

    2013-01-01

    Quantitative colocalization studies suffer from the lack of unified approach to interpret obtained results. We developed a tool to characterize the results of colocalization experiments in a way so that they are understandable and comparable both qualitatively and quantitatively. Employing a fuzzy system model and computer simulation, we produced a set of just five linguistic variables tied to the values of popular colocalization coefficients: “Very Weak”, “Weak”, “Moderate”, “Strong”, and “Very Strong”. The use of the variables ensures that the results of colocalization studies are properly reported, easily shared, and universally understood by all researchers working in the field. When new coefficients are introduced, their values can be readily fitted into the set. PMID:23455567

  16. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  17. The Modern U.S. High School Astronomy Course, Its Status and Makeup II: Additional Results

    ERIC Educational Resources Information Center

    Krumenaker, Larry

    2009-01-01

    A postal survey of high school astronomy teachers strongly confirms many results of an earlier electronic survey. Additional and new results include a measure of the level of inquiry (more structured inquiry and teacher-led) in the classroom as well as data showing that more emphasis is given to traditional topics than to contemporary astronomy…

  18. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Pilot-Scale Test Results

    SciTech Connect

    Gary M. Blythe

    2006-03-01

    Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High Sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. This topical report presents the results from the Task 2 and Task 4 pilot-scale additive tests. The Task 3 and Task 5 full-scale additive tests will be conducted later in calendar year 2006.

  19. A quantitative method for evaluating results of treating Legg-Perthes syndrome.

    PubMed

    Harry, J D; Gross, R H

    1987-01-01

    A new quantitative method of analyzing hip joint architecture in Legg-Perthes syndrome is presented. Outlines of the bony femoral head and acetabular configuration as seen on the anteroposterior (AP) view were traced on a digitizer. Computer analysis provided measures of joint congruity, containment, and femoral head shape. The method's ability to distinguish pathologic from normal hips and to trace the course of the bony deformity of the hip joint was demonstrated in a group of 14 patients. Interobserver reliability was established. The method provides an objective quantification of treatment results and a reliable means for comparison of data between groups of patients.

  20. Surface porosity of stone casts resulting from immersion of addition silicone rubber impressions in disinfectant solutions.

    PubMed

    Hiraguchi, Hisako; Kaketani, Masahiro; Hirose, Hideharu; Kikuchi, Hisaji; Yoneyama, Takayuki

    2014-01-01

    This study investigated the effects of immersion of addition silicone rubber impressions in disinfectant solutions on the surface porosity of the resulting stone casts. Five brands of type 2 and 3 addition silicone rubber impression materials and one brand of type 4 dental stone were used. Impressions of a master die designed to simulate an abutment tooth were immersed in disinfectant for 30 minutes. The disinfectants used were 2% glutaraldehyde solution and 0.55% ortho-phthalaldehyde solution. The surface porosities of stone casts obtained from two brands of impression materials immersed in disinfectant for 30 minutes were determined. Results suggest that impression materials immersed in disinfectant solutions need sufficient time before pouring into dental stone.

  1. Critical appraisal of quantitative PCR results in colorectal cancer research: can we rely on published qPCR results?

    PubMed

    Dijkstra, J R; van Kempen, L C; Nagtegaal, I D; Bustin, S A

    2014-06-01

    The use of real-time quantitative polymerase chain reaction (qPCR) in cancer research has become ubiquitous. The relative simplicity of qPCR experiments, which deliver fast and cost-effective results, means that each year an increasing number of papers utilizing this technique are being published. But how reliable are the published results? Since the validity of gene expression data is greatly dependent on appropriate normalisation to compensate for sample-to-sample and run-to-run variation, we have evaluated the adequacy of normalisation procedures in qPCR-based experiments. Consequently, we assessed all colorectal cancer publications that made use of qPCR from 2006 until August 2013 for the number of reference genes used and whether they had been validated. Using even these minimal evaluation criteria, the validity of only three percent (6/179) of the publications can be adequately assessed. We describe common errors, and conclude that the current state of reporting on qPCR in colorectal cancer research is disquieting. Extrapolated to the study of cancer in general, it is clear that the majority of studies using qPCR cannot be reliably assessed and that at best, the results of these studies may or may not be valid and at worst, pervasive incorrect normalisation is resulting in the wholesale publication of incorrect conclusions. This survey demonstrates that the existence of guidelines, such as MIQE, is necessary but not sufficient to address this problem and suggests that the scientific community should examine its responsibility and be aware of the implications of these findings for current and future research.

  2. [Rapid Quantitative Analysis of Content of the Additive in Gasoline for Motor Vehicles by Near-Infrared Spectroscopy].

    PubMed

    Rong, Hai-teng; Song, Chun-feng; Yuan, Hong-fu; Li, Xiao-yu; Hu, Ai-qin; Xie, Jin-chun; Yan, De-lin

    2015-10-01

    A new rapid quantitative method for the determination of oxygenates and the compounds not included in the national standard in gasoline using near-infrared spectroscopy is raised by this paper. This method combine near-infrared spectroscopy with oblique projection. This experiment choose four different types of gasoline, including reconcile gasoline, FCC refined gasoline, reformed gasoline and desulfurizing gasoline. Prepare series gasoline samples containing different concentrations and different types of compounds. Using FTIR spectrometer to measure those samples and got transmission spectrums. Oblique projection method could separate quantity spectral signal from mixed spectrum signal, and using projection to calculate and analyze the separated signal to obtain the content of measured component. The deviation between this method and the real content is low, the absolute error is less than 0.8 and the relative error is less than 8%. For the actual gasoline samples, compare results of this method with gas chromatography, the absolute error are less than 0.85 and the relative error are less than 6.85%. This method solves the problem of general multivariate calibration methods. It is very significant for the development of rapid detection technology using NIR suitable for on-site and the improvement of the quality of gasoline.

  3. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2015-09-22

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning.

  4. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  5. Quantitative assessment of breast lesion viscoelasticity: initial clinical results using supersonic shear imaging.

    PubMed

    Tanter, Mickael; Bercoff, Jeremy; Athanasiou, Alexandra; Deffieux, Thomas; Gennisson, Jean-Luc; Montaldo, Gabriel; Muller, Marie; Tardivon, Anne; Fink, Mathias

    2008-09-01

    This paper presents an initial clinical evaluation of in vivo elastography for breast lesion imaging using the concept of supersonic shear imaging. This technique is based on the combination of a radiation force induced in tissue by an ultrasonic beam and an ultrafast imaging sequence capable of catching in real time the propagation of the resulting shear waves. The local shear wave velocity is recovered using a time-offlight technique and enables the 2-D mapping of shear elasticity. This imaging modality is implemented on a conventional linear probe driven by a dedicated ultrafast echographic device. Consequently, it can be performed during a standard echographic examination. The clinical investigation was performed on 15 patients, which corresponded to 15 lesions (4 cases BI-RADS 3, 7 cases BI-RADS 4 and 4 cases BI-RADS 5). The ability of the supersonic shear imaging technique to provide a quantitative and local estimation of the shear modulus of abnormalities with a millimetric resolution is illustrated on several malignant (invasive ductal and lobular carcinoma) and benign cases (fibrocystic changes and viscous cysts). In the investigated cases, malignant lesions were found to be significantly different from benign solid lesions with respect to their elasticity values. Cystic lesions have shown no shear wave propagate at all in the lesion (because shear waves do not propage in liquid). These preliminary clinical results directly demonstrate the clinical feasibility of this new elastography technique in providing quantitative assessment of relative stiffness of breast tissues. This technique of evaluating tissue elasticity gives valuable information that is complementary to the B-mode morphologic information. More extensive studies are necessary to validate the assumption that this new mode potentially helps the physician in both false-positive and false-negative rejection.

  6. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  7. The post-embryonic development of Remipedia (Crustacea)--additional results and new insights.

    PubMed

    Koenemann, Stefan; Olesen, Jørgen; Alwes, Frederike; Iliffe, Thomas M; Hoenemann, Mario; Ungerer, Petra; Wolff, Carsten; Scholtz, Gerhard

    2009-03-01

    The post-embryonic development of a species of the enigmatic crustacean group Remipedia is described in detail for the first time under various aspects. Applying a molecular approach, we can clearly prove the species identity of the larvae as belonging to Pleomothra apletocheles. We document the cellular level of several larval stages and the differentiation of segments, limbs, and the general body morphology applying the techniques of confocal laser scanning microscopy and scanning electron microscopy. In addition, we document the swimming behavior and the peculiar movements of the naupliar appendages. A comparison of our results with published data on other Crustacea and their larval development tentatively supports ideas about phylogenetic affinities of the Remipedia to the Malacostraca.

  8. Sensitive and cost-effective LC-MS/MS method for quantitation of CVT-6883 in human urine using sodium dodecylbenzenesulfonate additive to eliminate adsorptive losses.

    PubMed

    Chen, Chungwen; Bajpai, Lakshmikant; Mollova, Nevena; Leung, Kwan

    2009-04-01

    CVT-6883, a novel selective A(2B) adenosine receptor antagonist currently under clinical development, is highly lipophilic and exhibits high affinity for non-specific binding to container surfaces, resulting in very low recovery in urine assays. Our study showed the use of sodium dodecylbenzenesulfonate (SDBS), a low-cost additive, eliminated non-specific binding problems in the analysis of CVT-6883 in human urine without compromising sensitivity. A new sensitive and selective LC-MS/MS method for quantitation of CVT-6883 in the range of 0.200-80.0ng/mL using SDBS additive was therefore developed and validated for the analysis of human urine samples. The recoveries during sample collection, handling and extraction for the analyte and internal standard (d(5)-CVT-6883) were higher than 87%. CVT-6883 was found stable under the following conditions: in extract - at ambient temperature for 3 days, under refrigeration (5 degrees C) for 6 days; in human urine (containing 4mM SDBS) - after three freeze/thaw cycles, at ambient temperature for 26h, under refrigeration (5 degrees C) for 94h, and in a freezer set to -20 degrees C for at least 2 months. The results demonstrated that the validated method is sufficiently sensitive, specific, and cost-effective for the analysis of CVT-6883 in human urine and will provide a powerful tool to support the clinical programs for CVT-6883.

  9. Further results on delay-range-dependent stability with additive time-varying delay systems.

    PubMed

    Liu, Pin-Lin

    2014-03-01

    In this paper, new conditions for the delay-range-dependent stability analysis of time-varying delay systems are proposed in a Lyapunov-Krasovskii framework. Time delay is considered to be time-varying and has lower and upper bounds. A new method is first presented for a system with two time delays, integral inequality approach (IIA) used to express relationships among terms of Leibniz-Newton formula. Constructing a novel Lyapunov-Krasovskii functional includes information belonging to a given range; new delay-range-dependent criterion is established in term of linear matrix inequality (LMI). The advantage of that criterion lies in its simplicity and less conservative. This paper also presents a new result of stability analysis for continuous systems with two additive time-variant components representing a general class of delay with strong application background in network-based control systems. Resulting criteria are then expressed in terms of convex optimization with LMI constraints, allowing for use of efficient solvers. Finally, three numerical examples show these methods reducing conservatism and improving maximal allowable delay.

  10. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging

    PubMed Central

    Li, Chien Feng; Chen, Tai Yuan; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  11. Differentiation between Glioblastoma Multiforme and Primary Cerebral Lymphoma: Additional Benefits of Quantitative Diffusion-Weighted MR Imaging.

    PubMed

    Ko, Ching Chung; Tai, Ming Hong; Li, Chien Feng; Chen, Tai Yuan; Chen, Jeon Hor; Shu, Ginger; Kuo, Yu Ting; Lee, Yu Chang

    2016-01-01

    The differentiation between glioblastoma multiforme (GBM) and primary cerebral lymphoma (PCL) is important because the treatments are substantially different. The purpose of this article is to describe the MR imaging characteristics of GBM and PCL with emphasis on the quantitative ADC analysis in the tumor necrosis, the most strongly-enhanced tumor area, and the peritumoral edema. This retrospective cohort study collected 104 GBM (WHO grade IV) patients and 22 immune-competent PCL (diffuse large B cell lymphoma) patients. All these patients had pretreatment brain MR DWI and ADC imaging. Analysis of conventional MR imaging and quantitative ADC measurement including the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe) were done. ROC analysis with optimal cut-off values and area-under-the ROC curve (AUC) was performed. For conventional MR imaging, there are statistical differences in tumor size, tumor location, tumor margin, and the presence of tumor necrosis between GBM and PCL. Quantitative ADC analysis shows that GBM tended to have significantly (P<0.05) higher ADC in the most strongly-enhanced area (ADCt) and lower ADC in the peritumoral edema (ADCe) as compared with PCL. Excellent AUC (0.94) with optimal sensitivity of 90% and specificity of 86% for differentiating between GBM and PCL was obtained by combination of ADC in the tumor necrosis (ADCn), the most strongly-enhanced tumor area (ADCt), and the peritumoral edema (ADCe). Besides, there are positive ADC gradients in the peritumoral edema in a subset of GBMs but not in the PCLs. Quantitative ADC analysis in these three areas can thus be implemented to improve diagnostic accuracy for these two brain tumor types. The histological correlation of the ADC difference deserves further investigation. PMID:27631626

  12. Plant interspecific differences in arbuscular mycorrhizal colonization as a result of soil carbon addition.

    PubMed

    Eschen, René; Müller-Schärer, Heinz; Schaffner, Urs

    2013-01-01

    Soil nutrient availability and colonization by arbuscular mycorrhizal fungi are important and potentially interacting factors shaping vegetation composition and succession. We investigated the effect of carbon (C) addition, aimed at reducing soil nutrient availability, on arbuscular mycorrhizal colonization. Seedlings of 27 plant species with different sets of life-history traits (functional group affiliation, life history strategy and nitrophilic status) were grown in pots filled with soil from a nutrient-rich set-aside field and amended with different amounts of C. Mycorrhizal colonization was progressively reduced along the gradient of increasing C addition in 17 out of 27 species, but not in the remaining species. Grasses had lower colonization levels than forbs and legumes and the decline in AM fungal colonization was more pronounced in legumes than in other forbs and grasses. Mycorrhizal colonization did not differ between annual and perennial species, but decreased more rapidly along the gradient of increasing C addition in plants with high Ellenberg N values than in plants with low Ellenberg N values. Soil C addition not only limits plant growth through a reduction in available nutrients, but also reduces mycorrhizal colonization of plant roots. The effect of C addition on mycorrhizal colonization varies among plant functional groups, with legumes experiencing an overproportional reduction in AM fungal colonization along the gradient of increasing C addition. We therefore propose that for a better understanding of vegetation succession on set-aside fields one may consider the interrelationship between plant growth, soil nutrient availability and mycorrhizal colonization of plant roots.

  13. TANK 40 FINAL SB5 CHEMICAL CHARACTERIZATION RESULTS PRIOR TO NP ADDITION

    SciTech Connect

    Bannochie, C.; Click, D.

    2010-01-06

    A sample of Sludge Batch 5 (SB5) was pulled from Tank 40 in order to obtain radionuclide inventory analyses necessary for compliance with the Waste Acceptance Product Specifications (WAPS). This sample was also analyzed for chemical composition including noble metals. Prior to radionuclide inventory analyses, a final sample of the H-canyon Np stream will be added to bound the Np addition anticipated for Tank 40. These analyses along with the WAPS radionuclide analyses will help define the composition of the sludge in Tank 40 that is currently being fed to DWPF as SB5. At the Savannah River National Laboratory (SRNL) the 3-L Tank 40 SB5 sample was transferred from the shipping container into a 4-L high density polyethylene vessel and solids allowed to settle overnight. Supernate was then siphoned off and circulated through the shipping container to complete the transfer of the sample. Following thorough mixing of the 3-L sample, a 239 g sub-sample was removed. This sub-sample was then utilized for all subsequent analytical samples. Eight separate aliquots of the slurry were digested, four with HNO{sub 3}/HCl (aqua regia) in sealed Teflon{reg_sign} vessels and four in Na{sub 2}O{sub 2} (alkali or peroxide fusion) using Zr crucibles. Due to the use of Zr crucibles and Na in the peroxide fusions, Na and Zr cannot be determined from this preparation. Additionally, other alkali metals, such as Li and K that may be contaminants in the Na{sub 2}O{sub 2} are not determined from this preparation. Three Analytical Reference Glass - 14 (ARG-1) standards were digested along with a blank for each preparation. The ARG-1 glass allows for an assessment of the completeness of each digestion. Each aqua regia digestion and blank was diluted to 1:100 mL with deionized water and submitted to Analytical Development (AD) for inductively coupled plasma - atomic emission spectroscopy (ICPAES) analysis, inductively coupled plasma - mass spectrometry (ICP-MS) analysis of masses 81-209 and 230

  14. TANK 40 FINAL SB5 CHEMICAL CHARACTERIZATION RESULTS PRIOR TO NP ADDITION

    SciTech Connect

    Bannochie, C; Damon Click, D

    2009-02-26

    A sample of Sludge Batch 5 (SB5) was pulled from Tank 40 in order to obtain radionuclide inventory analyses necessary for compliance with the Waste Acceptance Product Specifications (WAPS). This sample was also analyzed for chemical composition including noble metals. Prior to radionuclide inventory analyses, a final sample of the H-canyon Np stream will be added to bound the Np addition anticipated for Tank 40. These analyses along with the WAPS radionuclide analyses will help define the composition of the sludge in Tank 40 that is currently being fed to DWPF as SB5. At the Savannah River National Laboratory (SRNL) the 3-L Tank 40 SB5 sample was transferred from the shipping container into a 4-L high density polyethylene vessel and solids allowed to settle overnight. Supernate was then siphoned off and circulated through the shipping container to complete the transfer of the sample. Following thorough mixing of the 3-L sample, a 239 g sub-sample was removed. This sub-sample was then utilized for all subsequent analytical samples. Eight separate aliquots of the slurry were digested, four with HNO{sub 3}/HCl (aqua regia) in sealed Teflon{reg_sign} vessels and four in Na{sub 2}O{sub 2} (alkali or peroxide fusion) using Zr crucibles. Due to the use of Zr crucibles and Na in the peroxide fusions, Na and Zr cannot be determined from this preparation. Additionally, other alkali metals, such as Li and K that may be contaminants in the Na{sub 2}O{sub 2} are not determined from this preparation. Three Analytical Reference Glass-1 (ARG-1) standards were digested along with a blank for each preparation. The ARG-1 glass allows for an assessment of the completeness of each digestion. Each aqua regia digestion and blank was diluted to 1:100 mL with deionized water and submitted to Analytical Development (AD) for inductively coupled plasma--atomic emission spectroscopy (ICPAES) analysis, inductively coupled plasma--mass spectrometry (ICP-MS) analysis of masses 81-209 and 230

  15. A Pyrosequencing Assay for the Quantitative Methylation Analysis of GALR1 in Endometrial Samples: Preliminary Results

    PubMed Central

    Kottaridi, Christine; Koureas, Nikolaos; Margari, Niki; Terzakis, Emmanouil; Bilirakis, Evripidis; Pappas, Asimakis; Chrelias, Charalampos; Spathis, Aris; Aga, Evangelia; Pouliakis, Abraham; Panayiotides, Ioannis; Karakitsos, Petros

    2015-01-01

    Endometrial cancer is the most common malignancy of the female genital tract while aberrant DNA methylation seems to play a critical role in endometrial carcinogenesis. Galanin's expression has been involved in many cancers. We developed a new pyrosequencing assay that quantifies DNA methylation of galanin's receptor-1 (GALR1). In this study, the preliminary results indicate that pyrosequencing methylation analysis of GALR1 promoter can be a useful ancillary marker to cytology as the histological status can successfully predict. This marker has the potential to lead towards better management of women with endometrial lesions and eventually reduce unnecessary interventions. In addition it can provide early warning for women with negative cytological result. PMID:26504828

  16. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Indirect food additives resulting from packaging..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  17. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Indirect food additives resulting from packaging..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  18. 21 CFR 570.13 - Indirect food additives resulting from packaging materials prior sanctioned for animal feed and...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Indirect food additives resulting from packaging materials prior sanctioned for animal feed and pet food. 570.13 Section 570.13 Food and Drugs FOOD AND DRUG... FOOD ADDITIVES General Provisions § 570.13 Indirect food additives resulting from packaging...

  19. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Indirect food additives resulting from packaging..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  20. 21 CFR 570.13 - Indirect food additives resulting from packaging materials prior sanctioned for animal feed and...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 6 2012-04-01 2012-04-01 false Indirect food additives resulting from packaging materials prior sanctioned for animal feed and pet food. 570.13 Section 570.13 Food and Drugs FOOD AND DRUG... FOOD ADDITIVES General Provisions § 570.13 Indirect food additives resulting from packaging...

  1. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 6 2011-04-01 2011-04-01 false Indirect food additives resulting from packaging..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD ADDITIVES General Provisions § 570.14 Indirect food additives resulting from packaging materials for animal feed...

  2. 21 CFR 570.13 - Indirect food additives resulting from packaging materials prior sanctioned for animal feed and...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 6 2014-04-01 2014-04-01 false Indirect food additives resulting from packaging materials prior sanctioned for animal feed and pet food. 570.13 Section 570.13 Food and Drugs FOOD AND DRUG... FOOD ADDITIVES General Provisions § 570.13 Indirect food additives resulting from packaging...

  3. 21 CFR 570.13 - Indirect food additives resulting from packaging materials prior sanctioned for animal feed and...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 6 2013-04-01 2013-04-01 false Indirect food additives resulting from packaging materials prior sanctioned for animal feed and pet food. 570.13 Section 570.13 Food and Drugs FOOD AND DRUG... FOOD ADDITIVES General Provisions § 570.13 Indirect food additives resulting from packaging...

  4. Speech Perception Results for Children Using Cochlear Implants Who Have Additional Special Needs

    ERIC Educational Resources Information Center

    Dettman, Shani J.; Fiket, Hayley; Dowell, Richard C.; Charlton, Margaret; Williams, Sarah S.; Tomov, Alexandra M.; Barker, Elizabeth J.

    2004-01-01

    Speech perception outcomes in young children with cochlear implants are affected by a number of variables including the age of implantation, duration of implantation, mode of communication, and the presence of a developmental delay or additional disability. The aim of this study is to examine the association between degree of developmental delay…

  5. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    PubMed

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (P<0.05) and tended to be perceived with higher intensities of flavor (P<0.10), pepper (P<0.20), and oiliness (P<0.20), while resulting lower in chewiness (P<0.20). TDS showed that in all the sausages hardness was the first dominant attribute; then, in NNS products flavor remained dominant until the end of tasting, whereas in NS products oiliness prevailed. In conclusion, TDS showed that the perception of some textural parameters, such as oiliness, during mastication was more dominant in NS products, whereas using conventional QDA this attribute appeared higher in sausages manufactured without preservatives. Therefore, TDS provided additional information for the description and differentiation of Lucanian sausages. PMID:27486959

  6. Spinel dissolution via addition of glass forming chemicals. Results of preliminary experiments

    SciTech Connect

    Fox, K. M.; Johnson, F. C.

    2015-11-01

    Increased loading of high level waste in glass can lead to crystallization within the glass. Some crystalline species, such as spinel, have no practical impact on the chemical durability of the glass, and therefore may be acceptable from both a processing and a product performance standpoint. In order to operate a melter with a controlled amount of crystallization, options must be developed for remediating an unacceptable accumulation of crystals. This report describes preliminary experiments designed to evaluate the ability to dissolve spinel crystals in simulated waste glass melts via the addition of glass forming chemicals (GFCs).

  7. Comparison of the multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture.

    PubMed

    Converse, Reagan R; Wymer, Larry J; Dufour, Alfred P; Wade, Timothy J

    2012-10-01

    Few studies have addressed the efficacy of composite sampling for measuring indicator bacteria by quantitative PCR (qPCR). We compared results from composited samples with multiple-sample means for culture- and qPCR-based water quality monitoring. Results from composited samples for both methods were similarly correlated to multiple-sample means and predicted criteria exceedances equally.

  8. The effect of hydraulic loading on bioclogging in porous media: Quantitative results from tomographic imaging

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Davit, Y.; Connolly, J. M.; Gerlach, R.; Wood, B. D.; Wildenschild, D.

    2013-12-01

    Biofilm growth in porous media is generally surface attached, and pore filling. A direct result of biofilm formation is the clogging of pore space available for fluid transport. This clogging effect has come to be termed bioclogging. In physical experiments bioclogging expresses as an increase in differential pressure across experimental specimens and traditional investigations of bioclogging in 3D porous media have included measurements of bulk differential pressure changes in order to evaluate changes in permeability or hydraulic conductivity. Due to the opaque nature of most types of porous media, visualization of bioclogging has been limited to the use of 2D or pseudo-3D micromodels. As a result, bioclogging models have relied on parameters derived from 2D visualization experiments. Results from these studies have shown that even small changes in pore morphology associated with biofilm growth can significantly alter fluid hydrodynamics. Recent advances in biofilm imaging facilitate the investigation of biofilm growth and bioclogging in porous media through the implementation of x-ray computed microtomography (CMT) and a functional contrast agent. We used barium sulfate as the contrast agent which consists of a particle suspension that fills all pore space available to fluid flow. Utilization of x-ray CMT with a barium sulfate contrast agent facilitates the examination of biofilm growth at the micron scale throughout experimental porous media growth reactors. This method has been applied to investigate changes in macropore morphology associated with biofilm growth. Applied fluid flow rates correspond to initial Reynolds numbers ranging from 0.1 to 100. Results include direct comparison of measured changes in porosity and hydraulic conductivity as calculated using differential pressure measurements vs. images. In addition, parameters such as biofilm thickness, reactive surface area, and attachment surface area will be presented in order to help characterize

  9. QUANTITATIVE EVALUATION OF ASR DETERIORATION LEVEL BASED ON SURVEY RESULT OF EXISTING STRUCTURE

    NASA Astrophysics Data System (ADS)

    Kawashima, Yasushi; Kosa, Kenji; Matsumoto, Shigeru; Miura, Masatsugu

    The relationship between the crack density and compressive strength of the core cylinder, which drilled from actual structure damaged by ASR, was investigated. The results showed that even if the crack density increased about 1.0m/m2, the compressive strength decreased only 2N/mm2. Then, the new method for estimating future compressive strength using the accumulation crack density in the current is proposed. In addition, the declining tendency of compressive strength by the ASR expansion was early proportional to the expansion, and it was examined on the reason for becoming gentle curve afterwards. As a technique, the detailed observation of ASR crack which arose in the loading test for the plane was carried out, after cylindrical specimen for test was cut in longitudinal direction. As the result, It was proven that the proportion in which line of rupture overlaps with the ASR crack was low, and the load is resisted by interlocking between coarse aggregate and concrete in the crack plane.

  10. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  11. A Quantitative Study of the Resultant Differences between Additive Practices and Reductive Practices in Data Requirements Gathering

    ERIC Educational Resources Information Center

    Johnson, Gerald

    2016-01-01

    With the increase in technology in all facets of our lives and work, there is an ever increasing set of expectations that people have regarding information availability, response time, and dependability. While expectations are affected by gender, age, experience, industry, and other factors, people have expectations of technology, and from…

  12. Additional results on 'Reducing geometric dilution of precision using ridge regression'

    NASA Astrophysics Data System (ADS)

    Kelly, Robert J.

    1990-07-01

    Kelly (1990) presented preliminary results on the feasibility of using ridge regression (RR) to reduce the effects of geometric dilution of precision (GDOP) error inflation in position-fix navigation systems. Recent results indicate that RR will not reduce GDOP bias inflation when biaslike measurement errors last much longer than the aircraft guidance-loop response time. This conclusion precludes the use of RR on navigation systems whose dominant error sources are biaslike; e.g., the GPS selective-availability error source. The simulation results given by Kelly are, however, valid for the conditions defined. Although RR has not yielded a satisfactory solution to the general GDOP problem, it has illuminated the role that multicollinearity plays in navigation signal processors such as the Kalman filter. Bias inflation, initial position guess errors, ridge-parameter selection methodology, and the recursive ridge filter are discussed.

  13. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  14. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  15. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  16. XLF deficiency results in reduced N-nucleotide addition during V(D)J recombination

    PubMed Central

    IJspeert, Hanna; Rozmus, Jacob; Schwarz, Klaus; Warren, René L.; van Zessen, David; Holt, Robert A.; Pico-Knijnenburg, Ingrid; Simons, Erik; Jerchel, Isabel; Wawer, Angela; Lorenz, Myriam; Patıroğlu, Turkan; Akar, Himmet Haluk; Leite, Ricardo; Verkaik, Nicole S.; Stubbs, Andrew P.; van Gent, Dik C.; van Dongen, Jacques J. M.

    2016-01-01

    Repair of DNA double-strand breaks (DSBs) by the nonhomologous end-joining pathway (NHEJ) is important not only for repair of spontaneous breaks but also for breaks induced in developing lymphocytes during V(D)J (variable [V], diversity [D], and joining [J] genes) recombination of their antigen receptor loci to create a diverse repertoire. Mutations in the NHEJ factor XLF result in extreme sensitivity for ionizing radiation, microcephaly, and growth retardation comparable to mutations in LIG4 and XRCC4, which together form the NHEJ ligation complex. However, the effect on the immune system is variable (mild to severe immunodeficiency) and less prominent than that seen in deficiencies of NHEJ factors ARTEMIS and DNA-dependent protein kinase catalytic subunit, with defects in the hairpin opening step, which is crucial and unique for V(D)J recombination. Therefore, we aimed to study the role of XLF during V(D)J recombination. We obtained clinical data from 9 XLF-deficient patients and performed immune phenotyping and antigen receptor repertoire analysis of immunoglobulin (Ig) and T-cell receptor (TR) rearrangements, using next-generation sequencing in 6 patients. The results were compared with XRCC4 and LIG4 deficiency. Both Ig and TR rearrangements showed a significant decrease in the number of nontemplated (N) nucleotides inserted by terminal deoxynucleotidyl transferase, which resulted in a decrease of 2 to 3 amino acids in the CDR3. Such a reduction in the number of N-nucleotides has a great effect on the junctional diversity, and thereby on the total diversity of the Ig and TR repertoire. This shows that XLF has an important role during V(D)J recombination in creating diversity of the repertoire by stimulating N-nucleotide insertion. PMID:27281794

  17. XLF deficiency results in reduced N-nucleotide addition during V(D)J recombination.

    PubMed

    IJspeert, Hanna; Rozmus, Jacob; Schwarz, Klaus; Warren, René L; van Zessen, David; Holt, Robert A; Pico-Knijnenburg, Ingrid; Simons, Erik; Jerchel, Isabel; Wawer, Angela; Lorenz, Myriam; Patıroğlu, Turkan; Akar, Himmet Haluk; Leite, Ricardo; Verkaik, Nicole S; Stubbs, Andrew P; van Gent, Dik C; van Dongen, Jacques J M; van der Burg, Mirjam

    2016-08-01

    Repair of DNA double-strand breaks (DSBs) by the nonhomologous end-joining pathway (NHEJ) is important not only for repair of spontaneous breaks but also for breaks induced in developing lymphocytes during V(D)J (variable [V], diversity [D], and joining [J] genes) recombination of their antigen receptor loci to create a diverse repertoire. Mutations in the NHEJ factor XLF result in extreme sensitivity for ionizing radiation, microcephaly, and growth retardation comparable to mutations in LIG4 and XRCC4, which together form the NHEJ ligation complex. However, the effect on the immune system is variable (mild to severe immunodeficiency) and less prominent than that seen in deficiencies of NHEJ factors ARTEMIS and DNA-dependent protein kinase catalytic subunit, with defects in the hairpin opening step, which is crucial and unique for V(D)J recombination. Therefore, we aimed to study the role of XLF during V(D)J recombination. We obtained clinical data from 9 XLF-deficient patients and performed immune phenotyping and antigen receptor repertoire analysis of immunoglobulin (Ig) and T-cell receptor (TR) rearrangements, using next-generation sequencing in 6 patients. The results were compared with XRCC4 and LIG4 deficiency. Both Ig and TR rearrangements showed a significant decrease in the number of nontemplated (N) nucleotides inserted by terminal deoxynucleotidyl transferase, which resulted in a decrease of 2 to 3 amino acids in the CDR3. Such a reduction in the number of N-nucleotides has a great effect on the junctional diversity, and thereby on the total diversity of the Ig and TR repertoire. This shows that XLF has an important role during V(D)J recombination in creating diversity of the repertoire by stimulating N-nucleotide insertion.

  18. Aircraft-Produced Ice Particles (APIPs): Additional Results and Further Insights.

    NASA Astrophysics Data System (ADS)

    Woodley, William L.; Gordon, Glenn; Henderson, Thomas J.; Vonnegut, Bernard; Rosenfeld, Daniel; Detwiler, Andrew

    2003-05-01

    This paper presents new results from studies of aircraft-produced ice particles (APIPs) in supercooled fog and clouds. Nine aircraft, including a Beech King Air 200T cloud physics aircraft, a Piper Aztec, a Cessna 421-C, two North American T-28s, an Aero Commander, a Piper Navajo, a Beech Turbo Baron, and a second four-bladed King Air were involved in the tests. The instrumented King Air served as the monitoring aircraft for trails of ice particles created, or not created, when the other aircraft were flown through clouds at various temperatures and served as both the test and monitoring aircraft when it itself was tested. In some cases sulfur hexafluoride (SF6) gas was released by the test aircraft during its test run and was detected by the King Air during its monitoring passes to confirm the location of the test aircraft wake. Ambient temperatures for the tests ranged between 5° and 12°C. The results confirm earlier published results and provide further insights into the APIPs phenomenon. The King Air at ambient temperatures less than 8°C can produce APIPs readily. The Piper Aztec and the Aero Commander also produced APIPs under the test conditions in which they were flown. The Cessna 421, Piper Navajo, and Beech Turbo Baron did not. The APIPs production potential of a T-28 is still indeterminate because a limited range of conditions was tested. Homogeneous nucleation in the adiabatically cooled regions where air is expanding around the rapidly rotating propeller tips is the cause of APIPs. An equation involving the propeller efficiency, engine thrust, and true airspeed of the aircraft is used along with the published thrust characteristics of the propellers to predict when the aircraft will produce APIPs. In most cases the predictions agree well with the field tests. Of all of the aircraft tested, the Piper Aztec, despite its small size and low horsepower, was predicted to be the most prolific producer of APIPs, and this was confirmed in field tests. The

  19. Continuously growing rodent molars result from a predictable quantitative evolutionary change over 50 million years

    PubMed Central

    Mushegyan, Vagan; Eronen, Jussi T.; Lawing, A. Michelle; Sharir, Amnon; Janis, Christine; Jernvall, Jukka; Klein, Ophir D.

    2015-01-01

    Summary The fossil record is widely informative about evolution, but fossils are not systematically used to study the evolution of stem cell-driven renewal. Here, we examined evolution of the continuous growth (hypselodonty) of rodent molar teeth, which is fuelled by the presence of dental stem cells. We studied occurrences of 3500 North American rodent fossils, ranging from 50 million years ago (mya) to 2 mya. We examined changes in molar height to determine if evolution of hypselodonty shows distinct patterns in the fossil record, and we found that hypselodont taxa emerged through intermediate forms of increasing crown height. Next, we designed a Markov simulation model, which replicated molar height increases throughout the Cenozoic, and, moreover, evolution of hypselodonty. Thus, by extension, the retention of the adult stem-cell niche appears to be a predictable quantitative rather than a stochastic qualitative process. Our analyses predict that hypselodonty will eventually become the dominant phenotype. PMID:25921530

  20. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted.

  1. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  2. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  3. MC2AQ: Preliminary Results With the Addition of a Bulk Model of Particulate Matter

    NASA Astrophysics Data System (ADS)

    Neary, L.; Kaminski, J.; Yang, R.; Michelangeli, D. V.; McConnell, J.

    2001-12-01

    MC2 (Mesoscale Compressible Community model) is a mesoscale model developed by collaborators at the University of Quebec at Montreal and the Meteorological Service of Canada. MC2AQ is an on-line air quality version of MC2 that was developed at York University. The AQ part of the model includes complex oxidant gas-phase chemistry, deposition, anthropogenic and on-line biogenic emissions. MC2AQ has been used successfully to calculate ozone concentrations in Eastern Canada and the United States and also for Europe. The model can be run down to urban scales of a kilometer or less. The long-term goal of this project is to modify MC2AQ to include aerosol and aqueous chemistry, and the detailed microphysics of the formation and evolution of size distributed particles in an on-line fashion. As a first step, the model has recently been updated to include a new Canadian emissions inventory that includes bulk primary sources of PM2.5 and PM10. Secondary sulphate and nitrate chemical production mechanisms have also been included. In this first phase of the work bulk aerosols were included along with dry deposition for aerosols and rain out in MC2AQ. Results of this first phase showing ozone and PM concentrations and 24 hour accumulated depositions of total PM will be presented, and compared to some field observations in Southern Ontario.

  4. Flue gas conditioning for improved particle collection in electrostatic precipitators. First topical report, Results of laboratory screening of additives

    SciTech Connect

    Durham, M.D.

    1993-04-16

    Several tasks have been completed in a program to evaluate additives to improve fine particle collection in electrostatic precipitators. Screening tests and laboratory evaluations of additives are summarized in this report. Over 20 additives were evaluated; four were found to improve flyash precipitation rates. The Insitec particle analyzer was also evaluated; test results show that the analyzer will provide accurate sizing and counting information for particles in the size range of {le} 10 {mu}m dia.

  5. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  6. Detection of multivessel disease in patients with sustained myocardial infarction by thallium 201 myocardial scintigraphy: No additional value of quantitative analysis

    SciTech Connect

    Niemeyer, M.G.; Pauwels, E.K.; van der Wall, E.E.; Cramer, M.J.; Verzijlbergen, J.F.; Zwinderman, A.H.; Ascoop, C.A. )

    1989-01-01

    This study was performed to determine the value of visual and quantitative thallium 201 scintigraphy for the detection of multivessel disease in 67 patients with a sustained transmural myocardial infarction. Also the viability of the myocardial regions corresponding to pathologic Q-waves was evaluated. Of the 67 patients, 51 patients had multivessel coronary artery disease (76%). The sensitivity of the exercise test was 53%, of thallium scintigraphy 69%, when interpreted visually, and 67%, when analysed quantitatively. The specificity of these methods was 69%, 56%, and 50%, respectively. Sixty-two infarct-related flow regions were detected by visual analysis of the thallium scans, total redistribution was observed in 11/62 (18%) of patients, partial redistribution in 26/62 (42%), and no redistribution in 25/62 (40%) of patients. The infarct-related areas with total redistribution on the thallium scintigrams were more likely to be associated with normal or hypokinetic wall motion (7/11: 64%) than the areas with a persistent defect (7/25:28%) (P = 0.05), which were more related with akinetic or dyskinetic wall motion. Based on our results, it is concluded that (1) both visual and quantitative analysis of thallium exercise scintigraphy have limited value to predict the presence or absence of multivessel coronary artery disease in patients with sustained myocardial infarction, and (2) exercise-induced thallium redistribution may occur within the infarct zone, suggesting the presence of viable but jeopardized myocardium in presumed fibrotic myocardial areas.

  7. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  8. Design and Performance Considerations for the Quantitative Measurement of HEU Residues Resulting from 99Mo Production

    SciTech Connect

    McElroy, Robert Dennis; Chapman, Jeffrey Allen; Bogard, James S; Belian, Anthony P

    2011-01-01

    Molybdenum-99 is produced by the irradiation of high-enriched uranium (HEU) resulting in the accumulation of large quantities of HEU residues. In general, these residues are not recycled but are either disposed of or stored in containers with surface exposure rates as high as 100 R/h. The 235U content of these waste containers must be quantified for both accountability and waste disposal purposes. The challenges of quantifying such difficult-to-assay materials are discussed, along with performance estimates for each of several potential assay options. In particular, the design and performance of a High Activity Active Well Coincidence Counting (HA-AWCC) system designed and built specifically for these irradiated HEU waste materials are presented.

  9. Quantitative analysis of toxic and essential elements in human hair. Clinical validity of results.

    PubMed

    Kosanovic, Melita; Jokanovic, Milan

    2011-03-01

    Over the last three decades, there has been an increasing awareness of environmental and occupational exposures to toxic or potentially toxic trace elements. The evolution of biological monitoring includes knowledge of kinetics of toxic and/or essential elements and adverse health effects related to their exposure. The debate whether a hair is a valid sample for biomonitoring or not is still attracting the attention of analysts, health care professionals, and environmentalists. Although researchers have found many correlations of essential elements to diseases, metabolic disorders, environmental exposures, and nutritional status, opponents of the concept of hair analysis object that hair samples are unreliable due to the influence of external factors. This review discusses validity of hair as a sample for biomonitoring of essential and toxic elements, with emphasis on pre-analytical, analytical, and post-analytical factors influencing results.

  10. Perspectives of Speech-Language Pathologists on the Use of Telepractice in Schools: Quantitative Survey Results

    PubMed Central

    Tucker, Janice K.

    2012-01-01

    This research surveyed 170 school-based speech-language pathologists (SLPs) in one northeastern state, with only 1.8% reporting telepractice use in school-settings. These results were consistent with two ASHA surveys (2002; 2011) that reported limited use of telepractice for school-based speech-language pathology. In the present study, willingness to use telepractice was inversely related to age, perhaps because younger members of the profession are more accustomed to using technology. Overall, respondents were concerned about the validity of assessments administered via telepractice; whether clinicians can adequately establish rapport with clients via telepractice; and if therapy conducted via telepractice can be as effective as in-person speech-language therapy. Most respondents indicated the need to establish procedures and guidelines for school-based telepractice programs. PMID:25945204

  11. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-12-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  12. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  13. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-11-26

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay.

  14. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 3 Full-scale Test Results

    SciTech Connect

    Gary Blythe

    2007-05-01

    in Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High-sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Plant Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. The pilot-scale tests were completed in 2005 and have been previously reported. This topical report presents the results from the Task 3 full-scale additive tests, conducted at IPL's Petersburg Station Unit 2. The Task 5 full-scale additive tests will be conducted later in calendar year 2007.

  15. Quantitative Results from Shockless Compression Experiments on Solids to Multi-Megabar Pressure

    NASA Astrophysics Data System (ADS)

    Davis, Jean-Paul; Brown, Justin; Knudson, Marcus; Lemke, Raymond

    2015-03-01

    Quasi-isentropic, shockless ramp-wave experiments promise accurate equation-of-state (EOS) data in the solid phase at relatively low temperatures and multi-megabar pressures. In this range of pressure, isothermal diamond-anvil techniques have limited pressure accuracy due to reliance on theoretical EOS of calibration standards, thus accurate quasi-isentropic compression data would help immensely in constraining EOS models. Multi-megabar shockless compression experiments using the Z Machine at Sandia as a magnetic drive with stripline targets continue to be performed on a number of solids. New developments will be presented in the design and analysis of these experiments, including topics such as 2-D and magneto-hydrodynamic (MHD) effects and the use of LiF windows. Results will be presented for tantalum and/or gold metals, with comparisons to independently developed EOS. * Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  16. Perceived future career prospects in general practice: quantitative results from questionnaire surveys of UK doctors

    PubMed Central

    Lambert, Trevor W; Smith, Fay; Goldacre, Michael J

    2016-01-01

    Background There are more studies of current job satisfaction among GPs than of their views about their future career prospects, although both are relevant to commitment to careers in general practice. Aim To report on the views of GPs compared with clinicians in other specialties about their future career prospects. Design and setting Questionnaire surveys were sent to UK medical doctors who graduated in selected years between 1974 and 2008. Method Questionnaires were sent to the doctors at different times after graduation, ranging from 3 to 24 years. Results Based on the latest survey of each graduation year of the 20 940 responders, 66.2% of GPs and 74.2% of hospital doctors were positive about their prospects and 9.7% and 8.3%, respectively, were negative. However, with increasing time since graduation and increasing levels of seniority, GPs became less positive about their prospects; by contrast, over time, surgeons became more positive. Three to 5 years after graduation, 86.3% of those training in general practice were positive about their prospects compared with 52.9% of surgical trainees: in surveys conducted 12–24 years after graduation, 60.2% of GPs and 76.6% of surgeons were positive about their prospects. Conclusion GPs held broadly positive views of their career prospects, as did other doctors. However, there was an increase in negativity with increasing time since graduation that was not seen in hospital doctors. Research into the causes of this negativity and policy measures to ameliorate it would contribute to the continued commitment of GPs and may help to reduce attrition. PMID:27578813

  17. 21 CFR 570.13 - Indirect food additives resulting from packaging materials prior sanctioned for animal feed and...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Indirect food additives resulting from packaging materials prior sanctioned for animal feed and pet food. 570.13 Section 570.13 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED...

  18. 21 CFR 570.14 - Indirect food additives resulting from packaging materials for animal feed and pet food.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 6 2010-04-01 2010-04-01 false Indirect food additives resulting from packaging materials for animal feed and pet food. 570.14 Section 570.14 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS FOOD...

  19. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  20. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  1. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  2. Sofosbuvir Inhibits Hepatitis E Virus Replication In Vitro and Results in an Additive Effect When Combined With Ribavirin.

    PubMed

    Dao Thi, Viet Loan; Debing, Yannick; Wu, Xianfang; Rice, Charles M; Neyts, Johan; Moradpour, Darius; Gouttenoire, Jérôme

    2016-01-01

    Infection with hepatitis E virus genotype 3 may result in chronic hepatitis in immunocompromised patients. Reduction of immunosuppression or treatment with ribavirin or pegylated interferon-α can result in viral clearance. However, safer and more effective treatment options are needed. Here, we show that sofosbuvir inhibits the replication of hepatitis E virus genotype 3 both in subgenomic replicon systems as well as a full-length infectious clone. Moreover, the combination of sofosbuvir and ribavirin results in an additive antiviral effect. Sofosbuvir may be considered as an add-on therapy to ribavirin for the treatment of chronic hepatitis E in immunocompromised patients.

  3. High-performance liquid chromatography of governing liquid to detect illegal bovine milk's addition in water buffalo Mozzarella: comparison with results from raw milk and cheese matrix.

    PubMed

    Enne, Giuseppe; Elez, Danijela; Fondrini, Fabio; Bonizzi, Ivan; Feligini, Maria; Aleandri, Riccardo

    2005-11-11

    A method to detect fraudulent addition of bovine milk in water buffalo Mozzarella cheese by gradient high-performance liquid chromatography (RP-HPLC), relying on the measurement of quantity ratios within beta-lactoglobulin protein family, is described. Analyses were performed on raw milk, cheese matrix and cheese governing liquid using a C4 column and UV detection. This work demonstrated that bovine milk addition during cheesemaking can be detected in governing liquid of Mozzarella down to the EU law limit of 1% as well as in raw milk and cheese matrix. A significant lowering of peaks' areas and heights was observed in cheese matrix and governing liquid samples in comparison with the corresponding milk ones, possibly due to proteins' degradation during the cheesemaking process. The results show that, unlike previous works reported, the use of a matrix-specific calibration curve is essential in order to achieve a proper quantitation of beta-lactoglobulin proteins, thus allowing a reliable estimation of bovine milk addition.

  4. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  5. Additional road markings as an indication of speed limits: results of a field experiment and a driving simulator study.

    PubMed

    Daniels, Stijn; Vanrie, Jan; Dreesen, An; Brijs, Tom

    2010-05-01

    Although speed limits are indicated by road signs, road users are not always aware, while driving, of the actual speed limit on a given road segment. The Roads and Traffic Agency developed additional road markings in order to support driver decisions on speed on 70 km/h roads in Flanders-Belgium. In this paper the results are presented of two evaluation studies, both a field study and a simulator study, on the effects of the additional road markings on speed behaviour. The results of the field study showed no substantial effect of the markings on speed behaviour. Neither did the simulator study, with slightly different stimuli. Nevertheless an effect on lateral position was noticed in the simulator study, showing at least some effect of the markings. The role of conspicuity of design elements and expectations towards traffic environments is discussed. Both studies illustrate well some strengths and weaknesses of observational field studies compared to experimental simulator studies.

  6. Field Testing of a Wet FGD Additive for Enhanced Mercury Control - Task 5 Full-Scale Test Results

    SciTech Connect

    Gary Blythe; MariJon Owens

    2007-12-01

    and reporting. The other four tasks involve field testing on FGD systems, either at pilot or full scale. The four tasks include: Task 2 - Pilot Additive Testing in Texas Lignite Flue Gas; Task 3 - Full-scale FGD Additive Testing in High-sulfur Eastern Bituminous Flue Gas; Task 4 - Pilot Wet Scrubber Additive Tests at Plant Yates; and Task 5 - Full-scale Additive Tests at Plant Yates. The pilot-scale tests and the full-scale test using high-sulfur coal were completed in 2005 and 2006 and have been previously reported. This topical report presents the results from the Task 5 full-scale additive tests, conducted at Southern Company's Plant Yates Unit 1. Both additives were tested there.

  7. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  8. Quantitative assessment of trabecular bone micro-architecture of the wrist via 7 Tesla MRI: preliminary results

    PubMed Central

    Wang, Ligong; Liang, Guoyuan; Babb, James S.; Wiggins, Graham C.; Saha, Punam K.; Regatte, Ravinder R.

    2013-01-01

    Object The goal of this study was to determine the feasibility of performing quantitative 7T magnetic resonance imaging (MRI) assessment of trabecular bone micro-architecture of the wrist, a common fracture site. Materials and methods The wrists of 4 healthy subjects (1 woman, 3 men, 28±8.9 years) were scanned on a 7T whole body MR scanner using a 3D fast low-angle shot (FLASH) sequence (TR/TE = 20/4.5ms, 0.169 × 0.169 × 0.5mm). Trabecular bone was segmented and divided into 4 or 8 angular subregions. Total bone volume (TBV), bone volume fraction (BVF), surface-curve ratio (SC), and erosion index (EI) were computed. Subjects were scanned twice to assess measurement reproducibility. Results Group mean subregional values for TBV, BVF, SC, and EI (8 subregion analysis) were as follows: 8489 ± 3686, 0.27 ± 0.045, 9.61 ± 6.52; and 1.43 ± 1.25. Within each individual, there was subregional variation in TBV, SC, and EI (>5%), but not BVF (<5%). Intersubject variation (≥12%) existed for all parameters. Within-subject coefficients of variation were ≤10%. Conclusion This is the first study to perform quantitative 7T MRI assessment of trabecular bone micro-architecture of the wrist. This method could be utilized to study perturbations in bone structure in subjects with osteoporosis or other bone disorders. PMID:21544680

  9. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  10. Screening for antibodies against Aleutian disease virus (ADV) in mink. Elucidation of dubious results by additive counterimmunoelectrophoresis.

    PubMed

    Uttenthal, A

    1992-01-01

    In order to distinguish true positive results in counterimmunoelectrophoresis from false positive ones an additive counterimmunoelectrophoresis was developed. The method was tested on selected mink serum samples as part of a routine testing for antibodies towards Aleutian disease virus on 3 million blood samples. The procedure of the method is, that a known positive serum sample is mixed with the patient serum to be tested. The result from a false positive sample will be one precipitin line towards virus and one nonspecific line. If the serum sample is a true positive one, the antibodies originating from the patient serum will be added to the antibodies in the standard positive serum giving only one precipitin line. The system is further extended by testing the serum samples towards an antigen preparation containing all the cellular components but free from virus. PMID:1335756

  11. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  12. Lanthanum Tricyanide-Catalyzed Acyl Silane-Ketone Benzoin Additions and Kinetic Resolution of Resultant α-Silyloxyketones

    PubMed Central

    Tarr, James C.

    2010-01-01

    We report the full account of our efforts on the lanthanum tricyanide-catalyzed acyl silane-ketone benzoin reaction. The reaction exhibits a wide scope in both acyl silane (aryl, alkyl) and ketone (aryl-alkyl, alkyl-alkyl, aryl-aryl, alkenyl-alkyl, alkynyl-alkyl) coupling partners. The diastereoselectivity of the reaction has been examined in both cyclic and acyclic systems. Cyclohexanones give products arising from equatorial attack by the acyl silane. The diastereoselectivity of acyl silane addition to acyclic α-hydroxy ketones can be controlled by varying the protecting group to obtain either Felkin-Ahn or chelation control. The resultant α-silyloxyketone products can be resolved with selectivity factors from 10 to 15 by subjecting racemic ketone benzoin products to CBS reduction. PMID:20392127

  13. Divergent targets of glycolysis and oxidative phosphorylation result in additive effects of metformin and starvation in colon and breast cancer.

    PubMed

    Marini, Cecilia; Bianchi, Giovanna; Buschiazzo, Ambra; Ravera, Silvia; Martella, Roberto; Bottoni, Gianluca; Petretto, Andrea; Emionite, Laura; Monteverde, Elena; Capitanio, Selene; Inglese, Elvira; Fabbi, Marina; Bongioanni, Francesca; Garaboldi, Lucia; Bruzzi, Paolo; Orengo, Anna Maria; Raffaghello, Lizzia; Sambuceti, Gianmario

    2016-01-01

    Emerging evidence demonstrates that targeting energy metabolism is a promising strategy to fight cancer. Here we show that combining metformin and short-term starvation markedly impairs metabolism and growth of colon and breast cancer. The impairment in glycolytic flux caused by starvation is enhanced by metformin through its interference with hexokinase II activity, as documented by measurement of 18F-fluorodeoxyglycose uptake. Oxidative phosphorylation is additively compromised by combined treatment: metformin virtually abolishes Complex I function; starvation determines an uncoupled status of OXPHOS and amplifies the activity of respiratory Complexes II and IV thus combining a massive ATP depletion with a significant increase in reactive oxygen species. More importantly, the combined treatment profoundly impairs cancer glucose metabolism and virtually abolishes lesion growth in experimental models of breast and colon carcinoma. Our results strongly suggest that energy metabolism is a promising target to reduce cancer progression. PMID:26794854

  14. Automatic segmentation of cell nuclei in Feulgen-stained histological sections of prostate cancer and quantitative evaluation of segmentation results.

    PubMed

    Nielsen, Birgitte; Albregtsen, Fritz; Danielsen, Håvard E

    2012-07-01

    Digital image analysis of cell nuclei is useful to obtain quantitative information for the diagnosis and prognosis of cancer. However, the lack of a reliable automatic nuclear segmentation is a limiting factor for high-throughput nuclear image analysis. We have developed a method for automatic segmentation of nuclei in Feulgen-stained histological sections of prostate cancer. A local adaptive thresholding with an object perimeter gradient verification step detected the nuclei and was combined with an active contour model that featured an optimized initialization and worked within a restricted region to improve convergence of the segmentation of each nucleus. The method was tested on 30 randomly selected image frames from three cases, comparing the results from the automatic algorithm to a manual delineation of 924 nuclei. The automatic method segmented a few more nuclei compared to the manual method, and about 73% of the manually segmented nuclei were also segmented by the automatic method. For each nucleus segmented both manually and automatically, the accuracy (i.e., agreement with manual delineation) was estimated. The mean segmentation sensitivity/specificity were 95%/96%. The results from the automatic method were not significantly different from the ground truth provided by manual segmentation. This opens the possibility for large-scale nuclear analysis based on automatic segmentation of nuclei in Feulgen-stained histological sections.

  15. Model assessment of additional contamination of water bodies as a result of wildfires in the Chernobyl exclusion zone.

    PubMed

    Bondar, Yu I; Navumau, A D; Nikitin, A N; Brown, J; Dowdall, M

    2014-12-01

    Forest fires and wild fires are recognized as a possible cause of resuspension and redistribution of radioactive substances when occurring on lands contaminated with such materials, and as such are a matter of concern within the regions of Belarus and the Ukraine which were contaminated by the Chernobyl accident in 1986. Modelling the effects of such fires on radioactive contaminants is a complex matter given the number of variables involved. In this paper, a probabilistic model was developed using empirical data drawn from the Polessie State Radiation-Ecological Reserve (PSRER), Belarus, and the Maximum Entropy Method. Using the model, it was possible to derive estimates of the contribution of fire events to overall variability in the levels of (137)Cs and (239,240)Pu in ground air as well as estimates of the deposition of these radionuclides to specific water bodies within the contaminated areas of Belarus. Results indicate that fire events are potentially significant redistributors of radioactive contaminants within the study area and may result in additional contamination being introduced to water bodies.

  16. Human resource challenges facing Zambia's mental health care system and possible solutions: results from a combined quantitative and qualitative study.

    PubMed

    Sikwese, Alice; Mwape, Lonia; Mwanza, Jason; Kapungwe, Augustus; Kakuma, Ritsuko; Imasiku, Mwiya; Lund, Crick; Cooper, Sara; The Mhapp Research Programme Consortium

    2010-01-01

    Human resources for mental health care in low- and middle-income countries are inadequate to meet the growing public health burden of neuropsychiatric disorders. Information on actual numbers is scarce, however. The aim of this study was to analyse the key human resource constraints and challenges facing Zambia's mental health care system, and the possible solutions. This study used both qualitative and quantitative methodologies. The WHO-AIMS Version 2.2 was utilized to ascertain actual figures on human resource availability. Semi-structured interviews and focus group discussions were conducted to assess key stakeholders' perceptions regarding the human resource constraints and challenges. The results revealed an extreme scarcity of human resources dedicated to mental health in Zambia. Respondents highlighted many human resource constraints, including shortages, lack of post-graduate and in-service training, and staff mismanagement. A number of reasons for and consequences of these problems were highlighted. Dedicating more resources to mental health, increasing the output of qualified mental health care professionals, stepping up in-service training, and increasing political will from government were amongst the key solutions highlighted by the respondents. There is an urgent need to scale up human and financial resources for mental health in Zambia. PMID:21226643

  17. Three-dimensional quantitative analysis of adhesive remnants and enamel loss resulting from debonding orthodontic molar tubes

    PubMed Central

    2014-01-01

    Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969

  18. Accuracy and Precision in the Southern Hemisphere Additional Ozonesondes (SHADOZ) Dataset in Light of the JOSIE-2000 Results

    NASA Technical Reports Server (NTRS)

    Witte, Jacquelyn C.; Thompson, Anne M.; Schmidlin, F. J.; Oltmans, S. J.; Smit, H. G. J.

    2004-01-01

    Since 1998 the Southern Hemisphere ADditional OZonesondes (SHADOZ) project has provided over 2000 ozone profiles over eleven southern hemisphere tropical and subtropical stations. Balloon-borne electrochemical concentration cell (ECC) ozonesondes are used to measure ozone. The data are archived at: &ttp://croc.gsfc.nasa.gov/shadoz>. In analysis of ozonesonde imprecision within the SHADOZ dataset, Thompson et al. [JGR, 108,8238,20031 we pointed out that variations in ozonesonde technique (sensor solution strength, instrument manufacturer, data processing) could lead to station-to-station biases within the SHADOZ dataset. Imprecisions and accuracy in the SHADOZ dataset are examined in light of new data. First, SHADOZ total ozone column amounts are compared to version 8 TOMS (2004 release). As for TOMS version 7, satellite total ozone is usually higher than the integrated column amount from the sounding. Discrepancies between the sonde and satellite datasets decline two percentage points on average, compared to version 7 TOMS offsets. Second, the SHADOZ station data are compared to results of chamber simulations (JOSE-2000, Juelich Ozonesonde Intercomparison Experiment) in which the various SHADOZ techniques were evaluated. The range of JOSE column deviations from a standard instrument (-10%) in the chamber resembles that of the SHADOZ station data. It appears that some systematic variations in the SHADOZ ozone record are accounted for by differences in solution strength, data processing and instrument type (manufacturer).

  19. A gene-free formulation of classical quantitative genetics used to examine results and interpretations under three standard assumptions.

    PubMed

    Taylor, Peter J

    2012-12-01

    Quantitative genetics (QG) analyses variation in traits of humans, other animals, or plants in ways that take account of the genealogical relatedness of the individuals whose traits are observed. "Classical" QG, where the analysis of variation does not involve data on measurable genetic or environmental entities or factors, is reformulated in this article using models that are free of hypothetical, idealized versions of such factors, while still allowing for defined degrees of relatedness among kinds of individuals or "varieties." The gene-free formulation encompasses situations encountered in human QG as well as in agricultural QG. This formulation is used to describe three standard assumptions involved in classical QG and provide plausible alternatives. Several concerns about the partitioning of trait variation into components and its interpretation, most of which have a long history of debate, are discussed in light of the gene-free formulation and alternative assumptions. That discussion is at a theoretical level, not dependent on empirical data in any particular situation. Additional lines of work to put the gene-free formulation and alternative assumptions into practice and to assess their empirical consequences are noted, but lie beyond the scope of this article. The three standard QG assumptions examined are: (1) partitioning of trait variation into components requires models of hypothetical, idealized genes with simple Mendelian inheritance and direct contributions to the trait; (2) all other things being equal, similarity in traits for relatives is proportional to the fraction shared by the relatives of all the genes that vary in the population (e.g., fraternal or dizygotic twins share half of the variable genes that identical or monozygotic twins share); (3) in analyses of human data, genotype-environment interaction variance (in the classical QG sense) can be discounted. The concerns about the partitioning of trait variation discussed include: the

  20. In vivo discrimination of hip fracture with quantitative computed tomography: results from the prospective European Femur Fracture Study (EFFECT).

    PubMed

    Bousson, Valérie Danielle; Adams, Judith; Engelke, Klaus; Aout, Mounir; Cohen-Solal, Martine; Bergot, Catherine; Haguenauer, Didier; Goldberg, Daniele; Champion, Karine; Aksouh, Redha; Vicaut, Eric; Laredo, Jean-Denis

    2011-04-01

    In assessing osteoporotic fractures of the proximal femur, the main objective of this in vivo case-control study was to evaluate the performance of quantitative computed tomography (QCT) and a dedicated 3D image analysis tool [Medical Image Analysis Framework--Femur option (MIAF-Femur)] in differentiating hip fracture and non-hip fracture subjects. One-hundred and seven women were recruited in the study, 47 women (mean age 81.6 years) with low-energy hip fractures and 60 female non-hip fracture control subjects (mean age 73.4 years). Bone mineral density (BMD) and geometric variables of cortical and trabecular bone in the femoral head and neck, trochanteric, and intertrochanteric regions and proximal shaft were assessed using QCT and MIAF-Femur. Areal BMD (aBMD) was assessed using dual-energy X-ray absorptiometry (DXA) in 96 (37 hip fracture and 59 non-hip fracture subjects) of the 107 patients. Logistic regressions were computed to extract the best discriminates of hip fracture, and area under the receiver characteristic operating curve (AUC) was calculated. Three logistic models that discriminated the occurrence of hip fracture with QCT variables were obtained (AUC = 0.84). All three models combined one densitometric variable--a trabecular BMD (measured in the femoral head or in the trochanteric region)--and one geometric variable--a cortical thickness value (measured in the femoral neck or proximal shaft). The best discriminant using DXA variables was obtained with total femur aBMD (AUC = 0.80, p = .003). Results highlight a synergistic contribution of trabecular and cortical components in hip fracture risk and the utility of assessing QCT BMD of the femoral head for improved understanding and possible insights into prevention of hip fractures.

  1. Acquisition and Retention of Quantitative Communication Skills in an Undergraduate Biology Curriculum: Long-Term Retention Results

    ERIC Educational Resources Information Center

    Chevalier, Cary D.; Ashley, David C.; Rushin, John W.

    2010-01-01

    The purpose of this study was to assess some of the effects of a nontraditional, experimental learning approach designed to improve rapid acquisition and long-term retention of quantitative communication skills (QCS) such as descriptive and inferential statistics, hypothesis formulation, experimental design, data characteristics, and data…

  2. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  3. 49 CFR 1155.23 - Additional requirements when filing after an unsatisfactory result from a State, local, or...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... required in 49 CFR 1155.23(b). The petition shall be filed simultaneously with the land-use-exemption... Procedures Governing Applications for a Land-Use-Exemption Permit § 1155.23 Additional requirements when... siting of the facility, the applicant may petition the Board to accept an application for a...

  4. Effects of Additional Elements on the Evolution of Second Phases in 9-12% cr Steel and Resulting Mechanical Properties

    NASA Astrophysics Data System (ADS)

    Dong, Jiling; Yu, Hui; Yoo, Dae-Hwang; Huynh, Quocbao; Shin, Keesam; Kim, Minsoo; Kang, Sungtae

    Investigated in this study are precipitate evolution with and without addition of W, Co, and B in two kinds of 9-12% Cr steels (named as A and B) used for power plants after various aging time and temperature using OM, SEM, TEM, etc. Three kinds of precipitates (Cr-rich M23C6, Nb-rich and V-rich MX, W-rich and Mo-rich Laves phase) were observed and investigated in the two alloys. Upon aging, the area fraction of M23C6 increased whereas that of Laves phases decreased despite of increase in size. The area fraction of W-rich Laves phase was much higher than that of Mo-rich Laves phase, indicating that W addition, compared to that of Mo addition, is more powerful in the formation of Laves phase precipitation (specimen A). The martensitic microstructure of specimen B was more stable than that of specimen A due to the addition of cobalt and boron. The tensile test and impact test were measured and studied in relation to the long term aging effect.

  5. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  6. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  7. Additional results on palaeomagnetic stratigraphy of the Koobi Fora Formation, east of Lake Turkana (Lake Rudolf), Kenya

    USGS Publications Warehouse

    Hillhouse, J.W.; Ndombi, J.W.M.; Cox, A.; Brock, A.

    1977-01-01

    The magnetostratigraphy of the hominid-bearing sediments exposed east of Lake Turkana has been strengthened by new palaeomagnetic results. Ages obtained from several tuffs by the 40Ar/39Ar method suggest an approxmate match between the observed magnetozones and the geomagnetic polarity time scale; however, the palaeomagnetic results are also compatible with a younger chronology suggested by conventional K-Ar dating of the KBS Tuff. ?? 1977 Nature Publishing Group.

  8. Estimation of daily aluminum intake in Japan based on food consumption inspection results: impact of food additives

    PubMed Central

    Sato, Kyoko; Suzuki, Ippei; Kubota, Hiroki; Furusho, Noriko; Inoue, Tomoyuki; Yasukouchi, Yoshikazu; Akiyama, Hiroshi

    2014-01-01

    Dietary aluminum (Al) intake by young children, children, youths, and adults in Japan was estimated using the market basket method. The Al content of food category (I–VII) samples for each age group was determined by inductively coupled plasma-atomic emission spectrometry (ICP-AES). The Al content in processed foods and unprocessed foods ranged from 0.40 to 21.7 mg/kg and from 0.32 to 0.54 mg/kg, respectively. For processed foods in all age groups, the Al content in food category VI samples, sugar and confections/savories, was the highest, followed by those in category II, cereals. The daily dietary Al intake from processed foods was much larger than that from unprocessed foods. The mean weekly percentages of the provisional tolerable weekly intake (PTWI, established by the joint FAO/WHO Expert Committee on Food Additives in 2011) from processed foods for all age groups are 43.1, 22.4, 17.6 and 15.1%, respectively. Only the highest consumer Al exposure value (>P95) of the young children group exceeded the PTWI. PMID:25473496

  9. Estimation of daily aluminum intake in Japan based on food consumption inspection results: impact of food additives.

    PubMed

    Sato, Kyoko; Suzuki, Ippei; Kubota, Hiroki; Furusho, Noriko; Inoue, Tomoyuki; Yasukouchi, Yoshikazu; Akiyama, Hiroshi

    2014-07-01

    Dietary aluminum (Al) intake by young children, children, youths, and adults in Japan was estimated using the market basket method. The Al content of food category (I-VII) samples for each age group was determined by inductively coupled plasma-atomic emission spectrometry (ICP-AES). The Al content in processed foods and unprocessed foods ranged from 0.40 to 21.7 mg/kg and from 0.32 to 0.54 mg/kg, respectively. For processed foods in all age groups, the Al content in food category VI samples, sugar and confections/savories, was the highest, followed by those in category II, cereals. The daily dietary Al intake from processed foods was much larger than that from unprocessed foods. The mean weekly percentages of the provisional tolerable weekly intake (PTWI, established by the joint FAO/WHO Expert Committee on Food Additives in 2011) from processed foods for all age groups are 43.1, 22.4, 17.6 and 15.1%, respectively. Only the highest consumer Al exposure value (>P 95) of the young children group exceeded the PTWI.

  10. A re-examination of paleomagnetic results from NA Jurassic sedimentary rocks: Additional evidence for proposed Jurassic MUTO?

    NASA Astrophysics Data System (ADS)

    Housen, B. A.

    2015-12-01

    Kent and Irving, 2010; and Kent et al, 2015 propose a monster shift in the position of Jurassic (160 to 145 Ma) paleopoles for North America- defined by results from igneous rocks. This monster shift is likely an unrecognized true polar wander occurrence. Although subject to inclination error, results from sedimentary rocks from North America, if corrected for these effects, can be used to supplement the available data for this time period. Steiner (2003) reported results from 48 stratigraphic horizons sampled from the Callovian Summerville Fm, from NE New Mexico. A recalculated mean of these results yields a mean direction of D = 332, I = 39, n=48, k = 15, α95 = 5.4°. These data were analyzed for possible inclination error-although the dataset is small, the E-I results yielded a corrected I = 53. This yields a corrected paleopole for NA at ~165 Ma located at 67° N and 168° E.Paleomagnetic results from the Black Hills- Kilanowski (2002) for the Callovian Hulett Mbr of the Sundance Fm, and Gregiore (2001) the Oxfordian-Tithonian Morrison Fm (Gregiore, 2001) have previously been interpreted to represent Eocene-aged remagnetizations- due to the nearly exact coincidence between the in-situ pole positions of these Jurassic units with the Eocene pole for NA. Both of the tilt-corrected results for these units have high latitude poles (Sundance Fm: 79° N, 146° E; Morrison Fm: 89° N, 165° E). An E-I analysis of these data will be presented- using a provisional inclination error of 10°, corrected paleopoles are: (Sundance Fm: 76° N, 220° E; Morrison Fm: 77° N, 266° E). The Black Hills 165 Ma (Sundance Fm) and 145 Ma (Morrison Fm) poles, provisionally corrected for 10° inclination error- occur fairly close to the NA APWP proposed by Kent et al, 2015- using an updated set of results from kimberlites- the agreement between the Sundance Fm and the Triple-B (158 Ma) pole would be nearly exact with a slightly lesser inclination error. The Summerville Fm- which is

  11. Failure to Report Effect Sizes: The Handling of Quantitative Results in Published Health Education and Behavior Research.

    PubMed

    Barry, Adam E; Szucs, Leigh E; Reyes, Jovanni V; Ji, Qian; Wilson, Kelly L; Thompson, Bruce

    2016-10-01

    Given the American Psychological Association's strong recommendation to always report effect sizes in research, scholars have a responsibility to provide complete information regarding their findings. The purposes of this study were to (a) determine the frequencies with which different effect sizes were reported in published, peer-reviewed articles in health education, promotion, and behavior journals and (b) discuss implications for reporting effect size in social science research. Across a 4-year time period (2010-2013), 1,950 peer-reviewed published articles were examined from the following six health education and behavior journals: American Journal of Health Behavior, American Journal of Health Promotion, Health Education & Behavior, Health Education Research, Journal of American College Health, and Journal of School Health Quantitative features from eligible manuscripts were documented using Qualtrics online survey software. Of the 1,245 articles in the final sample that reported quantitative data analyses, approximately 47.9% (n = 597) of the articles reported an effect size. While 16 unique types of effect size were reported across all included journals, many of the effect sizes were reported with little frequency across most journals. Overall, odds ratio/adjusted odds ratio (n = 340, 50.1%), Pearson r/r(2) (n = 162, 23.8%), and eta squared/partial eta squared (n = 46, 7.2%) accounted for the most frequently used effect size. Quality research practice requires both testing statistical significance and reporting effect size. However, our study shows that a substantial portion of published literature in health education and behavior lacks consistent reporting of effect size.

  12. Failure to Report Effect Sizes: The Handling of Quantitative Results in Published Health Education and Behavior Research.

    PubMed

    Barry, Adam E; Szucs, Leigh E; Reyes, Jovanni V; Ji, Qian; Wilson, Kelly L; Thompson, Bruce

    2016-10-01

    Given the American Psychological Association's strong recommendation to always report effect sizes in research, scholars have a responsibility to provide complete information regarding their findings. The purposes of this study were to (a) determine the frequencies with which different effect sizes were reported in published, peer-reviewed articles in health education, promotion, and behavior journals and (b) discuss implications for reporting effect size in social science research. Across a 4-year time period (2010-2013), 1,950 peer-reviewed published articles were examined from the following six health education and behavior journals: American Journal of Health Behavior, American Journal of Health Promotion, Health Education & Behavior, Health Education Research, Journal of American College Health, and Journal of School Health Quantitative features from eligible manuscripts were documented using Qualtrics online survey software. Of the 1,245 articles in the final sample that reported quantitative data analyses, approximately 47.9% (n = 597) of the articles reported an effect size. While 16 unique types of effect size were reported across all included journals, many of the effect sizes were reported with little frequency across most journals. Overall, odds ratio/adjusted odds ratio (n = 340, 50.1%), Pearson r/r(2) (n = 162, 23.8%), and eta squared/partial eta squared (n = 46, 7.2%) accounted for the most frequently used effect size. Quality research practice requires both testing statistical significance and reporting effect size. However, our study shows that a substantial portion of published literature in health education and behavior lacks consistent reporting of effect size. PMID:27624442

  13. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin–DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody

    PubMed Central

    Dou, Shuping; Virostko, John; Greiner, Dale L.; Powers, Alvin C.; Liu, Guozheng

    2016-01-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ~95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  14. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin-DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody.

    PubMed

    Dou, Shuping; Virostko, John; Greiner, Dale L; Powers, Alvin C; Liu, Guozheng

    2015-08-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ∼95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  15. Additive reductions in zebrafish PRPS1 activity result in a spectrum of deficiencies modeling several human PRPS1-associated diseases

    PubMed Central

    Pei, Wuhong; Xu, Lisha; Varshney, Gaurav K.; Carrington, Blake; Bishop, Kevin; Jones, MaryPat; Huang, Sunny C.; Idol, Jennifer; Pretorius, Pamela R.; Beirl, Alisha; Schimmenti, Lisa A.; Kindt, Katie S.; Sood, Raman; Burgess, Shawn M.

    2016-01-01

    Phosphoribosyl pyrophosphate synthetase-1 (PRPS1) is a key enzyme in nucleotide biosynthesis, and mutations in PRPS1 are found in several human diseases including nonsyndromic sensorineural deafness, Charcot-Marie-Tooth disease-5, and Arts Syndrome. We utilized zebrafish as a model to confirm that mutations in PRPS1 result in phenotypic deficiencies in zebrafish similar to those in the associated human diseases. We found two paralogs in zebrafish, prps1a and prps1b and characterized each paralogous mutant individually as well as the double mutant fish. Zebrafish prps1a mutants and prps1a;prps1b double mutants showed similar morphological phenotypes with increasingly severe phenotypes as the number of mutant alleles increased. Phenotypes included smaller eyes and reduced hair cell numbers, consistent with the optic atrophy and hearing impairment observed in human patients. The double mutant also showed abnormal development of primary motor neurons, hair cell innervation, and reduced leukocytes, consistent with the neuropathy and recurrent infection of the human patients possessing the most severe reductions of PRPS1 activity. Further analyses indicated the phenotypes were associated with a prolonged cell cycle likely resulting from reduced nucleotide synthesis and energy production in the mutant embryos. We further demonstrated the phenotypes were caused by delays in the tissues most highly expressing the prps1 genes. PMID:27425195

  16. Lattice strain measurements of deuteride (hydride) formation in epitaxial Nb: Additional results and further insights into past measurements

    SciTech Connect

    Allain, Monica M.C.; Heuser, Brent J.

    2005-08-01

    The evolution of lattice strain during in situ gas-phase deuterium loading of epitaxial (110) Nb films on the (1120) sapphire was measured with x-ray diffraction. Two samples with film thicknesses 208 and 1102 A were driven through the miscibility gap. Strains in three orthogonal directions were recorded, permitting the complete set of unit cell parameters to be determined for both the solid solution and deuteride phases. The overall film thickness was simultaneously measured by recording the glancing angle reflectivity response. The behavior of the two films was markedly different, with the thicker film exhibiting a much more compliant behavior and concomitant irreversible plastic deformation. The correlation between out-of-plane lattice and film expansion for both films is also consistent with this observation. These results help explain past inconsistencies observed by others.

  17. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  18. Establishment of quantitative PCR (qPCR) and culture laboratory facilities in a field hospital in benin: 1-year results.

    PubMed

    Marion, Estelle; Ganlonon, Line; Claco, Eric; Blanchard, Simon; Kempf, Marie; Adeye, Ambroise; Chauty, Annick

    2014-12-01

    No simple diagnostic tool is available to confirm Mycobacterium ulcerans infection, which is an emerging disease reported in many rural areas of Africa. Here, we report the 1-year results of a hospital laboratory that was created in an area of endemicity of Benin to facilitate the diagnosis of M. ulcerans infection. PMID:25320228

  19. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  20. Flue gas conditioning for improved particle collection in electrostatic precipitators. Second topical report, Results of bench-scale screening of additives

    SciTech Connect

    Durham, M.D.

    1993-08-13

    ADA Technologies, Inc. (ADA) has completed the bench-scale testing phase of a program to evaluate additives that will improve the collection of fine particles in electrostatic precipitators (ESPs). A bench-scale ESP was installed at the Consolidation Coal Company (CONSOL) combustion research and development facility in Library, PA in order to conduct the evaluation. During a two-week test, four candidate additives were injected into the flue gas ahead of a 100 acfm ESP to determine the effect on fly ash collectability. Two additives were found to reduce the emissions from the ESP. Additives ``C`` and ``D`` performed better than initially anticipated -- reducing emissions initially by 17%. Emissions were reduced by 27% after the ESP was modified by the installation of baffles to minimize sneakage. In addition to the measured improvements in performance, no detrimental effects (i.e., electrode fouling) were observed in the operation of the ESP during the testing. The measures of success identified for the bench-scale phase of the program have been surpassed. Since the additives will affect only non-rapping reentrainment particle losses, it is expected that an even greater improvement in particle collection will be observed in larger-scale ESPs. Therefore, positive results are anticipated during the pilot-scale phase of the program and during a future full-scale demonstration test. A preliminary economic analysis was performed to evaluate the cost of the additive process and to compare its costs against alternative means for reducing emissions from ESPs. The results show that conditioning with additive C at a rate of 0.05% (wt. additive to wt. fly ash) is much less expensive than adding new ESP capacity, and more cost competitive than existing chemical conditioning processes. Preliminary chemical analysis of conditioned fly ash shows that it passes the Toxicity Characteristic Leaching Procedure criteria.

  1. Quantitative comparison between theoretical predictions and experimental results for Bragg spectroscopy of a strongly interacting Fermi superfluid

    SciTech Connect

    Zou Peng; Kuhnle, Eva D.; Vale, Chris J.; Hu Hui

    2010-12-15

    Theoretical predictions for the dynamic structure factor of a harmonically trapped Fermi superfluid near the Bose-Einstein condensate-Bardeen-Cooper-Schrieffer (BEC-BCS) crossover are compared with recent Bragg spectroscopy measurements at large transferred momenta. The calculations are based on a random-phase (or time-dependent Hartree-Fock-Gorkov) approximation generalized to the strongly interacting regime. Excellent agreement with experimental spectra at low temperatures is obtained, with no free parameters. Theoretical predictions for zero-temperature static structure factor are also found to agree well with the experimental results and independent theoretical calculations based on the exact Tan relations. The temperature dependence of the structure factors at unitarity is predicted.

  2. [The evaluation of the results after coronary angioplasty by intracoronary Doppler and quantitative angiography. The correlation of both methods].

    PubMed

    Goicolea Ruigómez, F J; Iñíguez Romo, A; Macaya, C; Alfonso, F; Hernández Antolín, R; Casado, J; Zamorano, J; Zarco, P

    1992-03-01

    To study the importance of measuring coronary flow reserve immediately after coronary angioplasty we have analysed the results obtained after 28 angioplasties performed in 21 patients. Coronary flow reserve was measured with a 3F intracoronary catheter selectively placed in the dilated artery. Corresponding coronary angiography was analysed with an automatic edge detection program (ARTREK) and visual estimation. Coronary flow reserve increased in 26/27 cases after angioplasty from 2.4 +/- 1.3 to 4.1 +/- 2.7 (p less than 0.001). A correlation was found between minimal luminal area and minimal luminal diameter after coronary angioplasty, and coronary flow reserve (r = 0.46; p less than 0.05 and r = 0.47; p less than 0.05, respectively). The finding of a normal coronary flow reserve (greater than or equal to 3.5), had a 100% specificity but only 56% sensitivity to detect angiographic success (residual stenosis less than 50%). However 47% of patients with angiographic success did not reach normal values of coronary flow reserve. Visual estimation of the stenosis had a good correlation with automatic evaluation but significant scattering was observed at visual levels less than or equal to 25%. Visual assessment underestimated residual stenosis in all but one of the procedures. We conclude that coronary flow reserve is a potentially useful index for assessing the results after angioplasty that may complement coronary angiography. Nonetheless substantial differences between both methods exist in a significant number of cases. The relative merits of both methods, as well as the particular circumstances in which coronary flow reserve should be used, require further studies.

  3. Messages that increase women’s intentions to abstain from alcohol during pregnancy: results from quantitative testing of advertising concepts

    PubMed Central

    2014-01-01

    Background Public awareness-raising campaigns targeting alcohol use during pregnancy are an important part of preventing prenatal alcohol exposure and Fetal Alcohol Spectrum Disorder. Despite this, there is little evidence on what specific elements contribute to campaign message effectiveness. This research evaluated three different advertising concepts addressing alcohol and pregnancy: a threat appeal, a positive appeal promoting a self-efficacy message, and a concept that combined the two appeals. The primary aim was to determine the effectiveness of these concepts in increasing women’s intentions to abstain from alcohol during pregnancy. Methods Women of childbearing age and pregnant women residing in Perth, Western Australia participated in a computer-based questionnaire where they viewed either a control or one of the three experimental concepts. Following exposure, participants’ intentions to abstain from and reduce alcohol intake during pregnancy were measured. Other measures assessed included perceived main message, message diagnostics, and potential to promote defensive responses or unintended consequences. Results The concepts containing a threat appeal were significantly more effective at increasing women’s intentions to abstain from alcohol during pregnancy than the self-efficacy message and the control. The concept that combined threat and self-efficacy is recommended for development as part of a mass-media campaign as it has good persuasive potential, provides a balance of positive and negative emotional responses, and is unlikely to result in defensive or unintended consequences. Conclusions This study provides important insights into the components that enhance the persuasiveness and effectiveness of messages aimed at preventing prenatal alcohol exposure. The recommended concept has good potential for use in a future campaign aimed at promoting women’s intentions to abstain from alcohol during pregnancy. PMID:24410764

  4. Longitudinal, intermodality registration of quantitative breast PET and MRI data acquired before and during neoadjuvant chemotherapy: Preliminary results

    SciTech Connect

    Atuegwu, Nkiruka C.; Williams, Jason M.; Li, Xia; Arlinghaus, Lori R.; Abramson, Richard G.; Chakravarthy, A. Bapsi; Abramson, Vandana G.; Yankeelov, Thomas E.

    2014-05-15

    Purpose: The authors propose a method whereby serially acquired DCE-MRI, DW-MRI, and FDG-PET breast data sets can be spatially and temporally coregistered to enable the comparison of changes in parameter maps at the voxel level. Methods: First, the authors aligned the PET and MR images at each time point rigidly and nonrigidly. To register the MR images longitudinally, the authors extended a nonrigid registration algorithm by including a tumor volume-preserving constraint in the cost function. After the PET images were aligned to the MR images at each time point, the authors then used the transformation obtained from the longitudinal registration of the MRI volumes to register the PET images longitudinally. The authors tested this approach on ten breast cancer patients by calculating a modified Dice similarity of tumor size between the PET and MR images as well as the bending energy and changes in the tumor volume after the application of the registration algorithm. Results: The median of the modified Dice in the registered PET and DCE-MRI data was 0.92. For the longitudinal registration, the median tumor volume change was −0.03% for the constrained algorithm, compared to −32.16% for the unconstrained registration algorithms (p = 8 × 10{sup −6}). The medians of the bending energy were 0.0092 and 0.0001 for the unconstrained and constrained algorithms, respectively (p = 2.84 × 10{sup −7}). Conclusions: The results indicate that the proposed method can accurately spatially align DCE-MRI, DW-MRI, and FDG-PET breast images acquired at different time points during therapy while preventing the tumor from being substantially distorted or compressed.

  5. SU-C-210-06: Quantitative Evaluation of Dosimetric Effects Resulting From Positional Variations of Pancreatic Tumor Volumes

    SciTech Connect

    Yu, S; Sehgal, V; Wei, R; Lawrenson, L; Kuo, J; Hanna, N; Ramsinghani, N; Daroui, P; Al-Ghazi, M

    2015-06-15

    Purpose: The aim of this study is to quantify dosimetric effects resulting from variation in pancreatic tumor position assessed by bony anatomy and implanted fiducial markers Methods: Twelve pancreatic cancer patients were retrospectively analyzed for this study. All patients received modulated arc therapy (VMAT) treatment using fiducial-based Image Guided Radiation Therapy (IGRT) to the intact pancreas. Using daily orthogonal kV and/or Cone beam CT images, the shift needed to co-register the daily pre-treatment images to reference CT from fiducial to bone (Fid-Bone) were recorded as Left-Right (LR), Anterior-Posterior (AP) and Superior-Inferior (SI). The original VMAT plan iso-center was shifted based on KV bone matching positions at 5 evenly spaced fractions. Dose coverage of the planning target volumes (PTVs) (V100%), mean dose to liver, kidney and stomach/duodenum were assessed in the modified plans. Results: A total of 306 fractions were analyzed. The absolute fiducial-bone positional shifts were greatest in the SI direction, (AP = 2.7 ± 3.0, LR = 2.8 ± 2.8, and SI 6.3 ± 7.9 mm, mean ± SD). The V100% was significantly reduced by 13.5%, (Fid-Bone = 95.3 ± 2.0 vs. 82.3 ± 11.8%, p=0.02). This varied widely among patients (Fid-Bone V100% Range = 2–60%), where 33% of patients had a reduction in V100% of more than 10%. The impact on OARs was greatest to the liver (Fid-Bone= 14.6 vs. 16.1 Gy, 10%), and stomach, (Fid-Bone = 23.9 vx. 25.5 Gy, 7%), however was not statistically significant (p=0.10 both). Conclusion: Compared to matching by fiducial markers, matching by bony anatomy would have substantially reduced the PTV coverage by 13.5%. This reinforces the importance of online position verification based on fiducial markers. Hence, implantation of fiducial markers is strongly recommended for pancreatic cancer patients undergoing intensity modulated radiation therapy treatments.

  6. Comparison of ENDF/B-VII.1 and ENDF/B-VII.0 Results for the Expanded Criticality Validation Suite for MCNP and for Selected Additional Criticality Benchmarks

    NASA Astrophysics Data System (ADS)

    Mosteller, R.

    2014-04-01

    Results obtained with the MCNP5 Monte Carlo code and the ENDF/B-VII.1 and ENDF/B-VII.0 nuclear data libraries have been compared for the 119 benchmarks in the expanded criticality validation suite for MCNP and for 23 additional benchmarks. ENDF/B-VII.1 was found to produce improvements relative to ENDF/B-VII.0 for benchmarks that contain significant amounts of tungsten, zirconium, cadmium, or beryllium, although the results for the benchmarks with beryllium suggest that further improvement still may be needed. In addition, a number of deficiencies previously identified for ENDF/B-VII.0 still remain in ENDF/B-VII.1.

  7. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  8. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  9. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  10. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  11. Binary neutron-star mergers with Whisky and SACRA: First quantitative comparison of results from independent general-relativistic hydrodynamics codes

    NASA Astrophysics Data System (ADS)

    Baiotti, Luca; Shibata, Masaru; Yamamoto, Tetsuro

    2010-09-01

    We present the first quantitative comparison of two independent general-relativistic hydrodynamics codes, the whisky code and the sacra code. We compare the output of simulations starting from the same initial data and carried out with the configuration (numerical methods, grid setup, resolution, gauges) which for each code has been found to give consistent and sufficiently accurate results, in particular, in terms of cleanness of gravitational waveforms. We focus on the quantities that should be conserved during the evolution (rest mass, total mass energy, and total angular momentum) and on the gravitational-wave amplitude and frequency. We find that the results produced by the two codes agree at a reasonable level, with variations in the different quantities but always at better than about 10%.

  12. An adapted mindfulness-based stress reduction program for elders in a continuing care retirement community: quantitative and qualitative results from a pilot randomized controlled trial.

    PubMed

    Moss, Aleezé S; Reibel, Diane K; Greeson, Jeffrey M; Thapar, Anjali; Bubb, Rebecca; Salmon, Jacqueline; Newberg, Andrew B

    2015-06-01

    The purpose of this study was to test the feasibility and effectiveness of an adapted 8-week Mindfulness-Based Stress Reduction (MBSR) program for elders in a continuing care community. This mixed-methods study used both quantitative and qualitative measures. A randomized waitlist control design was used for the quantitative aspect of the study. Thirty-nine elderly were randomized to MBSR (n = 20) or a waitlist control group (n = 19), mean age was 82 years. Both groups completed pre-post measures of health-related quality of life, acceptance and psychological flexibility, facets of mindfulness, self-compassion, and psychological distress. A subset of MBSR participants completed qualitative interviews. MBSR participants showed significantly greater improvement in acceptance and psychological flexibility and in role limitations due to physical health. In the qualitative interviews, MBSR participants reported increased awareness, less judgment, and greater self-compassion. Study results demonstrate the feasibility and potential effectiveness of an adapted MBSR program in promoting mind-body health for elders.

  13. Quantitatively Verifying the Results' Rationality for Farmland Quality Evaluation with Crop Yield, a Case Study in the Northwest Henan Province, China.

    PubMed

    Zhang, Yali; Huang, Junchang; Yu, Lin; Wang, Song

    2016-01-01

    Evaluating the assessing results' rationality for farmland quality (FQ) is usually qualitative and based on farmers and experts' perceptions of soil quality and crop yield. Its quantitative checking still remains difficult and is likely ignored. In this paper, FQ in Xiuwu County, the Northwest Henan Province, China was evaluated by the gray relational analysis (GRA) method and the traditional analytic hierarchy process (AHP) method. The consistency rate of two results was analysed. Research focused on proposing one method of testing the evaluation results' rationality for FQ based on the crop yield. Firstly generating a grade map of crop yield and overlying it with the FQ evaluation maps. Then analysing their consistency rate for each grade in the same spatial position. Finally examining the consistency effects and allowing for a decision on adopting the results. The results showed that the area rate consistency and matching evaluation unit numbers between the two methods were 84.68% and 87.29%, respectively, and the space distribution was approximately equal. The area consistency rates between crop yield level and FQ evaluation levels by GRA and AHP were 78.15% and 74.29%, respectively. Therefore, the verifying effects of GRA and AHP were near, good and acceptable, and the FQ results from both could reflect the crop yield levels. The evaluation results by GCA, as a whole, were slightly more rational than that by AHP. PMID:27490247

  14. Using qualitative research to facilitate the interpretation of quantitative results from a discrete choice experiment: insights from a survey in elderly ophthalmologic patients

    PubMed Central

    Vennedey, Vera; Danner, Marion; Evers, Silvia MAA; Fauser, Sascha; Stock, Stephanie; Dirksen, Carmen D; Hiligsmann, Mickaël

    2016-01-01

    Background Age-related macular degeneration (AMD) is the leading cause of visual impairment and blindness in industrialized countries. Currently, mainly three treatment options are available, which are all intravitreal injections, but differ with regard to the frequency of injections needed, their approval status, and cost. This study aims to estimate patients’ preferences for characteristics of treatment options for neovascular AMD. Methods An interviewer-assisted discrete choice experiment was conducted among patients suffering from AMD treated with intravitreal injections. A Bayesian efficient design was used for the development of 12 choice tasks. In each task patients indicated their preference for one out of two treatment scenarios described by the attributes: side effects, approval status, effect on visual function, injection and monitoring frequency. While answering the choice tasks, patients were asked to think aloud and explain the reasons for choosing or rejecting specific characteristics. Quantitative data were analyzed with a mixed multinomial logit model. Results Eighty-six patients completed the questionnaire. Patients significantly preferred treatments that improve visual function, are approved, are administered in a pro re nata regimen (as needed), and are accompanied by bimonthly monitoring. Patients significantly disliked less frequent monitoring visits (every 4 months) and explained this was due to fear of deterioration being left unnoticed, and in turn experiencing disease deterioration. Significant preference heterogeneity was found for all levels except for bimonthly monitoring visits and severe, rare eye-related side effects. Patients gave clear explanations of their individual preferences during the interviews. Conclusion Significant preference trends were discernible for the overall sample, despite the preference heterogeneity for most treatment characteristics. Patients like to be monitored and treated regularly, but not too frequently

  15. Quantitatively Verifying the Results' Rationality for Farmland Quality Evaluation with Crop Yield, a Case Study in the Northwest Henan Province, China

    PubMed Central

    Huang, Junchang; Wang, Song

    2016-01-01

    Evaluating the assessing results’ rationality for farmland quality (FQ) is usually qualitative and based on farmers and experts’ perceptions of soil quality and crop yield. Its quantitative checking still remains difficult and is likely ignored. In this paper, FQ in Xiuwu County, the Northwest Henan Province, China was evaluated by the gray relational analysis (GRA) method and the traditional analytic hierarchy process (AHP) method. The consistency rate of two results was analysed. Research focused on proposing one method of testing the evaluation results’ rationality for FQ based on the crop yield. Firstly generating a grade map of crop yield and overlying it with the FQ evaluation maps. Then analysing their consistency rate for each grade in the same spatial position. Finally examining the consistency effects and allowing for a decision on adopting the results. The results showed that the area rate consistency and matching evaluation unit numbers between the two methods were 84.68% and 87.29%, respectively, and the space distribution was approximately equal. The area consistency rates between crop yield level and FQ evaluation levels by GRA and AHP were 78.15% and 74.29%, respectively. Therefore, the verifying effects of GRA and AHP were near, good and acceptable, and the FQ results from both could reflect the crop yield levels. The evaluation results by GCA, as a whole, were slightly more rational than that by AHP. PMID:27490247

  16. Quantitative High-Efficiency Cadmium-Zinc-Telluride SPECT with Dedicated Parallel-Hole Collimation System in Obese Patients: Results of a Multi-Center Study

    PubMed Central

    Nakazato, Ryo; Slomka, Piotr J.; Fish, Mathews; Schwartz, Ronald G.; Hayes, Sean W.; Thomson, Louise E.J.; Friedman, John D.; Lemley, Mark; Mackin, Maria L.; Peterson, Benjamin; Schwartz, Arielle M.; Doran, Jesse A.; Germano, Guido; Berman, Daniel S.

    2014-01-01

    Background Obesity is a common source of artifact on conventional SPECT myocardial perfusion imaging (MPI). We evaluated image quality and diagnostic performance of high-efficiency (HE) cadmium-zinc-telluride (CZT) parallel-hole SPECT-MPI for coronary artery disease (CAD) in obese patients. Methods and Results 118 consecutive obese patients at 3 centers (BMI 43.6±8.9 kg/m2, range 35–79.7 kg/m2) had upright/supine HE-SPECT and ICA >6 months (n=67) or low-likelihood of CAD (n=51). Stress quantitative total perfusion deficit (TPD) for upright (U-TPD), supine (S-TPD) and combined acquisitions (C-TPD) was assessed. Image quality (IQ; 5=excellent; <3 nondiagnostic) was compared among BMI 35–39.9 (n=58), 40–44.9 (n=24) and ≥45 (n=36) groups. ROC-curve area for CAD detection (≥50% stenosis) for U-TPD, S-TPD, and C-TPD were 0.80, 0.80, and 0.87, respectively. Sensitivity/specificity was 82%/57% for U-TPD, 74%/71% for S-TPD, and 80%/82% for C-TPD. C-TPD had highest specificity (P=.02). C-TPD normalcy rate was higher than U-TPD (88% vs. 75%, P=.02). Mean IQ was similar among BMI 35–39.9, 40–44.9 and ≥45 groups [4.6 vs. 4.4 vs. 4.5, respectively (P=.6)]. No patient had a non-diagnostic stress scan. Conclusions In obese patients, HE-SPECT MPI with dedicated parallel-hole collimation demonstrated high image quality, normalcy rate, and diagnostic accuracy for CAD by quantitative analysis of combined upright/supine acquisitions. PMID:25388380

  17. The impact of extracellular matrix on the chemoresistance of solid tumors--experimental and clinical results of hyaluronidase as additive to cytostatic chemotherapy.

    PubMed

    Baumgartner, G; Gomar-Höss, C; Sakr, L; Ulsperger, E; Wogritsch, C

    1998-09-11

    Chemoresistance is of outstanding importance for the limited results of chemotherapy in solid tumors. Chemoresistance of multicellular tumor tissues is more pronounced than that of single cells in vivo and in vitro. The enzyme hyaluronidase is able to loosen the cell-cell contact and the interstitial connective tissue and as such, in a number of preclinical and clinical trials, was shown to enhance the efficacy of cytostatic agents. Although proven to be very effective as additive to local chemotherapy, the systemic efficacy is not documented as well. We present a randomized trial done in high-grade astrocytomas with combined chemotherapy and radiation therapy with and without hyaluronidase. After very promising pilot results with systemic hyaluronidase in various tumor entities and also astrocytomas, this randomized study failed to show synergy to chemotherapy and radiation therapy in high-grade astrocytomas concerning survival. The promising preclinical data and the rather well documented activity in therapeutic use as additive to local chemotherapy seem to be an adequate motive to further elucidate the complex manner in which hyaluronidase is active in the interstitial tumor matrix and to obtain more information concerning the optimal route of application, the optimal dosage and the spectrum of tumor entities where it is synergistic with cytostatic chemotherapy and perhaps even radiation therapy.

  18. Potlining Additives

    SciTech Connect

    Rudolf Keller

    2004-08-10

    In this project, a concept to improve the performance of aluminum production cells by introducing potlining additives was examined and tested. Boron oxide was added to cathode blocks, and titanium was dissolved in the metal pool; this resulted in the formation of titanium diboride and caused the molten aluminum to wet the carbonaceous cathode surface. Such wetting reportedly leads to operational improvements and extended cell life. In addition, boron oxide suppresses cyanide formation. This final report presents and discusses the results of this project. Substantial economic benefits for the practical implementation of the technology are projected, especially for modern cells with graphitized blocks. For example, with an energy savings of about 5% and an increase in pot life from 1500 to 2500 days, a cost savings of $ 0.023 per pound of aluminum produced is projected for a 200 kA pot.

  19. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  20. Pulsed addition of HMF and furfural to batch-grown xylose-utilizing Saccharomyces cerevisiae results in different physiological responses in glucose and xylose consumption phase

    PubMed Central

    2013-01-01

    Background Pretreatment of lignocellulosic biomass generates a number of undesired degradation products that can inhibit microbial metabolism. Two of these compounds, the furan aldehydes 5-hydroxymethylfurfural (HMF) and 2-furaldehyde (furfural), have been shown to be an impediment for viable ethanol production. In the present study, HMF and furfural were pulse-added during either the glucose or the xylose consumption phase in order to dissect the effects of these inhibitors on energy state, redox metabolism, and gene expression of xylose-consuming Saccharomyces cerevisiae. Results Pulsed addition of 3.9 g L-1 HMF and 1.2 g L-1 furfural during either the glucose or the xylose consumption phase resulted in distinct physiological responses. Addition of furan aldehydes in the glucose consumption phase was followed by a decrease in the specific growth rate and the glycerol yield, whereas the acetate yield increased 7.3-fold, suggesting that NAD(P)H for furan aldehyde conversion was generated by acetate synthesis. No change in the intracellular levels of NAD(P)H was observed 1 hour after pulsing, whereas the intracellular concentration of ATP increased by 58%. An investigation of the response at transcriptional level revealed changes known to be correlated with perturbations in the specific growth rate, such as protein and nucleotide biosynthesis. Addition of furan aldehydes during the xylose consumption phase brought about an increase in the glycerol and acetate yields, whereas the xylitol yield was severely reduced. The intracellular concentrations of NADH and NADPH decreased by 58 and 85%, respectively, hence suggesting that HMF and furfural drained the cells of reducing power. The intracellular concentration of ATP was reduced by 42% 1 hour after pulsing of inhibitors, suggesting that energy-requiring repair or maintenance processes were activated. Transcriptome profiling showed that NADPH-requiring processes such as amino acid biosynthesis and sulfate and

  1. Starch plus sunflower oil addition to the diet of dry dairy cows results in a trans-11 to trans-10 shift of biohydrogenation.

    PubMed

    Zened, A; Enjalbert, F; Nicot, M C; Troegeler-Meynadier, A

    2013-01-01

    Trans fatty acids (FA), exhibit different biological properties. Among them, cis-9,trans-11 conjugated linoleic acid has some interesting putative health properties, whereas trans-10,cis-12 conjugated linoleic acid has negative effects on cow milk fat production and would negatively affect human health. In high-yielding dairy cows, a shift from trans-11 to trans-10 pathway of biohydrogenation (BH) can occur in the rumen of cows receiving high-concentrate diets, especially when the diet is supplemented with unsaturated fat sources. To study this shift, 4 rumen-fistulated nonlactating Holstein cows were assigned to a 4×4 Latin square design with 4 different diets during 4 periods. Cows received 12 kg of dry matter per day of 4 diets based on corn silage during 4 successive periods: a control diet (22% starch, <3% crude fat on DM basis), a high-starch diet supplemented with wheat plus barley (35% starch, <3% crude fat), a sunflower oil diet supplemented with 5% of sunflower oil (20% starch, 7.6% crude fat), and a high-starch plus sunflower oil diet (33% starch, 7.3% crude fat). Five hours after feeding, proportions of trans-11 BH isomers greatly increased in the rumen content with the addition of sunflower oil, without change in ruminal pH compared with the control diet. Addition of starch to the control diet had no effect on BH pathways but decreased ruminal pH. The addition of a large amount of starch in association with sunflower oil increased trans-10 FA at the expense of trans-11 FA in the rumen content, revealing a trans-11 to trans-10 shift. Interestingly, with this latter diet, ruminal pH did not change compared with a single addition of starch. This trans-11 to trans-10 shift occurred progressively, after a decrease in the proportion of trans-11 FA in the rumen, suggesting that this shift could result from a dysbiosis in the rumen in favor of trans-10-producing bacteria at the expense of those producing trans-11 or a modification of bacterial activities.

  2. High SO{sub 2} removal efficiency testing: Results of DBA and sodium formate additive tests at Southwestern Electric Power company`s Pirkey Station

    SciTech Connect

    1996-05-30

    Tests were conducted at Southwestern Electric Power Company`s (SWEPCo) Henry W. Pirkey Station wet limestone flue gas desulfurization (FGD) system to evaluate options for achieving high sulfur dioxide removal efficiency. The Pirkey FGD system includes four absorber modules, each with dual slurry recirculation loops and with a perforated plate tray in the upper loop. The options tested involved the use of dibasic acid (DBA) or sodium formate as a performance additive. The effectiveness of other potential options was simulated with the Electric Power Research Institute`s (EPRI) FGD PRocess Integration and Simulation Model (FGDPRISM) after it was calibrated to the system. An economic analysis was done to determine the cost effectiveness of the high-efficiency options. Results are-summarized below.

  3. Amplitudes of Pain-Related Evoked Potentials Are Useful to Detect Small Fiber Involvement in Painful Mixed Fiber Neuropathies in Addition to Quantitative Sensory Testing – An Electrophysiological Study

    PubMed Central

    Hansen, Niels; Kahn, Ann-Kathrin; Zeller, Daniel; Katsarava, Zaza; Sommer, Claudia; Üçeyler, Nurcan

    2015-01-01

    To investigate the usefulness of pain-related evoked potentials (PREP) elicited by electrical stimulation for the identification of small fiber involvement in patients with mixed fiber neuropathy (MFN). Eleven MFN patients with clinical signs of large fiber impairment and neuropathic pain and ten healthy controls underwent clinical and electrophysiological evaluation. Small fiber function, electrical conductivity and morphology were examined by quantitative sensory testing (QST), PREP, and skin punch biopsy. MFN was diagnosed following clinical and electrophysiological examination (chronic inflammatory demyelinating neuropathy: n = 6; vasculitic neuropathy: n = 3; chronic axonal ­neuropathy: n = 2). The majority of patients with MFN characterized their pain by descriptors that mainly represent C-fiber-mediated pain. In QST, patients displayed elevated cold, warm, mechanical, and vibration detection thresholds and cold pain thresholds indicative of MFN. PREP amplitudes in patients correlated with cold (p < 0.05) and warm detection thresholds (p < 0.05). Burning pain and the presence of par-/dysesthesias correlated negatively with PREP amplitudes (p < 0.05). PREP amplitudes correlating with cold and warm detection thresholds, burning pain, and par-/dysesthesias support employing PREP amplitudes as an additional tool in conjunction with QST for detecting small fiber impairment in patients with MFN. PMID:26696950

  4. Change in cardio-protective medication and health-related quality of life after diagnosis of screen-detected diabetes: Results from the ADDITION-Cambridge cohort

    PubMed Central

    Black, J.A.; Long, G.H.; Sharp, S.J.; Kuznetsov, L.; Boothby, C.E.; Griffin, S.J.; Simmons, R.K.

    2015-01-01

    Aims Establishing a balance between the benefits and harms of treatment is important among individuals with screen-detected diabetes, for whom the burden of treatment might be higher than the burden of the disease. We described the association between cardio-protective medication and health-related quality of life (HRQoL) among individuals with screen-detected diabetes. Methods 867 participants with screen-detected diabetes underwent clinical measurements at diagnosis, one and five years. General HRQoL (EQ5D) was measured at baseline, one- and five-years, and diabetes-specific HRQoL (ADDQoL-AWI) and health status (SF-36) at one and five years. Multivariable linear regression was used to quantify the association between change in HRQoL and change in cardio-protective medication. Results The median (IQR) number of prescribed cardio-protective agents was 2 (1 to 3) at diagnosis, 3 (2 to 4) at one year and 4 (3 to 5) at five years. Change in cardio-protective medication was not associated with change in HRQoL from diagnosis to one year. From one year to five years, change in cardio-protective agents was not associated with change in the SF-36 mental health score. One additional agent was associated with an increase in the SF-36 physical health score (2.1; 95%CI 0.4, 3.8) and an increase in the EQ-5D (0.05; 95%CI 0.02, 0.08). Conversely, one additional agent was associated with a decrease in the ADDQoL-AWI (−0.32; 95%CI −0.51, −0.13), compared to no change. Conclusions We found little evidence that increases in the number of cardio-protective medications impacted negatively on HRQoL among individuals with screen-detected diabetes over five years. PMID:25937542

  5. SU-E-J-06: Additional Imaging Guidance Dose to Patient Organs Resulting From X-Ray Tubes Used in CyberKnife Image Guidance System

    SciTech Connect

    Sullivan, A; Ding, G

    2015-06-15

    Purpose: The use of image-guided radiation therapy (IGRT) has become increasingly common, but the additional radiation exposure resulting from repeated image guidance procedures raises concerns. Although there are many studies reporting imaging dose from different image guidance devices, imaging dose for the CyberKnife Robotic Radiosurgery System is not available. This study provides estimated organ doses resulting from image guidance procedures on the CyberKnife system. Methods: Commercially available Monte Carlo software, PCXMC, was used to calculate average organ doses resulting from x-ray tubes used in the CyberKnife system. There are seven imaging protocols with kVp ranging from 60 – 120 kV and 15 mAs for treatment sites in the Cranium, Head and Neck, Thorax, and Abdomen. The output of each image protocol was measured at treatment isocenter. For each site and protocol, Adult body sizes ranging from anorexic to extremely obese were simulated since organ dose depends on patient size. Doses for all organs within the imaging field-of-view of each site were calculated for a single image acquisition from both of the orthogonal x-ray tubes. Results: Average organ doses were <1.0 mGy for every treatment site and imaging protocol. For a given organ, dose increases as kV increases or body size decreases. Higher doses are typically reported for skeletal components, such as the skull, ribs, or clavicles, than for softtissue organs. Typical organ doses due to a single exposure are estimated as 0.23 mGy to the brain, 0.29 mGy to the heart, 0.08 mGy to the kidneys, etc., depending on the imaging protocol and site. Conclusion: The organ doses vary with treatment site, imaging protocol and patient size. Although the organ dose from a single image acquisition resulting from two orthogonal beams is generally insignificant, the sum of repeated image acquisitions (>100) could reach 10–20 cGy for a typical treatment fraction.

  6. Additive effects of LPL, APOA5 and APOE variant combinations on triglyceride levels and hypertriglyceridemia: results of the ICARIA genetic sub-study

    PubMed Central

    2010-01-01

    Background Hypertriglyceridemia (HTG) is a well-established independent risk factor for cardiovascular disease and the influence of several genetic variants in genes related with triglyceride (TG) metabolism has been described, including LPL, APOA5 and APOE. The combined analysis of these polymorphisms could produce clinically meaningful complementary information. Methods A subgroup of the ICARIA study comprising 1825 Spanish subjects (80% men, mean age 36 years) was genotyped for the LPL-HindIII (rs320), S447X (rs328), D9N (rs1801177) and N291S (rs268) polymorphisms, the APOA5-S19W (rs3135506) and -1131T/C (rs662799) variants, and the APOE polymorphism (rs429358; rs7412) using PCR and restriction analysis and TaqMan assays. We used regression analyses to examine their combined effects on TG levels (with the log-transformed variable) and the association of variant combinations with TG levels and hypertriglyceridemia (TG ≥ 1.69 mmol/L), including the covariates: gender, age, waist circumference, blood glucose, blood pressure, smoking and alcohol consumption. Results We found a significant lowering effect of the LPL-HindIII and S447X polymorphisms (p < 0.0001). In addition, the D9N, N291S, S19W and -1131T/C variants and the APOE-ε4 allele were significantly associated with an independent additive TG-raising effect (p < 0.05, p < 0.01, p < 0.001, p < 0.0001 and p < 0.001, respectively). Grouping individuals according to the presence of TG-lowering or TG-raising polymorphisms showed significant differences in TG levels (p < 0.0001), with the lowest levels exhibited by carriers of two lowering variants (10.2% reduction in TG geometric mean with respect to individuals who were homozygous for the frequent alleles of all the variants), and the highest levels in carriers of raising combinations (25.1% mean TG increase). Thus, carrying two lowering variants was protective against HTG (OR = 0.62; 95% CI, 0.39-0.98; p = 0.042) and having one single raising polymorphism (OR

  7. High SO{sub 2} removal efficiency testing. Topical report - results of sodium formate additive tests at New York State Electric & Gas Corporation`s Kintigh Station

    SciTech Connect

    Murphy, J.

    1997-02-14

    Tests were conducted at New York State Gas & Electric`s (NYSEG`s) Kintigh Station to evaluate options for achieving high sulfur dioxide (SO{sub 2}) removal efficiency in the wet limestone flue gas desulfurization (FGD) system. This test program was one of six conducted by the U.S. Department of Energy to evaluate low-capital-cost upgrades to existing FGD systems as a means for utilities to comply with the requirements of the 1990 Clean Air Act Amendments. The upgrade option tested at Kintigh was sodium formate additive. Results from the tests were used to calibrate the Electric Power Research Institute`s (EPRI) FGD PRocess Integration and Simulation Model (FGDPRISM) to the Kintigh scrubber configuration. FGDPRISM was then used to predict system performance for evaluating conditions other than those tested. An economic evaluation was then done to determine the cost effectiveness of various high-efficiency upgrade options. These costs can be compared with the estimated market value of SO{sub 2} allowance or the expected costs of allowances generated by other means, such as fuel switching or new scrubbers, to arrive at the most cost-effective strategy for Clean Air Act compliance.

  8. Prevalence of sexual desire and satisfaction among patients with screen-detected diabetes and impact of intensive multifactorial treatment: Results from the ADDITION-Denmark study

    PubMed Central

    Giraldi, Annamaria; Kristensen, Ellids; Lauritzen, Torsten; Sandbæk, Annelli; Charles, Morten

    2015-01-01

    Abstract Objective. Sexual problems are common in people with diabetes. It is unknown whether early detection of diabetes and subsequent intensive multifactorial treatment (IT) are associated with sexual health. We report the prevalence of low sexual desire and low sexual satisfaction among people with screen-detected diabetes and compare the impact of intensive multifactorial treatment with the impact of routine care (RC) on these measures. Design. A cross-sectional analysis of the ADDITION-Denmark trial cohort six years post-diagnosis. Setting. 190 general practices around Denmark. Subjects. A total of 968 patients with screen-detected type 2 diabetes. Main outcome measures. Low sexual desire and low sexual satisfaction. Results. Mean (standard deviation, SD) age was 64.9 (6.9) years. The prevalence of low sexual desire was 53% (RC) and 54% (IT) among women, and 24% (RC) and 25% (IT) among men. The prevalence of low sexual satisfaction was 23% (RC) and 18% (IT) among women, and 27% (RC) and 37% (IT) among men. Among men, the prevalence of low sexual satisfaction was significantly higher in the IT group than in the RC group, p = 0.01. Conclusion. Low sexual desire and low satisfaction are frequent among men and women with screen-detected diabetes, and IT may negatively impact men's sexual satisfaction. PMID:25659194

  9. Accuracy and Precision in the Southern Hemisphere Additional Ozonesondes (SHADOZ) Dataset 1998-2000 in Light of the JOSIE-2000 Results

    NASA Technical Reports Server (NTRS)

    Witte, J. C.; Thompson, A. M.; Schmidlin, F. J.; Oltmans, S. J.; McPeters, R. D.; Smit, H. G. J.

    2003-01-01

    A network of 12 southern hemisphere tropical and subtropical stations in the Southern Hemisphere ADditional OZonesondes (SHADOZ) project has provided over 2000 profiles of stratospheric and tropospheric ozone since 1998. Balloon-borne electrochemical concentration cell (ECC) ozonesondes are used with standard radiosondes for pressure, temperature and relative humidity measurements. The archived data are available at:http: //croc.gsfc.nasa.gov/shadoz. In Thompson et al., accuracies and imprecisions in the SHADOZ 1998- 2000 dataset were examined using ground-based instruments and the TOMS total ozone measurement (version 7) as references. Small variations in ozonesonde technique introduced possible biases from station-to-station. SHADOZ total ozone column amounts are now compared to version 8 TOMS; discrepancies between the two datasets are reduced 2\\% on average. An evaluation of ozone variations among the stations is made using the results of a series of chamber simulations of ozone launches (JOSIE-2000, Juelich Ozonesonde Intercomparison Experiment) in which a standard reference ozone instrument was employed with the various sonde techniques used in SHADOZ. A number of variations in SHADOZ ozone data are explained when differences in solution strength, data processing and instrument type (manufacturer) are taken into account.

  10. High frequency transcutaneous electrical nerve stimulation with diphenidol administration results in an additive antiallodynic effect in rats following chronic constriction injury.

    PubMed

    Lin, Heng-Teng; Chiu, Chong-Chi; Wang, Jhi-Joung; Hung, Ching-Hsia; Chen, Yu-Wen

    2015-03-01

    The impact of coadministration of transcutaneous electrical nerve stimulation (TENS) and diphenidol is not well established. Here we estimated the effects of diphenidol in combination with TENS on mechanical allodynia and tumor necrosis factor-α (TNF-α) expression. Using an animal chronic constriction injury (CCI) model, the rat was estimated for evidence of mechanical sensitivity via von Frey hair stimulation and TNF-α expression in the sciatic nerve using the ELISA assay. High frequency (100Hz) TENS or intraperitoneal injection of diphenidol (2.0μmol/kg) was applied daily, starting on postoperative day 1 (POD1) and lasting for the next 13 days. We demonstrated that both high frequency TENS and diphenidol groups had an increase in mechanical withdrawal thresholds of 60%. Coadministration of high frequency TENS and diphenidol gives better results of paw withdrawal thresholds in comparison with high frequency TENS alone or diphenidol alone. Both diphenidol and coadministration of high frequency TENS with diphenidol groups showed a significant reduction of the TNF-α level compared with the CCI or HFS group (P<0.05) in the sciatic nerve on POD7, whereas the CCI or high frequency TENS group exhibited a higher TNF-α level than the sham group (P<0.05). Our resulting data revealed that diphenidol alone, high frequency TENS alone, and the combination produced a reduction of neuropathic allodynia. Both diphenidol and the combination of diphenidol with high frequency TENS inhibited TNF-α expression. A moderately effective dose of diphenidol appeared to have an additive effect with high frequency TENS. Therefore, multidisciplinary treatments could be considered for this kind of mechanical allodynia. PMID:25596445

  11. Three-dimensional parametric mapping in quantitative micro-CT imaging of post-surgery femoral head-neck samples: preliminary results

    PubMed Central

    Giannotti, Stefano; Bottai, Vanna; Panetta, Daniele; De Paola, Gaia; Tripodi, Maria; Citarelli, Carmine; Dell’Osso, Giacomo; Lazzerini, Ilaria; Salvadori, Piero Antonio; Guido, Giulio

    2015-01-01

    Summary Osteoporosis and pathological increased occurrence of fractures are an important public health problem. They may affect patients’ quality of life and even increase mortality of osteoporotic patients, and consequently represent a heavy economic burden for national healthcare systems. The adoption of simple and inexpensive methods for mass screening of population at risk may be the key for an effective prevention. The current clinical standards of diagnosing osteoporosis and assessing the risk of an osteoporotic bone fracture include dual-energy X-ray absorptiometry (DXA) and quantitative computed tomography (QCT) for the measurement of bone mineral density (BMD). Micro-computed tomography (micro-CT) is a tomographic imaging technique with very high resolution allowing direct quantification of cancellous bone microarchitecture. The Authors performed micro-CT analysis of the femoral heads harvested from 8 patients who have undergone surgery for hip replacement for primary and secondary degenerative disease to identify possible new morphometric parameters based on the analysis of the distribution of intra-subject microarchitectural parameters through the creation of parametric images. Our results show that the micro-architectural metrics commonly used may not be sufficient for the realistic assessment of bone microarchitecture of the femoral head in patients with hip osteoarthritis. The innovative micro-CT approach considers the entire femoral head in its physiological shape with all its components like cartilage, cortical layer and trabecular region. The future use of these methods for a more detailed study of the reaction of trabecular bone for the internal fixation or prostheses would be desirable. PMID:26811703

  12. Three-dimensional parametric mapping in quantitative micro-CT imaging of post-surgery femoral head-neck samples: preliminary results.

    PubMed

    Giannotti, Stefano; Bottai, Vanna; Panetta, Daniele; De Paola, Gaia; Tripodi, Maria; Citarelli, Carmine; Dell'Osso, Giacomo; Lazzerini, Ilaria; Salvadori, Piero Antonio; Guido, Giulio

    2015-01-01

    Osteoporosis and pathological increased occurrence of fractures are an important public health problem. They may affect patients' quality of life and even increase mortality of osteoporotic patients, and consequently represent a heavy economic burden for national healthcare systems. The adoption of simple and inexpensive methods for mass screening of population at risk may be the key for an effective prevention. The current clinical standards of diagnosing osteoporosis and assessing the risk of an osteoporotic bone fracture include dual-energy X-ray absorptiometry (DXA) and quantitative computed tomography (QCT) for the measurement of bone mineral density (BMD). Micro-computed tomography (micro-CT) is a tomographic imaging technique with very high resolution allowing direct quantification of cancellous bone microarchitecture. The Authors performed micro-CT analysis of the femoral heads harvested from 8 patients who have undergone surgery for hip replacement for primary and secondary degenerative disease to identify possible new morphometric parameters based on the analysis of the distribution of intra-subject microarchitectural parameters through the creation of parametric images. Our results show that the micro-architectural metrics commonly used may not be sufficient for the realistic assessment of bone microarchitecture of the femoral head in patients with hip osteoarthritis. The innovative micro-CT approach considers the entire femoral head in its physiological shape with all its components like cartilage, cortical layer and trabecular region. The future use of these methods for a more detailed study of the reaction of trabecular bone for the internal fixation or prostheses would be desirable. PMID:26811703

  13. Computer quantitation of Q-T and terminal T wave (aT-eT) intervals during exercise: methodology and results in normal men.

    PubMed

    O'Donnell, J; Knoebel, S B; Lovelace, D E; McHenry, P L

    1981-05-01

    Computer-quantitated measurements of the Q-T intervals, the Q-T/Q-Tc ratio (Q-T/corrected Q-T) and the terminal T wave (apex to end of T [aT-eT] interval) were evaluated in resting and exercise electrocardiograms of 130 normal men with a mean age of 40 years. Pseudo-orthogonal, bipolar X, Y and Z axis leads were recorded during treadmill exercise testing, and 25 consecutive QRS-T complexes from standing rest and three exercise stages were computer-averaged. The Q-T intervals, Q-T/Q-Tc ratio and aT-eT interval measurements were then computed in the X and Z axis leads only, because the Y lead proved to be too noisy for accurate interpretation. A correlation coefficient of 0.9830 resulted between measurements made manually from the plotted, composite QRS-T complexes and those made by computer. No significant differences , in the paired sense, were found between any of the measurements. Measurements made on the Z axis lead; however, the differences in the measurements remained constant across all stages of exercise. A Q-T/Q-Tc ratio of greater than 1.08, previously reported to be a reliable indicator of coronary disease, was observed in the majority of our normal subjects during exercise. Although the Q-T interval is substantially influenced by many factors, the aT-eT interval proved not to be age- or heart rate-dependent. It appears that the aT-eT interval can be measured with a high degree of reliability during exercise and it may prove to be a relatively specific indicator of repolarization alterations that occur with myocardial ischemia.

  14. The results with the addition of metronomic cyclophosphamide to palliative radiotherapy for the treatment of non-small cell lung carcinoma

    PubMed Central

    Joshi, Subhash Chandra; Pandey, Kailash Chandra; Rastogi, Madhup; Sharma, Mukesh; Gupta, Manoj

    2015-01-01

    Background A considerable proportion of non-small cell lung carcinoma (NSCLC) patients are ineligible for radical therapies. Many are frail not to tolerate intravenous palliative chemotherapy either. These patients often receive palliative radiotherapy (RT), or supportive care alone. We intend to compare outcomes with palliative RT alone, versus palliative RT plus oral low dose metronomic cyclophosphamide. Methods Data was mined from 139 eligible NSCLC patient records. Comparisons were made between 65 patients treated from January 2011 to March 2013 with palliative RT (20-30 Gray in 5-10 fractions) alone, versus 74 patients treated from April 2013 to December 2014 with palliative RT plus oral metronomic cyclophosphamide (50 mg once daily from day of initiation of RT until at least the day of disease progression). Response was assessed after 1-month post-RT by computed tomography. Patients with complete or partial response were recorded as responders. For the determination of progression free survival (PFS), progression would be declared in case of increase in size of lesions, development of new lesions, or development of effusions. The proportions of responders were compared with the Fisher exact test, and the PFS curves were compared with the log-rank test. Results Differences in response rates were statistically insignificant. The PFS was significantly higher when metronomic chemotherapy was added to RT in comparison to treatment with RT alone (mean PFS 3.1 vs. 2.55 months; P=0.0501). Further histological sub-group analysis revealed that the enhanced outcomes with addition of metronomic cyclophosphamide to RT were limited to patients with adenocarcinoma histology (3.5 vs. 2.4 months; P=0.0053), while there was no benefit for those with squamous cell histology (2.6 vs. 2.6 months; P=1). At the dose of oral cyclophosphamide used, there was no recorded instance of any measurable hematological toxicity. Conclusions For pulmonary adenocarcinoma patients, the treatment

  15. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  16. Changes in physical activity and modelled cardiovascular risk following diagnosis of diabetes: 1-year results from the ADDITION-Cambridge trial cohort

    PubMed Central

    Barakat, A; Williams, K M; Prevost, A T; Kinmonth, A-L; Wareham, N J; Griffin, S J; Simmons, R K

    2013-01-01

    Aims To describe change in physical activity over 1 year and associations with change in cardiovascular disease risk factors in a population with screen-detected Type 2 diabetes. Methods Eight hundred and sixty-seven individuals with screen-detected diabetes underwent measurement of self-reported physical activity, cardiovascular disease risk factors and modelled cardiovascular disease risk at baseline and 1 year (n = 736) in the ADDITION-Cambridge trial. Multiple linear regression was used to quantify the association between change in different physical activity domains and cardiovascular disease risk factors at 1 year. Results There was no change in self-reported physical activity over 12 months. Even relatively large changes in physical activity were associated with relatively small changes in cardiovascular disease risk factors after allowing for changes in self-reported medication and diet. For every 30 metabolic equivalent-h increase in recreational activity (equivalent to 10 h/brisk walking/week), there was an average reduction of 0.1% in HbA1c in men (95% CI −0.15 to −0.01, P = 0.021) and an average reduction of 2 mmHg in systolic blood pressure in women (95% CI −4.0 to −0.05, P = 0.045). Conclusions Few associations were observed between change in different physical activity domains and cardiovascular disease risk factors in this trial cohort. Cardiovascular disease risk reduction appeared to be driven largely by factors other than changes in self-reported physical activity in the first year following diagnosis. PMID:22913463

  17. Changes in diet, cardiovascular risk factors and modelled cardiovascular risk following diagnosis of diabetes: 1-year results from the ADDITION-Cambridge trial cohort

    PubMed Central

    Savory, L A; Griffin, S J; Williams, K M; Prevost, A T; Kinmonth, A-L; Wareham, N J; Simmons, R K

    2014-01-01

    Aims To describe change in self-reported diet and plasma vitamin C, and to examine associations between change in diet and cardiovascular disease risk factors and modelled 10-year cardiovascular disease risk in the year following diagnosis of Type 2 diabetes. Methods Eight hundred and sixty-seven individuals with screen-detected diabetes underwent assessment of self-reported diet, plasma vitamin C, cardiovascular disease risk factors and modelled cardiovascular disease risk at baseline and 1 year (n = 736) in the ADDITION-Cambridge trial. Multivariable linear regression was used to quantify the association between change in diet and cardiovascular disease risk at 1 year, adjusting for change in physical activity and cardio-protective medication. Results Participants reported significant reductions in energy, fat and sodium intake, and increases in fruit, vegetable and fibre intake over 1 year. The reduction in energy was equivalent to an average-sized chocolate bar; the increase in fruit was equal to one plum per day. There was a small increase in plasma vitamin C levels. Increases in fruit intake and plasma vitamin C were associated with small reductions in anthropometric and metabolic risk factors. Increased vegetable intake was associated with an increase in BMI and waist circumference. Reductions in fat, energy and sodium intake were associated with reduction in HbA1c, waist circumference and total cholesterol/modelled cardiovascular disease risk, respectively. Conclusions Improvements in dietary behaviour in this screen-detected population were associated with small reductions in cardiovascular disease risk, independently of change in cardio-protective medication and physical activity. Dietary change may have a role to play in the reduction of cardiovascular disease risk following diagnosis of diabetes. PMID:24102972

  18. Quantitative analysis

    PubMed Central

    Nevin, John A.

    1984-01-01

    Quantitative analysis permits the isolation of invariant relations in the study of behavior. The parameters of these relations can serve as higher-order dependent variables in more extensive analyses. These points are illustrated by reference to quantitative descriptions of performance maintained by concurrent schedules, multiple schedules, and signal-detection procedures. Such quantitative descriptions of empirical data may be derived from mathematical theories, which in turn can lead to novel empirical analyses so long as their terms refer to behavioral and environmental events. Thus, quantitative analysis is an integral aspect of the experimental analysis of behavior. PMID:16812400

  19. Paleomagnetic intensity of Aso pyroclastic flows: Additional results with LTD-DHT Shaw method, Thellier method with pTRM-tail check

    NASA Astrophysics Data System (ADS)

    Maruuchi, T.; Shibuya, H.

    2009-12-01

    For the sake to calibrate the absolute value of the ’relative paleointensity variation curve’ drawn from sediment cores, Takai et al. (2002) proposed to use pyroclastic flows co-bearing with wide spread tephras. The pyroclastic flows prepare volcanic rocks with TRM, which let us determine absolute paleointensity, and the tephras prepare the correlation with sediment stratigraphy. While 4 out of 6 pyroclastic flows are consistent with Sint-800 paleointensity variation curve, two flows, Aso-2 and Aso-4, show weaker and stronger than Sint-800 beyond the error, respectively. We revisited the paleointensity study of Aso pyroclastic flows, adding LTD- DHT Shaw method, the pTRM-tail check in Thellier experiment, and LTD-DHT Shaw method by using volcanic glasses. We prepared 11 specimens from 3 sites of Aso-1 welded tuff for LTD-DHT Shaw method experiments, and obtained 6 paleointensities satisfied a set of strict criteria. They yield an average paleointensity of 21.3±5.8uT, which is smaller than 31.0±3.4uT provided by Takai et al. (2002). For Aso-2 welded tuff, 11 samples from 3 sites were submitted to Thellier experiments, and 6 passed a set of pretty stringent criteria including pTRM-tail check, which is not performed by Takai et al. (2002). They give an average paleointensity of 20.2±1.5uT, which is virtually identical to 20.2±1.0uT (27 samples) given by Takai et al. (2002). Although the success rate was not good in LTD-DHT Shaw method, 2 out of 12 specimens passed the criteria, and gave 25.8±3.4uT, which is consistent with Takai et al. (2002). In addition, we obtained a reliable paleointensity from a volcanic glass in LTD-DHT Shaw method, it gives a paleointensity of 23.6 uT. It is also consitent with Takai et al. (2002). For Aso-3 welded tuff, we performed only LTD-DHT Shaw method for one specimen from one site yet. It gives a paleointensity of 43.0uT, which is higher than 31.8±3.6uT given by Takai et al. (2002). Eight sites were set for Aso-4 welded tuff

  20. Food additives

    PubMed Central

    Spencer, Michael

    1974-01-01

    Food additives are discussed from the food technology point of view. The reasons for their use are summarized: (1) to protect food from chemical and microbiological attack; (2) to even out seasonal supplies; (3) to improve their eating quality; (4) to improve their nutritional value. The various types of food additives are considered, e.g. colours, flavours, emulsifiers, bread and flour additives, preservatives, and nutritional additives. The paper concludes with consideration of those circumstances in which the use of additives is (a) justified and (b) unjustified. PMID:4467857

  1. Modification of the active layer/PEDOT:PSS interface by solvent additives resulting in improvement of the performance of organic solar cells.

    PubMed

    Synooka, Olesia; Kretschmer, Florian; Hager, Martin D; Himmerlich, Marcel; Krischok, Stefan; Gehrig, Dominik; Laquai, Frédéric; Schubert, Ulrich S; Gobsch, Gerhard; Hoppe, Harald

    2014-07-23

    The influence of various polar solvent additives with different dipole moments has been investigated since the performance of a photovoltaic device comprising a donor-acceptor copolymer (benzothiadiazole-fluorene-diketopyrrolopyrrole (BTD-F-DKPP)) and phenyl-C60-butyric acid methyl ester (PCBM) was notably increased. A common approach for controlling bulk heterojunction morphology and thereby improving the solar cell performance involves the use of solvent additives exhibiting boiling points higher than that of the surrounding solvent in order to allow the fullerene to aggregate during the host solvent evaporation and film solidification. In contrast to that, we report the application of polar solvent additives with widely varied dipole moments, where intentionally no dependence on their boiling points was applied. We found that an appropriate amount of the additive can improve all solar cell parameters. This beneficial effect could be largely attributed to a modification of the poly(3,4-ethylenedioxythiophene):poly(styrenesulfonate) (PEDOT:PSS)-active layer interface within the device layer stack, which was successfully reproduced for polymer solar cells based on the commonly used PCDTBT (poly[N-900-hepta-decanyl-2,7-carbazole-alt-5,5-(40,70-di-2-thienyl-20,10,30-benzothiadiazole)]) copolymer. PMID:24979240

  2. Need for a gender-sensitive human security framework: results of a quantitative study of human security and sexual violence in Djohong District, Cameroon

    PubMed Central

    2014-01-01

    Background Human security shifts traditional concepts of security from interstate conflict and the absence of war to the security of the individual. Broad definitions of human security include livelihoods and food security, health, psychosocial well-being, enjoyment of civil and political rights and freedom from oppression, and personal safety, in addition to absence of conflict. Methods In March 2010, we undertook a population-based health and livelihood study of female refugees from conflict-affected Central African Republic living in Djohong District, Cameroon and their female counterparts within the Cameroonian host community. Embedded within the survey instrument were indicators of human security derived from the Leaning-Arie model that defined three domains of psychosocial stability suggesting individuals and communities are most stable when their core attachments to home, community and the future are intact. Results While the female refugee human security outcomes describe a population successfully assimilated and thriving in their new environments based on these three domains, the ability of human security indicators to predict the presence or absence of lifetime and six-month sexual violence was inadequate. Using receiver operating characteristic (ROC) analysis, the study demonstrates that common human security indicators do not uncover either lifetime or recent prevalence of sexual violence. Conclusions These data suggest that current gender-blind approaches of describing human security are missing serious threats to the safety of one half of the population and that efforts to develop robust human security indicators should include those that specifically measure violence against women. PMID:24829613

  3. Evaluating the effectiveness of pasteurization for reducing human illnesses from Salmonella spp. in egg products: results of a quantitative risk assessment.

    PubMed

    Latimer, Heejeong K; Marks, Harry M; Coleman, Margaret E; Schlosser, Wayne D; Golden, Neal J; Ebel, Eric D; Kause, Janell; Schroeder, Carl M

    2008-02-01

    As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.

  4. From provocative narrative scenarios to quantitative biophysical model results: Simulating plausible futures to 2070 in an urbanizing agricultural watershed in Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Booth, E.; Chen, X.; Motew, M.; Qiu, J.; Zipper, S. C.; Carpenter, S. R.; Kucharik, C. J.; Steven, L. I.

    2015-12-01

    Scenario analysis is a powerful tool for envisioning future social-ecological change and its consequences on human well-being. Scenarios that integrate qualitative storylines and quantitative biophysical models can create a vivid picture of these potential futures but the integration process is not straightforward. We present - using the Yahara Watershed in southern Wisconsin (USA) as a case study - a method for developing quantitative inputs (climate, land use/cover, and land management) to drive a biophysical modeling suite based on four provocative and contrasting narrative scenarios that describe plausible futures of the watershed to 2070. The modeling suite consists of an agroecosystem model (AgroIBIS-VSF), hydrologic routing model (THMB), and empirical lake water quality model and estimates several biophysical indicators to evaluate the watershed system under each scenario. These indicators include water supply, lake flooding, agricultural production, and lake water quality. Climate (daily precipitation and air temperature) for each scenario was determined using statistics from 210 different downscaled future climate projections for two 20-year time periods (2046-2065 and 2081-2100) and modified using a stochastic weather generator to allow flexibility for matching specific climate events within the scenario narratives. Land use/cover for each scenario was determined first by quantifying changes in areal extent every decade for 15 categories at the watershed scale to be consistent with the storyline events and theme. Next, these changes were spatially distributed using a rule-based framework based on land suitability metrics that determine transition probabilities. Finally, agricultural inputs including manure and fertilizer application rates were determined for each scenario based on the prevalence of livestock, water quality regulations, and technological innovations. Each scenario is compared using model inputs (maps and time-series of land use/cover and

  5. Additional correction for energy transfer efficiency calculation in filter-based Förster resonance energy transfer microscopy for more accurate results

    NASA Astrophysics Data System (ADS)

    Sun, Yuansheng; Periasamy, Ammasi

    2010-03-01

    Förster resonance energy transfer (FRET) microscopy is commonly used to monitor protein interactions with filter-based imaging systems, which require spectral bleedthrough (or cross talk) correction to accurately measure energy transfer efficiency (E). The double-label (donor+acceptor) specimen is excited with the donor wavelength, the acceptor emission provided the uncorrected FRET signal and the donor emission (the donor channel) represents the quenched donor (qD), the basis for the E calculation. Our results indicate this is not the most accurate determination of the quenched donor signal as it fails to consider the donor spectral bleedthrough (DSBT) signals in the qD for the E calculation, which our new model addresses, leading to a more accurate E result. This refinement improves E comparisons made with lifetime and spectral FRET imaging microscopy as shown here using several genetic (FRET standard) constructs, where cerulean and venus fluorescent proteins are tethered by different amino acid linkers.

  6. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy results in a significant improvement in overall survival in patients with newly diagnosed mantle cell lymphoma: results of a randomized UK National Cancer Research Institute trial

    PubMed Central

    Rule, Simon; Smith, Paul; Johnson, Peter W.M.; Bolam, Simon; Follows, George; Gambell, Joanne; Hillmen, Peter; Jack, Andrew; Johnson, Stephen; Kirkwood, Amy A; Kruger, Anton; Pocock, Christopher; Seymour, John F.; Toncheva, Milena; Walewski, Jan; Linch, David

    2016-01-01

    Mantle cell lymphoma is an incurable and generally aggressive lymphoma that is more common in elderly patients. Whilst a number of different chemotherapeutic regimens are active in this disease, there is no established gold standard therapy. Rituximab has been used widely to good effect in B-cell malignancies but there is no evidence that it improves outcomes when added to chemotherapy in this disease. We performed a randomized, open-label, multicenter study looking at the addition of rituximab to the standard chemotherapy regimen of fludarabine and cyclophosphamide in patients with newly diagnosed mantle cell lymphoma. A total of 370 patients were randomized. With a median follow up of six years, rituximab improved the median progression-free survival from 14.9 to 29.8 months (P<0.001) and overall survival from 37.0 to 44.5 months (P=0.005). This equates to absolute differences of 9.0% and 22.1% for overall and progression-free survival, respectively, at two years. Overall response rates were similar, but complete response rates were significantly higher in the rituximab arm: 52.7% vs. 39.9% (P=0.014). There was no clinically significant additional toxicity observed with the addition of rituximab. Overall, approximately 18% of patients died of non-lymphomatous causes, most commonly infections. The addition of rituximab to fludarabine and cyclophosphamide chemotherapy significantly improves outcomes in patients with mantle cell lymphoma. However, these regimens have significant late toxicity and should be used with caution. This trial has been registered (ISRCTN81133184 and clinicaltrials.gov:00641095) and is supported by the UK National Cancer Research Network. PMID:26611473

  7. An Economic Evaluation of TENS in Addition to Usual Primary Care Management for the Treatment of Tennis Elbow: Results from the TATE Randomized Controlled Trial

    PubMed Central

    Lewis, Martyn; Chesterton, Linda S.; Sim, Julius; Mallen, Christian D.; Hay, Elaine M.; van der Windt, Daniëlle A.

    2015-01-01

    Background The TATE trial was a multicentre pragmatic randomized controlled trial of supplementing primary care management (PCM)–consisting of a GP consultation followed by information and advice on exercises–with transcutaneous electrical nerve stimulation (TENS), to reduce pain intensity in patients with tennis elbow. This paper reports the health economic evaluation. Methods and Findings Adults with new diagnosis of tennis elbow were recruited from 38 general practices in the UK, and randomly allocated to PCM (n = 120) or PCM plus TENS (n = 121). Outcomes included reduction in pain intensity and quality-adjusted-life-years (QALYs) based on the EQ5D and SF6D. Two economic perspectives were evaluated: (i) healthcare–inclusive of NHS and private health costs for the tennis elbow; (ii) societal–healthcare costs plus productivity losses through work absenteeism. Mean outcome and cost differences between the groups were evaluated using a multiple imputed dataset as the base case evaluation, with uncertainty represented in cost-effectiveness planes and through probabilistic cost-effectiveness acceptability curves). Incremental healthcare cost was £33 (95%CI -40, 106) and societal cost £65 (95%CI -307, 176) for PCM plus TENS. Mean differences in outcome were: 0.11 (95%CI -0.13, 0.35) for change in pain (0–10 pain scale); -0.015 (95%CI -0.058, 0.029) for QALYEQ5D; 0.007 (95%CI -0.022, 0.035) for QALYSF6D (higher score differences denote greater benefit for PCM plus TENS). The ICER (incremental cost effectiveness ratio) for the main evaluation of mean difference in societal cost (£) relative to mean difference in pain outcome was -582 (95%CI -8666, 8113). However, incremental ICERs show differences in cost–effectiveness of additional TENS, according to the outcome being evaluated. Conclusion Our findings do not provide evidence for or against the cost-effectiveness of TENS as an adjunct to primary care management of tennis elbow. PMID:26317528

  8. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  9. Structure of transition-metal cluster compounds: Use of an additional orbital resulting from the f, g character of spd bond orbitals*

    PubMed Central

    Pauling, Linus

    1977-01-01

    A general theory of the structure of complexes of the transition metals is developed on the basis of the enneacovalence of the metals and the requirements of the electroneutrality principle. An extra orbital may be provided through the small but not negligible amount of f and g character of spd bond orbitals, and an extra electron or electron pair may be accepted in this orbital for a single metal or a cluster to neutralize the positive electric charge resulting from the partial ionic character of the bonds with ligands, such as the carbonyl group. Examples of cluster compounds of cobalt, ruthenium, rhodium, osmium, and gold are discussed. PMID:16592470

  10. The Laminar Flow Tube Reactor as a Quantitative Tool for Nucleation Studies: Experimental Results and Theoretical Analysis of Homogeneous Nucleation of Dibutylphthalate

    SciTech Connect

    Mikheev, Vladimir B.; Laulainen, Nels S. ); Barlow, Stephan E. ); Knott, Michael; Ford, Ian J.

    1999-12-01

    A Laminar Flow Tube Reactor has been designed and constructed in order to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 C to+19.1 C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) has been provided. Both isothermal and the isobaric nucleation rates have been measured. The experimental data obtained have been compared with measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis based on the first and the second nucleation theorems has been made. The critical cluster size and the excess of internal energy of the critical cluster have been obtained.

  11. The laminar flow tube reactor as a quantitative tool for nucleation studies: Experimental results and theoretical analysis of homogeneous nucleation of dibutylphthalate

    SciTech Connect

    Mikheev, Vladimir B.; Laulainen, Nels S.; Barlow, Stephan E.; Knott, Michael; Ford, Ian J.

    2000-09-01

    A laminar flow tube reactor was designed and constructed to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 to +19.1 degree sign C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) is given. Both isothermal and the isobaric nucleation rates were measured. The experimental data obtained were compared with the measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis, based on the first and the second nucleation theorems, is also presented. The critical cluster size and the excess of internal energy of the critical cluster are obtained. (c) 2000 American Institute of Physics.

  12. Phosphazene additives

    DOEpatents

    Harrup, Mason K; Rollins, Harry W

    2013-11-26

    An additive comprising a phosphazene compound that has at least two reactive functional groups and at least one capping functional group bonded to phosphorus atoms of the phosphazene compound. One of the at least two reactive functional groups is configured to react with cellulose and the other of the at least two reactive functional groups is configured to react with a resin, such as an amine resin of a polycarboxylic acid resin. The at least one capping functional group is selected from the group consisting of a short chain ether group, an alkoxy group, or an aryloxy group. Also disclosed are an additive-resin admixture, a method of treating a wood product, and a wood product.

  13. Effect of Using Local Intrawound Vancomycin Powder in Addition to Intravenous Antibiotics in Posterior Lumbar Surgery: Midterm Result in a Single-Center Study

    PubMed Central

    Lee, Gun-Ill; Chun, Hyoung-Joon; Choi, Kyu-Sun

    2016-01-01

    Objective We conducted this study to report the efficacy of local application of vancomycin powder in the setting of surgical site infection (SSI) of posterior lumbar surgical procedures and to figure out risk factors of SSIs. Methods From February 2013 to December 2013, SSI rates following 275 posterior lumbar surgeries of which intrawound vancomycin powder was used in combination with intravenous antibiotics (Vanco group) were assessed. Compared with 296 posterior lumbar procedures with intravenous antibiotic only group from February 2012 to December 2012 (non-Vanco group), various infection rates were assessed. Univariate and multivariate analysis to figure out risk factors of infection among Vanco group were done. Results Statistically significant reduction of SSI in Vanco group (5.5%) from non-Vanco group (10.5%) was confirmed (p=0.028). Mean follow-up period was 8 months. Rate of acute staphylococcal SSIs reduced statistically significantly to 4% compared to 7.4% of non-Vanco group (p=0.041). Deep staphylococcal infection decreased to 2 compared to 8 and deep methicillin-resistant Staphylococcus aureus infection also decreased to 1 compared to 5 in non-Vanco group. No systemic complication was observed. Statistically significant risk factors associated with SSI were diabetes mellitus, history of cardiovascular disease, length of hospital stay, number of instrumented level and history of previous surgery. Conclusion In this series of 571 patients, intrawound vancomycin powder usage resulted in significant decrease in SSI rates in our posterior lumbar surgical procedures. Patients at high risk of infection are highly recommended as a candidate for this technique. PMID:27437012

  14. Does early intensive multifactorial therapy reduce modelled cardiovascular risk in individuals with screen-detected diabetes? Results from the ADDITION-Europe cluster randomized trial

    PubMed Central

    Black, J A; Sharp, S J; Wareham, N J; Sandbæk, A; Rutten, G E H M; Lauritzen, T; Khunti, K; Davies, M J; Borch-Johnsen, K; Griffin, S J; Simmons, R K

    2014-01-01

    Aims Little is known about the long-term effects of intensive multifactorial treatment early in the diabetes disease trajectory. In the absence of long-term data on hard outcomes, we described change in 10-year modelled cardiovascular risk in the 5 years following diagnosis, and quantified the impact of intensive treatment on 10-year modelled cardiovascular risk at 5 years. Methods In a pragmatic, cluster-randomized, parallel-group trial in Denmark, the Netherlands and the UK, 3057 people with screen-detected Type 2 diabetes were randomized by general practice to receive (1) routine care of diabetes according to national guidelines (1379 patients) or (2) intensive multifactorial target-driven management (1678 patients). Ten-year modelled cardiovascular disease risk was calculated at baseline and 5 years using the UK Prospective Diabetes Study Risk Engine (version 3β). Results Among 2101 individuals with complete data at follow up (73.4%), 10-year modelled cardiovascular disease risk was 27.3% (sd 13.9) at baseline and 21.3% (sd 13.8) at 5-year follow-up (intensive treatment group difference –6.9, sd 9.0; routine care group difference –5.0, sd 12.2). Modelled 10-year cardiovascular disease risk was lower in the intensive treatment group compared with the routine care group at 5 years, after adjustment for baseline cardiovascular disease risk and clustering (–2.0; 95% CI –3.1 to –0.9). Conclusions Despite increasing age and diabetes duration, there was a decline in modelled cardiovascular disease risk in the 5 years following diagnosis. Compared with routine care, 10-year modelled cardiovascular disease risk was lower in the intensive treatment group at 5 years. Our results suggest that patients benefit from intensive treatment early in the diabetes disease trajectory, where the rate of cardiovascular disease risk progression may be slowed. PMID:24533664

  15. Association Between Single-Nucleotide Polymorphisms in Hormone Metabolism and DNA Repair Genes and Epithelial Ovarian Cancer: Results from Two Australian Studies and an Additional Validation Set

    PubMed Central

    Beesley, Jonathan; Jordan, Susan J.; Spurdle, Amanda B.; Song, Honglin; Ramus, Susan J.; Kjaer, Suzanne Kruger; Hogdall, Estrid; DiCioccio, Richard A.; McGuire, Valerie; Whittemore, Alice S.; Gayther, Simon A.; Pharoah, Paul D.P.; Webb, Penelope M.; Chenevix-Trench, Georgia

    2009-01-01

    Although some high-risk ovarian cancer genes have been identified, it is likely that common low penetrance alleles exist that confer some increase in ovarian cancer risk. We have genotyped nine putative functional single-nucleotide polymorphisms (SNP) in genes involved in steroid hormone synthesis (SRD5A2, CYP19A1, HSB17B1, and HSD17B4) and DNA repair (XRCC2, XRCC3, BRCA2, and RAD52) using two Australian ovarian cancer case-control studies, comprising a total of 1,466 cases and 1,821 controls of Caucasian origin. Genotype frequencies in cases and controls were compared using logistic regression. The only SNP we found to be associated with ovarian cancer risk in both of these two studies was SRD5A2 V89L (rs523349), which showed a significant trend of increasing risk per rare allele (P = 0.00002). We then genotyped another SNP in this gene (rs632148; r2 = 0.945 with V89L) in an attempt to validate this finding in an independent set of 1,479 cases and 2,452 controls from United Kingdom, United States, and Denmark. There was no association between rs632148 and ovarian cancer risk in the validation samples, and overall, there was no significant heterogeneity between the results of the five studies. Further analyses of SNPs in this gene are therefore warranted to determine whether SRD5A2 plays a role in ovarian cancer predisposition. PMID:18086758

  16. Quantitative Thinking.

    ERIC Educational Resources Information Center

    DuBridge, Lee A.

    An appeal for more research to determine how to educate children as effectively as possible is made. Mathematics teachers can readily examine the educational problems of today in their classrooms since learning progress in mathematics can easily be measured and evaluated. Since mathematics teachers have learned to think in quantitative terms and…

  17. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  18. Soil Greenhouse Gas Fluxes in a Pacific Northwest Douglas-Fir Forest: Results from a Soil Fertilization and Biochar Addition Experiment

    NASA Astrophysics Data System (ADS)

    Hawthorne, I.; Johnson, M. S.; Jassal, R. S.; Black, T. A.

    2013-12-01

    evacuated 12-mL vials and analyzed by gas chromatography. Chamber headspace GHG mixing ratios vs. time data were fit to linear and exponential models in R (Version 2.14.0) and fluxes were calculated. Results showed high variability in GHG fluxes over time in all treatments. Higher CO2 emissions were observed during early summer (119 μg CO2 m-2 s-1 in the control plots), decreasing with drought (19 μg CO2 m-2 s-1 in the control plots). CH4 uptake by soil increased during summer months from -0.004 μg CH4 m-2 s-1 to -0.089 μg CH4 m-2 s-1 in the control plots, in response to drying conditions in the upper soil profile. N2O was both consumed and emitted in all treatments, with fluxes ranging from -0.0009 to 0.0019 μg N2O m-2 s-1 in the control plots. Analysis of variance indicated that there were significant differences in GHG fluxes between treatments over time. We also investigated the potential effects of large volume headspace removal, and H2O vapour saturation leading to a dilution effect by using a closed-path infra-red gas analyzer with an inline humidity sensor.

  19. Quantitative optical phase microscopy.

    PubMed

    Barty, A; Nugent, K A; Paganin, D; Roberts, A

    1998-06-01

    We present a new method for the extraction of quantitative phase data from microscopic phase samples by use of partially coherent illumination and an ordinary transmission microscope. The technique produces quantitative images of the phase profile of the sample without phase unwrapping. The technique is able to recover phase even in the presence of amplitude modulation, making it significantly more powerful than existing methods of phase microscopy. We demonstrate the technique by providing quantitatively correct phase images of well-characterized test samples and show that the results obtained for more-complex samples correlate with structures observed with Nomarski differential interference contrast techniques.

  20. Quantitative photoacoustic tomography

    PubMed Central

    Yuan, Zhen; Jiang, Huabei

    2009-01-01

    In this paper, several algorithms that allow for quantitative photoacoustic reconstruction of tissue optical, acoustic and physiological properties are described in a finite-element method based framework. These quantitative reconstruction algorithms are compared, and the merits and limitations associated with these methods are discussed. In addition, a multispectral approach is presented for concurrent reconstructions of multiple parameters including deoxyhaemoglobin, oxyhaemoglobin and water concentrations as well as acoustic speed. Simulation and in vivo experiments are used to demonstrate the effectiveness of the reconstruction algorithms presented. PMID:19581254

  1. Quantitative exfoliative cytology of abnormal oral mucosal smears.

    PubMed Central

    Cowpe, J G; Longmore, R B; Green, M W

    1988-01-01

    In this study quantitative techniques have been applied to smears collected from the buccal mucosa and floor of the mouth. The results display an encouraging success rate for identifying premalignant and malignant lesions. 'Intrapatient' normal smears provide a satisfactory control for comparison with pathological smears. Early results indicate that quantitative cytology could be of great value for monitoring and follow-up of suspicious lesions and provide an excellent additional diagnostic test for detecting early oral malignancy. PMID:3184106

  2. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  3. Incorporating patient preferences into drug development and regulatory decision making: Results from a quantitative pilot study with cancer patients, carers, and regulators.

    PubMed

    Postmus, D; Mavris, M; Hillege, H L; Salmonson, T; Ryll, B; Plate, A; Moulon, I; Eichler, H-G; Bere, N; Pignatti, F

    2016-05-01

    Currently, patient preference studies are not required to be included in marketing authorization applications to regulatory authorities, and the role and methodology for such studies have not been agreed upon. The European Medicines Agency (EMA) conducted a pilot study to gain experience on how the collection of individual preferences can inform the regulatory review. Using a short online questionnaire, ordinal statements regarding the desirability of different outcomes in the treatment of advanced cancer were elicited from 139 participants (98 regulators, 29 patient or carers, and 12 healthcare professionals). This was followed by face-to-face meetings to gather feedback and validate the individual responses. In this article we summarize the EMA pilot study and discuss the role of patient preference studies within the regulatory review. Based on the results, we conclude that our preference elicitation instrument was easy to implement and sufficiently precise to learn about the distribution of the participants' individual preferences. PMID:26715217

  4. Drugs, Women and Violence in the Americas: U.S. Quantitative Results of a Multi-Centric Pilot Project (Phase 2)

    PubMed Central

    González-Guarda, Rosa María; Peragallo, Nilda; Lynch, Ami; Nemes, Susanna

    2011-01-01

    Objectives To explore the collective and individual experiences that Latin American females in the U.S. have with substance abuse, violence and risky sexual behaviors. Methods This study was conducted in two phases from July 2006 to June 2007 in south Florida. This paper covers Phase 2. In Phase 2, questionnaires were provided to women to test whether there is a relationship between demographics, acculturation, depression, self-esteem and substance use/abuse; whether there is a relationship between demographics, acculturation, depression, self-esteem and violence exposure and victimization; whether there is a relationship between demographics, acculturation, depression, self-esteem, HIV knowledge and STD and HIV/AIDS risks among respondents; and whether there is a relationship between substance abuse, violence victimization and HIV/AIDS risks among respondents. Results Participants reported high rates of alcohol and drug abuse among their current or most recent partners. This is a major concern because partner alcohol use and drug use was related to partner physical, sexual and psychological abuse. Only two factors were associated with lifetime drug use: income and acculturation. Over half of the participants reported being victims of at least one form of abuse during childhood and adulthood. A substantial component of abuse reported during adulthood was perpetrated by a currently or recent intimate partner. Conclusions The results from this study suggest that substance abuse, violence and HIV should be addressed in an integrative and comprehensive manner. Recommendations for the development of policies, programs and services addressing substance abuse, violence and risk for HIV among Latinos are provided. PMID:22504304

  5. The effects of measurement error on previously reported mathematical relationships between indicator organism density and swimming-associated illness: a quantitative estimate of the resulting bias.

    PubMed

    Fleisher, J M

    1990-12-01

    Several recent epidemiological studies seeking associations between swimming in recreational waters contaminated with domestic sewage and increased illness among such swimmers have reported mathematical relationships relating indicator organism densities to illness among swimmers. Common to the design of all of these studies is the failure to adequately control for large amounts of measurement error contained in estimates of exposure, i.e. estimated indicator organism densities. The limited precision of current methods of indicator organism enumeration, coupled with temporal and spacial variation in indicator organism densities at the locations studied, are responsible for a substantial portion of this measurement error. The failure to control adequately for these sources of measurement error has resulted in a significant amount of bias being present in the mathematical relationships reported by these previously published epidemiological studies. In order to explore the magnitude and direction of this bias, computer simulations were conducted using data in which estimation of indicator organism density was obtained by the two most widely used techniques of enumeration: the Multiple Tube Fermentation Technique and the Membrane Filtration Technique. The results of the computer simulations show that the bias caused by this measurement error is non-differential causing the mathematical relationships between indicator organism density and swimming-associated illness reported in previous epidemiological studies to underestimate true risk by a minimum of approximately 14%, and that this underestimate could range as high as approximately 30 to 57%. This study also demonstrates that substantial reduction of this bias can be easily accomplished by incorporating a formal water quality sampling strategy, based on statistical principles of experimental design and analysis, into the design of future epidemiological studies seeking mathematical relationships between indicator

  6. The CheMin XRD on the Mars Science Laboratory Rover Curiosity: Construction, Operation, and Quantitative Mineralogical Results from the Surface of Mars

    NASA Technical Reports Server (NTRS)

    Blake, David F.

    2015-01-01

    The Mars Science Laboratory mission was launched from Cape Canaveral, Florida on Nov. 26, 2011 and landed in Gale crater, Mars on Aug. 6, 2012. MSL's mission is to identify and characterize ancient "habitable" environments on Mars. MSL's precision landing system placed the Curiosity rover within 2 km of the center of its 20 X 6 km landing ellipse, next to Gale's central mound, a 5,000 meter high pile of laminated sediment which may contain 1 billion years of Mars history. Curiosity carries with it a full suite of analytical instruments, including the CheMin X-ray diffractometer, the first XRD flown in space. CheMin is essentially a transmission X-ray pinhole camera. A fine-focus Co source and collimator transmits a 50µm beam through a powdered sample held between X-ray transparent plastic windows. The sample holder is shaken by a piezoelectric actuator such that the powder flows like a liquid, each grain passing in random orientation through the beam over time. Forward-diffracted and fluoresced X-ray photons from the sample are detected by an X-ray sensitive Charge Coupled Device (CCD) operated in single photon counting mode. When operated in this way, both the x,y position and the energy of each photon are detected. The resulting energy-selected Co Kalpha Debye-Scherrer pattern is used to determine the identities and amounts of minerals present via Rietveld refinement, and a histogram of all X-ray events constitutes an X-ray fluorescence analysis of the sample.The key role that definitive mineralogy plays in understanding the Martian surface is a consequence of the fact that minerals are thermodynamic phases, having known and specific ranges of temperature, pressure and composition within which they are stable. More than simple compositional analysis, definitive mineralogical analysis can provide information about pressure/temperature conditions of formation, past climate, water activity and the like. Definitive mineralogical analyses are necessary to establish

  7. Consumption of Antimicrobials in Pigs, Veal Calves, and Broilers in The Netherlands: Quantitative Results of Nationwide Collection of Data in 2011

    PubMed Central

    Bos, Marian E. H.; Taverne, Femke J.; van Geijlswijk, Ingeborg M.; Mouton, Johan W.; Mevius, Dik J.; Heederik, Dick J. J.

    2013-01-01

    In 2011, Dutch animal production sectors started recording veterinary antimicrobial consumption. These data are used by the Netherlands Veterinary Medicines Authority to create transparency in and define benchmark indicators for veterinary consumption of antimicrobials. This paper presents the results of sector wide consumption of antimicrobials, in the form of prescriptions or deliveries, for all pig, veal calf, and broiler farms. Data were used to calculate animal defined daily dosages per year (ADDD/Y) per pig or veal calf farm. For broiler farms, number of animal treatment days per year was calculated. Furthermore, data were used to calculate the consumption of specific antimicrobial classes per administration route per pig or veal calf farm. The distribution of antimicrobial consumption per farm varied greatly within and between farm categories. All categories, except for rosé starter farms, showed a highly right skewed distribution with a long tail. Median ADDD/Y values varied from 1.2 ADDD/Y for rosé finisher farms to 83.2 ADDD/Y for rosé starter farms, with 28.6 ADDD/Y for white veal calf farms. Median consumption in pig farms was 9.3 ADDD/Y for production pig farms and 3.0 ADDD/Y for slaughter pig farms. Median consumption in broiler farms was 20.9 ATD/Y. Regarding specific antimicrobial classes, fluoroquinolones were mainly used on veal calf farms, but in low quantities: P75 range was 0 – 0.99 ADDD/Y, and 0 – 0.04 ADDD/Y in pig farms. The P75 range for 3rd/4th-generation cephalosporins was 0 – 0.07 ADDD/Y for veal calf farms, and 0 – 0.1 ADDD/Y for pig farms. The insights obtained from these results, and the full transparency obtained by monitoring antimicrobial consumption per farm, will help reduce antimicrobial consumption and endorse antimicrobial stewardship. The wide and skewed distribution in consumption has important practical and methodological implications for benchmarking, surveillance and future analysis of trends. PMID:24204857

  8. Consumption of antimicrobials in pigs, veal calves, and broilers in the Netherlands: quantitative results of nationwide collection of data in 2011.

    PubMed

    Bos, Marian E H; Taverne, Femke J; van Geijlswijk, Ingeborg M; Mouton, Johan W; Mevius, Dik J; Heederik, Dick J J

    2013-01-01

    In 2011, Dutch animal production sectors started recording veterinary antimicrobial consumption. These data are used by the Netherlands Veterinary Medicines Authority to create transparency in and define benchmark indicators for veterinary consumption of antimicrobials. This paper presents the results of sector wide consumption of antimicrobials, in the form of prescriptions or deliveries, for all pig, veal calf, and broiler farms. Data were used to calculate animal defined daily dosages per year (ADDD/Y) per pig or veal calf farm. For broiler farms, number of animal treatment days per year was calculated. Furthermore, data were used to calculate the consumption of specific antimicrobial classes per administration route per pig or veal calf farm. The distribution of antimicrobial consumption per farm varied greatly within and between farm categories. All categories, except for rosé starter farms, showed a highly right skewed distribution with a long tail. Median ADDD/Y values varied from 1.2 ADDD/Y for rosé finisher farms to 83.2 ADDD/Y for rosé starter farms, with 28.6 ADDD/Y for white veal calf farms. Median consumption in pig farms was 9.3 ADDD/Y for production pig farms and 3.0 ADDD/Y for slaughter pig farms. Median consumption in broiler farms was 20.9 ATD/Y. Regarding specific antimicrobial classes, fluoroquinolones were mainly used on veal calf farms, but in low quantities: P75 range was 0 - 0.99 ADDD/Y, and 0 - 0.04 ADDD/Y in pig farms. The P75 range for 3(rd)/4(th)-generation cephalosporins was 0 - 0.07 ADDD/Y for veal calf farms, and 0 - 0.1 ADDD/Y for pig farms. The insights obtained from these results, and the full transparency obtained by monitoring antimicrobial consumption per farm, will help reduce antimicrobial consumption and endorse antimicrobial stewardship. The wide and skewed distribution in consumption has important practical and methodological implications for benchmarking, surveillance and future analysis of trends. PMID:24204857

  9. Altered levels of the Taraxacum kok-saghyz (Russian dandelion) small rubber particle protein, TkSRPP3, result in qualitative and quantitative changes in rubber metabolism.

    PubMed

    Collins-Silva, Jillian; Nural, Aise Taban; Skaggs, Amanda; Scott, Deborah; Hathwaik, Upul; Woolsey, Rebekah; Schegg, Kathleen; McMahan, Colleen; Whalen, Maureen; Cornish, Katrina; Shintani, David

    2012-07-01

    Several proteins have been identified and implicated in natural rubber biosynthesis, one of which, the small rubber particle protein (SRPP), was originally identified in Hevea brasiliensis as an abundant protein associated with cytosolic vesicles known as rubber particles. While previous in vitro studies suggest that SRPP plays a role in rubber biosynthesis, in vivo evidence is lacking to support this hypothesis. To address this issue, a transgene approach was taken in Taraxacum kok-saghyz (Russian dandelion or Tk) to determine if altered SRPP levels would influence rubber biosynthesis. Three dandelion SRPPs were found to be highly abundant on dandelion rubber particles. The most abundant particle associated SRPP, TkSRPP3, showed temporal and spatial patterns of expression consistent with patterns of natural rubber accumulation in dandelion. To confirm its role in rubber biosynthesis, TkSRPP3 expression was altered in Russian dandelion using over-expression and RNAi methods. While TkSRPP3 over-expressing lines had slightly higher levels of rubber in their roots, relative to the control, TkSRPP3 RNAi lines showed significant decreases in root rubber content and produced dramatically lower molecular weight rubber than the control line. Not only do results here provide in vivo evidence of TkSRPP proteins affecting the amount of rubber in dandelion root, but they also suggest a function in regulating the molecular weight of the cis-1, 4-polyisoprene polymer.

  10. Hand-to-mouth contacts result in greater ingestion of feces than dietary water consumption in Tanzania: a quantitative fecal exposure assessment model.

    PubMed

    Mattioli, Mia Catharine M; Davis, Jennifer; Boehm, Alexandria B

    2015-02-01

    Diarrheal diseases kill 1800 children under the age of five die each day, and nearly half of these deaths occur in sub-Saharan Africa. Contaminated drinking water and hands are two important environmental transmission routes of diarrhea-causing pathogens to young children in low-income countries. The objective of this research is to evaluate the relative contribution of these two major exposure pathways in a low-income country setting. A Monte Carlo simulation was used to model the amount of human feces ingested by children under five years old from exposure via hand-to-mouth contacts and stored drinking water ingestion in Bagamoyo, Tanzania. Child specific exposure data were obtained from the USEPA 2011 Exposure Factors Handbook, and fecal contamination was estimated using hand rinse and stored water fecal indicator bacteria concentrations from over 1200 Tanzanian households. The model outcome is a distribution of a child's daily dose of feces via each exposure route. The model results show that Tanzanian children ingest a significantly greater amount of feces each day from hand-to-mouth contacts than from drinking water, which may help elucidate why interventions focused on water without also addressing hygiene often see little to no effect on reported incidence of diarrhea.

  11. POPULATION DYNAMICS OF COTTON RATS ACROSS A LANDSCAPE MANIPULATED BY NITROGEN ADDITIONS AND ENCLOSURE FENCING

    EPA Science Inventory

    Nitrogen additions in grasslands have produced qualitative and quantitative changes in vegetation resulting in an increase in biomass and decrease in plant species diversity. As with plants, we theorize that animal communities will decrease in species richness and become dominat...

  12. Quantitative aspects of septicemia.

    PubMed Central

    Yagupsky, P; Nolte, F S

    1990-01-01

    For years, quantitative blood cultures found only limited use as aids in the diagnosis and management of septic patients because the available methods were cumbersome, labor intensive, and practical only for relatively small volumes of blood. The development and subsequent commercial availability of lysis-centrifugation direct plating methods for blood cultures have addressed many of the shortcomings of the older methods. The lysis-centrifugation method has demonstrated good performance relative to broth-based blood culture methods. As a result, quantitative blood cultures have found widespread use in clinical microbiology laboratories. Most episodes of clinical significant bacteremia in adults are characterized by low numbers of bacteria per milliliter of blood. In children, the magnitude of bacteremia is generally much higher, with the highest numbers of bacteria found in the blood of septic neonates. The magnitude of bacteremia correlates with the severity of disease in children and with mortality rates in adults, but other factors play more important roles in determining the patient's outcome. Serial quantitative blood cultures have been used to monitor the in vivo efficacy of antibiotic therapy in patients with slowly resolving sepsis, such as disseminated Mycobacterium avium-M. intracellulare complex infections. Quantitative blood culture methods were used in early studies of bacterial endocarditis, and the results significantly contributed to our understanding of the pathophysiology of this disease. Comparison of paired quantitative blood cultures obtained from a peripheral vein and the central venous catheter has been used to help identify patients with catheter-related sepsis and is the only method that does not require removal of the catheter to establish the diagnosis. Quantitation of bacteria in the blood can also help distinguish contaminated from truly positive blood cultures; however, no quantitative criteria can invariably differentiate

  13. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  14. Berkeley Quantitative Genome Browser

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation.more » The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.« less

  15. Berkeley Quantitative Genome Browser

    SciTech Connect

    Hechmer, Aaron

    2008-02-29

    The Berkeley Quantitative Genome Browser provides graphical browsing functionality for genomic data organized, at a minimum, by sequence and position. While supporting the annotation browsing features typical of many other genomic browsers, additional emphasis is placed on viewing and utilizing quantitative data. Data may be read from GFF, SGR, FASTA or any column delimited format. Once the data has been read into the browser's buffer, it may be searched. filtered or subjected to mathematical transformation. The browser also supplies some graphical design manipulation functionality geared towards preparing figures for presentations or publication. A plug-in mechanism enables development outside the core functionality that adds more advanced or esoteric analysis capabilities. BBrowse's development and distribution is open-source and has been built to run on Linux, OSX and MS Windows operating systems.

  16. Unraveling Additive from Nonadditive Effects Using Genomic Relationship Matrices

    PubMed Central

    Muñoz, Patricio R.; Resende, Marcio F. R.; Gezan, Salvador A.; Resende, Marcos Deon Vilela; de los Campos, Gustavo; Kirst, Matias; Huber, Dudley; Peter, Gary F.

    2014-01-01

    The application of quantitative genetics in plant and animal breeding has largely focused on additive models, which may also capture dominance and epistatic effects. Partitioning genetic variance into its additive and nonadditive components using pedigree-based models (P-genomic best linear unbiased predictor) (P-BLUP) is difficult with most commonly available family structures. However, the availability of dense panels of molecular markers makes possible the use of additive- and dominance-realized genomic relationships for the estimation of variance components and the prediction of genetic values (G-BLUP). We evaluated height data from a multifamily population of the tree species Pinus taeda with a systematic series of models accounting for additive, dominance, and first-order epistatic interactions (additive by additive, dominance by dominance, and additive by dominance), using either pedigree- or marker-based information. We show that, compared with the pedigree, use of realized genomic relationships in marker-based models yields a substantially more precise separation of additive and nonadditive components of genetic variance. We conclude that the marker-based relationship matrices in a model including additive and nonadditive effects performed better, improving breeding value prediction. Moreover, our results suggest that, for tree height in this population, the additive and nonadditive components of genetic variance are similar in magnitude. This novel result improves our current understanding of the genetic control and architecture of a quantitative trait and should be considered when developing breeding strategies. PMID:25324160

  17. Human Brain Atlas-based Multimodal MRI Analysis of Volumetry, Diffusimetry, Relaxometry and Lesion Distribution in Multiple Sclerosis Patients and Healthy Adult Controls: Implications for understanding the Pathogenesis of Multiple Sclerosis and Consolidation of Quantitative MRI Results in MS

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Datta, Sushmita; Wolinsky, Jerry S.; Narayana, Ponnada A.

    2011-01-01

    Multiple sclerosis (MS) is the most common immune-mediated disabling neurological disease of the central nervous system. The pathogenesis of MS is not fully understood. Histopathology implicates both demyelination and axonal degeneration as the major contributors to the accumulation of disability. The application of several in vivo quantitative magnetic resonance imaging (MRI) methods to both lesioned and normal-appearing brain tissue has not yet provided a solid conclusive support of the hypothesis that MS might be a diffuse disease. In this work, we adopted FreeSurfer to provide standardized macrostructure or volumetry of lesion free normal-appearing brain tissue in combination with multiple quantitative MRI metrics (T2 relaxation time, diffusion tensor anisotropy and diffusivities) that characterize tissue microstructural integrity. By incorporating a large number of healthy controls, we have attempted to separate the natural age-related change from the disease-induced effects. Our work shows elevation in diffusivity and relaxation times and reduction in volume in a number of normal-appearing white matter and gray matter structures in relapsing-remitting multiple sclerosis patients. These changes were related in part with the spatial distribution of lesions. The whole brain lesion load and age-adjusted expanded disability status score showed strongest correlations in regions such as corpus callosum with qMRI metrics that are believed to be specific markers of axonal dysfunction, consistent with histologic data of others indicating axonal loss that is independent of focal lesions. Our results support that MS at least in part has a neurodegenerative component. PMID:21978603

  18. The NIST Quantitative Infrared Database

    PubMed Central

    Chu, P. M.; Guenther, F. R.; Rhoderick, G. C.; Lafferty, W. J.

    1999-01-01

    With the recent developments in Fourier transform infrared (FTIR) spectrometers it is becoming more feasible to place these instruments in field environments. As a result, there has been enormous increase in the use of FTIR techniques for a variety of qualitative and quantitative chemical measurements. These methods offer the possibility of fully automated real-time quantitation of many analytes; therefore FTIR has great potential as an analytical tool. Recently, the U.S. Environmental Protection Agency (U.S.EPA) has developed protocol methods for emissions monitoring using both extractive and open-path FTIR measurements. Depending upon the analyte, the experimental conditions and the analyte matrix, approximately 100 of the hazardous air pollutants (HAPs) listed in the 1990 U.S.EPA Clean Air Act amendment (CAAA) can be measured. The National Institute of Standards and Technology (NIST) has initiated a program to provide quality-assured infrared absorption coefficient data based on NIST prepared primary gas standards. Currently, absorption coefficient data has been acquired for approximately 20 of the HAPs. For each compound, the absorption coefficient spectrum was calculated using nine transmittance spectra at 0.12 cm−1 resolution and the Beer’s law relationship. The uncertainties in the absorption coefficient data were estimated from the linear regressions of the transmittance data and considerations of other error sources such as the nonlinear detector response. For absorption coefficient values greater than 1 × 10−4 μmol/mol)−1 m−1 the average relative expanded uncertainty is 2.2 %. This quantitative infrared database is currently an ongoing project at NIST. Additional spectra will be added to the database as they are acquired. Our current plans include continued data acquisition of the compounds listed in the CAAA, as well as the compounds that contribute to global warming and ozone depletion.

  19. Quantitative image analysis of synovial tissue.

    PubMed

    van der Hall, Pascal O; Kraan, Maarten C; Tak, Paul Peter

    2007-01-01

    Quantitative image analysis is a form of imaging that includes microscopic histological quantification, video microscopy, image analysis, and image processing. Hallmarks are the generation of reliable, reproducible, and efficient measurements via strict calibration and step-by-step control of the acquisition, storage and evaluation of images with dedicated hardware and software. Major advantages of quantitative image analysis over traditional techniques include sophisticated calibration systems, interaction, speed, and control of inter- and intraobserver variation. This results in a well controlled environment, which is essential for quality control and reproducibility, and helps to optimize sensitivity and specificity. To achieve this, an optimal quantitative image analysis system combines solid software engineering with easy interactivity with the operator. Moreover, the system also needs to be as transparent as possible in generating the data because a "black box design" will deliver uncontrollable results. In addition to these more general aspects, specifically for the analysis of synovial tissue the necessity of interactivity is highlighted by the added value of identification and quantification of information as present in areas such as the intimal lining layer, blood vessels, and lymphocyte aggregates. Speed is another important aspect of digital cytometry. Currently, rapidly increasing numbers of samples, together with accumulation of a variety of markers and detection techniques has made the use of traditional analysis techniques such as manual quantification and semi-quantitative analysis unpractical. It can be anticipated that the development of even more powerful computer systems with sophisticated software will further facilitate reliable analysis at high speed.

  20. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  1. Acid-base titrations by stepwise addition of equal volumes of titrant with special reference to automatic titrations-III Presentation of a fully automatic titration apparatus and of results supporting the theories given in the preceding parts.

    PubMed

    Pehrsson, L; Ingman, F

    1977-02-01

    This paper forms Part III of a series in which the first two parts describe methods for evaluating titrations performed by stepwise addition of equal volumes of titrant. The great advantage of these methods is that they do not require an accurate calibration of the electrode system. This property makes the methods very suitable for routine work. e.g., in automatic analysis. An apparatus for performing such titrations automatically is presented. Further, results of titrations of monoprotic acids, a diprotic acid, an ampholyte, a mixture of an acid with its conjugate base, and mixtures of two acids with a small difference between the stability constants are given. Most of these titrations cannot be evaluated by the Gran or Hofstee methods but yield results having errors of the order of 0.1% if the methods proposed in Parts I and II of this series are employed. The advantages of the method of stepwise addition of equal volumes of titrant combined with the proposed evaluation methods, in comparison with common methods such as titration to a preset pH, are that all the data are used in the evaluation, permitting a statistical treatment and giving better possibilities for tracing systematic errors.

  2. 21 CFR 570.18 - Tolerances for related food additives.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... available methods that permit quantitative determination of the amount of each food additive present or... present in or on a food and there are available methods that permit quantitative determination of...

  3. Smog control fuel additives

    SciTech Connect

    Lundby, W.

    1993-06-29

    A method is described of controlling, reducing or eliminating, ozone and related smog resulting from photochemical reactions between ozone and automotive or industrial gases comprising the addition of iodine or compounds of iodine to hydrocarbon-base fuels prior to or during combustion in an amount of about 1 part iodine per 240 to 10,000,000 parts fuel, by weight, to be accomplished by: (a) the addition of these inhibitors during or after the refining or manufacturing process of liquid fuels; (b) the production of these inhibitors for addition into fuel tanks, such as automotive or industrial tanks; or (c) the addition of these inhibitors into combustion chambers of equipment utilizing solid fuels for the purpose of reducing ozone.

  4. [A novel quantitative PCR with fluorogenic probe].

    PubMed

    Isono, K

    1997-03-01

    The polymerase chain reaction(PCR) is a powerful tool to amplify small amounts of DNA or RNA for various molecular analysis. However, in these analyses, PCR only provides qualitative results. The availability of quantitative PCR provides valuable additional information in various applications. It is difficult to establish absolute quantitation, because PCR amplification is a complicated reaction process of exponential growth. To trace the amplification process, the initial amount of template and the efficiency of amplification in each cycle, has to be determined. Conventional methods have not achieved absolute quantitative analysis. The ABI PRISM 7700 Sequence Detection System has solved these problems with real-time monitoring of the PCR process. The real-time detection system provides essential information to quantify the initial target copy number, because it can draw an amplification curve. Using the 5' nuclease assay, a specific fluorescent signal is generated and measured at every cycle during a run. This system can perform a variety of applications including, quantitation, allele discrimination, PCR optimization and viral screening. Using the ABI PRISM 7700 Sequence Detection System, the rice genome has been quantitatively analyzed. To monitor maturation of the chloroplast genome from proplastid during germ development, 5' nuclease assay set up for Cab and rbcL genes which are located in the nuclear genome and chloroplast genome, respectively. Cab was used as an internal standard for normalization of cell numbers. The maturation process of chloroplast was estimated using the ratio of gene dosage, [rbcL]/[Cab]. After development of cotyledon, a significant increase in copy numbers of the chloroplast was observed. These results indicate that a light-induced chloroplast maturation process is coupled with an increase in chloroplast genome copy numbers.

  5. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  6. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone. PMID:26187058

  7. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  8. Comparing the effects of tofacitinib, methotrexate and the combination, on bone marrow oedema, synovitis and bone erosion in methotrexate-naive, early active rheumatoid arthritis: results of an exploratory randomised MRI study incorporating semiquantitative and quantitative techniques

    PubMed Central

    Conaghan, Philip G; Østergaard, Mikkel; Bowes, Michael A; Wu, Chunying; Fuerst, Thomas; Irazoque-Palazuelos, Fedra; Soto-Raices, Oscar; Hrycaj, Pawel; Xie, Zhiyong; Zhang, Richard; Wyman, Bradley T; Bradley, John D; Soma, Koshika; Wilkinson, Bethanie

    2016-01-01

    Objectives To explore the effects of tofacitinib—an oral Janus kinase inhibitor for the treatment of rheumatoid arthritis (RA)—with or without methotrexate (MTX), on MRI endpoints in MTX-naive adult patients with early active RA and synovitis in an index wrist or hand. Methods In this exploratory, phase 2, randomised, double-blind, parallel-group study, patients received tofacitinib 10 mg twice daily + MTX, tofacitinib 10 mg twice daily + placebo (tofacitinib monotherapy), or MTX + placebo (MTX monotherapy), for 1 year. MRI endpoints (Outcome Measures in Rheumatology Clinical Trials RA MRI score (RAMRIS), quantitative RAMRIS (RAMRIQ) and dynamic contrast-enhanced (DCE) MRI) were assessed using a mixed-effect model for repeated measures. Treatment differences with p<0.05 (vs MTX monotherapy) were considered significant. Results In total, 109 patients were randomised and treated. Treatment differences in RAMRIS bone marrow oedema (BME) at month 6 were −1.55 (90% CI −2.52 to −0.58) for tofacitinib + MTX and −1.74 (−2.72 to −0.76) for tofacitinib monotherapy (both p<0.01 vs MTX monotherapy). Numerical improvements in RAMRIS synovitis at month 3 were −0.63 (−1.58 to 0.31) for tofacitinib + MTX and −0.52 (−1.46 to 0.41) for tofacitinib monotherapy (both p>0.05 vs MTX monotherapy). Treatment differences in RAMRIQ synovitis were statistically significant at month 3, consistent with DCE MRI findings. Less deterioration of RAMRIS and RAMRIQ erosive damage was seen at months 6 and 12 in both tofacitinib groups versus MTX monotherapy. Conclusions These results provide consistent evidence using three different MRI technologies that tofacitinib treatment leads to early reduction of inflammation and inhibits progression of structural damage. Trial registration number NCT01164579. PMID:27002108

  9. Quantitative vs qualitative research methods.

    PubMed

    Lakshman, M; Sinha, L; Biswas, M; Charles, M; Arora, N K

    2000-05-01

    Quantitative methods have been widely used because of the fact that things that can be measured or counted gain scientific credibility over the unmeasurable. But the extent of biological abnormality, severity, consequences and the impact of illness cannot be satisfactorily captured and answered by the quantitative research alone. In such situations qualitative methods take a holistic perspective preserving the complexities of human behavior by addressing the "why" and "how" questions. In this paper an attempt has been made to highlight the strengths and weaknesses of both the methods and also that a balanced mix of both qualitative as well as quantitative methods yield the most valid and reliable results.

  10. Quantitative velocity modulation spectroscopy

    NASA Astrophysics Data System (ADS)

    Hodges, James N.; McCall, Benjamin J.

    2016-05-01

    Velocity Modulation Spectroscopy (VMS) is arguably the most important development in the 20th century for spectroscopic study of molecular ions. For decades, interpretation of VMS lineshapes has presented challenges due to the intrinsic covariance of fit parameters including velocity modulation amplitude, linewidth, and intensity. This limitation has stifled the growth of this technique into the quantitative realm. In this work, we show that subtle changes in the lineshape can be used to help address this complexity. This allows for determination of the linewidth, intensity relative to other transitions, velocity modulation amplitude, and electric field strength in the positive column of a glow discharge. Additionally, we explain the large homogeneous component of the linewidth that has been previously described. Using this component, the ion mobility can be determined.

  11. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described. PMID:24136541

  12. Quantitative proteomic analysis of intact plastids.

    PubMed

    Shiraya, Takeshi; Kaneko, Kentaro; Mitsui, Toshiaki

    2014-01-01

    Plastids are specialized cell organelles in plant cells that are differentiated into various forms including chloroplasts, chromoplasts, and amyloplasts, and fulfill important functions in maintaining the overall cell metabolism and sensing environmental factors such as sunlight. It is therefore important to grasp the mechanisms of differentiation and functional changes of plastids in order to enhance the understanding of vegetality. In this chapter, details of a method for the extraction of intact plastids that makes analysis possible while maintaining the plastid functions are provided; in addition, a quantitative shotgun method for analyzing the composition and changes in the content of proteins in plastids as a result of environmental impacts is described.

  13. Phenylethynyl Containing Reactive Additives

    NASA Technical Reports Server (NTRS)

    Connell, John W. (Inventor); Smith, Joseph G., Jr. (Inventor); Hergenrother, Paul M. (Inventor)

    2002-01-01

    Phenylethynyl containing reactive additives were prepared from aromatic diamine, containing phenylethvnvl groups and various ratios of phthalic anhydride and 4-phenylethynviphthalic anhydride in glacial acetic acid to form the imide in one step or in N-methyl-2-pvrrolidinone to form the amide acid intermediate. The reactive additives were mixed in various amounts (10% to 90%) with oligomers containing either terminal or pendent phenylethynyl groups (or both) to reduce the melt viscosity and thereby enhance processability. Upon thermal cure, the additives react and become chemically incorporated into the matrix and effect an increase in crosslink density relative to that of the host resin. This resultant increase in crosslink density has advantageous consequences on the cured resin properties such as higher glass transition temperature and higher modulus as compared to that of the host resin.

  14. Health effects models for nuclear power plant accident consequence analysis. Modification of models resulting from addition of effects of exposure to alpha-emitting radionuclides: Revision 1, Part 2, Scientific bases for health effects models, Addendum 2

    SciTech Connect

    Abrahamson, S.; Bender, M.A.; Boecker, B.B.; Scott, B.R.; Gilbert, E.S.

    1993-05-01

    The Nuclear Regulatory Commission (NRC) has sponsored several studies to identify and quantify, through the use of models, the potential health effects of accidental releases of radionuclides from nuclear power plants. The Reactor Safety Study provided the basis for most of the earlier estimates related to these health effects. Subsequent efforts by NRC-supported groups resulted in improved health effects models that were published in the report entitled {open_quotes}Health Effects Models for Nuclear Power Plant Consequence Analysis{close_quotes}, NUREG/CR-4214, 1985 and revised further in the 1989 report NUREG/CR-4214, Rev. 1, Part 2. The health effects models presented in the 1989 NUREG/CR-4214 report were developed for exposure to low-linear energy transfer (LET) (beta and gamma) radiation based on the best scientific information available at that time. Since the 1989 report was published, two addenda to that report have been prepared to (1) incorporate other scientific information related to low-LET health effects models and (2) extend the models to consider the possible health consequences of the addition of alpha-emitting radionuclides to the exposure source term. The first addendum report, entitled {open_quotes}Health Effects Models for Nuclear Power Plant Accident Consequence Analysis, Modifications of Models Resulting from Recent Reports on Health Effects of Ionizing Radiation, Low LET Radiation, Part 2: Scientific Bases for Health Effects Models,{close_quotes} was published in 1991 as NUREG/CR-4214, Rev. 1, Part 2, Addendum 1. This second addendum addresses the possibility that some fraction of the accident source term from an operating nuclear power plant comprises alpha-emitting radionuclides. Consideration of chronic high-LET exposure from alpha radiation as well as acute and chronic exposure to low-LET beta and gamma radiations is a reasonable extension of the health effects model.

  15. Electrophilic addition of astatine

    SciTech Connect

    Norseev, Yu.V.; Vasaros, L.; Nhan, D.D.; Huan, N.K.

    1988-03-01

    It has been shown for the first time that astatine is capable of undergoing addition reactions to unsaturated hydrocarbons. A new compound of astatine, viz., ethylene astatohydrin, has been obtained, and its retention numbers of squalane, Apiezon, and tricresyl phosphate have been found. The influence of various factors on the formation of ethylene astatohydrin has been studied. It has been concluded on the basis of the results obtained that the univalent cations of astatine in an acidic medium is protonated hypoastatous acid.

  16. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills/). In addition to the teaching activity collection (85 activites), this site contains a variety of resources to assist faculty with the methods they use to teach quantitative skills at both the introductory and advanced levels; information about broader efforts in quantitative literacy involving other science disciplines, and a special section of resources for students who are struggling with their quantitative skills. The site is part of the Digital Library for Earth Science Education and has been developed by geoscience faculty in collaboration with mathematicians and mathematics educators with funding from the National Science Foundation.

  17. Overview of differences between microbial feed additives and probiotics for food regarding regulation, growth promotion effects and health properties and consequences for extrapolation of farm animal results to humans.

    PubMed

    Bernardeau, M; Vernoux, J-P

    2013-04-01

    For many years, microbial adjuncts have been used to supplement the diets of farm animals and humans. They have evolved since the 1990s to become known as probiotics, i.e. functional food with health benefits. After the discovery of a possible link between manipulation of gut microflora in mice and obesity, a focus on the use of these beneficial microbes that act on gut microflora in animal farming was undertaken and compared with the use of probiotics for food. Beneficial microbes added to feed are classified at a regulatory level as zootechnical additives, in the category of gut flora stabilizers for healthy animals and are regulated up to strain level in Europe. Intended effects are improvement of performance characteristics, which are strain dependent and growth enhancement is not a prerequisite. In fact, increase of body weight is not commonly reported and its frequency is around 25% of the published data examined here. However, when a Body Weight Gain (BWG) was found in the literature, it was generally moderate (lower than or close to 10%) and this over a reduced period of their short industrial life. When it was higher than 10%, it could be explained as an indirect consequence of the alleviation of the weight losses linked to stressful intensive rearing conditions or health deficiency. However, regulations on feed do not consider the health effects because animals are supposed to be healthy, so there is no requirement for reporting healthy effects in the standard European dossier. The regulations governing the addition of beneficial microorganisms to food are less stringent than for feed and no dossier is required if a species has a Qualified Presumption of Safety status. The microbial strain marketed is not submitted to any regulation and its properties (including BWG) do not need to be studied. Only claims for functional or healthy properties are regulated and again growth effect is not included. However, recent studies on probiotic effects showed that BWG

  18. Multilayer approach to the quantitative analysis of x-ray photoelectron spectroscopy results: Applications to ultrathin SiO{sub 2} on Si and to self-assembled monolayers on gold

    SciTech Connect

    Marel, C. van der; Yildirim, M.; Stapert, H.R.

    2005-09-15

    X-ray photoelectron spectroscopy (XPS) is widely applied for the chemical characterization of surfaces and multilayers of thin films. In order to obtain quantitative results, XPS peak areas generally are divided by sensitivity factors and normalized to 100 at. % to obtain so-called raw concentrations. For homogeneous materials, materials with randomly distributed atoms within the analyzed surface layer, these concentrations may be a useful quantity. Yet, for a material consisting of a substrate on top of which a number of chemically different layers are present, the raw concentrations depend on measuring details like the takeoff angle during the XPS analyses and clearly are not a satisfactory way to describe the sample. The main purpose of this article is to present a calculation method that converts raw concentrations into more meaningful quantities. The method is applicable to a restricted but technologically relevant class of samples: substrates on top of which one or more homogeneous layers are present. Examples are: gate dielectrics on Si or GaAs, self-assembling monolayers on a metallic substrate, thin oxide films on metals with an organic contamination on top. The method is based upon standard exponential attenuation of the photoelectron intensity as a function of traveled distance. For each element or chemical state in the system it has to be known to which layer(s) it belongs. Sensitivity factors are corrected for matrix effects and for intrinsic excitations. Starting from the raw concentrations, the method calculates in a self-consistent way the composition of all layers in the system and the thickness of each layer. Only one measurement at one measuring angle is required to obtain these results. To obtain insight into the accuracy of the calculation method, XPS results obtained on ultrathin SiO{sub 2} layers on Si that were slightly contaminated with hydrocarbons have been analyzed with the method. The obtained thicknesses were in good agreement with

  19. Mastitomics, the integrated omics of bovine milk in an experimental model of Streptococcus uberis mastitis: 2. Label-free relative quantitative proteomics† †Electronic supplementary information (ESI) available. See DOI: 10.1039/c6mb00290k Click here for additional data file.

    PubMed Central

    Mudaliar, Manikhandan; Tassi, Riccardo; Thomas, Funmilola C.; McNeilly, Tom N.; Weidt, Stefan K.; McLaughlin, Mark; Wilson, David; Burchmore, Richard; Herzyk, Pawel; Eckersall, P. David

    2016-01-01

    Mastitis, inflammation of the mammary gland, is the most common and costly disease of dairy cattle in the western world. It is primarily caused by bacteria, with Streptococcus uberis as one of the most prevalent causative agents. To characterize the proteome during Streptococcus uberis mastitis, an experimentally induced model of intramammary infection was used. Milk whey samples obtained from 6 cows at 6 time points were processed using label-free relative quantitative proteomics. This proteomic analysis complements clinical, bacteriological and immunological studies as well as peptidomic and metabolomic analysis of the same challenge model. A total of 2552 non-redundant bovine peptides were identified, and from these, 570 bovine proteins were quantified. Hierarchical cluster analysis and principal component analysis showed clear clustering of results by stage of infection, with similarities between pre-infection and resolution stages (0 and 312 h post challenge), early infection stages (36 and 42 h post challenge) and late infection stages (57 and 81 h post challenge). Ingenuity pathway analysis identified upregulation of acute phase protein pathways over the course of infection, with dominance of different acute phase proteins at different time points based on differential expression analysis. Antimicrobial peptides, notably cathelicidins and peptidoglycan recognition protein, were upregulated at all time points post challenge and peaked at 57 h, which coincided with 10 000-fold decrease in average bacterial counts. The integration of clinical, bacteriological, immunological and quantitative proteomics and other-omic data provides a more detailed systems level view of the host response to mastitis than has been achieved previously. PMID:27412694

  20. Quantitative Proteomic Analysis of the Human Nucleolus.

    PubMed

    Bensaddek, Dalila; Nicolas, Armel; Lamond, Angus I

    2016-01-01

    Recent years have witnessed spectacular progress in the field of mass spectrometry (MS)-based quantitative proteomics, including advances in instrumentation, chromatography, sample preparation methods, and experimental design for multidimensional analyses. It is now possible not only to identify most of the protein components of a cell proteome in a single experiment, but also to describe additional proteome dimensions, such as protein turnover rates, posttranslational modifications, and subcellular localization. Furthermore, by comparing the proteome at different time points, it is possible to create a "time-lapse" view of proteome dynamics. By combining high-throughput quantitative proteomics with detailed subcellular fractionation protocols and data analysis techniques it is also now possible to characterize in detail the proteomes of specific subcellular organelles, providing important insights into cell regulatory mechanisms and physiological responses. In this chapter we present a reliable workflow and protocol for MS-based analysis and quantitation of the proteome of nucleoli isolated from human cells. The protocol presented is based on a SILAC analysis of human MCF10A-Src-ER cells with analysis performed on a Q-Exactive Plus Orbitrap MS instrument (Thermo Fisher Scientific). The subsequent chapter describes how to process the resulting raw MS files from this experiment using MaxQuant software and data analysis procedures to evaluate the nucleolar proteome using customized R scripts. PMID:27576725

  1. Vinyl capped addition polyimides

    NASA Technical Reports Server (NTRS)

    Vannucci, Raymond D. (Inventor); Malarik, Diane C. (Inventor); Delvigs, Peter (Inventor)

    1991-01-01

    Polyimide resins (PMR) are generally useful where high strength and temperature capabilities are required (at temperatures up to about 700 F). Polyimide resins are particularly useful in applications such as jet engine compressor components, for example, blades, vanes, air seals, air splitters, and engine casing parts. Aromatic vinyl capped addition polyimides are obtained by reacting a diamine, an ester of tetracarboxylic acid, and an aromatic vinyl compound. Low void materials with improved oxidative stability when exposed to 700 F air may be fabricated as fiber reinforced high molecular weight capped polyimide composites. The aromatic vinyl capped polyimides are provided with a more aromatic nature and are more thermally stable than highly aliphatic, norbornenyl-type end-capped polyimides employed in PMR resins. The substitution of aromatic vinyl end-caps for norbornenyl end-caps in addition polyimides results in polymers with improved oxidative stability.

  2. Claisen-type addition of glycine to pyridoxal in water.

    PubMed

    Toth, Krisztina; Amyes, Tina L; Richard, John P; Malthouse, J Paul G; NíBeilliú, Máire E

    2004-09-01

    The reaction between 5'-deoxypyridoxal and glycine in D2O buffered at pD 7.0 does not result in significant formation of the expected products of pyridoxal-catalyzed transamination or deuterium exchange of the alpha-amino protons of glycine, but rather gives a quantitative yield of the two diastereomeric products of the formal Claisen-type addition of glycine to 5'-deoxypyridoxal. The unexpected extensive formation of these products reflects the extraordinary selectivity of the 5'-deoxypyridoxal-stabilized glycine enolate toward addition to the carbonyl group of 5'-deoxypyridoxal in the protic solvent water.

  3. Addition of docetaxel, zoledronic acid, or both to first-line long-term hormone therapy in prostate cancer (STAMPEDE): survival results from an adaptive, multiarm, multistage, platform randomised controlled trial

    PubMed Central

    James, Nicholas D; Sydes, Matthew R; Clarke, Noel W; Mason, Malcolm D; Dearnaley, David P; Spears, Melissa R; Ritchie, Alastair W S; Parker, Christopher C; Russell, J Martin; Attard, Gerhardt; de Bono, Johann; Cross, William; Jones, Rob J; Thalmann, George; Amos, Claire; Matheson, David; Millman, Robin; Alzouebi, Mymoona; Beesley, Sharon; Birtle, Alison J; Brock, Susannah; Cathomas, Richard; Chakraborti, Prabir; Chowdhury, Simon; Cook, Audrey; Elliott, Tony; Gale, Joanna; Gibbs, Stephanie; Graham, John D; Hetherington, John; Hughes, Robert; Laing, Robert; McKinna, Fiona; McLaren, Duncan B; O'Sullivan, Joe M; Parikh, Omi; Peedell, Clive; Protheroe, Andrew; Robinson, Angus J; Srihari, Narayanan; Srinivasan, Rajaguru; Staffurth, John; Sundar, Santhanam; Tolan, Shaun; Tsang, David; Wagstaff, John; Parmar, Mahesh K B

    2016-01-01

    Summary Background Long-term hormone therapy has been the standard of care for advanced prostate cancer since the 1940s. STAMPEDE is a randomised controlled trial using a multiarm, multistage platform design. It recruits men with high-risk, locally advanced, metastatic or recurrent prostate cancer who are starting first-line long-term hormone therapy. We report primary survival results for three research comparisons testing the addition of zoledronic acid, docetaxel, or their combination to standard of care versus standard of care alone. Methods Standard of care was hormone therapy for at least 2 years; radiotherapy was encouraged for men with N0M0 disease to November, 2011, then mandated; radiotherapy was optional for men with node-positive non-metastatic (N+M0) disease. Stratified randomisation (via minimisation) allocated men 2:1:1:1 to standard of care only (SOC-only; control), standard of care plus zoledronic acid (SOC + ZA), standard of care plus docetaxel (SOC + Doc), or standard of care with both zoledronic acid and docetaxel (SOC + ZA + Doc). Zoledronic acid (4 mg) was given for six 3-weekly cycles, then 4-weekly until 2 years, and docetaxel (75 mg/m2) for six 3-weekly cycles with prednisolone 10 mg daily. There was no blinding to treatment allocation. The primary outcome measure was overall survival. Pairwise comparisons of research versus control had 90% power at 2·5% one-sided α for hazard ratio (HR) 0·75, requiring roughly 400 control arm deaths. Statistical analyses were undertaken with standard log-rank-type methods for time-to-event data, with hazard ratios (HRs) and 95% CIs derived from adjusted Cox models. This trial is registered at ClinicalTrials.gov (NCT00268476) and ControlledTrials.com (ISRCTN78818544). Findings 2962 men were randomly assigned to four groups between Oct 5, 2005, and March 31, 2013. Median age was 65 years (IQR 60–71). 1817 (61%) men had M+ disease, 448 (15%) had N+/X M0, and 697 (24%) had N0M0. 165 (6

  4. Quantitative environmental risk analysis

    SciTech Connect

    Klovning, J.; Nilsen, E.F.

    1995-12-31

    According to regulations relating to implementation and rise of risk analysis in the petroleum activities issued by the Norwegian Petroleum Directorate, it is mandatory for an operator on the Norwegian Continental Shelf to establish acceptance criteria for environmental risk in the activities and carry out environmental risk analysis. This paper presents a {open_quotes}new{close_quotes} method for environmental risk analysis developed by the company. The objective has been to assist the company to meet rules and regulations and to assess and describe the environmental risk in a systematic manner. In the environmental risk analysis the most sensitive biological resource in the affected area is used to assess the environmental damage. The analytical method is based on the methodology for quantitative risk analysis related to loss of life. In addition it incorporates the effect of seasonal fluctuations in the environmental risk evaluations. The paper is describing the function of the main analytical sequences exemplified through an analysis of environmental risk related to exploration drilling in an environmental sensitive area on the Norwegian Continental Shelf.

  5. Quantitation of naturalistic behaviors.

    PubMed

    Evans, H L

    1988-10-01

    Naturalistic behaviors are behaviors that organisms exhibit 'in nature'. Eating, sleeping and sexual behaviors are examples. Since naturalistic behaviors are observed to occur without any apparent training or learning, some people mistakenly believe that all naturalistic behaviors are unlearned, and are thus different from laboratory behaviors. We maintain that naturalistic behaviors can be studied profitably in the toxicological laboratory, using quantitative techniques from behavioral neuroscience. Understanding of toxicity and underlying mechanisms is enhanced when naturalistic behaviors are thought of as responses to stimuli. Stimuli that influence naturalistic behaviors may arise inside the organisms (e.g., physiological signals of hunger) or outside the organisms (e.g., the smell of food or the start of the nocturnal lighting cycle). A practical, noninvasive, automated system can be used to improve upon the cage-side observation currently used to evaluate naturalistic behaviors in toxicity screening. Effects of alkyltins and other neurotoxicants upon eating, drinking, rearing, and the daily cycle of rest-activity will be shown. The rodent's pattern of nocturnal activity has proven to be particularly sensitive to neurotoxicants, and thus deserves additional attention in developing neurobehavioral toxicology.

  6. A Bayesian framework for comparative quantitative genetics

    PubMed Central

    Ovaskainen, Otso; Cano, José Manuel; Merilä, Juha

    2008-01-01

    Bayesian approaches have been extensively used in animal breeding sciences, but similar approaches in the context of evolutionary quantitative genetics have been rare. We compared the performance of Bayesian and frequentist approaches in estimation of quantitative genetic parameters (viz. matrices of additive and dominance variances) in datasets typical of evolutionary studies and traits differing in their genetic architecture. Our results illustrate that it is difficult to disentangle the relative roles of different genetic components from small datasets, and that ignoring, e.g. dominance is likely to lead to biased estimates of additive variance. We suggest that a natural summary statistic for G-matrix comparisons can be obtained by examining how different the underlying multinormal probability distributions are, and illustrate our approach with data on the common frog (Rana temporaria). Furthermore, we derive a simple Monte Carlo method for computation of fraternity coefficients needed for the estimation of dominance variance, and use the pedigree of a natural Siberian jay (Perisoreus infaustus) population to illustrate that the commonly used approximate values can be substantially biased. PMID:18211881

  7. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  8. Investigating Children's Abilities to Count and Make Quantitative Comparisons

    ERIC Educational Resources Information Center

    Lee, Joohi; Md-Yunus, Sham'ah

    2016-01-01

    This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…

  9. Quantitative Research in Chemical Education.

    ERIC Educational Resources Information Center

    Nurrenbern, Susan C.; Robinson, William R.

    1994-01-01

    Provides an overview of the area of quantitative research in chemical education, which involves the same components that comprise chemical research: (1) a question or hypothesis; (2) research design; (3) data collection and analysis; and (4) interpretation of results. Includes questions of interest to chemical educators; areas of quantitative…

  10. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  11. [Autoimmune processes after long-term low-level exposure to electromagnetic fields (the results of an experiment). Part 1. Mobile communications and changes in electromagnetic conditions for the population. Needs for additional substantiation of the existing hygienic standards].

    PubMed

    Grigor'ev, Iu G; Grigor'ev, O A; Ivanov, A A; Liaginskaia, A M; Merkulov, A V; Stepanov, V S; Shagina, N B

    2010-01-01

    Mobile communications provides a new source of electromagnetic exposure for almost the whole population of the Russian Federation. For the first time in the history of civilization the brain of mobile phone users was exposed to localized radiofrequency (RF) electromagnetic fields (EMF). Population exposure from the base stations is also considered to be specific. However, existing standards for limiting the exposure do not account for this special EMF source and may not ensure the absence of health effects. There was a need for reliable information that would extend databases used for development of new standards. As recommended by the World Health Organization an additional experiment was performed under the supervision of foreign experts, which showed changes in autoimmune status in rats after long-term low-level RF EMF exposure with an incident power density of 500 microW/cm2.

  12. Intensification of antiretroviral therapy through addition of enfuvirtide in naive HIV-1-infected patients with severe immunosuppression does not improve immunological response: results of a randomized multicenter trial (ANRS 130 Apollo).

    PubMed

    Joly, Véronique; Fagard, Catherine; Grondin, Carine; Descamps, Diane; Yazdanpanah, Yazdan; Charpentier, Charlotte; Colin de Verdiere, Nathalie; Tabuteau, Sophie; Raffi, François; Cabie, André; Chene, Geneviève; Yeni, Patrick

    2013-02-01

    We studied whether addition of enfuvirtide (ENF) to a background combination antiretroviral therapy (cART) would improve the CD4 cell count response at week 24 in naive patients with advanced HIV disease. ANRS 130 Apollo is a randomized study, conducted in naive HIV-1-infected patients, either asymptomatic with CD4 counts of <100/mm(3) or stage B/C disease with CD4 counts of <200/mm(3). Patients received tenofovir-emtricitabine with lopinavir-ritonavir (LPV/r) or efavirenz and were randomized to receive ENF for 24 weeks (ENF arm) or not (control arm). The primary endpoint was the proportion of patients with CD4 counts of ≥ 200/mm(3) at week 24. A total of 195 patients were randomized: 73% had stage C disease, 78% were male, the mean age was 44 years, the median CD4 count was 30/mm(3), and the median HIV-1 RNA load was 5.4 log(10) copies/ml. Eighty-one percent of patients received LPV/r. One patient was lost to follow-up, and eight discontinued the study (four in each arm). The proportions of patients with CD4 counts of ≥ 200/mm(3) at week 24 were 34% and 38% in the ENF and control arms, respectively (P = 0.53). The proportions of patients with HIV-1 RNA loads of <50 copies/ml were 74% and 58% at week 24 in the ENF and control arms, respectively (P < 0.02), and the proportion reached 79% in both arms at week 48. Twenty (20%) and 12 patients (13%) in the ENF and control arms, respectively, experienced at least one AIDS event during follow-up (P = 0.17). Although inducing a more rapid virological response, addition of ENF to a standard cART does not improve the immunological outcome in naive HIV-infected patients with severe immunosuppression. PMID:23165467

  13. Intensification of antiretroviral therapy through addition of enfuvirtide in naive HIV-1-infected patients with severe immunosuppression does not improve immunological response: results of a randomized multicenter trial (ANRS 130 Apollo).

    PubMed

    Joly, Véronique; Fagard, Catherine; Grondin, Carine; Descamps, Diane; Yazdanpanah, Yazdan; Charpentier, Charlotte; Colin de Verdiere, Nathalie; Tabuteau, Sophie; Raffi, François; Cabie, André; Chene, Geneviève; Yeni, Patrick

    2013-02-01

    We studied whether addition of enfuvirtide (ENF) to a background combination antiretroviral therapy (cART) would improve the CD4 cell count response at week 24 in naive patients with advanced HIV disease. ANRS 130 Apollo is a randomized study, conducted in naive HIV-1-infected patients, either asymptomatic with CD4 counts of <100/mm(3) or stage B/C disease with CD4 counts of <200/mm(3). Patients received tenofovir-emtricitabine with lopinavir-ritonavir (LPV/r) or efavirenz and were randomized to receive ENF for 24 weeks (ENF arm) or not (control arm). The primary endpoint was the proportion of patients with CD4 counts of ≥ 200/mm(3) at week 24. A total of 195 patients were randomized: 73% had stage C disease, 78% were male, the mean age was 44 years, the median CD4 count was 30/mm(3), and the median HIV-1 RNA load was 5.4 log(10) copies/ml. Eighty-one percent of patients received LPV/r. One patient was lost to follow-up, and eight discontinued the study (four in each arm). The proportions of patients with CD4 counts of ≥ 200/mm(3) at week 24 were 34% and 38% in the ENF and control arms, respectively (P = 0.53). The proportions of patients with HIV-1 RNA loads of <50 copies/ml were 74% and 58% at week 24 in the ENF and control arms, respectively (P < 0.02), and the proportion reached 79% in both arms at week 48. Twenty (20%) and 12 patients (13%) in the ENF and control arms, respectively, experienced at least one AIDS event during follow-up (P = 0.17). Although inducing a more rapid virological response, addition of ENF to a standard cART does not improve the immunological outcome in naive HIV-infected patients with severe immunosuppression.

  14. Quantitative phase imaging of arthropods

    PubMed Central

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Abstract. Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  15. Quantitative phase imaging of arthropods.

    PubMed

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy. PMID:26334858

  16. Quantitative phase imaging of arthropods

    NASA Astrophysics Data System (ADS)

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-11-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  17. Quantitative phase imaging of arthropods.

    PubMed

    Sridharan, Shamira; Katz, Aron; Soto-Adames, Felipe; Popescu, Gabriel

    2015-01-01

    Classification of arthropods is performed by characterization of fine features such as setae and cuticles. An unstained whole arthropod specimen mounted on a slide can be preserved for many decades, but is difficult to study since current methods require sample manipulation or tedious image processing. Spatial light interference microscopy (SLIM) is a quantitative phase imaging (QPI) technique that is an add-on module to a commercial phase contrast microscope. We use SLIM to image a whole organism springtail Ceratophysella denticulata mounted on a slide. This is the first time, to our knowledge, that an entire organism has been imaged using QPI. We also demonstrate the ability of SLIM to image fine structures in addition to providing quantitative data that cannot be obtained by traditional bright field microscopy.

  18. Quantitative Spectroscopy of Distant Type Ia Supernovae

    NASA Astrophysics Data System (ADS)

    Bronder, T. J.; Hook, I.; Howell, D. A.; Sullivan, M.; Perrett, K.; Conley, A.; Astier, P.; Basa, S.; Carlberg, R. G.; Guy, J.; Pain, R.; Pritchet, C. J.; Neill, James D.

    2007-08-01

    Quantitative analysis of 24 high-z (zmed = 0.81) Type Ia supernovae (SNe Ia) spectra observed at the Gemini Telescopes for the Supernova Legacy Survey (SNLS) is presented. This analysis includes equivalent width measurements of SNe Ia-specific absorption features with methods tailored to the reduced signal-to-noise and host galaxy contamination present in these distant spectra. The results from this analysis are compared to corresponding measurements of a large set of low-z SNe Ia from the literature. This comparison showed no significant difference (less than 2σ) between the spectroscopic features of the distant and nearby SNe; a result that supports the assumption that SNe Ia are not evolving with redshift. Additionally, a new correlation between SiII absorption (observed near peak luminosity) and SNe Ia peak magnitudes is presented.

  19. [Food additives and healthiness].

    PubMed

    Heinonen, Marina

    2014-01-01

    Additives are used for improving food structure or preventing its spoilage, for example. Many substances used as additives are also naturally present in food. The safety of additives is evaluated according to commonly agreed principles. If high concentrations of an additive cause adverse health effects for humans, a limit of acceptable daily intake (ADI) is set for it. An additive is a risk only when ADI is exceeded. The healthiness of food is measured on the basis of nutrient density and scientifically proven effects.

  20. Electric utility use of fireside additives. Final report

    SciTech Connect

    Locklin, D.W.; Krause, H.H.; Anson, D.; Reid, W.

    1980-01-01

    Fireside additives have been used or proposed for use in fossil-fired utility boilers to combat a number of problems related to boiler performance and reliability. These problems include corrosion, fouling, superheat control, and acidic emissions. Fuel additives and other fireside additives have been used mainly with oil firing; however, there is growing experience with additives in coal-firing, especially for flyash conditioning to improve the performance of electrostatic precipitators. In decisions regarding the selection and use of additives, utilities have had to rely extensively on empiricism, due partly to an incomplete understanding of processes involved and partly to the limited amount of quantitative data. The study reported here was sponsored by the Electric Power Research Institute to assemble and analyze pertinent operating experience and to recommend guidelines for utility decisions on the use of additives. The combined results of the state-of-the-art review of technical literature and a special survey of utility experience are reported. A total of 38 utilities participated in the survey, providing information on trials conducted on 104 units in 93 different plants. Altogether, 445 separate trials were reported, each representing a unit/additive/fuel combination. Additives used in these trials included 90 different additive formulations, both pure compounds and proprietary products. These formulations were categorized into 37 generic classes according to their chemical constituents, and the results of the survey are presented by these generic classes. The findings are organized according to the operating problems for which fireside additives are used. Guidelines are presented for utility use in additive selection and in planning additive trials.

  1. [Quantitative method of representative contaminants in groundwater pollution risk assessment].

    PubMed

    Wang, Jun-Jie; He, Jiang-Tao; Lu, Yan; Liu, Li-Ya; Zhang, Xiao-Liang

    2012-03-01

    In the light of the problem that stress vulnerability assessment in groundwater pollution risk assessment is lack of an effective quantitative system, a new system was proposed based on representative contaminants and corresponding emission quantities through the analysis of groundwater pollution sources. And quantitative method of the representative contaminants in this system was established by analyzing the three properties of representative contaminants and determining the research emphasis using analytic hierarchy process. The method had been applied to the assessment of Beijing groundwater pollution risk. The results demonstrated that the representative contaminants hazards greatly depended on different research emphasizes. There were also differences between the sequence of three representative contaminants hazards and their corresponding properties. It suggested that subjective tendency of the research emphasis had a decisive impact on calculation results. In addition, by the means of sequence to normalize the three properties and to unify the quantified properties results would zoom in or out of the relative properties characteristic of different representative contaminants.

  2. A Quantitative Gas Chromatographic Ethanol Determination.

    ERIC Educational Resources Information Center

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  3. Recapturing Quantitative Biology.

    ERIC Educational Resources Information Center

    Pernezny, Ken; And Others

    1996-01-01

    Presents a classroom activity on estimating animal populations. Uses shoe boxes and candies to emphasize the importance of mathematics in biology while introducing the methods of quantitative ecology. (JRH)

  4. On Quantitative Rorschach Scales.

    ERIC Educational Resources Information Center

    Haggard, Ernest A.

    1978-01-01

    Two types of quantitative Rorschach scales are discussed: first, those based on the response categories of content, location, and the determinants, and second, global scales based on the subject's responses to all ten stimulus cards. (Author/JKS)

  5. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  6. Visual constraints for the perception of quantitative depth from temporal interocular unmatched features.

    PubMed

    Ni, Rui; Chen, Lin; Andersen, George J

    2010-07-21

    Previous research (Brooks & Gillam, 2006) has found that temporal interocular unmatched (IOUM) features generate a perception of subjective contours and can result in a perception of quantitative depth. In the present study we examine in detail the factors important for quantitative depth perception from IOUM features. In Experiments 1 and 2 observers were shown temporal IOUM features based on three dots that disappeared behind an implicit surface. Subjects reported a perception of a subjective surface and were able to perceive qualitative depth. In Experiments 3 and 4 metrical depth was perceived when binocular disparity features were added to the display. These results suggest that quantitative depth from IOUM information is perceived when binocular matched information is present in regions adjacent to the surface. In addition, the perceived depth of the subjective surface decreased with an increase in the width of the subjective surface suggesting a limitation in the propagation of quantitative depth to surface regions where qualitative depth information is available.

  7. Is herpes zoster an additional complication in old age alongside comorbidity and multiple medications? Results of the post hoc analysis of the 12-month longitudinal prospective observational ARIZONA cohort study

    PubMed Central

    Pickering, Gisèle; Gavazzi, Gaëtan; Gaillat, Jacques; Paccalin, Marc; Bloch, Karine; Bouhassira, Didier

    2016-01-01

    Objectives To examine the burden of comorbidity, polypharmacy and herpes zoster (HZ), an infectious disease, and its main complication post-herpetic neuralgia (PHN) in young (50–70 years of age: 70−) and old (≥70 years of age: 70+) patients. Design Post hoc analysis of the results of the 12-month longitudinal prospective multicentre observational ARIZONA cohort study. Settings and participants The study took place in primary care in France from 20 November 2006 to 12 September 2008. Overall, 644 general practitioners (GPs) collected data from 1358 patients aged 50 years or more with acute eruptive HZ. Outcome measures Presence of HZ-related pain or PHN (pain persisting >3 months) was documented at day 0 and at months 3, 6, and 12. To investigate HZ and PHN burden, pain, quality of life (QoL) and mood were self-assessed using validated questionnaires (Zoster Brief Pain Inventory, 12-item Short-Form health survey and Hospital Anxiety and Depression Scale, respectively). Results As compared with younger patients, older patients more frequently presented with comorbidities, more frequently took analgesics and had poorer response on all questionnaires, indicating greater burden, at inclusion. Analgesics were more frequently prescribed to relieve acute pain or PHN in 70+ than 70− patients. Despite higher levels of medication prescription, poorer pain relief and poorer response to all questionnaires were reported in 70+ than 70− patients. Conclusions Occurrence of HZ and progression to PHN adds extra burden on top of pharmacological treatment and impaired quality of life, especially in older patients who already have health problems to cope with in everyday life. PMID:26892790

  8. Quantitive DNA Fiber Mapping

    SciTech Connect

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  9. Microbial phytase addition resulted in a greater increase in phosphorus digestibility in dry-fed compared with liquid-fed non-heat-treated wheat-barley-maize diets for pigs.

    PubMed

    Blaabjerg, K; Thomassen, A-M; Poulsen, H D

    2015-02-01

    The objective was to evaluate the effect of microbial phytase (1250 FTU/kg diet with 88% dry matter (DM)) on apparent total tract digestibility (ATTD) of phosphorus (P) in pigs fed a dry or soaked diet. Twenty-four pigs (65±3 kg) from six litters were used. Pigs were housed in metabolism crates and fed one of four diets for 12 days; 5 days for adaptation and 7 days for total, but separate collection of feces and urine. The basal diet was composed of wheat, barley, maize, soybean meal and no mineral phosphate. Dietary treatments were: basal dry-fed diet (BDD), BDD with microbial phytase (BDD+phy), BDD soaked for 24 h at 20°C before feeding (BDS) and BDS with microbial phytase (BDS+phy). Supplementation of microbial phytase increased ATTD of DM and crude protein (N×6.25) by 2 and 3 percentage units (P<0.0001; P<0.001), respectively. The ATTD of P was affected by the interaction between microbial phytase and soaking (P=0.02). This was due to a greater increase in ATTD of P by soaking of the diet containing solely plant phytase compared with the diet supplemented with microbial phytase: 35%, 65%, 44% and 68% for BDD, BDD+phy, BSD and BSD+phy, respectively. As such, supplementation of microbial phytase increased ATTD of P in the dry-fed diet, but not in the soaked diet. The higher ATTD of P for BDS compared with BDD resulted from the degradation of 54% of the phytate in BDS by wheat and barley phytases during soaking. On the other hand, soaking of BDS+phy did not increase ATTD of P significantly compared with BDD+phy despite that 76% of the phytate in BDS+phy was degraded before feeding. In conclusion, soaking of BDS containing solely plant phytase provided a great potential for increasing ATTD of P. However, this potential was not present when microbial phytase (1250 FTU/kg diet) was supplemented, most likely because soaking of BDS+phy for 24 h at 20°C did not result in a complete degradation of phytate before feeding.

  10. Quantitative phase imaging with programmable illumination

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Edwards, Chris; Goddard, Lynford L.; Popescu, Gabriel

    2015-03-01

    Even with the recent rapid advances in the field of microscopy, non-laser light sources used for light microscopy have not been developing significantly. Most current optical microscopy systems use halogen bulbs as their light sources to provide a white-light illumination. Due to the confined shapes and finite filament size of the bulbs, little room is available for modification in the light source, which prevents further advances in microscopy. By contrast, commercial projectors provide a high power output that is comparable to the halogen lamps while allowing for great flexibility in patterning the illumination. In addition to their high brightness, the illumination can be patterned to have arbitrary spatial and spectral distributions. Therefore, commercial projectors can be adopted as a flexible light source to an optical microscope by careful alignment to the existing optical path. In this study, we employed a commercial projector source to a quantitative phase imaging system called spatial light interference microscopy (SLIM), which is an outside module for an existing phase contrast (PC) microscope. By replacing the ring illumination of PC with a ring-shaped pattern projected onto the condenser plane, we were able to recover the same result as the original SLIM. Furthermore, the ring illumination is replaced with multiple dots aligned along the same ring to minimize the overlap between the scattered and unscattered fields. This new method minimizes the halo artifact of the imaging system, which allows for a halo-free high-resolution quantitative phase microscopy system.

  11. Quantitative Luminescence Imaging System

    SciTech Connect

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  12. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Fletcher, James C. (Inventor); Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1992-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  13. Polyimide processing additives

    NASA Technical Reports Server (NTRS)

    Pratt, J. Richard (Inventor); St.clair, Terry L. (Inventor); Stoakley, Diane M. (Inventor); Burks, Harold D. (Inventor)

    1993-01-01

    A process for preparing polyimides having enhanced melt flow properties is described. The process consists of heating a mixture of a high molecular weight poly-(amic acid) or polyimide with a low molecular weight amic acid or imide additive in the range of 0.05 to 15 percent by weight of the additive. The polyimide powders so obtained show improved processability, as evidenced by lower melt viscosity by capillary rheometry. Likewise, films prepared from mixtures of polymers with additives show improved processability with earlier onset of stretching by TMA.

  14. Quantitative 3D analysis of huge nanoparticle assemblies† †Electronic supplementary information (ESI) available.CCDC 1417516–1417520 contain the supplementary crystallographic data for this paper. For ESI and crystallographic data in CIF or other electronic format see DOI: 10.1039/c5nr06962a Click here for additional data file.

    PubMed Central

    Zanaga, Daniele; Bleichrodt, Folkert; Altantzis, Thomas; Winckelmans, Naomi; Palenstijn, Willem Jan; Sijbers, Jan; de Nijs, Bart; van Huis, Marijn A.; Sánchez-Iglesias, Ana; Liz-Marzán, Luis M.; van Blaaderen, Alfons; Joost Batenburg, K.; Van Tendeloo, Gustaaf

    2016-01-01

    Nanoparticle assemblies can be investigated in 3 dimensions using electron tomography. However, it is not straightforward to obtain quantitative information such as the number of particles or their relative position. This becomes particularly difficult when the number of particles increases. We propose a novel approach in which prior information on the shape of the individual particles is exploited. It improves the quality of the reconstruction of these complex assemblies significantly. Moreover, this quantitative Sparse Sphere Reconstruction approach yields directly the number of particles and their position as an output of the reconstruction technique, enabling a detailed 3D analysis of assemblies with as many as 10 000 particles. The approach can also be used to reconstruct objects based on a very limited number of projections, which opens up possibilities to investigate beam sensitive assemblies where previous reconstructions with the available electron tomography techniques failed. PMID:26607629

  15. Quantitative spectroscopy of hot stars

    NASA Technical Reports Server (NTRS)

    Kudritzki, R. P.; Hummer, D. G.

    1990-01-01

    A review on the quantitative spectroscopy (QS) of hot stars is presented, with particular attention given to the study of photospheres, optically thin winds, unified model atmospheres, and stars with optically thick winds. It is concluded that the results presented here demonstrate the reliability of Qs as a unique source of accurate values of the global parameters (effective temperature, surface gravity, and elemental abundances) of hot stars.

  16. Quantitative approaches to computational vaccinology.

    PubMed

    Doytchinova, Irini A; Flower, Darren R

    2002-06-01

    This article reviews the newly released JenPep database and two new powerful techniques for T-cell epitope prediction: (i) the additive method; and (ii) a 3D-Quantitative Structure Activity Relationships (3D-QSAR) method, based on Comparative Molecular Similarity Indices Analysis (CoMSIA). The JenPep database is a family of relational databases supporting the growing need of immunoinformaticians for quantitative data on peptide binding to major histocompatibility complexes and to the Transporters associated with Antigen Processing (TAP). It also contains an annotated list of T-cell epitopes. The database is available free via the Internet (http://www.jenner.ac.uk/JenPep). The additive prediction method is based on the assumption that the binding affinity of a peptide depends on the contributions from each amino acid as well as on the interactions between the adjacent and every second side-chain. In the 3D-QSAR approach, the influence of five physicochemical properties (steric bulk, electrostatic potential, local hydrophobicity, hydrogen-bond donor and hydrogen-bond acceptor abilities) on the affinity of peptides binding to MHC molecules were considered. Both methods were exemplified through their application to the well-studied problem of peptides binding to the human class I MHC molecule HLA-A*0201. PMID:12067414

  17. Additional Types of Neuropathy

    MedlinePlus

    ... A A Listen En Español Additional Types of Neuropathy Charcot's Joint Charcot's Joint, also called neuropathic arthropathy, ... can stop bone destruction and aid healing. Cranial Neuropathy Cranial neuropathy affects the 12 pairs of nerves ...

  18. Food Additives and Hyperkinesis

    ERIC Educational Resources Information Center

    Wender, Ester H.

    1977-01-01

    The hypothesis that food additives are causally associated with hyperkinesis and learning disabilities in children is reviewed, and available data are summarized. Available from: American Medical Association 535 North Dearborn Street Chicago, Illinois 60610. (JG)

  19. Surface acoustic wave stabilized oscillators: Additional aging results

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Updated aging data for SAW oscillators with aluminum transducers on ST-cut quartz, for gold transducers on ST-cut quartz, and for aluminum transducers on SiO2/LiTaO3. Devices with gold transducers age differently (both and shape of curve) from those having a aluminum transducers indicating the transducer metallization can represent an important aging mechanism.

  20. Quantitative MALDI tandem mass spectrometric imaging of cocaine from brain tissue with a deuterated internal standard.

    PubMed

    Pirman, David A; Reich, Richard F; Kiss, András; Heeren, Ron M A; Yost, Richard A

    2013-01-15

    Mass spectrometric imaging (MSI) is an analytical technique used to determine the distribution of individual analytes within a given sample. A wide array of analytes and samples can be investigated by MSI, including drug distribution in rats, lipid analysis from brain tissue, protein differentiation in tumors, and plant metabolite distributions. Matrix-assisted laser desorption/ionization (MALDI) is a soft ionization technique capable of desorbing and ionizing a large range of compounds, and it is the most common ionization source used in MSI. MALDI mass spectrometry (MS) is generally considered to be a qualitative analytical technique because of significant ion-signal variability. Consequently, MSI is also thought to be a qualitative technique because of the quantitative limitations of MALDI coupled with the homogeneity of tissue sections inherent in an MSI experiment. Thus, conclusions based on MS images are often limited by the inability to correlate ion signal increases with actual concentration increases. Here, we report a quantitative MSI method for the analysis of cocaine (COC) from brain tissue using a deuterated internal standard (COC-d(3)) combined with wide-isolation MS/MS for analysis of the tissue extracts with scan-by-scan COC-to-COC-d(3) normalization. This resulted in significant improvements in signal reproducibility and calibration curve linearity. Quantitative results from the MSI experiments were compared with quantitative results from liquid chromatography (LC)-MS/MS results from brain tissue extracts. Two different quantitative MSI techniques (standard addition and external calibration) produced quantitative results comparable to LC-MS/MS data. Tissue extracts were also analyzed by MALDI wide-isolation MS/MS, and quantitative results were nearly identical to those from LC-MS/MS. These results clearly demonstrate the necessity for an internal standard for quantitative MSI experiments. PMID:23214490

  1. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  2. Electric utility use of fireside additives. Final report

    SciTech Connect

    Locklin, D.W.; Krause, H.H.; Anson, D.; Reid, W.

    1980-01-01

    Fireside additives have been used or proposed for use in fossil-fired utility boilers to combat a number of problems related to boiler performance and reliability. These problems include corrosion, fouling, superheat control, and acidic emissions. Fuel additivies and other fireside additives have been used mainly with oil firing; however, there is growing experience with additives in coal-firing, especially for flyash conditioning to improve the performance of electrostatic precipitators. In decisions regarding the selection and use of additives, utilities have had to rely extensively on empiricism, due partly to our incomplete understanding of processes involved and partly to the limited amount of quantitative data. The study reported here was sponsored by the Electric Power Research Institute to assemble and analyze pertinent operating experience and to recommend guidelines for utility decisions on the use of additives. This report describes the combined results of the state-of-the-art review of technical literature and a special survey of utility experience. A total of 38 utilities participated in the survey, providing information on trials conducted on 104 units in 93 different plants. Altogether, 445 separate trials were reported, each representing a unit/additive/fuel combination. 90 different additive formulations, both pure compounds and proprietary products, were categorized into 37 generic classes according to their chemical constituents, and the results of the survey are presented by these generic classes. This report is organized according to the operating problems for which fireside additives are used. Guidelines are presented for utility use in additive selection and in planning additive trials.

  3. The contribution of quantitative trait loci and neutral marker loci to the genetic variances and covariances among quantitative traits in random mating populations

    SciTech Connect

    Ruiz, A.; Barbadilla, A.

    1995-01-01

    Using Cockerham`s approach of orthogonal scales, we develop genetic models for the effect of an arbitrary number of multiallelic quantitative trait loci (QTLs) or neutral marker loci (NMLs) upon any number of quantitative traits. These models allow the unbiased estimation of the contributions of a set of marker loci to the additive and dominance variances and covariances among traits in a random mating population. The method has been applied to an analysis of allozyme and quantitative data from the European oyster. The contribution of a set of marker loci may either be real, when the markers are actually QTLs, or apparent, when they are NMLs that are in linkage disequilibrium with hidden QTLs. Our results show that the additive and dominance variances contributed by a set of NMLs are always minimum estimates of the corresponding variances contributed by the associated QTLs. In contrast, the apparent contribution of the NMLs to the additive and dominance covariances between two traits may be larger than, equal to or lower than the actual contributions of the QTLs. We also derive an expression for the expected variance explained by the correlation between a quantitative trait and multilocus heterozygosity. This correlation explains only a part of the genetic variance contributed by the markers, i.e., in general, a combination of additive and dominance variances and, thus, provides only very limited information relative to the method supplied here. 94 refs., 2 figs., 5 tabs.

  4. Additive Manufacturing Infrared Inspection

    NASA Technical Reports Server (NTRS)

    Gaddy, Darrell

    2014-01-01

    Additive manufacturing is a rapid prototyping technology that allows parts to be built in a series of thin layers from plastic, ceramics, and metallics. Metallic additive manufacturing is an emerging form of rapid prototyping that allows complex structures to be built using various metallic powders. Significant time and cost savings have also been observed using the metallic additive manufacturing compared with traditional techniques. Development of the metallic additive manufacturing technology has advanced significantly over the last decade, although many of the techniques to inspect parts made from these processes have not advanced significantly or have limitations. Several external geometry inspection techniques exist such as Coordinate Measurement Machines (CMM), Laser Scanners, Structured Light Scanning Systems, or even traditional calipers and gages. All of the aforementioned techniques are limited to external geometry and contours or must use a contact probe to inspect limited internal dimensions. This presentation will document the development of a process for real-time dimensional inspection technique and digital quality record of the additive manufacturing process using Infrared camera imaging and processing techniques.

  5. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  6. A quantitative study of nanoparticle skin penetration with interactive segmentation.

    PubMed

    Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook

    2016-10-01

    In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications. PMID:26589318

  7. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  8. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  9. Quantitative Simulation Games

    NASA Astrophysics Data System (ADS)

    Černý, Pavol; Henzinger, Thomas A.; Radhakrishna, Arjun

    While a boolean notion of correctness is given by a preorder on systems and properties, a quantitative notion of correctness is defined by a distance function on systems and properties, where the distance between a system and a property provides a measure of "fit" or "desirability." In this article, we explore several ways how the simulation preorder can be generalized to a distance function. This is done by equipping the classical simulation game between a system and a property with quantitative objectives. In particular, for systems that satisfy a property, a quantitative simulation game can measure the "robustness" of the satisfaction, that is, how much the system can deviate from its nominal behavior while still satisfying the property. For systems that violate a property, a quantitative simulation game can measure the "seriousness" of the violation, that is, how much the property has to be modified so that it is satisfied by the system. These distances can be computed in polynomial time, since the computation reduces to the value problem in limit average games with constant weights. Finally, we demonstrate how the robustness distance can be used to measure how many transmission errors are tolerated by error correcting codes.

  10. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  11. The Changing Context of Critical Quantitative Inquiry

    ERIC Educational Resources Information Center

    Rios-Aguilar, Cecilia

    2014-01-01

    The author provides a framework to help scholars in the field of higher education to be critical. Additionally, the author reflects and comments on the chapters included in this special volume. Finally, this chapter ends with a discussion of the opportunities and challenges of critical quantitative inquiry.

  12. Multifunctional fuel additives

    SciTech Connect

    Baillargeon, D.J.; Cardis, A.B.; Heck, D.B.

    1991-03-26

    This paper discusses a composition comprising a major amount of a liquid hydrocarbyl fuel and a minor low-temperature flow properties improving amount of an additive product of the reaction of a suitable diol and product of a benzophenone tetracarboxylic dianhydride and a long-chain hydrocarbyl aminoalcohol.

  13. Biobased lubricant additives

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Fully biobased lubricants are those formulated using all biobased ingredients, i.e. biobased base oils and biobased additives. Such formulations provide the maximum environmental, safety, and economic benefits expected from a biobased product. Currently, there are a number of biobased base oils that...

  14. Nuclear medicine and imaging research: Quantitative studies in radiopharmaceutical science

    SciTech Connect

    Copper, M.; Beck, R.N.

    1991-06-01

    During the past three years the program has undergone a substantial revitalization. There has been no significant change in the scientific direction of this grant, in which emphasis continues to be placed on developing new or improved methods of obtaining quantitative data from radiotracer imaging studies. However, considerable scientific progress has been made in the three areas of interest: Radiochemistry, Quantitative Methodologies, and Experimental Methods and Feasibility Studies, resulting in a sharper focus of perspective and improved integration of the overall scientific effort. Changes in Faculty and staff, including development of new collaborations, have contributed to this, as has acquisition of additional and new equipment and renovations and expansion of the core facilities. 121 refs., 30 figs., 2 tabs.

  15. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells. PMID:26039484

  16. Non-interferometric quantitative phase imaging of yeast cells

    NASA Astrophysics Data System (ADS)

    Poola, Praveen K.; Pandiyan, Vimal Prabhu; John, Renu

    2015-12-01

    Real-time imaging of live cells is quite difficult without the addition of external contrast agents. Various methods for quantitative phase imaging of living cells have been proposed like digital holographic microscopy and diffraction phase microscopy. In this paper, we report theoretical and experimental results of quantitative phase imaging of live yeast cells with nanometric precision using transport of intensity equations (TIE). We demonstrate nanometric depth sensitivity in imaging live yeast cells using this technique. This technique being noninterferometric, does not need any coherent light sources and images can be captured through a regular bright-field microscope. This real-time imaging technique would deliver the depth or 3-D volume information of cells and is highly promising in real-time digital pathology applications, screening of pathogens and staging of diseases like malaria as it does not need any preprocessing of samples.

  17. Dual function microscope for quantitative DIC and birefringence imaging

    NASA Astrophysics Data System (ADS)

    Li, Chengshuai; Zhu, Yizheng

    2016-03-01

    A spectral multiplexing interferometry (SXI) method is presented for integrated birefringence and phase gradient measurement on label-free biological specimens. With SXI, the retardation and orientation of sample birefringence are simultaneously encoded onto two separate spectral carrier waves, generated by a crystal retarder oriented at a specific angle. Thus sufficient information for birefringence determination can be obtained from a single interference spectrum, eliminating the need for multiple acquisitions with mechanical rotation or electrical modulation. In addition, with the insertion of a Nomarski prism, the setup can then acquire quantitative differential interference contrast images. Red blood cells infected by malaria parasites are imaged for birefringence retardation as well as phase gradient. The results demonstrate that the SXI approach can achieve both quantitative phase imaging and birefringence imaging with a single, high-sensitivity system.

  18. Quantitative analysis of microtubule orientation in interdigitated leaf pavement cells.

    PubMed

    Akita, Kae; Higaki, Takumi; Kutsuna, Natsumaro; Hasezawa, Seiichiro

    2015-01-01

    Leaf pavement cells are shaped like a jigsaw puzzle in most dicotyledon species. Molecular genetic studies have identified several genes required for pavement cells morphogenesis and proposed that microtubules play crucial roles in the interdigitation of pavement cells. In this study, we performed quantitative analysis of cortical microtubule orientation in leaf pavement cells in Arabidopsis thaliana. We captured confocal images of cortical microtubules in cotyledon leaf epidermis expressing GFP-tubulinβ and quantitatively evaluated the microtubule orientations relative to the pavement cell growth axis using original image processing techniques. Our results showed that microtubules kept parallel orientations to the growth axis during pavement cell growth. In addition, we showed that immersion treatment of seed cotyledons in solutions containing tubulin polymerization and depolymerization inhibitors decreased pavement cell complexity. Treatment with oryzalin and colchicine inhibited the symmetric division of guard mother cells.

  19. A quantitative method for evaluating alternatives. [aid to decision making

    NASA Technical Reports Server (NTRS)

    Forthofer, M. J.

    1981-01-01

    When faced with choosing between alternatives, people tend to use a number of criteria (often subjective, rather than objective) to decide which is the best alternative for them given their unique situation. The subjectivity inherent in the decision-making process can be reduced by the definition and use of a quantitative method for evaluating alternatives. This type of method can help decision makers achieve degree of uniformity and completeness in the evaluation process, as well as an increased sensitivity to the factors involved. Additional side-effects are better documentation and visibility of the rationale behind the resulting decisions. General guidelines for defining a quantitative method are presented and a particular method (called 'hierarchical weighted average') is defined and applied to the evaluation of design alternatives for a hypothetical computer system capability.

  20. Estimation of Variance Components of Quantitative Traits in Inbred Populations

    PubMed Central

    Abney, Mark; McPeek, Mary Sara; Ober, Carole

    2000-01-01

    Summary Use of variance-component estimation for mapping of quantitative-trait loci in humans is a subject of great current interest. When only trait values, not genotypic information, are considered, variance-component estimation can also be used to estimate heritability of a quantitative trait. Inbred pedigrees present special challenges for variance-component estimation. First, there are more variance components to be estimated in the inbred case, even for a relatively simple model including additive, dominance, and environmental effects. Second, more identity coefficients need to be calculated from an inbred pedigree in order to perform the estimation, and these are computationally more difficult to obtain in the inbred than in the outbred case. As a result, inbreeding effects have generally been ignored in practice. We describe here the calculation of identity coefficients and estimation of variance components of quantitative traits in large inbred pedigrees, using the example of HDL in the Hutterites. We use a multivariate normal model for the genetic effects, extending the central-limit theorem of Lange to allow for both inbreeding and dominance under the assumptions of our variance-component model. We use simulated examples to give an indication of under what conditions one has the power to detect the additional variance components and to examine their impact on variance-component estimation. We discuss the implications for mapping and heritability estimation by use of variance components in inbred populations. PMID:10677322

  1. Theoretical study of the nuclear spin-molecular rotation coupling for relativistic electrons and non-relativistic nuclei. II. Quantitative results in HX (X = H,F,Cl,Br,I) compounds.

    PubMed

    Aucar, I Agustín; Gómez, Sergio S; Melo, Juan I; Giribet, Claudia C; Ruiz de Azúa, Martín C

    2013-04-01

    In the present work, numerical results of the nuclear spin-rotation (SR) tensor in the series of compounds HX (X = H,F,Cl,Br,I) within relativistic 4-component expressions obtained by Aucar et al. [J. Chem. Phys. 136, 204119 (2012)] are presented. The SR tensors of both the H and X nuclei are discussed. Calculations were carried out within the relativistic Linear Response formalism at the Random Phase Approximation with the DIRAC program. For the halogen nucleus X, correlation effects on the non-relativistic values are shown to be of similar magnitude and opposite sign to relativistic effects. For the light H nucleus, by means of the linear response within the elimination of the small component approach it is shown that the whole relativistic effect is given by the spin-orbit operator combined with the Fermi contact operator. Comparison of "best estimate" calculated values with experimental results yield differences smaller than 2%-3% in all cases. The validity of "Flygare's relation" linking the SR tensor and the NMR nuclear magnetic shielding tensor in the present series of compounds is analyzed.

  2. Quantitative MRI Assessment of Leukoencephalopathy

    PubMed Central

    Reddick, Wilburn E.; Glass, John O.; Langston, James W.; Helton, Kathleen J.

    2008-01-01

    Quantitative MRI assessment of leukoencephalopathy is difficult because the MRI properties of leukoencephalopathy significantly overlap those of normal tissue. This report describes the use of an automated procedure for longitudinal measurement of tissue volume and relaxation times to quantify leukoencephalopathy. Images derived by using this procedure in patients undergoing therapy for acute lymphoblastic leukemia (ALL) are presented. Five examinations from each of five volunteers (25 examinations) were used to test the reproducibility of quantitated baseline and subsequent, normal-appearing images; the coefficients of variation were less than 2% for gray and white matter. Regions of leukoencephalopathy in patients were assessed by comparison with manual segmentation. Two radiologists manually segmented images from 15 randomly chosen MRI examinations that exhibited leukoencephalopathy. Kappa analyses showed that the two radiologists’ interpretations were concordant (κ = 0.70) and that each radiologist’s interpretations agreed with the results of the automated procedure (κ = 0.57 and 0.55).The clinical application of this method was illustrated by analysis of images from sequential MR examinations of two patients who developed leukoencephalopathy during treatment for ALL. The ultimate goal is to use these quantitative MR imaging measures to better understand therapy-induced neurotoxicity, which can be limited or even reversed with some combination of therapy adjustments and pharmacological and neurobehavioral interventions. PMID:11979570

  3. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  4. Tackifier for addition polyimides

    NASA Technical Reports Server (NTRS)

    Butler, J. M.; St.clair, T. L.

    1980-01-01

    A modification to the addition polyimide, LaRC-160, was prepared to improve tack and drape and increase prepeg out-time. The essentially solventless, high viscosity laminating resin is synthesized from low cost liquid monomers. The modified version takes advantage of a reactive, liquid plasticizer which is used in place of solvent and helps solve a major problem of maintaining good prepeg tack and drape, or the ability of the prepeg to adhere to adjacent plies and conform to a desired shape during the lay up process. This alternate solventless approach allows both longer life of the polymer prepeg and the processing of low void laminates. This approach appears to be applicable to all addition polyimide systems.

  5. Functional Generalized Additive Models.

    PubMed

    McLean, Mathew W; Hooker, Giles; Staicu, Ana-Maria; Scheipl, Fabian; Ruppert, David

    2014-01-01

    We introduce the functional generalized additive model (FGAM), a novel regression model for association studies between a scalar response and a functional predictor. We model the link-transformed mean response as the integral with respect to t of F{X(t), t} where F(·,·) is an unknown regression function and X(t) is a functional covariate. Rather than having an additive model in a finite number of principal components as in Müller and Yao (2008), our model incorporates the functional predictor directly and thus our model can be viewed as the natural functional extension of generalized additive models. We estimate F(·,·) using tensor-product B-splines with roughness penalties. A pointwise quantile transformation of the functional predictor is also considered to ensure each tensor-product B-spline has observed data on its support. The methods are evaluated using simulated data and their predictive performance is compared with other competing scalar-on-function regression alternatives. We illustrate the usefulness of our approach through an application to brain tractography, where X(t) is a signal from diffusion tensor imaging at position, t, along a tract in the brain. In one example, the response is disease-status (case or control) and in a second example, it is the score on a cognitive test. R code for performing the simulations and fitting the FGAM can be found in supplemental materials available online.

  6. Quantitative measurements in capsule endoscopy.

    PubMed

    Keuchel, M; Kurniawan, N; Baltes, P; Bandorski, D; Koulaouzidis, A

    2015-10-01

    This review summarizes several approaches for quantitative measurement in capsule endoscopy. Video capsule endoscopy (VCE) typically provides wireless imaging of small bowel. Currently, a variety of quantitative measurements are implemented in commercially available hardware/software. The majority is proprietary and hence undisclosed algorithms. Measurement of amount of luminal contamination allows calculating scores from whole VCE studies. Other scores express the severity of small bowel lesions in Crohn׳s disease or the degree of villous atrophy in celiac disease. Image processing with numerous algorithms of textural and color feature extraction is further in the research focuses for automated image analysis. These tools aim to select single images with relevant lesions as blood, ulcers, polyps and tumors or to omit images showing only luminal contamination. Analysis of motility pattern, size measurement and determination of capsule localization are additional topics. Non-visual wireless capsules transmitting data acquired with specific sensors from the gastrointestinal (GI) tract are available for clinical routine. This includes pH measurement in the esophagus for the diagnosis of acid gastro-esophageal reflux. A wireless motility capsule provides GI motility analysis on the basis of pH, pressure, and temperature measurement. Electromagnetically tracking of another motility capsule allows visualization of motility. However, measurement of substances by GI capsules is of great interest but still at an early stage of development. PMID:26299419

  7. Quantitative Susceptibility Mapping in Parkinson's Disease

    PubMed Central

    Seiler, Stephan; Deistung, Andreas; Schweser, Ferdinand; Franthal, Sebastian; Homayoon, Nina; Katschnig-Winter, Petra; Koegl-Wallner, Mariella; Pendl, Tamara; Stoegerer, Eva Maria; Wenzel, Karoline; Fazekas, Franz; Ropele, Stefan; Reichenbach, Jürgen Rainer; Schmidt, Reinhold; Schwingenschuh, Petra

    2016-01-01

    Background Quantitative susceptibility mapping (QSM) and R2* relaxation rate mapping have demonstrated increased iron deposition in the substantia nigra of patients with idiopathic Parkinson’s disease (PD). However, the findings in other subcortical deep gray matter nuclei are converse and the sensitivity of QSM and R2* for morphological changes and their relation to clinical measures of disease severity has so far been investigated only sparsely. Methods The local ethics committee approved this study and all subjects gave written informed consent. 66 patients with idiopathic Parkinson’s disease and 58 control subjects underwent quantitative MRI at 3T. Susceptibility and R2* maps were reconstructed from a spoiled multi-echo 3D gradient echo sequence. Mean susceptibilities and R2* rates were measured in subcortical deep gray matter nuclei and compared between patients with PD and controls as well as related to clinical variables. Results Compared to control subjects, patients with PD had increased R2* values in the substantia nigra. QSM also showed higher susceptibilities in patients with PD in substantia nigra, in the nucleus ruber, thalamus, and globus pallidus. Magnetic susceptibility of several of these structures was correlated with the levodopa-equivalent daily dose (LEDD) and clinical markers of motor and non-motor disease severity (total MDS-UPDRS, MDS-UPDRS-I and II). Disease severity as assessed by the Hoehn & Yahr scale was correlated with magnetic susceptibility in the substantia nigra. Conclusion The established finding of higher R2* rates in the substantia nigra was extended by QSM showing superior sensitivity for PD-related tissue changes in nigrostriatal dopaminergic pathways. QSM additionally reflected the levodopa-dosage and disease severity. These results suggest a more widespread pathologic involvement and QSM as a novel means for its investigation, more sensitive than current MRI techniques. PMID:27598250

  8. Optical Coherence Tomography for nanoparticles quantitative characterization

    NASA Astrophysics Data System (ADS)

    Trojanowski, Michał; Kraszewski, Maciej; StrÄ kowski, Marcin R.; Pluciński, Jerzy

    2015-08-01

    The unique features of nanocomposite materials depend on the type and size of nanoparticles, as well as their placement in the composite matrices. Therefore the nanocomposites manufacturing process requires inline control over certain parameters of nanoparticles such as dispersion and concentration. Keeping track of nanoparticles parameters inside a matrix is currently a difficult task due to lack of a fast, reliable and cost effective way of measurement that can be used for large volume samples. For this purpose the Optical Coherence Tomography (OCT) has been used. OCT is an optical measurement method, which is a non-destructive and non-invasive technique. It is capable of creating tomographic images of inner structure by gathering depth related backscattered signal from scattering particles. In addition, it can analyse, in a single shot, area of the centimetre range with resolution up to single micrometres. Still to increase OCT measurement capabilities we are using additional system extensions such as Spectroscopic OCT (SOCT). With such addition, we are able to measure depth related parameters such as scattering spectra and intensity of backscattered signal. Those parameters allow us to quantitatively estimate nanoparticles concentration. Gaining those, information allows to calculate volume concentration of nanoparticles. In addition, we analyse metallic oxides nanoparticles. To fully characterize nanoparticles it is necessary to find and differentiate those that are single particles from agglomerated ones. In this contribution we present our research results on using the LCI based measurement techniques for evaluation of materials with nanoparticles. The laboratory system and signal processing algorithms are going to be shown in order to express the usefulness of this method for inline constant monitoring of the nanocomposite material fabrication.

  9. Non-linear effects in quantitative 2D NMR of polysaccharides: pitfalls and how to avoid them.

    PubMed

    Martineau, Estelle; El Khantache, Kamel; Pupier, Marion; Sepulcri, Patricia; Akoka, Serge; Giraudeau, Patrick

    2015-04-10

    Quantitative 2D NMR is a powerful analytical tool which is widely used to determine the concentration of small molecules in complex samples. Due to the site-specific response of the 2D NMR signal, the determination of absolute concentrations requires the use of a calibration or standard addition approach, where the analyte acts as its own reference. Standard addition methods, where the targeted sample is gradually spiked with known amounts of the targeted analyte, are particularly well-suited for quantitative 2D NMR of small molecules. This paper explores the potential of such quantitative 2D NMR approaches for the quantitative analysis of a high molecular weight polysaccharide. The results highlight that the standard addition method leads to a strong under-estimation of the target concentration, whatever the 2D NMR pulse sequence. Diffusion measurements show that a change in the macromolecular organization of the studied polysaccharide is the most probable hypothesis to explain the non-linear evolution of the 2D NMR signal with concentration. In spite of this non-linearity--the detailed explanation of which is out of the scope of this paper--we demonstrate that accurate quantitative results can still be obtained provided that an external calibration is performed with a wide range of concentrations surrounding the target value. This study opens the way to a number of studies where 2D NMR is needed for the quantitative analysis of macromolecules.

  10. Quantitative social science

    NASA Astrophysics Data System (ADS)

    Weidlich, W.

    1987-03-01

    General concepts for the quantitative description of the dynamics of social processes are introduced. They allow for embedding social science into the conceptual framework of synergetics. Equations of motion for the socioconfiguration are derived on the stochastic and quasideterministic level. As an application the migration of interacting human populations is treated. The solutions of the nonlinear migratory equations include limit cycles and strange attractors. The empiric evaluation of interregional migratory dynamics is exemplified in the case of Germany.

  11. Lipid Informed Quantitation and Identification

    SciTech Connect

    Kevin Crowell, PNNL

    2014-07-21

    LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to show the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets

  12. Computational vaccinology: quantitative approaches.

    PubMed

    Flower, Darren R; McSparron, Helen; Blythe, Martin J; Zygouri, Christianna; Taylor, Debra; Guan, Pingping; Wan, Shouzhan; Coveney, Peter V; Walshe, Valerie; Borrow, Persephone; Doytchinova, Irini A

    2003-01-01

    The immune system is hierarchical and has many levels, exhibiting much emergent behaviour. However, at its heart are molecular recognition events that are indistinguishable from other types of biomacromolecular interaction. These can be addressed well by quantitative experimental and theoretical biophysical techniques, and particularly by methods from drug design. We review here our approach to computational immunovaccinology. In particular, we describe the JenPep database and two new techniques for T cell epitope prediction. One is based on quantitative structure-activity relationships (a 3D-QSAR method based on CoMSIA and another 2D method based on the Free-Wilson approach) and the other on atomistic molecular dynamic simulations using high performance computing. JenPep (http://www.jenner.ar.uk/ JenPep) is a relational database system supporting quantitative data on peptide binding to major histocompatibility complexes, TAP transporters, TCR-pMHC complexes, and an annotated list of B cell and T cell epitopes. Our 2D-QSAR method factors the contribution to peptide binding from individual amino acids as well as 1-2 and 1-3 residue interactions. In the 3D-QSAR approach, the influence of five physicochemical properties (volume, electrostatic potential, hydrophobicity, hydrogen-bond donor and acceptor abilities) on peptide affinity were considered. Both methods are exemplified through their application to the well-studied problem of peptide binding to the human class I MHC molecule HLA-A*0201. PMID:14712934

  13. Study on the performance evaluation of quantitative precipitation estimation and quantitative precipitation forecast

    NASA Astrophysics Data System (ADS)

    Yang, H.; Chang, K.; Suk, M.; cha, J.; Choi, Y.

    2011-12-01

    Rainfall estimation and short-term (several hours) quantitative prediction of precipitation based on meteorological radar data is one of the intensely studied topics. The Korea Peninsula has the horizontally narrow land area and complex topography with many of mountains, and so it has the characteristics that the rainfall system changes in many cases. Quantitative precipitation estimation (QPE) and quantitative precipitation forecasts (QPF) are the crucial information for severe weather or water management. We have been conducted the performance evaluation of QPE/QPF of Korea Meteorological Administration (KMA), which is the first step for optimizing QPE/QPF system in South Korea. The real-time adjusted RAR (Radar-AWS-Rainrate) system gives better agreement with the observed rain-rate than that of the fixed Z-R relation, and the additional bias correction of RAR yields the slightly better results. A correlation coefficient of R2 = 0.84 is obtained between the daily accumulated observed and RAR estimated rainfall. The RAR will be available for the hydrological applications such as the water budget. The VSRF (Very Short Range Forecast) shows better performance than the MAPLE (McGill Algorithm for Precipitation Nowcasting by Lagrangian) within 40 minutes, but the MAPLE better than the VSRF after 40 minutes. In case of hourly forecast, MAPLE shows better performance than the VSRF. QPE and QPF are thought to be meaningful for the nowcasting (1~2 hours) except the model forecast. The long-term forecast longer than 3 hours by meteorological model is especially meaningful for such as water management.

  14. Optimized Pulse Parameters for Reducing Quantitation Errors Due to Saturation Factor Changes in Magnetic Resonance Spectroscopy

    NASA Astrophysics Data System (ADS)

    Galbán, Craig J.; Spencer, Richard G. S.

    2002-06-01

    We present an analysis of the effects of chemical exchange and changes in T1 on metabolite quantitation for heart, skeletal muscle, and brain using the one-pulse experiment for a sample which is subject to temporal variation. We use an optimization algorithm to calculate interpulse delay times, TRs, and flip angles, θ, resulting in maximal root-mean-squared signal-to-noise per unit time ( S/ N) for all exchanging species under 5 and 10% constraints on quantitation errors. The optimization yields TR and θ pairs giving signal-to-noise per unit time close or superior to typical literature values. Additional simulations were performed to demonstrate explicitly the dependence of the quantitation errors on pulse parameters and variations in the properties of the sample, such as may occur after an intervention. We find that (i) correction for partial saturation in accordance with the usual analysis neglecting variations in metabolite concentrations and rate constants may readily result in quantitation errors of 15% or more; the exact degree of error depends upon the details of the system under consideration; (ii) if T1's vary as well, significantly larger quantitation errors may occur; and (iii) optimal values of pulse parameters may minimize errors in quantitation with minimal S/ N loss.

  15. Quantitative proteomics analysis of adsorbed plasma proteins classifies nanoparticles with different surface properties and size

    SciTech Connect

    Zhang, Haizhen; Burnum, Kristin E.; Luna, Maria L.; Petritis, Brianne O.; Kim, Jong Seo; Qian, Weijun; Moore, Ronald J.; Heredia-Langner, Alejandro; Webb-Robertson, Bobbie-Jo M.; Thrall, Brian D.; Camp, David G.; Smith, Richard D.; Pounds, Joel G.; Liu, Tao

    2011-12-01

    In biofluids (e.g., blood plasma) nanoparticles are readily embedded in layers of proteins that can affect their biological activity and biocompatibility. Herein, we report a study on the interactions between human plasma proteins and nanoparticles with a controlled systematic variation of properties using stable isotope labeling and liquid chromatography-mass spectrometry (LC-MS) based quantitative proteomics. Novel protocol has been developed to simplify the isolation of nanoparticle bound proteins and improve the reproducibility. Plasma proteins associated with polystyrene nanoparticles with three different surface chemistries and two sizes as well as for four different exposure times (for a total of 24 different samples) were identified and quantified by LC-MS analysis. Quantitative comparison of relative protein abundances were achieved by spiking an 18 O-labeled 'universal reference' into each individually processed unlabeled sample as an internal standard, enabling simultaneous application of both label-free and isotopic labeling quantitation across the sample set. Clustering analysis of the quantitative proteomics data resulted in distinctive pattern that classifies the nanoparticles based on their surface properties and size. In addition, data on the temporal study indicated that the stable protein 'corona' that was isolated for the quantitative analysis appeared to be formed in less than 5 minutes. The comprehensive results obtained herein using quantitative proteomics have potential implications towards predicting nanoparticle biocompatibility.

  16. The quantitative potential for breast tomosynthesis imaging

    SciTech Connect

    Shafer, Christina M.; Samei, Ehsan; Lo, Joseph Y.

    2010-03-15

    Purpose: Due to its limited angular scan range, breast tomosynthesis has lower resolution in the depth direction, which may limit its accuracy in quantifying tissue density. This study assesses the quantitative potential of breast tomosynthesis using relatively simple reconstruction and image processing algorithms. This quantitation could allow improved characterization of lesions as well as image processing to present tomosynthesis images with the familiar appearance of mammography by preserving more low-frequency information. Methods: All studies were based on a Siemens prototype MAMMOMAT Novation TOMO breast tomo system with a 45 deg. total angular span. This investigation was performed using both simulations and empirical measurements. Monte Carlo simulations were conducted using the breast tomosynthesis geometry and tissue-equivalent, uniform, voxelized phantoms with cuboid lesions of varying density embedded within. Empirical studies were then performed using tissue-equivalent plastic phantoms which were imaged on the actual prototype system. The material surrounding the lesions was set to either fat-equivalent or glandular-equivalent plastic. From the simulation experiments, the effects of scatter, lesion depth, and background material density were studied. The empirical experiments studied the effects of lesion depth, background material density, x-ray tube energy, and exposure level. Additionally, the proposed analysis methods were independently evaluated using a commercially available QA breast phantom (CIRS Model 11A). All image reconstruction was performed with a filtered backprojection algorithm. Reconstructed voxel values within each slice were corrected to reduce background nonuniformities. Results: The resulting lesion voxel values varied linearly with known glandular fraction (correlation coefficient R{sup 2}>0.90) under all simulated and empirical conditions, including for the independent tests with the QA phantom. Analysis of variance performed

  17. Performance Boosting Additive

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Mainstream Engineering Corporation was awarded Phase I and Phase II contracts from Goddard Space Flight Center's Small Business Innovation Research (SBIR) program in early 1990. With support from the SBIR program, Mainstream Engineering Corporation has developed a unique low cost additive, QwikBoost (TM), that increases the performance of air conditioners, heat pumps, refrigerators, and freezers. Because of the energy and environmental benefits of QwikBoost, Mainstream received the Tibbetts Award at a White House Ceremony on October 16, 1997. QwikBoost was introduced at the 1998 International Air Conditioning, Heating, and Refrigeration Exposition. QwikBoost is packaged in a handy 3-ounce can (pressurized with R-134a) and will be available for automotive air conditioning systems in summer 1998.

  18. Sewage sludge additive

    NASA Technical Reports Server (NTRS)

    Kalvinskas, J. J.; Mueller, W. A.; Ingham, J. D. (Inventor)

    1980-01-01

    The additive is for a raw sewage treatment process of the type where settling tanks are used for the purpose of permitting the suspended matter in the raw sewage to be settled as well as to permit adsorption of the dissolved contaminants in the water of the sewage. The sludge, which settles down to the bottom of the settling tank is extracted, pyrolyzed and activated to form activated carbon and ash which is mixed with the sewage prior to its introduction into the settling tank. The sludge does not provide all of the activated carbon and ash required for adequate treatment of the raw sewage. It is necessary to add carbon to the process and instead of expensive commercial carbon, coal is used to provide the carbon supplement.

  19. Perspectives on Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Bourell, David L.

    2016-07-01

    Additive manufacturing (AM) has skyrocketed in visibility commercially and in the public sector. This article describes the development of this field from early layered manufacturing approaches of photosculpture, topography, and material deposition. Certain precursors to modern AM processes are also briefly described. The growth of the field over the last 30 years is presented. Included is the standard delineation of AM technologies into seven broad categories. The economics of AM part generation is considered, and the impacts of the economics on application sectors are described. On the basis of current trends, the future outlook will include a convergence of AM fabricators, mass-produced AM fabricators, enabling of topology optimization designs, and specialization in the AM legal arena. Long-term developments with huge impact are organ printing and volume-based printing.

  20. Sarks as additional fermions

    NASA Astrophysics Data System (ADS)

    Agrawal, Jyoti; Frampton, Paul H.; Jack Ng, Y.; Nishino, Hitoshi; Yasuda, Osamu

    1991-03-01

    An extension of the standard model is proposed. The gauge group is SU(2) X ⊗ SU(3) C ⊗ SU(2) S ⊗ U(1) Q, where all gauge symmetries are unbroken. The colour and electric charge are combined with SU(2) S which becomes strongly coupled at approximately 500 GeV and binds preons to form fermionic and vector bound states. The usual quarks and leptons are singlets under SU(2) X but additional fermions, called sarks. transform under it and the electroweak group. The present model explains why no more than three light quark-lepton families can exist. Neutral sark baryons, called narks, are candidates for the cosmological dark matter having the characteristics designed for WIMPS. Further phenomenological implications of sarks are analyzed i including electron-positron annihilation. Z 0 decay, flavor-changing neutral currents. baryon-number non-conservation, sarkonium and the neutron electric dipole moment.

  1. A quantitative assessment of results with the Angelchik prosthesis.

    PubMed Central

    Wyllie, J. H.; Edwards, D. A.

    1985-01-01

    The Angelchik antireflux prosthesis was assessed in 15 unpromising patients, 12 of whom had peptic strictures of the oesophagus. Radiological techniques were used to show the effect of the device on gastro-oesophageal reflux, and on the bore and length of strictures. Twelve months later (range 6-24) most patients were well satisfied with the operation, and all considered it had been worthwhile; there was radiological evidence of reduction in reflux and remission of strictures. The device never surrounded the oesophageal sphincter; in all but 1 case it encircled a tube of stomach. Images Fig. 5 Fig. 6 PMID:4037629

  2. Evolutionary Quantitative Genomics of Populus trichocarpa

    PubMed Central

    McKown, Athena D.; La Mantia, Jonathan; Guy, Robert D.; Ingvarsson, Pär K.; Hamelin, Richard; Mansfield, Shawn D.; Ehlting, Jürgen; Douglas, Carl J.; El-Kassaby, Yousry A.

    2015-01-01

    Forest trees generally show high levels of local adaptation and efforts focusing on understanding adaptation to climate will be crucial for species survival and management. Here, we address fundamental questions regarding the molecular basis of adaptation in undomesticated forest tree populations to past climatic environments by employing an integrative quantitative genetics and landscape genomics approach. Using this comprehensive approach, we studied the molecular basis of climate adaptation in 433 Populus trichocarpa (black cottonwood) genotypes originating across western North America. Variation in 74 field-assessed traits (growth, ecophysiology, phenology, leaf stomata, wood, and disease resistance) was investigated for signatures of selection (comparing QST -FST) using clustering of individuals by climate of origin (temperature and precipitation). 29,354 SNPs were investigated employing three different outlier detection methods and marker-inferred relatedness was estimated to obtain the narrow-sense estimate of population differentiation in wild populations. In addition, we compared our results with previously assessed selection of candidate SNPs using the 25 topographical units (drainages) across the P. trichocarpa sampling range as population groupings. Narrow-sense QST for 53% of distinct field traits was significantly divergent from expectations of neutrality (indicating adaptive trait variation); 2,855 SNPs showed signals of diversifying selection and of these, 118 SNPs (within 81 genes) were associated with adaptive traits (based on significant QST). Many SNPs were putatively pleiotropic for functionally uncorrelated adaptive traits, such as autumn phenology, height, and disease resistance. Evolutionary quantitative genomics in P. trichocarpa provides an enhanced understanding regarding the molecular basis of climate-driven selection in forest trees and we highlight that important loci underlying adaptive trait variation also show relationship to

  3. Quantitative nondestructive characterization of visco-elastic materials at high pressure

    SciTech Connect

    Aizawa, Tatsuhiko; Kihara, Junji; Ohno, Jun

    1995-11-01

    New anvil apparatus was developed to realize high pressure atmosphere suitable to investigation of viscoelastic behaviors of such soft materials as polymers, lubricants, proteins and so forth. In addition, ultrasonic spectroscopy system was also newly constructed to make quantitative nondestructive evaluation of elasticity and viscosity of soft materials at high pressure. In order to demonstrate the validity and effectiveness of the developed system and methodology for quantitative nondestructive visco-elastic characterization, various silicone oils are employed, and measured spectra are compared to the theoretical results calculated by the three linear element model.

  4. Quantitative SPECT techniques.

    PubMed

    Watson, D D

    1999-07-01

    Quantitative imaging involves first, a set of measurements that characterize an image. There are several variations of technique, but the basic measurements that are used for single photon emission computed tomography (SPECT) perfusion images are reasonably standardized. Quantification currently provides only relative tracer activity within the myocardial regions defined by an individual SPECT acquisition. Absolute quantification is still a work in progress. Quantitative comparison of absolute changes in tracer uptake comparing a stress and rest study or preintervention and postintervention study would be useful and could be done, but most commercial systems do not maintain the data normalization that is necessary for this. Measurements of regional and global function are now possible with electrocardiography (ECG) gating, and this provides clinically useful adjunctive data. Techniques for measuring ventricular function are evolving and promise to provide clinically useful accuracy. The computer can classify images as normal or abnormal by comparison with a normal database. The criteria for this classification involve more than just checking the normal limits. The images should be analyzed to measure how far they deviate from normal, and this information can be used in conjunction with pretest likelihood to indicate the level of statistical certainty that an individual patient has a true positive or true negative test. The interface between the computer and the clinician interpreter is an important part of the process. Especially when both perfusion and function are being determined, the ability of the interpreter to correctly assimilate the data is essential to the use of the quantitative process. As we become more facile with performing and recording objective measurements, the significance of the measurements in terms of risk evaluation, viability assessment, and outcome should be continually enhanced. PMID:10433336

  5. Does Preoperative Measurement of Cerebral Blood Flow with Acetazolamide Challenge in Addition to Preoperative Measurement of Cerebral Blood Flow at the Resting State Increase the Predictive Accuracy of Development of Cerebral Hyperperfusion after Carotid Endarterectomy? Results from 500 Cases with Brain Perfusion Single-photon Emission Computed Tomography Study

    PubMed Central

    OSHIDA, Sotaro; OGASAWARA, Kuniaki; SAURA, Hiroaki; YOSHIDA, Koji; FUJIWARA, Shunro; KOJIMA, Daigo; KOBAYASHI, Masakazu; YOSHIDA, Kenji; KUBO, Yoshitaka; OGAWA, Akira

    2015-01-01

    The purpose of the present study was to determine whether preoperative measurement of cerebral blood flow (CBF) with acetazolamide in addition to preoperative measurement of CBF at the resting state increases the predictive accuracy of development of cerebral hyperperfusion after carotid endarterectomy (CEA). CBF at the resting state and cerebrovascular reactivity (CVR) to acetazolamide were quantitatively assessed using N-isopropyl-p-[123I]-iodoamphetamine (IMP)-autoradiography method with single-photon emission computed tomography (SPECT) before CEA in 500 patients with ipsilateral internal carotid artery stenosis (≥ 70%). CBF measurement using 123I-IMP SPECT was also performed immediately and 3 days after CEA. A region of interest (ROI) was automatically placed in the middle cerebral artery territory in the affected cerebral hemisphere using a three-dimensional stereotactic ROI template. Preoperative decreases in CBF at the resting state [95% confidence intervals (CIs), 0.855 to 0.967; P = 0.0023] and preoperative decreases in CVR to acetazolamide (95% CIs, 0.844 to 0.912; P < 0.0001) were significant independent predictors of post-CEA hyperperfusion. The area under the receiver operating characteristic curve for prediction of the development of post-CEA hyperperfusion was significantly greater for CVR to acetazolamide than for CBF at the resting state (difference between areas, 0.173; P < 0.0001). Sensitivity, specificity, and positive- and negative-predictive values for the prediction of the development of post-CEA hyperperfusion were significantly greater for CVR to acetazolamide than for CBF at the resting state (P < 0.05, respectively). The present study demonstrated that preoperative measurement of CBF with acetazolamide in addition to preoperative measurement of CBF at the resting state increases the predictive accuracy of the development of post-CEA hyperperfusion. PMID:25746308

  6. Additive lattice kirigami

    PubMed Central

    Castle, Toen; Sussman, Daniel M.; Tanis, Michael; Kamien, Randall D.

    2016-01-01

    Kirigami uses bending, folding, cutting, and pasting to create complex three-dimensional (3D) structures from a flat sheet. In the case of lattice kirigami, this cutting and rejoining introduces defects into an underlying 2D lattice in the form of points of nonzero Gaussian curvature. A set of simple rules was previously used to generate a wide variety of stepped structures; we now pare back these rules to their minimum. This allows us to describe a set of techniques that unify a wide variety of cut-and-paste actions under the rubric of lattice kirigami, including adding new material and rejoining material across arbitrary cuts in the sheet. We also explore the use of more complex lattices and the different structures that consequently arise. Regardless of the choice of lattice, creating complex structures may require multiple overlapping kirigami cuts, where subsequent cuts are not performed on a locally flat lattice. Our additive kirigami method describes such cuts, providing a simple methodology and a set of techniques to build a huge variety of complex 3D shapes. PMID:27679822

  7. Additive lattice kirigami

    PubMed Central

    Castle, Toen; Sussman, Daniel M.; Tanis, Michael; Kamien, Randall D.

    2016-01-01

    Kirigami uses bending, folding, cutting, and pasting to create complex three-dimensional (3D) structures from a flat sheet. In the case of lattice kirigami, this cutting and rejoining introduces defects into an underlying 2D lattice in the form of points of nonzero Gaussian curvature. A set of simple rules was previously used to generate a wide variety of stepped structures; we now pare back these rules to their minimum. This allows us to describe a set of techniques that unify a wide variety of cut-and-paste actions under the rubric of lattice kirigami, including adding new material and rejoining material across arbitrary cuts in the sheet. We also explore the use of more complex lattices and the different structures that consequently arise. Regardless of the choice of lattice, creating complex structures may require multiple overlapping kirigami cuts, where subsequent cuts are not performed on a locally flat lattice. Our additive kirigami method describes such cuts, providing a simple methodology and a set of techniques to build a huge variety of complex 3D shapes.

  8. High-throughput automated image analysis of neuroinflammation and neurodegeneration enables quantitative assessment of virus neurovirulence

    PubMed Central

    Maximova, Olga A.; Murphy, Brian R.; Pletnev, Alexander G.

    2010-01-01

    Historically, the safety of live attenuated vaccine candidates against neurotropic viruses was assessed by semi-quantitative analysis of virus-induced histopathology in the central nervous system of monkeys. We have developed a high-throughput automated image analysis (AIA) for the quantitative assessment of virus-induced neuroinflammation and neurodegeneration. Evaluation of the results generated by AIA showed that quantitative estimates of lymphocytic infiltration, microglial activation, and neurodegeneration strongly and significantly correlated with results of traditional histopathological scoring. In addition, we show that AIA is a targeted, objective, accurate, and time-efficient approach that provides reliable differentiation of virus neurovirulence. As such, it may become a useful tool in establishing consistent analytical standards across research and development laboratories and regulatory agencies, and may improve the safety evaluation of live virus vaccines. The implementation of this high-throughput AIA will markedly advance many fields of research including virology, neuroinflammation, neuroscience, and vaccinology. PMID:20688036

  9. High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.

    ERIC Educational Resources Information Center

    Bohman, Ove; And Others

    1982-01-01

    Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)

  10. Cancer detection by quantitative fluorescence image analysis.

    PubMed

    Parry, W L; Hemstreet, G P

    1988-02-01

    Quantitative fluorescence image analysis is a rapidly evolving biophysical cytochemical technology with the potential for multiple clinical and basic research applications. We report the application of this technique for bladder cancer detection and discuss its potential usefulness as an adjunct to methods used currently by urologists for the diagnosis and management of bladder cancer. Quantitative fluorescence image analysis is a cytological method that incorporates 2 diagnostic techniques, quantitation of nuclear deoxyribonucleic acid and morphometric analysis, in a single semiautomated system to facilitate the identification of rare events, that is individual cancer cells. When compared to routine cytopathology for detection of bladder cancer in symptomatic patients, quantitative fluorescence image analysis demonstrated greater sensitivity (76 versus 33 per cent) for the detection of low grade transitional cell carcinoma. The specificity of quantitative fluorescence image analysis in a small control group was 94 per cent and with the manual method for quantitation of absolute nuclear fluorescence intensity in the screening of high risk asymptomatic subjects the specificity was 96.7 per cent. The more familiar flow cytometry is another fluorescence technique for measurement of nuclear deoxyribonucleic acid. However, rather than identifying individual cancer cells, flow cytometry identifies cellular pattern distributions, that is the ratio of normal to abnormal cells. Numerous studies by others have shown that flow cytometry is a sensitive method to monitor patients with diagnosed urological disease. Based upon results in separate quantitative fluorescence image analysis and flow cytometry studies, it appears that these 2 fluorescence techniques may be complementary tools for urological screening, diagnosis and management, and that they also may be useful separately or in combination to elucidate the oncogenic process, determine the biological potential of tumors

  11. Simplified quantitative glycomics using the stable isotope label Girard's reagent p by electrospray ionization mass spectrometry.

    PubMed

    Wang, Chengjian; Wu, Zhiyu; Yuan, Jiangbei; Wang, Bo; Zhang, Ping; Zhang, Ying; Wang, Zhongfu; Huang, Linjuan

    2014-02-01

    Fast, sensitive, and simple methods for quantitative analysis of disparities in glycan expression between different biological samples are essential for studies of protein glycosylation patterns (glycomics) and the search for disease glycan biomarkers. Relative quantitation of glycans based on stable isotope labeling combined with mass spectrometric detection represents an emerging and promising technique. However, this technique is undermined by the complexity of mass spectra of isotope-labeled glycans caused by the presence of multiple metal ion adduct signals, which result in a decrease of detection sensitivity and an increase of difficulties in data interpretation. Herein we report a simplified quantitative glycomics strategy, which features nonreductive isotopic labeling of reducing glycans with either nondeuterated (d0-) or deuterated (d5-) Girard's reagent P (GP) without salts introduced and simplified mass spectrometric profiles of d0- and d5-GP derivatives of neutral glycans as molecular ions without complex metal ion adducts, allowing rapid and sensitive quantitative comparison between different glycan samples. We have obtained optimized GP-labeling conditions and good quantitation linearity, reproducibility, and accuracy of data by the method. Its excellent applicability was validated by comparatively quantitative analysis of the neutral N-glycans released from bovine and porcine immunoglobulin G as well as of those from mouse and rat sera. Additionally, we have revealed the potential of this strategy for the high-sensitivity analysis of sialylated glycans as GP derivatives, which involves neutralization of the carboxyl group of sialic acid by chemical derivatization.

  12. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness.

  13. Quantitative survey on the shape of the back of men's head as viewed from the side.

    PubMed

    Tamir, Abraham

    2013-05-01

    This article classifies quantitatively into 4 shapes men's back part of the head viewed from the side that are demonstrated in some of the figures in this article. Because of self-evident reasons, the shapes were blurred. The survey is based on the analysis of 2220 shapes obtained by photographing mainly bald men and by finding pictures in the Internet. To the best of the author's knowledge, this quantitative approach has never been implemented before. The results obtained are as follows: the percentage of 376 "flat heads" is 17%; the percentage of 755 "little round heads," 34%; the percentage of 1017 "round heads," 45.8%; and the percentage of 72 "very round heads," 3.2%. This quantitative survey is an additional step in analyzing quantitatively the shape of the parts of the face wherein, in articles that were previously published or that will be published in this magazine, shapes of the nose, ear conch, and human eye were analyzed quantitatively. In addition, the shapes of the leg toes were also analyzed. Finally, it should be noted that, because of obvious reasons, the survey is based on men's head, most of which are with baldness. PMID:23714907

  14. Quantitative Hyperspectral Reflectance Imaging

    PubMed Central

    Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.

    2008-01-01

    Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms.

  15. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed.

  16. Quantitative volumetric breast density estimation using phase contrast mammography

    NASA Astrophysics Data System (ADS)

    Wang, Zhentian; Hauser, Nik; Kubik-Huch, Rahel A.; D'Isidoro, Fabio; Stampanoni, Marco

    2015-05-01

    Phase contrast mammography using a grating interferometer is an emerging technology for breast imaging. It provides complementary information to the conventional absorption-based methods. Additional diagnostic values could be further obtained by retrieving quantitative information from the three physical signals (absorption, differential phase and small-angle scattering) yielded simultaneously. We report a non-parametric quantitative volumetric breast density estimation method by exploiting the ratio (dubbed the R value) of the absorption signal to the small-angle scattering signal. The R value is used to determine breast composition and the volumetric breast density (VBD) of the whole breast is obtained analytically by deducing the relationship between the R value and the pixel-wise breast density. The proposed method is tested by a phantom study and a group of 27 mastectomy samples. In the clinical evaluation, the estimated VBD values from both cranio-caudal (CC) and anterior-posterior (AP) views are compared with the ACR scores given by radiologists to the pre-surgical mammograms. The results show that the estimated VBD results using the proposed method are consistent with the pre-surgical ACR scores, indicating the effectiveness of this method in breast density estimation. A positive correlation is found between the estimated VBD and the diagnostic ACR score for both the CC view (p=0.033 ) and AP view (p=0.001 ). A linear regression between the results of the CC view and AP view showed a correlation coefficient γ = 0.77, which indicates the robustness of the proposed method and the quantitative character of the additional information obtained with our approach.

  17. Quantitative volumetric breast density estimation using phase contrast mammography.

    PubMed

    Wang, Zhentian; Hauser, Nik; Kubik-Huch, Rahel A; D'Isidoro, Fabio; Stampanoni, Marco

    2015-05-21

    Phase contrast mammography using a grating interferometer is an emerging technology for breast imaging. It provides complementary information to the conventional absorption-based methods. Additional diagnostic values could be further obtained by retrieving quantitative information from the three physical signals (absorption, differential phase and small-angle scattering) yielded simultaneously. We report a non-parametric quantitative volumetric breast density estimation method by exploiting the ratio (dubbed the R value) of the absorption signal to the small-angle scattering signal. The R value is used to determine breast composition and the volumetric breast density (VBD) of the whole breast is obtained analytically by deducing the relationship between the R value and the pixel-wise breast density. The proposed method is tested by a phantom study and a group of 27 mastectomy samples. In the clinical evaluation, the estimated VBD values from both cranio-caudal (CC) and anterior-posterior (AP) views are compared with the ACR scores given by radiologists to the pre-surgical mammograms. The results show that the estimated VBD results using the proposed method are consistent with the pre-surgical ACR scores, indicating the effectiveness of this method in breast density estimation. A positive correlation is found between the estimated VBD and the diagnostic ACR score for both the CC view (p = 0.033) and AP view (p = 0.001). A linear regression between the results of the CC view and AP view showed a correlation coefficient γ = 0.77, which indicates the robustness of the proposed method and the quantitative character of the additional information obtained with our approach.

  18. Bayes` theorem and quantitative risk assessment

    SciTech Connect

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  19. Quantitative comparisons of in vitro assays for estrogenic activities.

    PubMed Central

    Fang, H; Tong, W; Perkins, R; Soto, A M; Prechtl, N V; Sheehan, D M

    2000-01-01

    Substances that may act as estrogens show a broad chemical structural diversity. To thoroughly address the question of possible adverse estrogenic effects, reliable methods are needed to detect and identify the chemicals of these diverse structural classes. We compared three assays--in vitro estrogen receptor competitive binding assays (ER binding assays), yeast-based reporter gene assays (yeast assays), and the MCF-7 cell proliferation assay (E-SCREEN assay)--to determine their quantitative agreement in identifying structurally diverse estrogens. We examined assay performance for relative sensitivity, detection of active/inactive chemicals, and estrogen/antiestrogen activities. In this examination, we combined individual data sets in a specific, quantitative data mining exercise. Data sets for at least 29 chemicals from five laboratories were analyzed pair-wise by X-Y plots. The ER binding assay was a good predictor for the other two assay results when the antiestrogens were excluded (r(2) is 0.78 for the yeast assays and 0.85 for the E-SCREEN assays). Additionally, the examination strongly suggests that biologic information that is not apparent from any of the individual assays can be discovered by quantitative pair-wise comparisons among assays. Antiestrogens are identified as outliers in the ER binding/yeast assay, while complete antagonists are identified in the ER binding and E-SCREEN assays. Furthermore, the presence of outliers may be explained by different mechanisms that induce an endocrine response, different impurities in different batches of chemicals, different species sensitivity, or limitations of the assay techniques. Although these assays involve different levels of biologic complexity, the major conclusion is that they generally provided consistent information in quantitatively determining estrogenic activity for the five data sets examined. The results should provide guidance for expanded data mining examinations and the selection of appropriate

  20. Quantitation of latex allergens.

    PubMed

    Palosuo, Timo; Alenius, Harri; Turjanmaa, Kristiina

    2002-05-01

    Minimizing allergen concentration in latex goods to prevent sensitization to natural rubber latex (NRL) and thereby the development of clinical allergy is acknowledged as of mutual interest for rubber manufacturers and regulatory health authorities. However, measuring total protein, the principal currently available method, cannot be deemed a satisfactory regulatory measure to control allergen content. Specific methods based on human IgE-containing reagents, such as radioallergosorbent test (RAST) inhibition, have been available in certain laboratories for demonstrating NRL allergens in rubber products but the methods lack standardization. Currently, one commercial test has become available for measuring individual NRL allergens by capture ELISA-based assays using monoclonal antibodies and purified or recombinant allergens. Such methods are specific, they can be properly standardized, and they are of sufficient sensitivity and reproducibility. Results from medical gloves collected in two national market surveys in Finland in 1995 and 1999, respectively, show that Hev b 6.02 and Hev b 5, the two major allergens for NRL-allergic adults, are the most abundant allergens regularly detectable in high- and moderate-allergen gloves. In addition, Hev b 3 and Hev b 1, the two major allergens for children with spina bifida, are also commonly found. In general, when the sum of the four allergens exceeded 1 microg/g, most NRL-allergic patients showed positive skin prick test reactions against them. Using these new methods assessment of threshold levels that could in due course become guidelines for the rubber industry and regulatory health authorities is becoming possible. Eventually, this progress is expected to lead to a declining incidence of latex allergy. PMID:12079417

  1. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  2. Geographical Variation in a Quantitative Character

    PubMed Central

    Nagylaki, T.

    1994-01-01

    A model for the evolution of the local averages of a quantitative character under migration, selection, and random genetic drift in a subdivided population is formulated and investigated. Generations are discrete and nonoverlapping; the monoecious, diploid population mates at random in each deme. All three evolutionary forces are weak, but the migration pattern and the local population numbers are otherwise arbitrary. The character is determined by purely additive gene action and a stochastically independent environment; its distribution is Gaussian with a constant variance; and it is under Gaussian stabilizing selection with the same parameters in every deme. Linkage disequilibrium is neglected. Most of the results concern the covariances of the local averages. For a finite number of demes, explicit formulas are derived for (i) the asymptotic rate and pattern of convergence to equilibrium, (ii) the variance of a suitably weighted average of the local averages, and (iii) the equilibrium covariances when selection and random drift are much weaker than migration. Essentially complete analyses of equilibrium and convergence are presented for random outbreeding and site homing, the Levene and island models, the circular habitat and the unbounded linear stepping-stone model in the diffusion approximation, and the exact unbounded stepping-stone model in one and two dimensions. PMID:8138171

  3. Extracting Quantitative Data from Lunar Soil Spectra

    NASA Technical Reports Server (NTRS)

    Noble, S. K.; Pieters, C. M.; Hiroi, T.

    2005-01-01

    Using the modified Gaussian model (MGM) developed by Sunshine et al. [1] we compared the spectral properties of the Lunar Soil Characterization Consortium (LSCC) suite of lunar soils [2,3] with their petrologic and chemical compositions to obtain quantitative data. Our initial work on Apollo 17 soils [4] suggested that useful compositional data could be elicited from high quality soil spectra. We are now able to expand upon those results with the full suite of LSCC soils that allows us to explore a much wider range of compositions and maturity states. The model is shown to be sensitive to pyroxene abundance and can evaluate the relative portion of high-Ca and low-Ca pyroxenes in the soils. In addition, the dataset has provided unexpected insights into the nature and causes of absorption bands in lunar soils. For example, it was found that two distinct absorption bands are required in the 1.2 m region of the spectrum. Neither of these bands can be attributed to plagioclase or agglutinates, but both appear to be largely due to pyroxene.

  4. Quantitative cell biology: the essential role of theory.

    PubMed

    Howard, Jonathon

    2014-11-01

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not.

  5. Reproducibility and quantitation of amplicon sequencing-based detection.

    PubMed

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-08-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  6. Improved Synthesis of and Nucleophilic Addition to 2-Formyl-2-Cyclohexenone

    PubMed Central

    Adary, Elan M.; Chang, Chih-wei; Auria, Damian T. D’; Nguyen, Phuc M.; Polewacz, Klaudyna; Reinicke, Justin A.; Seo, Hannah; Berger, Gideon O.

    2014-01-01

    A preparation of 2-formyl-2-cyclohexenone in nearly quantitative yield and purity of approximately 95% is described. It is scalable and has been extended to the synthesis of the 5- and 7-membered ring homologs with comparable yields. Conditions have also been developed for the successful conjugate addition of dimethylmalonate to 2-formyl-2-cyclohexenone, in good and scalable yield (60%). This result has been extended to 5 other nucleophile classes, and the dimethylmalonate conjugate addition has been demonstrated with 2-formyl-2-cyclopentenone and 2-formyl-2-cycloheptenone. PMID:25593375

  7. A MALDI-MS-based quantitative analytical method for endogenous estrone in human breast cancer cells

    PubMed Central

    Kim, Kyoung-Jin; Kim, Hee-Jin; Park, Han-Gyu; Hwang, Cheol-Hwan; Sung, Changmin; Jang, Kyoung-Soon; Park, Sung-Hee; Kim, Byung-Gee; Lee, Yoo-Kyung; Yang, Yung-Hun; Jeong, Jae Hyun; Kim, Yun-Gon

    2016-01-01

    The level of endogenous estrone, one of the three major naturally occurring estrogens, has a significant correlation with the incidence of post-menopausal breast cancer. However, it is challenging to quantitatively monitor it owing to its low abundance. Here, we develop a robust and highly sensitive mass-assisted laser desorption/ionization mass spectrometry (MALDI-MS)-based quantitative platform to identify the absolute quantities of endogenous estrones in a variety of clinical specimens. The one-step modification of endogenous estrone provided good linearity (R2 > 0.99) and significantly increased the sensitivity of the platform (limit of quantitation: 11 fmol). In addition, we could identify the absolute amount of endogenous estrones in cells of the breast cancer cell line MCF-7 (34 fmol/106 cells) by using a deuterated estrone as an internal standard. Finally, by applying the MALDI-MS-based quantitative method to endogenous estrones, we successfully monitored changes in the metabolic expression level of estrones (17.7 fmol/106 letrozole-treated cells) in MCF-7 cells resulting from treatment with an aromatase inhibitor. Taken together, these results suggest that this MALDI-MS-based quantitative approach may be a general method for the targeted metabolomics of ketone-containing metabolites, which can reflect clinical conditions and pathogenic mechanisms. PMID:27091422

  8. Quantitative photoacoustic elastography in humans

    NASA Astrophysics Data System (ADS)

    Hai, Pengfei; Zhou, Yong; Gong, Lei; Wang, Lihong V.

    2016-06-01

    We report quantitative photoacoustic elastography (QPAE) capable of measuring Young's modulus of biological tissue in vivo in humans. By combining conventional PAE with a stress sensor having known stress-strain behavior, QPAE can simultaneously measure strain and stress, from which Young's modulus is calculated. We first demonstrate the feasibility of QPAE in agar phantoms with different concentrations. The measured Young's modulus values fit well with both the empirical expectation based on the agar concentrations and those measured in an independent standard compression test. Next, QPAE was applied to quantify the Young's modulus of skeletal muscle in vivo in humans, showing a linear relationship between muscle stiffness and loading. The results demonstrated the capability of QPAE to assess the absolute elasticity of biological tissue noninvasively in vivo in humans, indicating its potential for tissue biomechanics studies and clinical applications.

  9. Quantitative patterns in drone wars

    NASA Astrophysics Data System (ADS)

    Garcia-Bernardo, Javier; Dodds, Peter Sheridan; Johnson, Neil F.

    2016-02-01

    Attacks by drones (i.e., unmanned combat air vehicles) continue to generate heated political and ethical debates. Here we examine the quantitative nature of drone attacks, focusing on how their intensity and frequency compare with that of other forms of human conflict. Instead of the power-law distribution found recently for insurgent and terrorist attacks, the severity of attacks is more akin to lognormal and exponential distributions, suggesting that the dynamics underlying drone attacks lie beyond these other forms of human conflict. We find that the pattern in the timing of attacks is consistent with one side having almost complete control, an important if expected result. We show that these novel features can be reproduced and understood using a generative mathematical model in which resource allocation to the dominant side is regulated through a feedback loop.

  10. Quantitative evaluation of dermatological antiseptics.

    PubMed

    Leitch, C S; Leitch, A E; Tidman, M J

    2015-12-01

    Topical antiseptics are frequently used in dermatological management, yet evidence for the efficacy of traditional generic formulations is often largely anecdotal. We tested the in vitro bactericidal activity of four commonly used topical antiseptics against Staphylococcus aureus, using a modified version of the European Standard EN 1276, a quantitative suspension test for evaluation of the bactericidal activity of chemical disinfectants and antiseptics. To meet the standard for antiseptic effectiveness of EN 1276, at least a 5 log10 reduction in bacterial count within 5 minutes of exposure is required. While 1% benzalkonium chloride and 6% hydrogen peroxide both achieved a 5 log10 reduction in S. aureus count, neither 2% aqueous eosin nor 1 : 10 000 potassium permanganate showed significant bactericidal activity compared with control at exposure periods of up to 1 h. Aqueous eosin and potassium permanganate may have desirable astringent properties, but these results suggest they lack effective antiseptic activity, at least against S. aureus.

  11. Quantitative Electron Nanodiffraction.

    SciTech Connect

    Spence, John

    2015-01-30

    This Final report summarizes progress under this award for the final reporting period 2002 - 2013 in our development of quantitive electron nanodiffraction to materials problems, especially devoted to atomistic processes in semiconductors and electronic oxides such as the new artificial oxide multilayers, where our microdiffraction is complemented with energy-loss spectroscopy (ELNES) and aberration-corrected STEM imaging (9). The method has also been used to map out the chemical bonds in the important GaN semiconductor (1) used for solid state lighting, and to understand the effects of stacking sequence variations and interfaces in digital oxide superlattices (8). Other projects include the development of a laser-beam Zernike phase plate for cryo-electron microscopy (5) (based on the Kapitza-Dirac effect), work on reconstruction of molecular images using the scattering from many identical molecules lying in random orientations (4), a review article on space-group determination for the International Tables on Crystallography (10), the observation of energy-loss spectra with millivolt energy resolution and sub-nanometer spatial resolution from individual point defects in an alkali halide, a review article for the Centenary of X-ray Diffration (17) and the development of a new method of electron-beam lithography (12). We briefly summarize here the work on GaN, on oxide superlattice ELNES, and on lithography by STEM.

  12. Additive manufacturing of hybrid circuits

    DOE PAGES

    Bell, Nelson S.; Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David M.; Hirschfeld, Deidre; Hall, Aaron Christopher

    2016-03-26

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects.more » Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. As a result, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.« less

  13. Tamm-Horsfall mucoproteins promote calcium oxalate crystal formation in urine: quantitative studies

    SciTech Connect

    Rose, G.A.; Sulaiman, S.

    1982-01-01

    The technique of rapid evaporation of whole urine to standard osmolality has been studied further and quantitative measurements made of the calcium oxalate crystals resulting, firstly by a microscope method and secondly by isotope method using 14C-oxalate. It is confirmed that ultrafiltration of urine prior to evaporation leads to a large reduction in calcium oxalate crystal formation and that this is largely restored by addition of human urinary Tamm-Horsfall protein (uromucoid). Albumin does not have this effect.

  14. Quantitative Vibrational Dynamics of Iron in Carbonyl Porphyrins

    PubMed Central

    Leu, Bogdan M.; Silvernail, Nathan J.; Zgierski, Marek Z.; Wyllie, Graeme R. A.; Ellison, Mary K.; Scheidt, W. Robert; Zhao, Jiyong; Sturhahn, Wolfgang; Alp, E. Ercan; Sage, J. Timothy

    2007-01-01

    We use nuclear resonance vibrational spectroscopy and computational predictions based on density functional theory (DFT) to explore the vibrational dynamics of 57Fe in porphyrins that mimic the active sites of histidine-ligated heme proteins complexed with carbon monoxide. Nuclear resonance vibrational spectroscopy yields the complete vibrational spectrum of a Mössbauer isotope, and provides a valuable probe that is not only selective for protein active sites but quantifies the mean-squared amplitude and direction of the motion of the probe nucleus, in addition to vibrational frequencies. Quantitative comparison of the experimental results with DFT calculations provides a detailed, rigorous test of the vibrational predictions, which in turn provide a reliable description of the observed vibrational features. In addition to the well-studied stretching vibration of the Fe-CO bond, vibrations involving the Fe-imidazole bond, and the Fe-Npyr bonds to the pyrrole nitrogens of the porphyrin contribute prominently to the observed experimental signal. All of these frequencies show structural sensitivity to the corresponding bond lengths, but previous studies have failed to identify the latter vibrations, presumably because the coupling to the electronic excitation is too small in resonance Raman measurements. We also observe the FeCO bending vibrations, which are not Raman active for these unhindered model compounds. The observed Fe amplitude is strongly inconsistent with three-body oscillator descriptions of the FeCO fragment, but agrees quantitatively with DFT predictions. Over the past decade, quantum chemical calculations have suggested revised estimates of the importance of steric distortion of the bound CO in preventing poisoning of heme proteins by carbon monoxide. Quantitative agreement with the predicted frequency, amplitude, and direction of Fe motion for the FeCO bending vibrations provides direct experimental support for the quantum chemical description of the

  15. Microbiological Quantitative Risk Assessment

    NASA Astrophysics Data System (ADS)

    Dominguez, Silvia; Schaffner, Donald W.

    The meat and poultry industry faces ongoing challenges due to the natural association of pathogens of concern (e.g., Salmonella, Campylobacter jejuni, Escherichia coli O157:H7) with a variety of domesticated food animals. In addition, pathogens such as Listeria monocytogenes pose a significant cross-contamination risk during further meat and poultry processing, distribution, and storage. Furthermore, the meat and poultry industries are constantly changing with the addition of new products, use of new raw materials, and targeting of new consumer populations, each of which may give rise to potential new risks. National and international regulations are increasingly using a “risk-based” approach to food safety (where the regulatory focus is driven by the magnitude of the risk), so risk assessment is becoming a valuable tool to systematically organize and evaluate the potential public health risk posed by food processing operations.

  16. Quantitative Literacy: Geosciences and Beyond

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.; McCallum, W. G.

    2002-12-01

    Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.

  17. Quantitative Spectroscopy of Deneb

    NASA Astrophysics Data System (ADS)

    Schiller, Florian; Przybilla, N.

    We use the visually brightest A-type supergiant Deneb (A2 Ia) as benchmark for testing a spectro- scopic analysis technique developed for quantitative studies of BA-type supergiants. Our NLTE spectrum synthesis technique allows us to derive stellar parameters and elemental abundances with unprecedented accuracy. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2 m telescope. Practically all inconsistencies reported in earlier studies are resolved. A self-consistent view of Deneb is thus obtained, allowing us to discuss its evolutionary state in detail by comparison with the most recent generation of evolution models for massive stars. The basic atmospheric parameters Teff = 8525 ± 75 K and log g = 1.10 ± 0.05 dex (cgs) and the distance imply the following fundamental parameters for Deneb: M spec = 17 ± 3 M⊙ , L = 1.77 ± 0.29 · 105 L⊙ and R = 192 ± 16 R⊙ . The derived He and CNO abundances indicate mixing with nuclear processed matter. The high N/C ratio of 4.64 ± 1.39 and a N/O ratio of 0.88 ± 0.07 (mass fractions) could in principle be explained by evolutionary models with initially very rapid rotation. A mass of ˜ 22 M⊙ is implied for the progenitor on the zero-age main se- quence, i.e. it was a late O-type star. Significant mass-loss has occurred, probably enhanced by pronounced centrifugal forces. The observational constraints favour a scenario for the evolu- tion of Deneb where the effects of rotational mixing may be amplified by an interaction with a magnetic field. Analogous analyses of such highly luminous BA-type supergiants will allow for precision studies of different galaxies in the Local Group and beyond.

  18. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  19. Modern quantitative acid-base chemistry.

    PubMed

    Stewart, P A

    1983-12-01

    Quantitative analysis of ionic solutions in terms of physical and chemical principles has been effectively prohibited in the past by the overwhelming amount of calculation it required, but computers have suddenly eliminated that prohibition. The result is an approach to acid-base which revolutionizes our ability to understand, predict, and control what happens to hydrogen ions in living systems. This review outlines that approach and suggests some of its most useful implications. Quantitative understanding requires distinctions between independent variables (in body fluids: pCO2, net strong ion charge, and total weak acid, usually protein), and dependent variables [( HCO-3], [HA], [A-], [CO(2-)3], [OH-], and [H+] (or pH]. Dependent variables are determined by independent variables, and can be calculated from the defining equations for the specific system. Hydrogen ion movements between solutions can not affect hydrogen ion concentration; only changes in independent variables can. Many current models for ion movements through membranes will require modification on the basis of this quantitative analysis. Whole body acid-base balance can be understood quantitatively in terms of the three independent variables and their physiological regulation by the lungs, kidneys, gut, and liver. Quantitative analysis also shows that body fluids interact mainly by strong ion movements through the membranes separating them.

  20. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms

    PubMed Central

    2014-01-01

    Background Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Methods Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. Results A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. Conclusion The semi-quantitative

  1. An Additive Manufacturing Test Artifact.

    PubMed

    Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan

    2014-01-01

    A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039

  2. An Additive Manufacturing Test Artifact

    PubMed Central

    Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan

    2014-01-01

    A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system. PMID:26601039

  3. An Additive Manufacturing Test Artifact.

    PubMed

    Moylan, Shawn; Slotwinski, John; Cooke, April; Jurrens, Kevin; Donmez, M Alkan

    2014-01-01

    A test artifact, intended for standardization, is proposed for the purpose of evaluating the performance of additive manufacturing (AM) systems. A thorough analysis of previously proposed AM test artifacts as well as experience with machining test artifacts have inspired the design of the proposed test artifact. This new artifact is designed to provide a characterization of the capabilities and limitations of an AM system, as well as to allow system improvement by linking specific errors measured in the test artifact to specific sources in the AM system. The proposed test artifact has been built in multiple materials using multiple AM technologies. The results of several of the builds are discussed, demonstrating how the measurement results can be used to characterize and improve a specific AM system.

  4. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  5. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  6. A comparison of three quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael J.; Settles, Gary S.

    2012-01-01

    We compare the results of three quantitative schlieren techniques applied to the measurement and visualization of a two-dimensional laminar free-convection boundary layer. The techniques applied are Schardin's "calibrated" schlieren technique, in which a weak lens in the field-of-view provides a calibration of light deflection angle to facilitate quantitative measurements, "rainbow schlieren", in which the magnitude of schlieren deflection is coded by hue in the image, and "background-oriented schlieren" (BOS), in which quantitative schlieren-like results are had from measuring the distortion of a background pattern using digital-image-correlation software. In each case computers and software are applied to process the data, thus streamlining and modernizing the quantitative application of schlieren optics. (BOS, in particular, is only possible with digital-image-correlation software.) Very good results are had with the lens-calibrated standard schlieren method in the flow tested here. BOS likewise produces good results and requires less expensive apparatus than the other methods, but lacks the simplification of parallel light that they feature. Rainbow schlieren suffers some unique drawbacks, including the production of the required rainbow cutoff filter, and provides little significant benefit over the calibrated schlieren technique.

  7. Seniors' Online Communities: A Quantitative Content Analysis

    ERIC Educational Resources Information Center

    Nimrod, Galit

    2010-01-01

    Purpose: To examine the contents and characteristics of seniors' online communities and to explore their potential benefits to older adults. Design and Methods: Quantitative content analysis of a full year's data from 14 leading online communities using a novel computerized system. The overall database included 686,283 messages. Results: There was…

  8. On the Additive and Dominant Variance and Covariance of Individuals Within the Genomic Selection Scope

    PubMed Central

    Vitezica, Zulma G.; Varona, Luis; Legarra, Andres

    2013-01-01

    Genomic evaluation models can fit additive and dominant SNP effects. Under quantitative genetics theory, additive or “breeding” values of individuals are generated by substitution effects, which involve both “biological” additive and dominant effects of the markers. Dominance deviations include only a portion of the biological dominant effects of the markers. Additive variance includes variation due to the additive and dominant effects of the markers. We describe a matrix of dominant genomic relationships across individuals, D, which is similar to the G matrix used in genomic best linear unbiased prediction. This matrix can be used in a mixed-model context for genomic evaluations or to estimate dominant and additive variances in the population. From the “genotypic” value of individuals, an alternative parameterization defines additive and dominance as the parts attributable to the additive and dominant effect of the markers. This approach underestimates the additive genetic variance and overestimates the dominance variance. Transforming the variances from one model into the other is trivial if the distribution of allelic frequencies is known. We illustrate these results with mouse data (four traits, 1884 mice, and 10,946 markers) and simulated data (2100 individuals and 10,000 markers). Variance components were estimated correctly in the model, considering breeding values and dominance deviations. For the model considering genotypic values, the inclusion of dominant effects biased the estimate of additive variance. Genomic models were more accurate for the estimation of variance components than their pedigree-based counterparts. PMID:24121775

  9. Quantitative equivalence between polymer nanocomposites and thin polymer films.

    PubMed

    Bansal, Amitabh; Yang, Hoichang; Li, Chunzhao; Cho, Kilwon; Benicewicz, Brian C; Kumar, Sanat K; Schadler, Linda S

    2005-09-01

    The thermomechanical responses of polymers, which provide limitations to their practical use, are favourably altered by the addition of trace amounts of a nanofiller. However, the resulting changes in polymer properties are poorly understood, primarily due to the non-uniform spatial distribution of nanoparticles. Here we show that the thermomechanical properties of 'polymer nanocomposites' are quantitatively equivalent to the well-documented case of planar polymer films. We quantify this equivalence by drawing a direct analogy between film thickness and an appropriate experimental interparticle spacing. We show that the changes in glass-transition temperature with decreasing interparticle spacing for two filler surface treatments are quantitatively equivalent to the corresponding thin-film data with a non-wetting and a wetting polymer-particle interface. Our results offer new insights into the role of confinement on the glass transition, and we conclude that the mere presence of regions of modified mobility in the vicinity of the particle surfaces, that is, a simple two-layer model, is insufficient to explain our results. Rather, we conjecture that the glass-transition process requires that the interphase regions surrounding different particles interact. PMID:16086021

  10. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. PMID:27016439

  11. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events.

  12. Workshop on quantitative dynamic stratigraphy

    SciTech Connect

    Cross, T.A.

    1988-04-01

    This document discusses the development of quantitative simulation models for the investigation of geologic systems. The selection of variables, model verification, evaluation, and future directions in quantitative dynamic stratigraphy (QDS) models are detailed. Interdisciplinary applications, integration, implementation, and transfer of QDS are also discussed. (FI)

  13. Rapid qualitative and quantitative analyses of proanthocyanidin oligomers and polymers by UPLC-MS/MS.

    PubMed

    Engström, Marica T; Pälijärvi, Maija; Fryganas, Christos; Grabber, John H; Mueller-Harvey, Irene; Salminen, Juha-Pekka

    2014-04-16

    This paper presents the development of a rapid method with ultraperformance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) for the qualitative and quantitative analyses of plant proanthocyanidins directly from crude plant extracts. The method utilizes a range of cone voltages to achieve the depolymerization step in the ion source of both smaller oligomers and larger polymers. The formed depolymerization products are further fragmented in the collision cell to enable their selective detection. This UPLC-MS/MS method is able to separately quantitate the terminal and extension units of the most common proanthocyanidin subclasses, that is, procyanidins and prodelphinidins. The resulting data enable (1) quantitation of the total proanthocyanidin content, (2) quantitation of total procyanidins and prodelphinidins including the procyanidin/prodelphinidin ratio, (3) estimation of the mean degree of polymerization for the oligomers and polymers, and (4) estimation of how the different procyanidin and prodelphinidin types are distributed along the chromatographic hump typically produced by large proanthocyanidins. All of this is achieved within the 10 min period of analysis, which makes the presented method a significant addition to the chemistry tools currently available for the qualitative and quantitative analyses of complex proanthocyanidin mixtures from plant extracts.

  14. Quantitation and detection of vanadium in biologic and pollution materials

    NASA Technical Reports Server (NTRS)

    Gordon, W. A.

    1974-01-01

    A review is presented of special considerations and methodology for determining vanadium in biological and air pollution materials. In addition to descriptions of specific analysis procedures, general sections are included on quantitation of analysis procedures, sample preparation, blanks, and methods of detection of vanadium. Most of the information presented is applicable to the determination of other trace elements in addition to vanadium.

  15. Quantitative structure-chromatographic retention relationships

    SciTech Connect

    Kaliszan, R.

    1987-01-01

    This book provides a wide-ranging overview of quantitative structure-retention relationships (QSRR). It brings together a great deal of information that previously was scattered in various parts of the literature. Although the book covers a lot of material, it provides the reader with sufficient background to read the related literature. In addition to QSRR, the book covers some topics related to quantitative structure-activity relationships (QSAR), where activity refers to biological activity. Overall, the book is well written and easy to understand. It would have been helpful to the reader if the chapter numbers had been included in the running heads. The book is divided by subject into 12 chapters, each with references. Works published through 1985 are included; hence, some recent literature is not covered. However, the book is heavily referenced, and each reference has the full title of the work as well as source and author information.

  16. Quantitative genetic analysis of injury liability in infants and toddlers

    SciTech Connect

    Phillips, K.; Matheny, A.P. Jr.

    1995-02-27

    A threshold model of latent liability was applied to infant and toddler twin data on total count of injuries sustained during the interval from birth to 36 months of age. A quantitative genetic analysis of estimated twin correlations in injury liability indicated strong genetic dominance effects, but no additive genetic variance was detected. Because interpretations involving overdominance have little research support, the results may be due to low order epistasis or other interaction effects. Boys had more injuries than girls, but this effect was found only for groups whose parents were prompted and questioned in detail about their children`s injuries. Activity and impulsivity are two behavioral predictors of childhood injury, and the results are discussed in relation to animal research on infant and adult activity levels, and impulsivity in adult humans. Genetic epidemiological approaches to childhood injury should aid in targeting higher risk children for preventive intervention. 30 refs., 4 figs., 3 tabs.

  17. Quantitative Phase Analysis by the Rietveld Method for Forensic Science.

    PubMed

    Deng, Fei; Lin, Xiaodong; He, Yonghong; Li, Shu; Zi, Run; Lai, Shijun

    2015-07-01

    Quantitative phase analysis (QPA) is helpful to determine the type attribute of the object because it could present the content of the constituents. QPA by Rietveld method requires neither measurement of calibration data nor the use of an internal standard; however, the approximate crystal structure of each phase in a mixture is necessary. In this study, 8 synthetic mixtures composed of potassium nitrate and sulfur were analyzed by Rietveld QPA method. The Rietveld refinement was accomplished with a material analysis using diffraction program and evaluated by three agreement indices. Results showed that Rietveld QPA yielded precise results, with errors generally less than 2.0% absolute. In addition, a criminal case which was broken successfully with the help of Rietveld QPA method was also introduced. This method will allow forensic investigators to acquire detailed information of the material evidence, which could point out the direction for case detection and court proceedings.

  18. Quantitative microbial ecology through stable isotope probing.

    PubMed

    Hungate, Bruce A; Mau, Rebecca L; Schwartz, Egbert; Caporaso, J Gregory; Dijkstra, Paul; van Gestel, Natasja; Koch, Benjamin J; Liu, Cindy M; McHugh, Theresa A; Marks, Jane C; Morrissey, Ember M; Price, Lance B

    2015-11-01

    Bacteria grow and transform elements at different rates, and as yet, quantifying this variation in the environment is difficult. Determining isotope enrichment with fine taxonomic resolution after exposure to isotope tracers could help, but there are few suitable techniques. We propose a modification to stable isotope probing (SIP) that enables the isotopic composition of DNA from individual bacterial taxa after exposure to isotope tracers to be determined. In our modification, after isopycnic centrifugation, DNA is collected in multiple density fractions, and each fraction is sequenced separately. Taxon-specific density curves are produced for labeled and nonlabeled treatments, from which the shift in density for each individual taxon in response to isotope labeling is calculated. Expressing each taxon's density shift relative to that taxon's density measured without isotope enrichment accounts for the influence of nucleic acid composition on density and isolates the influence of isotope tracer assimilation. The shift in density translates quantitatively to isotopic enrichment. Because this revision to SIP allows quantitative measurements of isotope enrichment, we propose to call it quantitative stable isotope probing (qSIP). We demonstrated qSIP using soil incubations, in which soil bacteria exhibited strong taxonomic variations in (18)O and (13)C composition after exposure to [(18)O]water or [(13)C]glucose. The addition of glucose increased the assimilation of (18)O into DNA from [(18)O]water. However, the increase in (18)O assimilation was greater than expected based on utilization of glucose-derived carbon alone, because the addition of glucose indirectly stimulated bacteria to utilize other substrates for growth. This example illustrates the benefit of a quantitative approach to stable isotope probing.

  19. Genome Size, Quantitative Genetics and the Genomic Basis for Flower Size Evolution in Silene latifolia

    PubMed Central

    MEAGHER, THOMAS R.; GILLIES, AMANDA C. M.; COSTICH, DENISE E.

    2005-01-01

    • Background and Aims The overall goal of this paper is to construct an overview of the genetic basis for flower size evolution in Silene latifolia. It aims to examine the relationship between the molecular bases for flower size and the underlying assumption of quantitative genetics theory that quantitative variation is ultimately due to the impact of a number of structural genes. • Scope Previous work is reviewed on the quantitative genetics and potential for response to selection on flower size, and the relationship between flower size and nuclear DNA content in S. latifolia. These earlier findings provide a framework within which to consider more recent analyses of a joint quantitative trait loci (QTL) analysis of flower size and DNA content in this species. • Key Results Flower size is a character that fits the classical quantitative genetics model of inheritance very nicely. However, an earlier finding that flower size is correlated with nuclear DNA content suggested that quantitative aspects of genome composition rather than allelic substitution at structural loci might play a major role in the evolution of flower size. The present results reported here show that QTL for flower size are correlated with QTL for DNA content, further corroborating an earlier result and providing additional support for the conclusion that localized variations in DNA content underlie evolutionary changes in flower size. • Conclusions The search image for QTL should be broadened to include overall aspects of genome regulation. As we prepare to enter the much-heralded post-genomic era, we also need to revisit our overall models of the relationship between genotype and phenotype to encompass aspects of genome structure and composition beyond structural genes. PMID:15596472

  20. NSCLC tumor shrinkage prediction using quantitative image features.

    PubMed

    Hunter, Luke A; Chen, Yi Pei; Zhang, Lifei; Matney, Jason E; Choi, Haesun; Kry, Stephen F; Martel, Mary K; Stingo, Francesco; Liao, Zhongxing; Gomez, Daniel; Yang, Jinzhong; Court, Laurence E

    2016-04-01

    The objective of this study was to develop a quantitative image feature model to predict non-small cell lung cancer (NSCLC) volume shrinkage from pre-treatment CT images. 64 stage II-IIIB NSCLC patients with similar treatments were all imaged using the same CT scanner and protocol. For each patient, the planning gross tumor volume (GTV) was deformed onto the week 6 treatment image, and tumor shrinkage was quantified as the deformed GTV volume divided by the planning GTV volume. Geometric, intensity histogram, absolute gradient image, co-occurrence matrix, and run-length matrix image features were extracted from each planning GTV. Prediction models were generated using principal component regression with simulated annealing subset selection. Performance was quantified using the mean squared error (MSE) between the predicted and observed tumor shrinkages. Permutation tests were used to validate the results. The optimal prediction model gave a strong correlation between the observed and predicted tumor shrinkages with r=0.81 and MSE=8.60×10(-3). Compared to predictions based on the mean population shrinkage this resulted in a 2.92 fold reduction in MSE. In conclusion, this study indicated that quantitative image features extracted from existing pre-treatment CT images can successfully predict tumor shrinkage and provide additional information for clinical decisions regarding patient risk stratification, treatment, and prognosis. PMID:26878137

  1. Additive and nonadditive genetic variation in avian personality traits.

    PubMed

    van Oers, K; Drent, P J; de Jong, G; van Noordwijk, A J

    2004-11-01

    Individuals of all vertebrate species differ consistently in their reactions to mildly stressful challenges. These typical reactions, described as personalities or coping strategies, have a clear genetic basis, but the structure of their inheritance in natural populations is almost unknown. We carried out a quantitative genetic analysis of two personality traits (exploration and boldness) and the combination of these two traits (early exploratory behaviour). This study was carried out on the lines resulting from a two-directional artificial selection experiment on early exploratory behaviour (EEB) of great tits (Parus major) originating from a wild population. In analyses using the original lines, reciprocal F(1) and reciprocal first backcross generations, additive, dominance, maternal effects ands sex-dependent expression of exploration, boldness and EEB were estimated. Both additive and dominant genetic effects were important determinants of phenotypic variation in exploratory behaviour and boldness. However, no sex-dependent expression was observed in either of these personality traits. These results are discussed with respect to the maintenance of genetic variation in personality traits, and the expected genetic structure of other behavioural and life history traits in general.

  2. Simulation evaluation of quantitative myocardial perfusion assessment from cardiac CT

    NASA Astrophysics Data System (ADS)

    Bindschadler, Michael; Modgil, Dimple; Branch, Kelley R.; La Riviere, Patrick J.; Alessio, Adam M.

    2014-03-01

    Contrast enhancement on cardiac CT provides valuable information about myocardial perfusion and methods have been proposed to assess perfusion with static and dynamic acquisitions. There is a lack of knowledge and consensus on the appropriate approach to ensure 1) sufficient diagnostic accuracy for clinical decisions and 2) low radiation doses for patient safety. This work developed a thorough dynamic CT simulation and several accepted blood flow estimation techniques to evaluate the performance of perfusion assessment across a range of acquisition and estimation scenarios. Cardiac CT acquisitions were simulated for a range of flow states (Flow = 0.5, 1, 2, 3 ml/g/min, cardiac output = 3,5,8 L/min). CT acquisitions were simulated with a validated CT simulator incorporating polyenergetic data acquisition and realistic x-ray flux levels for dynamic acquisitions with a range of scenarios including 1, 2, 3 sec sampling for 30 sec with 25, 70, 140 mAs. Images were generated using conventional image reconstruction with additional image-based beam hardening correction to account for iodine content. Time attenuation curves were extracted for multiple regions around the myocardium and used to estimate flow. In total, 2,700 independent realizations of dynamic sequences were generated and multiple MBF estimation methods were applied to each of these. Evaluation of quantitative kinetic modeling yielded blood flow estimates with an root mean square error (RMSE) of ~0.6 ml/g/min averaged across multiple scenarios. Semi-quantitative modeling and qualitative static imaging resulted in significantly more error (RMSE = ~1.2 and ~1.2 ml/min/g respectively). For quantitative methods, dose reduction through reduced temporal sampling or reduced tube current had comparable impact on the MBF estimate fidelity. On average, half dose acquisitions increased the RMSE of estimates by only 18% suggesting that substantial dose reductions can be employed in the context of quantitative myocardial

  3. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  4. Quantitative analysis of saccadic search strategy

    NASA Astrophysics Data System (ADS)

    Over, E. A. B.

    2007-06-01

    This thesis deals with the quantitative analysis of saccadic search strategy. The goal of the research presented was twofold: 1) to quantify overall characteristics of fixation location and saccade direction, and 2) to identify search strategies, with the use of a quantitative description of eye movement parameters. Chapter 2 provides a method to quantify a general property of fixation locations. We proposed a quantitative measure based on Voronoi diagrams for the characterization of the uniformity of fixation density. This measure may be thought of as indicating the clustering of fixations. We showed that during a visual search task, a structured (natural) background leads to higher clustering of fixations compared to a homogeneous background. In addition, in natural stimuli, a search task leads to higher clustering of fixations than the instruction to freely view the stimuli. Chapter 3 provides a method to identify the overall field of saccade directions in the viewing area. We extended the Voronoi method of chapter 2 so that it became possible to create vector maps. These maps indicate the preferred saccade direction for each position in the viewing area. Several measures of these vector maps were used to quantify the influence of observer-dependent and stimulus-dependent factors on saccade direction in a search task with natural scenes. The results showed that the influence of stimulus-dependent factors appeared to be larger than the influence of observer-dependent factors. In chapter 4 we showed that the border of the search area played a role in the search strategy. In a search experiment in differently shaped areas we measured that search performance was poorer near the search area luminance edges. Fixation density, however, was higher in the edge region, and saccade direction was mainly along the edges of the search areas. In a target visibility experiment we established that the visibility of targets near a luminance edge is less than the visibility of

  5. Incorporation of additives into polymers

    DOEpatents

    McCleskey, T. Mark; Yates, Matthew Z.

    2003-07-29

    There has been invented a method for incorporating additives into polymers comprising: (a) forming an aqueous or alcohol-based colloidal system of the polymer; (b) emulsifying the colloidal system with a compressed fluid; and (c) contacting the colloidal polymer with the additive in the presence of the compressed fluid. The colloidal polymer can be contacted with the additive by having the additive in the compressed fluid used for emulsification or by adding the additive to the colloidal system before or after emulsification with the compressed fluid. The invention process can be carried out either as a batch process or as a continuous on-line process.

  6. [Patch-testing methods: additional specialised or additional series].

    PubMed

    Cleenewerck, M-B

    2009-01-01

    The tests in the European standard battery must occasionally be supplemented by specialised or additional batteries, particularly where the contact allergy is thought to be of occupational origin. These additional batteries cover all allergens associated with various professional activities (hairdressing, baking, dentistry, printing, etc.) and with different classes of materials and chemical products (glue, plastic, rubber...). These additional tests may also include personal items used by patients on a daily basis such as cosmetics, shoes, plants, textiles and so on.

  7. Additive manufacturing of optical components

    NASA Astrophysics Data System (ADS)

    Heinrich, Andreas; Rank, Manuel; Maillard, Philippe; Suckow, Anne; Bauckhage, Yannick; Rößler, Patrick; Lang, Johannes; Shariff, Fatin; Pekrul, Sven

    2016-08-01

    The development of additive manufacturing methods has enlarged rapidly in recent years. Thereby, the work mainly focuses on the realization of mechanical components, but the additive manufacturing technology offers a high potential in the field of optics as well. Owing to new design possibilities, completely new solutions are possible. This article briefly reviews and compares the most important additive manufacturing methods for polymer optics. Additionally, it points out the characteristics of additive manufactured polymer optics. Thereby, surface quality is of crucial importance. In order to improve it, appropriate post-processing steps are necessary (e.g. robot polishing or coating), which will be discussed. An essential part of this paper deals with various additive manufactured optical components and their use, especially in optical systems for shape metrology (e.g. borehole sensor, tilt sensor, freeform surface sensor, fisheye lens). The examples should demonstrate the potentials and limitations of optical components produced by additive manufacturing.

  8. Quantitative phase-amplitude microscopy. III. The effects of noise.

    PubMed

    Paganin, D; Barty, A; McMahon, P J; Nugent, K A

    2004-04-01

    We explore the effect of noise on images obtained using quantitative phase-amplitude microscopy - a new microscopy technique based on the determination of phase from the intensity evolution of propagating radiation. We compare the predictions with experimental results and also propose an approach that allows good-quality quantitative phase retrieval to be obtained even for very noisy data.

  9. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  10. Straightness error evaluation of additional constraints

    NASA Astrophysics Data System (ADS)

    Pei, Ling; Wang, Shenghuai; Liu, Yong

    2011-05-01

    A new generation of Dimensional and Geometrical Product Specifications (GPS) and Verification standard system is based on both the Mathematical structure and the Metrology. To determine the eligibility of the product should be adapt to modern digital measuring instruments. But in mathematizating measurement when the geometric tolerance specifications has additional constraints requirement, such as straightness with an additional constraint, required to qualify the additional form requirements of the feature within the tolerance zone. Knowing how to close the geometrical specification to the functional specification will result in the correctness of measurement results. Adopting the methodology to evaluate by analyzing various forms including the ideal features and the extracted features and their combinations in an additional form constraint of the straightness in tolerance zone had been found correctly acceptance decision for products. The results show that different combinations of the various forms had affected acceptance on the product qualification and the appropriate forms matching can meet the additional form requirements for product features.

  11. Straightness error evaluation of additional constraints

    NASA Astrophysics Data System (ADS)

    Pei, Ling; Wang, Shenghuai; Liu, Yong

    2010-12-01

    A new generation of Dimensional and Geometrical Product Specifications (GPS) and Verification standard system is based on both the Mathematical structure and the Metrology. To determine the eligibility of the product should be adapt to modern digital measuring instruments. But in mathematizating measurement when the geometric tolerance specifications has additional constraints requirement, such as straightness with an additional constraint, required to qualify the additional form requirements of the feature within the tolerance zone. Knowing how to close the geometrical specification to the functional specification will result in the correctness of measurement results. Adopting the methodology to evaluate by analyzing various forms including the ideal features and the extracted features and their combinations in an additional form constraint of the straightness in tolerance zone had been found correctly acceptance decision for products. The results show that different combinations of the various forms had affected acceptance on the product qualification and the appropriate forms matching can meet the additional form requirements for product features.

  12. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  13. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  14. A quantitative philology of introspection

    PubMed Central

    Diuk, Carlos G.; Slezak, D. Fernandez; Raskovsky, I.; Sigman, M.; Cecchi, G. A.

    2012-01-01

    The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the “Axial Age,” saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy—which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single “arrow of time” in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the twentieth century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a) it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b) to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus. PMID:23015783

  15. QUANTITATIVE INVESTIGATIONS OF IDIOTYPIC ANTIBODIES

    PubMed Central

    Kuettner, Mirta Goffan; Wang, Ai-Lan; Nisonoff, Alfred

    1972-01-01

    Antisera were prepared in rabbits against anti-p-azobenzoate antibodies of an A/J and a BALB/c mouse and anti-p-azophenylarsonate antibodies of an A/J mouse. After appropriate absorption the antisera reacted with the anti-hapten antibody of the donor mouse but, by sensitive quantitative tests, not at all with other components of the hyperimmune serum or with preimmune serum of the donor mouse. The absorbed antiserum therefore appeared to be specific for idiotypic determinants. Nearly all idiotypic specificities identified in the serum of the donor were also present in the serum of other mice of the same strain, immunized against the same hapten group, but not in mice immunized with a different hapten. In each case the antibodies of the donor mouse reacted most effectively on a weight basis with antiidiotypic antiserum. Cross-reactions were observed among different strains of mice but homologous anti-bodies reacted most effectively with antiidiotypic antisera. C57/BL and DBA antisera contained very low concentrations of specificities present in the A/J and BALB/c antibody populations; antibodies of A/J and BALB/c antisera are more closely related to one another. The results indicate that idiotypic specificity may provide a genetic marker for the variable regions of immunoglobulin polypeptide chains. PMID:4110016

  16. Enhancing the Quantitative Representation of Socioeconomic Conditions in the Shared Socio-economic Pathways (SSPs) using the International Futures Model

    NASA Astrophysics Data System (ADS)

    Rothman, D. S.; Siraj, A.; Hughes, B.

    2013-12-01

    The international research community is currently in the process of developing new scenarios for climate change research. One component of these scenarios are the Shared Socio-economic Pathways (SSPs), which describe a set of possible future socioeconomic conditions. These are presented in narrative storylines with associated quantitative drivers. The core quantitative drivers include total population, average GDP per capita, educational attainment, and urbanization at the global, regional, and national levels. At the same time there have been calls, particularly by the IAV community, for the SSPs to include additional quantitative information on other key social factors, such as income inequality, governance, health, and access to key infrastructures, which are discussed in the narratives. The International Futures system (IFs), based at the Pardee Center at the University of Denver, is able to provide forecasts of many of these indicators. IFs cannot use the SSP drivers as exogenous inputs, but we are able to create development pathways that closely reproduce the core quantitative drivers defined by the different SSPs, as well as incorporating assumptions on other key driving factors described in the qualitative narratives. In this paper, we present forecasts for additional quantitative indicators based upon the implementation of the SSP development pathways in IFs. These results will be of value to many researchers.

  17. 75 FR 51444 - Procurement List Additions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-20

    ... . SUPPLEMENTARY INFORMATION: Additions On 6/4/2010 (75 FR 31768-31769); 6/11/2010 (75 FR 33270-33271); 6/ 18/2010 (75 FR 34701-34702); and 6/25/2010 (75 FR 36363-36371), the Committee for Purchase From People Who Are... factors considered for this certification were: 1. The action will not result in any additional...

  18. The Additive Coloration of Alkali Halides

    ERIC Educational Resources Information Center

    Jirgal, G. H.; and others

    1969-01-01

    Describes the construction and use of an inexpensive, vacuum furnace designed to produce F-centers in alkali halide crystals by additive coloration. The method described avoids corrosion or contamination during the coloration process. Examination of the resultant crystals is discussed and several experiments using additively colored crystals are…

  19. Developing Multiplicative Thinking from Additive Reasoning

    ERIC Educational Resources Information Center

    Tobias, Jennifer M.; Andreasen, Janet B.

    2013-01-01

    As students progress through elementary school, they encounter mathematics concepts that shift from additive to multiplicative situations (NCTM 2000). When they encounter fraction problems that require multiplicative thinking, they tend to incorrectly extend additive properties from whole numbers (Post et al. 1985). As a result, topics such as …

  20. Quantitative blood flow velocity imaging using laser speckle flowmetry

    PubMed Central

    Nadort, Annemarie; Kalkman, Koen; van Leeuwen, Ton G.; Faber, Dirk J.

    2016-01-01

    Laser speckle flowmetry suffers from a debated quantification of the inverse relation between decorrelation time (τc) and blood flow velocity (V), i.e. 1/τc = αV. Using a modified microcirculation imager (integrated sidestream dark field - laser speckle contrast imaging [SDF-LSCI]), we experimentally investigate on the influence of the optical properties of scatterers on α in vitro and in vivo. We found a good agreement to theoretical predictions within certain limits for scatterer size and multiple scattering. We present a practical model-based scaling factor to correct for multiple scattering in microcirculatory vessels. Our results show that SDF-LSCI offers a quantitative measure of flow velocity in addition to vessel morphology, enabling the quantification of the clinically relevant blood flow, velocity and tissue perfusion. PMID:27126250

  1. Quantitative blood flow velocity imaging using laser speckle flowmetry

    NASA Astrophysics Data System (ADS)

    Nadort, Annemarie; Kalkman, Koen; van Leeuwen, Ton G.; Faber, Dirk J.

    2016-04-01

    Laser speckle flowmetry suffers from a debated quantification of the inverse relation between decorrelation time (τc) and blood flow velocity (V), i.e. 1/τc = αV. Using a modified microcirculation imager (integrated sidestream dark field - laser speckle contrast imaging [SDF-LSCI]), we experimentally investigate on the influence of the optical properties of scatterers on α in vitro and in vivo. We found a good agreement to theoretical predictions within certain limits for scatterer size and multiple scattering. We present a practical model-based scaling factor to correct for multiple scattering in microcirculatory vessels. Our results show that SDF-LSCI offers a quantitative measure of flow velocity in addition to vessel morphology, enabling the quantification of the clinically relevant blood flow, velocity and tissue perfusion.

  2. Quantitative confocal microscopy: beyond a pretty picture.

    PubMed

    Jonkman, James; Brown, Claire M; Cole, Richard W

    2014-01-01

    Quantitative optical microscopy has become the norm, with the confocal laser-scanning microscope being the workhorse of many imaging laboratories. Generating quantitative data requires a greater emphasis on the accurate operation of the microscope itself, along with proper experimental design and adequate controls. The microscope, which is more accurately an imaging system, cannot be treated as a "black box" with the collected data viewed as infallible. There needs to be regularly scheduled performance testing that will ensure that quality data are being generated. This regular testing also allows for the tracking of metrics that can point to issues before they result in instrument malfunction and downtime. In turn, images must be collected in a manner that is quantitative with maximal signal to noise (which can be difficult depending on the application) without data clipping. Images must then be processed to correct for background intensities, fluorophore cross talk, and uneven field illumination. With advanced techniques such as spectral imaging, Förster resonance energy transfer, and fluorescence-lifetime imaging microscopy, experimental design needs to be carefully planned out and include all appropriate controls. Quantitative confocal imaging in all of these contexts and more will be explored within the chapter. PMID:24974025

  3. Performance of calibration standards for antigen quantitation with flow cytometry.

    PubMed

    Lenkei, R; Gratama, J W; Rothe, G; Schmitz, G; D'hautcourt, J L; Arekrans, A; Mandy, F; Marti, G

    1998-10-01

    In the frame of the activities initiated by the Task Force for Antigen Quantitation of the European Working Group on Clinical Cell Analysis (EWGCCA), an experiment was conducted to evaluate microbead standards used for quantitative flow cytometry (QFCM). An unified window of analysis (UWA) was established on three different instruments (EPICS XL [Coulter Corporation, Miami, FL], FACScan and FACS Calibur [Becton Dickinson, San Jose, CA]) with QC3 microbeads (FCSC, PR). By using this defined fluorescence intensity scale, the performance of several monoclonal antibodies directed to CD3, CD4, and CD8 (conjugated and unconjugated), from three manufacturers (BDIS, Coulter [Immunotech], and DAKO) was tested. In addition, the QIFI system (DAKO) and QuantiBRITE (BDIS), and a method of relative fluorescence intensity (RFI, method of Giorgi), were compared. mAbs reacting with three more antigens, CD16, CD19, and CD38 were tested on the FACScan instrument. Quantitation was carried out using a single batch of cryopreserved peripheral blood leukocytes, and all tests were performed as single color analyses. Significant correlations were observed between the antibody-binding capacity (ABC) values of the same CD antigen measured with various calibrators and with antibodies differing in respect to vendor, labeling and possible epitope recognition. Despite the significant correlations, the ABC values of most monoclonal antibodies differed by 20-40% when determined by the different fluorochrome conjugates and different calibrators. The results of this study indicate that, at the present stage of QFCM consistent ABC values may be attained between laboratories provided that a specific calibration system is used including specific calibrators, reagents, and protocols.

  4. DREAM: a method for semi-quantitative dermal exposure assessment.

    PubMed

    Van-Wendel-de-Joode, Berna; Brouwer, Derk H; Vermeulen, Roel; Van Hemmen, Joop J; Heederik, Dick; Kromhout, Hans

    2003-01-01

    This paper describes a new method (DREAM) for structured, semi-quantitative dermal exposure assessment for chemical or biological agents that can be used in occupational hygiene or epidemiology. It is anticipated that DREAM could serve as an initial assessment of dermal exposure, amongst others, resulting in a ranking of tasks and subsequently jobs. DREAM consists of an inventory and evaluation part. Two examples of dermal exposure of workers of a car-construction company show that DREAM characterizes tasks and gives insight into exposure mechanisms, forming a basis for systematic exposure reduction. DREAM supplies estimates for exposure levels on the outside clothing layer as well as on skin, and provides insight into the distribution of dermal exposure over the body. Together with the ranking of tasks and people, this provides information for measurement strategies and helps to determine who, where and what to measure. In addition to dermal exposure assessment, the systematic description of dermal exposure pathways helps to prioritize and determine most adequate measurement strategies and methods. DREAM could be a promising approach for structured, semi-quantitative, dermal exposure assessment. PMID:12505908

  5. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs.

  6. Quantitative analysis of Caenorhabditis elegans chemotaxis using a microfluidic device.

    PubMed

    Hu, Liang; Ye, Jinjuan; Tan, Haowei; Ge, Anle; Tang, Lichun; Feng, Xiaojun; Du, Wei; Liu, Bi-Feng

    2015-08-01

    Caenorhabditis elegans, one of the widely studied model organisms, sense external chemical cues and perform relative chemotaxis behaviors through its simple chemosensory neuronal system. To study the mechanism underlying chemosensory behavior, a rapid and reliable method for quantitatively analyzing the worms' behaviors is essential. In this work, we demonstrated a microfluidic approach for investigating chemotaxis responses of worms to chemical gradients. The flow-based microfluidic chip was consisted of circular tree-like microchannels, which was able to generate eight flow streams containing stepwise chemical concentrations without the difference in flow velocity. Worms' upstream swimming into microchannels with various concentrations was monitored for quantitative analysis of the chemotaxis behavior. By using this microfluidic chip, the attractive and repellent responses of C. elegans to NaCl were successfully quantified within several minutes. The results demonstrated the wild type-like repellent responses and severely impaired attractive responses in grk-2 mutant animals with defects in calcium influx. In addition, the chemotaxis analysis of the third stage larvae revealed that its gustatory response was different from that in the adult stage. Thus, our microfluidic method provided a useful platform for studying the chemosensory behaviors of C. elegans and screening of chemosensation-related chemical drugs. PMID:26320797

  7. Composition of fingermark residue: a qualitative and quantitative review.

    PubMed

    Girod, Aline; Ramotowski, Robert; Weyermann, Céline

    2012-11-30

    This article describes the composition of fingermark residue as being a complex system with numerous compounds coming from different sources and evolving over time from the initial composition (corresponding to the composition right after deposition) to the aged composition (corresponding to the evolution of the initial composition over time). This complex system will additionally vary due to effects of numerous influence factors grouped in five different classes: the donor characteristics, the deposition conditions, the substrate nature, the environmental conditions and the applied enhancement techniques. The initial and aged compositions as well as the influence factors are thus considered in this article to provide a qualitative and quantitative review of all compounds identified in fingermark residue up to now. The analytical techniques used to obtain these data are also enumerated. This review highlights the fact that despite the numerous analytical processes that have already been proposed and tested to elucidate fingermark composition, advanced knowledge is still missing. Thus, there is a real need to conduct future research on the composition of fingermark residue, focusing particularly on quantitative measurements, aging kinetics and effects of influence factors. The results of future research are particularly important for advances in fingermark enhancement and dating technique developments. PMID:22727572

  8. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular. PMID:23650936

  9. Genetic Architectures of Quantitative Variation in RNA Editing Pathways.

    PubMed

    Gu, Tongjun; Gatti, Daniel M; Srivastava, Anuj; Snyder, Elizabeth M; Raghupathy, Narayanan; Simecek, Petr; Svenson, Karen L; Dotu, Ivan; Chuang, Jeffrey H; Keller, Mark P; Attie, Alan D; Braun, Robert E; Churchill, Gary A

    2016-02-01

    RNA editing refers to post-transcriptional processes that alter the base sequence of RNA. Recently, hundreds of new RNA editing targets have been reported. However, the mechanisms that determine the specificity and degree of editing are not well understood. We examined quantitative variation of site-specific editing in a genetically diverse multiparent population, Diversity Outbred mice, and mapped polymorphic loci that alter editing ratios globally for C-to-U editing and at specific sites for A-to-I editing. An allelic series in the C-to-U editing enzyme Apobec1 influences the editing efficiency of Apob and 58 additional C-to-U editing targets. We identified 49 A-to-I editing sites with polymorphisms in the edited transcript that alter editing efficiency. In contrast to the shared genetic control of C-to-U editing, most of the variable A-to-I editing sites were determined by local nucleotide polymorphisms in proximity to the editing site in the RNA secondary structure. Our results indicate that RNA editing is a quantitative trait subject to genetic variation and that evolutionary constraints have given rise to distinct genetic architectures in the two canonical types of RNA editing.

  10. Indirect scaling methods for testing quantitative emotion theories.

    PubMed

    Junge, Martin; Reisenzein, Rainer

    2013-01-01

    Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.

  11. Quantitatively Probing the Means of Controlling Nanoparticle Assembly on Surfaces

    SciTech Connect

    Patete, J.m.; Wong, S.; Peng, X.; Serafin, J.M.

    2011-05-17

    As a means of developing a simple, cost-effective, and reliable method for probing nanoparticle behavior, we have used atomic force microscopy to gain a quantitative 3D visual representation of the deposition patterns of citrate-capped Au nanoparticles on a substrate as a function of (a) sample preparation, (b) the choice of substrate, (c) the dispersion solvent, and (d) the number of loading steps. Specifically, we have found that all four parameters can be independently controlled and manipulated in order to alter the resulting pattern and quantity of as-deposited nanoparticles. From these data, the sample preparation technique appears to influence deposition patterns most broadly, and the dispersion solvent is the most convenient parameter to use in tuning the quantity of nanoparticles deposited onto the surface under spin-coating conditions. Indeed, we have quantitatively measured the effect of surface coverage for both mica and silicon substrates under preparation techniques associated with (i) evaporation under ambient air, (ii) heat treatment, and (iii) spin-coating preparation conditions. In addition, we have observed a decrease in nanoparticle adhesion to a substrate when the ethylene glycol content of the colloidal dispersion solvent is increased, which had the effect of decreasing interparticle-substrate interactions. Finally, we have shown that substrates prepared by these diverse techniques have potential applicability in surface-enhanced Raman spectroscopy.

  12. Enantioselective Michael Addition of Water

    PubMed Central

    Chen, Bi-Shuang; Resch, Verena; Otten, Linda G; Hanefeld, Ulf

    2015-01-01

    The enantioselective Michael addition using water as both nucleophile and solvent has to date proved beyond the ability of synthetic chemists. Herein, the direct, enantioselective Michael addition of water in water to prepare important β-hydroxy carbonyl compounds using whole cells of Rhodococcus strains is described. Good yields and excellent enantioselectivities were achieved with this method. Deuterium labeling studies demonstrate that a Michael hydratase catalyzes the water addition exclusively with anti-stereochemistry. PMID:25529526

  13. Enantioselective Michael addition of water.

    PubMed

    Chen, Bi-Shuang; Resch, Verena; Otten, Linda G; Hanefeld, Ulf

    2015-02-01

    The enantioselective Michael addition using water as both nucleophile and solvent has to date proved beyond the ability of synthetic chemists. Herein, the direct, enantioselective Michael addition of water in water to prepare important β-hydroxy carbonyl compounds using whole cells of Rhodococcus strains is described. Good yields and excellent enantioselectivities were achieved with this method. Deuterium labeling studies demonstrate that a Michael hydratase catalyzes the water addition exclusively with anti-stereochemistry.

  14. [Clinical evaluation of a novel HBsAg quantitative assay].

    PubMed

    Takagi, Kazumi; Tanaka, Yasuhito; Naganuma, Hatsue; Hiramatsu, Kumiko; Iida, Takayasu; Takasaka, Yoshimitsu; Mizokami, Masashi

    2007-07-01

    The clinical implication of the hepatitis B surface antigen (HBsAg) concentrations in HBV-infected individuals remains unclear. The aim of this study was to evaluate a novel fully automated Chemiluminescence Enzyme Immunoassay (Sysmex HBsAg quantitative assay) by comparative measurements of the reference serum samples versus two independent commercial assays (Lumipulse f or Architect HBsAg QT). Furthermore, clinical usefulness was assessed for monitoring of the serum HBsAg levels during antiviral therapy. A dilution test using 5 reference-serum samples showed linear correlation curve in range from 0.03 to 2,360 IU/ml. The HBsAg was measured in total of 400 serum samples and 99.8% had consistent results between Sysmex and Lumipulse f. Additionally, a positive linear correlation was observed between Sysmex and Architect. To compare the Architect and Sysmex, both methods were applied to quantify the HBsAg in serum samples with different HBV genotypes/subgenotypes, as well as in serum contained HBV vaccine escape mutants (126S, 145R). Correlation between the methods was observed in results for escape mutants and common genotypes (A, B, C) in Japan. Observed during lamivudine therapy, an increase in HBsAg and HBV DNA concentrations preceded the aminotransferase (ALT) elevation associated with drug-resistant HBV variant emergence (breakthrough hepatitis). In conclusion, reliability of the Sysmex HBsAg quantitative assay was confirmed for all HBV genetic variants common in Japan. Monitoring of serum HBsAg concentrations in addition to HBV DNA quantification, is helpful in evaluation of the response to lamivudine treatment and diagnosis of the breakthrough hepatitis.

  15. Quantitative rainbow schlieren deflectometry

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Klimek, Robert B.; Buchele, Donald R.

    1995-01-01

    In the rainbow schlieren apparatus, a continuously graded rainbow filter is placed in the back focal plane of the decollimating lens. Refractive-index gradients in the test section thus appear as gradations in hue rather than irradiance. A simple system is described wherein a conventional color CCD array and video digitizer are used to quantify accurately the color attributes of the resulting image, and hence the associated ray deflections. The present system provides a sensitivity comparable with that of conventional interferometry, while being simpler to implement and less sensitive to mechanical misalignment.

  16. Gasoline additives, emissions, and performance

    SciTech Connect

    1995-12-31

    The papers included in this publication deal with the influence of fuel, additive, and hardware changes on a variety of vehicle performance characteristics. Advanced techniques for measuring these performance parameters are also described. Contents include: Fleet test evaluation of gasoline additives for intake valve and combustion chamber deposit clean up; A technique for evaluating octane requirement additives in modern engines on dynamometer test stands; A fleet test of two additive technologies comparing their effects on tailpipe emissions; Investigation into the vehicle exhaust emissions of high percentage ethanol blends; Variability in hydrocarbon speciation measurements at low emission (ULEV) levels; and more.

  17. Additively manufactured porous tantalum implants.

    PubMed

    Wauthle, Ruben; van der Stok, Johan; Amin Yavari, Saber; Van Humbeeck, Jan; Kruth, Jean-Pierre; Zadpoor, Amir Abbas; Weinans, Harrie; Mulier, Michiel; Schrooten, Jan

    2015-03-01

    The medical device industry's interest in open porous, metallic biomaterials has increased in response to additive manufacturing techniques enabling the production of complex shapes that cannot be produced with conventional techniques. Tantalum is an important metal for medical devices because of its good biocompatibility. In this study selective laser melting technology was used for the first time to manufacture highly porous pure tantalum implants with fully interconnected open pores. The architecture of the porous structure in combination with the material properties of tantalum result in mechanical properties close to those of human bone and allow for bone ingrowth. The bone regeneration performance of the porous tantalum was evaluated in vivo using an orthotopic load-bearing bone defect model in the rat femur. After 12 weeks, substantial bone ingrowth, good quality of the regenerated bone and a strong, functional implant-bone interface connection were observed. Compared to identical porous Ti-6Al-4V structures, laser-melted tantalum shows excellent osteoconductive properties, has a higher normalized fatigue strength and allows for more plastic deformation due to its high ductility. It is therefore concluded that this is a first step towards a new generation of open porous tantalum implants manufactured using selective laser melting.

  18. Additively manufactured porous tantalum implants.

    PubMed

    Wauthle, Ruben; van der Stok, Johan; Amin Yavari, Saber; Van Humbeeck, Jan; Kruth, Jean-Pierre; Zadpoor, Amir Abbas; Weinans, Harrie; Mulier, Michiel; Schrooten, Jan

    2015-03-01

    The medical device industry's interest in open porous, metallic biomaterials has increased in response to additive manufacturing techniques enabling the production of complex shapes that cannot be produced with conventional techniques. Tantalum is an important metal for medical devices because of its good biocompatibility. In this study selective laser melting technology was used for the first time to manufacture highly porous pure tantalum implants with fully interconnected open pores. The architecture of the porous structure in combination with the material properties of tantalum result in mechanical properties close to those of human bone and allow for bone ingrowth. The bone regeneration performance of the porous tantalum was evaluated in vivo using an orthotopic load-bearing bone defect model in the rat femur. After 12 weeks, substantial bone ingrowth, good quality of the regenerated bone and a strong, functional implant-bone interface connection were observed. Compared to identical porous Ti-6Al-4V structures, laser-melted tantalum shows excellent osteoconductive properties, has a higher normalized fatigue strength and allows for more plastic deformation due to its high ductility. It is therefore concluded that this is a first step towards a new generation of open porous tantalum implants manufactured using selective laser melting. PMID:25500631

  19. Color Addition and Subtraction Apps

    ERIC Educational Resources Information Center

    Ruiz, Frances; Ruiz, Michael J.

    2015-01-01

    Color addition and subtraction apps in HTML5 have been developed for students as an online hands-on experience so that they can more easily master principles introduced through traditional classroom demonstrations. The evolution of the additive RGB color model is traced through the early IBM color adapters so that students can proceed step by step…

  20. Quantitative mass spectrometry: an overview.

    PubMed

    Urban, Pawel L

    2016-10-28

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements.This article is part of the themed issue 'Quantitative mass spectrometry'. PMID:27644965

  1. Quantitative mass spectrometry: an overview

    PubMed Central

    2016-01-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry—especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644965

  2. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  3. Quantitation of Microorganisms in Sputum

    PubMed Central

    Monroe, P. W.; Muchmore, H. G.; Felton, F. G.; Pirtle, J. K.

    1969-01-01

    A method of quantitating microbial cultures of homogenized sputum has been devised. Possible application of this method to the problem of determining the etiologic agent of lower-respiratory-tract infections has been studied to determine its usefulness as a guide in the management of these infections. Specimens were liquefied by using an equal volume of 2% N-acetyl-L-cysteine. The liquefied sputum suspension was serially diluted to 10-1, 10-3, 10-5, and 10-7. These dilutions were plated on appropriate media by using an 0.01-ml calibrated loop; they were incubated and examined by standard diagnostic methods. Quantitation of fresh sputum from patients with pneumonia prior to antimicrobial therapy revealed that probable pathogens were present in populations of 107 organisms/ml or greater. Normal oropharyngeal flora did not occur in these numbers before therapy. Comparison of microbial counts on fresh and aged sputum showed that it is necessary to use only fresh specimens, since multiplication or death alters both quantitative and qualitative findings. Proper collection and quantitative culturing of homogenized sputum provided information more directly applicable to patient management than did qualitative routine methods. Not only was the recognition of the probable pathogenic organism in pneumonia patients improved, but serial quantitative cultures were particularly useful in recognizing the emergence of superinfections and in evaluating the efficacy of antimicrobial therapy. PMID:4390055

  4. Quantitative mass spectrometry: an overview.

    PubMed

    Urban, Pawel L

    2016-10-28

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements.This article is part of the themed issue 'Quantitative mass spectrometry'.

  5. Determining cleanup levels in bioremediation: Quantitative structure activity relationship techniques

    SciTech Connect

    Arulgnanendran, V.R.J.; Nirmalakhandan, N.

    1995-12-31

    An important feature in the process of planning and initiating bioremediation is the quantification of the toxicity of either an individual chemical or a group of chemicals when multiple chemicals are involved. A laboratory protocol was developed to test the toxicity of single chemicals and mixtures of organic chemicals in a soil medium. Portions of these chemicals are used as a training set to develop Quantitative Structure Activity Relationship (QSAR) models. These predictive models are tested using the chemicals in the testing set, i.e., the remaining chemicals. Moreover mixtures with 10 contaminants in each mixture are tested experimentally to determine joint toxicity for mixtures of chemicals. Using the concepts of Toxic Units, Additivity Index, and Mixture Toxicity Index, the laboratory results are tested for additive, synergistic, or antagonistic effects of the contaminants. These concepts are further validated on mixtures containing eight chemicals that are tested in the laboratory. In addition to the use of the predictive models in evaluating cleanup levels for hazardous waste locations, they are useful to predict microbial toxicity in soils of new chemicals from a congeneric group acting by the same mode of toxicity. These models are applicable when the contaminants act singly or jointly in a mixture.

  6. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  7. Quantitative proteome analysis in cardiovascular physiology and pathology. I. Data processing.

    PubMed

    Grussenmeyer, Thomas; Meili-Butz, Silvia; Dieterle, Thomas; Traunecker, Emmanuel; Carrel, Thierry P; Lefkovits, Ivan

    2008-12-01

    Methodological evaluation of the proteomic analysis of cardiovascular-tissue material has been performed with a special emphasis on establishing examinations that allow reliable quantitative analysis of silver-stained readouts. Reliability, reproducibility, robustness and linearity were addressed and clarified. In addition, several types of normalization procedures were evaluated and new approaches are proposed. It has been found that the silver-stained readout offers a convenient approach for quantitation if a linear range for gel loading is defined. In addition, a broad range of a 10-fold input (loading 20-200 microg per gel) fulfills the linearity criteria, although at the lowest input (20 microg) a portion of protein species will remain undetected. The method is reliable and reproducible within a range of 65-200 microg input. The normalization procedure using the sum of all spot intensities from a silver-stained 2D pattern has been shown to be less reliable than other approaches, namely, normalization through median or through involvement of interquartile range. A special refinement of the normalization through virtual segmentation of pattern, and calculation of normalization factor for each stratum provides highly satisfactory results. The presented results not only provide evidence for the usefulness of silver-stained gels for quantitative evaluation, but they are directly applicable to the research endeavor of monitoring alterations in cardiovascular pathophysiology.

  8. Evaluation of additive element to improve PZT piezoelectricity by using first-principles calculation

    NASA Astrophysics Data System (ADS)

    Yasoda, Yutaka; Uetsuji, Yasutomo; Tsuchiya, Kazuyoshi

    2015-12-01

    Recently, piezoelectric material has a very important potential for functional material which configure Bio-MEMS (Biological Micro Electro Mechanical Systems) actuator and sensor. Specifically, in implementation of piezoelectric material for Bio-MEMS, thin film fabrication by sputtering method is made from the viewpoint of miniaturization. Furthermore, in piezoelectric material, perovskite type material composed of ABO3 has a high piezoelectricity. Then, PZT (Lead Zirconate Titanate) as the perovskite type piezoelectric material is widely used since it is easy to produce and has high piezoelectricity. PZT has zirconium or titanium in the B site of ABO3 structure. PZT has the features such as physical properties to greatly change by change in the B site composition ratio of zirconium and titanium. Thus, the B site greatly influences physical properties and therefore function improvement by additive element is tried widely. However, experimental method to lack in economy and quantitativeness is mainstream. Therefore, application of the result is difficult and new evaluation method of B site additive element for sputtering fabrication is necessary. Accordingly, in this research, search of an additive element at low cost and quantitative from the viewpoint of energy by first-principles calculation. First of all, the additive elements which capable of substituting for a B site of PZT were searched. Next, change of piezoelectricity was evaluated by change of crystal structure in a PZT system was introduced an additive element that substitution of the B site was possible. As a result, additive elements for the PZT B site capable of improving piezoelectricity were determined.

  9. Color Addition and Subtraction Apps

    NASA Astrophysics Data System (ADS)

    Ruiz, Frances; Ruiz, Michael J.

    2015-10-01

    Color addition and subtraction apps in HTML5 have been developed for students as an online hands-on experience so that they can more easily master principles introduced through traditional classroom demonstrations. The evolution of the additive RGB color model is traced through the early IBM color adapters so that students can proceed step by step in understanding mathematical representations of RGB color. Finally, color addition and subtraction are presented for the X11 colors from web design to illustrate yet another real-life application of color mixing.

  10. Qualitative and quantitative analysis of steroidal saponins in crude extract and bark powder of Yucca schidigera Roezl.

    PubMed

    Kowalczyk, Mariusz; Pecio, Łukasz; Stochmal, Anna; Oleszek, Wiesław

    2011-08-10

    Steroidal saponins in commercial stem syrup and in extract of a bark of Yucca schidigera were identified with high-performance liquid chromatography ion trap mass spectrometry and quantitated using ultraperformance liquid chromatography with quadrupole mass spectrometric detection. Fragmentation patterns of yucca saponins were generated using collision-induced dissociation and compared with fragmentation of authentic standards as well as with published spectrometric information. In addition to detection of twelve saponins known to occur in Y. schidigera, collected fragmentation data led to tentative identifications of seven new saponins. A quantitation method for all 19 detected compounds was developed and validated. Samples derived from the syrup and the bark of yucca were quantitatively measured and compared. Obtained results indicate that yucca bark accumulates polar, bidesmosidic saponins, while in the stem steroidal glycosides with middle- and short-length saccharide chains are predominant. The newly developed method provides an opportunity to evaluate the composition of yucca products available on the market. PMID:21721553

  11. Quantitative genetic models for describing simultaneous and recursive relationships between phenotypes.

    PubMed Central

    Gianola, Daniel; Sorensen, Daniel

    2004-01-01

    Multivariate models are of great importance in theoretical and applied quantitative genetics. We extend quantitative genetic theory to accommodate situations in which there is linear feedback or recursiveness between the phenotypes involved in a multivariate system, assuming an infinitesimal, additive, model of inheritance. It is shown that structural parameters defining a simultaneous or recursive system have a bearing on the interpretation of quantitative genetic parameter estimates (e.g., heritability, offspring-parent regression, genetic correlation) when such features are ignored. Matrix representations are given for treating a plethora of feedback-recursive situations. The likelihood function is derived, assuming multivariate normality, and results from econometric theory for parameter identification are adapted to a quantitative genetic setting. A Bayesian treatment with a Markov chain Monte Carlo implementation is suggested for inference and developed. When the system is fully recursive, all conditional posterior distributions are in closed form, so Gibbs sampling is straightforward. If there is feedback, a Metropolis step may be embedded for sampling the structural parameters, since their conditional distributions are unknown. Extensions of the model to discrete random variables and to nonlinear relationships between phenotypes are discussed. PMID:15280252

  12. Teebi hypertelorism syndrome: additional cases.

    PubMed

    Machado-Paula, Ligiane Alves; Guion-Almeida, Maria Leine

    2003-03-01

    We report on two unrelated Brazilian boys who have craniofacial and digital anomalies resembling those reported with Teebi hypertelorism syndrome. Additional features such as cleft lip and palate, large uvula, atypical chin and abnormal scapulae were observed.

  13. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field.

  14. Polyolefins as additives in plastics

    SciTech Connect

    Deanin, R.D.

    1993-12-31

    Polyolefins are not only major commodity plastics - they are also very useful as additives, both in other polyolefins and also in other types of plastics. This review covers ethylene, propylene, butylene and isobutylene polymers, in blends with each other, and as additives to natural rubber, styrene/butadiene rubber, polystyrene, polyvinyl chloride, polymethyl methacrylate, polyphenylene oxide, polycarbonate, thermoplastic polyesters, polyurethanes, polyamides, and mixed automotive plastics recycling.

  15. Quantitative measurement of nanomechanical properties in composite materials

    NASA Astrophysics Data System (ADS)

    Zhao, Wei

    results significantly, and new, power-law body of revolution models of the probe tip geometry have been applied. Due to the low yield strength of polymers compared with other engineering materials, elastic-plastic contact is considered to better represent the epoxy surface response and was used to acquire more accurate quantitative measurements. Visco-elastic contact response was introduced in the boundary condition of the AFAM cantilever vibration model, due to the creep nature of epoxy, to determine time-dependent effects. These methods have direct impact on the quantitative measurement capabilities of near-filler interphase regions in polymers and composites and the long-term influence of environmental conditions on composites. In addition, quantitative AFAM scans were made on distal surfaces of human bicuspids and molars, to determine the microstructural and spatial variation in nanomechanical properties of the enamel biocomposite. Single point AFAM measurements were performed on individual enamel prism and sheath locations to determine spatial elastic modulus. Mechanical property variation of enamel is associated to the differences in the mineral to organic content and the apatite crystal orientations within the enamel microstructure. Also, variation in the elastic modulus of the enamel ultrastructure was observed in measurements at the outer enamel versus near the dentine enamel junction (DEJ).

  16. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results.

  17. Food additives and preschool children.

    PubMed

    Martyn, Danika M; McNulty, Breige A; Nugent, Anne P; Gibney, Michael J

    2013-02-01

    Food additives have been used throughout history to perform specific functions in foods. A comprehensive framework of legislation is in place within Europe to control the use of additives in the food supply and ensure they pose no risk to human health. Further to this, exposure assessments are regularly carried out to monitor population intakes and verify that intakes are not above acceptable levels (acceptable daily intakes). Young children may have a higher dietary exposure to chemicals than adults due to a combination of rapid growth rates and distinct food intake patterns. For this reason, exposure assessments are particularly important in this age group. The paper will review the use of additives and exposure assessment methods and examine factors that affect dietary exposure by young children. One of the most widely investigated unfavourable health effects associated with food additive intake in preschool-aged children are suggested adverse behavioural effects. Research that has examined this relationship has reported a variety of responses, with many noting an increase in hyperactivity as reported by parents but not when assessed using objective examiners. This review has examined the experimental approaches used in such studies and suggests that efforts are needed to standardise objective methods of measuring behaviour in preschool children. Further to this, a more holistic approach to examining food additive intakes by preschool children is advisable, where overall exposure is considered rather than focusing solely on behavioural effects and possibly examining intakes of food additives other than food colours.

  18. Quantitative resilience analysis through control design.

    SciTech Connect

    Sunderland, Daniel; Vugrin, Eric D.; Camphouse, Russell Chris

    2009-09-01

    Critical infrastructure resilience has become a national priority for the U. S. Department of Homeland Security. System resilience has been studied for several decades in many different disciplines, but no standards or unifying methods exist for critical infrastructure resilience analysis. Few quantitative resilience methods exist, and those existing approaches tend to be rather simplistic and, hence, not capable of sufficiently assessing all aspects of critical infrastructure resilience. This report documents the results of a late-start Laboratory Directed Research and Development (LDRD) project that investigated the development of quantitative resilience through application of control design methods. Specifically, we conducted a survey of infrastructure models to assess what types of control design might be applicable for critical infrastructure resilience assessment. As a result of this survey, we developed a decision process that directs the resilience analyst to the control method that is most likely applicable to the system under consideration. Furthermore, we developed optimal control strategies for two sets of representative infrastructure systems to demonstrate how control methods could be used to assess the resilience of the systems to catastrophic disruptions. We present recommendations for future work to continue the development of quantitative resilience analysis methods.

  19. Nanostructured surfaces investigated by quantitative morphological studies

    NASA Astrophysics Data System (ADS)

    Perani, Martina; Carapezzi, Stefania; Rani Mutta, Geeta; Cavalcoli, Daniela

    2016-05-01

    The morphology of different surfaces has been investigated by atomic force microscopy and quantitatively analyzed in this paper. Two different tools have been employed to this scope: the analysis of the height-height correlation function and the determination of the mean grain size, which have been combined to obtain a complete characterization of the surfaces. Different materials have been analyzed: SiO x N y , InGaN/GaN quantum wells and Si nanowires, grown with different techniques. Notwithstanding the presence of grain-like structures on all the samples analyzed, they present very diverse surface design, underlying that this procedure can be of general use. Our results show that the quantitative analysis of nanostructured surfaces allows us to obtain interesting information, such as grain clustering, from the comparison of the lateral correlation length and the grain size.

  20. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  1. Quantitative aspects of the Galperin L parameter

    NASA Astrophysics Data System (ADS)

    Kosik, J. C.

    2007-12-01

    A new geomagnetic parameter was suggested twenty years ago by Y. Galperin, the Galperin L parameter, and it was introduced into the CNES Maglib for French-Russian projects in the exploration of the distant magnetosphere. The definition and the advantages of the Galperin L parameter are recalled in this brief paper. Unforeseen possibilities in the use of this parameter for mathematical models of the magnetosphere are stressed using past results obtained with the Mead model. The Galperin L parameter is shown to add, in the synchronous region, a quantitative capability to the qualitative description (labelling) of the magnetosphere. More work will be necessary to adapt past mathematical models to present numerical models and extend the domain of the quantitative applications of the Galperin L parameter.

  2. Quantitative assessment of scientific quality

    NASA Astrophysics Data System (ADS)

    Heinzl, Harald; Bloching, Philipp

    2012-09-01

    Scientific publications, authors, and journals are commonly evaluated with quantitative bibliometric measures. Frequently-used measures will be reviewed and their strengths and weaknesses will be highlighted. Reflections about conditions for a new, research paper-specific measure will be presented.

  3. Quantitative Research in Written Composition.

    ERIC Educational Resources Information Center

    Gebhard, Ann O.

    Offered as an introductory guide to teachers interested in approaching written English as a "second dialect" that students must master, this review covers quantitative investigations of written language. The first section deals with developmental studies, describing how a variety of researchers have related written structure to writer maturity.…

  4. Equilibria in Quantitative Reachability Games

    NASA Astrophysics Data System (ADS)

    Brihaye, Thomas; Bruyère, Véronique; de Pril, Julie

    In this paper, we study turn-based quantitative multiplayer non zero-sum games played on finite graphs with reachability objectives. In this framework each player aims at reaching his own goal as soon as possible. We prove existence of finite-memory Nash (resp. secure) equilibria in multiplayer (resp. two-player) games.

  5. Extension of the standard addition method by blank addition.

    PubMed

    Steliopoulos, Panagiotis

    2015-01-01

    Standard addition involves adding varying amounts of the analyte to sample portions of fixed mass or fixed volume and submitting those portions to the sample preparation procedure. After measuring the final extract solutions, the observed signals are linearly regressed on the spiked amounts. The original unknown amount is estimated by the opposite of the abscissa intercept of the fitted straight line [1]. A limitation of this method is that only data points with abscissa values equal to and greater than zero are available so that there is no information on whether linearity holds below the spiking level zero. An approach to overcome this limitation is introduced.•Standard addition is combined with blank addition.•Blank addition means that defined mixtures of blank matrix and sample material are subjected to sample preparation to give final extract solutions.•Equations are presented to estimate the original unknown amount and to calculate the 1-2α confidence interval about this estimate using the combined data set.

  6. ADDITIVITY ASSESSMENT OF TRIHALOMETHANE MIXTURES BY PROPORTIONAL RESPONSE ADDITION

    EPA Science Inventory

    If additivity is known or assumed, the toxicity of a chemical mixture may be predicted from the dose response curves of the individual chemicals comprising the mixture. As single chemical data are abundant and mixture data sparse, mixture risk methods that utilize single chemical...

  7. Extension of the standard addition method by blank addition

    PubMed Central

    Steliopoulos, Panagiotis

    2015-01-01

    Standard addition involves adding varying amounts of the analyte to sample portions of fixed mass or fixed volume and submitting those portions to the sample preparation procedure. After measuring the final extract solutions, the observed signals are linearly regressed on the spiked amounts. The original unknown amount is estimated by the opposite of the abscissa intercept of the fitted straight line [1]. A limitation of this method is that only data points with abscissa values equal to and greater than zero are available so that there is no information on whether linearity holds below the spiking level zero. An approach to overcome this limitation is introduced.•Standard addition is combined with blank addition.•Blank addition means that defined mixtures of blank matrix and sample material are subjected to sample preparation to give final extract solutions.•Equations are presented to estimate the original unknown amount and to calculate the 1-2α confidence interval about this estimate using the combined data set. PMID:26844210

  8. The addition of rituximab to a combination of fludarabine, cyclophosphamide, mitoxantrone (FCM) significantly increases the response rate and prolongs survival as compared with FCM alone in patients with relapsed and refractory follicular and mantle cell lymphomas: results of a prospective randomized study of the German Low-Grade Lymphoma Study Group.

    PubMed

    Forstpointner, Roswitha; Dreyling, Martin; Repp, Roland; Hermann, Sandra; Hänel, Annette; Metzner, Bernd; Pott, Christiane; Hartmann, Frank; Rothmann, Frank; Rohrberg, Robert; Böck, Hans-Peter; Wandt, Hannes; Unterhalt, Michael; Hiddemann, Wolfgang

    2004-11-15

    In follicular lymphoma (FL) and mantle cell lymphoma (MCL) the monoclonal antibody rituximab may improve the prognosis when combined with chemotherapy. This was investigated in a prospective randomized study in patients with relapsed disease. A total of 147 patients were randomized to receive 4 courses of chemotherapy with 25 mg/m(2) fludarabine on days 1 to 3, 200 mg/m(2) cyclophosphamide on days 1 to 3, and 8 mg/m(2) mitoxantrone on day 1 (FCM), alone or combined with rituximab (375 mg/m(2); R-FCM). Of 128 evaluable patients, 62 were randomized for FCM and 66 for R-FCM. R-FCM revealed an overall response rate of 79% (33% complete remission [CR], 45% partial remission [PR]) as compared with 58% for FCM alone (13% CR, 45% PR; P = .01), with similar results in a subgroup analysis of FL (94% vs 70%) and MCL (58% vs 46%). In the total group, the R-FCM arm was significantly superior concerning progression-free survival (PFS; P = .0381) and overall survival (OS; P = .0030). In FL PFS was significantly longer in the R-FCM arm (P = .0139) whereas in MCL a significantly longer OS was observed (P = .0042). There were no differences in clinically relevant side effects in both study arms. Hence, the addition of rituximab to FCM chemotherapy significantly improves the outcome of relapsed or refractory FL and MCL.

  9. Resolving the Quantitative-Qualitative Dilemma: A Critical Realist Approach

    ERIC Educational Resources Information Center

    Scott, David

    2007-01-01

    The philosophical issues underpinning the quantitative-qualitative divide in educational research are examined. Three types of argument which support a resolution are considered: pragmatism, false duality and warranty through triangulation. In addition a number of proposed strategies--alignment, sequencing, translation and triangulation--are…

  10. [INVITED] Lasers in additive manufacturing

    NASA Astrophysics Data System (ADS)

    Pinkerton, Andrew J.

    2016-04-01

    Additive manufacturing is a topic of considerable ongoing interest, with forecasts predicting it to have major impact on industry in the future. This paper focusses on the current status and potential future development of the technology, with particular reference to the role of lasers within it. It begins by making clear the types and roles of lasers in the different categories of additive manufacturing. This is followed by concise reviews of the economic benefits and disadvantages of the technology, current state of the market and use of additive manufacturing in different industries. Details of these fields are referenced rather than expanded in detail. The paper continues, focusing on current indicators to the future of additive manufacturing. Barriers to its development, trends and opportunities in major industrial sectors, and wider opportunities for its development are covered. Evidence indicates that additive manufacturing may not become the dominant manufacturing technology in all industries, but represents an excellent opportunity for lasers to increase their influence in manufacturing as a whole.

  11. Evaluation of certain food additives.

    PubMed

    2015-01-01

    This report represents the conclusions of a Joint FAO/WHO Expert Committee convened to evaluate the safety of various food additives, including flavouring agents, and to prepare specifications for identity and purity. The first part of the report contains a general discussion of the principles governing the toxicological evaluation of and assessment of dietary exposure to food additives, including flavouring agents. A summary follows of the Committee's evaluations of technical, toxicological and dietary exposure data for eight food additives (Benzoe tonkinensis; carrageenan; citric and fatty acid esters of glycerol; gardenia yellow; lutein esters from Tagetes erecta; octenyl succinic acid-modified gum arabic; octenyl succinic acid-modified starch; paprika extract; and pectin) and eight groups of flavouring agents (aliphatic and alicyclic hydrocarbons; aliphatic and aromatic ethers; ionones and structurally related substances; miscellaneous nitrogen-containing substances; monocyclic and bicyclic secondary alcohols, ketones and related esters; phenol and phenol derivatives; phenyl-substituted aliphatic alcohols and related aldehydes and esters; and sulfur-containing heterocyclic compounds). Specifications for the following food additives were revised: citric acid; gellan gum; polyoxyethylene (20) sorbitan monostearate; potassium aluminium silicate; and Quillaia extract (Type 2). Annexed to the report are tables summarizing the Committee's recommendations for dietary exposures to and toxicological evaluations of all of the food additives and flavouring agents considered at this meeting.

  12. Clinical effects of sulphite additives.

    PubMed

    Vally, H; Misso, N L A; Madan, V

    2009-11-01

    Sulphites are widely used as preservative and antioxidant additives in the food and pharmaceutical industries. Topical, oral or parenteral exposure to sulphites has been reported to induce a range of adverse clinical effects in sensitive individuals, ranging from dermatitis, urticaria, flushing, hypotension, abdominal pain and diarrhoea to life-threatening anaphylactic and asthmatic reactions. Exposure to the sulphites arises mainly from the consumption of foods and drinks that contain these additives; however, exposure may also occur through the use of pharmaceutical products, as well as in occupational settings. While contact sensitivity to sulphite additives in topical medications is increasingly being recognized, skin reactions also occur after ingestion of or parenteral exposure to sulphites. Most studies report a 3-10% prevalence of sulphite sensitivity among asthmatic subjects following ingestion of these additives. However, the severity of these reactions varies, and steroid-dependent asthmatics, those with marked airway hyperresponsiveness, and children with chronic asthma, appear to be at greater risk. In addition to episodic and acute symptoms, sulphites may also contribute to chronic skin and respiratory symptoms. To date, the mechanisms underlying sulphite sensitivity remain unclear, although a number of potential mechanisms have been proposed. Physicians should be aware of the range of clinical manifestations of sulphite sensitivity, as well as the potential sources of exposure. Minor modifications to diet or behaviour lead to excellent clinical outcomes for sulphite-sensitive individuals.

  13. Four-Point Bending as a Method for Quantitatively Evaluating Spinal Arthrodesis in a Rat Model

    PubMed Central

    Robinson, Samuel T; Svet, Mark T; Kanim, Linda A; Metzger, Melodie F

    2015-01-01

    The most common method of evaluating the success (or failure) of rat spinal fusion procedures is manual palpation testing. Whereas manual palpation provides only a subjective binary answer (fused or not fused) regarding the success of a fusion surgery, mechanical testing can provide more quantitative data by assessing variations in strength among treatment groups. We here describe a mechanical testing method to quantitatively assess single-level spinal fusion in a rat model, to improve on the binary and subjective nature of manual palpation as an end point for fusion-related studies. We tested explanted lumbar segments from Sprague–Dawley rat spines after single-level posterolateral fusion procedures at L4–L5. Segments were classified as ‘not fused,’ ‘restricted motion,’ or ‘fused’ by using manual palpation testing. After thorough dissection and potting of the spine, 4-point bending in flexion then was applied to the L4–L5 motion segment, and stiffness was measured as the slope of the moment–displacement curve. Results demonstrated statistically significant differences in stiffness among all groups, which were consistent with preliminary grading according to manual palpation. In addition, the 4-point bending results provided quantitative information regarding the quality of the bony union formed and therefore enabled the comparison of fused specimens. Our results demonstrate that 4-point bending is a simple, reliable, and effective way to describe and compare results among rat spines after fusion surgery. PMID:25730756

  14. Quantitative wake analysis of a freely swimming fish using 3D synthetic aperture PIV

    NASA Astrophysics Data System (ADS)

    Mendelson, Leah; Techet, Alexandra H.

    2015-07-01

    Synthetic aperture PIV (SAPIV) is used to quantitatively analyze the wake behind a giant danio ( Danio aequipinnatus) swimming freely in a seeded quiescent tank. The experiment is designed with minimal constraints on animal behavior to ensure that natural swimming occurs. The fish exhibits forward swimming and turning behaviors at speeds between 0.9 and 1.5 body lengths/second. Results show clearly isolated and linked vortex rings in the wake structure, as well as the thrust jet coming off of a visual hull reconstruction of the fish body. As a benchmark for quantitative analysis of volumetric PIV data, the vortex circulation and impulse are computed using methods consistent with those applied to planar PIV data. Volumetric momentum analysis frameworks are discussed for linked and asymmetric vortex structures, laying a foundation for further volumetric studies of swimming hydrodynamics with SAPIV. Additionally, a novel weighted refocusing method is presented as an improvement to SAPIV reconstruction.

  15. Quantitative three-dimensional photoacoustic tomography of the finger joints: an in vivo study

    NASA Astrophysics Data System (ADS)

    Sun, Yao; Sobel, Eric; Jiang, Huabei

    2009-11-01

    We present for the first time in vivo full three-dimensional (3-D) photoacoustic tomography (PAT) of the distal interphalangeal joint in a human subject. Both absorbed energy density and absorption coefficient images of the joint are quantitatively obtained using our finite-element-based photoacoustic image reconstruction algorithm coupled with the photon diffusion equation. The results show that major anatomical features in the joint along with the side arteries can be imaged with a 1-MHz transducer in a spherical scanning geometry. In addition, the cartilages associated with the joint can be quantitatively differentiated from the phalanx. This in vivo study suggests that the 3-D PAT method described has the potential to be used for early diagnosis of joint diseases such as osteoarthritis and rheumatoid arthritis.

  16. The evolution and extinction of the ichthyosaurs from the perspective of quantitative ecospace modelling.

    PubMed

    Dick, Daniel G; Maxwell, Erin E

    2015-07-01

    The role of niche specialization and narrowing in the evolution and extinction of the ichthyosaurs has been widely discussed in the literature. However, previous studies have concentrated on a qualitative discussion of these variables only. Here, we use the recently developed approach of quantitative ecospace modelling to provide a high-resolution quantitative examination of the changes in dietary and ecological niche experienced by the ichthyosaurs throughout their evolution in the Mesozoic. In particular, we demonstrate that despite recent discoveries increasing our understanding of taxonomic diversity among the ichthyosaurs in the Cretaceous, when viewed from the perspective of ecospace modelling, a clear trend of ecological contraction is visible as early as the Middle Jurassic. We suggest that this ecospace redundancy, if carried through to the Late Cretaceous, could have contributed to the extinction of the ichthyosaurs. Additionally, our results suggest a novel model to explain ecospace change, termed the 'migration model'. PMID:26156130

  17. NASA Intellectual Property Negotiation Practices and their Relationship to Quantitative Measures of Technology Transfer

    NASA Technical Reports Server (NTRS)

    Bush, Lance B.

    1997-01-01

    In the current political climate NASA must be able to show reliable measures demonstrating successful technology transfer. The currently available quantitative data of intellectual property technology transfer efforts portray a less than successful performance. In this paper, the use of only quantitative values for measurement of technology transfer is shown to undervalue the effort. In addition, NASA's current policy in negotiating intellectual property rights results in undervalued royalty rates. NASA has maintained that it's position of providing public good precludes it from negotiating fair market value for its technology and instead has negotiated for reasonable cost in order to recover processing fees. This measurement issue is examined and recommendations made which include a new policy regarding the intellectual property rights negotiation, and two measures to supplement the intellectual property measures.

  18. The evolution and extinction of the ichthyosaurs from the perspective of quantitative ecospace modelling

    PubMed Central

    Dick, Daniel G.; Maxwell, Erin E.

    2015-01-01

    The role of niche specialization and narrowing in the evolution and extinction of the ichthyosaurs has been widely discussed in the literature. However, previous studies have concentrated on a qualitative discussion of these variables only. Here, we use the recently developed approach of quantitative ecospace modelling to provide a high-resolution quantitative examination of the changes in dietary and ecological niche experienced by the ichthyosaurs throughout their evolution in the Mesozoic. In particular, we demonstrate that despite recent discoveries increasing our understanding of taxonomic diversity among the ichthyosaurs in the Cretaceous, when viewed from the perspective of ecospace modelling, a clear trend of ecological contraction is visible as early as the Middle Jurassic. We suggest that this ecospace redundancy, if carried through to the Late Cretaceous, could have contributed to the extinction of the ichthyosaurs. Additionally, our results suggest a novel model to explain ecospace change, termed the ‘migration model’. PMID:26156130

  19. Quantitative analysis of anions in glycosaminoglycans and application in heparin stability studies.

    PubMed

    Liu, Li; Linhardt, Robert J; Zhang, Zhenqing

    2014-06-15

    The sulfo groups of glycosaminoglycans contribute to their high charge densities, and are critical for the role they play in various physiological and pathophysiological processes. Unfortunately, the sulfo groups can be hydrolyzed to inorganic sulfate. Thus, it is important to monitor the presence of these sulfo groups. In addition, free anions, including chloride, sulfate and acetate, are often present in glycosaminoglycans as a result of multiple purification steps, and their presence also needs to be monitored. In this report, ion chromatography with conductivity detection is used to analyze the anions present in glycosaminoglycans, including heparin, heparan sulfate, chondroitin sulfate and dermatan sulfate. This method allows quantitation over a wide range of concentrations, affording a limit of quantitation of 0.1 ppm and a limit of detection of 0.05 ppm for most anions of interest. The stability of heparin was also studied, providing data on the formation of both sulfate and acetate anions.

  20. Gas Chromatographic Determination of Methyl Salicylate in Rubbing Alcohol: An Experiment Employing Standard Addition.

    ERIC Educational Resources Information Center

    Van Atta, Robert E.; Van Atta, R. Lewis

    1980-01-01

    Provides a gas chromatography experiment that exercises the quantitative technique of standard addition to the analysis for a minor component, methyl salicylate, in a commercial product, "wintergreen rubbing alcohol." (CS)

  1. Formation Of Cometary Hydrocarbons By Hydrogen Addition Reactions On Cold Grains

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hitomi; Watanabe, N.; Kawakita, H.; Fukushima, T.

    2012-10-01

    Hydrogen addition reactions on cold grains are considered to play an important role to form many kinds of volatiles in low temperature conditions like molecular clouds or early solar nebula. We can investigate the physical conditions (e.g., temperature, gas density, and etc.) of the early solar nebula via chemical properties of the pristine bodies like comets. The hydrocarbons like C2H2 and C2H6 have been studied so far and C2H6 might be a product of successive hydrogen addition of C2H2 on the cold grain. To evaluate the efficiency of hydrogen addition reactions from C2H2 to C2H6 quantitatively, we conducted laboratory measurements of those reactions under multiple conditions of the samples (on H2O ice) at different temperatures (10, 20, 30 K) with the LASSIE apparatus at Hokkaido University. Our results provide more detailed information about those reactions than previous quantitative studies. We discuss about the reaction rates with different samples and conditions.

  2. Additive Manufacturing of Hybrid Circuits

    NASA Astrophysics Data System (ADS)

    Sarobol, Pylin; Cook, Adam; Clem, Paul G.; Keicher, David; Hirschfeld, Deidre; Hall, Aaron C.; Bell, Nelson S.

    2016-07-01

    There is a rising interest in developing functional electronics using additively manufactured components. Considerations in materials selection and pathways to forming hybrid circuits and devices must demonstrate useful electronic function; must enable integration; and must complement the complex shape, low cost, high volume, and high functionality of structural but generally electronically passive additively manufactured components. This article reviews several emerging technologies being used in industry and research/development to provide integration advantages of fabricating multilayer hybrid circuits or devices. First, we review a maskless, noncontact, direct write (DW) technology that excels in the deposition of metallic colloid inks for electrical interconnects. Second, we review a complementary technology, aerosol deposition (AD), which excels in the deposition of metallic and ceramic powder as consolidated, thick conformal coatings and is additionally patternable through masking. Finally, we show examples of hybrid circuits/devices integrated beyond 2-D planes, using combinations of DW or AD processes and conventional, established processes.

  3. Postmarketing surveillance of food additives.

    PubMed

    Butchko, H H; Tschanz, C; Kotsonis, F N

    1994-08-01

    Postmarketing surveillance of consumption and of anecdotal reports of adverse health effects has been recognized by a number of regulatory authorities as a potentially useful method to provide further assurance of the safety of new food additives. Surveillance of consumption is used to estimate more reliably actual consumption levels relative to the acceptable daily intake of a food additive. Surveillance of anecdotal reports of adverse health effects is used to determine the presence of infrequent idiosyncratic responses that may not be predictable from premarket evaluations. The high-intensity sweetner, aspartame, is a food additive that has been the subject of extensive evaluation during the postmarketing period and is thus used as an example to discuss postmarketing surveillance.

  4. Dynamic quantitative photothermal monitoring of cell death of individual human red blood cells upon glucose depletion

    NASA Astrophysics Data System (ADS)

    Vasudevan, Srivathsan; Chen, George Chung Kit; Andika, Marta; Agarwal, Shuchi; Chen, Peng; Olivo, Malini

    2010-09-01

    Red blood cells (RBCs) have been found to undergo ``programmed cell death,'' or eryptosis, and understanding this process can provide more information about apoptosis of nucleated cells. Photothermal (PT) response, a label-free photothermal noninvasive technique, is proposed as a tool to monitor the cell death process of living human RBCs upon glucose depletion. Since the physiological status of the dying cells is highly sensitive to photothermal parameters (e.g., thermal diffusivity, absorption, etc.), we applied linear PT response to continuously monitor the death mechanism of RBC when depleted of glucose. The kinetics of the assay where the cell's PT response transforms from linear to nonlinear regime is reported. In addition, quantitative monitoring was performed by extracting the relevant photothermal parameters from the PT response. Twofold increases in thermal diffusivity and size reduction were found in the linear PT response during cell death. Our results reveal that photothermal parameters change earlier than phosphatidylserine externalization (used for fluorescent studies), allowing us to detect the initial stage of eryptosis in a quantitative manner. Hence, the proposed tool, in addition to detection of eryptosis earlier than fluorescence, could also reveal physiological status of the cells through quantitative photothermal parameter extraction.

  5. Tougher Addition Polyimides Containing Siloxane

    NASA Technical Reports Server (NTRS)

    St. Clair, T. L.; Maudgal, S.

    1986-01-01

    Laminates show increased impact resistances and other desirable mechanical properties. Bismaleamic acid extended by reaction of diaminosiloxane with maleic anhydride in 1:1 molar ratio, followed by reaction with half this molar ratio of aromatic dianhydride. Bismaleamic acid also extended by reaction of diaminosiloxane with maleic anhydride in 1:2 molar ratio, followed by reaction with half this molar ratio of aromatic diamine (Michael-addition reaction). Impact resistances improved over those of unmodified bismaleimide, showing significant increase in toughness. Aromatic addition polyimides developed as both matrix and adhesive resins for applications on future aircraft and spacecraft.

  6. An international collaborative family-based whole genome quantitative trait linkage scan for myopic refractive error

    PubMed Central

    Abbott, Diana; Li, Yi-Ju; Guggenheim, Jeremy A.; Metlapally, Ravikanth; Malecaze, Francois; Calvas, Patrick; Rosenberg, Thomas; Paget, Sandrine; Zayats, Tetyana; Mackey, David A.; Feng, Sheng

    2012-01-01

    Purpose To investigate quantitative trait loci linked to refractive error, we performed a genome-wide quantitative trait linkage analysis using single nucleotide polymorphism markers and family data from five international sites. Methods Genomic DNA samples from 254 families were genotyped by the Center for Inherited Disease Research using the Illumina Linkage Panel IVb. Quantitative trait linkage analysis was performed on 225 Caucasian families and 4,656 markers after accounting for linkage disequilibrium and quality control exclusions. Two refractive quantitative phenotypes, sphere (SPH) and spherical equivalent (SE), were analyzed. The SOLAR program was used to estimate identity by descent probabilities and to conduct two-point and multipoint quantitative trait linkage analyses. Results We found 29 markers and 11 linkage regions reaching peak two-point and multipoint logarithms of the odds (LODs)>1.5. Four linkage regions revealed at least one LOD score greater than 2: chromosome 6q13–6q16.1 (LOD=1.96 for SPH, 2.18 for SE), chromosome 5q35.1–35.2 (LOD=2.05 for SPH, 1.80 for SE), chromosome 7q11.23–7q21.2 (LOD=1.19 for SPH, 2.03 for SE), and chromosome 3q29 (LOD=1.07 for SPH, 2.05 for SE). Among these, the chromosome 6 and chromosome 5 regions showed the most consistent results between SPH and SEM. Four linkage regions with multipoint scores above 1.5 are near or within the known myopia (MYP) loci of MYP3, MYP12, MYP14, and MYP16. Overall, we observed consistent linkage signals across the SPH and SEM phenotypes, although scores were generally higher for the SEM phenotype. Conclusions Our quantitative trait linkage analyses of a large myopia family cohort provided additional evidence for several known MYP loci, and identified two additional potential loci at chromosome 6q13–16.1 and chromosome 5q35.1–35.2 for myopia. These results will benefit the efforts toward determining genes for myopic refractive error. PMID:22509102

  7. Quantitative measurement of feline colonic transit

    SciTech Connect

    Krevsky, B.; Somers, M.B.; Maurer, A.H.; Malmud, L.S.; Knight, L.C.; Fisher, R.S.

    1988-10-01

    Colonic transit scintigraphy, a method for quantitatively evaluating the movement of the fecal stream in vivo, was employed to evaluate colonic transit in the cat. Scintigraphy was performed in duplicate in five cats and repeated four times in one cat. After instillation of an 111In marker into the cecum through a surgically implanted silicone cecostomy tube, colonic movement of the instillate was quantitated for 24 h using gamma scintigraphy. Antegrade and retrograde motion of radionuclide was observed. The cecum and ascending colon emptied rapidly, with a half-emptying time of 1.68 +/- 0.56 h (mean +/- SE). After 24 h, 25.1 +/- 5.2% of the activity remained in the transverse colon. The progression of the geometric center was initially rapid, followed later by a delayed phase. Geometric center reproducibility was found to be high when analyzed using simple linear regression (slope = 0.92; r = 0.73; P less than 0.01). Atropine (0.1 mg/kg im) was found to delay cecum and ascending colon emptying and delay progression of the geometric center. These results demonstrate both 1) the ability of colonic transit scintigraphy to detect changes in transit induced by pharmacological manipulation and 2) the fact that muscarinic blockade inhibits antegrade transit of the fecal stream. We conclude that feline colonic transit may be studied in a quantitative and reproducible manner with colonic transit scintigraphy.

  8. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts.

  9. Lubricating additive for drilling muds

    SciTech Connect

    Gutierrez, A.; Brois, S. J.; Brownawell, D. W.; Walker, T. O.

    1985-01-01

    Aqueous drilling fluids containing a minor amount of an additive composition featuring oxazolines of C/sub 1/-C/sub 30/ alkylthioglycolic acid. Such fluids are especially useful where reduced torque drilling fluids are needed. Another embodiment of this invention relates to a method of drilling utilizing the above-described fluids.

  10. Tetrasulfide extreme pressure lubricant additives

    SciTech Connect

    Gast, L.E.; Kenney, H.E.; Schwab, A.W.

    1980-08-19

    A novel class of compounds has been prepared comprising the tetrasulfides of /sup 18/C hydrocarbons, /sup 18/C fatty acids, and /sup 18/C fatty and alkyl and triglyceride esters. These tetrasulfides are useful as extreme pressure lubricant additives and show potential as replacements for sulfurized sperm whale oil.

  11. Promoting Additive Acculturation in Schools.

    ERIC Educational Resources Information Center

    Gibson, Margaret A.

    1995-01-01

    A study focusing on 113 ninth graders of Mexican descent indicates that most students and their parents adhere to a strategy of additive acculturation (incorporating skills of the new culture and language), but that the school curriculum and general school climate devalue Mexican culture. (SLD)

  12. Individualized Additional Instruction for Calculus

    ERIC Educational Resources Information Center

    Takata, Ken

    2010-01-01

    College students enrolling in the calculus sequence have a wide variance in their preparation and abilities, yet they are usually taught from the same lecture. We describe another pedagogical model of Individualized Additional Instruction (IAI) that assesses each student frequently and prescribes further instruction and homework based on the…

  13. Out of bounds additive manufacturing

    DOE PAGES

    Holshouser, Chris; Newell, Clint; Palas, Sid; Love, Lonnie J.; Kunc, Vlastimil; Lind, Randall F.; Lloyd, Peter D.; Rowe, John C.; Blue, Craig A.; Duty, Chad E.; et al

    2013-03-01

    Lockheed Martin and Oak Ridge National Laboratory are working on an additive manufacturing system capable of manufacturing components measured not in terms of inches or feet, but multiple yards in all dimensions with the potential to manufacture parts that are completely unbounded in size.

  14. Tinkertoy Color-Addition Device.

    ERIC Educational Resources Information Center

    Ferguson, Joe L.

    1995-01-01

    Describes construction and use of a simple home-built device, using an overhead projector, for use in demonstrations of the addition of various combinations of red, green, and blue light. Useful in connection with discussions of color, color vision, or color television. (JRH)

  15. Additional Financial Resources for Education.

    ERIC Educational Resources Information Center

    Hubbard, Ben C.

    This paper discusses the continuing need for additional educational funds and suggests that the only way to gain these funds is through concerted and persistent political efforts by supporters of education at both the federal and state levels. The author first points out that for many reasons declining enrollment may not decrease operating costs…

  16. Does finger sense predict addition performance?

    PubMed

    Newman, Sharlene D

    2016-05-01

    The impact of fingers on numerical and mathematical cognition has received a great deal of attention recently. However, the precise role that fingers play in numerical cognition is unknown. The current study explores the relationship between finger sense, arithmetic and general cognitive ability. Seventy-six children between the ages of 5 and 12 participated in the study. The results of stepwise multiple regression analyses demonstrated that while general cognitive ability including language processing was a predictor of addition performance, finger sense was not. The impact of age on the relationship between finger sense, and addition was further examined. The participants were separated into two groups based on age. The results showed that finger gnosia score impacted addition performance in the older group but not the younger group. These results appear to support the hypothesis that fingers provide a scaffold for calculation and that if that scaffold is not properly built, it has continued differential consequences to mathematical cognition. PMID:26993292

  17. Evaluation of certain food additives.

    PubMed

    2012-01-01

    This report represents the conclusions of a Joint FAO/WHO Expert Committee convened to evaluate the safety of various food additives, including flavouring agents, with a view to concluding as to safety concerns and to preparing specifications for identity and purity. The first part of the report contains a general discussion of the principles governing the toxicological evaluation of and assessment of dietary exposure to food additives, including flavouring agents. A summary follows of the Committee's evaluations of technical, toxicological and dietary exposure data for five food additives (magnesium dihydrogen diphosphate; mineral oil (medium and low viscosity) classes II and III; 3-phytase from Aspergillus niger expressed in Aspergillus niger; serine protease (chymotrypsin) from Nocardiopsis prasina expressed in Bacillus licheniformis; and serine protease (trypsin) from Fusarium oxysporum expressed in Fusarium venenatum) and 16 groups of flavouring agents (aliphatic and aromatic amines and amides; aliphatic and aromatic ethers; aliphatic hydrocarbons, alcohols, aldehydes, ketones, carboxylic acids and related esters, sulfides, disulfides and ethers containing furan substitution; aliphatic linear alpha,beta-unsaturated aldehydes, acids and related alcohols, acetals and esters; amino acids and related substances; epoxides; furfuryl alcohol and related substances; linear and branched-chain aliphatic, unsaturated, unconjugated alcohols, aldehydes, acids and related esters; miscellaneous nitrogen-containing substances; phenol and phenol derivatives; pyrazine derivatives; pyridine, pyrrole and quinoline derivatives; saturated aliphatic acyclic branched-chain primary alcohols, aldehydes and acids; simple aliphatic and aromatic sulfides and thiols; sulfur-containing heterocyclic compounds; and sulfur-substituted furan derivatives). Specifications for the following food additives were revised: ethyl cellulose, mineral oil (medium viscosity), modified starches and titanium

  18. Evaluation of certain food additives.

    PubMed

    2012-01-01

    This report represents the conclusions of a Joint FAO/WHO Expert Committee convened to evaluate the safety of various food additives, including flavouring agents, with a view to concluding as to safety concerns and to preparing specifications for identity and purity. The first part of the report contains a general discussion of the principles governing the toxicological evaluation of and assessment of dietary exposure to food additives, including flavouring agents. A summary follows of the Committee's evaluations of technical, toxicological and dietary exposure data for five food additives (magnesium dihydrogen diphosphate; mineral oil (medium and low viscosity) classes II and III; 3-phytase from Aspergillus niger expressed in Aspergillus niger; serine protease (chymotrypsin) from Nocardiopsis prasina expressed in Bacillus licheniformis; and serine protease (trypsin) from Fusarium oxysporum expressed in Fusarium venenatum) and 16 groups of flavouring agents (aliphatic and aromatic amines and amides; aliphatic and aromatic ethers; aliphatic hydrocarbons, alcohols, aldehydes, ketones, carboxylic acids and related esters, sulfides, disulfides and ethers containing furan substitution; aliphatic linear alpha,beta-unsaturated aldehydes, acids and related alcohols, acetals and esters; amino acids and related substances; epoxides; furfuryl alcohol and related substances; linear and branched-chain aliphatic, unsaturated, unconjugated alcohols, aldehydes, acids and related esters; miscellaneous nitrogen-containing substances; phenol and phenol derivatives; pyrazine derivatives; pyridine, pyrrole and quinoline derivatives; saturated aliphatic acyclic branched-chain primary alcohols, aldehydes and acids; simple aliphatic and aromatic sulfides and thiols; sulfur-containing heterocyclic compounds; and sulfur-substituted furan derivatives). Specifications for the following food additives were revised: ethyl cellulose, mineral oil (medium viscosity), modified starches and titanium

  19. Quantitation of flaviviruses by fluorescent focus assay.

    PubMed

    Payne, Anne F; Binduga-Gajewska, Iwona; Kauffman, Elizabeth B; Kramer, Laura D

    2006-06-01

    An indirect immunofluorescence assay for quantitation of flaviviruses was developed as an alternative to the standard plaque assay. The assay was validated with West Nile virus (WNV), St. Louis encephalitis virus (SLEV), and Dengue virus (DENV) types 1-4. Vero cells were plated in 8-well chamber slides, and infected with 10-fold serial dilutions of virus. About 1-3 days after infection, cells were fixed, incubated with specific monoclonal antibody, and stained with a secondary antibody labeled with a fluorescent tag. Fluorescent foci of infection were observed and counted using a fluorescence microscope, and viral titers were calculated as fluorescent focus units (FFU) per ml. The optimal time for performing the fluorescent focus assay (FFA) on Vero cells was 24 h for WNV, and 48 h for SLEV and the four DENV serotypes. In contrast, the time required to complete a standard Vero cell plaque assay for these viruses range from 3 days for WNV to 11 days for DENV-1. Thus, the FFA method of virus titration is useful for viruses whose plaques develop slowly. In addition, these viruses can be quantitated by FFA on a mosquito cell line (C6/36), which does not support plaque formation. The FFA for flaviviruses was validated for accuracy, precision, specificity, and robustness of the assay.

  20. Quantitative Stratification of Diffuse Parenchymal Lung Diseases

    PubMed Central

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Maldonado, Fabien; Peikert, Tobias; Moua, Teng; Ryu, Jay H.; Bartholmai, Brian J.; Robb, Richard A.

    2014-01-01

    Diffuse parenchymal lung diseases (DPLDs) are characterized by widespread pathological changes within the pulmonary tissue that impair the elasticity and gas exchange properties of the lungs. Clinical-radiological diagnosis of these diseases remains challenging and their clinical course is characterized by variable disease progression. These challenges have hindered the introduction of robust objective biomarkers for patient-specific prediction based on specific phenotypes in clinical practice for patients with DPLD. Therefore, strategies facilitating individualized clinical management, staging and identification of specific phenotypes linked to clinical disease outcomes or therapeutic responses are urgently needed. A classification schema consistently reflecting the radiological, clinical (lung function and clinical outcomes) and pathological features of a disease represents a critical need in modern pulmonary medicine. Herein, we report a quantitative stratification paradigm to identify subsets of DPLD patients with characteristic radiologic patterns in an unsupervised manner and demonstrate significant correlation of these self-organized disease groups with clinically accepted surrogate endpoints. The proposed consistent and reproducible technique could potentially transform diagnostic staging, clinical management and prognostication of DPLD patients as well as facilitate patient selection for clinical trials beyond the ability of current radiological tools. In addition, the sequential quantitative stratification of the type and extent of parenchymal process may allow standardized and objective monitoring of disease, early assessment of treatment response and mortality prediction for DPLD patients. PMID:24676019

  1. Quantitative Species Measurements In Microgravity Combustion Flames

    NASA Technical Reports Server (NTRS)

    Chen, Shin-Juh; Pilgrim, Jeffrey S.; Silver, Joel A.; Piltch, Nancy D.

    2003-01-01

    The capability of models and theories to accurately predict and describe the behavior of low gravity flames can only be verified by quantitative measurements. Although video imaging, simple temperature measurements, and velocimetry methods have provided useful information in many cases, there is still a need for quantitative species measurements. Over the past decade, we have been developing high sensitivity optical absorption techniques to permit in situ, non-intrusive, absolute concentration measurements for both major and minor flames species using diode lasers. This work has helped to establish wavelength modulation spectroscopy (WMS) as an important method for species detection within the restrictions of microgravity-based measurements. More recently, in collaboration with Prof. Dahm at the University of Michigan, a new methodology combining computed flame libraries with a single experimental measurement has allowed us to determine the concentration profiles for all species in a flame. This method, termed ITAC (Iterative Temperature with Assumed Chemistry) was demonstrated for a simple laminar nonpremixed methane-air flame at both 1-g and at 0-g in a vortex ring flame. In this paper, we report additional normal and microgravity experiments which further confirm the usefulness of this approach. We also present the development of a new type of laser. This is an external cavity diode laser (ECDL) which has the unique capability of high frequency modulation as well as a very wide tuning range. This will permit the detection of multiple species with one laser while using WMS detection.

  2. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application. PMID:26321463

  3. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-01

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  4. Natural bacterial communities serve as quantitative geochemical biosensors

    SciTech Connect

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; Olesen, Scott W.; Paradis, Charles; Wu, Liyou; Campbell, James H.; Fortney, Julian L.; Mehlhorn, Tonia L.; Lowe, Kenneth A.; Earles, Jennifer E.; Phillips, Jana; Techtmann, Steve M.; Joyner, Dominique C.; Elias, Dwayne A.; Bailey, Kathryn L.; Hurt, Richard A.; Preheim, Sarah P.; Sanders, Matthew C.; Yang, Joy; Mueller, Marcella A.; Brooks, Scott; Watson, David B.; Zhang, Ping; He, Zhili; Dubinsky, Eric A.; Adams, Paul D.; Arkin, Adam P.; Fields, Matthew W.; Zhou, Jizhong; Alm, Eric J.; Hazen, Terry C.

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination, even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.

  5. Natural bacterial communities serve as quantitative geochemical biosensors

    DOE PAGES

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; Olesen, Scott W.; Paradis, Charles; Wu, Liyou; Campbell, James H.; Fortney, Julian L.; Mehlhorn, Tonia L.; Lowe, Kenneth A.; et al

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination,more » even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.« less

  6. Active mineral additives of sapropel ashes

    NASA Astrophysics Data System (ADS)

    Khomich, V. A.; Danilina, E. V.; Krivonos, O. I.; Plaksin, G. V.

    2015-01-01

    The goal of the presented research is to establish a scientific rational for the possibility of sapropel ashes usage as an active mineral additive. The research included the study of producing active mineral additives from sapropels by their thermal treatment at 850900 °C and afterpowdering, the investigation of the properties of paste matrix with an ash additive, and the study of the ash influence on the cement bonding agent. Thermogravimetric analysis and X-ray investigations allowed us to establish that while burning, organic substances are removed, clay minerals are dehydrated and their structure is broken. Sapropel ashes chemical composition was determined. An amorphous ash constituent is mainly formed from silica of the mineral sapropel part and alumosilicagels resulted from clay minerals decomposition. Properties of PC 400 and PC 500A0 sparopel ash additives were studied. Adding ashes containing Glenium plasticizer to the cement increases paste matrix strength and considerably reduces its water absorption. X-ray phase analysis data shows changes in the phase composition of the paste matrix with an ash additive. Ash additives produce a pozzolanic effect on the cement bonding agent. Besides, an ash additive due to the alumosilicagels content causes transformation from unstable calcium aluminate forms to the stable ones.

  7. Evidence for dose-additive effects of a type II pyrethroid mixture. In vitro assessment.

    PubMed

    Romero, A; Ares, I; Ramos, E; Castellano, V; Martínez, M; Martínez-Larrañaga, M R; Anadón, A; Martínez, M A

    2015-04-01

    Despite the widespread use of pyrethroid insecticides that led to common exposure in the population, few studies have been conducted to quantitatively assess dose-additive effects of pyrethroids using a funcional measure involved in the common toxic mode of action. The aim of this study was to evaluate the potency and efficacy of 6 Type II pyretroids (α-cypermethrin, cyfluthrin, λ-cyhalothrin, deltamethrin, cyphenothrin and esfenvalerate) to evoke induction of both nitric oxide and lipid peroxides levels measured as malondialdehyde in three in vitro models (SH-SY5Y, HepG2 and Caco-2 human cells) as well as to test the hypothesis of dose additivity for mixtures of these same 6 pyrethroids. Concentration-responses for 6 pyrethroids were determined as well as the response to mixtures of all 6 pyrethroids. Additivity was tested assuming a dose-additive model. The human neuroblastoma SH-SY5Y cell line was the most sensitive in vitro model. The rank order of potency for cell SH-SY5Y viability MTT assay was deltamethrin>cyphenothrin>λ-cyhalothrin>cyfluthrin>esfenvalerate>α-cypermethrin. When 6 pyrethroids were present in the mixture at an equitoxic mixing ratio, the action on nitric oxide (NO) and lipid peroxides measured as malondialdehyde (MDA) production was consistent with a dose-additive model. The results of the present study are consistent with previous reports of additivity of pyrethroids in vivo e in vitro.

  8. Quantitative detection of protein arrays.

    PubMed

    Levit-Binnun, Nava; Lindner, Ariel B; Zik, Ory; Eshhar, Zelig; Moses, Elisha

    2003-03-15

    We introduce a quantitative method that utilizes scanning electron microscopy for the analysis of protein chips (SEMPC). SEMPC is based upon counting target-coated gold particles interacting specifically with ligands or proteins arrayed on a derivative microscope glass slide by utilizing backscattering electron detection. As model systems, we quantified the interactions of biotin and streptavidin and of an antibody with its cognate hapten. Our method gives quantitative molecule-counting capabilities with an excellent signal-to-noise ratio and demonstrates a broad dynamic range while retaining easy sample preparation and realistic automation capability. Increased sensitivity and dynamic range are achieved in comparison to currently used array detection methods such as fluorescence, with no signal bleaching, affording high reproducibility and compatibility with miniaturization. Thus, our approach facilitates the determination of the absolute number of molecules bound to the chip rather than their relative amounts, as well as the use of smaller samples.

  9. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  10. Studies on the quantitation of immunoglobulin in human intestinal secretions

    PubMed Central

    Samson, R. R.; McClelland, D. B. L.; Shearman, D. J. C.

    1973-01-01

    There is increasing evidence for the importance of the secretory immune system in the gut. In studies of local antibody production it is important to have satisfactory methods for measuring immunoglobulin concentrations and to be aware of the errors which may occur. Studies on immunoglobulin measurement in intestinal secretion by the radial immunodiffusion method are reported, showing the effects of proteolytic digestion, IgA molecular size, and sampling and storage conditions. Because of the presence of monomeric IgA in addition to secretory IgA, there is no satisfactory standard for IgA in gastrointestinal secretions, and only semi-quantitative results can be given. With radial immunodiffusion, IgG and IgM when subjected to tryptic digestion, and IgA when subjected to peptic digestion, may be overestimated because of the presence of fragments of immunoglobulins. In addition, pepsin rapidly destroys IgM and IgG. Both IgM and IgG are unstable in storage. The findings suggest that immunoglobulin concentration measurements in small intestinal aspirates should be interpreted with caution. These problems are also relevant to the detection of specific antibodies in gastrointestinal secretions. ImagesFig 3Fig 5Fig 7Fig 8Fig 9Fig 11 PMID:4582728

  11. The use of selection experiments for detecting quantitative trait loci.

    PubMed

    Ollivier, L; Messer, L A; Rothschild, M F; Legault, C

    1997-06-01

    Gene frequency changes following selection may reveal the existence of gene effects on the trait selected. Loci for the selected quantitative trait (SQTL) may thus be detected. Additionally, one can estimate the average effect (alpha) of a marker allele associated with an SQTL from the allele frequency change (delta q) due to selection of given intensity (i). In a sample of unrelated individuals, it is optimal to select the upper and lower 27% for generating delta q in order to estimate alpha. For a given number of individuals genotyped, this estimator is 0.25i2 times more efficient than the classical estimator of alpha, based on the regression of the trait on the genotype at the marker locus. The method is extended to selection criteria using information from relatives, showing that combined selection considerably increases the efficiency of estimation for traits of low heritability. The method has been applied to the detection of SQTL in a selection experiment in which the trait selected was pig litter size averaged over the first four parities, with i = 3. Results for four genes are provided, one of which yielded a highly significant effect. The conditions required for valid application of the method are discussed, including selection experiments over several generations. Additional advantages of the method can be anticipated from determining gene frequencies on pooled samples of blood or DNA.

  12. Quantitative wave-particle duality

    NASA Astrophysics Data System (ADS)

    Qureshi, Tabish

    2016-07-01

    The complementary wave and particle character of quantum objects (or quantons) was pointed out by Niels Bohr. This wave-particle duality, in the context of the two-slit experiment, is here described not just as two extreme cases of wave and particle characteristics, but in terms of quantitative measures of these characteristics, known to follow a duality relation. A very simple and intuitive derivation of a closely related duality relation is presented, which should be understandable to the introductory student.

  13. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  14. Quantitative gallbladder imaging following cholecystokinin

    SciTech Connect

    Topper, T.E.; Ryerson, T.W.; Nora, P.F.

    1980-07-01

    Quantitative gallbladder imaging with Tc-99m paraisopropylimidodiacetic acid (PIPIDA) was performed and time-activity curves over the gallbladder were obtained following i.v. injection of cholecystokinin (CCK). The gallbladders that failed to contract after CCK were found to be abnormal at surgery. This test appears to be helpful in evaluating patients who have normal oral cholecystograms but have persistent symptoms of gallbladder disease.

  15. Quantitative bioluminescence imaging of mouse tumor models.

    PubMed

    Tseng, Jen-Chieh; Kung, Andrew L

    2015-01-05

    Bioluminescence imaging (BLI) has become an essential technique for preclinical evaluation of anticancer therapeutics and provides sensitive and quantitative measurements of tumor burden in experimental cancer models. For light generation, a vector encoding firefly luciferase is introduced into human cancer cells that are grown as tumor xenografts in immunocompromised hosts, and the enzyme substrate luciferin is injected into the host. Alternatively, the reporter gene can be expressed in genetically engineered mouse models to determine the onset and progression of disease. In addition to expression of an ectopic luciferase enzyme, bioluminescence requires oxygen and ATP, thus only viable luciferase-expressing cells or tissues are capable of producing bioluminescence signals. Here, we summarize a BLI protocol that takes advantage of advances in hardware, especially the cooled charge-coupled device camera, to enable detection of bioluminescence in living animals with high sensitivity and a large dynamic range.

  16. Imaging Performance of Quantitative Transmission Ultrasound

    PubMed Central

    Lenox, Mark W.; Wiskin, James; Lewis, Matthew A.; Darrouzet, Stephen; Borup, David; Hsieh, Scott

    2015-01-01

    Quantitative Transmission Ultrasound (QTUS) is a tomographic transmission ultrasound modality that is capable of generating 3D speed-of-sound maps of objects in the field of view. It performs this measurement by propagating a plane wave through the medium from a transmitter on one side of a water tank to a high resolution receiver on the opposite side. This information is then used via inverse scattering to compute a speed map. In addition, the presence of reflection transducers allows the creation of a high resolution, spatially compounded reflection map that is natively coregistered to the speed map. A prototype QTUS system was evaluated for measurement and geometric accuracy as well as for the ability to correctly determine speed of sound. PMID:26604918

  17. Following Optogenetic Dimerizers and Quantitative Prospects.

    PubMed

    Niu, Jacqueline; Ben Johny, Manu; Dick, Ivy E; Inoue, Takanari

    2016-09-20

    Optogenetics describes the use of genetically encoded photosensitive proteins to direct intended biological processes with light in recombinant and native systems. While most of these light-responsive proteins were originally discovered in photosynthetic organisms, the past few decades have been punctuated by experiments that not only commandeer but also engineer and enhance these natural tools to explore a wide variety of physiological questions. In addition, the ability to tune dynamic range and kinetic rates of optogenetic actuators is a challenging question that is heavily explored with computational methods devised to facilitate optimization of these systems. Here, we explain the basic mechanisms of a few popular photodimerizing optogenetic systems, discuss applications, compare optogenetic tools against more traditional chemical methods, and propose a simple quantitative understanding of how actuators exert their influence on targeted processes. PMID:27542508

  18. Quantitative measures for redox signaling.

    PubMed

    Pillay, Ché S; Eagling, Beatrice D; Driscoll, Scott R E; Rohwer, Johann M

    2016-07-01

    Redox signaling is now recognized as an important regulatory mechanism for a number of cellular processes including the antioxidant response, phosphokinase signal transduction and redox metabolism. While there has been considerable progress in identifying the cellular machinery involved in redox signaling, quantitative measures of redox signals have been lacking, limiting efforts aimed at understanding and comparing redox signaling under normoxic and pathogenic conditions. Here we have outlined some of the accepted principles for redox signaling, including the description of hydrogen peroxide as a signaling molecule and the role of kinetics in conferring specificity to these signaling events. Based on these principles, we then develop a working definition for redox signaling and review a number of quantitative methods that have been employed to describe signaling in other systems. Using computational modeling and published data, we show how time- and concentration- dependent analyses, in particular, could be used to quantitatively describe redox signaling and therefore provide important insights into the functional organization of redox networks. Finally, we consider some of the key challenges with implementing these methods. PMID:27151506

  19. QUANTITATIVE INVESTIGATIONS OF IDIOTYPIC ANTIBODIES

    PubMed Central

    Spring, Susan B.; Schroeder, Kenneth W.; Nisonoff, Alfred

    1971-01-01

    The effect of challenge by antigen on persistence of clones of antibody-producing cells and on the induction of new clones was investigated through quantitative measurements of idiotypic specificities. In each of nine rabbits idiotypic specificities present in the earliest bleedings were completely replaced after a few months; subsequent changes occurred much more slowly. On a quantitative basis the population of molecules used as immunogen always reacted most effectively with the homologous anti-idiotypic antiserum. Little effect of increased antigen dose on the rate of change of idiotype was observed. Even large amounts of antigen administered every 2 wk caused only gradual changes in idiotypic specificities. This was attributed either to more effective capture of antigen by memory cells, as compared to precursor cells, or to the induction of tolerance in those clones that were not expressed. In two of three rabbits on a monthly injection schedule, the idiotypic specificities identified underwent very slow changes over a period as long as 17 months. Changes occurred more rapidly when antigen was administered every 2 wk. In each of four rabbits investigated, all idiotypic specificities identified before a 5 month rest period were still present afterwards, indicating the survival of essentially all clones of antibody-producing cells during that interval. Quantitative inhibition data indicated that some new clones of cells were initiated. PMID:15776574

  20. Decontamination formulation with sorbent additive

    DOEpatents

    Tucker; Mark D. , Comstock; Robert H.

    2007-10-16

    A decontamination formulation and method of making that neutralizes the adverse health effects of both chemical and biological compounds, especially chemical warfare (CW) and biological warfare (BW) agents, and toxic industrial chemicals. The formulation provides solubilizing compounds that serve to effectively render the chemical and biological compounds, particularly CW and BW compounds, susceptible to attack, and at least one reactive compound that serves to attack (and detoxify or kill) the compound. The formulation includes at least one solubilizing agent, a reactive compound, a bleaching activator, a sorbent additive, and water. The highly adsorbent, water-soluble sorbent additive (e.g., sorbitol or mannitol) is used to "dry out" one or more liquid ingredients, such as the liquid bleaching activator (e.g., propylene glycol diacetate or glycerol diacetate) and convert the activator into a dry, free-flowing powder that has an extended shelf life, and is more convenient to handle and mix in the field.