Science.gov

Sample records for accurate quantitative results

  1. Optimization of sample preparation for accurate results in quantitative NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamazaki, Taichi; Nakamura, Satoe; Saito, Takeshi

    2017-04-01

    Quantitative nuclear magnetic resonance (qNMR) spectroscopy has received high marks as an excellent measurement tool that does not require the same reference standard as the analyte. Measurement parameters have been discussed in detail and high-resolution balances have been used for sample preparation. However, the high-resolution balances, such as an ultra-microbalance, are not general-purpose analytical tools and many analysts may find those balances difficult to use, thereby hindering accurate sample preparation for qNMR measurement. In this study, we examined the relationship between the resolution of the balance and the amount of sample weighed during sample preparation. We were able to confirm the accuracy of the assay results for samples weighed on a high-resolution balance, such as the ultra-microbalance. Furthermore, when an appropriate tare and amount of sample was weighed on a given balance, accurate assay results were obtained with another high-resolution balance. Although this is a fundamental result, it offers important evidence that would enhance the versatility of the qNMR method.

  2. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  3. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  4. Groundtruth approach to accurate quantitation of fluorescence microarrays

    SciTech Connect

    Mascio-Kegelmeyer, L; Tomascik-Cheeseman, L; Burnett, M S; van Hummelen, P; Wyrobek, A J

    2000-12-01

    To more accurately measure fluorescent signals from microarrays, we calibrated our acquisition and analysis systems by using groundtruth samples comprised of known quantities of red and green gene-specific DNA probes hybridized to cDNA targets. We imaged the slides with a full-field, white light CCD imager and analyzed them with our custom analysis software. Here we compare, for multiple genes, results obtained with and without preprocessing (alignment, color crosstalk compensation, dark field subtraction, and integration time). We also evaluate the accuracy of various image processing and analysis techniques (background subtraction, segmentation, quantitation and normalization). This methodology calibrates and validates our system for accurate quantitative measurement of microarrays. Specifically, we show that preprocessing the images produces results significantly closer to the known ground-truth for these samples.

  5. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  6. Accurate stress resultants equations for laminated composite deep thick shells

    SciTech Connect

    Qatu, M.S.

    1995-11-01

    This paper derives accurate equations for the normal and shear force as well as bending and twisting moment resultants for laminated composite deep, thick shells. The stress resultant equations for laminated composite thick shells are shown to be different from those of plates. This is due to the fact the stresses over the thickness of the shell have to be integrated on a trapezoidal-like shell element to obtain the stress resultants. Numerical results are obtained and showed that accurate stress resultants are needed for laminated composite deep thick shells, especially if the curvature is not spherical.

  7. A fluorescence-based quantitative real-time PCR assay for accurate Pocillopora damicornis species identification

    NASA Astrophysics Data System (ADS)

    Thomas, Luke; Stat, Michael; Evans, Richard D.; Kennington, W. Jason

    2016-09-01

    Pocillopora damicornis is one of the most extensively studied coral species globally, but high levels of phenotypic plasticity within the genus make species identification based on morphology alone unreliable. As a result, there is a compelling need to develop cheap and time-effective molecular techniques capable of accurately distinguishing P. damicornis from other congeneric species. Here, we develop a fluorescence-based quantitative real-time PCR (qPCR) assay to genotype a single nucleotide polymorphism that accurately distinguishes P. damicornis from other morphologically similar Pocillopora species. We trial the assay across colonies representing multiple Pocillopora species and then apply the assay to screen samples of Pocillopora spp. collected at regional scales along the coastline of Western Australia. This assay offers a cheap and time-effective alternative to Sanger sequencing and has broad applications including studies on gene flow, dispersal, recruitment and physiological thresholds of P. damicornis.

  8. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  9. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    PubMed Central

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-01-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems. PMID:27934889

  10. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads

    PubMed Central

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-01-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith–Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets. PMID:22379138

  11. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads.

    PubMed

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-06-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith-Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets.

  12. Accurate detection and quantitation of heteroplasmic mitochondrial point mutations by pyrosequencing.

    PubMed

    White, Helen E; Durston, Victoria J; Seller, Anneke; Fratter, Carl; Harvey, John F; Cross, Nicholas C P

    2005-01-01

    Disease-causing mutations in mitochondrial DNA (mtDNA) are typically heteroplasmic and therefore interpretation of genetic tests for mitochondrial disorders can be problematic. Detection of low level heteroplasmy is technically demanding and it is often difficult to discriminate between the absence of a mutation or the failure of a technique to detect the mutation in a particular tissue. The reliable measurement of heteroplasmy in different tissues may help identify individuals who are at risk of developing specific complications and allow improved prognostic advice for patients and family members. We have evaluated Pyrosequencing technology for the detection and estimation of heteroplasmy for six mitochondrial point mutations associated with the following diseases: Leber's hereditary optical neuropathy (LHON), G3460A, G11778A, and T14484C; mitochondrial encephalopathy with lactic acidosis and stroke-like episodes (MELAS), A3243G; myoclonus epilepsy with ragged red fibers (MERRF), A8344G, and neurogenic muscle weakness, ataxia, and retinitis pigmentosa (NARP)/Leighs: T8993G/C. Results obtained from the Pyrosequencing assays for 50 patients with presumptive mitochondrial disease were compared to those obtained using the commonly used diagnostic technique of polymerase chain reaction (PCR) and restriction enzyme digestion. The Pyrosequencing assays provided accurate genotyping and quantitative determination of mutational load with a sensitivity and specificity of 100%. The MELAS A3243G mutation was detected reliably at a level of 1% heteroplasmy. We conclude that Pyrosequencing is a rapid and robust method for detecting heteroplasmic mitochondrial point mutations.

  13. Quantitative proteomics using the high resolution accurate mass capabilities of the quadrupole-orbitrap mass spectrometer.

    PubMed

    Gallien, Sebastien; Domon, Bruno

    2014-08-01

    High resolution/accurate mass hybrid mass spectrometers have considerably advanced shotgun proteomics and the recent introduction of fast sequencing capabilities has expanded its use for targeted approaches. More specifically, the quadrupole-orbitrap instrument has a unique configuration and its new features enable a wide range of experiments. An overview of the analytical capabilities of this instrument is presented, with a focus on its application to quantitative analyses. The high resolution, the trapping capability and the versatility of the instrument have allowed quantitative proteomic workflows to be redefined and new data acquisition schemes to be developed. The initial proteomic applications have shown an improvement of the analytical performance. However, as quantification relies on ion trapping, instead of ion beam, further refinement of the technique can be expected.

  14. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  15. Highly sensitive capillary electrophoresis-mass spectrometry for rapid screening and accurate quantitation of drugs of abuse in urine.

    PubMed

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-05-30

    The combination of capillary electrophoresis (CE) and mass spectrometry (MS) is particularly well adapted to bioanalysis due to its high separation efficiency, selectivity, and sensitivity; its short analytical time; and its low solvent and sample consumption. For clinical and forensic toxicology, a two-step analysis is usually performed: first, a screening step for compound identification, and second, confirmation and/or accurate quantitation in cases of presumed positive results. In this study, a fast and sensitive CE-MS workflow was developed for the screening and quantitation of drugs of abuse in urine samples. A CE with a time-of-flight MS (CE-TOF/MS) screening method was developed using a simple urine dilution and on-line sample preconcentration with pH-mediated stacking. The sample stacking allowed for a high loading capacity (20.5% of the capillary length), leading to limits of detection as low as 2 ng mL(-1) for drugs of abuse. Compound quantitation of positive samples was performed by CE-MS/MS with a triple quadrupole MS equipped with an adapted triple-tube sprayer and an electrospray ionization (ESI) source. The CE-ESI-MS/MS method was validated for two model compounds, cocaine (COC) and methadone (MTD), according to the Guidance of the Food and Drug Administration. The quantitative performance was evaluated for selectivity, response function, the lower limit of quantitation, trueness, precision, and accuracy. COC and MTD detection in urine samples was determined to be accurate over the range of 10-1000 ng mL(-1) and 21-1000 ng mL(-1), respectively.

  16. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  17. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations

    NASA Astrophysics Data System (ADS)

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-01

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  18. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  19. SILAC-Based Quantitative Strategies for Accurate Histone Posttranslational Modification Profiling Across Multiple Biological Samples.

    PubMed

    Cuomo, Alessandro; Soldi, Monica; Bonaldi, Tiziana

    2017-01-01

    Histone posttranslational modifications (hPTMs) play a key role in regulating chromatin dynamics and fine-tuning DNA-based processes. Mass spectrometry (MS) has emerged as a versatile technology for the analysis of histones, contributing to the dissection of hPTMs, with special strength in the identification of novel marks and in the assessment of modification cross talks. Stable isotope labeling by amino acid in cell culture (SILAC), when adapted to histones, permits the accurate quantification of PTM changes among distinct functional states; however, its application has been mainly confined to actively dividing cell lines. A spike-in strategy based on SILAC can be used to overcome this limitation and profile hPTMs across multiple samples. We describe here the adaptation of SILAC to the analysis of histones, in both standard and spike-in setups. We also illustrate its coupling to an implemented "shotgun" workflow, by which heavy arginine-labeled histone peptides, produced upon Arg-C digestion, are qualitatively and quantitatively analyzed in an LC-MS/MS system that combines ultrahigh-pressure liquid chromatography (UHPLC) with new-generation Orbitrap high-resolution instrument.

  20. How accurate is the Kubelka-Munk theory of diffuse reflection? A quantitative answer

    NASA Astrophysics Data System (ADS)

    Joseph, Richard I.; Thomas, Michael E.

    2012-10-01

    The (heuristic) Kubelka-Munk theory of diffuse reflectance and transmittance of a film on a substrate, which is widely used because it gives simple analytic results, is compared to the rigorous radiative transfer model of Chandrasekhar. The rigorous model has to be numerically solved, thus is less intuitive. The Kubelka-Munk theory uses an absorption coefficient and scatter coefficient as inputs, similar to the rigorous model of Chandrasekhar. The relationship between these two sets of coefficients is addressed. It is shown that the Kubelka-Munk theory is remarkably accurate if one uses the proper albedo parameter.

  1. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling.

    PubMed

    Boers, Stefan A; Hays, John P; Jansen, Ruud

    2017-04-05

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison.

  2. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling

    PubMed Central

    Boers, Stefan A.; Hays, John P.; Jansen, Ruud

    2017-01-01

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison. PMID:28378789

  3. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  4. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  5. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  6. Renal Cortical Lactate Dehydrogenase: A Useful, Accurate, Quantitative Marker of In Vivo Tubular Injury and Acute Renal Failure

    PubMed Central

    Zager, Richard A.; Johnson, Ali C. M.; Becker, Kirsten

    2013-01-01

    Studies of experimental acute kidney injury (AKI) are critically dependent on having precise methods for assessing the extent of tubular cell death. However, the most widely used techniques either provide indirect assessments (e.g., BUN, creatinine), suffer from the need for semi-quantitative grading (renal histology), or reflect the status of residual viable, not the number of lost, renal tubular cells (e.g., NGAL content). Lactate dehydrogenase (LDH) release is a highly reliable test for assessing degrees of in vitro cell death. However, its utility as an in vivo AKI marker has not been defined. Towards this end, CD-1 mice were subjected to graded renal ischemia (0, 15, 22, 30, 40, or 60 min) or to nephrotoxic (glycerol; maleate) AKI. Sham operated mice, or mice with AKI in the absence of acute tubular necrosis (ureteral obstruction; endotoxemia), served as negative controls. Renal cortical LDH or NGAL levels were assayed 2 or 24 hrs later. Ischemic, glycerol, and maleate-induced AKI were each associated with striking, steep, inverse correlations (r, −0.89) between renal injury severity and renal LDH content. With severe AKI, >65% LDH declines were observed. Corresponding prompt plasma and urinary LDH increases were observed. These observations, coupled with the maintenance of normal cortical LDH mRNA levels, indicated the renal LDH efflux, not decreased LDH synthesis, caused the falling cortical LDH levels. Renal LDH content was well maintained with sham surgery, ureteral obstruction or endotoxemic AKI. In contrast to LDH, renal cortical NGAL levels did not correlate with AKI severity. In sum, the above results indicate that renal cortical LDH assay is a highly accurate quantitative technique for gauging the extent of experimental acute ischemic and toxic renal injury. That it avoids the limitations of more traditional AKI markers implies great potential utility in experimental studies that require precise quantitation of tubule cell death. PMID:23825563

  7. Quantitative results for square gradient models of fluids

    NASA Astrophysics Data System (ADS)

    Kong, Ling-Ti; Vriesinga, Dan; Denniston, Colin

    2011-03-01

    Square gradient models for fluids are extensively used because they are believed to provide a good qualitative understanding of the essential physics. However, unlike elasticity theory for solids, there are few quantitative results for specific (as opposed to generic) fluids. Indeed the only numerical value of the square gradient coefficients for specific fluids have been inferred from attempts to match macroscopic properties such as surface tensions rather than from direct measurement. We employ all-atom molecular dynamics, using the TIP3P and OPLS force fields, to directly measure the coefficients of the density gradient expansion for several real fluids. For all liquids measured, including water, we find that the square gradient coefficient is negative, suggesting the need for some regularization of a model including only the square gradient, but only at wavelengths comparable to the molecular separation of molecules. The implications for liquid-gas interfaces are also examined. Remarkably, the square gradient model is found to give a reasonably accurate description of density fluctuations in the liquid state down to wavelengths close to atomic size.

  8. Quantitatively accurate activity measurements with a dedicated cardiac SPECT camera: Physical phantom experiments

    SciTech Connect

    Pourmoghaddas, Amir Wells, R. Glenn

    2016-01-15

    Healthcare), followed by a CT scan for attenuation correction (AC). For each experiment, separate images were created including reconstruction with no corrections (NC), with AC, with attenuation and dual-energy window (DEW) scatter correction (ACSC), with attenuation and partial volume correction (PVC) applied (ACPVC), and with attenuation, scatter, and PVC applied (ACSCPVC). The DEW SC method used was modified to account for the presence of the low-energy tail. Results: T-tests showed that the mean error in absolute activity measurement was reduced significantly for AC and ACSC compared to NC for both (hot and cold) datasets (p < 0.001) and that ACSC, ACPVC, and ACSCPVC show significant reductions in mean differences compared to AC (p ≤ 0.001) without increasing the uncertainty (p > 0.4). The effect of SC and PVC was significant in reducing errors over AC in both datasets (p < 0.001 and p < 0.01, respectively), resulting in a mean error of 5% ± 4%. Conclusions: Quantitative measurements of cardiac {sup 99m}Tc activity are achievable using attenuation and scatter corrections, with the authors’ dedicated cardiac SPECT camera. Partial volume corrections offer improvements in measurement accuracy in AC images and ACSC images with elevated background activity; however, these improvements are not significant in ACSC images with low background activity.

  9. Extended Rearrangement Inequalities and Applications to Some Quantitative Stability Results

    NASA Astrophysics Data System (ADS)

    Lemou, Mohammed

    2016-12-01

    In this paper, we prove a new functional inequality of Hardy-Littlewood type for generalized rearrangements of functions. We then show how this inequality provides quantitative stability results of steady states to evolution systems that essentially preserve the rearrangements and some suitable energy functional, under minimal regularity assumptions on the perturbations. In particular, this inequality yields a quantitative stability result of a large class of steady state solutions to the Vlasov-Poisson systems, and more precisely we derive a quantitative control of the L 1 norm of the perturbation by the relative Hamiltonian (the energy functional) and rearrangements. A general non linear stability result has been obtained by Lemou et al. (Invent Math 187:145-194, 2012) in the gravitational context, however the proof relied in a crucial way on compactness arguments which by construction provides no quantitative control of the perturbation. Our functional inequality is also applied to the context of 2D-Euler systems and also provides quantitative stability results of a large class of steady-states to this system in a natural energy space.

  10. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis

    PubMed Central

    Smith, William L.; Chadwick, Sean G.; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E.; Aguin, Tina J.; Sobel, Jack D.

    2016-01-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease: Gardnerella vaginalis, Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the order Clostridiales), Megasphaera phylotype 1 or 2, Lactobacillus iners, Lactobacillus crispatus, Lactobacillus gasseri, and Lactobacillus jensenii. We generated a logistic regression model that identified G. vaginalis, A. vaginae, and Megasphaera phylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion of Lactobacillus spp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV. PMID:26818677

  11. Quantitative spectroscopy of hot stars: accurate atomic data applied on a large scale as driver of recent breakthroughs

    NASA Astrophysics Data System (ADS)

    Przybilla, N.; Schaffenroth, V.; Nieva, M. F.; Butler, K.

    2016-10-01

    OB-type stars present hotbeds for non-LTE physics because of their strong radiation fields that drive the atmospheric plasma out of local thermodynamic equilibrium. We report on recent breakthroughs in the quantitative analysis of the optical and UV-spectra of OB-type stars that were facilitated by application of accurate and precise atomic data on a large scale. An astrophysicist's dream has come true, by bringing observed and model spectra into close match over wide parts of the observed wavelength ranges. This allows tight observational constraints to be derived from OB-type stars for a wide range of applications in astrophysics. However, despite the progress made, many details of the modelling may be improved further. We discuss atomic data needs in terms of laboratory measurements and also ab-initio calculations. Particular emphasis is given to quantitative spectroscopy in the near-IR, which will be the focus in the era of the upcoming extremely large telescopes.

  12. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  13. There's plenty of gloom at the bottom: the many challenges of accurate quantitation in size-based oligomeric separations.

    PubMed

    Striegel, André M

    2013-11-01

    There is a variety of small-molecule species (e.g., tackifiers, plasticizers, oligosaccharides) the size-based characterization of which is of considerable scientific and industrial importance. Likewise, quantitation of the amount of oligomers in a polymer sample is crucial for the import and export of substances into the USA and European Union (EU). While the characterization of ultra-high molar mass macromolecules by size-based separation techniques is generally considered a challenge, it is this author's contention that a greater challenge is encountered when trying to perform, for quantitation purposes, separations in and of the oligomeric region. The latter thesis is expounded herein, by detailing the various obstacles encountered en route to accurate, quantitative oligomeric separations by entropically dominated techniques such as size-exclusion chromatography, hydrodynamic chromatography, and asymmetric flow field-flow fractionation, as well as by methods which are, principally, enthalpically driven such as liquid adsorption and temperature gradient interaction chromatography. These obstacles include, among others, the diminished sensitivity of static light scattering (SLS) detection at low molar masses, the non-constancy of the response of SLS and of commonly employed concentration-sensitive detectors across the oligomeric region, and the loss of oligomers through the accumulation wall membrane in asymmetric flow field-flow fractionation. The battle is not lost, however, because, with some care and given a sufficient supply of sample, the quantitation of both individual oligomeric species and of the total oligomeric region is often possible.

  14. Quantitation of Insulin-Like Growth Factor 1 in Serum by Liquid Chromatography High Resolution Accurate-Mass Mass Spectrometry.

    PubMed

    Ketha, Hemamalini; Singh, Ravinder J

    2016-01-01

    Insulin-like growth factor 1 (IGF-1) is a 70 amino acid peptide hormone which acts as the principal mediator of the effects of growth hormone (GH). Due to a wide variability in circulating concentration of GH, IGF-1 quantitation is the first step in the diagnosis of GH excess or deficiency. Majority (>95 %) of IGF-1 circulates as a ternary complex along with its principle binding protein insulin-like growth factor 1 binding protein 3 (IGFBP-3) and acid labile subunit. The assay design approach for IGF-1 quantitation has to include a step to dissociate IGF-1 from its ternary complex. Several commercial assays employ a buffer containing acidified ethanol to achieve this. Despite several modifications, commercially available immunoassays have been shown to have challenges with interference from IGFBP-3. Additionally, inter-method comparison between IGF-1 immunoassays has been shown to be suboptimal. Mass spectrometry has been utilized for quantitation of IGF-1. In this chapter a liquid chromatography high resolution accurate-mass mass spectrometry (LC-HRAMS) based method for IGF-1 quantitation has been described.

  15. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  16. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis.

    PubMed

    Hilbert, David W; Smith, William L; Chadwick, Sean G; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E; Aguin, Tina J; Sobel, Jack D; Gygax, Scott E

    2016-04-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease:Gardnerella vaginalis,Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the orderClostridiales),Megasphaeraphylotype 1 or 2,Lactobacillus iners,Lactobacillus crispatus,Lactobacillus gasseri, andLactobacillus jensenii We generated a logistic regression model that identifiedG. vaginalis,A. vaginae, andMegasphaeraphylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion ofLactobacillusspp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV.

  17. An Accurate Heading Solution using MEMS-based Gyroscope and Magnetometer Integrated System (Preliminary Results)

    NASA Astrophysics Data System (ADS)

    El-Diasty, M.

    2014-11-01

    An accurate heading solution is required for many applications and it can be achieved by high grade (high cost) gyroscopes (gyros) which may not be suitable for such applications. Micro-Electro Mechanical Systems-based (MEMS) is an emerging technology, which has the potential of providing heading solution using a low cost MEMS-based gyro. However, MEMS-gyro-based heading solution drifts significantly over time. The heading solution can also be estimated using MEMS-based magnetometer by measuring the horizontal components of the Earth magnetic field. The MEMS-magnetometer-based heading solution does not drift over time, but are contaminated by high level of noise and may be disturbed by the presence of magnetic field sources such as metal objects. This paper proposed an accurate heading estimation procedure based on the integration of MEMS-based gyro and magnetometer measurements that correct gyro and magnetometer measurements where gyro angular rates of changes are estimated using magnetometer measurements and then integrated with the measured gyro angular rates of changes with a robust filter to estimate the heading. The proposed integration solution is implemented using two data sets; one was conducted in static mode without magnetic disturbances and the second was conducted in kinematic mode with magnetic disturbances. The results showed that the proposed integrated heading solution provides accurate, smoothed and undisturbed solution when compared with magnetometerbased and gyro-based heading solutions.

  18. Adsorption of lactate dehydrogenase enzyme on carbon nanotubes: how to get accurate results for the cytotoxicity of these nanomaterials.

    PubMed

    Forest, Valérie; Figarol, Agathe; Boudard, Delphine; Cottier, Michèle; Grosseau, Philippe; Pourchez, Jérémie

    2015-03-31

    Carbon nanotube (CNT) cytotoxicity is frequently investigated using in vitro classical toxicology assays. However, these cellular tests, usually based on the use of colorimetric or fluorimetric dyes, were designed for chemicals and may not be suitable for nanosized materials. Indeed, because of their unique physicochemical properties CNT can interfere with the assays and bias the results. To get accurate data and draw reliable conclusions, these artifacts should be carefully taken into account. The aim of this study was to evaluate qualitatively and quantitatively the interferences occurring between CNT and the commonly used lactate dehydrogenase (LDH) assay. Experiments under cell-free conditions were performed, and it was clearly demonstrated that artifacts occurred. They were due to the intrinsic absorbance of CNT on one hand and the adsorption of LDH at the CNT surface on the other hand. The adsorption of LDH on CNT was modeled and was found to fit the Langmuir model. The K(ads) and n(eq) constants were defined, allowing the correction of results obtained from cellular experiments to get more accurate data and lead to proper conclusions on the cytotoxicity of CNT.

  19. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples.

    PubMed

    Mackie, David M; Jahnke, Justin P; Benyamin, Marcus S; Sumner, James J

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users' purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells.

  20. The APEX Quantitative Proteomics Tool: Generating protein quantitation estimates from LC-MS/MS proteomics results

    PubMed Central

    Braisted, John C; Kuntumalla, Srilatha; Vogel, Christine; Marcotte, Edward M; Rodrigues, Alan R; Wang, Rong; Huang, Shih-Ting; Ferlanti, Erik S; Saeed, Alexander I; Fleischmann, Robert D; Peterson, Scott N; Pieper, Rembert

    2008-01-01

    Background Mass spectrometry (MS) based label-free protein quantitation has mainly focused on analysis of ion peak heights and peptide spectral counts. Most analyses of tandem mass spectrometry (MS/MS) data begin with an enzymatic digestion of a complex protein mixture to generate smaller peptides that can be separated and identified by an MS/MS instrument. Peptide spectral counting techniques attempt to quantify protein abundance by counting the number of detected tryptic peptides and their corresponding MS spectra. However, spectral counting is confounded by the fact that peptide physicochemical properties severely affect MS detection resulting in each peptide having a different detection probability. Lu et al. (2007) described a modified spectral counting technique, Absolute Protein Expression (APEX), which improves on basic spectral counting methods by including a correction factor for each protein (called Oi value) that accounts for variable peptide detection by MS techniques. The technique uses machine learning classification to derive peptide detection probabilities that are used to predict the number of tryptic peptides expected to be detected for one molecule of a particular protein (Oi). This predicted spectral count is compared to the protein's observed MS total spectral count during APEX computation of protein abundances. Results The APEX Quantitative Proteomics Tool, introduced here, is a free open source Java application that supports the APEX protein quantitation technique. The APEX tool uses data from standard tandem mass spectrometry proteomics experiments and provides computational support for APEX protein abundance quantitation through a set of graphical user interfaces that partition thparameter controls for the various processing tasks. The tool also provides a Z-score analysis for identification of significant differential protein expression, a utility to assess APEX classifier performance via cross validation, and a utility to merge multiple

  1. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  2. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  3. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes.

  4. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2016-10-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  5. Quantitative polymerase chain reaction analysis of DNA from noninvasive samples for accurate microsatellite genotyping of wild chimpanzees (Pan troglodytes verus).

    PubMed

    Morin, P A; Chambers, K E; Boesch, C; Vigilant, L

    2001-07-01

    Noninvasive samples are useful for molecular genetic analyses of wild animal populations. However, the low DNA content of such samples makes DNA amplification difficult, and there is the potential for erroneous results when one of two alleles at heterozygous microsatellite loci fails to be amplified. In this study we describe an assay designed to measure the amount of amplifiable nuclear DNA in low DNA concentration extracts from noninvasive samples. We describe the range of DNA amounts obtained from chimpanzee faeces and shed hair samples and formulate a new efficient approach for accurate microsatellite genotyping. Prescreening of extracts for DNA quantity is recommended for sorting of samples for likely success and reliability. Repetition of results remains extensive for analysis of microsatellite amplifications beginning from low starting amounts of DNA, but is reduced for those with higher DNA content.

  6. Quantitative analysis of rib kinematics based on dynamic chest bone images: preliminary results.

    PubMed

    Tanaka, Rie; Sanada, Shigeru; Sakuta, Keita; Kawashima, Hiroki

    2015-04-01

    An image-processing technique for separating bones from soft tissue in static chest radiographs has been developed. The present study was performed to evaluate the usefulness of dynamic bone images in quantitative analysis of rib movement. Dynamic chest radiographs of 16 patients were obtained using a dynamic flat-panel detector and processed to create bone images by using commercial software (Clear Read BS, Riverain Technologies). Velocity vectors were measured in local areas on the dynamic images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as a reduced rib velocity field, resulting in an asymmetrical distribution of rib movement. Vector maps in all normal cases exhibited left/right symmetric distributions of the velocity field, whereas those in abnormal cases showed asymmetric distributions because of locally limited rib movements. Dynamic bone images were useful for accurate quantitative analysis of rib movements. The present method has a potential for an additional functional examination in chest radiography.

  7. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  8. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  9. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  10. Accurate, quantitative assays for the hydrolysis of soluble type I, II, and III /sup 3/H-acetylated collagens by bacterial and tissue collagenases

    SciTech Connect

    Mallya, S.K.; Mookhtiar, K.A.; Van Wart, H.E.

    1986-11-01

    Accurate and quantitative assays for the hydrolysis of soluble /sup 3/H-acetylated rat tendon type I, bovine cartilage type II, and human amnion type III collagens by both bacterial and tissue collagenases have been developed. The assays are carried out at any temperature in the 1-30/sup 0/C range in a single reaction tube and the progress of the reaction is monitored by withdrawing aliquots as a function of time, quenching with 1,10-phenanthroline, and quantitation of the concentration of hydrolysis fragments. The latter is achieved by selective denaturation of these fragments by incubation under conditions described in the previous paper of this issue. The assays give percentages of hydrolysis of all three collagen types by neutrophil collagenase that agree well with the results of gel electrophoresis experiments. The initial rates of hydrolysis of all three collagens are proportional to the concentration of both neutrophil or Clostridial collagenases over a 10-fold range of enzyme concentrations. All three assays can be carried out at collagen concentrations that range from 0.06 to 2 mg/ml and give linear double reciprocal plots for both tissue and bacterial collagenases that can be used to evaluate the kinetic parameters K/sub m/ and k/sub cat/ or V/sub max/. The assay developed for the hydrolysis of rat type I collagen by neutrophil collagenase is shown to be more sensitive by at least one order of magnitude than comparable assays that use rat type I collagen fibrils or gels as substrate.

  11. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    SciTech Connect

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; Parks, Jerry M.

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of

  12. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    DOE PAGES

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; ...

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalaminmore » in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co axial ligand binding, leading to substantial errors in predicted

  13. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid-Base and Ligand Binding Equilibria of Aquacobalamin.

    PubMed

    Johnston, Ryne C; Zhou, Jing; Smith, Jeremy C; Parks, Jerry M

    2016-08-04

    Redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. A major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co-ligand binding equilibrium constants (Kon/off), pKas, and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for Co(III), Co(II), and Co(I) species, respectively, and the second model features saturation of each vacant axial coordination site on Co(II) and Co(I) species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co-axial ligand binding, leading to substantial errors in predicted pKas and

  14. Can a quantitative simulation of an Otto engine be accurately rendered by a simple Novikov model with heat leak?

    NASA Astrophysics Data System (ADS)

    Fischer, A.; Hoffmann, K.-H.

    2004-03-01

    In this case study a complex Otto engine simulation provides data including, but not limited to, effects from losses due to heat conduction, exhaust losses and frictional losses. This data is used as a benchmark to test whether the Novikov engine with heat leak, a simple endoreversible model, can reproduce the complex engine behavior quantitatively by an appropriate choice of model parameters. The reproduction obtained proves to be of high quality.

  15. Calibration of qualitative HBsAg assay results for quantitative HBsAg monitoring.

    PubMed

    Gunning, Hans; Adachi, Dena; Tang, Julian W

    2014-10-01

    Evidence is accumulating that quantitative hepatitis B surface antigen monitoring may be useful in managing patients with chronic HBV infection on certain treatment regimens. Based on these results with the Abbott Architect qualitative and quantitative HBsAg assays, it seems feasible to convert qualitative to quantitative HBsAg values for this purpose.

  16. African Primary Care Research: quantitative analysis and presentation of results.

    PubMed

    Mash, Bob; Ogunbanjo, Gboyega A

    2014-06-06

    This article is part of a series on Primary Care Research Methods. The article describes types of continuous and categorical data, how to capture data in a spreadsheet, how to use descriptive and inferential statistics and, finally, gives advice on how to present the results in text, figures and tables. The article intends to help Master's level students with writing the data analysis section of their research proposal and presenting their results in their final research report.

  17. Quantitative results of stellar evolution and pulsation theories.

    NASA Technical Reports Server (NTRS)

    Fricke, K.; Stobie, R. S.; Strittmatter, P. A.

    1971-01-01

    The discrepancy between the masses of Cepheid variables deduced from evolution theory and pulsation theory is examined. The effect of input physics on evolutionary tracks is first discussed; in particular, changes in the opacity are considered. The sensitivity of pulsation masses to opacity changes and to the ascribed values of luminosity and effective temperature are then analyzed. The Cepheid mass discrepancy is discussed in the light of the results already obtained. Other astronomical evidence, including the mass-luminosity relation for main sequence stars, the solar neutrino flux, and cluster ages are also considered in an attempt to determine the most likely source of error in the event that substantial mass loss has not occurred.

  18. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  19. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait

    PubMed Central

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y.; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A.; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  20. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  1. Recent Results on the Accurate Measurements of the Dielectric Constant of Seawater at 1.413GHZ

    NASA Technical Reports Server (NTRS)

    Lang, R.H.; Tarkocin, Y.; Utku, C.; Le Vine, D.M.

    2008-01-01

    Measurements of the complex. dielectric constant of seawater at 30.00 psu, 35.00 psu and 38.27 psu over the temperature range from 5 C to 3 5 at 1.413 GHz are given and compared with the Klein-Swift results. A resonant cavity technique is used. The calibration constant used in the cavity perturbation formulas is determined experimentally using methanol and ethanediol (ethylene glycol) as reference liquids. Analysis of the data shows that the measurements are accurate to better than 1.0% in almost all cases studied.

  2. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  3. Can MRI accurately detect pilon articular malreduction? A quantitative comparison between CT and 3T MRI bone models

    PubMed Central

    Radzi, Shairah; Dlaska, Constantin Edmond; Cowin, Gary; Robinson, Mark; Pratap, Jit; Schuetz, Michael Andreas; Mishra, Sanjay

    2016-01-01

    Background Pilon fracture reduction is a challenging surgery. Radiographs are commonly used to assess the quality of reduction, but are limited in revealing the remaining bone incongruities. The study aimed to develop a method in quantifying articular malreductions using 3D computed tomography (CT) and magnetic resonance imaging (MRI) models. Methods CT and MRI data were acquired using three pairs of human cadaveric ankle specimens. Common tibial pilon fractures were simulated by performing osteotomies to the ankle specimens. Five of the created fractures [three AO type-B (43-B1), and two AO type-C (43-C1) fractures] were then reduced and stabilised using titanium implants, then rescanned. All datasets were reconstructed into CT and MRI models, and were analysed in regards to intra-articular steps and gaps, surface deviations, malrotations and maltranslations of the bone fragments. Results Initial results reveal that type B fracture CT and MRI models differed by ~0.2 (step), ~0.18 (surface deviations), ~0.56° (rotation) and ~0.4 mm (translation). Type C fracture MRI models showed metal artefacts extending to the articular surface, thus unsuitable for analysis. Type C fracture CT models differed from their CT and MRI contralateral models by ~0.15 (surface deviation), ~1.63° (rotation) and ~0.4 mm (translation). Conclusions Type B fracture MRI models were comparable to CT and may potentially be used for the postoperative assessment of articular reduction on a case-to-case basis. PMID:28090442

  4. Experimental demonstration of quantitation errors in MR spectroscopy resulting from saturation corrections under changing conditions

    NASA Astrophysics Data System (ADS)

    Galbán, Craig J.; Ellis, Scott J.; Spencer, Richard G. S.

    2003-04-01

    Metabolite concentration measurements in in vivo NMR are generally performed under partially saturated conditions, with correction for partial saturation performed after data collection using a measured saturation factor. Here, we present an experimental test of the hypothesis that quantitation errors can occur due to application of such saturation factor corrections in changing systems. Thus, this extends our previous theoretical work on quantitation errors due to varying saturation factors. We obtained results for two systems frequently studied by 31P NMR, the ischemic rat heart and the electrically stimulated rat gastrocnemius muscle. The results are interpreted in light of previous theoretical work which defined the degree of saturation occurring in a one-pulse experiment for a system with given spin-lattice relaxation times, T1s, equilibrium magnetizations, M0s, and reaction rates. We found that (i) the assumption of constancy of saturation factors leads to quantitation errors on the order of 40% in inorganic phosphate; (ii) the dominant contributor to the quantitation errors in inorganic phosphate is most likely changes in T1; (iii) T1 and M0 changes between control and intervention periods, and chemical exchange contribute to different extents to quantitation errors in phosphocreatine and γ-ATP; (iv) relatively small increases in interpulse delay substantially decreased quantitation errors for metabolites in ischemic rat hearts; (v) random error due to finite SNR led to approximately 4% error in quantitation, and hence was a substantially smaller contributor than were changes in saturation factors.

  5. Visual Mapping of Sedimentary Facies Can Yield Accurate And Geomorphically Meaningful Results at Morphological Unit to River Segment Scales

    NASA Astrophysics Data System (ADS)

    Pasternack, G. B.; Wyrick, J. R.; Jackson, J. R.

    2014-12-01

    Long practiced in fisheries, visual substrate mapping of coarse-bedded rivers is eschewed by geomorphologists for inaccuracy and limited sizing data. Geomorphologists perform time-consuming measurements of surficial grains, with the few locations precluding spatially explicit mapping and analysis of sediment facies. Remote sensing works for bare land, but not vegetated or subaqueous sediments. As visual systems apply the log2 Wentworth scale made for sieving, they suffer from human inability to readily discern those classes. We hypothesized that size classes centered on the PDF of the anticipated sediment size distribution would enable field crews to accurately (i) identify presence/absence of each class in a facies patch and (ii) estimate the relative amount of each class to within 10%. We first tested 6 people using 14 measured samples with different mixtures. Next, we carried out facies mapping for ~ 37 km of the lower Yuba River in California. Finally, we tested the resulting data to see if it produced statistically significant hydraulic-sedimentary-geomorphic results. Presence/absence performance error was 0-4% for four people, 13% for one person, and 33% for one person. The last person was excluded from further effort. For the abundance estimation performance error was 1% for one person, 7-12% for three people, and 33% for one person. This last person was further trained and re-tested. We found that the samples easiest to visually quantify were unimodal and bimodal, while those most difficult had nearly equal amounts of each size. This confirms psychological studies showing that humans have a more difficult time quantifying abundances of subgroups when confronted with well-mixed groups. In the Yuba, mean grain size decreased downstream, as is typical for an alluvial river. When averaged by reach, mean grain size and bed slope were correlated with an r2 of 0.95. At the morphological unit (MU) scale, eight in-channel bed MU types had an r2 of 0.90 between mean

  6. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    SciTech Connect

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  7. Can Community Health Workers Report Accurately on Births and Deaths? Results of Field Assessments in Ethiopia, Malawi and Mali

    PubMed Central

    Silva, Romesh; Amouzou, Agbessi; Munos, Melinda; Marsh, Andrew; Hazel, Elizabeth; Victora, Cesar; Black, Robert; Bryce, Jennifer

    2016-01-01

    Introduction Most low-income countries lack complete and accurate vital registration systems. As a result, measures of under-five mortality rates rely mostly on household surveys. In collaboration with partners in Ethiopia, Ghana, Malawi, and Mali, we assessed the completeness and accuracy of reporting of births and deaths by community-based health workers, and the accuracy of annualized under-five mortality rate estimates derived from these data. Here we report on results from Ethiopia, Malawi and Mali. Method In all three countries, community health workers (CHWs) were trained, equipped and supported to report pregnancies, births and deaths within defined geographic areas over a period of at least fifteen months. In-country institutions collected these data every month. At each study site, we administered a full birth history (FBH) or full pregnancy history (FPH), to women of reproductive age via a census of households in Mali and via household surveys in Ethiopia and Malawi. Using these FBHs/FPHs as a validation data source, we assessed the completeness of the counts of births and deaths and the accuracy of under-five, infant, and neonatal mortality rates from the community-based method against the retrospective FBH/FPH for rolling twelve-month periods. For each method we calculated total cost, average annual cost per 1,000 population, and average cost per vital event reported. Results On average, CHWs submitted monthly vital event reports for over 95 percent of catchment areas in Ethiopia and Malawi, and for 100 percent of catchment areas in Mali. The completeness of vital events reporting by CHWs varied: we estimated that 30%-90% of annualized expected births (i.e. the number of births estimated using a FPH) were documented by CHWs and 22%-91% of annualized expected under-five deaths were documented by CHWs. Resulting annualized under-five mortality rates based on the CHW vital events reporting were, on average, under-estimated by 28% in Ethiopia, 32% in

  8. Detection and quantitation of trace phenolphthalein (in pharmaceutical preparations and in forensic exhibits) by liquid chromatography-tandem mass spectrometry, a sensitive and accurate method.

    PubMed

    Sharma, Kakali; Sharma, Shiba P; Lahiri, Sujit C

    2013-01-01

    Phenolphthalein, an acid-base indicator and laxative, is important as a constituent of widely used weight-reducing multicomponent food formulations. Phenolphthalein is an useful reagent in forensic science for the identification of blood stains of suspected victims and for apprehending erring officials accepting bribes in graft or trap cases. The pink-colored alkaline hand washes originating from the phenolphthalein-smeared notes can easily be determined spectrophotometrically. But in many cases, colored solution turns colorless with time, which renders the genuineness of bribe cases doubtful to the judiciary. No method is known till now for the detection and identification of phenolphthalein in colorless forensic exhibits with positive proof. Liquid chromatography-tandem mass spectrometry had been found to be most sensitive, accurate method capable of detection and quantitation of trace phenolphthalein in commercial formulations and colorless forensic exhibits with positive proof. The detection limit of phenolphthalein was found to be 1.66 pg/L or ng/mL, and the calibration curve shows good linearity (r(2) = 0.9974).

  9. Allele Specific Locked Nucleic Acid Quantitative PCR (ASLNAqPCR): An Accurate and Cost-Effective Assay to Diagnose and Quantify KRAS and BRAF Mutation

    PubMed Central

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes. PMID:22558339

  10. Quantitative fuel motion determination with the CABRI fast neutron hodoscope; Evaluation methods and results

    SciTech Connect

    Baumung, K. ); Augier, G. )

    1991-12-01

    The fast neutron hodoscope installed at the CABRI reactor in Cadarache, France, is employed to provide quantitative fuel motion data during experiments in which single liquid-metal fast breeder reactor test pins are subjected to simulated accident conditions. Instrument design and performance are reviewed, the methods for the quantitative evaluation are presented, and error sources are discussed. The most important findings are the axial expansion as a function of time, phenomena related to pin failure (such as time, location, pin failure mode, and fuel mass ejected after failure), and linear fuel mass distributions with a 2-cm axial resolution. In this paper the hodoscope results of the CABRI-1 program are summarized.

  11. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  12. Duplicate portion sampling combined with spectrophotometric analysis affords the most accurate results when assessing daily dietary phosphorus intake.

    PubMed

    Navarro-Alarcon, Miguel; Zambrano, Esmeralda; Moreno-Montoro, Miriam; Agil, Ahmad; Olalla, Manuel

    2012-08-01

    The assessment of daily dietary phosphorus (P) intake is a major concern in human nutrition because of its relationship with Ca and Mg metabolism and osteoporosis. Within this context, we hypothesized that several of the methods available for the assessment of daily dietary intake of P are equally accurate and reliable, although few studies have been conducted to confirm this. The aim of this study then was to evaluate daily dietary P intake, which we did by 3 methods: duplicate portion sampling of 108 hospital meals, combined either with spectrophotometric analysis or the use of food composition tables, and 24-hour dietary recall for 3 consecutive days plus the use of food composition tables. The mean P daily dietary intakes found were 1106 ± 221, 1480 ± 221, and 1515 ± 223 mg/d, respectively. Daily dietary intake of P determined by spectrophotometric analysis was significantly lower (P < .001) and closer to dietary reference intakes for adolescents aged from 14 to 18 years (88.5%) and adult subjects (158.1%) compared with the other 2 methods. Duplicate portion sampling with P analysis takes into account the influence of technological and cooking processes on the P content of foods and meals and therefore afforded the most accurate and reliable P daily dietary intakes. The use of referred food composition tables overestimated daily dietary P intake. No adverse effects in relation to P nutrition (deficiencies or toxic effects) were encountered.

  13. A rapid and accurate method for the quantitative estimation of natural polysaccharides and their fractions using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector.

    PubMed

    Cheong, Kit-Leong; Wu, Ding-Tao; Zhao, Jing; Li, Shao-Ping

    2015-06-26

    In this study, a rapid and accurate method for quantitative analysis of natural polysaccharides and their different fractions was developed. Firstly, high performance size exclusion chromatography (HPSEC) was utilized to separate natural polysaccharides. And then the molecular masses of their fractions were determined by multi-angle laser light scattering (MALLS). Finally, quantification of polysaccharides or their fractions was performed based on their response to refractive index detector (RID) and their universal refractive index increment (dn/dc). Accuracy of the developed method for the quantification of individual and mixed polysaccharide standards, including konjac glucomannan, CM-arabinan, xyloglucan, larch arabinogalactan, oat β-glucan, dextran (410, 270, and 25 kDa), mixed xyloglucan and CM-arabinan, and mixed dextran 270 K and CM-arabinan was determined, and their average recoveries were between 90.6% and 98.3%. The limits of detection (LOD) and quantification (LOQ) were ranging from 10.68 to 20.25 μg/mL, and 42.70 to 68.85 μg/mL, respectively. Comparing to the conventional phenol sulfuric acid assay and HPSEC coupled with evaporative light scattering detection (HPSEC-ELSD) analysis, the developed HPSEC-MALLS-RID method based on universal dn/dc for the quantification of polysaccharides and their fractions is much more simple, rapid, and accurate with no need of individual polysaccharide standard, as well as free of calibration curve. The developed method was also successfully utilized for quantitative analysis of polysaccharides and their different fractions from three medicinal plants of Panax genus, Panax ginseng, Panax notoginseng and Panax quinquefolius. The results suggested that the HPSEC-MALLS-RID method based on universal dn/dc could be used as a routine technique for the quantification of polysaccharides and their fractions in natural resources.

  14. Accrual Patterns for Clinical Studies Involving Quantitative Imaging: Results of an NCI Quantitative Imaging Network (QIN) Survey

    PubMed Central

    Kurland, Brenda F.; Aggarwal, Sameer; Yankeelov, Thomas E.; Gerstner, Elizabeth R.; Mountz, James M.; Linden, Hannah M.; Jones, Ella F.; Bodeker, Kellie L.; Buatti, John M.

    2017-01-01

    Patient accrual is essential for the success of oncology clinical trials. Recruitment for trials involving the development of quantitative imaging biomarkers may face different challenges than treatment trials. This study surveyed investigators and study personnel for evaluating accrual performance and perceived barriers to accrual and for soliciting solutions to these accrual challenges that are specific to quantitative imaging-based trials. Responses for 25 prospective studies were received from 12 sites. The median percent annual accrual attained was 94.5% (range, 3%–350%). The most commonly selected barrier to recruitment (n = 11/25, 44%) was that “patients decline participation,” followed by “too few eligible patients” (n = 10/25, 40%). In a forced choice for the single greatest recruitment challenge, “too few eligible patients” was the most common response (n = 8/25, 32%). Quantitative analysis and qualitative responses suggested that interactions among institutional, physician, and patient factors contributed to accrual success and challenges. Multidisciplinary collaboration in trial design and execution is essential to accrual success, with attention paid to ensuring and communicating potential trial benefits to enrolled and future patients. PMID:28127586

  15. DSS1/DSS2 astrometry for 1101 First Byurakan Survey blue stellar objects: Accurate positions and other results

    NASA Astrophysics Data System (ADS)

    Mickaelian, A. M.

    2004-10-01

    Accurate measurements of the positions of 1101 First Byurakan Survey (FBS) blue stellar objects (the Second part of the FBS) have been carried out on the DSS1 and DSS2 (red and blue images). To establish the accuracy of the DSS1 and DSS2, measurements have been made for 153 AGN for which absolute VLBI coordinates have been published. The rms errors are: 0.45 arcsec for DSS1, 0.33 arcsec for DSS2 red, and 0.59 arcsec for DSS2 blue in each coordinate, the corresponding total positional errors being 0.64 arcsec, 0.46 arcsec, and 0.83 arcsec, respectively. The highest accuracy (0.42 arcsec) is obtained by weighted averaging of the DSS1 and DSS2 red positions. It is shown that by using all three DSS images accidental errors can be significantly reduced. The comparison of DSS2 and DSS1 images made it possible to reveal positional differences and proper motions for 78 objects (for 62 of these for the first time), including new high-probability candidate white dwarfs, and to find objects showing strong variability, i.e. high-probability candidate cataclysmic variables. Table 1 is only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/426/367

  16. A novel, integrated PET-guided MRS technique resulting in more accurate initial diagnosis of high-grade glioma.

    PubMed

    Kim, Ellen S; Satter, Martin; Reed, Marilyn; Fadell, Ronald; Kardan, Arash

    2016-06-01

    Glioblastoma multiforme (GBM) is the most common and lethal malignant glioma in adults. Currently, the modality of choice for diagnosing brain tumor is high-resolution magnetic resonance imaging (MRI) with contrast, which provides anatomic detail and localization. Studies have demonstrated, however, that MRI may have limited utility in delineating the full tumor extent precisely. Studies suggest that MR spectroscopy (MRS) can also be used to distinguish high-grade from low-grade gliomas. However, due to operator dependent variables and the heterogeneous nature of gliomas, the potential for error in diagnostic accuracy with MRS is a concern. Positron emission tomography (PET) imaging with (11)C-methionine (MET) and (18)F-fluorodeoxyglucose (FDG) has been shown to add additional information with respect to tumor grade, extent, and prognosis based on the premise of biochemical changes preceding anatomic changes. Combined PET/MRS is a technique that integrates information from PET in guiding the location for the most accurate metabolic characterization of a lesion via MRS. We describe a case of glioblastoma multiforme in which MRS was initially non-diagnostic for malignancy, but when MRS was repeated with PET guidance, demonstrated elevated choline/N-acetylaspartate (Cho/NAA) ratio in the right parietal mass consistent with a high-grade malignancy. Stereotactic biopsy, followed by PET image-guided resection, confirmed the diagnosis of grade IV GBM. To our knowledge, this is the first reported case of an integrated PET/MRS technique for the voxel placement of MRS. Our findings suggest that integrated PET/MRS may potentially improve diagnostic accuracy in high-grade gliomas.

  17. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    PubMed

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts.

  18. Report: EPA Needs Accurate Data on Results of Pollution Prevention Grants to Maintain Program Integrity and Measure Effectiveness of Grants

    EPA Pesticide Factsheets

    Report #15-P-0276, September 4, 2015. Inaccurate reporting of results misrepresents the impacts of pollution prevention activities provided to the public, and misinforms EPA management on the effectiveness of its investment in the program.

  19. Critical appraisal of quantitative PCR results in colorectal cancer research: can we rely on published qPCR results?

    PubMed

    Dijkstra, J R; van Kempen, L C; Nagtegaal, I D; Bustin, S A

    2014-06-01

    The use of real-time quantitative polymerase chain reaction (qPCR) in cancer research has become ubiquitous. The relative simplicity of qPCR experiments, which deliver fast and cost-effective results, means that each year an increasing number of papers utilizing this technique are being published. But how reliable are the published results? Since the validity of gene expression data is greatly dependent on appropriate normalisation to compensate for sample-to-sample and run-to-run variation, we have evaluated the adequacy of normalisation procedures in qPCR-based experiments. Consequently, we assessed all colorectal cancer publications that made use of qPCR from 2006 until August 2013 for the number of reference genes used and whether they had been validated. Using even these minimal evaluation criteria, the validity of only three percent (6/179) of the publications can be adequately assessed. We describe common errors, and conclude that the current state of reporting on qPCR in colorectal cancer research is disquieting. Extrapolated to the study of cancer in general, it is clear that the majority of studies using qPCR cannot be reliably assessed and that at best, the results of these studies may or may not be valid and at worst, pervasive incorrect normalisation is resulting in the wholesale publication of incorrect conclusions. This survey demonstrates that the existence of guidelines, such as MIQE, is necessary but not sufficient to address this problem and suggests that the scientific community should examine its responsibility and be aware of the implications of these findings for current and future research.

  20. Towards more accurate isoscapes encouraging results from wine, water and marijuana data/model and model/model comparisons.

    NASA Astrophysics Data System (ADS)

    West, J. B.; Ehleringer, J. R.; Cerling, T.

    2006-12-01

    Understanding how the biosphere responds to change it at the heart of biogeochemistry, ecology, and other Earth sciences. The dramatic increase in human population and technological capacity over the past 200 years or so has resulted in numerous, simultaneous changes to biosphere structure and function. This, then, has lead to increased urgency in the scientific community to try to understand how systems have already responded to these changes, and how they might do so in the future. Since all biospheric processes exhibit some patchiness or patterns over space, as well as time, we believe that understanding the dynamic interactions between natural systems and human technological manipulations can be improved if these systems are studied in an explicitly spatial context. We present here results of some of our efforts to model the spatial variation in the stable isotope ratios (δ2H and δ18O) of plants over large spatial extents, and how these spatial model predictions compare to spatially explicit data. Stable isotopes trace and record ecological processes and as such, if modeled correctly over Earth's surface allow us insights into changes in biosphere states and processes across spatial scales. The data-model comparisons show good agreement, in spite of the remaining uncertainties (e.g., plant source water isotopic composition). For example, inter-annual changes in climate are recorded in wine stable isotope ratios. Also, a much simpler model of leaf water enrichment driven with spatially continuous global rasters of precipitation and climate normals largely agrees with complex GCM modeling that includes leaf water δ18O. Our results suggest that modeling plant stable isotope ratios across large spatial extents may be done with reasonable accuracy, including over time. These spatial maps, or isoscapes, can now be utilized to help understand spatially distributed data, as well as to help guide future studies designed to understand ecological change across

  1. The route to MBxNyCz molecular wheels: II. Results using accurate functionals and basis sets

    NASA Astrophysics Data System (ADS)

    Güthler, A.; Mukhopadhyay, S.; Pandey, R.; Boustani, I.

    2014-04-01

    Applying ab initio quantum chemical methods, molecular wheels composed of metal and light atoms were investigated. High quality basis sets 6-31G*, TZPV, and cc-pVTZ as well as exchange and non-local correlation functionals B3LYP, BP86 and B3P86 were used. The ground-state energy and structures of cyclic planar and pyramidal clusters TiBn (for n = 3-10) were computed. In addition, the relative stability and electronic structures of molecular wheels TiBxNyCz (for x, y, z = 0-10) and MBnC10-n (for n = 2 to 5 and M = Sc to Zn) were determined. This paper sustains a follow-up study to the previous one of Boustani and Pandey [Solid State Sci. 14 (2012) 1591], in which the calculations were carried out at the HF-SCF/STO3G/6-31G level of theory to determine the initial stability and properties. The results show that there is a competition between the 2D planar and the 3D pyramidal TiBn clusters (for n = 3-8). Different isomers of TiB10 clusters were also studied and a structural transition of 3D-isomer into 2D-wheel is presented. Substitution boron in TiB10 by carbon or/and nitrogen atoms enhances the stability and leads toward the most stable wheel TiB3C7. Furthermore, the computations show that Sc, Ti and V at the center of the molecular wheels are energetically favored over other transition metal atoms of the first row.

  2. Nucleic Acids Research Group (NRG): The Importance of DNA Extraction in Metagenomics: The Gatekeeper to Accurate Results!

    PubMed Central

    Carmical, R.; Nadella, V.; Herbert, Z.; Beckloff, N.; Chittur, S.; Rosato, C.; Perera, A.; Auer, H.; Robinson, M.; Tighe, S.; Holbrook, Jennifer

    2013-01-01

    It is well recognized that the field of metagenomics is becoming a critical tool for studying previously unobtainable population dynamics at both an identification of species level and a functional or transcriptional level. Because the power to resolve microbial information is so important for identifying the components in an mixed sample, metagenomics can be used to study nearly any possible environment or system including clinical, environmental, and industrial, to name a few. Clinically, it may be used to determine sub-populations colonizing regions of the body or determining a rare infection to assist in treatment strategies. Environmentally it may be used to identify microbial populations within a soil, water or air sample, or within a bioreactor to characterize a population- based functional process. The possibilities are endless. However, the accuracy of a metagenomics dataset relies on three important “gatekeepers” including 1) The ability to effectively extract all DNA or RNA from every cell within a sample, 2) The reliability of the methods used for deep or high-throughput sequencing, and 3) The software used to analyze the data. Since DNA extraction is the first step in the technical process of metagenomics, the Nucleic Acid Research Group (NARG) conducted a study to evaluate extraction methods using a synthetic microbial sample. The synthetic microbial sample was prepared from 10 known bacteria at specific concentrations and ranging in diversity. Samples were extracted in duplicate using various popular kit based methods as well as several homebrew protocols then analyzed by NextGen sequencing on an Illumina HiSeq. Results of the study include determining the percent recovery of those organisms by comparing to the known quantity in the original synthetic mix.

  3. Evaluation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in Pyrus pyrifolia using different tissue samples and seasonal conditions.

    PubMed

    Imai, Tsuyoshi; Ubi, Benjamin E; Saito, Takanori; Moriguchi, Takaya

    2014-01-01

    We have evaluated suitable reference genes for real time (RT)-quantitative PCR (qPCR) analysis in Japanese pear (Pyrus pyrifolia). We tested most frequently used genes in the literature such as β-Tubulin, Histone H3, Actin, Elongation factor-1α, Glyceraldehyde-3-phosphate dehydrogenase, together with newly added genes Annexin, SAND and TIP41. A total of 17 primer combinations for these eight genes were evaluated using cDNAs synthesized from 16 tissue samples from four groups, namely: flower bud, flower organ, fruit flesh and fruit skin. Gene expression stabilities were analyzed using geNorm and NormFinder software packages or by ΔCt method. geNorm analysis indicated three best performing genes as being sufficient for reliable normalization of RT-qPCR data. Suitable reference genes were different among sample groups, suggesting the importance of validation of gene expression stability of reference genes in the samples of interest. Ranking of stability was basically similar between geNorm and NormFinder, suggesting usefulness of these programs based on different algorithms. ΔCt method suggested somewhat different results in some groups such as flower organ or fruit skin; though the overall results were in good correlation with geNorm or NormFinder. Gene expression of two cold-inducible genes PpCBF2 and PpCBF4 were quantified using the three most and the three least stable reference genes suggested by geNorm. Although normalized quantities were different between them, the relative quantities within a group of samples were similar even when the least stable reference genes were used. Our data suggested that using the geometric mean value of three reference genes for normalization is quite a reliable approach to evaluating gene expression by RT-qPCR. We propose that the initial evaluation of gene expression stability by ΔCt method, and subsequent evaluation by geNorm or NormFinder for limited number of superior gene candidates will be a practical way of finding out

  4. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran

    PubMed Central

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2016-01-01

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning. PMID:26493414

  5. Marriage Patterns and Childbearing: Results From a Quantitative Study in North of Iran.

    PubMed

    Taghizadeh, Ziba; Behmanesh, Fereshteh; Ebadi, Abbas

    2015-09-22

    Social changes have rapidly removed arranged marriages and it seems the change in marriage pattern has played a role in childbearing. On the other hand, there is a great reduction in population in many countries which requires a comprehensive policy to manage the considerable drop in population. To achieve this goal, initially, the factors affecting fertility must be precisely identified. This study aims to examine the role of marriage patterns in childbearing. In this cross-sectional quantitative study, 880 married women 15-49 years old, living in the north of Iran were studied using a cluster sampling strategy. The results showed that there are no significant differences in reproductive behaviors of three patterns of marriage in Bobol city of Iran. It seems there is a convergence in childbearing due to the different patterns of marriage and Policymakers should pay attention to other determinants of reproductive behaviors in demographic planning.

  6. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  7. Smaller, Scale-Free Gene Networks Increase Quantitative Trait Heritability and Result in Faster Population Recovery

    PubMed Central

    Malcom, Jacob W.

    2011-01-01

    One of the goals of biology is to bridge levels of organization. Recent technological advances are enabling us to span from genetic sequence to traits, and then from traits to ecological dynamics. The quantitative genetics parameter heritability describes how quickly a trait can evolve, and in turn describes how quickly a population can recover from an environmental change. Here I propose that we can link the details of the genetic architecture of a quantitative trait—i.e., the number of underlying genes and their relationships in a network—to population recovery rates by way of heritability. I test this hypothesis using a set of agent-based models in which individuals possess one of two network topologies or a linear genotype-phenotype map, 16–256 genes underlying the trait, and a variety of mutation and recombination rates and degrees of environmental change. I find that the network architectures introduce extensive directional epistasis that systematically hides and reveals additive genetic variance and affects heritability: network size, topology, and recombination explain 81% of the variance in average heritability in a stable environment. Network size and topology, the width of the fitness function, pre-change additive variance, and certain interactions account for ∼75% of the variance in population recovery times after a sudden environmental change. These results suggest that not only the amount of additive variance, but importantly the number of loci across which it is distributed, is important in regulating the rate at which a trait can evolve and populations can recover. Taken in conjunction with previous research focused on differences in degree of network connectivity, these results provide a set of theoretical expectations and testable hypotheses for biologists working to span levels of organization from the genotype to the phenotype, and from the phenotype to the environment. PMID:21347400

  8. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  9. Differential label-free quantitative proteomic analysis of Shewanella oneidensis cultured under aerobic and suboxic conditions by accurate mass and time tag approach.

    PubMed

    Fang, Ruihua; Elias, Dwayne A; Monroe, Matthew E; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D; Callister, Stephen J; Moore, Ronald J; Gorby, Yuri A; Adkins, Joshua N; Fredrickson, Jim K; Lipton, Mary S; Smith, Richard D

    2006-04-01

    We describe the application of LC-MS without the use of stable isotope labeling for differential quantitative proteomic analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and suboxic conditions. LC-MS/MS was used to initially identify peptide sequences, and LC-FTICR was used to confirm these identifications as well as measure relative peptide abundances. 2343 peptides covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as statistical analysis of microarrays, whereas another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis was transitioned from aerobic to suboxic conditions.

  10. Differential Label-free Quantitative Proteomic Analysis of Shewanella oneidensis Cultured under Aerobic and Suboxic Conditions by Accurate Mass and Time Tag Approach

    SciTech Connect

    Fang, Ruihua; Elias, Dwayne A.; Monroe, Matthew E.; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D.; Callister, Stephen J.; Moore, Ronald J.; Gorby, Yuri A.; Adkins, Joshua N.; Fredrickson, Jim K.; Lipton, Mary S.; Smith, Richard D.

    2006-04-01

    We describe the application of liquid chromatography coupled to mass spectrometry (LC/MS) without the use of stable isotope labeling for differential quantitative proteomics analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and sub-oxic conditions. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) was used to initially identify peptide sequences, and LC coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR) was used to confirm these identifications, as well as measure relative peptide abundances. 2343 peptides, covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as SAM, while another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis is transitioned from aerobic to sub-oxic conditions.

  11. Quantitative Differentiation of Bloodstain Patterns Resulting from Gunshot and Blunt Force Impacts.

    PubMed

    Siu, Sonya; Pender, Jennifer; Springer, Faye; Tulleners, Frederic; Ristenpart, William

    2017-02-10

    Bloodstain pattern analysis (BPA) provides significant evidentiary value in crime scene interpretation and reconstruction. In this work, we develop a quantitative methodology using digital image analysis techniques to differentiate impact bloodstain patterns. The bloodstain patterns were digitally imaged and analyzed using image analysis algorithms. Our analysis of 72 unique bloodstain patterns, comprising more than 490,000 individual droplet stains, indicates that the mean drop size in a gunshot spatter pattern is at most 30% smaller than the mean drop stain size in blunt instrument patterns. In contrast, we demonstrate that the spatial distribution of the droplet stains-their density as a function of position in the pattern-significantly differs between gunshot and blunt instrument patterns, with densities as much as 400% larger for gunshot impacts. Thus, quantitative metrics involving the spatial distribution of droplet stains within a bloodstain pattern can be useful for objective differentiation between blunt instrument and gunshot bloodstain patterns.

  12. Relative Quantification of Costal Cordillera (Ecuador) Uplift : Preliminary Results from Quantitative Geomorphology

    NASA Astrophysics Data System (ADS)

    Reyes, Pedro; Dauteuil, Olivier; Michaud, François

    2010-05-01

    The coastal cordillera of Ecuador (culminating point around 800 m) includes on its littoral margins uplifted marine terraces (maximum known 360 m). The coastal cordillera constitutes an important barrier of drainage and on nearly 600 km the drainage resulting from the Andes is diverted towards Río Guayas in the South and Río Esmeraldas in North. What is the uplifting mode of the coastal cordillera? For how long it has constituted a barrier of drainage? Does the coastal cordillera rising be linked with the littoral margin rising? Does the cordillera have raised in a homogeneous or segmented way? What is the geodynamic process of the uplift of the cordillera? Can this uplift be related with the subduction of the Carnegie ridge? The first objective of this work is to analyze the morphology of the coastal cordillera with helps of quantitative geomorphology using digital techniques such as DEM (realized with a resolution of 30 m by Marc Souris, IRD), to specify the evolution of the coastal cordillera uplift. This study was carried out starting combining analysis of morphology, maps derived from the slopes and anomalies of the drainage of the hydrographic network. In the second time, three methods were applied to DEM data using the ArcGIS software: 1) the digitalization and the interpolation of basal surface of the last marine formation of regional distribution (the Borbón formation on the geological map of Ecuador) to determine paleo-horizontal and to see its deformation; 2) the extraction of 109 profiles of rivers which allow us to calculate for each river the vertical, horizontal, and total deviation compared to the theoretical profile of the river and the associated SL index; 3) the measurement of the relief incision (depth + half width of the valley, on the whole 7500 measurements) according to the method of Bonnet et al. (1998). We adapted this method to be able to represent the state of incision in any point, correcting from the influence of the lithology and

  13. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  14. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  15. Development of a strand-specific real-time qRT-PCR for the accurate detection and quantitation of West Nile virus RNA.

    PubMed

    Lim, Stephanie M; Koraka, Penelope; Osterhaus, Albert D M E; Martina, Byron E E

    2013-12-01

    Studying the tropism and replication kinetics of West Nile virus (WNV) in different cell types in vitro and in tissues in animal models is important for understanding its pathogenesis. As detection of the negative strand viral RNA is a more reliable indicator of active replication for single-stranded positive-sense RNA viruses, the specificity of qRT-PCR assays currently used for the detection of WNV positive and negative strand RNA was reassessed. It was shown that self- and falsely-primed cDNA was generated during the reverse transcription step in an assay employing unmodified primers and several reverse transcriptases. As a result, a qRT-PCR assay using the thermostable rTth in combination with tagged primers was developed, which greatly improved strand specificity by circumventing the events of self- and false-priming. The reliability of the assay was then addressed in vitro using BV-2 microglia cells as well as in C57/BL6 mice. It was possible to follow the kinetics of positive and negative-strand RNA synthesis both in vitro and in vivo; however, the sensitivity of the assay will need to be optimized in order to detect and quantify negative-strand RNA synthesis in the very early stages of infection. Overall, the strand-specific qRT-PCR assay developed in this study is an effective tool to quantify WNV RNA, reassess viral replication, and study tropism of WNV in the context of WNV pathogenesis.

  16. Accurate near-field lithography modeling and quantitative mapping of the near-field distribution of a plasmonic nanoaperture in a metal.

    PubMed

    Kim, Yongwoo; Jung, Howon; Kim, Seok; Jang, Jinhee; Lee, Jae Yong; Hahn, Jae W

    2011-09-26

    In nanolithography using optical near-field sources to push the critical dimension below the diffraction limit, optimization of process parameters is of utmost importance. Herein we present a simple analytic model to predict photoresist profiles with a localized evanescent exposure that decays exponentially in a photoresist of finite contrast. We introduce the concept of nominal developing thickness (NDT) to determine the proper developing process that yields the best topography of the exposure profile fitting to the isointensity contour. Based on this model, we experimentally investigated the NDT and obtained exposure profiles produced by the near-field distribution of a bowtie-shaped nanoaperture. The profiles were properly fit to the calculated results obtained by the finite differential time domain method. Using the threshold exposure dose of a photoresist, we can determine the absolute intensity of the intensity distribution of the near field and analyze the difference in decay rates of the near field distributions obtained via experiment and calculation. For maximum depth of 41 nm, we estimate the uncertainties in the measurements of profile and intensity to be less than 6% and about 1%, respectively. We expect this method will be useful in detecting the absolute value of the near-field distribution produced by nano-scale devices.

  17. Comparison of Diagnostic Performance Between Visual and Quantitative Assessment of Bone Scintigraphy Results in Patients With Painful Temporomandibular Disorder

    PubMed Central

    Choi, Bong-Hoi; Yoon, Seok-Ho; Song, Seung-Il; Yoon, Joon-Kee; Lee, Su Jin; An, Young-Sil

    2016-01-01

    Abstract This retrospective clinical study was performed to evaluate whether a visual or quantitative method is more valuable for assessing painful temporomandibular disorder (TMD) using bone scintigraphy results. In total, 230 patients (172 women and 58 men) with TMD were enrolled. All patients were questioned about their temporomandibular joint (TMJ) pain. Bone scintigraphic data were acquired in all patients, and images were analyzed by visual and quantitative methods using the TMJ-to-skull uptake ratio. The diagnostic performances of both bone scintigraphic assessment methods for painful TMD were compared. In total, 241 of 460 TMJs (52.4%) were finally diagnosed with painful TMD. The sensitivity, specificity, positive predictive value, negative predictive value, and accuracy of the visual analysis for diagnosing painful TMD were 62.8%, 59.6%, 58.6%, 63.8%, and 61.1%, respectively. The quantitative assessment showed the ability to diagnose painful TMD with a sensitivity of 58.8% and specificity of 69.3%. The diagnostic ability of the visual analysis for diagnosing painful TMD was not significantly different from that of the quantitative analysis. Visual bone scintigraphic analysis showed a diagnostic utility similar to that of quantitative assessment for the diagnosis of painful TMD. PMID:26765456

  18. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Zakharov, S.M.

    1997-01-01

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation. {copyright} {ital 1997 American Institute of Physics.}

  19. A quantitative assessment of reliability of the TOPAZ-2 space NPS reactor unit based on ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Zakharov, Sergei M.

    1997-01-10

    The paper discusses life-limiting factors (parameters) and statistics of random sudden failures, revealed in the course of ground development, for 4 given subsystems of the TOPAZ-2 space NPS reactor unit. Results are presented of a quantitative assessment of the lower confidence limits of the probability of failure-free operation.

  20. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  1. Accurate quantitation for in vitro refolding of single domain antibody fragments expressed as inclusion bodies by referring the concomitant expression of a soluble form in the periplasms of Escherichia coli.

    PubMed

    Noguchi, Tomoaki; Nishida, Yuichi; Takizawa, Keiji; Cui, Yue; Tsutsumi, Koki; Hamada, Takashi; Nishi, Yoshisuke

    2017-03-01

    Single domain antibody fragments from two species, a camel VHH (PM1) and a shark VNAR (A6), were derived from inclusion bodies of E. coli and refolded in vitro following three refolding recipes for comparing refolding efficiencies: three-step cold dialysis refolding (TCDR), one-step hot dialysis refolding (OHDR), and one-step cold dialysis refolding (OCDR), as these fragments were expressed as 'a soluble form' either in cytoplasm or periplasm, but the amount were much less than those expressed as 'an insoluble form (inclusion body)' in cytoplasm and periplasm. In order to verify the refolding efficiencies from inclusion bodies correctly, proteins purified from periplasmic soluble fractions were used as reference samples. These samples showed far-UV spectra of a typical β-sheet-dominant structure in circular dichroism (CD) spectroscopy and so did the refolded samples as well. As the maximal magnitude of ellipticity in millidegrees (θmax) observed at a given wave length was proportional to the concentrations of the respective reference samples, we could draw linear regression lines for the magnitudes vs. sample concentrations. By using these lines, we measured the concentrations for the refolded PM1 and A6 samples purified from solubilized cytoplasmic insoluble fractions. The refolding efficiency of PM1 was almost 50% following TCDR and 40% and 30% following OHDR and OCDR, respectively, whereas the value of A6 was around 30% following TCDR, and out of bound for quantitation following the other two recipes. The ELISA curves, which were derived from the refolded samples, coincided better with those obtained from the reference samples after converting the values from the protein-concentrations at recovery to the ones of refolded proteins using recovery ratios, indicating that such a correction gives better results for the accurate measure of the ELISA curves than those without correction. Our method require constructing a dual expression system, expressed both in

  2. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, Nikolai N.; Nechaev, Yuri A.; Khazanovich, Igor M.; Samodelov, Victor N.; Pavlov, Konstantin A.

    1997-01-10

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing.

  3. Stereotactic hypofractionated accurate radiotherapy of the prostate (SHARP), 33.5 Gy in five fractions for localized disease: First clinical trial results

    SciTech Connect

    Madsen, Berit L. . E-mail: ronblm@vmmc.org; Hsi, R. Alex; Pham, Huong T.; Fowler, Jack F.; Esagui, Laura C.; Corman, John

    2007-03-15

    Purpose: To evaluate the feasibility and toxicity of stereotactic hypofractionated accurate radiotherapy (SHARP) for localized prostate cancer. Methods and Materials: A Phase I/II trial of SHARP performed for localized prostate cancer using 33.5 Gy in 5 fractions, calculated to be biologically equivalent to 78 Gy in 2 Gy fractions ({alpha}/{beta} ratio of 1.5 Gy). Noncoplanar conformal fields and daily stereotactic localization of implanted fiducials were used for treatment. Genitourinary (GU) and gastrointestinal (GI) toxicity were evaluated by American Urologic Association (AUA) score and Common Toxicity Criteria (CTC). Prostate-specific antigen (PSA) values and self-reported sexual function were recorded at specified follow-up intervals. Results: The study includes 40 patients. The median follow-up is 41 months (range, 21-60 months). Acute toxicity Grade 1-2 was 48.5% (GU) and 39% (GI); 1 acute Grade 3 GU toxicity. Late Grade 1-2 toxicity was 45% (GU) and 37% (GI). No late Grade 3 or higher toxicity was reported. Twenty-six patients reported potency before therapy; 6 (23%) have developed impotence. Median time to PSA nadir was 18 months with the majority of nadirs less than 1.0 ng/mL. The actuarial 48-month biochemical freedom from relapse is 70% for the American Society for Therapeutic Radiology and Oncology definition and 90% by the alternative nadir + 2 ng/mL failure definition. Conclusions: SHARP for localized prostate cancer is feasible with minimal acute or late toxicity. Dose escalation should be possible.

  4. Quantitative Assessment of the Impact of Blood Pulsation on Intraocular Pressure Measurement Results in Healthy Subjects

    PubMed Central

    2017-01-01

    Background. Blood pulsation affects the results obtained using various medical devices in many different ways. Method. The paper proves the effect of blood pulsation on intraocular pressure measurements. Six measurements for each of the 10 healthy subjects were performed in various phases of blood pulsation. A total of 8400 corneal deformation images were recorded. The results of intraocular pressure measurements were related to the results of heartbeat phases measured with a pulse oximeter placed on the index finger of the subject's left hand. Results. The correlation between the heartbeat phase measured with a pulse oximeter and intraocular pressure is 0.69 ± 0.26 (p < 0.05). The phase shift calculated for the maximum correlation is equal to 60 ± 40° (p < 0.05). When the moment of measuring intraocular pressure with an air-puff tonometer is not synchronized, the changes in IOP for the analysed group of subjects can vary in the range of ±2.31 mmHg (p < 0.3). Conclusions. Blood pulsation has a statistically significant effect on the results of intraocular pressure measurement. For this reason, in modern ophthalmic devices, the measurement should be synchronized with the heartbeat phases. The paper proposes an additional method for synchronizing the time of pressure measurement with the blood pulsation phase. PMID:28250983

  5. Quantitative Assessment of the CCMC's Experimental Real-time SWMF-Geospace Results

    NASA Astrophysics Data System (ADS)

    Liemohn, Michael; Ganushkina, Natalia; De Zeeuw, Darren; Welling, Daniel; Toth, Gabor; Ilie, Raluca; Gombosi, Tamas; van der Holst, Bart; Kuznetsova, Maria; Maddox, Marlo; Rastaetter, Lutz

    2016-04-01

    Experimental real-time simulations of the Space Weather Modeling Framework (SWMF) are conducted at the Community Coordinated Modeling Center (CCMC), with results available there (http://ccmc.gsfc.nasa.gov/realtime.php), through the CCMC Integrated Space Weather Analysis (iSWA) site (http://iswa.ccmc.gsfc.nasa.gov/IswaSystemWebApp/), and the Michigan SWMF site (http://csem.engin.umich.edu/realtime). Presently, two configurations of the SWMF are running in real time at CCMC, both focusing on the geospace modules, using the BATS-R-US magnetohydrodynamic model, the Ridley Ionosphere Model, and with and without the Rice Convection Model for inner magnetospheric drift physics. While both have been running for several years, nearly continuous results are available since July 2015. Dst from the model output is compared against the Kyoto real-time Dst, in particular the daily minimum value of Dst to quantify the ability of the model to capture storms. Contingency tables are presented, showing that the run with the inner magnetosphere model is much better at reproducing storm-time values. For disturbances with a minimum Dst lower than -50 nT, this version yields a probability of event detection of 0.86 and a Heidke Skill Score of 0.60. In the other version of the SWMF, without the inner magnetospheric module included, the modeled Dst never dropped below -50 nT during the examined epoch.

  6. Evaluating the Economic Impact of Smart Care Platforms: Qualitative and Quantitative Results of a Case Study

    PubMed Central

    Van der Auwermeulen, Thomas; Van Ooteghem, Jan; Jacobs, An; Verbrugge, Sofie; Colle, Didier

    2016-01-01

    Background In response to the increasing pressure of the societal challenge because of a graying society, a gulf of new Information and Communication Technology (ICT) supported care services (eCare) can now be noticed. Their common goal is to increase the quality of care while decreasing its costs. Smart Care Platforms (SCPs), installed in the homes of care-dependent people, foster the interoperability of these services and offer a set of eCare services that are complementary on one platform. These eCare services could not only result in more quality care for care receivers, but they also offer opportunities to care providers to optimize their processes. Objective The objective of the study was to identify and describe the expected added values and impacts of integrating SCPs in current home care delivery processes for all actors. In addition, the potential economic impact of SCP deployment is quantified from the perspective of home care organizations. Methods Semistructured and informal interviews and focus groups and cocreation workshops with service providers, managers of home care organizations, and formal and informal care providers led to the identification of added values of SCP integration. In a second step, process breakdown analyses of home care provisioning allowed defining the operational impact for home care organization. Impacts on 2 different process steps of providing home care were quantified. After modeling the investment, an economic evaluation compared the business as usual (BAU) scenario versus the integrated SCP scenario. Results The added value of SCP integration for all actors involved in home care was identified. Most impacts were qualitative such as increase in peace of mind, better quality of care, strengthened involvement in care provisioning, and more transparent care communication. For home care organizations, integrating SCPs could lead to a decrease of 38% of the current annual expenses for two administrative process steps namely

  7. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test.

    PubMed

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G

    2015-11-26

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as "gold standard" for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay.

  8. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  9. Parents' decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results.

    PubMed

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9-10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents' general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions.

  10. Parents’ decision-making about the human papillomavirus vaccine for their daughters: I. Quantitative results

    PubMed Central

    Krawczyk, Andrea; Knäuper, Bärbel; Gilca, Vladimir; Dubé, Eve; Perez, Samara; Joyal-Desmarais, Keven; Rosberger, Zeev

    2015-01-01

    Vaccination against the human papillomavirus (HPV) is an effective primary prevention measure for HPV-related diseases. For children and young adolescents, the uptake of the vaccine is contingent on parental consent. This study sought to identify key differences between parents who obtain (acceptors) and parents who refuse (non-acceptors) the HPV vaccine for their daughters. In the context of a free, universal, school-based HPV vaccination program in Québec, 774 parents of 9–10 year-old girls completed and returned a questionnaire by mail. The questionnaire was based on the theoretical constructs of the Health Belief Model (HBM), along with constructs from other theoretical frameworks. Of the 774 parents, 88.2% reported their daughter having received the HPV vaccine. Perceived susceptibility of daughters to HPV infection, perceived benefits of the vaccine, perceived barriers (including safety of the vaccine), and cues to action significantly distinguished between parents whose daughters had received the HPV vaccine and those whose daughters had not. Other significant factors associated with daughter vaccine uptake were parents’ general vaccination attitudes, anticipated regret, adherence to other routinely recommended vaccines, social norms, and positive media influence. The results of this study identify a number of important correlates related to parents' decisions to accept or refuse the HPV vaccine uptake for their daughters. Future work may benefit from targeting such factors and incorporating other health behavior theories in the design of effective HPV vaccine uptake interventions. PMID:25692455

  11. Semi-quantitative characterisation of ambient ultrafine aerosols resulting from emissions of coal fired power stations.

    PubMed

    Hinkley, J T; Bridgman, H A; Buhre, B J P; Gupta, R P; Nelson, P F; Wall, T F

    2008-02-25

    Emissions from coal fired power stations are known to be a significant anthropogenic source of fine atmospheric particles, both through direct primary emissions and secondary formation of sulfate and nitrate from emissions of gaseous precursors. However, there is relatively little information available in the literature regarding the contribution emissions make to the ambient aerosol, particularly in the ultrafine size range. In this study, the contribution of emissions to particles smaller than 0.3 mum in the ambient aerosol was examined at a sampling site 7 km from two large Australian coal fired power stations equipped with fabric filters. A novel approach was employed using conditional sampling based on sulfur dioxide (SO(2)) as an indicator species, and a relatively new sampler, the TSI Nanometer Aerosol Sampler. Samples were collected on transmission electron microscope (TEM) grids and examined using a combination of TEM imaging and energy dispersive X-ray (EDX) analysis for qualitative chemical analysis. The ultrafine aerosol in low SO(2) conditions was dominated by diesel soot from vehicle emissions, while significant quantities of particles, which were unstable under the electron beam, were observed in the high SO(2) samples. The behaviour of these particles was consistent with literature accounts of sulfate and nitrate species, believed to have been derived from precursor emissions from the power stations. A significant carbon peak was noted in the residues from the evaporated particles, suggesting that some secondary organic aerosol formation may also have been catalysed by these acid seed particles. No primary particulate material was observed in the minus 0.3 mum fraction. The results of this study indicate the contribution of species more commonly associated with gas to particle conversion may be more significant than expected, even close to source.

  12. Problems of a thermionic space NPS reactor unit quantitative reliability assessment on the basis of ground development results

    SciTech Connect

    Ponomarev-Stepnoi, N.N.; Nechaev, Y.A.; Khazanovich, I.M.; Samodelov, V.N.; Pavlov, K.A.

    1997-01-01

    The paper sets forth major problems that arose in the course of a quantitative assessment of reliability of a TOPAZ-2 space NPS reactor unit performed on the basis of ground development results. Proposals are made on the possible ways to solve those problems through development and introduction of individual standards especially for the ground development stage, which would specify the assessment algorithm and censoring rules, and exclude a number of existing uncertainties when making a decision on going to flight testing. {copyright} {ital 1997 American Institute of Physics.}

  13. Accurate measurements of vadose zone fluxes using automated equilibrium tension plate lysimeters: A synopsis of results from the Spydia research facility, New Zealand.

    NASA Astrophysics Data System (ADS)

    Wöhling, Thomas; Barkle, Greg; Stenger, Roland; Moorhead, Brian; Wall, Aaron; Clague, Juliet

    2014-05-01

    Automated equilibrium tension plate lysimeters (AETLs) are arguably the most accurate method to measure unsaturated water and contaminant fluxes below the root zone at the scale of up to 1 m². The AETL technique utilizes a porous sintered stainless-steel plate to provide a comparatively large sampling area with a continuously controlled vacuum that is in "equilibrium" with the surrounding vadose zone matric pressure to ensure measured fluxes represent those under undisturbed conditions. This novel lysimeter technique was used at an intensive research site for investigations of contaminant pathways from the land surface to the groundwater on a sheep and beef farm under pastoral land use in the Tutaeuaua subcatchment, New Zealand. The Spydia research facility was constructed in 2005 and was fully operational between 2006 and 2011. Extending from a central access caisson, 15 separately controlled AETLs with 0.2 m² surface area were installed at five depths between 0.4 m and 5.1 m into the undisturbed volcanic vadose zone materials. The unique setup of the facility ensured minimum interference of the experimental equipment and external factors with the measurements. Over the period of more than five years, a comprehensive data set was collected at each of the 15 AETL locations which comprises of time series of soil water flux, pressure head, volumetric water contents, and soil temperature. The soil water was regularly analysed for EC, pH, dissolved carbon, various nitrogen compounds (including nitrate, ammonia, and organic N), phosphorus, bromide, chloride, sulphate, silica, and a range of other major ions, as well as for various metals. Climate data was measured directly at the site (rainfall) and a climate station at 500m distance. The shallow groundwater was sampled at three different depths directly from the Spydia caisson and at various observation wells surrounding the facility. Two tracer experiments were conducted at the site in 2009 and 2010. In the 2009

  14. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  15. An approach for relating the results of quantitative nondestructive evaluation to intrinsic properties of high-performance materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    One of the most difficult problems the manufacturing community has faced during recent years has been to accurately assess the physical state of anisotropic high-performance materials by nondestructive means. In order to advance the design of ultrasonic nondestructive testing systems, a more fundamental understanding of how ultrasonic waves travel and interact within the anisotropic material is needed. The relationship between the ultrasonic and engineering parameters needs to be explored to understand their mutual dependence. One common denominator is provided by the elastic constants. The preparation of specific graphite/epoxy samples to be used in the experimental investigation of the anisotropic properties (through the measurement of the elastic stiffness constants) is discussed. Accurate measurements of these constants will depend upon knowledge of refraction effects as well as the direction of group velocity propagation. The continuing effort for the development of improved visualization techniques for physical parameters is discussed. Group velocity images are presented and discussed. In order to fully understand the relationship between the ultrasonic and the common engineering parameters, the physical interpretation of the linear elastic coefficients (the quantities that relate applied stresses to resulting strains) are discussed. This discussion builds a more intuitional understanding of how the ultrasonic parameters are related to the traditional engineering parameters.

  16. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  17. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  18. Quantitative assessment of age-related macular degeneration using parametric modeling of the leakage transfer function: preliminary results.

    PubMed

    Eldeeb, Safaa M; Abdelmoula, Walid M; Shah, Syed M; Fahmy, Ahmed S

    2012-01-01

    Age-related macular degeneration (AMD) is a major cause of blindness and visual impairment in older adults. The wet form of the disease is characterized by abnormal blood vessels forming a choroidal neovascular membrane (CNV), that result in destruction of normal architecture of the retina. Current evaluation and follow up of wet AMD include subjective evaluation of Fluorescein Angiograms (FA) to determine the activity of the lesion and monitor the progression or regression of the disease. However, this subjective evaluation prevents accurate monitoring of the disease progression or regression in response to a pharmacologic agent. In this work, we present a method that allows objective assessment of the activity of a CNV lesion which can be statistically compared across different patient and time points. The method is based on a hypothesis that the discrepancy in the time-intensity signals among the diseased and normal retinal areas are due to an implicit transfer function whose parameters can be used to characterize the retina. The method begins with parametric modeling of the temporal variation of the lesion and background intensities. Then, the values of the model parameters are used to evaluate the change in the activity of the disease. Preliminary results on five datasets show that the calculated parameters are highly correlated with the Visual Acuity (VA) of the patients.

  19. Effectiveness of quantitative MAA SPECT/CT for the definition of vascularized hepatic volume and dosimetric approach: phantom validation and clinical preliminary results in patients with complex hepatic vascularization treated with yttrium-90-labeled microspheres.

    PubMed

    Garin, Etienne; Lenoir, Laurence; Rolland, Yan; Laffont, Sophie; Pracht, Marc; Mesbah, Habiba; Porée, Philippe; Ardisson, Valérie; Bourguet, Patrick; Clement, Bruno; Boucher, Eveline

    2011-12-01

    The goal of this study was to assess the use of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) analysis for vascularized volume measurements in the use of the yttrium-90-radiolabeled microspheres (TheraSphere). A phantom study was conducted for the validation of SPECT/CT volume measurement. SPECT/CT quantitative analysis was used for the measurement of the volume of distribution of the albumin macroaggregates (MAA; i.e., the vascularized volume) in the liver and the tumor, and the total activity contained in the liver and the tumor in four consecutive patients presenting with a complex liver vascularization referred for a treatment with TheraSphere. SPECT/CT volume measurement proved to be accurate (mean error <7%) and reproducible (interobserver concordance 0.99). For eight treatments, in cases of complex hepatic vascularization, the hepatic volumes based on angiography and CT led to a relative overestimation or underestimation of the vascularized hepatic volume by 43.2 ± 32.7% (5-87%) compared with SPECT/CT analyses. The vascularized liver volume taken into account calculated from SPECT/CT data, instead of angiography and CT data, results in modifying the activity injected for three treatments of eight. Moreover, quantitative analysis of SPECT/CT allows us to calculate the absorbed dose in the tumor and in the healthy liver, leading to doubling of the injected activity for one treatment of eight. MAA SPECT/CT is accurate for volume measurements. It provides a valuable contribution to the therapeutic planning of patients presenting with complex hepatic vascularization, in particular for calculating the vascularized liver volume, the activity to be injected and the absorbed doses. Studies should be conducted to assess the role of quantitative MAA/SPECT CT in therapeutic planning.

  20. Qualitative and quantitative results of interferon-γ release assays for monitoring the response to anti-tuberculosis treatment

    PubMed Central

    Park, I-Nae; Shim, Tae Sun

    2017-01-01

    Background/Aims The usefulness of interferon-γ release assays (IGRAs) in monitoring to responses to anti-tuberculosis (TB) treatment is controversial. We compared the results of two IGRAs before and after anti-TB treatment in same patients with active TB. Methods From a retrospective review, we selected patients with active TB who underwent repeated QuantiFERON-TB Gold (QFN-Gold, Cellestis Limited) and T-SPOT.TB (Oxford Immunotec) assays before and after anti-TB treatment with first-line drugs. Both tests were performed prior to the start of anti-TB treatment or within 1 week after the start of anti-TB treatment and after completion of treatment. Results A total of 33 active TB patients were included in the study. On the QFN-Gold test, at baseline, 23 cases (70%) were early secreted antigenic target 6-kDa protein 6 (ESAT-6) or culture filtrate protein 10 (CFP-10) positive. On the T-SPOT. TB test, at baseline, 31 cases (94%) were ESAT-6 or CFP-10 positive. Most of patients remained both test-positive after anti-TB treatment. Although changes in interferon-γ release responses over time were highly variable in both tests, there was a mean decline of 27 and 24 spot-forming counts for ESAT-6 and CFP-10, respectively on the T-SPOT.TB test (p < 0.05 for all). Conclusions Although limited by the small number of patients and a short-term follow-up, there was significant decline in the quantitative result of the T-SPOT. TB test with treatment. However, both commercial IGRAs may not provide evidence regarding the cure of disease in Korea, a country where the prevalence of TB is within the intermediate range. PMID:27951621

  1. The allele distribution in next-generation sequencing data sets is accurately described as the result of a stochastic branching process.

    PubMed

    Heinrich, Verena; Stange, Jens; Dickhaus, Thorsten; Imkeller, Peter; Krüger, Ulrike; Bauer, Sebastian; Mundlos, Stefan; Robinson, Peter N; Hecht, Jochen; Krawitz, Peter M

    2012-03-01

    With the availability of next-generation sequencing (NGS) technology, it is expected that sequence variants may be called on a genomic scale. Here, we demonstrate that a deeper understanding of the distribution of the variant call frequencies at heterozygous loci in NGS data sets is a prerequisite for sensitive variant detection. We model the crucial steps in an NGS protocol as a stochastic branching process and derive a mathematical framework for the expected distribution of alleles at heterozygous loci before measurement that is sequencing. We confirm our theoretical results by analyzing technical replicates of human exome data and demonstrate that the variance of allele frequencies at heterozygous loci is higher than expected by a simple binomial distribution. Due to this high variance, mutation callers relying on binomial distributed priors are less sensitive for heterozygous variants that deviate strongly from the expected mean frequency. Our results also indicate that error rates can be reduced to a greater degree by technical replicates than by increasing sequencing depth.

  2. Longitudinal, intermodality registration of quantitative breast PET and MRI data acquired before and during neoadjuvant chemotherapy: Preliminary results

    SciTech Connect

    Atuegwu, Nkiruka C.; Williams, Jason M.; Li, Xia; Arlinghaus, Lori R.; Abramson, Richard G.; Chakravarthy, A. Bapsi; Abramson, Vandana G.; Yankeelov, Thomas E.

    2014-05-15

    Purpose: The authors propose a method whereby serially acquired DCE-MRI, DW-MRI, and FDG-PET breast data sets can be spatially and temporally coregistered to enable the comparison of changes in parameter maps at the voxel level. Methods: First, the authors aligned the PET and MR images at each time point rigidly and nonrigidly. To register the MR images longitudinally, the authors extended a nonrigid registration algorithm by including a tumor volume-preserving constraint in the cost function. After the PET images were aligned to the MR images at each time point, the authors then used the transformation obtained from the longitudinal registration of the MRI volumes to register the PET images longitudinally. The authors tested this approach on ten breast cancer patients by calculating a modified Dice similarity of tumor size between the PET and MR images as well as the bending energy and changes in the tumor volume after the application of the registration algorithm. Results: The median of the modified Dice in the registered PET and DCE-MRI data was 0.92. For the longitudinal registration, the median tumor volume change was −0.03% for the constrained algorithm, compared to −32.16% for the unconstrained registration algorithms (p = 8 × 10{sup −6}). The medians of the bending energy were 0.0092 and 0.0001 for the unconstrained and constrained algorithms, respectively (p = 2.84 × 10{sup −7}). Conclusions: The results indicate that the proposed method can accurately spatially align DCE-MRI, DW-MRI, and FDG-PET breast images acquired at different time points during therapy while preventing the tumor from being substantially distorted or compressed.

  3. Mast Cells Are Abundant in Primary Cutaneous T-Cell Lymphomas: Results from a Computer-Aided Quantitative Immunohistological Study

    PubMed Central

    Eder, Johanna; Rogojanu, Radu; Jerney, Waltraud; Erhart, Friedrich; Dohnal, Alexander; Kitzwögerer, Melitta; Steiner, Georg; Moser, Julia; Trautinger, Franz

    2016-01-01

    Background Mast cells (MC) are bone marrow derived haematopoetic cells playing a crucial role not only in immune response but also in the tumor microenvironment with protumorigenic and antitumorigenic functions. The role of MC in primary cutaneous T-cell lymphomas (CTCL), a heterogeneous group of non-Hodgkin lymphomas with initial presentation in the skin, is largely unknown. Objective To gain more accurate information about presence, number, distribution and state of activation (degranulated vs. non-degranulated) of MC in CTCL variants and clinical stages. Materials and Methods We established a novel computer-aided tissue analysis method on digitized skin sections. Immunohistochemistry with an anti-MC tryptase antibody was performed on 34 biopsies of different CTCL subtypes and on control skin samples. An algorithm for the automatic detection of the epidermis and of cell density based CTCL areas was developed. Cells were stratified as being within the CTCL infiltrate, in P1 (a surrounding area 0–30 μm away from CTCL), or in P2 (30–60 μm away from CTCL) area. Results We found high MC counts within CTCL infiltrates and P1 and a decreased MC number in the surrounding dermis P2. Higher MC numbers were found in MF compared to all other CTCL subgroups. Regarding different stages of MF, we found significantly higher mast cell counts in stages IA and IB than in stages IIA and IIB. Regarding MC densities, we found a higher density of MC in MF compared to all other CTCL subgroups. More MC were non-degranulated than degranulated. Conclusion Here for the first time an automated method for MC analysis on tissue sections and its use in CTCL is described. Eliminating error from investigator bias, the method allows for precise cell identification and counting. Our results provide new insights on MC distribution in CTCL reappraising their role in the pathophysiology of CTCL. PMID:27893746

  4. A recurrent neural network approach to quantitatively studying solar wind effects on TEC derived from GPS; preliminary results

    NASA Astrophysics Data System (ADS)

    Habarulema, J. B.; McKinnell, L.-A.; Opperman, B. D. L.

    2009-05-01

    This paper attempts to describe the search for the parameter(s) to represent solar wind effects in Global Positioning System total electron content (GPS TEC) modelling using the technique of neural networks (NNs). A study is carried out by including solar wind velocity (Vsw), proton number density (Np) and the Bz component of the interplanetary magnetic field (IMF Bz) obtained from the Advanced Composition Explorer (ACE) satellite as separate inputs to the NN each along with day number of the year (DN), hour (HR), a 4-month running mean of the daily sunspot number (R4) and the running mean of the previous eight 3-hourly magnetic A index values (A8). Hourly GPS TEC values derived from a dual frequency receiver located at Sutherland (32.38° S, 20.81° E), South Africa for 8 years (2000-2007) have been used to train the Elman neural network (ENN) and the result has been used to predict TEC variations for a GPS station located at Cape Town (33.95° S, 18.47° E). Quantitative results indicate that each of the parameters considered may have some degree of influence on GPS TEC at certain periods although a decrease in prediction accuracy is also observed for some parameters for different days and seasons. It is also evident that there is still a difficulty in predicting TEC values during disturbed conditions. The improvements and degradation in prediction accuracies are both close to the benchmark values which lends weight to the belief that diurnal, seasonal, solar and magnetic variabilities may be the major determinants of TEC variability.

  5. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented.

    PubMed

    Saraswathi, Saras; Sundaram, Suresh; Sundararajan, Narasimhan; Zimmermann, Michael; Nilsen-Hamilton, Marit

    2011-01-01

    A combination of Integer-Coded Genetic Algorithm (ICGA) and Particle Swarm Optimization (PSO), coupled with the neural-network-based Extreme Learning Machine (ELM), is used for gene selection and cancer classification. ICGA is used with PSO-ELM to select an optimal set of genes, which is then used to build a classifier to develop an algorithm (ICGA_PSO_ELM) that can handle sparse data and sample imbalance. We evaluate the performance of ICGA-PSO-ELM and compare our results with existing methods in the literature. An investigation into the functions of the selected genes, using a systems biology approach, revealed that many of the identified genes are involved in cell signaling and proliferation. An analysis of these gene sets shows a larger representation of genes that encode secreted proteins than found in randomly selected gene sets. Secreted proteins constitute a major means by which cells interact with their surroundings. Mounting biological evidence has identified the tumor microenvironment as a critical factor that determines tumor survival and growth. Thus, the genes identified by this study that encode secreted proteins might provide important insights to the nature of the critical biological features in the microenvironment of each tumor type that allow these cells to thrive and proliferate.

  6. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  7. 4D Seismic Monitoring at the Ketzin Pilot Site during five years of storage - Results and Quantitative Assessment

    NASA Astrophysics Data System (ADS)

    Lüth, Stefan; Ivanova, Alexandra; Ivandic, Monika; Götz, Julia

    2015-04-01

    The Ketzin pilot site for geological CO2-storage has been operative between June 2008 and August 2013. In this period, 67 kt of CO2 have been injected (Martens et al., this conference). Repeated 3D seismic monitoring surveys were performed before and during CO2 injection. A third repeat survey, providing data from the post-injection phase, is currently being prepared for the autumn of 2015. The large scale 3D surface seismic measurements have been complemented by other geophysical and geochemical monitoring methods, among which are high-resolution seismic surface-downhole observations. These observations have been concentrating on the reservoir area in the vicinity of the injection well and provide high-resolution images as well as data for petrophysical quantification of the CO2 distribution in the reservoir. The Ketzin pilot site is a saline aquifer site in an onshore environment which poses specific challenges for a reliable monitoring of the injection CO2. Although much effort was done to ensure as much as possible identical acquisition conditions, a high degree of repeatability noise was observed, mainly due to varying weather conditions, and also variations in the acquisition geometries due to logistical reasons. Nevertheless, time-lapse processing succeeded in generating 3D time-lapse data sets which could be interpreted in terms of CO2 storage related amplitude variations in the depth range of the storage reservoir. The time-lapse seismic data, pulsed-neutron-gamma logging results (saturation), and petrophysical core measurements were interpreted together in order to estimate the amount of injected carbon dioxide imaged by the seismic repeat data. For the first repeat survey, the mass estimation was summed up to 20.5 ktons, which is approximately 7% less than what had been injected then. For the second repeat survey, the mass estimation was summed up to approximately 10-15% less than what had been injected. The deviations may be explained by several factors

  8. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  9. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  10. Performance Observations of Scanner Qualification of NCI-Designated Cancer Centers: Results From the Centers of Quantitative Imaging Excellence (CQIE) Program

    PubMed Central

    Rosen, Mark; Kinahan, Paul E.; Gimpel, James F.; Opanowski, Adam; Siegel, Barry A.; Hill, G. Craig; Weiss, Linda; Shankar, Lalitha

    2016-01-01

    We present an overview of the Centers for Quantitative Imaging Excellence (CQIE) program, which was initiated in 2010 to establish a resource of clinical trial-ready sites within the National Cancer Institute (NCI)-designated Cancer Centers (NCI-CCs) network. The intent was to enable imaging centers in the NCI-CCs network capable of conducting treatment trials with advanced quantitative imaging end points. We describe the motivations for establishing the CQIE, the process used to initiate the network, the methods of site qualification for positron emission tomography, computed tomography, and magnetic resonance imaging, and the results of the evaluations over the subsequent 3 years. PMID:28395794

  11. Quantitative Assessment of Motor and Sensory/Motor Acquisition in Handicapped and Nonhandicapped Infants and Young Children. Volume II: Interobserver Reliability Results for the Procedures.

    ERIC Educational Resources Information Center

    Guess, Doug; And Others

    The second of a three volume report on a University of Kansas approach to developing quantitative measures of motor and perceptual motor functioning in nonhandicapped and severely/multiply handicapped infants and young children presents interobserver reliability results from the measures described in volume 1. Some studies also include a limited…

  12. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  13. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  14. Correlation of Serum and Dried Blood Spot Results for Quantitation of Schistosoma Circulating Anodic Antigen: a Proof of Principle

    PubMed Central

    Downs, Jennifer A.; Corstjens, Paul L.A.M.; Mngara, Julius; Lutonja, Peter; Isingo, Raphael; Urassa, Mark; Kornelis, Dieuwke; van Dam, Govert J.

    2015-01-01

    Circulating Anodic Antigen (CAA) testing is a powerful, increasingly-used tool for diagnosis of active schistosome infection. We sought to determine the feasibility and reliability of measuring CAA in blood spots collected on Whatman 903 Protein Saver cards, which are the predominant filter papers used worldwide for dried blood spot (DBS) research and clinical care. CAA was eluted from blood spots collected from 19 individuals onto Whatman 903 cards in Mwanza, Tanzania, and the assay was optimized to achieve CAA ratios comparable to those obtained from the spots’ corresponding serum samples. The optimized assay was then used to determine the correlation of serum samples (n=16) with DBS from cards that had been stored for 8 years at ambient temperature.Using a DBS volume equivalent to approximately four times the quantity of serum, CAA testing in DBS had a sensitivity of 76% and a specificity of 79% compared to CAA testing in serum. CAA testing was reliable in samples eluted from Whatman 903 cards that had been stored for 8 years at ambient temperature. The overall kappa coefficient was 0.53 (standard error 0.17, p<0.001). We conclude that CAA can be reliably and accurately measured in DBS collected onto the filter paper that is most commonly used for clinical care and research, and that can be stored for prolonged periods of time. This finding opens new avenues for future work among more than 700 million individuals living in areas worldwide in which schistosomes are endemic. PMID:26149541

  15. Modelling Study at Kutlular Copper FIELD with Spat This Study, Evaluation Steps of Copper Mine Field SP Data Are Shown How to Reach More Accurate Results for SP Inversion Method.

    NASA Astrophysics Data System (ADS)

    Sahin, O. K.; Asci, M.

    2014-12-01

    At this study, determination of theoretical parameters for inversion process of Trabzon-Sürmene-Kutlular ore bed anomalies was examined. Making a decision of which model equation can be used for inversion is the most important step for the beginning. It is thought that will give a chance to get more accurate results. So, sections were evaluated with sphere-cylinder nomogram. After that, same sections were analyzed with cylinder-dike nomogram to determine the theoretical parameters for inversion process for every single model equations. After comparison of results, we saw that only one of them was more close to parameters of nomogram evaluations. But, other inversion result parameters were different from their nomogram parameters.

  16. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  17. Body image in adolescence: cross-cultural research--results of the preliminary phase of a quantitative survey.

    PubMed

    Ferron, C

    1997-01-01

    This preliminary phase of a quantitative research had two main objectives: to identify the emotional and relational components of body image in adolescents, and to determine whether the experience of body changes is dependent upon individuals' context. Two samples of adolescents, both 13 to 17 years of age, who were healthy, middle- or upper middle-class, and randomly chosen, participated in the study. Subjects were 80 French adolescents (40 boys and 40 girls) from a center for preventive medicine, and 60 American adolescents (30 boys and 30 girls), from a suburban high school. Thorough individual interviews were conducted with these adolescents on the basis of a precise interview guide in order to determine their perceptions, attitudes, and beliefs about body image. A thematic analysis of the content of these recorded interviews revealed the differences between adolescents from the two countries. I was found that the main cultural differences were based on the belief that the real body and the ideal body coincide, and on the way physical appearance is included in the diversity of relational experiences. Gender differences were shown to be centered more on the level of control of body changes and on self-assessment modes; the signs of a failing or troubled body image may find their origin on an individual level, in the particularities of the family and parental language about the body, and on a collective level in the social representation of the body. The consequences of these symbolic representations on the adolescents' body image and attitudes toward their own health, are presented and discussed.

  18. A method to accurately quantitate intensities of (32)P-DNA bands when multiple bands appear in a single lane of a gel is used to study dNTP insertion opposite a benzo[a]pyrene-dG adduct by Sulfolobus DNA polymerases Dpo4 and Dbh.

    PubMed

    Sholder, Gabriel; Loechler, Edward L

    2015-01-01

    Quantitating relative (32)P-band intensity in gels is desired, e.g., to study primer-extension kinetics of DNA polymerases (DNAPs). Following imaging, multiple (32)P-bands are often present in lanes. Though individual bands appear by eye to be simple and well-resolved, scanning reveals they are actually skewed-Gaussian in shape and neighboring bands are overlapping, which complicates quantitation, because slower migrating bands often have considerable contributions from the trailing edges of faster migrating bands. A method is described to accurately quantitate adjacent (32)P-bands, which relies on having a standard: a simple skewed-Gaussian curve from an analogous pure, single-component band (e.g., primer alone). This single-component scan/curve is superimposed on its corresponding band in an experimentally determined scan/curve containing multiple bands (e.g., generated in a primer-extension reaction); intensity exceeding the single-component scan/curve is attributed to other components (e.g., insertion products). Relative areas/intensities are determined via pixel analysis, from which relative molarity of components is computed. Common software is used. Commonly used alternative methods (e.g., drawing boxes around bands) are shown to be less accurate. Our method was used to study kinetics of dNTP primer-extension opposite a benzo[a]pyrene-N(2)-dG-adduct with four DNAPs, including Sulfolobus solfataricus Dpo4 and Sulfolobus acidocaldarius Dbh. Vmax/Km is similar for correct dCTP insertion with Dpo4 and Dbh. Compared to Dpo4, Dbh misinsertion is slower for dATP (∼20-fold), dGTP (∼110-fold) and dTTP (∼6-fold), due to decreases in Vmax. These findings provide support that Dbh is in the same Y-Family DNAP class as eukaryotic DNAP κ and bacterial DNAP IV, which accurately bypass N(2)-dG adducts, as well as establish the scan-method described herein as an accurate method to quantitate relative intensity of overlapping bands in a single lane, whether generated

  19. Comparison of the Multiple-sample means with composite sample results for fecal indicator bacteria by quantitative PCR and culture

    EPA Science Inventory

    ABSTRACT: Few studies have addressed the efficacy of composite sampling for measurement of indicator bacteria by QPCR. In this study, composite results were compared to single sample results for culture- and QPCR-based water quality monitoring. Composite results for both methods ...

  20. Beam hardening artifacts in micro-computed tomography scanning can be reduced by X-ray beam filtration and the resulting images can be used to accurately measure BMD.

    PubMed

    Meganck, Jeffrey A; Kozloff, Kenneth M; Thornton, Michael M; Broski, Stephen M; Goldstein, Steven A

    2009-12-01

    Bone mineral density (BMD) measurements are critical in many research studies investigating skeletal integrity. For pre-clinical research, micro-computed tomography (microCT) has become an essential tool in these studies. However, the ability to measure the BMD directly from microCT images can be biased by artifacts, such as beam hardening, in the image. This three-part study was designed to understand how the image acquisition process can affect the resulting BMD measurements and to verify that the BMD measurements are accurate. In the first part of this study, the effect of beam hardening-induced cupping artifacts on BMD measurements was examined. In the second part of this study, the number of bones in the X-ray path and the sampling process during scanning was examined. In the third part of this study, microCT-based BMD measurements were compared with ash weights to verify the accuracy of the measurements. The results indicate that beam hardening artifacts of up to 32.6% can occur in sample sizes of interest in studies investigating mineralized tissue and affect mineral density measurements. Beam filtration can be used to minimize these artifacts. The results also indicate that, for murine femora, the scan setup can impact densitometry measurements for both cortical and trabecular bone and morphologic measurements of trabecular bone. Last, when a scan setup that minimized all of these artifacts was used, the microCT-based measurements correlated well with ash weight measurements (R(2)=0.983 when air was excluded), indicating that microCT can be an accurate tool for murine bone densitometry.

  1. The laminar flow tube reactor as a quantitative tool for nucleation studies: Experimental results and theoretical analysis of homogeneous nucleation of dibutylphthalate

    SciTech Connect

    Mikheev, Vladimir B.; Laulainen, Nels S.; Barlow, Stephan E.; Knott, Michael; Ford, Ian J.

    2000-09-01

    A laminar flow tube reactor was designed and constructed to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 to +19.1 degree sign C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) is given. Both isothermal and the isobaric nucleation rates were measured. The experimental data obtained were compared with the measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis, based on the first and the second nucleation theorems, is also presented. The critical cluster size and the excess of internal energy of the critical cluster are obtained. (c) 2000 American Institute of Physics.

  2. The Laminar Flow Tube Reactor as a Quantitative Tool for Nucleation Studies: Experimental Results and Theoretical Analysis of Homogeneous Nucleation of Dibutylphthalate

    SciTech Connect

    Mikheev, Vladimir B.; Laulainen, Nels S. ); Barlow, Stephan E. ); Knott, Michael; Ford, Ian J.

    1999-12-01

    A Laminar Flow Tube Reactor has been designed and constructed in order to provide an accurate, quantitative measurement of a nucleation rate as a function of supersaturation and temperature. Measurements of nucleation of a supersaturated vapor of dibutylphthalate have been made for the temperature range from -30.3 C to+19.1 C. A thorough analysis of the possible sources of experimental uncertainties (such as defining the correct value of the initial vapor concentration, temperature boundary conditions on the reactor walls, accuracy of the calculations of the thermodynamic parameters of the nucleation zone, and particle concentration measurement) has been provided. Both isothermal and the isobaric nucleation rates have been measured. The experimental data obtained have been compared with measurements of other experimental groups and with theoretical predictions made on the basis of the self-consistency correction nucleation theory. Theoretical analysis based on the first and the second nucleation theorems has been made. The critical cluster size and the excess of internal energy of the critical cluster have been obtained.

  3. Results.

    ERIC Educational Resources Information Center

    Zemsky, Robert; Shaman, Susan; Shapiro, Daniel B.

    2001-01-01

    Describes the Collegiate Results Instrument (CRI), which measures a range of collegiate outcomes for alumni 6 years after graduation. The CRI was designed to target alumni from institutions across market segments and assess their values, abilities, work skills, occupations, and pursuit of lifelong learning. (EV)

  4. Quantitative assessment of the impact of biomedical image acquisition on the results obtained from image analysis and processing

    PubMed Central

    2014-01-01

    Introduction Dedicated, automatic algorithms for image analysis and processing are becoming more and more common in medical diagnosis. When creating dedicated algorithms, many factors must be taken into consideration. They are associated with selecting the appropriate algorithm parameters and taking into account the impact of data acquisition on the results obtained. An important feature of algorithms is the possibility of their use in other medical units by other operators. This problem, namely operator’s (acquisition) impact on the results obtained from image analysis and processing, has been shown on a few examples. Material and method The analysed images were obtained from a variety of medical devices such as thermal imaging, tomography devices and those working in visible light. The objects of imaging were cellular elements, the anterior segment and fundus of the eye, postural defects and others. In total, almost 200'000 images coming from 8 different medical units were analysed. All image analysis algorithms were implemented in C and Matlab. Results For various algorithms and methods of medical imaging, the impact of image acquisition on the results obtained is different. There are different levels of algorithm sensitivity to changes in the parameters, for example: (1) for microscope settings and the brightness assessment of cellular elements there is a difference of 8%; (2) for the thyroid ultrasound images there is a difference in marking the thyroid lobe area which results in a brightness assessment difference of 2%. The method of image acquisition in image analysis and processing also affects: (3) the accuracy of determining the temperature in the characteristic areas on the patient’s back for the thermal method - error of 31%; (4) the accuracy of finding characteristic points in photogrammetric images when evaluating postural defects – error of 11%; (5) the accuracy of performing ablative and non-ablative treatments in cosmetology - error of 18

  5. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  6. Quantitative estimates of the risk of new outbreaks of foot-and-mouth disease as a result of burning pyres.

    PubMed

    Jones, R; Kelly, L; French, N; England, T; Livesey, C; Wooldridge, M

    2004-02-07

    The risk of dispersing foot-and-mouth disease virus into the atmosphere, and spreading it to susceptible holdings as a result of burning large numbers of carcases together on open pyres, has been estimated for six selected pyres burned during the 2001 outbreak in the UK. The probability of an animal or holding becoming infected was dependent on the estimated level of exposure to the virus predicted from the concentrations of virus calculated by the Met Office, Bracknell. In general, the probability of infection per animal and per holding decreased as their distance from the pyre increased. In the case of two of the pyres, a holding under the pyre plumes became infected on a date consistent with when the pyre was lit. However, by calculating their estimated probability of infection from the pyres it was concluded that it was unlikely that in either case the pyre was the source of infection.

  7. Quantitative comparison between theoretical predictions and experimental results for Bragg spectroscopy of a strongly interacting Fermi superfluid

    NASA Astrophysics Data System (ADS)

    Zou, Peng; Kuhnle, Eva D.; Vale, Chris J.; Hu, Hui

    2010-12-01

    Theoretical predictions for the dynamic structure factor of a harmonically trapped Fermi superfluid near the Bose-Einstein condensate-Bardeen-Cooper-Schrieffer (BEC-BCS) crossover are compared with recent Bragg spectroscopy measurements at large transferred momenta. The calculations are based on a random-phase (or time-dependent Hartree-Fock-Gorkov) approximation generalized to the strongly interacting regime. Excellent agreement with experimental spectra at low temperatures is obtained, with no free parameters. Theoretical predictions for zero-temperature static structure factor are also found to agree well with the experimental results and independent theoretical calculations based on the exact Tan relations. The temperature dependence of the structure factors at unitarity is predicted.

  8. Quantitative comparison of transient elastography (TE), shear wave elastography (SWE) and liver biopsy results of patients with chronic liver disease.

    PubMed

    Kim, Hyun-Jin; Lee, Hae-Kag; Cho, Jae-Hwan; Yang, Han-Jun

    2015-08-01

    [Purpose] The purpose of this study was to carry out a comparitive analysis of hepatic fibrosis results of the liver hardness of patients with chronic liver disease as measured by elastography (TE), shear wave elastography (SWE), and liver biopsy. [Subjects and Methods] This study was a retrospective analysis of 304 patients who underwent SWE and TE before and after liver biopsy, taken from among patients who had been checked for liver fibrosis by liver biopsy between August 2013 and August 2014. We used receiver operating characteristic (ROC) curve to prove the diagnostic significance of liver stiffness, and then analyzed the sensitivity, specificity, accuracy, positive predictive value, and negative predictive value of SWE and TE, as well as the kappa index through cross-analysis of SWE, TE, and liver biopsy. [Results] For liver hardness, the sensitivity of SWE was 84.39%, the specificity of SWE was 97.92%, the accuracy of SWE was 87.33%, the positive predictive value of SWE was 99.32%, and the negative predictive value of SWE was 63.51%. The sensitivity of TE was 94.80%, the specificity of TE was 77.08%, the accuracy of TE was 90.95%, the positive predictive value of TE was 93.97%, and the negative predictive value of TE was 80.43%. [Conclusion] It is our opinion that SWE and TE are non-invasive methods that are more effective than the invasive methods used for diagnosing liver hardness. Invasive methods cover only a section of liver tissue, and are more likely to cause side effects during biopsy.

  9. Messages that increase women’s intentions to abstain from alcohol during pregnancy: results from quantitative testing of advertising concepts

    PubMed Central

    2014-01-01

    Background Public awareness-raising campaigns targeting alcohol use during pregnancy are an important part of preventing prenatal alcohol exposure and Fetal Alcohol Spectrum Disorder. Despite this, there is little evidence on what specific elements contribute to campaign message effectiveness. This research evaluated three different advertising concepts addressing alcohol and pregnancy: a threat appeal, a positive appeal promoting a self-efficacy message, and a concept that combined the two appeals. The primary aim was to determine the effectiveness of these concepts in increasing women’s intentions to abstain from alcohol during pregnancy. Methods Women of childbearing age and pregnant women residing in Perth, Western Australia participated in a computer-based questionnaire where they viewed either a control or one of the three experimental concepts. Following exposure, participants’ intentions to abstain from and reduce alcohol intake during pregnancy were measured. Other measures assessed included perceived main message, message diagnostics, and potential to promote defensive responses or unintended consequences. Results The concepts containing a threat appeal were significantly more effective at increasing women’s intentions to abstain from alcohol during pregnancy than the self-efficacy message and the control. The concept that combined threat and self-efficacy is recommended for development as part of a mass-media campaign as it has good persuasive potential, provides a balance of positive and negative emotional responses, and is unlikely to result in defensive or unintended consequences. Conclusions This study provides important insights into the components that enhance the persuasiveness and effectiveness of messages aimed at preventing prenatal alcohol exposure. The recommended concept has good potential for use in a future campaign aimed at promoting women’s intentions to abstain from alcohol during pregnancy. PMID:24410764

  10. TU-EF-204-12: Quantitative Evaluation of Spectral Detector CT Using Virtual Monochromatic Images: Initial Results

    SciTech Connect

    Duan, X; Guild, J; Arbique, G; Anderson, J; Dhanantwari, A; Yagil, Y

    2015-06-15

    Purpose To evaluate the image quality and spectral information of a spectral detector CT (SDCT) scanner using virtual monochromatic (VM) energy images. Methods The SDCT scanner (Philips Healthcare) was equipped with a dual-layer detector and spectral iterative reconstruction (IR), which generates conventional 80–140 kV polychromatic energy (PE) CT images using both detector layers, PE images from the low-energy (upper) and high-energy (lower) detector layers and VM images. A solid water phantom with iodine (2.0–20.0 mg I/ml) and calcium (50.0–600.0 mg Ca/ml) rod inserts was used to evaluate effective energy estimate (EEE) and iodine contrast to noise ratio (CNR). The EEE corresponding to an insert CT number in a PE image was calculated from a CT number fit to the VM image set. Since PE image is prone to beam-hardening artifact EEE may underestimate the actual energy separation from two layers of the detector. A 30-cm-diameter water phantom was used to evaluate noise power spectrum (NPS). The phantoms were scanned at 120 and 140 kV with the same CTDIvol. Results The CT number difference for contrast inserts in VM images (50–150 keV) was 1.3±6% between 120 and 140 kV scans. The difference of EEE calculated from low- and high-energy detector images was 11.5 and 16.7 keV for 120 and 140 kV scans, respectively. The differences calculated from 140 and 100 kV conventional PE images were 12.8, and 20.1 keV from 140 and 80 kV conventional PE images. The iodine CNR increased monotonically with decreased keV. Compared to conventional PE images, the peak of NPS curves from VM images were shifted to lower frequency. Conclusion The EEE results indicates that SDCT at 120 and 140 kV may have energy separation comparable to 100/140 kV and 80/140 kV dual-kV imaging. The effects of IR on CNR and NPS require further investigation for SDCT. Author YY and AD are Philips Healthcare employees.

  11. SU-C-210-06: Quantitative Evaluation of Dosimetric Effects Resulting From Positional Variations of Pancreatic Tumor Volumes

    SciTech Connect

    Yu, S; Sehgal, V; Wei, R; Lawrenson, L; Kuo, J; Hanna, N; Ramsinghani, N; Daroui, P; Al-Ghazi, M

    2015-06-15

    Purpose: The aim of this study is to quantify dosimetric effects resulting from variation in pancreatic tumor position assessed by bony anatomy and implanted fiducial markers Methods: Twelve pancreatic cancer patients were retrospectively analyzed for this study. All patients received modulated arc therapy (VMAT) treatment using fiducial-based Image Guided Radiation Therapy (IGRT) to the intact pancreas. Using daily orthogonal kV and/or Cone beam CT images, the shift needed to co-register the daily pre-treatment images to reference CT from fiducial to bone (Fid-Bone) were recorded as Left-Right (LR), Anterior-Posterior (AP) and Superior-Inferior (SI). The original VMAT plan iso-center was shifted based on KV bone matching positions at 5 evenly spaced fractions. Dose coverage of the planning target volumes (PTVs) (V100%), mean dose to liver, kidney and stomach/duodenum were assessed in the modified plans. Results: A total of 306 fractions were analyzed. The absolute fiducial-bone positional shifts were greatest in the SI direction, (AP = 2.7 ± 3.0, LR = 2.8 ± 2.8, and SI 6.3 ± 7.9 mm, mean ± SD). The V100% was significantly reduced by 13.5%, (Fid-Bone = 95.3 ± 2.0 vs. 82.3 ± 11.8%, p=0.02). This varied widely among patients (Fid-Bone V100% Range = 2–60%), where 33% of patients had a reduction in V100% of more than 10%. The impact on OARs was greatest to the liver (Fid-Bone= 14.6 vs. 16.1 Gy, 10%), and stomach, (Fid-Bone = 23.9 vx. 25.5 Gy, 7%), however was not statistically significant (p=0.10 both). Conclusion: Compared to matching by fiducial markers, matching by bony anatomy would have substantially reduced the PTV coverage by 13.5%. This reinforces the importance of online position verification based on fiducial markers. Hence, implantation of fiducial markers is strongly recommended for pancreatic cancer patients undergoing intensity modulated radiation therapy treatments.

  12. Does atomoxetine improve executive function, inhibitory control, and hyperactivity? Results from a placebo-controlled trial using quantitative measurement technology.

    PubMed

    Wehmeier, Peter M; Schacht, Alexander; Ulberstad, Fredrik; Lehmann, Martin; Schneider-Fresenius, Christian; Lehmkuhl, Gerd; Dittmann, Ralf W; Banaschewski, Tobias

    2012-10-01

    .001), CGI-S-ADHD (ES = 1.11, P < 0.001). The results of this study show that ATX for 8 weeks significantly reduced ADHD-related symptoms as measured by the cb-CPT/MT.

  13. Concepts and Results of New Method for Accurate Ground and In-Flight Calibration of the Particle Spectrometers of the Fast Plasma Investigation on NASA's Magnetospheric MultiScale Mission

    NASA Astrophysics Data System (ADS)

    Gliese, U.; Gershman, D. J.; Dorelli, J.; Avanov, L. A.; Barrie, A. C.; Clark, G. B.; Kujawski, J. T.; Mariano, A. J.; Coffey, V. N.; Tucker, C. J.; Chornay, D. J.; Cao, N. T.; Zeuch, M. A.; Dickson, C.; Smith, D. L.; Salo, C.; MacDonald, E.; Kreisler, S.; Jacques, A. D.; Giles, B. L.; Pollock, C. J.

    2015-12-01

    The Fast Plasma Investigation (FPI) on NASA's Magnetospheric MultiScale (MMS) mission employs 16 Dual Electron Spectrometers and 16 Dual Ion Spectrometers with 4 of each type on each of 4 spacecraft to enable fast (30 ms for electrons; 150 ms for ions) and spatially differentiated measurements of the full 3D particle velocity distributions. This approach presents a new and challenging aspect to the calibration and operation of these instruments on ground and in flight. The response uniformity, the reliability of their calibration and the approach to handling any temporal evolution of these calibrated characteristics all assume enhanced importance in this application, where we attempt to understand the meaning of particle distributions within the ion and electron diffusion regions of magnetically reconnecting plasmas. We have developed a detailed model of the spectrometer detection system, its behavior and its signal, crosstalk and noise sources. Based on this, we have devised a new calibration method that enables accurate and repeatable measurement of micro-channel plate (MCP) gain, signal loss due to variation in MCP gain and crosstalk effects in one single measurement. The foundational concepts of this new calibration method, named threshold scan, are presented. It is shown how this method has been successfully applied both on ground and in-flight to achieve highly accurate and precise calibration of all 64 spectrometers. Calibration parameters that will evolve in flight are determined daily providing a robust characterization of sensor suite performance, as a basis for both in-situ hardware adjustment and data processing to scientific units, throughout mission lifetime. This is shown to be very desirable as the instruments will produce higher quality raw science data that will require smaller post-acquisition data-corrections using results from in-flight derived pitch angle distribution measurements and ground calibration measurements. The practical application

  14. Preliminary results of a quantitative comparison of the spectral signatures of Landsat Thematic Mapper (TM) and Modular Optoelectronic Multispectral Scanner (MOMS).

    NASA Technical Reports Server (NTRS)

    Bodechtel, J.; Zilger, J.; Salomonson, V. V.

    1985-01-01

    Operationally acquired Thematic Mapper and experimental MOMS-01 data are evaluated quantitatively concerning the systems spectral response and performance for geoscientific applications. Results show the two instruments to be similar in the spectral bands compared. Although the MOMS scanner has a smaller IFOV, it has a lower modulation transfer function performance for small, low contrast features as compared to Thematic Mapper. This deficiency does not only occur when MOMS was switched to the low gain mode. It is due to the CD arrays used (ITEK CCPD 1728).

  15. A quantitative comparison between the flow factor approach model and the molecular dynamics simulation results for the flow of a confined molecularly thin fluid film

    NASA Astrophysics Data System (ADS)

    Zhang, Yongbin

    2015-06-01

    Quantitative comparisons were made between the flow factor approach model and the molecular dynamics simulation (MDS) results both of which describe the flow of a molecularly thin fluid film confined between two solid walls. Although these two approaches, respectively, calculate the flow of a confined molecularly thin fluid film by different ways, very good agreements were found between them when the Couette and Poiseuille flows, respectively, calculated from them were compared. It strongly indicates the validity of the flow factor approach model in modeling the flow of a confined molecularly thin fluid film.

  16. Risk assessment of false-positive quantitative real-time PCR results in food, due to detection of DNA originating from dead cells.

    PubMed

    Wolffs, Petra; Norling, Börje; Rådström, Peter

    2005-03-01

    Real-time PCR technology is increasingly used for detection and quantification of pathogens in food samples. A main disadvantage of nucleic acid detection is the inability to distinguish between signals originating from viable cells and DNA released from dead cells. In order to gain knowledge concerning risks of false-positive results due to detection of DNA originating from dead cells, quantitative PCR (qPCR) was used to investigate the degradation kinetics of free DNA in four types of meat samples. Results showed that the fastest degradation rate was observed (1 log unit per 0.5 h) in chicken homogenate, whereas the slowest rate was observed in pork rinse (1 log unit per 120.5 h). Overall results indicated that degradation occurred faster in chicken samples than in pork samples and faster at higher temperatures. Based on these results, it was concluded that, especially in pork samples, there is a risk of false-positive PCR results. This was confirmed in a quantitative study on cell death and signal persistence over a period of 28 days, employing three different methods, i.e. viable counts, direct qPCR, and finally floatation, a recently developed discontinuous density centrifugation method, followed by qPCR. Results showed that direct qPCR resulted in an overestimation of up to 10 times of the amount of cells in the samples compared to viable counts, due to detection of DNA from dead cells. However, after using floatation prior to qPCR, results resembled the viable count data. This indicates that by using of floatation as a sample treatment step prior to qPCR, the risk of false-positive PCR results due to detection of dead cells, can be minimized.

  17. Quantitative prediction of in vivo profiles of CYP3A4 induction in humans from in vitro results with a reporter gene assay.

    PubMed

    Kozawa, Masanari; Honma, Masashi; Suzuki, Hiroshi

    2009-06-01

    Although primary human hepatocytes are commonly used for induction studies, the evaluation method is associated with several problems. More recently, a reporter gene assay has been suggested to be an alternative, although the contribution of only transfected nuclear receptors can be evaluated. The aim of the present study was to establish a method by which the extent of in vivo CYP3A4 induction in humans can be quantitatively predicted based on in vitro results with a reporter gene assay. From previous reports, we calculated in vivo induction ratios (R(in vivo)) caused by prototypical inducers based on the alterations in the hepatic intrinsic clearance of probe drugs. Next, we derived equations by which these R(in vivo) values can be predicted from the results of a reporter gene assay. To use the data obtained from a reporter gene assay, rifampicin was used as a reference drug. The correction coefficient (CC), which is used to quantitatively correlate the activity of inducers between in vitro and in vivo situations, was calculated by comparing the predicted data with the observed R(in vivo) values for rifampicin. With the calculated CC value, good correlations were found between the predicted and observed R(in vivo) values for other inducers such as phenobarbital, phenytoin, and omeprazole. Taken together, with the equations derived in the present study, we have been able to predict the extent of in vivo induction of human CYP3A4 by inducers in a time-dependent and quantitative manner from in vitro data.

  18. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research

    PubMed Central

    Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R.

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM®) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  19. DICOM for quantitative imaging biomarker development: a standards based approach to sharing clinical data and structured PET/CT analysis results in head and neck cancer research.

    PubMed

    Fedorov, Andriy; Clunie, David; Ulrich, Ethan; Bauer, Christian; Wahle, Andreas; Brown, Bartley; Onken, Michael; Riesmeier, Jörg; Pieper, Steve; Kikinis, Ron; Buatti, John; Beichel, Reinhard R

    2016-01-01

    Background. Imaging biomarkers hold tremendous promise for precision medicine clinical applications. Development of such biomarkers relies heavily on image post-processing tools for automated image quantitation. Their deployment in the context of clinical research necessitates interoperability with the clinical systems. Comparison with the established outcomes and evaluation tasks motivate integration of the clinical and imaging data, and the use of standardized approaches to support annotation and sharing of the analysis results and semantics. We developed the methodology and tools to support these tasks in Positron Emission Tomography and Computed Tomography (PET/CT) quantitative imaging (QI) biomarker development applied to head and neck cancer (HNC) treatment response assessment, using the Digital Imaging and Communications in Medicine (DICOM(®)) international standard and free open-source software. Methods. Quantitative analysis of PET/CT imaging data collected on patients undergoing treatment for HNC was conducted. Processing steps included Standardized Uptake Value (SUV) normalization of the images, segmentation of the tumor using manual and semi-automatic approaches, automatic segmentation of the reference regions, and extraction of the volumetric segmentation-based measurements. Suitable components of the DICOM standard were identified to model the various types of data produced by the analysis. A developer toolkit of conversion routines and an Application Programming Interface (API) were contributed and applied to create a standards-based representation of the data. Results. DICOM Real World Value Mapping, Segmentation and Structured Reporting objects were utilized for standards-compliant representation of the PET/CT QI analysis results and relevant clinical data. A number of correction proposals to the standard were developed. The open-source DICOM toolkit (DCMTK) was improved to simplify the task of DICOM encoding by introducing new API abstractions

  20. An adapted mindfulness-based stress reduction program for elders in a continuing care retirement community: quantitative and qualitative results from a pilot randomized controlled trial.

    PubMed

    Moss, Aleezé S; Reibel, Diane K; Greeson, Jeffrey M; Thapar, Anjali; Bubb, Rebecca; Salmon, Jacqueline; Newberg, Andrew B

    2015-06-01

    The purpose of this study was to test the feasibility and effectiveness of an adapted 8-week Mindfulness-Based Stress Reduction (MBSR) program for elders in a continuing care community. This mixed-methods study used both quantitative and qualitative measures. A randomized waitlist control design was used for the quantitative aspect of the study. Thirty-nine elderly were randomized to MBSR (n = 20) or a waitlist control group (n = 19), mean age was 82 years. Both groups completed pre-post measures of health-related quality of life, acceptance and psychological flexibility, facets of mindfulness, self-compassion, and psychological distress. A subset of MBSR participants completed qualitative interviews. MBSR participants showed significantly greater improvement in acceptance and psychological flexibility and in role limitations due to physical health. In the qualitative interviews, MBSR participants reported increased awareness, less judgment, and greater self-compassion. Study results demonstrate the feasibility and potential effectiveness of an adapted MBSR program in promoting mind-body health for elders.

  1. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  2. Quantitative imaging methods in osteoporosis

    PubMed Central

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M. Carola

    2016-01-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research. PMID:28090446

  3. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  4. Using qualitative research to facilitate the interpretation of quantitative results from a discrete choice experiment: insights from a survey in elderly ophthalmologic patients

    PubMed Central

    Vennedey, Vera; Danner, Marion; Evers, Silvia MAA; Fauser, Sascha; Stock, Stephanie; Dirksen, Carmen D; Hiligsmann, Mickaël

    2016-01-01

    Background Age-related macular degeneration (AMD) is the leading cause of visual impairment and blindness in industrialized countries. Currently, mainly three treatment options are available, which are all intravitreal injections, but differ with regard to the frequency of injections needed, their approval status, and cost. This study aims to estimate patients’ preferences for characteristics of treatment options for neovascular AMD. Methods An interviewer-assisted discrete choice experiment was conducted among patients suffering from AMD treated with intravitreal injections. A Bayesian efficient design was used for the development of 12 choice tasks. In each task patients indicated their preference for one out of two treatment scenarios described by the attributes: side effects, approval status, effect on visual function, injection and monitoring frequency. While answering the choice tasks, patients were asked to think aloud and explain the reasons for choosing or rejecting specific characteristics. Quantitative data were analyzed with a mixed multinomial logit model. Results Eighty-six patients completed the questionnaire. Patients significantly preferred treatments that improve visual function, are approved, are administered in a pro re nata regimen (as needed), and are accompanied by bimonthly monitoring. Patients significantly disliked less frequent monitoring visits (every 4 months) and explained this was due to fear of deterioration being left unnoticed, and in turn experiencing disease deterioration. Significant preference heterogeneity was found for all levels except for bimonthly monitoring visits and severe, rare eye-related side effects. Patients gave clear explanations of their individual preferences during the interviews. Conclusion Significant preference trends were discernible for the overall sample, despite the preference heterogeneity for most treatment characteristics. Patients like to be monitored and treated regularly, but not too frequently

  5. Quantitatively Verifying the Results' Rationality for Farmland Quality Evaluation with Crop Yield, a Case Study in the Northwest Henan Province, China

    PubMed Central

    Huang, Junchang; Wang, Song

    2016-01-01

    Evaluating the assessing results’ rationality for farmland quality (FQ) is usually qualitative and based on farmers and experts’ perceptions of soil quality and crop yield. Its quantitative checking still remains difficult and is likely ignored. In this paper, FQ in Xiuwu County, the Northwest Henan Province, China was evaluated by the gray relational analysis (GRA) method and the traditional analytic hierarchy process (AHP) method. The consistency rate of two results was analysed. Research focused on proposing one method of testing the evaluation results’ rationality for FQ based on the crop yield. Firstly generating a grade map of crop yield and overlying it with the FQ evaluation maps. Then analysing their consistency rate for each grade in the same spatial position. Finally examining the consistency effects and allowing for a decision on adopting the results. The results showed that the area rate consistency and matching evaluation unit numbers between the two methods were 84.68% and 87.29%, respectively, and the space distribution was approximately equal. The area consistency rates between crop yield level and FQ evaluation levels by GRA and AHP were 78.15% and 74.29%, respectively. Therefore, the verifying effects of GRA and AHP were near, good and acceptable, and the FQ results from both could reflect the crop yield levels. The evaluation results by GCA, as a whole, were slightly more rational than that by AHP. PMID:27490247

  6. Quantitative myocardial perfusion SPECT.

    PubMed

    Tsui, B M; Frey, E C; LaCroix, K J; Lalush, D S; McCartney, W H; King, M A; Gullberg, G T

    1998-01-01

    In recent years, there has been much interest in the clinical application of attenuation compensation to myocardial perfusion single photon emission computed tomography (SPECT) with the promise that accurate quantitative images can be obtained to improve clinical diagnoses. The different attenuation compensation methods that are available create confusion and some misconceptions. Also, attenuation-compensated images reveal other image-degrading effects including collimator-detector blurring and scatter that are not apparent in uncompensated images. This article presents basic concepts of the major factors that degrade the quality and quantitative accuracy of myocardial perfusion SPECT images, and includes a discussion of the various image reconstruction and compensation methods and misconceptions and pitfalls in implementation. The differences between the various compensation methods and their performance are demonstrated. Particular emphasis is directed to an approach that promises to provide quantitative myocardial perfusion SPECT images by accurately compensating for the 3-dimensional (3-D) attenuation, collimator-detector response, and scatter effects. With advances in the computer hardware and optimized implementation techniques, quantitatively accurate and high-quality myocardial perfusion SPECT images can be obtained in clinically acceptable processing time. Examples from simulation, phantom, and patient studies are used to demonstrate the various aspects of the investigation. We conclude that quantitative myocardial perfusion SPECT, which holds great promise to improve clinical diagnosis, is an achievable goal in the near future.

  7. Quantitative PCR for determining the infectivity of bacteriophage MS2 upon inactivation by heat, UV-B radiation, and singlet oxygen: advantages and limitations of an enzymatic treatment to reduce false-positive results.

    PubMed

    Pecson, Brian M; Martin, Luisa Valério; Kohn, Tamar

    2009-09-01

    Health risks posed by waterborne viruses are difficult to assess because it is tedious or impossible to determine the infectivity of many viruses. Recent studies hypothesized that quantitative PCR (qPCR) could selectively quantify infective viruses if preceded by an enzymatic treatment (ET) to reduce confounding false-positive signals. The goal of this study was to determine if ET with qPCR (ET-qPCR) can be used to accurately quantify the infectivity of the human viral surrogate bacteriophage MS2 upon partial inactivation by three treatments (heating at 72 degrees C, singlet oxygen, and UV radiation). Viruses were inactivated in buffered solutions and a lake water sample and assayed with culturing, qPCR, and ET-qPCR. To ensure that inactivating genome damage was fully captured, primer sets that covered the entire coding region were used. The susceptibility of different genome regions and the maximum genomic damage after each inactivating treatment were compared. We found that (i) qPCR alone caused false-positive results for all treatments, (ii) ET-qPCR significantly reduced (up to >5.2 log units) but did not eliminate the false-positive signals, and (iii) the elimination of false-positive signals differed between inactivating treatments. By assaying the whole coding region, we demonstrated that genome damage only partially accounts for virus inactivation. The possibility of achieving complete accordance between culture- and PCR-based assays is therefore called into doubt. Despite these differences, we postulate that ET-qPCR can track infectivity, given that decreases in infectivity were always accompanied by dose-dependent decreases in ET-qPCR signal. By decreasing false-positive signals, ET-qPCR improved the detection of infectivity loss relative to qPCR.

  8. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  9. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  10. Accurate vessel segmentation with constrained B-snake.

    PubMed

    Yuanzhi Cheng; Xin Hu; Ji Wang; Yadong Wang; Tamura, Shinichi

    2015-08-01

    We describe an active contour framework with accurate shape and size constraints on the vessel cross-sectional planes to produce the vessel segmentation. It starts with a multiscale vessel axis tracing in a 3D computed tomography (CT) data, followed by vessel boundary delineation on the cross-sectional planes derived from the extracted axis. The vessel boundary surface is deformed under constrained movements on the cross sections and is voxelized to produce the final vascular segmentation. The novelty of this paper lies in the accurate contour point detection of thin vessels based on the CT scanning model, in the efficient implementation of missing contour points in the problematic regions and in the active contour model with accurate shape and size constraints. The main advantage of our framework is that it avoids disconnected and incomplete segmentation of the vessels in the problematic regions that contain touching vessels (vessels in close proximity to each other), diseased portions (pathologic structure attached to a vessel), and thin vessels. It is particularly suitable for accurate segmentation of thin and low contrast vessels. Our method is evaluated and demonstrated on CT data sets from our partner site, and its results are compared with three related methods. Our method is also tested on two publicly available databases and its results are compared with the recently published method. The applicability of the proposed method to some challenging clinical problems, the segmentation of the vessels in the problematic regions, is demonstrated with good results on both quantitative and qualitative experimentations; our segmentation algorithm can delineate vessel boundaries that have level of variability similar to those obtained manually.

  11. Correlating Quantitative Fecal Immunochemical Test Results with Neoplastic Findings on Colonoscopy in a Population-Based Colorectal Cancer Screening Program: A Prospective Study

    PubMed Central

    McGahan, Colleen E.

    2016-01-01

    Background and Aims. The Canadian Partnership Against Cancer (CPAC) recommends a fecal immunochemical test- (FIT-) positive predictive value (PPV) for all adenomas of ≥50%. We sought to assess FIT performance among average-risk participants of the British Columbia Colon Screening Program (BCCSP). Methods. From Nov-2013 to Dec-2014 consecutive participants of the BCCSP were assessed. Data was obtained from a prospectively collected database. A single quantitative FIT (NS-Plus, Alfresa Pharma Corporation, Japan) with a cut-off of ≥10 μg/g (≥50 ng/mL) was used. Results. 20,322 FIT-positive participants underwent CSPY. At a FIT cut-off of ≥10 μg/g (≥50 ng/mL) the PPV for all adenomas was 52.0%. Increasing the FIT cut-off to ≥20 μg/g (≥100 ng/mL) would increase the PPV for colorectal cancer (CRC) by 1.5% and for high-risk adenomas (HRAs) by 6.5% at a cost of missing 13.6% of CRCs and 32.4% of HRAs. Conclusions. As the NS-Plus FIT cut-off rises, the PPV for CRC and HRAs increases but at the cost of missed lesions. A cut-off of ≥10 μg/g (≥50 ng/mL) produces a PPV for all adenomas exceeding national recommendations. Health authorities need to take into consideration endoscopic resources when selecting a FIT positivity threshold. PMID:28116286

  12. Near-infrared reflectance spectroscopy (NIRS) enables the fast and accurate prediction of essential amino acid contents. 2. Results for wheat, barley, corn, triticale, wheat bran/middlings, rice bran, and sorghum.

    PubMed

    Fontaine, Johannes; Schirmer, Barbara; Hörr, Jutta

    2002-07-03

    Further NIRS calibrations were developed for the accurate and fast prediction of the total contents of methionine, cystine, lysine, threonine, tryptophan, and other essential amino acids, protein, and moisture in the most important cereals and brans or middlings for animal feed production. More than 1100 samples of global origin collected over five years were analyzed for amino acids following the Official Methods of the United States and European Union. Detailed data and graphics are given to characterize the obtained calibration equations. NIRS was validated with 98 independent samples for wheat and 78 samples for corn and compared to amino acid predictions using linear crude protein regression equations. With a few exceptions, validation showed that 70-98% of the amino acid variance in the samples could be explained using NIRS. Especially for lysine and methionine, the most limiting amino acids for farm animals, NIRS can predict contents in cereals much better than crude protein regressions. Through low cost and high speed of analysis NIRS enables the amino acid analysis of many samples in order to improve the accuracy of feed formulation and obtain better quality and lower production costs.

  13. Comparison of Enterococcus quantitative polymerase chain reaction analysis results from midwest U.S. river samples using EPA Method 1611 and Method 1609 PCR reagents

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...

  14. SiNG-PCRseq: Accurate inter-sequence quantification achieved by spiking-in a neighbor genome for competitive PCR amplicon sequencing.

    PubMed

    Oh, Soo A; Yang, Inchul; Hahn, Yoonsoo; Kang, Yong-Kook; Chung, Sun-Ku; Jeong, Sangkyun

    2015-07-06

    Despite the recent technological advances in DNA quantitation by sequencing, accurate delineation of the quantitative relationship among different DNA sequences is yet to be elaborated due to difficulties in correcting the sequence-specific quantitation biases. We here developed a novel DNA quantitation method via spiking-in a neighbor genome for competitive PCR amplicon sequencing (SiNG-PCRseq). This method utilizes genome-wide chemically equivalent but easily discriminable homologous sequences with a known copy arrangement in the neighbor genome. By comparing the amounts of selected human DNA sequences simultaneously to those of matched sequences in the orangutan genome, we could accurately draw the quantitative relationships for those sequences in the human genome (root-mean-square deviations <0.05). Technical replications of cDNA quantitation performed using different reagents at different time points also resulted in excellent correlations (R(2) > 0.95). The cDNA quantitation using SiNG-PCRseq was highly concordant with the RNA-seq-derived version in inter-sample comparisons (R(2) = 0.88), but relatively discordant in inter-sequence quantitation (R(2) < 0.44), indicating considerable level of sequence-dependent quantitative biases in RNA-seq. Considering the measurement structure explicitly relating the amount of different sequences within a sample, SiNG-PCRseq will facilitate sharing and comparing the quantitation data generated under different spatio-temporal settings.

  15. On an efficient and accurate method to integrate restricted three-body orbits

    NASA Technical Reports Server (NTRS)

    Murison, Marc A.

    1989-01-01

    This work is a quantitative analysis of the advantages of the Bulirsch-Stoer (1966) method, demonstrating that this method is certainly worth considering when working with small N dynamical systems. The results, qualitatively suspected by many users, are quantitatively confirmed as follows: (1) the Bulirsch-Stoer extrapolation method is very fast and moderately accurate; (2) regularization of the equations of motion stabilizes the error behavior of the method and is, of course, essential during close approaches; and (3) when applicable, a manifold-correction algorithm reduces numerical errors to the limits of machine accuracy. In addition, for the specific case of the restricted three-body problem, even a small eccentricity for the orbit of the primaries drastically affects the accuracy of integrations, whether regularized or not; the circular restricted problem integrates much more accurately.

  16. EEG patterns in persons exposed to ionizing radiation as a result of the chernobyl accident. Part 2: quantitative EEG analysis in patients who had acute radiation sickness.

    PubMed

    Loganovsky, Konstantin N; Yuryev, Konstantin L

    2004-01-01

    Cross-sectional quantitative electroencephalogram (qEEG) study (1996-2001) among Chernobyl accident survivors, who had confirmed acute radiation sickness and were irradiated in dose of 1-5 Gy, revealed the neurophysiological markers of ionizing radiation. Neuropsychological markers were: left fronto-temporal dominant frequency reduction; absolute delta-power lateralization to the left (dominant) hemisphere; relative delta-power increase in the fronto-temporal areas; absolute theta-power decrease in the left temporal region; absolute and relative alpha-power diffusive decrease, which may reflect cortico-limbic dysfunction lateralized to the left, dominant hemisphere, with the fronto-temporal cortical and hippocampal damage. Quantitative electroencephalogram proposed for differentiation of radiation and nonradiation brain damages and as a new biological dosymetry method. High radiosensitivity of the brain, neocortex, and dominant hemisphere higher radiosensitivity are discussed.

  17. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  18. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  19. Quantitative aspects of inductively coupled plasma mass spectrometry.

    PubMed

    Bulska, Ewa; Wagner, Barbara

    2016-10-28

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided.This article is part of the themed issue 'Quantitative mass spectrometry'.

  20. Role of secondary level laboratories in strengthening quality at primary level health facilities' laboratories: an innovative approach to ensure accurate HIV, tuberculosis, and malaria test results in resource-limited settings.

    PubMed

    Manyazewal, Tsegahun; Paterniti, Antonio D; Redfield, Robert R; Marinucci, Francesco

    2013-01-01

    Providing regular external quality assessment of primary level laboratories and timely feedback is crucial to ensure the reliability of testing capacity of the whole laboratory network. This study was aimed to assess the diagnostic performances of primary level laboratories in Southwest Showa Zone in Ethiopia. An external quality assessment protocol was devised whereby from among all the samples collected on-site at 4 health centers (HCs), each HC sent to a district hospital (DH) on a weekly basis 2 TB slides (1 Ziehl-Neelsen stained and another unstained), 2 malaria slides (1 Giemsa stained and another unstained), and 2 blood samples for HIV testing (1 whole blood and another plasma) for a comparative analysis. Similarly, the DH preserved the same amount and type of specimens to send to each HC for retesting. From October to November 2011, 192 single-blinded specimens were rechecked: 64 TB slides, 64 malaria slides, and 64 blood specimens for HIV testing. The analyses demonstrated an overall agreement of 95.3% (183/192) between the test and the retest, and 98.4% (63/64), 92.2% (59/64,) and 95.3% (61/64) for TB microscopy, malaria microscopy, and HIV rapid testing, respectively. Of the total TB slides tested positive, 20/23 (87%) were quantified similar in both laboratories. The agreement on HIV rapid testing was 100% (32/32) when plasma samples were tested either at HC (16/16) or at DH (16/16), while when whole blood specimens were tested, the agreement was 87.5% (14/16) and 93.8% (15/16) for samples prepared by HCs and DH, respectively. Results of this new approach proved that secondary laboratories could play a vital role in assuring laboratory qualities at primary level HCs, without depending on remotely located national and regional laboratories to provide this support.

  1. Evaluating the effectiveness of pasteurization for reducing human illnesses from Salmonella spp. in egg products: results of a quantitative risk assessment.

    PubMed

    Latimer, Heejeong K; Marks, Harry M; Coleman, Margaret E; Schlosser, Wayne D; Golden, Neal J; Ebel, Eric D; Kause, Janell; Schroeder, Carl M

    2008-02-01

    As part of the process for developing risk-based performance standards for egg product processing, the United States Department of Agriculture (USDA) Food Safety and Inspection Service (FSIS) undertook a quantitative microbial risk assessment for Salmonella spp. in pasteurized egg products. The assessment was designed to assist risk managers in evaluating egg handling and pasteurization performance standards for reducing the likelihood of Salmonella in pasteurized egg products and the subsequent risk to human health. The following seven pasteurized liquid egg product formulations were included in the risk assessment model, with the value in parentheses indicating the estimated annual number of human illnesses from Salmonella from each: egg white (2636), whole egg (1763), egg yolk (708), whole egg with 10% salt (407), whole egg with 10% sugar (0), egg yolk with 10% salt (11), and egg yolk with 10% sugar (0). Increased levels of pasteurization were predicted to be highly effective mitigations for reducing the number of illnesses. For example, if all egg white products were pasteurized for a 6-log(10) reduction of Salmonella, the estimated annual number of illnesses from these products would be reduced from 2636 to 270. The risk assessment identified several data gaps and research needs, including a quantitative study of cross-contamination during egg product processing and characterization of egg storage times and temperatures (i) on farms and in homes, (ii) for eggs produced off-line, and (iii) for egg products at retail. Pasteurized egg products are a relatively safe food; however, findings from this study suggest increased pasteurization can make them safer.

  2. From provocative narrative scenarios to quantitative biophysical model results: Simulating plausible futures to 2070 in an urbanizing agricultural watershed in Wisconsin, USA

    NASA Astrophysics Data System (ADS)

    Booth, E.; Chen, X.; Motew, M.; Qiu, J.; Zipper, S. C.; Carpenter, S. R.; Kucharik, C. J.; Steven, L. I.

    2015-12-01

    Scenario analysis is a powerful tool for envisioning future social-ecological change and its consequences on human well-being. Scenarios that integrate qualitative storylines and quantitative biophysical models can create a vivid picture of these potential futures but the integration process is not straightforward. We present - using the Yahara Watershed in southern Wisconsin (USA) as a case study - a method for developing quantitative inputs (climate, land use/cover, and land management) to drive a biophysical modeling suite based on four provocative and contrasting narrative scenarios that describe plausible futures of the watershed to 2070. The modeling suite consists of an agroecosystem model (AgroIBIS-VSF), hydrologic routing model (THMB), and empirical lake water quality model and estimates several biophysical indicators to evaluate the watershed system under each scenario. These indicators include water supply, lake flooding, agricultural production, and lake water quality. Climate (daily precipitation and air temperature) for each scenario was determined using statistics from 210 different downscaled future climate projections for two 20-year time periods (2046-2065 and 2081-2100) and modified using a stochastic weather generator to allow flexibility for matching specific climate events within the scenario narratives. Land use/cover for each scenario was determined first by quantifying changes in areal extent every decade for 15 categories at the watershed scale to be consistent with the storyline events and theme. Next, these changes were spatially distributed using a rule-based framework based on land suitability metrics that determine transition probabilities. Finally, agricultural inputs including manure and fertilizer application rates were determined for each scenario based on the prevalence of livestock, water quality regulations, and technological innovations. Each scenario is compared using model inputs (maps and time-series of land use/cover and

  3. Quantitative laser-induced breakdown spectroscopy data using peak area step-wise regression analysis: an alternative method for interpretation of Mars science laboratory results

    SciTech Connect

    Clegg, Samuel M; Barefield, James E; Wiens, Roger C; Dyar, Melinda D; Schafer, Martha W; Tucker, Jonathan M

    2008-01-01

    The ChemCam instrument on the Mars Science Laboratory (MSL) will include a laser-induced breakdown spectrometer (LIBS) to quantify major and minor elemental compositions. The traditional analytical chemistry approach to calibration curves for these data regresses a single diagnostic peak area against concentration for each element. This approach contrasts with a new multivariate method in which elemental concentrations are predicted by step-wise multiple regression analysis based on areas of a specific set of diagnostic peaks for each element. The method is tested on LIBS data from igneous and metamorphosed rocks. Between 4 and 13 partial regression coefficients are needed to describe each elemental abundance accurately (i.e., with a regression line of R{sup 2} > 0.9995 for the relationship between predicted and measured elemental concentration) for all major and minor elements studied. Validation plots suggest that the method is limited at present by the small data set, and will work best for prediction of concentration when a wide variety of compositions and rock types has been analyzed.

  4. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  5. Quantitative MRD monitoring identifies distinct GVL response patterns after allogeneic stem cell transplantation for chronic lymphocytic leukemia: results from the GCLLSG CLL3X trial.

    PubMed

    Ritgen, M; Böttcher, S; Stilgenbauer, S; Bunjes, D; Schubert, J; Cohen, S; Humpe, A; Hallek, M; Kneba, M; Schmitz, N; Döhner, H; Dreger, P

    2008-07-01

    The purpose of this study was to prospectively analyze minimal residual disease(MRD) kinetics after reduced-intensity allogeneic stem cell transplantation (allo-SCT) in high-risk chronic lymphocytic leukemia (CLL). Subjects were the first 30 consecutive patients from a prospective clinical trial, and seven pilot patients treated identically. Using real-time quantitative-PCR (RQ-PCR) and/or flow-based MRD monitoring (sensitivity >or=10(-4)), five distinct patterns of MRD kinetics could be identified: patients who promptly achieved durable MRD negativity without direct evidence of graft-versus-leukemia (GVL) effects (Group 1) (n=4; no clinical relapse); patients with complete and sustained MRD response after GVL induced by immunosuppression tapering (Group 2) or donor lymphocyte infusions (Group 3) (n=18; one relapse); patients without MRD response due to lack of GVL (Group 4) (n=2; two relapses); patients with incomplete and transient MRD response to GVL (Group 5) (n=4; three relapses). In summary, this study provides a comprehensive map of possible MRD courses and their prognostic implications after T-replete allo-SCT in high-risk CLL, indicating that effective GVL activity is induced virtually in all patients who develop chronic GVHD. However, in a significant proportion of cases, this does not translate into sustained disease control due to development of secondary GVL resistance.

  6. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  7. Accurate phase measurements for thick spherical objects using optical quadrature microscopy

    NASA Astrophysics Data System (ADS)

    Warger, William C., II; DiMarzio, Charles A.

    2009-02-01

    In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.

  8. Protein Quantitation of the Developing Cochlea Using Mass Spectrometry.

    PubMed

    Darville, Lancia N F; Sokolowski, Bernd H A

    2016-01-01

    Mass spectrometry-based proteomics allows for the measurement of hundreds to thousands of proteins in a biological system. Additionally, mass spectrometry can also be used to quantify proteins and peptides. However, observing quantitative differences between biological systems using mass spectrometry-based proteomics can be challenging because it is critical to have a method that is fast, reproducible, and accurate. Therefore, to study differential protein expression in biological samples labeling or label-free quantitative methods can be used. Labeling methods have been widely used in quantitative proteomics, however label-free methods have become equally as popular and more preferred because they produce faster, cleaner, and simpler results. Here, we describe the methods by which proteins are isolated and identified from cochlear sensory epithelia tissues at different ages and quantitatively differentiated using label-free mass spectrometry.

  9. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin–DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody

    PubMed Central

    Dou, Shuping; Virostko, John; Greiner, Dale L.; Powers, Alvin C.; Liu, Guozheng

    2016-01-01

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ~95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  10. Quantitative Correlation of in Vivo Properties with in Vitro Assay Results: The in Vitro Binding of a Biotin-DNA Analogue Modifier with Streptavidin Predicts the in Vivo Avidin-Induced Clearability of the Analogue-Modified Antibody.

    PubMed

    Dou, Shuping; Virostko, John; Greiner, Dale L; Powers, Alvin C; Liu, Guozheng

    2015-08-03

    Quantitative prediction of in vivo behavior using an in vitro assay would dramatically accelerate pharmaceutical development. However, studies quantitatively correlating in vivo properties with in vitro assay results are rare because of the difficulty in quantitatively understanding the in vivo behavior of an agent. We now demonstrate such a correlation as a case study based on our quantitative understanding of the in vivo chemistry. In an ongoing pretargeting project, we designed a trifunctional antibody (Ab) that concomitantly carried a biotin and a DNA analogue (hereafter termed MORF). The biotin and the MORF were fused into one structure prior to conjugation to the Ab for the concomitant attachment. Because it was known that avidin-bound Ab molecules leave the circulation rapidly, this design would theoretically allow complete clearance by avidin. The clearability of the trifunctional Ab was determined by calculating the blood MORF concentration ratio of avidin-treated Ab to non-avidin-treated Ab using mice injected with these compounds. In theory, any compromised clearability should be due to the presence of impurities. In vitro, we measured the biotinylated percentage of the Ab-reacting (MORF-biotin)⊃-NH2 modifier, by addition of streptavidin to the radiolabeled (MORF-biotin)⊃-NH2 samples and subsequent high-performance liquid chromatography (HPLC) analysis. On the basis of our previous quantitative understanding, we predicted that the clearability of the Ab would be equal to the biotinylation percentage measured via HPLC. We validated this prediction within a 3% difference. In addition to the high avidin-induced clearability of the trifunctional Ab (up to ∼95%) achieved by the design, we were able to predict the required quality of the (MORF-biotin)⊃-NH2 modifier for any given in vivo clearability. This approach may greatly reduce the steps and time currently required in pharmaceutical development in the process of synthesis, chemical analysis, in

  11. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  12. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  13. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  14. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  15. Precision of the reportable result. Simultaneous optimisation of number of preparations and injections for sample and reference standard in quantitative liquid chromatography.

    PubMed

    Ermer, J; Agut, C

    2014-08-01

    In pharmaceutical analysis, the precision of the reportable result, i.e. the result which is to be compared to the specification limit, is relevant for the evaluation of the suitability of the analytical procedure. But also for other applications, the precision of the result is important and an optimisation often of interest. However, increasing the number of determinations (e.g. injections or preparations) will reduce only the variability (or standard error) of the corresponding precision level. Therefore, the knowledge of the individual variance contributions, obtained from reliable precision studies is important to determine on a scientific basis which format of the (reportable) result, i.e. the number of injections and sample preparations (or even series), should be used. In case of relative analytical procedures such as LC, the calibration model and format, i.e. the number of determinations of the reference standard is one of the factors (besides instrument, operator, reagents, etc.) affecting the between-series variance contribution at intermediate precision/reproducibility level. Consequently, the precision of the reportable result is only valid for the calibration format used to obtain intermediate precision/reproducibility. Instead of repeating the whole precision study to optimize the calibration format, the present paper describes a statistical approach using variability results from the original precision study.

  16. Quantitative glycomics.

    PubMed

    Orlando, Ron

    2010-01-01

    The ability to quantitatively determine changes is an essential component of comparative glycomics. Multiple strategies are available by which this can be accomplished. These include label-free approaches and strategies where an isotopic label is incorporated into the glycans prior to analysis. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  17. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  18. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  19. Normalization with Corresponding Naïve Tissue Minimizes Bias Caused by Commercial Reverse Transcription Kits on Quantitative Real-Time PCR Results

    PubMed Central

    Garcia-Bardon, Andreas

    2016-01-01

    Real-time reverse transcription polymerase chain reaction (PCR) is the gold standard for expression analysis. Designed to improve reproducibility and sensitivity, commercial kits are commonly used for the critical step of cDNA synthesis. The present study was designed to determine the impact of these kits. mRNA from mouse brains were pooled to create serial dilutions ranging from 0.0625 μg to 2 μg, which were transcribed into cDNA using four different commercial reverse-transcription kits. Next, we transcribed mRNA from brain tissue after acute brain injury and naïve mice into cDNA for qPCR. Depending on tested genes, some kits failed to show linear results in dilution series and revealed strong variations in cDNA yield. Absolute expression data in naïve and trauma settings varied substantially between these kits. Normalization with a housekeeping gene failed to reduce kit-dependent variations, whereas normalization eliminated differences when naïve samples from the same region were used. The study shows strong evidence that choice of commercial cDNA synthesis kit has a major impact on PCR results and, consequently, on comparability between studies. Additionally, it provides a solution to overcome this limitation by normalization with data from naïve samples. This simple step helps to compare mRNA expression data between different studies and groups. PMID:27898720

  20. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey

    PubMed Central

    Borba, Christina P.C.; Ng, Lauren C.; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L.; Parnarouskis, Lindsey; Gray, Deborah A.; Carney, Julia R.; Domínguez, Silvia; Wang, Edward K.S.; Boxill, Ryan; Song, Suzan J.; Henderson, David C.

    2016-01-01

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5–22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy. PMID:26807147

  1. A mental health needs assessment of children and adolescents in post-conflict Liberia: results from a quantitative key-informant survey.

    PubMed

    Borba, Christina P C; Ng, Lauren C; Stevenson, Anne; Vesga-Lopez, Oriana; Harris, Benjamin L; Parnarouskis, Lindsey; Gray, Deborah A; Carney, Julia R; Domínguez, Silvia; Wang, Edward K S; Boxill, Ryan; Song, Suzan J; Henderson, David C

    2016-01-02

    Between 1989 and 2004, Liberia experienced a devastating civil war that resulted in widespread trauma with almost no mental health infrastructure to help citizens cope. In 2009, the Liberian Ministry of Health and Social Welfare collaborated with researchers from Massachusetts General Hospital to conduct a rapid needs assessment survey in Liberia with local key informants (n = 171) to examine the impact of war and post-war events on emotional and behavioral problems of, functional limitations of, and appropriate treatment settings for Liberian youth aged 5-22. War exposure and post-conflict sexual violence, poverty, infectious disease and parental death negatively impacted youth mental health. Key informants perceived that youth displayed internalizing and externalizing symptoms and mental health-related functional impairment at home, school, work and in relationships. Medical clinics were identified as the most appropriate setting for mental health services. Youth in Liberia continue to endure the harsh social, economic and material conditions of everyday life in a protracted post-conflict state, and have significant mental health needs. Their observed functional impairment due to mental health issues further limited their access to protective factors such as education, employment and positive social relationships. Results from this study informed Liberia's first post-conflict mental health policy.

  2. Accurate protein crystallography at ultra-high resolution: Valence electron distribution in crambin

    PubMed Central

    Jelsch, Christian; Teeter, Martha M.; Lamzin, Victor; Pichon-Pesme, Virginie; Blessing, Robert H.; Lecomte, Claude

    2000-01-01

    The charge density distribution of a protein has been refined experimentally. Diffraction data for a crambin crystal were measured to ultra-high resolution (0.54 Å) at low temperature by using short-wavelength synchrotron radiation. The crystal structure was refined with a model for charged, nonspherical, multipolar atoms to accurately describe the molecular electron density distribution. The refined parameters agree within 25% with our transferable electron density library derived from accurate single crystal diffraction analyses of several amino acids and small peptides. The resulting electron density maps of redistributed valence electrons (deformation maps) compare quantitatively well with a high-level quantum mechanical calculation performed on a monopeptide. This study provides validation for experimentally derived parameters and a window into charge density analysis of biological macromolecules. PMID:10737790

  3. Accurate protein crystallography at ultra-high resolution: valence electron distribution in crambin.

    PubMed

    Jelsch, C; Teeter, M M; Lamzin, V; Pichon-Pesme, V; Blessing, R H; Lecomte, C

    2000-03-28

    The charge density distribution of a protein has been refined experimentally. Diffraction data for a crambin crystal were measured to ultra-high resolution (0.54 A) at low temperature by using short-wavelength synchrotron radiation. The crystal structure was refined with a model for charged, nonspherical, multipolar atoms to accurately describe the molecular electron density distribution. The refined parameters agree within 25% with our transferable electron density library derived from accurate single crystal diffraction analyses of several amino acids and small peptides. The resulting electron density maps of redistributed valence electrons (deformation maps) compare quantitatively well with a high-level quantum mechanical calculation performed on a monopeptide. This study provides validation for experimentally derived parameters and a window into charge density analysis of biological macromolecules.

  4. Drugs, Women and Violence in the Americas: U.S. Quantitative Results of a Multi-Centric Pilot Project (Phase 2)

    PubMed Central

    González-Guarda, Rosa María; Peragallo, Nilda; Lynch, Ami; Nemes, Susanna

    2011-01-01

    Objectives To explore the collective and individual experiences that Latin American females in the U.S. have with substance abuse, violence and risky sexual behaviors. Methods This study was conducted in two phases from July 2006 to June 2007 in south Florida. This paper covers Phase 2. In Phase 2, questionnaires were provided to women to test whether there is a relationship between demographics, acculturation, depression, self-esteem and substance use/abuse; whether there is a relationship between demographics, acculturation, depression, self-esteem and violence exposure and victimization; whether there is a relationship between demographics, acculturation, depression, self-esteem, HIV knowledge and STD and HIV/AIDS risks among respondents; and whether there is a relationship between substance abuse, violence victimization and HIV/AIDS risks among respondents. Results Participants reported high rates of alcohol and drug abuse among their current or most recent partners. This is a major concern because partner alcohol use and drug use was related to partner physical, sexual and psychological abuse. Only two factors were associated with lifetime drug use: income and acculturation. Over half of the participants reported being victims of at least one form of abuse during childhood and adulthood. A substantial component of abuse reported during adulthood was perpetrated by a currently or recent intimate partner. Conclusions The results from this study suggest that substance abuse, violence and HIV should be addressed in an integrative and comprehensive manner. Recommendations for the development of policies, programs and services addressing substance abuse, violence and risk for HIV among Latinos are provided. PMID:22504304

  5. Efficient design, accurate fabrication and effective characterization of plasmonic quasicrystalline arrays of nano-spherical particles

    PubMed Central

    Namin, Farhad A.; Yuwen, Yu A.; Liu, Liu; Panaretos, Anastasios H.; Werner, Douglas H.; Mayer, Theresa S.

    2016-01-01

    In this paper, the scattering properties of two-dimensional quasicrystalline plasmonic lattices are investigated. We combine a newly developed synthesis technique, which allows for accurate fabrication of spherical nanoparticles, with a recently published variation of generalized multiparticle Mie theory to develop the first quantitative model for plasmonic nano-spherical arrays based on quasicrystalline morphologies. In particular, we study the scattering properties of Penrose and Ammann- Beenker gold spherical nanoparticle array lattices. We demonstrate that by using quasicrystalline lattices, one can obtain multi-band or broadband plasmonic resonances which are not possible in periodic structures. Unlike previously published works, our technique provides quantitative results which show excellent agreement with experimental measurements. PMID:26911709

  6. Accurate multiple network alignment through context-sensitive random walk

    PubMed Central

    2015-01-01

    Background Comparative network analysis can provide an effective means of analyzing large-scale biological networks and gaining novel insights into their structure and organization. Global network alignment aims to predict the best overall mapping between a given set of biological networks, thereby identifying important similarities as well as differences among the networks. It has been shown that network alignment methods can be used to detect pathways or network modules that are conserved across different networks. Until now, a number of network alignment algorithms have been proposed based on different formulations and approaches, many of them focusing on pairwise alignment. Results In this work, we propose a novel multiple network alignment algorithm based on a context-sensitive random walk model. The random walker employed in the proposed algorithm switches between two different modes, namely, an individual walk on a single network and a simultaneous walk on two networks. The switching decision is made in a context-sensitive manner by examining the current neighborhood, which is effective for quantitatively estimating the degree of correspondence between nodes that belong to different networks, in a manner that sensibly integrates node similarity and topological similarity. The resulting node correspondence scores are then used to predict the maximum expected accuracy (MEA) alignment of the given networks. Conclusions Performance evaluation based on synthetic networks as well as real protein-protein interaction networks shows that the proposed algorithm can construct more accurate multiple network alignments compared to other leading methods. PMID:25707987

  7. Prognostic value of quantitative cytometry in a series of 415 T1T2/N0N1/M0 breast cancer patients--preliminary results.

    PubMed

    Bolla, M; Seigneurin, D; Winckel, P; Marron-Charrière, J; Panh, M H; Pasquier, D; Ch-edin, M; Payan, R; Merlin, F; Colonna, M

    1996-09-01

    Identifying prognostic markers in local regional breast carcinomas remains an important challenge today. DNA content obtained by flow cytometry, has been found to be of prognostic value; results with other methods remain less clear. This report describes DNA image cytometry patterns which are assessed with respect to disease-free survival. From June 1982 to December 1992, 415 patients under 75 years of age, without any previous or synchronous carcinoma, suffering from an invasive breast cancer classified as T1 (52.8%), T2 (47.2%), N0 (65.1%) N1 (34.9%), MO according to clinical TNM staging, were enrolled in this study. The median age was 53 (28-75) and 58.8% of the patients were premenopausal; 85.3% underwent a breast conservative procedure and 14.7% a modified radical mastectomy followed by postoperative irradiation. Histological axillary lymph node status, Scarff-Bloom grade and/or cytological grade and, oestrogen receptor content were used in decision-making for adjuvant treatment: hormonotherapy (48%) or chemotherapy (18.8%). Imprints were taken from the macroscopically visible lesion at the time of surgery, and a Feulgen staining was carried out on air dried smears to be analysed using the Samba 200 cell image processor (Alcatel TITN, France). Five parameters were systematically assessed: proliferation index; DNA histogram, integrated optical density, DNA malignancy grade, ploidy balance. With a median follow-up of 36 months (0-105), proliferation index (P = 0.0008), DNA histogram (P = 0.0017), integrated optical density (IOD) (P = 0.018) and DNA malignancy grade (P = 0.017) had a significant prognostic value on disease-free survival estimated by the Kaplan-Meier method. When these parameters were included in a Cox proportional regression hazards model, PR (P = 0.01), Scarff-Bloom histological grading (P = 0.02), axillary clearance (P = 0.04) were significant; however, in the same model, taking into account the axillary lymph node histological status, IOD was

  8. The CheMin XRD on the Mars Science Laboratory Rover Curiosity: Construction, Operation, and Quantitative Mineralogical Results from the Surface of Mars

    NASA Technical Reports Server (NTRS)

    Blake, David F.

    2015-01-01

    The Mars Science Laboratory mission was launched from Cape Canaveral, Florida on Nov. 26, 2011 and landed in Gale crater, Mars on Aug. 6, 2012. MSL's mission is to identify and characterize ancient "habitable" environments on Mars. MSL's precision landing system placed the Curiosity rover within 2 km of the center of its 20 X 6 km landing ellipse, next to Gale's central mound, a 5,000 meter high pile of laminated sediment which may contain 1 billion years of Mars history. Curiosity carries with it a full suite of analytical instruments, including the CheMin X-ray diffractometer, the first XRD flown in space. CheMin is essentially a transmission X-ray pinhole camera. A fine-focus Co source and collimator transmits a 50µm beam through a powdered sample held between X-ray transparent plastic windows. The sample holder is shaken by a piezoelectric actuator such that the powder flows like a liquid, each grain passing in random orientation through the beam over time. Forward-diffracted and fluoresced X-ray photons from the sample are detected by an X-ray sensitive Charge Coupled Device (CCD) operated in single photon counting mode. When operated in this way, both the x,y position and the energy of each photon are detected. The resulting energy-selected Co Kalpha Debye-Scherrer pattern is used to determine the identities and amounts of minerals present via Rietveld refinement, and a histogram of all X-ray events constitutes an X-ray fluorescence analysis of the sample.The key role that definitive mineralogy plays in understanding the Martian surface is a consequence of the fact that minerals are thermodynamic phases, having known and specific ranges of temperature, pressure and composition within which they are stable. More than simple compositional analysis, definitive mineralogical analysis can provide information about pressure/temperature conditions of formation, past climate, water activity and the like. Definitive mineralogical analyses are necessary to establish

  9. Altered levels of the Taraxacum kok-saghyz (Russian dandelion) small rubber particle protein, TkSRPP3, result in qualitative and quantitative changes in rubber metabolism.

    PubMed

    Collins-Silva, Jillian; Nural, Aise Taban; Skaggs, Amanda; Scott, Deborah; Hathwaik, Upul; Woolsey, Rebekah; Schegg, Kathleen; McMahan, Colleen; Whalen, Maureen; Cornish, Katrina; Shintani, David

    2012-07-01

    Several proteins have been identified and implicated in natural rubber biosynthesis, one of which, the small rubber particle protein (SRPP), was originally identified in Hevea brasiliensis as an abundant protein associated with cytosolic vesicles known as rubber particles. While previous in vitro studies suggest that SRPP plays a role in rubber biosynthesis, in vivo evidence is lacking to support this hypothesis. To address this issue, a transgene approach was taken in Taraxacum kok-saghyz (Russian dandelion or Tk) to determine if altered SRPP levels would influence rubber biosynthesis. Three dandelion SRPPs were found to be highly abundant on dandelion rubber particles. The most abundant particle associated SRPP, TkSRPP3, showed temporal and spatial patterns of expression consistent with patterns of natural rubber accumulation in dandelion. To confirm its role in rubber biosynthesis, TkSRPP3 expression was altered in Russian dandelion using over-expression and RNAi methods. While TkSRPP3 over-expressing lines had slightly higher levels of rubber in their roots, relative to the control, TkSRPP3 RNAi lines showed significant decreases in root rubber content and produced dramatically lower molecular weight rubber than the control line. Not only do results here provide in vivo evidence of TkSRPP proteins affecting the amount of rubber in dandelion root, but they also suggest a function in regulating the molecular weight of the cis-1, 4-polyisoprene polymer.

  10. Need for a gender-sensitive human security framework: results of a quantitative study of human security and sexual violence in Djohong District, Cameroon

    PubMed Central

    2014-01-01

    Background Human security shifts traditional concepts of security from interstate conflict and the absence of war to the security of the individual. Broad definitions of human security include livelihoods and food security, health, psychosocial well-being, enjoyment of civil and political rights and freedom from oppression, and personal safety, in addition to absence of conflict. Methods In March 2010, we undertook a population-based health and livelihood study of female refugees from conflict-affected Central African Republic living in Djohong District, Cameroon and their female counterparts within the Cameroonian host community. Embedded within the survey instrument were indicators of human security derived from the Leaning-Arie model that defined three domains of psychosocial stability suggesting individuals and communities are most stable when their core attachments to home, community and the future are intact. Results While the female refugee human security outcomes describe a population successfully assimilated and thriving in their new environments based on these three domains, the ability of human security indicators to predict the presence or absence of lifetime and six-month sexual violence was inadequate. Using receiver operating characteristic (ROC) analysis, the study demonstrates that common human security indicators do not uncover either lifetime or recent prevalence of sexual violence. Conclusions These data suggest that current gender-blind approaches of describing human security are missing serious threats to the safety of one half of the population and that efforts to develop robust human security indicators should include those that specifically measure violence against women. PMID:24829613

  11. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  12. Quantitative and Qualitative Antibody Responses to Immunization With the Pneumococcal Polysaccharide Vaccine in HIV-Infected Patients After Initiation of Antiretroviral Treatment: Results From a Randomized Clinical Trial

    PubMed Central

    Rodriguez-Barradas, Maria C.; Serpa, Jose A.; Munjal, Iona; Mendoza, Daniel; Rueda, Adriana M.; Mushtaq, Mahwish; Pirofski, Liise-anne

    2015-01-01

    Background. Pneumococcal vaccination is recommended for human immunodeficiency virus-infected (HIV+) persons; the best timing for immunization with respect to initiation of antiretroviral therapy (ART) is unknown. Methods. Double-blind, placebo-controlled trial in HIV+ with CD4+ T cells/µL (CD4) ≥ 200 randomized to receive the 23-valent pneumococcal polysaccharide vaccine (PPV23) or placebo at enrollment, followed by placebo or PPV23, respectively, 9–12 months later (after ≥6 months of ART). Capsular polysaccharide-specific immunoglobin (Ig) G and IgM levels to serotypes 1, 3, 4, 6B, and 23F, and opsonophagocytic killing activity (OPA) to serotypes 6B and 23F were evaluated 1 month postvaccination. Results. One hundred seven subjects were enrolled, 72 (67.3%) were evaluable (36/group). Both groups had significant increases in pre- to 1-month postvaccination IgG levels, but negligible to IgM, and significant increases in OPA titers to serotype 6B but not to 23F. There were no significant differences between groups in serotype-specific IgM or IgG levels or OPA titers. For the combined groups, there was a significant correlation between serotype-specific IgG and OPA titers to 23F but not to 6B. There was no correlation between CD4, viral load and IgG responses. Conclusions. In HIV+ with CD4 ≥ 200, delaying PPV23 until ≥6 months of ART does not improve responses and may lead to missed opportunities for immunization. PMID:25538270

  13. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  14. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  15. Can Patient Safety Incident Reports Be Used to Compare Hospital Safety? Results from a Quantitative Analysis of the English National Reporting and Learning System Data

    PubMed Central

    2015-01-01

    claims per bed were significantly negatively associated with incident reports. Patient satisfaction and mortality outcomes were not significantly associated with reporting rates. Staff survey responses revealed that keeping reports confidential, keeping staff informed about incidents and giving feedback on safety initiatives increased reporting rates [r = 0.26 (p<0.01), r = 0.17 (p = 0.04), r = 0.23 (p = 0.01), r = 0.20 (p = 0.02)]. Conclusion The NRLS is the largest patient safety reporting system in the world. This study did not demonstrate many hospital characteristics to significantly influence overall reporting rate. There were no association between size of hospital, number of staff, mortality outcomes or patient satisfaction outcomes and incident reporting rate. The study did show that hospitals where staff reported more incidents had reduced litigation claims and when clinician staffing is increased fewer incidents reporting patient harm are reported, whilst near misses remain the same. Certain specialties report more near misses than others, and doctors report more harm incidents than near misses. Staff survey results showed that open environments and reduced fear of punitive response increases incident reporting. We suggest that reporting rates should not be used to assess hospital safety. Different healthcare professionals focus on different types of safety incidents and focusing on these areas whilst creating a responsive, confidential learning environment will increase staff engagement with error disclosure. PMID:26650823

  16. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins.

    PubMed

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-03-07

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg(-1), and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg(-1), respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg(-1), respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.

  17. Quantitative magnetospheric models: results and perspectives.

    NASA Astrophysics Data System (ADS)

    Kuznetsova, M.; Hesse, M.; Gombosi, T.; Csem Team

    Global magnetospheric models are indispensable tool that allow multi-point measurements to be put into global context Significant progress is achieved in global MHD modeling of magnetosphere structure and dynamics Medium resolution simulations confirm general topological pictures suggested by Dungey State of the art global models with adaptive grids allow performing simulations with highly resolved magnetopause and magnetotail current sheet Advanced high-resolution models are capable to reproduced transient phenomena such as FTEs associated with formation of flux ropes or plasma bubbles embedded into magnetopause and demonstrate generation of vortices at magnetospheric flanks On the other hand there is still controversy about the global state of the magnetosphere predicted by MHD models to the point of questioning the length of the magnetotail and the location of the reconnection sites within it For example for steady southwards IMF driving condition resistive MHD simulations produce steady configuration with almost stationary near-earth neutral line While there are plenty of observational evidences of periodic loading unloading cycle during long periods of southward IMF Successes and challenges in global modeling of magnetispheric dynamics will be addessed One of the major challenges is to quantify the interaction between large-scale global magnetospheric dynamics and microphysical processes in diffusion regions near reconnection sites Possible solutions to controversies will be discussed

  18. Raman Spectroscopy as an Accurate Probe of Defects in Graphene

    NASA Astrophysics Data System (ADS)

    Rodriguez-Nieva, Joaquin; Barros, Eduardo; Saito, Riichiro; Dresselhaus, Mildred

    2014-03-01

    Raman Spectroscopy has proved to be an invaluable non-destructive technique that allows us to obtain intrinsic information about graphene. Furthermore, defect-induced Raman features, namely the D and D' bands, have previously been used to assess the purity of graphitic samples. However, quantitative studies of the signatures of the different types of defects on the Raman spectra is still an open problem. Experimental results already suggest that the Raman intensity ratio ID /ID' may allow us to identify the nature of the defects. We study from a theoretical point of view the power and limitations of Raman spectroscopy in the study of defects in graphene. We derive an analytic model that describes the Double Resonance Raman process of disordered graphene samples, and which explicitly shows the role played by both the defect-dependent parameters as well as the experimentally-controlled variables. We compare our model with previous Raman experiments, and use it to guide new ways in which defects in graphene can be accurately probed with Raman spectroscopy. We acknowledge support from NSF grant DMR1004147.

  19. Personalized Orthodontic Accurate Tooth Arrangement System with Complete Teeth Model.

    PubMed

    Cheng, Cheng; Cheng, Xiaosheng; Dai, Ning; Liu, Yi; Fan, Qilei; Hou, Yulin; Jiang, Xiaotong

    2015-09-01

    The accuracy, validity and lack of relation information between dental root and jaw in tooth arrangement are key problems in tooth arrangement technology. This paper aims to describe a newly developed virtual, personalized and accurate tooth arrangement system based on complete information about dental root and skull. Firstly, a feature constraint database of a 3D teeth model is established. Secondly, for computed simulation of tooth movement, the reference planes and lines are defined by the anatomical reference points. The matching mathematical model of teeth pattern and the principle of the specific pose transformation of rigid body are fully utilized. The relation of position between dental root and alveolar bone is considered during the design process. Finally, the relative pose relationships among various teeth are optimized using the object mover, and a personalized therapeutic schedule is formulated. Experimental results show that the virtual tooth arrangement system can arrange abnormal teeth very well and is sufficiently flexible. The relation of position between root and jaw is favorable. This newly developed system is characterized by high-speed processing and quantitative evaluation of the amount of 3D movement of an individual tooth.

  20. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  1. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  2. Absolute quantitation of protein posttranslational modification isoform.

    PubMed

    Yang, Zhu; Li, Ning

    2015-01-01

    Mass spectrometry has been widely applied in characterization and quantification of proteins from complex biological samples. Because the numbers of absolute amounts of proteins are needed in construction of mathematical models for molecular systems of various biological phenotypes and phenomena, a number of quantitative proteomic methods have been adopted to measure absolute quantities of proteins using mass spectrometry. The liquid chromatography-tandem mass spectrometry (LC-MS/MS) coupled with internal peptide standards, i.e., the stable isotope-coded peptide dilution series, which was originated from the field of analytical chemistry, becomes a widely applied method in absolute quantitative proteomics research. This approach provides more and more absolute protein quantitation results of high confidence. As quantitative study of posttranslational modification (PTM) that modulates the biological activity of proteins is crucial for biological science and each isoform may contribute a unique biological function, degradation, and/or subcellular location, the absolute quantitation of protein PTM isoforms has become more relevant to its biological significance. In order to obtain the absolute cellular amount of a PTM isoform of a protein accurately, impacts of protein fractionation, protein enrichment, and proteolytic digestion yield should be taken into consideration and those effects before differentially stable isotope-coded PTM peptide standards are spiked into sample peptides have to be corrected. Assisted with stable isotope-labeled peptide standards, the absolute quantitation of isoforms of posttranslationally modified protein (AQUIP) method takes all these factors into account and determines the absolute amount of a protein PTM isoform from the absolute amount of the protein of interest and the PTM occupancy at the site of the protein. The absolute amount of the protein of interest is inferred by quantifying both the absolute amounts of a few PTM

  3. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  4. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  5. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  6. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  7. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  8. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  9. Simple Mathematical Models Do Not Accurately Predict Early SIV Dynamics

    PubMed Central

    Noecker, Cecilia; Schaefer, Krista; Zaccheo, Kelly; Yang, Yiding; Day, Judy; Ganusov, Vitaly V.

    2015-01-01

    Upon infection of a new host, human immunodeficiency virus (HIV) replicates in the mucosal tissues and is generally undetectable in circulation for 1–2 weeks post-infection. Several interventions against HIV including vaccines and antiretroviral prophylaxis target virus replication at this earliest stage of infection. Mathematical models have been used to understand how HIV spreads from mucosal tissues systemically and what impact vaccination and/or antiretroviral prophylaxis has on viral eradication. Because predictions of such models have been rarely compared to experimental data, it remains unclear which processes included in these models are critical for predicting early HIV dynamics. Here we modified the “standard” mathematical model of HIV infection to include two populations of infected cells: cells that are actively producing the virus and cells that are transitioning into virus production mode. We evaluated the effects of several poorly known parameters on infection outcomes in this model and compared model predictions to experimental data on infection of non-human primates with variable doses of simian immunodifficiency virus (SIV). First, we found that the mode of virus production by infected cells (budding vs. bursting) has a minimal impact on the early virus dynamics for a wide range of model parameters, as long as the parameters are constrained to provide the observed rate of SIV load increase in the blood of infected animals. Interestingly and in contrast with previous results, we found that the bursting mode of virus production generally results in a higher probability of viral extinction than the budding mode of virus production. Second, this mathematical model was not able to accurately describe the change in experimentally determined probability of host infection with increasing viral doses. Third and finally, the model was also unable to accurately explain the decline in the time to virus detection with increasing viral dose. These results

  10. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  11. Fully automated quantitative cephalometry using convolutional neural networks.

    PubMed

    Arık, Sercan Ö; Ibragimov, Bulat; Xing, Lei

    2017-01-01

    Quantitative cephalometry plays an essential role in clinical diagnosis, treatment, and surgery. Development of fully automated techniques for these procedures is important to enable consistently accurate computerized analyses. We study the application of deep convolutional neural networks (CNNs) for fully automated quantitative cephalometry for the first time. The proposed framework utilizes CNNs for detection of landmarks that describe the anatomy of the depicted patient and yield quantitative estimation of pathologies in the jaws and skull base regions. We use a publicly available cephalometric x-ray image dataset to train CNNs for recognition of landmark appearance patterns. CNNs are trained to output probabilistic estimations of different landmark locations, which are combined using a shape-based model. We evaluate the overall framework on the test set and compare with other proposed techniques. We use the estimated landmark locations to assess anatomically relevant measurements and classify them into different anatomical types. Overall, our results demonstrate high anatomical landmark detection accuracy ([Formula: see text] to 2% higher success detection rate for a 2-mm range compared with the top benchmarks in the literature) and high anatomical type classification accuracy ([Formula: see text] average classification accuracy for test set). We demonstrate that CNNs, which merely input raw image patches, are promising for accurate quantitative cephalometry.

  12. Quantitative analysis of the heterogeneous population of endocytic vesicles.

    PubMed

    Kozlov, Konstantin; Kosheverova, Vera; Kamentseva, Rimma; Kharchenko, Marianna; Sokolkova, Alena; Kornilova, Elena; Samsonova, Maria

    2017-03-07

    The quantitative characterization of endocytic vesicles in images acquired with microscope is critically important for deciphering of endocytosis mechanisms. Image segmentation is the most important step of quantitative image analysis. In spite of availability of many segmentation methods, the accurate segmentation is challenging when the images are heterogeneous with respect to object shapes and signal intensities what is typical for images of endocytic vesicles. We present a Morphological reconstruction and Contrast mapping segmentation method (MrComas) for the segmentation of the endocytic vesicle population that copes with the heterogeneity in their shape and intensity. The method uses morphological opening and closing by reconstruction in the vicinity of local minima and maxima respectively thus creating the strong contrast between their basins of attraction. As a consequence, the intensity is flattened within the objects and their edges are enhanced. The method accurately recovered quantitative characteristics of synthetic images that preserve characteristic features of the endocytic vesicle population. In benchmarks and quantitative comparisons with two other popular segmentation methods, namely manual thresholding and Squash plugin, MrComas shows the best segmentation results on real biological images of EGFR (Epidermal Growth Factor Receptor) endocytosis. As a proof of feasibility, the method was applied to quantify the dynamical behavior of Early Endosomal Autoantigen 1 (EEA1)-positive endosome subpopulations during EGF-stimulated endocytosis.

  13. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  14. Accurate Cross Sections for Microanalysis

    PubMed Central

    Rez, Peter

    2002-01-01

    To calculate the intensity of x-ray emission in electron beam microanalysis requires a knowledge of the energy distribution of the electrons in the solid, the energy variation of the ionization cross section of the relevant subshell, the fraction of ionizations events producing x rays of interest and the absorption coefficient of the x rays on the path to the detector. The theoretical predictions and experimental data available for ionization cross sections are limited mainly to K shells of a few elements. Results of systematic plane wave Born approximation calculations with exchange for K, L, and M shell ionization cross sections over the range of electron energies used in microanalysis are presented. Comparisons are made with experimental measurement for selected K shells and it is shown that the plane wave theory is not appropriate for overvoltages less than 2.5 V. PMID:27446747

  15. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  16. An analytic model for accurate spring constant calibration of rectangular atomic force microscope cantilevers.

    PubMed

    Li, Rui; Ye, Hongfei; Zhang, Weisheng; Ma, Guojun; Su, Yewang

    2015-10-29

    Spring constant calibration of the atomic force microscope (AFM) cantilever is of fundamental importance for quantifying the force between the AFM cantilever tip and the sample. The calibration within the framework of thin plate theory undoubtedly has a higher accuracy and broader scope than that within the well-established beam theory. However, thin plate theory-based accurate analytic determination of the constant has been perceived as an extremely difficult issue. In this paper, we implement the thin plate theory-based analytic modeling for the static behavior of rectangular AFM cantilevers, which reveals that the three-dimensional effect and Poisson effect play important roles in accurate determination of the spring constants. A quantitative scaling law is found that the normalized spring constant depends only on the Poisson's ratio, normalized dimension and normalized load coordinate. Both the literature and our refined finite element model validate the present results. The developed model is expected to serve as the benchmark for accurate calibration of rectangular AFM cantilevers.

  17. A fast, accurate, and reliable reconstruction method of the lumbar spine vertebrae using positional MRI.

    PubMed

    Simons, Craig J; Cobb, Loren; Davidson, Bradley S

    2014-04-01

    In vivo measurement of lumbar spine configuration is useful for constructing quantitative biomechanical models. Positional magnetic resonance imaging (MRI) accommodates a larger range of movement in most joints than conventional MRI and does not require a supine position. However, this is achieved at the expense of image resolution and contrast. As a result, quantitative research using positional MRI has required long reconstruction times and is sensitive to incorrectly identifying the vertebral boundary due to low contrast between bone and surrounding tissue in the images. We present a semi-automated method used to obtain digitized reconstructions of lumbar vertebrae in any posture of interest. This method combines a high-resolution reference scan with a low-resolution postural scan to provide a detailed and accurate representation of the vertebrae in the posture of interest. Compared to a criterion standard, translational reconstruction error ranged from 0.7 to 1.6 mm and rotational reconstruction error ranged from 0.3 to 2.6°. Intraclass correlation coefficients indicated high interrater reliability for measurements within the imaging plane (ICC 0.97-0.99). Computational efficiency indicates that this method may be used to compile data sets large enough to account for population variance, and potentially expand the use of positional MRI as a quantitative biomechanics research tool.

  18. Problems in publishing accurate color in IEEE journals.

    PubMed

    Vrhel, Michael J; Trussell, H J

    2002-01-01

    To demonstrate the performance of color image processing algorithms, it is desirable to be able to accurately display color images in archival publications. In poster presentations, the authors have substantial control of the printing process, although little control of the illumination. For journal publication, the authors must rely on professional intermediaries (printers) to accurately reproduce their results. Our previous work describes requirements for accurately rendering images using your own equipment. This paper discusses the problems of dealing with intermediaries and offers suggestions for improved communication and rendering.

  19. Quantitative Techniques in Volumetric Analysis

    NASA Astrophysics Data System (ADS)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  20. A Qualitative and Quantitative Evaluation of 8 Clear Sky Models.

    PubMed

    Bruneton, Eric

    2016-10-27

    We provide a qualitative and quantitative evaluation of 8 clear sky models used in Computer Graphics. We compare the models with each other as well as with measurements and with a reference model from the physics community. After a short summary of the physics of the problem, we present the measurements and the reference model, and how we "invert" it to get the model parameters. We then give an overview of each CG model, and detail its scope, its algorithmic complexity, and its results using the same parameters as in the reference model. We also compare the models with a perceptual study. Our quantitative results confirm that the less simplifications and approximations are used to solve the physical equations, the more accurate are the results. We conclude with a discussion of the advantages and drawbacks of each model, and how to further improve their accuracy.

  1. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  2. Target identification with quantitative activity based protein profiling (ABPP).

    PubMed

    Chen, Xiao; Wong, Yin Kwan; Wang, Jigang; Zhang, Jianbin; Lee, Yew-Mun; Shen, Han-Ming; Lin, Qingsong; Hua, Zi-Chun

    2017-02-01

    As many small bioactive molecules fulfill their functions through interacting with protein targets, the identification of such targets is crucial in understanding their mechanisms of action (MOA) and side effects. With technological advancements in target identification, it has become possible to accurately and comprehensively study the MOA and side effects of small molecules. While small molecules with therapeutic potential were derived solely from nature in the past, the remodeling and synthesis of such molecules have now been made possible. Presently, while some small molecules have seen successful application as drugs, the majority remain undeveloped, requiring further understanding of their MOA and side effects to fully tap into their potential. Given the typical promiscuity of many small molecules and the complexity of the cellular proteome, a high-flux and high-accuracy method is necessary. While affinity chromatography approaches combined with MS have had successes in target identification, limitations associated with nonspecific results remain. To overcome these complications, quantitative chemical proteomics approaches have been developed including metabolic labeling, chemical labeling, and label-free methods. These new approaches are adopted in conjunction with activity-based protein profiling (ABPP), allowing for a rapid process and accurate results. This review will briefly introduce the principles involved in ABPP, then summarize current advances in quantitative chemical proteomics approaches as well as illustrate with examples how ABPP coupled with quantitative chemical proteomics has been used to detect the targets of drugs and other bioactive small molecules including natural products.

  3. Approaches for the accurate definition of geological time boundaries

    NASA Astrophysics Data System (ADS)

    Schaltegger, Urs; Baresel, Björn; Ovtcharova, Maria; Goudemand, Nicolas; Bucher, Hugo

    2015-04-01

    Which strategies lead to the most precise and accurate date of a given geological boundary? Geological units are usually defined by the occurrence of characteristic taxa and hence boundaries between these geological units correspond to dramatic faunal and/or floral turnovers and they are primarily defined using first or last occurrences of index species, or ideally by the separation interval between two consecutive, characteristic associations of fossil taxa. These boundaries need to be defined in a way that enables their worldwide recognition and correlation across different stratigraphic successions, using tools as different as bio-, magneto-, and chemo-stratigraphy, and astrochronology. Sedimentary sequences can be dated in numerical terms by applying high-precision chemical-abrasion, isotope-dilution, thermal-ionization mass spectrometry (CA-ID-TIMS) U-Pb age determination to zircon (ZrSiO4) in intercalated volcanic ashes. But, though volcanic activity is common in geological history, ashes are not necessarily close to the boundary we would like to date precisely and accurately. In addition, U-Pb zircon data sets may be very complex and difficult to interpret in terms of the age of ash deposition. To overcome these difficulties we use a multi-proxy approach we applied to the precise and accurate dating of the Permo-Triassic and Early-Middle Triassic boundaries in South China. a) Dense sampling of ashes across the critical time interval and a sufficiently large number of analysed zircons per ash sample can guarantee the recognition of all system complexities. Geochronological datasets from U-Pb dating of volcanic zircon may indeed combine effects of i) post-crystallization Pb loss from percolation of hydrothermal fluids (even using chemical abrasion), with ii) age dispersion from prolonged residence of earlier crystallized zircon in the magmatic system. As a result, U-Pb dates of individual zircons are both apparently younger and older than the depositional age

  4. Quantitative measurement and modeling of sensitization development in stainless steel

    SciTech Connect

    Bruemmer, S.M.; Atteridge, D.G.

    1992-09-01

    The state-of-the-art to quantitatively measure and model sensitization development in austenitic stainless steels is assessed and critically analyzed. A modeling capability is evolved and validated using a diverse experimental data base. Quantitative predictions are demonstrated for simple and complex thermal and thermomechanical treatments. Commercial stainless steel heats ranging from high-carbon Type 304 and 316 to low-carbon Type 304L and 316L have been examined including many heats which correspond to extra-low-carbon, nuclear-grade compositions. Within certain limits the electrochemical potentiokinetic reactivation (EPR) test was found to give accurate and reproducible measurements of the degree of sensitization (DOS) in Type 304 and 316 stainless steels. EPR test results are used to develop the quantitative data base and evolve/validate the quantitative modeling capability. This thesis represents a first step to evolve methods for the quantitative assessment of structural reliability in stainless steel components and weldments. Assessments will be based on component-specific information concerning material characteristics, fabrication history and service exposure. Methods will enable fabrication (e.g., welding and repair welding) procedures and material aging effects to be evaluated and ensure adequate cracking resistance during the service lifetime of reactor components. This work is being conducted by the Oregon Graduate Institute with interactive input from personnel at Pacific Northwest Laboratory.

  5. Accurate measurement of streamwise vortices using dual-plane PIV

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Breuer, Kenneth S.

    2012-11-01

    Low Reynolds number aerodynamic experiments with flapping animals (such as bats and small birds) are of particular interest due to their application to micro air vehicles which operate in a similar parameter space. Previous PIV wake measurements described the structures left by bats and birds and provided insight into the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions based on said measurements. The highly three-dimensional and unsteady nature of the flows associated with flapping flight are major challenges for accurate measurements. The challenge of animal flight measurements is finding small flow features in a large field of view at high speed with limited laser energy and camera resolution. Cross-stream measurement is further complicated by the predominately out-of-plane flow that requires thick laser sheets and short inter-frame times, which increase noise and measurement uncertainty. Choosing appropriate experimental parameters requires compromise between the spatial and temporal resolution and the dynamic range of the measurement. To explore these challenges, we do a case study on the wake of a fixed wing. The fixed model simplifies the experiment and allows direct measurements of the aerodynamic forces via load cell. We present a detailed analysis of the wake measurements, discuss the criteria for making accurate measurements, and present a solution for making quantitative aerodynamic load measurements behind free-flyers.

  6. Ultra-sensitive and absolute quantitative detection of Cu(2+) based on DNAzyme and digital PCR in water and drink samples.

    PubMed

    Zhu, Pengyu; Shang, Ying; Tian, Wenying; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2017-04-15

    Here, we developed an ultra-sensitive and absolute quantitative detection method of Cu(2+) based on DNAzyme and digital PCR. The binding model between DNAzyme and Cu(2+) and the influence caused by the additional primer sequence were revealed to ensure quantitation independent of standard curves. The binding model of DNAzyme and Cu(2+) showed that one molecular DNAzyme could bind one Cu(2+) in the biosensor step. Thus, the final quantitative results, evaluated by three parallels, showed that the limit of quantitation (LOQ) was as low as 0.5pmol, while the sensitivity was evaluated as 50fmol. The specificity evaluation of our methodologies shows that extremely low crossing signal is existed within the non-specific ions. Moreover, the results of practical detection have shown that the quantitative results were stable and accurate among different food substrates. In conclusion, a flexible quantitative detection method with ultra-sensitivity was developed to detect trace amounts Cu(2+) within different substrates.

  7. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  8. Functionalized Magnetic Nanoparticles for the Detection and Quantitative Analysis of Cell Surface Antigen

    PubMed Central

    Shahbazi-Gahrouei, Daryoush; Abdolahi, Mohammad; Zarkesh-Esfahani, Sayyed Hamid; Laurent, Sophie; Sermeus, Corine; Gruettner, Cordula

    2013-01-01

    Cell surface antigens as biomarkers offer tremendous potential for early diagnosis, prognosis, and therapeutic response in a variety of diseases such as cancers. In this research, a simple, rapid, accurate, inexpensive, and easily available in vitro assay based on magnetic nanoparticles and magnetic cell separation principle was applied to identify and quantitatively analyze the cell surface antigen expression in the case of prostate cancer cells. Comparing the capability of the assay with flow cytometry as a gold standard method showed similar results. The results showed that the antigen-specific magnetic cell separation with antibody-coated magnetic nanoparticles has high potential for quantitative cell surface antigen detection and analysis. PMID:23484112

  9. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  10. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  11. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  12. The use of heavy nitrogen in quantitative proteomics experiments in plants.

    PubMed

    Arsova, Borjana; Kierszniowska, Sylwia; Schulze, Waltraud X

    2012-02-01

    In the growing field of plant systems biology, there is an undisputed need for methods allowing accurate quantitation of proteins and metabolites. As autotrophic organisms, plants can easily metabolize different nitrogen isotopes, resulting in proteins and metabolites with distinct molecular mass that can be separated on a mass spectrometer. In comparative quantitative experiments, treated and untreated samples are differentially labeled by nitrogen isotopes and jointly processed, thereby minimizing sample-to-sample variation. In recent years, heavy nitrogen labeling has become a widely used strategy in quantitative proteomics and novel approaches have been developed for metabolite identification. Here, we present an overview of currently used experimental strategies in heavy nitrogen labeling in plants and provide background on the history and function of this quantitation technique.

  13. Comparing the effects of tofacitinib, methotrexate and the combination, on bone marrow oedema, synovitis and bone erosion in methotrexate-naive, early active rheumatoid arthritis: results of an exploratory randomised MRI study incorporating semiquantitative and quantitative techniques

    PubMed Central

    Conaghan, Philip G; Østergaard, Mikkel; Bowes, Michael A; Wu, Chunying; Fuerst, Thomas; Irazoque-Palazuelos, Fedra; Soto-Raices, Oscar; Hrycaj, Pawel; Xie, Zhiyong; Zhang, Richard; Wyman, Bradley T; Bradley, John D; Soma, Koshika; Wilkinson, Bethanie

    2016-01-01

    Objectives To explore the effects of tofacitinib—an oral Janus kinase inhibitor for the treatment of rheumatoid arthritis (RA)—with or without methotrexate (MTX), on MRI endpoints in MTX-naive adult patients with early active RA and synovitis in an index wrist or hand. Methods In this exploratory, phase 2, randomised, double-blind, parallel-group study, patients received tofacitinib 10 mg twice daily + MTX, tofacitinib 10 mg twice daily + placebo (tofacitinib monotherapy), or MTX + placebo (MTX monotherapy), for 1 year. MRI endpoints (Outcome Measures in Rheumatology Clinical Trials RA MRI score (RAMRIS), quantitative RAMRIS (RAMRIQ) and dynamic contrast-enhanced (DCE) MRI) were assessed using a mixed-effect model for repeated measures. Treatment differences with p<0.05 (vs MTX monotherapy) were considered significant. Results In total, 109 patients were randomised and treated. Treatment differences in RAMRIS bone marrow oedema (BME) at month 6 were −1.55 (90% CI −2.52 to −0.58) for tofacitinib + MTX and −1.74 (−2.72 to −0.76) for tofacitinib monotherapy (both p<0.01 vs MTX monotherapy). Numerical improvements in RAMRIS synovitis at month 3 were −0.63 (−1.58 to 0.31) for tofacitinib + MTX and −0.52 (−1.46 to 0.41) for tofacitinib monotherapy (both p>0.05 vs MTX monotherapy). Treatment differences in RAMRIQ synovitis were statistically significant at month 3, consistent with DCE MRI findings. Less deterioration of RAMRIS and RAMRIQ erosive damage was seen at months 6 and 12 in both tofacitinib groups versus MTX monotherapy. Conclusions These results provide consistent evidence using three different MRI technologies that tofacitinib treatment leads to early reduction of inflammation and inhibits progression of structural damage. Trial registration number NCT01164579. PMID:27002108

  14. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  15. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  16. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  17. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  18. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  19. Increasing the quantitative bandwidth of NMR measurements.

    PubMed

    Power, J E; Foroozandeh, M; Adams, R W; Nilsson, M; Coombes, S R; Phillips, A R; Morris, G A

    2016-02-18

    The frequency range of quantitative NMR is increased from tens to hundreds of kHz by a new pulse sequence, CHORUS. It uses chirp pulses to excite uniformly over very large bandwidths, yielding accurate integrals even for nuclei such as (19)F that have very wide spectra.

  20. A new HPLC method for azithromycin quantitation.

    PubMed

    Zubata, Patricia; Ceresole, Rita; Rosasco, Maria Ana; Pizzorno, Maria Teresa

    2002-02-01

    A simple liquid chromatographic method was developed for the estimation of azithromycin raw material and in pharmaceutical forms. The sample was chromatographed on a reverse phase C18 column and eluants monitored at a wavelength of 215 nm. The method was accurate, precise and sufficiently selective. It is applicable for its quantitation, stability and dissolution tests.

  1. Accurate Automated Apnea Analysis in Preterm Infants

    PubMed Central

    Vergales, Brooke D.; Paget-Brown, Alix O.; Lee, Hoshik; Guin, Lauren E.; Smoot, Terri J.; Rusin, Craig G.; Clark, Matthew T.; Delos, John B.; Fairchild, Karen D.; Lake, Douglas E.; Moorman, Randall; Kattwinkel, John

    2017-01-01

    Objective In 2006 the apnea of prematurity (AOP) consensus group identified inaccurate counting of apnea episodes as a major barrier to progress in AOP research. We compare nursing records of AOP to events detected by a clinically validated computer algorithm that detects apnea from standard bedside monitors. Study Design Waveform, vital sign, and alarm data were collected continuously from all very low-birth-weight infants admitted over a 25-month period, analyzed for central apnea, bradycardia, and desaturation (ABD) events, and compared with nursing documentation collected from charts. Our algorithm defined apnea as > 10 seconds if accompanied by bradycardia and desaturation. Results Of the 3,019 nurse-recorded events, only 68% had any algorithm-detected ABD event. Of the 5,275 algorithm-detected prolonged apnea events > 30 seconds, only 26% had nurse-recorded documentation within 1 hour. Monitor alarms sounded in only 74% of events of algorithm-detected prolonged apnea events > 10 seconds. There were 8,190,418 monitor alarms of any description throughout the neonatal intensive care unit during the 747 days analyzed, or one alarm every 2 to 3 minutes per nurse. Conclusion An automated computer algorithm for continuous ABD quantitation is a far more reliable tool than the medical record to address the important research questions identified by the 2006 AOP consensus group. PMID:23592319

  2. [The development of multifunction intravenous infusion quantitative packaging device].

    PubMed

    Zhao, Shufang; Li, Ruihua; Shen, Lianhong

    2012-11-01

    Aimed at tackling the compatibility issues arising from the drug reaction in intravenous infusion tube, we developed a simple, suitable and multi-function intravenous infusion tube for the special use for rescuing critical patients, the elderly, children etc. Each drug in a transfusion process can be filtered to realize quantitative packet and packet delivery. Thus, the drugs in the infusion tube are prevented from meeting with each other. No overlap, no particle pollution occurred. Stable performance and accurate dosage are maintained. As a result safety is ensured during drug delivery.

  3. Quantitative assessment of growth plate activity

    SciTech Connect

    Harcke, H.T.; Macy, N.J.; Mandell, G.A.; MacEwen, G.D.

    1984-01-01

    In the immature skeleton the physis or growth plate is the area of bone least able to withstand external forces and is therefore prone to trauma. Such trauma often leads to premature closure of the plate and results in limb shortening and/or angular deformity (varus or valgus). Active localization of bone seeking tracers in the physis makes bone scintigraphy an excellent method for assessing growth plate physiology. To be most effective, however, physeal activity should be quantified so that serial evaluations are accurate and comparable. The authors have developed a quantitative method for assessing physeal activity and have applied it ot the hip and knee. Using computer acquired pinhole images of the abnormal and contralateral normal joints, ten regions of interest are placed at key locations around each joint and comparative ratios are generated to form a growth plate profile. The ratios compare segmental physeal activity to total growth plate activity on both ipsilateral and contralateral sides and to adjacent bone. In 25 patients, ages 2 to 15 years, with angular deformities of the legs secondary to trauma, Blount's disease, and Perthes disease, this technique is able to differentiate abnormal segmental physeal activity. This is important since plate closure does not usually occur uniformly across the physis. The technique may permit the use of scintigraphy in the prediction of early closure through the quantitative analysis of serial studies.

  4. Fast and accurate registration techniques for affine and nonrigid alignment of MR brain images.

    PubMed

    Liu, Jia-Xiu; Chen, Yong-Sheng; Chen, Li-Fen

    2010-01-01

    Registration of magnetic resonance brain images is a geometric operation that determines point-wise correspondences between two brains. It remains a difficult task due to the highly convoluted structure of the brain. This paper presents novel methods, Brain Image Registration Tools (BIRT), that can rapidly and accurately register brain images by utilizing the brain structure information estimated from image derivatives. Source and target image spaces are related by affine transformation and non-rigid deformation. The deformation field is modeled by a set of Wendland's radial basis functions hierarchically deployed near the salient brain structures. In general, nonlinear optimization is heavily engaged in the parameter estimation for affine/non-rigid transformation and good initial estimates are thus essential to registration performance. In this work, the affine registration is initialized by a rigid transformation, which can robustly estimate the orientation and position differences of brain images. The parameters of the affine/non-rigid transformation are then hierarchically estimated in a coarse-to-fine manner by maximizing an image similarity measure, the correlation ratio, between the involved images. T1-weighted brain magnetic resonance images were utilized for performance evaluation. Our experimental results using four 3-D image sets demonstrated that BIRT can efficiently align images with high accuracy compared to several other algorithms, and thus is adequate to the applications which apply registration process intensively. Moreover, a voxel-based morphometric study quantitatively indicated that accurate registration can improve both the sensitivity and specificity of the statistical inference results.

  5. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  6. Facile and quantitative electrochemical detection of yeast cell apoptosis

    NASA Astrophysics Data System (ADS)

    Yue, Qiulin; Xiong, Shiquan; Cai, Dongqing; Wu, Zhengyan; Zhang, Xin

    2014-03-01

    An electrochemical method based on square wave anodic stripping voltammetry (SWASV) was developed to detect the apoptosis of yeast cells conveniently and quantitatively through the high affinity between Cu2+ and phosphatidylserine (PS) translocated from the inner to the outer plasma membrane of the apoptotic cells. The combination of negatively charged PS and Cu2+ could decrease the electrochemical response of Cu2+ on the electrode. The results showed that the apoptotic rates of cells could be detected quantitatively through the variations of peak currents of Cu2+ by SWASV, and agreed well with those obtained through traditional flow cytometry detection. This work thus may provide a novel, simple, immediate and accurate detection method for cell apoptosis.

  7. Quantitative analysis of PET studies.

    PubMed

    Weber, Wolfgang A

    2010-09-01

    Quantitative analysis can be included relatively easily in clinical PET-imaging protocols, but in order to obtain meaningful quantitative results one needs to follow a standardized protocol for image acquisition and data analysis. Important factors to consider are the calibration of the PET scanner, the radiotracer uptake time and the approach for definition of regions of interests. Using such standardized acquisition protocols quantitative parameters of tumor metabolism or receptor status can be derived from tracer kinetic analysis and simplified approaches such as calculation of standardized uptake values (SUVs).

  8. Quantifying Methane Fluxes Simply and Accurately: The Tracer Dilution Method

    NASA Astrophysics Data System (ADS)

    Rella, Christopher; Crosson, Eric; Green, Roger; Hater, Gary; Dayton, Dave; Lafleur, Rick; Merrill, Ray; Tan, Sze; Thoma, Eben

    2010-05-01

    Methane is an important atmospheric constituent with a wide variety of sources, both natural and anthropogenic, including wetlands and other water bodies, permafrost, farms, landfills, and areas with significant petrochemical exploration, drilling, transport, or processing, or refining occurs. Despite its importance to the carbon cycle, its significant impact as a greenhouse gas, and its ubiquity in modern life as a source of energy, its sources and sinks in marine and terrestrial ecosystems are only poorly understood. This is largely because high quality, quantitative measurements of methane fluxes in these different environments have not been available, due both to the lack of robust field-deployable instrumentation as well as to the fact that most significant sources of methane extend over large areas (from 10's to 1,000,000's of square meters) and are heterogeneous emitters - i.e., the methane is not emitted evenly over the area in question. Quantifying the total methane emissions from such sources becomes a tremendous challenge, compounded by the fact that atmospheric transport from emission point to detection point can be highly variable. In this presentation we describe a robust, accurate, and easy-to-deploy technique called the tracer dilution method, in which a known gas (such as acetylene, nitrous oxide, or sulfur hexafluoride) is released in the same vicinity of the methane emissions. Measurements of methane and the tracer gas are then made downwind of the release point, in the so-called far-field, where the area of methane emissions cannot be distinguished from a point source (i.e., the two gas plumes are well-mixed). In this regime, the methane emissions are given by the ratio of the two measured concentrations, multiplied by the known tracer emission rate. The challenges associated with atmospheric variability and heterogeneous methane emissions are handled automatically by the transport and dispersion of the tracer. We present detailed methane flux

  9. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    ERIC Educational Resources Information Center

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  10. Accurate orbit propagation with planetary close encounters

    NASA Astrophysics Data System (ADS)

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  11. Accurate Quantification of microRNA via Single Strand Displacement Reaction on DNA Origami Motif

    PubMed Central

    Lou, Jingyu; Li, Weidong; Li, Sheng; Zhu, Hongxin; Yang, Lun; Zhang, Aiping; He, Lin; Li, Can

    2013-01-01

    DNA origami is an emerging technology that assembles hundreds of staple strands and one single-strand DNA into certain nanopattern. It has been widely used in various fields including detection of biological molecules such as DNA, RNA and proteins. MicroRNAs (miRNAs) play important roles in post-transcriptional gene repression as well as many other biological processes such as cell growth and differentiation. Alterations of miRNAs' expression contribute to many human diseases. However, it is still a challenge to quantitatively detect miRNAs by origami technology. In this study, we developed a novel approach based on streptavidin and quantum dots binding complex (STV-QDs) labeled single strand displacement reaction on DNA origami to quantitatively detect the concentration of miRNAs. We illustrated a linear relationship between the concentration of an exemplary miRNA as miRNA-133 and the STV-QDs hybridization efficiency; the results demonstrated that it is an accurate nano-scale miRNA quantifier motif. In addition, both symmetrical rectangular motif and asymmetrical China-map motif were tested. With significant linearity in both motifs, our experiments suggested that DNA Origami motif with arbitrary shape can be utilized in this method. Since this DNA origami-based method we developed owns the unique advantages of simple, time-and-material-saving, potentially multi-targets testing in one motif and relatively accurate for certain impurity samples as counted directly by atomic force microscopy rather than fluorescence signal detection, it may be widely used in quantification of miRNAs. PMID:23990889

  12. Quantitative rainbow schlieren deflectometry

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Klimek, Robert B.; Buchele, Donald R.

    1995-01-01

    In the rainbow schlieren apparatus, a continuously graded rainbow filter is placed in the back focal plane of the decollimating lens. Refractive-index gradients in the test section thus appear as gradations in hue rather than irradiance. A simple system is described wherein a conventional color CCD array and video digitizer are used to quantify accurately the color attributes of the resulting image, and hence the associated ray deflections. The present system provides a sensitivity comparable with that of conventional interferometry, while being simpler to implement and less sensitive to mechanical misalignment.

  13. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  14. Nonexposure Accurate Location K-Anonymity Algorithm in LBS

    PubMed Central

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR. PMID:24605060

  15. Nonexposure accurate location K-anonymity algorithm in LBS.

    PubMed

    Jia, Jinying; Zhang, Fengli

    2014-01-01

    This paper tackles location privacy protection in current location-based services (LBS) where mobile users have to report their exact location information to an LBS provider in order to obtain their desired services. Location cloaking has been proposed and well studied to protect user privacy. It blurs the user's accurate coordinate and replaces it with a well-shaped cloaked region. However, to obtain such an anonymous spatial region (ASR), nearly all existent cloaking algorithms require knowing the accurate locations of all users. Therefore, location cloaking without exposing the user's accurate location to any party is urgently needed. In this paper, we present such two nonexposure accurate location cloaking algorithms. They are designed for K-anonymity, and cloaking is performed based on the identifications (IDs) of the grid areas which were reported by all the users, instead of directly on their accurate coordinates. Experimental results show that our algorithms are more secure than the existent cloaking algorithms, need not have all the users reporting their locations all the time, and can generate smaller ASR.

  16. Accurate torque-speed performance prediction for brushless dc motors

    NASA Astrophysics Data System (ADS)

    Gipper, Patrick D.

    Desirable characteristics of the brushless dc motor (BLDCM) have resulted in their application for electrohydrostatic (EH) and electromechanical (EM) actuation systems. But to effectively apply the BLDCM requires accurate prediction of performance. The minimum necessary performance characteristics are motor torque versus speed, peak and average supply current and efficiency. BLDCM nonlinear simulation software specifically adapted for torque-speed prediction is presented. The capability of the software to quickly and accurately predict performance has been verified on fractional to integral HP motor sizes, and is presented. Additionally, the capability of torque-speed prediction with commutation angle advance is demonstrated.

  17. Accurate upwind-monotone (nonoscillatory) methods for conservation laws

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1992-01-01

    The well known MUSCL scheme of Van Leer is constructed using a piecewise linear approximation. The MUSCL scheme is second order accurate at the smooth part of the solution except at extrema where the accuracy degenerates to first order due to the monotonicity constraint. To construct accurate schemes which are free from oscillations, the author introduces the concept of upwind monotonicity. Several classes of schemes, which are upwind monotone and of uniform second or third order accuracy are then presented. Results for advection with constant speed are shown. It is also shown that the new scheme compares favorably with state of the art methods.

  18. Time-Accurate Numerical Simulations of Synthetic Jet Quiescent Air

    NASA Technical Reports Server (NTRS)

    Rupesh, K-A. B.; Ravi, B. R.; Mittal, R.; Raju, R.; Gallas, Q.; Cattafesta, L.

    2007-01-01

    The unsteady evolution of three-dimensional synthetic jet into quiescent air is studied by time-accurate numerical simulations using a second-order accurate mixed explicit-implicit fractional step scheme on Cartesian grids. Both two-dimensional and three-dimensional calculations of synthetic jet are carried out at a Reynolds number (based on average velocity during the discharge phase of the cycle V(sub j), and jet width d) of 750 and Stokes number of 17.02. The results obtained are assessed against PIV and hotwire measurements provided for the NASA LaRC workshop on CFD validation of synthetic jets.

  19. Differential equation based method for accurate approximations in optimization

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn I.; Adelman, Howard M.

    1990-01-01

    A method to efficiently and accurately approximate the effect of design changes on structural response is described. The key to this method is to interpret sensitivity equations as differential equations that may be solved explicitly for closed form approximations, hence, the method is denoted the Differential Equation Based (DEB) method. Approximations were developed for vibration frequencies, mode shapes and static displacements. The DEB approximation method was applied to a cantilever beam and results compared with the commonly-used linear Taylor series approximations and exact solutions. The test calculations involved perturbing the height, width, cross-sectional area, tip mass, and bending inertia of the beam. The DEB method proved to be very accurate, and in most cases, was more accurate than the linear Taylor series approximation. The method is applicable to simultaneous perturbation of several design variables. Also, the approximations may be used to calculate other system response quantities. For example, the approximations for displacements are used to approximate bending stresses.

  20. Method for depth-resolved quantitation of optical properties in layered media using spatially modulated quantitative spectroscopy.

    PubMed

    Saager, Rolf B; Truong, Alex; Cuccia, David J; Durkin, Anthony J

    2011-07-01

    We have demonstrated that spatially modulated quantitative spectroscopy (SMoQS) is capable of extracting absolute optical properties from homogeneous tissue simulating phantoms that span both the visible and near-infrared wavelength regimes. However, biological tissue, such as skin, is highly structured, presenting challenges to quantitative spectroscopic techniques based on homogeneous models. In order to more accurately address the challenges associated with skin, we present a method for depth-resolved optical property quantitation based on a two layer model. Layered Monte Carlo simulations and layered tissue simulating phantoms are used to determine the efficacy and accuracy of SMoQS to quantify layer specific optical properties of layered media. Initial results from both the simulation and experiment show that this empirical method is capable of determining top layer thickness within tens of microns across a physiological range for skin. Layer specific chromophore concentration can be determined to <±10% the actual values, on average, whereas bulk quantitation in either visible or near infrared spectroscopic regimes significantly underestimates the layer specific chromophore concentration and can be confounded by top layer thickness.

  1. A Simple and Accurate Method for Measuring Enzyme Activity.

    ERIC Educational Resources Information Center

    Yip, Din-Yan

    1997-01-01

    Presents methods commonly used for investigating enzyme activity using catalase and presents a new method for measuring catalase activity that is more reliable and accurate. Provides results that are readily reproduced and quantified. Can also be used for investigations of enzyme properties such as the effects of temperature, pH, inhibitors,…

  2. Accurate 3D quantification of the bronchial parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  3. 5D model for accurate representation and visualization of dynamic cardiac structures

    NASA Astrophysics Data System (ADS)

    Lin, Wei-te; Robb, Richard A.

    2000-05-01

    Accurate cardiac modeling is challenging due to the intricate structure and complex contraction patterns of myocardial tissues. Fast imaging techniques can provide 4D structural information acquired as a sequence of 3D images throughout the cardiac cycle. To mode. The beating heart, we created a physics-based surface model that deforms between successive time point in the cardiac cycle. 3D images of canine hearts were acquired during one complete cardiac cycle using the DSR and the EBCT. The left ventricle of the first time point is reconstructed as a triangular mesh. A mass-spring physics-based deformable mode,, which can expand and shrink with local contraction and stretching forces distributed in an anatomically accurate simulation of cardiac motion, is applied to the initial mesh and allows the initial mesh to deform to fit the left ventricle in successive time increments of the sequence. The resulting 4D model can be interactively transformed and displayed with associated regional electrical activity mapped onto anatomic surfaces, producing a 5D model, which faithfully exhibits regional cardiac contraction and relaxation patterns over the entire heart. The model faithfully represents structural changes throughout the cardiac cycle. Such models provide the framework for minimizing the number of time points required to usefully depict regional motion of myocardium and allow quantitative assessment of regional myocardial motion. The electrical activation mapping provides spatial and temporal correlation within the cardiac cycle. In procedures which as intra-cardiac catheter ablation, visualization of the dynamic model can be used to accurately localize the foci of myocardial arrhythmias and guide positioning of catheters for optimal ablation.

  4. Monte Carlo modeling provides accurate calibration factors for radionuclide activity meters.

    PubMed

    Zagni, F; Cicoria, G; Lucconi, G; Infantino, A; Lodi, F; Marengo, M

    2014-12-01

    Accurate determination of calibration factors for radionuclide activity meters is crucial for quantitative studies and in the optimization step of radiation protection, as these detectors are widespread in radiopharmacy and nuclear medicine facilities. In this work we developed the Monte Carlo model of a widely used activity meter, using the Geant4 simulation toolkit. More precisely the "PENELOPE" EM physics models were employed. The model was validated by means of several certified sources, traceable to primary activity standards, and other sources locally standardized with spectrometry measurements, plus other experimental tests. Great care was taken in order to accurately reproduce the geometrical details of the gas chamber and the activity sources, each of which is different in shape and enclosed in a unique container. Both relative calibration factors and ionization current obtained with simulations were compared against experimental measurements; further tests were carried out, such as the comparison of the relative response of the chamber for a source placed at different positions. The results showed a satisfactory level of accuracy in the energy range of interest, with the discrepancies lower than 4% for all the tested parameters. This shows that an accurate Monte Carlo modeling of this type of detector is feasible using the low-energy physics models embedded in Geant4. The obtained Monte Carlo model establishes a powerful tool for first instance determination of new calibration factors for non-standard radionuclides, for custom containers, when a reference source is not available. Moreover, the model provides an experimental setup for further research and optimization with regards to materials and geometrical details of the measuring setup, such as the ionization chamber itself or the containers configuration.

  5. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well.

  6. MR Fingerprinting for Rapid Quantitative Abdominal Imaging

    PubMed Central

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D.; Wright, Katherine L.; Seiberlich, Nicole; Griswold, Mark A.

    2016-01-01

    Purpose To develop a magnetic resonance (MR) “fingerprinting” technique for quantitative abdominal imaging. Materials and Methods This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Results Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). Conclusion A rapid technique for quantitative abdominal imaging was developed that

  7. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  8. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  9. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  10. Liver steatosis in pre-transplant liver biopsies can be quantified rapidly and accurately by nuclear magnetic resonance analysis.

    PubMed

    Bertram, Stefanie; Myland, Cathrin; Swoboda, Sandra; Gallinat, Anja; Minor, Thomas; Lehmann, Nils; Thie, Michael; Kälsch, Julia; Pott, Leona; Canbay, Ali; Bajanowski, Thomas; Reis, Henning; Paul, Andreas; Baba, Hideo A

    2017-02-01

    Donor livers marginally acceptable or acceptable according to extended criteria are more frequently transplanted due to the growing discrepancy between demand and availability of donor organs. One type of marginally acceptable graft is a steatotic donor liver, because it is more sensitive to ischemia-reperfusion injury. Thus, quantitative assessment of steatosis is crucial prior to liver transplantation. Extent of steatosis of 49 pre-reperfusion liver biopsies from patients who received orthotopic liver transplantation was assessed by three techniques: semi-quantitative histological evaluation, computerized histomorphometry, and NMR-based estimation of fat content. The findings were correlated to clinical data and to histological examination of corresponding post-reperfusion biopsies for quantification of ischemia-reperfusion injury. We found that values obtained through all three assessment methods were positively correlated. None of the values obtained by the three applied methods correlated with clinical outcome or extent of ischemia-reperfusion injury. Quantitative evaluation of steatosis by NMR yields results comparable to histological and morphometrical assessment. This technique is rapid (<5 min), accurately quantifies fat in donor livers, and provides results that can be used when evaluation by a pathologist is not available.

  11. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  12. Extracting Time-Accurate Acceleration Vectors From Nontrivial Accelerometer Arrangements.

    PubMed

    Franck, Jennifer A; Blume, Janet; Crisco, Joseph J; Franck, Christian

    2015-09-01

    Sports-related concussions are of significant concern in many impact sports, and their detection relies on accurate measurements of the head kinematics during impact. Among the most prevalent recording technologies are videography, and more recently, the use of single-axis accelerometers mounted in a helmet, such as the HIT system. Successful extraction of the linear and angular impact accelerations depends on an accurate analysis methodology governed by the equations of motion. Current algorithms are able to estimate the magnitude of acceleration and hit location, but make assumptions about the hit orientation and are often limited in the position and/or orientation of the accelerometers. The newly formulated algorithm presented in this manuscript accurately extracts the full linear and rotational acceleration vectors from a broad arrangement of six single-axis accelerometers directly from the governing set of kinematic equations. The new formulation linearizes the nonlinear centripetal acceleration term with a finite-difference approximation and provides a fast and accurate solution for all six components of acceleration over long time periods (>250 ms). The approximation of the nonlinear centripetal acceleration term provides an accurate computation of the rotational velocity as a function of time and allows for reconstruction of a multiple-impact signal. Furthermore, the algorithm determines the impact location and orientation and can distinguish between glancing, high rotational velocity impacts, or direct impacts through the center of mass. Results are shown for ten simulated impact locations on a headform geometry computed with three different accelerometer configurations in varying degrees of signal noise. Since the algorithm does not require simplifications of the actual impacted geometry, the impact vector, or a specific arrangement of accelerometer orientations, it can be easily applied to many impact investigations in which accurate kinematics need to

  13. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  14. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  15. Fast and Reliable Quantitative Peptidomics with labelpepmatch.

    PubMed

    Verdonck, Rik; De Haes, Wouter; Cardoen, Dries; Menschaert, Gerben; Huhn, Thomas; Landuyt, Bart; Baggerman, Geert; Boonen, Kurt; Wenseleers, Tom; Schoofs, Liliane

    2016-03-04

    The use of stable isotope tags in quantitative peptidomics offers many advantages, but the laborious identification of matching sets of labeled peptide peaks is still a major bottleneck. Here we present labelpepmatch, an R-package for fast and straightforward analysis of LC-MS spectra of labeled peptides. This open-source tool offers fast and accurate identification of peak pairs alongside an appropriate framework for statistical inference on quantitative peptidomics data, based on techniques from other -omics disciplines. A relevant case study on the desert locust Schistocerca gregaria proves our pipeline to be a reliable tool for quick but thorough explorative analyses.

  16. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  17. First Principles Quantitative Modeling of Molecular Devices

    NASA Astrophysics Data System (ADS)

    Ning, Zhanyu

    In this thesis, we report theoretical investigations of nonlinear and nonequilibrium quantum electronic transport properties of molecular transport junctions from atomistic first principles. The aim is to seek not only qualitative but also quantitative understanding of the corresponding experimental data. At present, the challenges to quantitative theoretical work in molecular electronics include two most important questions: (i) what is the proper atomic model for the experimental devices? (ii) how to accurately determine quantum transport properties without any phenomenological parameters? Our research is centered on these questions. We have systematically calculated atomic structures of the molecular transport junctions by performing total energy structural relaxation using density functional theory (DFT). Our quantum transport calculations were carried out by implementing DFT within the framework of Keldysh non-equilibrium Green's functions (NEGF). The calculated data are directly compared with the corresponding experimental measurements. Our general conclusion is that quantitative comparison with experimental data can be made if the device contacts are correctly determined. We calculated properties of nonequilibrium spin injection from Ni contacts to octane-thiolate films which form a molecular spintronic system. The first principles results allow us to establish a clear physical picture of how spins are injected from the Ni contacts through the Ni-molecule linkage to the molecule, why tunnel magnetoresistance is rapidly reduced by the applied bias in an asymmetric manner, and to what extent ab initio transport theory can make quantitative comparisons to the corresponding experimental data. We found that extremely careful sampling of the two-dimensional Brillouin zone of the Ni surface is crucial for accurate results in such a spintronic system. We investigated the role of contact formation and its resulting structures to quantum transport in several molecular

  18. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  19. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  20. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  1. Quantitative analysis in megageomorphology

    NASA Technical Reports Server (NTRS)

    Mayer, L.

    1985-01-01

    Megageomorphology is the study of regional topographic features and their relations to independent geomorphic variables that operate at the regional scale. These independent variables can be classified as either tectonic or climatic in nature. Quantitative megageomorphology stresses the causal relations between plate tectonic factors and landscape features or correlations between climatic factors and geomorphic processes. In addition, the cumulative effects of tectonics and climate on landscape evolution that simultaneously operate in a complex system of energy transfer is of interst. Regional topographic differentiation, say between continents and ocean floors, is largely the result of the different densities and density contrasts within the oceanic and continental lithosphere and their isostatic consequences. Regional tectonic processes that alter these lithospheric characteristics include rifting, collision, subduction, transpression and transtension.

  2. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  3. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  4. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  5. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  6. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  7. Accurate genome relative abundance estimation based on shotgun metagenomic reads.

    PubMed

    Xia, Li C; Cram, Jacob A; Chen, Ting; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    Accurate estimation of microbial community composition based on metagenomic sequencing data is fundamental for subsequent metagenomics analysis. Prevalent estimation methods are mainly based on directly summarizing alignment results or its variants; often result in biased and/or unstable estimates. We have developed a unified probabilistic framework (named GRAMMy) by explicitly modeling read assignment ambiguities, genome size biases and read distributions along the genomes. Maximum likelihood method is employed to compute Genome Relative Abundance of microbial communities using the Mixture Model theory (GRAMMy). GRAMMy has been demonstrated to give estimates that are accurate and robust across both simulated and real read benchmark datasets. We applied GRAMMy to a collection of 34 metagenomic read sets from four metagenomics projects and identified 99 frequent species (minimally 0.5% abundant in at least 50% of the data-sets) in the human gut samples. Our results show substantial improvements over previous studies, such as adjusting the over-estimated abundance for Bacteroides species for human gut samples, by providing a new reference-based strategy for metagenomic sample comparisons. GRAMMy can be used flexibly with many read assignment tools (mapping, alignment or composition-based) even with low-sensitivity mapping results from huge short-read datasets. It will be increasingly useful as an accurate and robust tool for abundance estimation with the growing size of read sets and the expanding database of reference genomes.

  8. Will Quantitative Proteomics Redefine Some of the Key Concepts in Skeletal Muscle Physiology?

    PubMed Central

    Gizak, Agnieszka; Rakus, Dariusz

    2016-01-01

    Molecular and cellular biology methodology is traditionally based on the reasoning called “the mechanistic explanation”. In practice, this means identifying and selecting correlations between biological processes which result from our manipulation of a biological system. In theory, a successful application of this approach requires precise knowledge about all parameters of a studied system. However, in practice, due to the systems’ complexity, this requirement is rarely, if ever, accomplished. Typically, it is limited to a quantitative or semi-quantitative measurements of selected parameters (e.g., concentrations of some metabolites), and a qualitative or semi-quantitative description of expression/post-translational modifications changes within selected proteins. A quantitative proteomics approach gives a possibility of quantitative characterization of the entire proteome of a biological system, in the context of the titer of proteins as well as their post-translational modifications. This enables not only more accurate testing of novel hypotheses but also provides tools that can be used to verify some of the most fundamental dogmas of modern biology. In this short review, we discuss some of the consequences of using quantitative proteomics to verify several key concepts in skeletal muscle physiology.

  9. Multimodal spatial calibration for accurately registering EEG sensor positions.

    PubMed

    Zhang, Jianhua; Chen, Jian; Chen, Shengyong; Xiao, Gang; Li, Xiaoli

    2014-01-01

    This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views' calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain.

  10. Uniformly high order accurate essentially non-oscillatory schemes 3

    NASA Technical Reports Server (NTRS)

    Harten, A.; Engquist, B.; Osher, S.; Chakravarthy, S. R.

    1986-01-01

    In this paper (a third in a series) the construction and the analysis of essentially non-oscillatory shock capturing methods for the approximation of hyperbolic conservation laws are presented. Also presented is a hierarchy of high order accurate schemes which generalizes Godunov's scheme and its second order accurate MUSCL extension to arbitrary order of accuracy. The design involves an essentially non-oscillatory piecewise polynomial reconstruction of the solution from its cell averages, time evolution through an approximate solution of the resulting initial value problem, and averaging of this approximate solution over each cell. The reconstruction algorithm is derived from a new interpolation technique that when applied to piecewise smooth data gives high-order accuracy whenever the function is smooth but avoids a Gibbs phenomenon at discontinuities. Unlike standard finite difference methods this procedure uses an adaptive stencil of grid points and consequently the resulting schemes are highly nonlinear.

  11. Computational Time-Accurate Body Movement: Methodology, Validation, and Application

    DTIC Science & Technology

    1995-10-01

    used that had a leading-edge sweep angle of 45 deg and a NACA 64A010 symmetrical airfoil section. A cross section of the pylon is a symmetrical...25 2. Information Flow for the Time-Accurate Store Trajectory Prediction Process . . . . . . . . . 26 3. Pitch Rates for NACA -0012 Airfoil...section are comparisons of the computational results to data for a NACA -0012 airfoil following a predefined pitching motion. Validation of the

  12. Quantitative coherent-scatter-computed tomography

    NASA Astrophysics Data System (ADS)

    Batchelar, Deidre L.; Westmore, Michael S.; Lai, Hao; Cunningham, Ian A.

    1998-07-01

    Conventional means of diagnosiing and assessing the progression of osteoporosis, including radiographic absorptiometry and quantitative CT, are directly or indirectly dependent upon bone density. This is, how ever, not always a reliable indicator of fracture risk. Changes in the trabecular structure and bone mineral content (BMC) are thought to provide a better indication of the change of spontaneous fractures occurring. Coherent-scatter CT (CSCT) is a technique which produces images based on the low angle (0 - 10 degrees) x-ray diffraction properties of tissue. Diffraction patterns from an object are acquired using first-generation CT geometry with a diagnostic x-ray image intensifier based system. These patterns are used to reconstruct a series of maps of the angle dependent coherent scatter cross section in a tomographic slice which are dependent upon the molecular structure of the scatterer. Hydroxyapatite has a very different cross section to that of soft tissue, and the CSCT method may, therefore, form the basis for a more direct measure of BMC. Our original CSCT images suffered from a 'cupping' artifact, resulting in increased intensities for pixels at the periphery of the object. This artifact, which is due to self-attenuation of scattered x rays, caused a systematic error of up to 20% in cross-sections measured from a CT image. This effect has been removed by monitoring the transmitted intensity using a photodiode mounted on the primary beam stop, and normalizing the scatter intensity to that of the transmitted beam for each projection. Images reconstructed from data normalized in this way do not exhibit observable attenuation artifacts. Elimination of this artifact enables the determination of accurate quantitative measures of BMC at each pixel in a tomograph.

  13. High Resolution Quantitative Angle-Scanning Widefield Surface Plasmon Microscopy

    PubMed Central

    Tan, Han-Min; Pechprasarn, Suejit; Zhang, Jing; Pitter, Mark C.; Somekh, Michael G.

    2016-01-01

    We describe the construction of a prismless widefield surface plasmon microscope; this has been applied to imaging of the interactions of protein and antibodies in aqueous media. The illumination angle of spatially incoherent diffuse laser illumination was controlled with an amplitude spatial light modulator placed in a conjugate back focal plane to allow dynamic control of the illumination angle. Quantitative surface plasmon microscopy images with high spatial resolution were acquired by post-processing a series of images obtained as a function of illumination angle. Experimental results are presented showing spatially and temporally resolved binding of a protein to a ligand. We also show theoretical results calculated by vector diffraction theory that accurately predict the response of the microscope on a spatially varying sample thus allowing proper quantification and interpretation of the experimental results. PMID:26830146

  14. INVITED TALK: Celestial Mechanics as quantitative observational science

    NASA Astrophysics Data System (ADS)

    Milani, Andrea

    2011-04-01

    When the theories of motion for celestial bodies are constrained by observations they can provide the most quantitative and most accurate physical models. The amount of information contained in such theories is large due to the following reasons: 1) very large data sets (millions of observations), 2) accurate observations, 3) large population samples, 4) accurate dynamical models, with subtle perturbations (e.g., non-gravitational, relativistic), 5) long term stability and computability, 6) extreme accuracy requirements (as in impact monitoring). This way of thinking shall be illustrated by a number of examples, taken from my own experience, that is either from results obtained in the past by myself and my coworkers or from work I have not completed and which I may not be able to complete, thus will be left as a bequest of problems to be solved. The examples will include the internal structure of asteroid families, the orbit determination of large populations (asteroids, debris), the next generation radioscience experiments, the prediction of chaotic orbits, the need to decide for the deflection of a threatening asteroid.

  15. Rapid and Highly Accurate Prediction of Poor Loop Diuretic Natriuretic Response in Patients With Heart Failure

    PubMed Central

    Testani, Jeffrey M.; Hanberg, Jennifer S.; Cheng, Susan; Rao, Veena; Onyebeke, Chukwuma; Laur, Olga; Kula, Alexander; Chen, Michael; Wilson, F. Perry; Darlington, Andrew; Bellumkonda, Lavanya; Jacoby, Daniel; Tang, W. H. Wilson; Parikh, Chirag R.

    2015-01-01

    Background Removal of excess sodium and fluid is a primary therapeutic objective in acute decompensated heart failure (ADHF) and commonly monitored with fluid balance and weight loss. However, these parameters are frequently inaccurate or not collected and require a delay of several hours after diuretic administration before they are available. Accessible tools for rapid and accurate prediction of diuretic response are needed. Methods and Results Based on well-established renal physiologic principles an equation was derived to predict net sodium output using a spot urine sample obtained one or two hours following loop diuretic administration. This equation was then prospectively validated in 50 ADHF patients using meticulously obtained timed 6-hour urine collections to quantitate loop diuretic induced cumulative sodium output. Poor natriuretic response was defined as a cumulative sodium output of <50 mmol, a threshold that would result in a positive sodium balance with twice-daily diuretic dosing. Following a median dose of 3 mg (2–4 mg) of intravenous bumetanide, 40% of the population had a poor natriuretic response. The correlation between measured and predicted sodium output was excellent (r=0.91, p<0.0001). Poor natriuretic response could be accurately predicted with the sodium prediction equation (AUC=0.95, 95% CI 0.89–1.0, p<0.0001). Clinically recorded net fluid output had a weaker correlation (r=0.66, p<0.001) and lesser ability to predict poor natriuretic response (AUC=0.76, 95% CI 0.63–0.89, p=0.002). Conclusions In patients being treated for ADHF, poor natriuretic response can be predicted soon after diuretic administration with excellent accuracy using a spot urine sample. PMID:26721915

  16. ACCURATE SIMULATIONS OF BINARY BLACK HOLE MERGERS IN FORCE-FREE ELECTRODYNAMICS

    SciTech Connect

    Alic, Daniela; Moesta, Philipp; Rezzolla, Luciano; Jaramillo, Jose Luis; Zanotti, Olindo

    2012-07-20

    We provide additional information on our recent study of the electromagnetic emission produced during the inspiral and merger of supermassive black holes when these are immersed in a force-free plasma threaded by a uniform magnetic field. As anticipated in a recent letter, our results show that although a dual-jet structure is present, the associated luminosity is {approx}100 times smaller than the total one, which is predominantly quadrupolar. Here we discuss the details of our implementation of the equations in which the force-free condition is not implemented at a discrete level, but rather obtained via a damping scheme which drives the solution to satisfy the correct condition. We show that this is important for a correct and accurate description of the current sheets that can develop in the course of the simulation. We also study in greater detail the three-dimensional charge distribution produced as a consequence of the inspiral and show that during the inspiral it possesses a complex but ordered structure which traces the motion of the two black holes. Finally, we provide quantitative estimates of the scaling of the electromagnetic emission with frequency, with the diffused part having a dependence that is the same as the gravitational-wave one and that scales as L{sup non-coll}{sub EM} Almost-Equal-To {Omega}{sup 10/3-8/3}, while the collimated one scales as L{sup coll}{sub EM} Almost-Equal-To {Omega}{sup 5/3-6/3}, thus with a steeper dependence than previously estimated. We discuss the impact of these results on the potential detectability of dual jets from supermassive black holes and the steps necessary for more accurate estimates.

  17. Highly accurate moving object detection in variable bit rate video-based traffic monitoring systems.

    PubMed

    Huang, Shih-Chia; Chen, Bo-Hao

    2013-12-01

    Automated motion detection, which segments moving objects from video streams, is the key technology of intelligent transportation systems for traffic management. Traffic surveillance systems use video communication over real-world networks with limited bandwidth, which frequently suffers because of either network congestion or unstable bandwidth. Evidence supporting these problems abounds in publications about wireless video communication. Thus, to effectively perform the arduous task of motion detection over a network with unstable bandwidth, a process by which bit-rate is allocated to match the available network bandwidth is necessitated. This process is accomplished by the rate control scheme. This paper presents a new motion detection approach that is based on the cerebellar-model-articulation-controller (CMAC) through artificial neural networks to completely and accurately detect moving objects in both high and low bit-rate video streams. The proposed approach is consisted of a probabilistic background generation (PBG) module and a moving object detection (MOD) module. To ensure that the properties of variable bit-rate video streams are accommodated, the proposed PBG module effectively produces a probabilistic background model through an unsupervised learning process over variable bit-rate video streams. Next, the MOD module, which is based on the CMAC network, completely and accurately detects moving objects in both low and high bit-rate video streams by implementing two procedures: 1) a block selection procedure and 2) an object detection procedure. The detection results show that our proposed approach is capable of performing with higher efficacy when compared with the results produced by other state-of-the-art approaches in variable bit-rate video streams over real-world limited bandwidth networks. Both qualitative and quantitative evaluations support this claim; for instance, the proposed approach achieves Similarity and F1 accuracy rates that are 76

  18. Identification and Evaluation of Reference Genes for Accurate Transcription Normalization in Safflower under Different Experimental Conditions

    PubMed Central

    Li, Dandan; Hu, Bo; Wang, Qing; Liu, Hongchang; Pan, Feng; Wu, Wei

    2015-01-01

    Safflower (Carthamus tinctorius L.) has received a significant amount of attention as a medicinal plant and oilseed crop. Gene expression studies provide a theoretical molecular biology foundation for improving new traits and developing new cultivars. Real-time quantitative PCR (RT-qPCR) has become a crucial approach for gene expression analysis. In addition, appropriate reference genes (RGs) are essential for accurate and rapid relative quantification analysis of gene expression. In this study, fifteen candidate RGs involved in multiple metabolic pathways of plants were finally selected and validated under different experimental treatments, at different seed development stages and in different cultivars and tissues for real-time PCR experiments. These genes were ABCS, 60SRPL10, RANBP1, UBCL, MFC, UBCE2, EIF5A, COA, EF1-β, EF1, GAPDH, ATPS, MBF1, GTPB and GST. The suitability evaluation was executed by the geNorm and NormFinder programs. Overall, EF1, UBCE2, EIF5A, ATPS and 60SRPL10 were the most stable genes, and MBF1, as well as MFC, were the most unstable genes by geNorm and NormFinder software in all experimental samples. To verify the validation of RGs selected by the two programs, the expression analysis of 7 CtFAD2 genes in safflower seeds at different developmental stages under cold stress was executed using different RGs in RT-qPCR experiments for normalization. The results showed similar expression patterns when the most stable RGs selected by geNorm or NormFinder software were used. However, the differences were detected using the most unstable reference genes. The most stable combination of genes selected in this study will help to achieve more accurate and reliable results in a wide variety of samples in safflower. PMID:26457898

  19. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  20. Quantitative shadowgraphy and proton radiography for large intensity modulations

    NASA Astrophysics Data System (ADS)

    Kasim, Muhammad Firmansyah; Ceurvorst, Luke; Ratan, Naren; Sadler, James; Chen, Nicholas; Sävert, Alexander; Trines, Raoul; Bingham, Robert; Burrows, Philip N.; Kaluza, Malte C.; Norreys, Peter

    2017-02-01

    Shadowgraphy is a technique widely used to diagnose objects or systems in various fields in physics and engineering. In shadowgraphy, an optical beam is deflected by the object and then the intensity modulation is captured on a screen placed some distance away. However, retrieving quantitative information from the shadowgrams themselves is a challenging task because of the nonlinear nature of the process. Here, we present a method to retrieve quantitative information from shadowgrams, based on computational geometry. This process can also be applied to proton radiography for electric and magnetic field diagnosis in high-energy-density plasmas and has been benchmarked using a toroidal magnetic field as the object, among others. It is shown that the method can accurately retrieve quantitative parameters with error bars less than 10%, even when caustics are present. The method is also shown to be robust enough to process real experimental results with simple pre- and postprocessing techniques. This adds a powerful tool for research in various fields in engineering and physics for both techniques.

  1. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  2. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  3. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  4. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  5. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  6. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  7. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  8. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics.

  9. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  10. Quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Cramer, Rainer

    2011-02-01

    Quantitation is an inherent requirement in comparative proteomics and there is no exception to this for plant proteomics. Quantitative proteomics has high demands on the experimental workflow, requiring a thorough design and often a complex multi-step structure. It has to include sufficient numbers of biological and technical replicates and methods that are able to facilitate a quantitative signal read-out. Quantitative plant proteomics in particular poses many additional challenges but because of the nature of plants it also offers some potential advantages. In general, analysis of plants has been less prominent in proteomics. Low protein concentration, difficulties in protein extraction, genome multiploidy, high Rubisco abundance in green tissue, and an absence of well-annotated and completed genome sequences are some of the main challenges in plant proteomics. However, the latter is now changing with several genomes emerging for model plants and crops such as potato, tomato, soybean, rice, maize and barley. This review discusses the current status in quantitative plant proteomics (MS-based and non-MS-based) and its challenges and potentials. Both relative and absolute quantitation methods in plant proteomics from DIGE to MS-based analysis after isotope labeling and label-free quantitation are described and illustrated by published studies. In particular, we describe plant-specific quantitative methods such as metabolic labeling methods that can take full advantage of plant metabolism and culture practices, and discuss other potential advantages and challenges that may arise from the unique properties of plants.

  11. Rapid and accurate evaluation of the quality of commercial organic fertilizers using near infrared spectroscopy.

    PubMed

    Wang, Chang; Huang, Chichao; Qian, Jian; Xiao, Jian; Li, Huan; Wen, Yongli; He, Xinhua; Ran, Wei; Shen, Qirong; Yu, Guanghui

    2014-01-01

    The composting industry has been growing rapidly in China because of a boom in the animal industry. Therefore, a rapid and accurate assessment of the quality of commercial organic fertilizers is of the utmost importance. In this study, a novel technique that combines near infrared (NIR) spectroscopy with partial least squares (PLS) analysis is developed for rapidly and accurately assessing commercial organic fertilizers quality. A total of 104 commercial organic fertilizers were collected from full-scale compost factories in Jiangsu Province, east China. In general, the NIR-PLS technique showed accurate predictions of the total organic matter, water soluble organic nitrogen, pH, and germination index; less accurate results of the moisture, total nitrogen, and electrical conductivity; and the least accurate results for water soluble organic carbon. Our results suggested the combined NIR-PLS technique could be applied as a valuable tool to rapidly and accurately assess the quality of commercial organic fertilizers.

  12. An Accurate Link Correlation Estimator for Improving Wireless Protocol Performance

    PubMed Central

    Zhao, Zhiwei; Xu, Xianghua; Dong, Wei; Bu, Jiajun

    2015-01-01

    Wireless link correlation has shown significant impact on the performance of various sensor network protocols. Many works have been devoted to exploiting link correlation for protocol improvements. However, the effectiveness of these designs heavily relies on the accuracy of link correlation measurement. In this paper, we investigate state-of-the-art link correlation measurement and analyze the limitations of existing works. We then propose a novel lightweight and accurate link correlation estimation (LACE) approach based on the reasoning of link correlation formation. LACE combines both long-term and short-term link behaviors for link correlation estimation. We implement LACE as a stand-alone interface in TinyOS and incorporate it into both routing and flooding protocols. Simulation and testbed results show that LACE: (1) achieves more accurate and lightweight link correlation measurements than the state-of-the-art work; and (2) greatly improves the performance of protocols exploiting link correlation. PMID:25686314

  13. Accurate determination of the sedimentation flux of concentrated suspensions

    NASA Astrophysics Data System (ADS)

    Martin, J.; Rakotomalala, N.; Salin, D.

    1995-10-01

    Flow rate jumps are used to generate propagating concentration variations in a counterflow stabilized suspension (a liquid fluidized bed). An acoustic technique is used to measure accurately the resulting concentration profiles through the bed. Depending on the experimental conditions, we have observed self-sharpening, or/and self-spreading concentration fronts. Our data are analyzed in the framework of Kynch's theory, providing an accurate determination of the sedimentation flux [CU(C); U(C) is the hindered sedimentation velocity of the suspension] and its derivatives in the concentration range 30%-60%. In the vicinity of the packing concentration, controlling the flow rate has allowed us to increase the maximum packing up to 60%.

  14. Selecting MODFLOW cell sizes for accurate flow fields.

    PubMed

    Haitjema, H; Kelson, V; de Lange, W

    2001-01-01

    Contaminant transport models often use a velocity field derived from a MODFLOW flow field. Consequently, the accuracy of MODFLOW in representing a ground water flow field determines in part the accuracy of the transport predictions, particularly when advective transport is dominant. We compared MODFLOW ground water flow rates and MODPATH particle traces (advective transport) for a variety of conceptual models and different grid spacings to exact or approximate analytic solutions. All of our numerical experiments concerned flow in a single confined or semiconfined aquifer. While MODFLOW appeared robust in terms of both local and global water balance, we found that ground water flow rates, particle traces, and associated ground water travel times are accurate only when sufficiently small cells are used. For instance, a minimum of four or five cells are required to accurately model total ground water inflow in tributaries or other narrow surface water bodies that end inside the model domain. Also, about 50 cells are needed to represent zones of differing transmissivities or an incorrect flow field and (locally) inaccurate ground water travel times may result. Finally, to adequately represent leakage through aquitards or through the bottom of surface water bodies it was found that the maximum allowable cell dimensions should not exceed a characteristic leakage length lambda, which is defined as the square root of the aquifer transmissivity times the resistance of the aquitard or stream bottom. In some cases a cell size of one-tenth of lambda is necessary to obtain accurate results.

  15. Accurate thermoelastic tensor and acoustic velocities of NaCl

    NASA Astrophysics Data System (ADS)

    Marcondes, Michel L.; Shukla, Gaurav; da Silveira, Pedro; Wentzcovitch, Renata M.

    2015-12-01

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  16. Accurate thermoelastic tensor and acoustic velocities of NaCl

    SciTech Connect

    Marcondes, Michel L.; Shukla, Gaurav; Silveira, Pedro da; Wentzcovitch, Renata M.

    2015-12-15

    Despite the importance of thermoelastic properties of minerals in geology and geophysics, their measurement at high pressures and temperatures are still challenging. Thus, ab initio calculations are an essential tool for predicting these properties at extreme conditions. Owing to the approximate description of the exchange-correlation energy, approximations used in calculations of vibrational effects, and numerical/methodological approximations, these methods produce systematic deviations. Hybrid schemes combining experimental data and theoretical results have emerged as a way to reconcile available information and offer more reliable predictions at experimentally inaccessible thermodynamics conditions. Here we introduce a method to improve the calculated thermoelastic tensor by using highly accurate thermal equation of state (EoS). The corrective scheme is general, applicable to crystalline solids with any symmetry, and can produce accurate results at conditions where experimental data may not exist. We apply it to rock-salt-type NaCl, a material whose structural properties have been challenging to describe accurately by standard ab initio methods and whose acoustic/seismic properties are important for the gas and oil industry.

  17. DNA barcode data accurately assign higher spider taxa.

    PubMed

    Coddington, Jonathan A; Agnarsson, Ingi; Cheng, Ren-Chung; Čandek, Klemen; Driskell, Amy; Frick, Holger; Gregorič, Matjaž; Kostanjšek, Rok; Kropf, Christian; Kweskin, Matthew; Lokovšek, Tjaša; Pipan, Miha; Vidergar, Nina; Kuntner, Matjaž

    2016-01-01

    underlying database impacts accuracy of results; many outliers in our dataset could be attributed to taxonomic and/or sequencing errors in BOLD and GenBank. It seems that an accurate and complete reference library of families and genera of life could provide accurate higher level taxonomic identifications cheaply and accessibly, within years rather than decades.

  18. DNA barcode data accurately assign higher spider taxa

    PubMed Central

    Coddington, Jonathan A.; Agnarsson, Ingi; Cheng, Ren-Chung; Čandek, Klemen; Driskell, Amy; Frick, Holger; Gregorič, Matjaž; Kostanjšek, Rok; Kropf, Christian; Kweskin, Matthew; Lokovšek, Tjaša; Pipan, Miha; Vidergar, Nina

    2016-01-01

    the underlying database impacts accuracy of results; many outliers in our dataset could be attributed to taxonomic and/or sequencing errors in BOLD and GenBank. It seems that an accurate and complete reference library of families and genera of life could provide accurate higher level taxonomic identifications cheaply and accessibly, within years rather than decades. PMID:27547527

  19. Diagnosis of breast cancer biopsies using quantitative phase imaging

    NASA Astrophysics Data System (ADS)

    Majeed, Hassaan; Kandel, Mikhail E.; Han, Kevin; Luo, Zelun; Macias, Virgilia; Tangella, Krishnarao; Balla, Andre; Popescu, Gabriel

    2015-03-01

    The standard practice in the histopathology of breast cancers is to examine a hematoxylin and eosin (H&E) stained tissue biopsy under a microscope. The pathologist looks at certain morphological features, visible under the stain, to diagnose whether a tumor is benign or malignant. This determination is made based on qualitative inspection making it subject to investigator bias. Furthermore, since this method requires a microscopic examination by the pathologist it suffers from low throughput. A quantitative, label-free and high throughput method for detection of these morphological features from images of tissue biopsies is, hence, highly desirable as it would assist the pathologist in making a quicker and more accurate diagnosis of cancers. We present here preliminary results showing the potential of using quantitative phase imaging for breast cancer screening and help with differential diagnosis. We generated optical path length maps of unstained breast tissue biopsies using Spatial Light Interference Microscopy (SLIM). As a first step towards diagnosis based on quantitative phase imaging, we carried out a qualitative evaluation of the imaging resolution and contrast of our label-free phase images. These images were shown to two pathologists who marked the tumors present in tissue as either benign or malignant. This diagnosis was then compared against the diagnosis of the two pathologists on H&E stained tissue images and the number of agreements were counted. In our experiment, the agreement between SLIM and H&E based diagnosis was measured to be 88%. Our preliminary results demonstrate the potential and promise of SLIM for a push in the future towards quantitative, label-free and high throughput diagnosis.

  20. Accurate measurement of streamwise vortices in low speed aerodynamic flows

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Kudo, Jun; Breuer, Kenneth S.

    2010-11-01

    Low Reynolds number experiments with flapping animals (such as bats and small birds) are of current interest in understanding biological flight mechanics, and due to their application to Micro Air Vehicles (MAVs) which operate in a similar parameter space. Previous PIV wake measurements have described the structures left by bats and birds, and provided insight to the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions due to significant experimental challenges associated with the highly three-dimensional and unsteady nature of the flows, and the low wake velocities associated with lifting bodies that only weigh a few grams. This requires the high-speed resolution of small flow features in a large field of view using limited laser energy and finite camera resolution. Cross-stream measurements are further complicated by the high out-of-plane flow which requires thick laser sheets and short interframe times. To quantify and address these challenges we present data from a model study on the wake behind a fixed wing at conditions comparable to those found in biological flight. We present a detailed analysis of the PIV wake measurements, discuss the criteria necessary for accurate measurements, and present a new dual-plane PIV configuration to resolve these issues.

  1. Rapid Accurate Identification of Bacterial and Viral Pathogens

    SciTech Connect

    Dunn, John

    2007-03-09

    The goals of this program were to develop two assays for rapid, accurate identification of pathogenic organisms at the strain level. The first assay "Quantitative Genome Profiling or QGP" is a real time PCR assay with a restriction enzyme-based component. Its underlying concept is that certain enzymes should cleave genomic DNA at many sites and that in some cases these cuts will interrupt the connection on the genomic DNA between flanking PCR primer pairs thereby eliminating selected PCR amplifications. When this occurs the appearance of the real-time PCR threshold (Ct) signal during DNA amplification is totally eliminated or, if cutting is incomplete, greatly delayed compared to an uncut control. This temporal difference in appearance of the Ct signal relative to undigested control DNA provides a rapid, high-throughput approach for DNA-based identification of different but closely related pathogens depending upon the nucleotide sequence of the target region. The second assay we developed uses the nucleotide sequence of pairs of shmi identifier tags (-21 bp) to identify DNA molecules. Subtle differences in linked tag pair combinations can also be used to distinguish between closely related isolates..

  2. Optimal target VOI size for accurate 4D coregistration of DCE-MRI

    NASA Astrophysics Data System (ADS)

    Park, Brian; Mikheev, Artem; Zaim Wadghiri, Youssef; Bertrand, Anne; Novikov, Dmitry; Chandarana, Hersh; Rusinek, Henry

    2016-03-01

    Dynamic contrast enhanced (DCE) MRI has emerged as a reliable and diagnostically useful functional imaging technique. DCE protocol typically lasts 3-15 minutes and results in a time series of N volumes. For automated analysis, it is important that volumes acquired at different times be spatially coregistered. We have recently introduced a novel 4D, or volume time series, coregistration tool based on a user-specified target volume of interest (VOI). However, the relationship between coregistration accuracy and target VOI size has not been investigated. In this study, coregistration accuracy was quantitatively measured using various sized target VOIs. Coregistration of 10 DCE-MRI mouse head image sets were performed with various sized VOIs targeting the mouse brain. Accuracy was quantified by measures based on the union and standard deviation of the coregistered volume time series. Coregistration accuracy was determined to improve rapidly as the size of the VOI increased and approached the approximate volume of the target (mouse brain). Further inflation of the VOI beyond the volume of the target (mouse brain) only marginally improved coregistration accuracy. The CPU time needed to accomplish coregistration is a linear function of N that varied gradually with VOI size. From the results of this study, we recommend the optimal size of the VOI to be slightly overinclusive, approximately by 5 voxels, of the target for computationally efficient and accurate coregistration.

  3. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  4. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  5. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  6. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  7. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  8. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  9. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  10. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  11. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  12. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    PubMed Central

    Shortis, Mark

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems. PMID:26690172

  13. Calibration Techniques for Accurate Measurements by Underwater Camera Systems.

    PubMed

    Shortis, Mark

    2015-12-07

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation and stability of underwater camera system calibration are also discussed. Samples of results from published reports are provided to demonstrate the range of possible accuracies for the measurements produced by underwater camera systems.

  14. Accurate simulation of optical properties in dyes.

    PubMed

    Jacquemin, Denis; Perpète, Eric A; Ciofini, Ilaria; Adamo, Carlo

    2009-02-17

    Since Antiquity, humans have produced and commercialized dyes. To this day, extraction of natural dyes often requires lengthy and costly procedures. In the 19th century, global markets and new industrial products drove a significant effort to synthesize artificial dyes, characterized by low production costs, huge quantities, and new optical properties (colors). Dyes that encompass classes of molecules absorbing in the UV-visible part of the electromagnetic spectrum now have a wider range of applications, including coloring (textiles, food, paintings), energy production (photovoltaic cells, OLEDs), or pharmaceuticals (diagnostics, drugs). Parallel to the growth in dye applications, researchers have increased their efforts to design and synthesize new dyes to customize absorption and emission properties. In particular, dyes containing one or more metallic centers allow for the construction of fairly sophisticated systems capable of selectively reacting to light of a given wavelength and behaving as molecular devices (photochemical molecular devices, PMDs).Theoretical tools able to predict and interpret the excited-state properties of organic and inorganic dyes allow for an efficient screening of photochemical centers. In this Account, we report recent developments defining a quantitative ab initio protocol (based on time-dependent density functional theory) for modeling dye spectral properties. In particular, we discuss the importance of several parameters, such as the methods used for electronic structure calculations, solvent effects, and statistical treatments. In addition, we illustrate the performance of such simulation tools through case studies. We also comment on current weak points of these methods and ways to improve them.

  15. Discrete sensors distribution for accurate plantar pressure analyses.

    PubMed

    Claverie, Laetitia; Ille, Anne; Moretto, Pierre

    2016-12-01

    The aim of this study was to determine the distribution of discrete sensors under the footprint for accurate plantar pressure analyses. For this purpose, two different sensor layouts have been tested and compared, to determine which was the most accurate to monitor plantar pressure with wireless devices in research and/or clinical practice. Ten healthy volunteers participated in the study (age range: 23-58 years). The barycenter of pressures (BoP) determined from the plantar pressure system (W-inshoe®) was compared to the center of pressures (CoP) determined from a force platform (AMTI) in the medial-lateral (ML) and anterior-posterior (AP) directions. Then, the vertical ground reaction force (vGRF) obtained from both W-inshoe® and force platform was compared for both layouts for each subject. The BoP and vGRF determined from the plantar pressure system data showed good correlation (SCC) with those determined from the force platform data, notably for the second sensor organization (ML SCC= 0.95; AP SCC=0.99; vGRF SCC=0.91). The study demonstrates that an adjusted placement of removable sensors is key to accurate plantar pressure analyses. These results are promising for a plantar pressure recording outside clinical or laboratory settings, for long time monitoring, real time feedback or for whatever activity requiring a low-cost system.

  16. Interacting with image hierarchies for fast and accurate object segmentation

    NASA Astrophysics Data System (ADS)

    Beard, David V.; Eberly, David H.; Hemminger, Bradley M.; Pizer, Stephen M.; Faith, R. E.; Kurak, Charles; Livingston, Mark

    1994-05-01

    Object definition is an increasingly important area of medical image research. Accurate and fairly rapid object definition is essential for measuring the size and, perhaps more importantly, the change in size of anatomical objects such as kidneys and tumors. Rapid and fairly accurate object definition is essential for 3D real-time visualization including both surgery planning and Radiation oncology treatment planning. One approach to object definition involves the use of 3D image hierarchies, such as Eberly's Ridge Flow. However, the image hierarchy segmentation approach requires user interaction in selecting regions and subtrees. Further, visualizing and comprehending the anatomy and the selected portions of the hierarchy can be problematic. In this paper we will describe the Magic Crayon tool which allows a user to define rapidly and accurately various anatomical objects by interacting with image hierarchies such as those generated with Eberly's Ridge Flow algorithm as well as other 3D image hierarchies. Preliminary results suggest that fairly complex anatomical objects can be segmented in under a minute with sufficient accuracy for 3D surgery planning, 3D radiation oncology treatment planning, and similar applications. Potential modifications to the approach for improved accuracy are summarized.

  17. Differential equation based method for accurate approximations in optimization

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn I.; Adelman, Howard M.

    1990-01-01

    This paper describes a method to efficiently and accurately approximate the effect of design changes on structural response. The key to this new method is to interpret sensitivity equations as differential equations that may be solved explicitly for closed form approximations, hence, the method is denoted the Differential Equation Based (DEB) method. Approximations were developed for vibration frequencies, mode shapes and static displacements. The DEB approximation method was applied to a cantilever beam and results compared with the commonly-used linear Taylor series approximations and exact solutions. The test calculations involved perturbing the height, width, cross-sectional area, tip mass, and bending inertia of the beam. The DEB method proved to be very accurate, and in msot cases, was more accurate than the linear Taylor series approximation. The method is applicable to simultaneous perturbation of several design variables. Also, the approximations may be used to calculate other system response quantities. For example, the approximations for displacement are used to approximate bending stresses.

  18. Dynamical correction of control laws for marine ships' accurate steering

    NASA Astrophysics Data System (ADS)

    Veremey, Evgeny I.

    2014-06-01

    The objective of this work is the analytical synthesis problem for marine vehicles autopilots design. Despite numerous known methods for a solution, the mentioned problem is very complicated due to the presence of an extensive population of certain dynamical conditions, requirements and restrictions, which must be satisfied by the appropriate choice of a steering control law. The aim of this paper is to simplify the procedure of the synthesis, providing accurate steering with desirable dynamics of the control system. The approach proposed here is based on the usage of a special unified multipurpose control law structure that allows decoupling a synthesis into simpler particular optimization problems. In particular, this structure includes a dynamical corrector to support the desirable features for the vehicle's motion under the action of sea wave disturbances. As a result, a specialized new method for the corrector design is proposed to provide an accurate steering or a trade-off between accurate steering and economical steering of the ship. This method guaranties a certain flexibility of the control law with respect to an actual environment of the sailing; its corresponding turning can be realized in real time onboard.

  19. Accurate modelling of unsteady flows in collapsible tubes.

    PubMed

    Marchandise, Emilie; Flaud, Patrice

    2010-01-01

    The context of this paper is the development of a general and efficient numerical haemodynamic tool to help clinicians and researchers in understanding of physiological flow phenomena. We propose an accurate one-dimensional Runge-Kutta discontinuous Galerkin (RK-DG) method coupled with lumped parameter models for the boundary conditions. The suggested model has already been successfully applied to haemodynamics in arteries and is now extended for the flow in collapsible tubes such as veins. The main difference with cardiovascular simulations is that the flow may become supercritical and elastic jumps may appear with the numerical consequence that scheme may not remain monotone if no limiting procedure is introduced. We show that our second-order RK-DG method equipped with an approximate Roe's Riemann solver and a slope-limiting procedure allows us to capture elastic jumps accurately. Moreover, this paper demonstrates that the complex physics associated with such flows is more accurately modelled than with traditional methods such as finite difference methods or finite volumes. We present various benchmark problems that show the flexibility and applicability of the numerical method. Our solutions are compared with analytical solutions when they are available and with solutions obtained using other numerical methods. Finally, to illustrate the clinical interest, we study the emptying process in a calf vein squeezed by contracting skeletal muscle in a normal and pathological subject. We compare our results with experimental simulations and discuss the sensitivity to parameters of our model.

  20. A robust and accurate formulation of molecular and colloidal electrostatics.

    PubMed

    Sun, Qiang; Klaseboer, Evert; Chan, Derek Y C

    2016-08-07

    This paper presents a re-formulation of the boundary integral method for the Debye-Hückel model of molecular and colloidal electrostatics that removes the mathematical singularities that have to date been accepted as an intrinsic part of the conventional boundary integral equation method. The essence of the present boundary regularized integral equation formulation consists of subtracting a known solution from the conventional boundary integral method in such a way as to cancel out the singularities associated with the Green's function. This approach better reflects the non-singular physical behavior of the systems on boundaries with the benefits of the following: (i) the surface integrals can be evaluated accurately using quadrature without any need to devise special numerical integration procedures, (ii) being able to use quadratic or spline function surface elements to represent the surface more accurately and the variation of the functions within each element is represented to a consistent level of precision by appropriate interpolation functions, (iii) being able to calculate electric fields, even at boundaries, accurately and directly from the potential without having to solve hypersingular integral equations and this imparts high precision in calculating the Maxwell stress tensor and consequently, intermolecular or colloidal forces, (iv) a reliable way to handle geometric configurations in which different parts of the boundary can be very close together without being affected by numerical instabilities, therefore potentials, fields, and forces between surfaces can be found accurately at surface separations down to near contact, and (v) having the simplicity of a formulation that does not require complex algorithms to handle singularities will result in significant savings in coding effort and in the reduction of opportunities for coding errors. These advantages are illustrated using examples drawn from molecular and colloidal electrostatics.

  1. A robust and accurate formulation of molecular and colloidal electrostatics

    NASA Astrophysics Data System (ADS)

    Sun, Qiang; Klaseboer, Evert; Chan, Derek Y. C.

    2016-08-01

    This paper presents a re-formulation of the boundary integral method for the Debye-Hückel model of molecular and colloidal electrostatics that removes the mathematical singularities that have to date been accepted as an intrinsic part of the conventional boundary integral equation method. The essence of the present boundary regularized integral equation formulation consists of subtracting a known solution from the conventional boundary integral method in such a way as to cancel out the singularities associated with the Green's function. This approach better reflects the non-singular physical behavior of the systems on boundaries with the benefits of the following: (i) the surface integrals can be evaluated accurately using quadrature without any need to devise special numerical integration procedures, (ii) being able to use quadratic or spline function surface elements to represent the surface more accurately and the variation of the functions within each element is represented to a consistent level of precision by appropriate interpolation functions, (iii) being able to calculate electric fields, even at boundaries, accurately and directly from the potential without having to solve hypersingular integral equations and this imparts high precision in calculating the Maxwell stress tensor and consequently, intermolecular or colloidal forces, (iv) a reliable way to handle geometric configurations in which different parts of the boundary can be very close together without being affected by numerical instabilities, therefore potentials, fields, and forces between surfaces can be found accurately at surface separations down to near contact, and (v) having the simplicity of a formulation that does not require complex algorithms to handle singularities will result in significant savings in coding effort and in the reduction of opportunities for coding errors. These advantages are illustrated using examples drawn from molecular and colloidal electrostatics.

  2. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  3. Recapturing Quantitative Biology.

    ERIC Educational Resources Information Center

    Pernezny, Ken; And Others

    1996-01-01

    Presents a classroom activity on estimating animal populations. Uses shoe boxes and candies to emphasize the importance of mathematics in biology while introducing the methods of quantitative ecology. (JRH)

  4. A Quantitative Gas Chromatographic Ethanol Determination.

    ERIC Educational Resources Information Center

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  5. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  6. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  7. Accurate glucose detection in a small etalon

    NASA Astrophysics Data System (ADS)

    Martini, Joerg; Kuebler, Sebastian; Recht, Michael; Torres, Francisco; Roe, Jeffrey; Kiesel, Peter; Bruce, Richard

    2010-02-01

    We are developing a continuous glucose monitor for subcutaneous long-term implantation. This detector contains a double chamber Fabry-Perot-etalon that measures the differential refractive index (RI) between a reference and a measurement chamber at 850 nm. The etalon chambers have wavelength dependent transmission maxima which dependent linearly on the RI of their contents. An RI difference of ▵n=1.5.10-6 changes the spectral position of a transmission maximum by 1pm in our measurement. By sweeping the wavelength of a single-mode Vertical-Cavity-Surface-Emitting-Laser (VCSEL) linearly in time and detecting the maximum transmission peaks of the etalon we are able to measure the RI of a liquid. We have demonstrated accuracy of ▵n=+/-3.5.10-6 over a ▵n-range of 0 to 1.75.10-4 and an accuracy of 2% over a ▵nrange of 1.75.10-4 to 9.8.10-4. The accuracy is primarily limited by the reference measurement. The RI difference between the etalon chambers is made specific to glucose by the competitive, reversible release of Concanavalin A (ConA) from an immobilized dextran matrix. The matrix and ConA bound to it, is positioned outside the optical detection path. ConA is released from the matrix by reacting with glucose and diffuses into the optical path to change the RI in the etalon. Factors such as temperature affect the RI in measurement and detection chamber equally but do not affect the differential measurement. A typical standard deviation in RI is +/-1.4.10-6 over the range 32°C to 42°C. The detector enables an accurate glucose specific concentration measurement.

  8. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  9. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  10. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  11. Quantitative photoacoustic tomography based on the radiative transfer equation.

    PubMed

    Yao, Lei; Sun, Yao; Jiang, Huabei

    2009-06-15

    We describe a method for quantitative photoacoustic tomography (PAT) based on the radiative transfer equation (RTE) coupled with the Helmholtz photoacoustic wave equation. This RTE-based quantitative PAT allows for accurate recovery of absolute absorption coefficient images of heterogeneous media and provides significantly improved image reconstruction for the cases where the photon diffusion approximation may fail. The method and associated finite element reconstruction algorithm are validated using a series of tissuelike phantom experiments.

  12. Quantitive DNA Fiber Mapping

    SciTech Connect

    Lu, Chun-Mei; Wang, Mei; Greulich-Bode, Karin M.; Weier, Jingly F.; Weier, Heinz-Ulli G.

    2008-01-28

    Several hybridization-based methods used to delineate single copy or repeated DNA sequences in larger genomic intervals take advantage of the increased resolution and sensitivity of free chromatin, i.e., chromatin released from interphase cell nuclei. Quantitative DNA fiber mapping (QDFM) differs from the majority of these methods in that it applies FISH to purified, clonal DNA molecules which have been bound with at least one end to a solid substrate. The DNA molecules are then stretched by the action of a receding meniscus at the water-air interface resulting in DNA molecules stretched homogeneously to about 2.3 kb/{micro}m. When non-isotopically, multicolor-labeled probes are hybridized to these stretched DNA fibers, their respective binding sites are visualized in the fluorescence microscope, their relative distance can be measured and converted into kilobase pairs (kb). The QDFM technique has found useful applications ranging from the detection and delineation of deletions or overlap between linked clones to the construction of high-resolution physical maps to studies of stalled DNA replication and transcription.

  13. Rapid and Accurate T2 Mapping from Multi Spin Echo Data Using Bloch-Simulation-Based Reconstruction

    PubMed Central

    Ben-Eliezer, Noam; Sodickson, Daniel K; Block, Tobias Kai

    2014-01-01

    Purpose Quantitative T2-relaxation-based contrast has the potential to provide valuable clinical information. Practical T2-mapping, however, is impaired either by prohibitively long acquisition times or by contamination of fast multi-echo protocols by stimulated and indirect echoes. This work presents a novel post-processing approach aiming to overcome the common penalties associated with multi-echo protocols, and enabling rapid and accurate mapping of T2 relaxation values. Methods Bloch simulations are used to estimate the actual echo modulation curve (EMC) in a multi spin-echo experiment. Simulations are repeated for a range of T2 values and transmit field scales, yielding a database of simulated EMCs, which is then used to identify the T2 value whose EMC most closely matches the experimentally measured data at each voxel. Results T2 maps of both phantom and in vivo scans were successfully reconstructed, closely matching maps produced from single spin-echo data. Results were consistent over the physiological range of T2 values and across different experimental settings. Conclusion The proposed technique allows accurate T2 mapping in clinically feasible scan times, free of user- and scanner-dependent variations, while providing a comprehensive framework that can be extended to model other parameters (e.g., T1, B1+, B0, diffusion) and support arbitrary acquisition schemes. PMID:24648387

  14. Comparative Application of PLS and PCR Methods to Simultaneous Quantitative Estimation and Simultaneous Dissolution Test of Zidovudine - Lamivudine Tablets.

    PubMed

    Üstündağ, Özgür; Dinç, Erdal; Özdemir, Nurten; Tilkan, M Günseli

    2015-01-01

    In the development strategies of new drug products and generic drug products, the simultaneous in-vitro dissolution behavior of oral dosage formulations is the most important indication for the quantitative estimation of efficiency and biopharmaceutical characteristics of drug substances. This is to force the related field's scientists to improve very powerful analytical methods to get more reliable, precise and accurate results in the quantitative analysis and dissolution testing of drug formulations. In this context, two new chemometric tools, partial least squares (PLS) and principal component regression (PCR) were improved for the simultaneous quantitative estimation and dissolution testing of zidovudine (ZID) and lamivudine (LAM) in a tablet dosage form. The results obtained in this study strongly encourage us to use them for the quality control, the routine analysis and the dissolution test of the marketing tablets containing ZID and LAM drugs.

  15. Accurate eQTL prioritization with an ensemble-based framework.

    PubMed

    Zeng, Haoyang; Edwards, Matthew D; Guo, Yuchun; Gifford, David K

    2017-02-21

    We present a novel ensemble-based computational framework, EnsembleExpr, that achieved the best performance in the Fourth Critical Assessment of Genome Interpretation (CAGI4) "eQTL-causal SNPs" challenge for identifying eQTLs and prioritizing their gene expression effects. Expression quantitative trait loci (eQTLs) are genome sequence variants that result in gene expression changes and thus are prime suspects in the search for contributions to the causality of complex traits. When EnsembleExpr is trained on data from massively parallel reporter assays (MPRA) it accurately predicts reporter expression levels from unseen regulatory sequences and identifies sequence variants that exhibit significant changes in reporter expression. Compared with other state-of-the-art methods, EnsembleExpr achieved competitive performance when applied on eQTL datasets determined by other protocols. We envision EnsembleExpr to be a resource to help interpret non-coding regulatory variants and prioritize disease-associated mutations for downstream validation. This article is protected by copyright. All rights reserved.

  16. Accurate registration of coronary arteries for volumetric CT digital subtraction angiography

    NASA Astrophysics Data System (ADS)

    Razeto, Marco; Matthews, James; Masood, Saad; Steel, Jill; Arakita, Kazumasa

    2013-03-01

    In the diagnosis of coronary artery disease with coronary computed tomography angiography, accurate evaluation remains challenging in the presence of calcifications or stents. Volumetric CT Digital Subtraction Angiography is a novel method that may become a powerful tool to overcome these limitations. However, precise registration of structures is essential, as even small misalignments can produce striking and disruptive bright and dark artefacts. Moreover, for clinical acceptance, the tool should require minimal user interaction and fast turnaround, thereby raising several challenges. In this paper we address the problem with a registration method based on a global non- rigid step, followed by local rigid refinement. In our quantitative analysis based on 10 datasets, each consisting of a pair of pre- and post-contrast scans of the same patient, we achieve an average Target Registration Error of 0.45 mm. Runtimes are less than 90 seconds for the global step, while each local refinement takes less than 15 seconds to run. Initial clinical evaluation shows good results in cases of moderate calcification, and indicates that around 50% of severely calcified and previously non-interpretable cases have been made interpretable by application of our method.

  17. Hydration free energies of cyanide and hydroxide ions from molecular dynamics simulations with accurate force fields

    USGS Publications Warehouse

    Lee, M.W.; Meuwly, M.

    2013-01-01

    The evaluation of hydration free energies is a sensitive test to assess force fields used in atomistic simulations. We showed recently that the vibrational relaxation times, 1D- and 2D-infrared spectroscopies for CN(-) in water can be quantitatively described from molecular dynamics (MD) simulations with multipolar force fields and slightly enlarged van der Waals radii for the C- and N-atoms. To validate such an approach, the present work investigates the solvation free energy of cyanide in water using MD simulations with accurate multipolar electrostatics. It is found that larger van der Waals radii are indeed necessary to obtain results close to the experimental values when a multipolar force field is used. For CN(-), the van der Waals ranges refined in our previous work yield hydration free energy between -72.0 and -77.2 kcal mol(-1), which is in excellent agreement with the experimental data. In addition to the cyanide ion, we also study the hydroxide ion to show that the method used here is readily applicable to similar systems. Hydration free energies are found to sensitively depend on the intermolecular interactions, while bonded interactions are less important, as expected. We also investigate in the present work the possibility of applying the multipolar force field in scoring trajectories generated using computationally inexpensive methods, which should be useful in broader parametrization studies with reduced computational resources, as scoring is much faster than the generation of the trajectories.

  18. Sensitive and accurate quantification of human malaria parasites using droplet digital PCR (ddPCR)

    PubMed Central

    Koepfli, Cristian; Nguitragool, Wang; Hofmann, Natalie E.; Robinson, Leanne J.; Ome-Kaius, Maria; Sattabongkot, Jetsumon; Felger, Ingrid; Mueller, Ivo

    2016-01-01

    Accurate quantification of parasite density in the human host is essential for understanding the biology and pathology of malaria. Semi-quantitative molecular methods are widely applied, but the need for an external standard curve makes it difficult to compare parasite density estimates across studies. Droplet digital PCR (ddPCR) allows direct quantification without the need for a standard curve. ddPCR was used to diagnose and quantify P. falciparum and P. vivax in clinical patients as well as in asymptomatic samples. ddPCR yielded highly reproducible measurements across the range of parasite densities observed in humans, and showed higher sensitivity than qPCR to diagnose P. falciparum, and equal sensitivity for P. vivax. Correspondence in quantification was very high (>0.95) between qPCR and ddPCR. Quantification between technical replicates by ddPCR differed 1.5–1.7-fold, compared to 2.4–6.2-fold by qPCR. ddPCR facilitates parasite quantification for studies where absolute densities are required, and will increase comparability of results reported from different laboratories. PMID:27982132

  19. In vivo validation of quantitative frequency domain fluorescence tomography

    NASA Astrophysics Data System (ADS)

    Lin, Yuting; Ghijsen, Michael; Nalcioglu, Orhan; Gulsen, Gultekin

    2012-12-01

    We have developed a hybrid frequency domain fluorescence tomography and magnetic resonance imaging system (MRI) for small animal imaging. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration and lifetime images using a multi-modality approach. In vivo experiments are undertaken to evaluate the system. We compare the recovered fluorescence parameters with and without MRI structural a priori information. In addition, we compare two optical background heterogeneity correction methods: Born normalization and utilizing diffuse optical tomography (DOT) functional a priori information. The results show that the concentration and lifetime of a 4.2-mm diameter indocyanine green inclusion located 15 mm deep inside a rat can be recovered with less than a 5% error when functional a priori information from DOT and structural a priori information from MRI are utilized.

  20. Mapping methods for computationally efficient and accurate structural reliability

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    Mapping methods are developed to improve the accuracy and efficiency of probabilistic structural analyses with coarse finite element meshes. The mapping methods consist of: (1) deterministic structural analyses with fine (convergent) finite element meshes, (2) probabilistic structural analyses with coarse finite element meshes, (3) the relationship between the probabilistic structural responses from the coarse and fine finite element meshes, and (4) a probabilistic mapping. The results show that the scatter of the probabilistic structural responses and structural reliability can be accurately predicted using a coarse finite element model with proper mapping methods. Therefore, large structures can be analyzed probabilistically using finite element methods.

  1. Beam Profile Monitor With Accurate Horizontal And Vertical Beam Profiles

    DOEpatents

    Havener, Charles C [Knoxville, TN; Al-Rejoub, Riad [Oak Ridge, TN

    2005-12-26

    A widely used scanner device that rotates a single helically shaped wire probe in and out of a particle beam at different beamline positions to give a pair of mutually perpendicular beam profiles is modified by the addition of a second wire probe. As a result, a pair of mutually perpendicular beam profiles is obtained at a first beamline position, and a second pair of mutually perpendicular beam profiles is obtained at a second beamline position. The simple modification not only provides more accurate beam profiles, but also provides a measurement of the beam divergence and quality in a single compact device.

  2. Quantitative Luminescence Imaging System

    SciTech Connect

    Batishko, C.R.; Stahl, K.A.; Fecht, B.A.

    1992-12-31

    The goal of the MEASUREMENT OF CHEMILUMINESCENCE project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  3. Quantitative luminescence imaging system

    NASA Astrophysics Data System (ADS)

    Batishko, C. R.; Stahl, K. A.; Fecht, B. A.

    The goal of the Measurement of Chemiluminescence project is to develop and deliver a suite of imaging radiometric instruments for measuring spatial distributions of chemiluminescence. Envisioned deliverables include instruments working at the microscopic, macroscopic, and life-sized scales. Both laboratory and field portable instruments are envisioned. The project also includes development of phantoms as enclosures for the diazoluminomelanin (DALM) chemiluminescent chemistry. A suite of either phantoms in a variety of typical poses, or phantoms that could be adjusted to a variety of poses, is envisioned. These are to include small mammals (rats), mid-sized mammals (monkeys), and human body parts. A complete human phantom that can be posed is a long-term goal of the development. Taken together, the chemistry and instrumentation provide a means for imaging rf dosimetry based on chemiluminescence induced by the heat resulting from rf energy absorption. The first delivered instrument, the Quantitative Luminescence Imaging System (QLIS), resulted in a patent, and an R&D Magazine 1991 R&D 100 award, recognizing it as one of the 100 most significant technological developments of 1991. The current status of the project is that three systems have been delivered, several related studies have been conducted, two preliminary human hand phantoms have been delivered, system upgrades have been implemented, and calibrations have been maintained. Current development includes sensitivity improvements to the microscope-based system; extension of the large-scale (potentially life-sized targets) system to field portable applications; extension of the 2-D large-scale system to 3-D measurement; imminent delivery of a more refined human hand phantom and a rat phantom; rf, thermal and imaging subsystem integration; and continued calibration and upgrade support.

  4. Accurate quantification of tio2 nanoparticles collected on air filters using a microwave-assisted acid digestion method

    PubMed Central

    Mudunkotuwa, Imali A.; Anthony, T. Renée; Grassian, Vicki H.; Peters, Thomas M.

    2016-01-01

    Titanium dioxide (TiO2) particles, including nanoparticles with diameters smaller than 100 nm, are used extensively in consumer products. In a 2011 current intelligence bulletin, the National Institute of Occupational Safety and Health (NIOSH) recommended methods to assess worker exposures to fine and ultrafine TiO2 particles and associated occupational exposure limits for these particles. However, there are several challenges and problems encountered with these recommended exposure assessment methods involving the accurate quantitation of titanium dioxide collected on air filters using acid digestion followed by inductively coupled plasma optical emission spectroscopy (ICP-OES). Specifically, recommended digestion methods include the use of chemicals, such as perchloric acid, which are typically unavailable in most accredited industrial hygiene laboratories due to highly corrosive and oxidizing properties. Other alternative methods that are used typically involve the use of nitric acid or combination of nitric acid and sulfuric acid, which yield very poor recoveries for titanium dioxide. Therefore, given the current state of the science, it is clear that a new method is needed for exposure assessment. In this current study, a microwave-assisted acid digestion method has been specifically designed to improve the recovery of titanium in TiO2 nanoparticles for quantitative analysis using ICP-OES. The optimum digestion conditions were determined by changing several variables including the acids used, digestion time, and temperature. Consequently, the optimized digestion temperature of 210°C with concentrated sulfuric and nitric acid (2:1 v/v) resulted in a recovery of >90% for TiO2. The method is expected to provide for a more accurate quantification of airborne TiO2 particles in the workplace environment. PMID:26181824

  5. Accurate quantification of tio2 nanoparticles collected on air filters using a microwave-assisted acid digestion method.

    PubMed

    Mudunkotuwa, Imali A; Anthony, T Renée; Grassian, Vicki H; Peters, Thomas M

    2016-01-01

    Titanium dioxide (TiO(2)) particles, including nanoparticles with diameters smaller than 100 nm, are used extensively in consumer products. In a 2011 current intelligence bulletin, the National Institute of Occupational Safety and Health (NIOSH) recommended methods to assess worker exposures to fine and ultrafine TiO(2) particles and associated occupational exposure limits for these particles. However, there are several challenges and problems encountered with these recommended exposure assessment methods involving the accurate quantitation of titanium dioxide collected on air filters using acid digestion followed by inductively coupled plasma optical emission spectroscopy (ICP-OES). Specifically, recommended digestion methods include the use of chemicals, such as perchloric acid, which are typically unavailable in most accredited industrial hygiene laboratories due to highly corrosive and oxidizing properties. Other alternative methods that are used typically involve the use of nitric acid or combination of nitric acid and sulfuric acid, which yield very poor recoveries for titanium dioxide. Therefore, given the current state of the science, it is clear that a new method is needed for exposure assessment. In this current study, a microwave-assisted acid digestion method has been specifically designed to improve the recovery of titanium in TiO(2) nanoparticles for quantitative analysis using ICP-OES. The optimum digestion conditions were determined by changing several variables including the acids used, digestion time, and temperature. Consequently, the optimized digestion temperature of 210°C with concentrated sulfuric and nitric acid (2:1 v/v) resulted in a recovery of >90% for TiO(2). The method is expected to provide for a more accurate quantification of airborne TiO(2) particles in the workplace environment.

  6. Non-linear effects in quantitative 2D NMR of polysaccharides: pitfalls and how to avoid them.

    PubMed

    Martineau, Estelle; El Khantache, Kamel; Pupier, Marion; Sepulcri, Patricia; Akoka, Serge; Giraudeau, Patrick

    2015-04-10

    Quantitative 2D NMR is a powerful analytical tool which is widely used to determine the concentration of small molecules in complex samples. Due to the site-specific response of the 2D NMR signal, the determination of absolute concentrations requires the use of a calibration or standard addition approach, where the analyte acts as its own reference. Standard addition methods, where the targeted sample is gradually spiked with known amounts of the targeted analyte, are particularly well-suited for quantitative 2D NMR of small molecules. This paper explores the potential of such quantitative 2D NMR approaches for the quantitative analysis of a high molecular weight polysaccharide. The results highlight that the standard addition method leads to a strong under-estimation of the target concentration, whatever the 2D NMR pulse sequence. Diffusion measurements show that a change in the macromolecular organization of the studied polysaccharide is the most probable hypothesis to explain the non-linear evolution of the 2D NMR signal with concentration. In spite of this non-linearity--the detailed explanation of which is out of the scope of this paper--we demonstrate that accurate quantitative results can still be obtained provided that an external calibration is performed with a wide range of concentrations surrounding the target value. This study opens the way to a number of studies where 2D NMR is needed for the quantitative analysis of macromolecules.

  7. Machine Learning of Parameters for Accurate Semiempirical Quantum Chemical Calculations

    PubMed Central

    2015-01-01

    We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempirical OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules. PMID:26146493

  8. Machine learning of parameters for accurate semiempirical quantum chemical calculations

    DOE PAGES

    Dral, Pavlo O.; von Lilienfeld, O. Anatole; Thiel, Walter

    2015-04-14

    We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempiricalmore » OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules.« less

  9. Machine learning of parameters for accurate semiempirical quantum chemical calculations

    SciTech Connect

    Dral, Pavlo O.; von Lilienfeld, O. Anatole; Thiel, Walter

    2015-04-14

    We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempirical OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules.

  10. Machine Learning of Parameters for Accurate Semiempirical Quantum Chemical Calculations.

    PubMed

    Dral, Pavlo O; von Lilienfeld, O Anatole; Thiel, Walter

    2015-05-12

    We investigate possible improvements in the accuracy of semiempirical quantum chemistry (SQC) methods through the use of machine learning (ML) models for the parameters. For a given class of compounds, ML techniques require sufficiently large training sets to develop ML models that can be used for adapting SQC parameters to reflect changes in molecular composition and geometry. The ML-SQC approach allows the automatic tuning of SQC parameters for individual molecules, thereby improving the accuracy without deteriorating transferability to molecules with molecular descriptors very different from those in the training set. The performance of this approach is demonstrated for the semiempirical OM2 method using a set of 6095 constitutional isomers C7H10O2, for which accurate ab initio atomization enthalpies are available. The ML-OM2 results show improved average accuracy and a much reduced error range compared with those of standard OM2 results, with mean absolute errors in atomization enthalpies dropping from 6.3 to 1.7 kcal/mol. They are also found to be superior to the results from specific OM2 reparameterizations (rOM2) for the same set of isomers. The ML-SQC approach thus holds promise for fast and reasonably accurate high-throughput screening of materials and molecules.

  11. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-05

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses.

  12. Natural bacterial communities serve as quantitative geochemical biosensors

    DOE PAGES

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; ...

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination,more » even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.« less

  13. Natural bacterial communities serve as quantitative geochemical biosensors

    SciTech Connect

    Smith, Mark B.; Rocha, Andrea M.; Smillie, Chris S.; Olesen, Scott W.; Paradis, Charles; Wu, Liyou; Campbell, James H.; Fortney, Julian L.; Mehlhorn, Tonia L.; Lowe, Kenneth A.; Earles, Jennifer E.; Phillips, Jana; Techtmann, Steve M.; Joyner, Dominique C.; Elias, Dwayne A.; Bailey, Kathryn L.; Hurt, Richard A.; Preheim, Sarah P.; Sanders, Matthew C.; Yang, Joy; Mueller, Marcella A.; Brooks, Scott; Watson, David B.; Zhang, Ping; He, Zhili; Dubinsky, Eric A.; Adams, Paul D.; Arkin, Adam P.; Fields, Matthew W.; Zhou, Jizhong; Alm, Eric J.; Hazen, Terry C.

    2015-05-12

    Biological sensors can be engineered to measure a wide range of environmental conditions. Here we show that statistical analysis of DNA from natural microbial communities can be used to accurately identify environmental contaminants, including uranium and nitrate at a nuclear waste site. In addition to contamination, sequence data from the 16S rRNA gene alone can quantitatively predict a rich catalogue of 26 geochemical features collected from 93 wells with highly differing geochemistry characteristics. We extend this approach to identify sites contaminated with hydrocarbons from the Deepwater Horizon oil spill, finding that altered bacterial communities encode a memory of prior contamination, even after the contaminants themselves have been fully degraded. We show that the bacterial strains that are most useful for detecting oil and uranium are known to interact with these substrates, indicating that this statistical approach uncovers ecologically meaningful interactions consistent with previous experimental observations. Future efforts should focus on evaluating the geographical generalizability of these associations. Taken as a whole, these results indicate that ubiquitous, natural bacterial communities can be used as in situ environmental sensors that respond to and capture perturbations caused by human impacts. These in situ biosensors rely on environmental selection rather than directed engineering, and so this approach could be rapidly deployed and scaled as sequencing technology continues to become faster, simpler, and less expensive. Here we show that DNA from natural bacterial communities can be used as a quantitative biosensor to accurately distinguish unpolluted sites from those contaminated with uranium, nitrate, or oil. These results indicate that bacterial communities can be used as environmental sensors that respond to and capture perturbations caused by human impacts.

  14. Quantitative cone beam X-ray luminescence tomography/X-ray computed tomography imaging

    SciTech Connect

    Chen, Dongmei; Zhu, Shouping Chen, Xueli; Chao, Tiantian; Cao, Xu; Zhao, Fengjun; Huang, Liyu; Liang, Jimin

    2014-11-10

    X-ray luminescence tomography (XLT) is an imaging technology based on X-ray-excitable materials. The main purpose of this paper is to obtain quantitative luminescence concentration using the structural information of the X-ray computed tomography (XCT) in the hybrid cone beam XLT/XCT system. A multi-wavelength luminescence cone beam XLT method with the structural a priori information is presented to relieve the severe ill-posedness problem in the cone beam XLT. The nanophosphors and phantom experiments were undertaken to access the linear relationship of the system response. Then, an in vivo mouse experiment was conducted. The in vivo experimental results show that the recovered concentration error as low as 6.67% with the location error of 0.85 mm can be achieved. The results demonstrate that the proposed method can accurately recover the nanophosphor inclusion and realize the quantitative imaging.

  15. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  16. BTeam, a Novel BRET-based Biosensor for the Accurate Quantification of ATP Concentration within Living Cells

    PubMed Central

    Yoshida, Tomoki; Kakizuka, Akira; Imamura, Hiromi

    2016-01-01

    ATP levels may represent fundamental health conditions of cells. However, precise measurement of intracellular ATP levels in living cells is hindered by the lack of suitable methodologies. Here, we developed a novel ATP biosensor termed “BTeam”. BTeam comprises a yellow fluorescent protein (YFP), the ATP binding domain of the ε subunit of the bacterial ATP synthase, and an ATP-nonconsuming luciferase (NLuc). To attain emission, BTeam simply required NLuc substrate. BTeam showed elevated bioluminescence resonance energy transfer efficiency upon ATP binding, resulted in the emission spectra changes correlating with ATP concentrations. By using values of YFP/NLuc emission ratio to represent ATP levels, BTeam achieved steady signal outputs even though emission intensities were altered. With this biosensor, we succeeded in the accurate quantification of intracellular ATP concentrations of a population of living cells, as demonstrated by detecting the slight distribution in the cytosol (3.7–4.1 mM) and mitochondrial matrix (2.4–2.7 mM) within some cultured cell lines. Furthermore, BTeam allowed continuous tracing of cytosolic ATP levels of the same cells, as well as bioluminescent imaging of cytosolic ATP dynamics within individual cells. This simple and accurate technique should be an effective method for quantitative measurement of intracellular ATP concentrations. PMID:28000761

  17. 77 FR 3800 - Accurate NDE & Inspection, LLC; Confirmatory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... COMMISSION Accurate NDE & Inspection, LLC; Confirmatory Order In the Matter of Accurate NDE & Docket: 150... request ADR with the NRC in an attempt to resolve issues associated with this matter. In response, on August 9, 2011, Accurate NDE requested ADR to resolve this matter with the NRC. On September 28,...

  18. Accurate spectral modeling for infrared radiation

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gupta, S. K.

    1977-01-01

    Direct line-by-line integration and quasi-random band model techniques are employed to calculate the spectral transmittance and total band absorptance of 4.7 micron CO, 4.3 micron CO2, 15 micron CO2, and 5.35 micron NO bands. Results are obtained for different pressures, temperatures, and path lengths. These are compared with available theoretical and experimental investigations. For each gas, extensive tabulations of results are presented for comparative purposes. In almost all cases, line-by-line results are found to be in excellent agreement with the experimental values. The range of validity of other models and correlations are discussed.

  19. Accurate response surface approximations for weight equations based on structural optimization

    NASA Astrophysics Data System (ADS)

    Papila, Melih

    Accurate weight prediction methods are vitally important for aircraft design optimization. Therefore, designers seek weight prediction techniques with low computational cost and high accuracy, and usually require a compromise between the two. The compromise can be achieved by combining stress analysis and response surface (RS) methodology. While stress analysis provides accurate weight information, RS techniques help to transmit effectively this information to the optimization procedure. The focus of this dissertation is structural weight equations in the form of RS approximations and their accuracy when fitted to results of structural optimizations that are based on finite element analyses. Use of RS methodology filters out the numerical noise in structural optimization results and provides a smooth weight function that can easily be used in gradient-based configuration optimization. In engineering applications RS approximations of low order polynomials are widely used, but the weight may not be modeled well by low-order polynomials, leading to bias errors. In addition, some structural optimization results may have high-amplitude errors (outliers) that may severely affect the accuracy of the weight equation. Statistical techniques associated with RS methodology are sought in order to deal with these two difficulties: (1) high-amplitude numerical noise (outliers) and (2) approximation model inadequacy. The investigation starts with reducing approximation error by identifying and repairing outliers. A potential reason for outliers in optimization results is premature convergence, and outliers of such nature may be corrected by employing different convergence settings. It is demonstrated that outlier repair can lead to accuracy improvements over the more standard approach of removing outliers. The adequacy of approximation is then studied by a modified lack-of-fit approach, and RS errors due to the approximation model are reduced by using higher order polynomials. In

  20. Quantitative photoacoustic tomography

    PubMed Central

    Yuan, Zhen; Jiang, Huabei

    2009-01-01

    In this paper, several algorithms that allow for quantitative photoacoustic reconstruction of tissue optical, acoustic and physiological properties are described in a finite-element method based framework. These quantitative reconstruction algorithms are compared, and the merits and limitations associated with these methods are discussed. In addition, a multispectral approach is presented for concurrent reconstructions of multiple parameters including deoxyhaemoglobin, oxyhaemoglobin and water concentrations as well as acoustic speed. Simulation and in vivo experiments are used to demonstrate the effectiveness of the reconstruction algorithms presented. PMID:19581254

  1. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  2. An Accurate and Efficient Method of Computing Differential Seismograms

    NASA Astrophysics Data System (ADS)

    Hu, S.; Zhu, L.

    2013-12-01

    Inversion of seismic waveforms for Earth structure usually requires computing partial derivatives of seismograms with respect to velocity model parameters. We developed an accurate and efficient method to calculate differential seismograms for multi-layered elastic media, based on the Thompson-Haskell propagator matrix technique. We first derived the partial derivatives of the Haskell matrix and its compound matrix respect to the layer parameters (P wave velocity, shear wave velocity and density). We then derived the partial derivatives of surface displacement kernels in the frequency-wavenumber domain. The differential seismograms are obtained by using the frequency-wavenumber double integration method. The implementation is computationally efficient and the total computing time is proportional to the time of computing the seismogram itself, i.e., independent of the number of layers in the model. We verified the correctness of results by comparing with differential seismograms computed using the finite differences method. Our results are more accurate because of the analytical nature of the derived partial derivatives.

  3. Accurate stone analysis: the impact on disease diagnosis and treatment.

    PubMed

    Mandel, Neil S; Mandel, Ian C; Kolbach-Mandel, Ann M

    2017-02-01

    This manuscript reviews the requirements for acceptable compositional analysis of kidney stones using various biophysical methods. High-resolution X-ray powder diffraction crystallography and Fourier transform infrared spectroscopy (FTIR) are the only acceptable methods in our labs for kidney stone analysis. The use of well-constructed spectral reference libraries is the basis for accurate and complete stone analysis. The literature included in this manuscript identify errors in most commercial laboratories and in some academic centers. We provide personal comments on why such errors are occurring at such high rates, and although the work load is rather large, it is very worthwhile in providing accurate stone compositions. We also provide the results of our almost 90,000 stone analyses and a breakdown of the number of components we have observed in the various stones. We also offer advice on determining the method used by the various FTIR equipment manufacturers who also provide a stone analysis library so that the FTIR users can feel comfortable in the accuracy of their reported results. Such an analysis on the accuracy of the individual reference libraries could positively influence the reduction in their respective error rates.

  4. A fast and accurate decoder for underwater acoustic telemetry.

    PubMed

    Ingraham, J M; Deng, Z D; Li, X; Fu, T; McMichael, G A; Trumbo, B A

    2014-07-01

    The Juvenile Salmon Acoustic Telemetry System, developed by the U.S. Army Corps of Engineers, Portland District, has been used to monitor the survival of juvenile salmonids passing through hydroelectric facilities in the Federal Columbia River Power System. Cabled hydrophone arrays deployed at dams receive coded transmissions sent from acoustic transmitters implanted in fish. The signals' time of arrival on different hydrophones is used to track fish in 3D. In this article, a new algorithm that decodes the received transmissions is described and the results are compared to results for the previous decoding algorithm. In a laboratory environment, the new decoder was able to decode signals with lower signal strength than the previous decoder, effectively increasing decoding efficiency and range. In field testing, the new algorithm decoded significantly more signals than the previous decoder and three-dimensional tracking experiments showed that the new decoder's time-of-arrival estimates were accurate. At multiple distances from hydrophones, the new algorithm tracked more points more accurately than the previous decoder. The new algorithm was also more than 10 times faster, which is critical for real-time applications on an embedded system.

  5. Interactive Isogeometric Volume Visualization with Pixel-Accurate Geometry.

    PubMed

    Fuchs, Franz G; Hjelmervik, Jon M

    2016-02-01

    A recent development, called isogeometric analysis, provides a unified approach for design, analysis and optimization of functional products in industry. Traditional volume rendering methods for inspecting the results from the numerical simulations cannot be applied directly to isogeometric models. We present a novel approach for interactive visualization of isogeometric analysis results, ensuring correct, i.e., pixel-accurate geometry of the volume including its bounding surfaces. The entire OpenGL pipeline is used in a multi-stage algorithm leveraging techniques from surface rendering, order-independent transparency, as well as theory and numerical methods for ordinary differential equations. We showcase the efficiency of our approach on different models relevant to industry, ranging from quality inspection of the parametrization of the geometry, to stress analysis in linear elasticity, to visualization of computational fluid dynamics results.

  6. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  7. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE PAGES

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...

    2016-12-15

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  8. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  9. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    SciTech Connect

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott; Morris, Richard V.; Ehlmann, Bethany; Dyar, M. Darby

    2016-12-15

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  10. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    NASA Astrophysics Data System (ADS)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott; Morris, Richard V.; Ehlmann, Bethany; Dyar, M. Darby

    2017-03-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the laser-induced breakdown spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element's emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple "sub-model" method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then "blending" these "sub-models" into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares (PLS) regression, is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  11. Can Selforganizing Maps Accurately Predict Photometric Redshifts?

    NASA Technical Reports Server (NTRS)

    Way, Michael J.; Klose, Christian

    2012-01-01

    We present an unsupervised machine-learning approach that can be employed for estimating photometric redshifts. The proposed method is based on a vector quantization called the self-organizing-map (SOM) approach. A variety of photometrically derived input values were utilized from the Sloan Digital Sky Survey's main galaxy sample, luminous red galaxy, and quasar samples, along with the PHAT0 data set from the Photo-z Accuracy Testing project. Regression results obtained with this new approach were evaluated in terms of root-mean-square error (RMSE) to estimate the accuracy of the photometric redshift estimates. The results demonstrate competitive RMSE and outlier percentages when compared with several other popular approaches, such as artificial neural networks and Gaussian process regression. SOM RMSE results (using delta(z) = z(sub phot) - z(sub spec)) are 0.023 for the main galaxy sample, 0.027 for the luminous red galaxy sample, 0.418 for quasars, and 0.022 for PHAT0 synthetic data. The results demonstrate that there are nonunique solutions for estimating SOM RMSEs. Further research is needed in order to find more robust estimation techniques using SOMs, but the results herein are a positive indication of their capabilities when compared with other well-known methods

  12. Report on Solar Water Heating Quantitative Survey

    SciTech Connect

    Focus Marketing Services

    1999-05-06

    This report details the results of a quantitative research study undertaken to better understand the marketplace for solar water-heating systems from the perspective of home builders, architects, and home buyers.

  13. A new, multiplex, quantitative real-time polymerase chain reaction system for nucleic acid detection and quantification.

    PubMed

    Liang, Fang; Arora, Neetika; Zhang, Kang Liang; Yeh, David Che Cheng; Lai, Richard; Pearson, Darnley; Barnett, Graeme; Whiley, David; Sloots, Theo; Corrie, Simon R; Barnard, Ross T

    2013-01-01

    Quantitative real-time polymerase chain reaction (qPCR) has emerged as a powerful investigative and diagnostic tool with potential to generate accurate and reproducible results. qPCR can be designed to fulfil the four key aspects required for the detection of nucleic acids: simplicity, speed, sensitivity, and specificity. This chapter reports the development of a novel real-time multiplex quantitative PCR technology, dubbed PrimRglo™, with a potential for high-degree multiplexing. It combines the capacity to simultaneously detect many viruses, bacteria, or nucleic acids, in a single reaction tube, with the ability to quantitate viral or bacterial load. The system utilizes oligonucleotide-tagged PCR primers, along with complementary fluorophore-labelled and quencher-labelled oligonucleotides. The analytic sensitivity of PrimRglo technology was compared with the widely used Taqman(®) and SYBR green detection systems.

  14. Uncertainty Quantification for Quantitative Imaging Holdup Measurements

    SciTech Connect

    Bevill, Aaron M; Bledsoe, Keith C

    2016-01-01

    In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.

  15. Quantitative SPECT of uptake of monoclonal antibodies

    SciTech Connect

    DeNardo, G.L.; Macey, D.J.; DeNardo, S.J.; Zhang, C.G.; Custer, T.R.

    1989-01-01

    Absolute quantitation of the distribution of radiolabeled antibodies is important to the efficient conduct of research with these agents and their ultimate use for imaging and treatment, but is formidable because of the unrestricted nature of their distribution within the patient. Planar imaging methods have been developed and provide an adequate approximation of the distribution of radionuclide for many purposes, particularly when there is considerable specificity of targeting. This is not currently the case for antibodies and is unlikely in the future. Single photon emission computed tomography (SPECT) provides potential for greater accuracy because it reduces problems caused by superimposition of tissues and non-target contributions to target counts. SPECT measurement of radionuclide content requires: (1) accurate determination of camera sensitivity; (2) accurate determination of the number of counts in a defined region of interest; (3) correction for attenuation; (4) correction for scatter and septal penetration; (5) accurate measurement of the administered dose; (6) adequate statistics; and (7) accurate definition of tissue mass or volume. The major impediment to each of these requirements is scatter of many types. The magnitude of this problem can be diminished by improvements in tomographic camera design, computer algorithms, and methodological approaches. 34 references.

  16. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments

    PubMed Central

    Eter, Wael A.; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, 111In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of 111In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  17. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments.

    PubMed

    Eter, Wael A; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-04-15

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, (111)In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of (111)In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers.

  18. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  19. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    SciTech Connect

    Dunn, Nicholas J. H.; Noid, W. G.

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.

  20. Fair & Accurate Grading for Exceptional Learners

    ERIC Educational Resources Information Center

    Jung, Lee Ann; Guskey, Thomas R.

    2011-01-01

    Despite the many changes in education over the past century, grading and reporting practices have essentially remained the same. In part, this is because few teacher preparation programs offer any guidance on sound grading practices. As a result, most current grading practices are grounded in tradition, rather than research on best practice. In an…

  1. The Quantitative Imaging Network in Precision Medicine

    PubMed Central

    Nordstrom, Robert J.

    2017-01-01

    Precision medicine is a healthcare model that seeks to incorporate a wealth of patient information to identify and classify disease progression and to provide tailored therapeutic solutions for individual patients. Interventions are based on knowledge of molecular and mechanistic causes, pathogenesis and pathology of disease. Individual characteristics of the patients are then used to select appropriate healthcare options. Imaging is playing an increasingly important role in identifying relevant characteristics that help to stratify patients for different interventions. However, lack of standards, limitations in image-processing interoperability, and errors in data collection can limit the applicability of imaging in clinical decision support. Quantitative imaging is the attempt to extract reliable, numerical information from images to eliminate qualitative judgments and errors for providing accurate measures of tumor response to therapy or for predicting future response. This issue of Tomography reports quantitative imaging developments made by several members of the National Cancer Institute Quantitative Imaging Network, a program dedicated to the promotion of quantitative imaging methods for clinical decision support. PMID:28083563

  2. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    SciTech Connect

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  3. Quantitative Simulation Games

    NASA Astrophysics Data System (ADS)

    Černý, Pavol; Henzinger, Thomas A.; Radhakrishna, Arjun

    While a boolean notion of correctness is given by a preorder on systems and properties, a quantitative notion of correctness is defined by a distance function on systems and properties, where the distance between a system and a property provides a measure of "fit" or "desirability." In this article, we explore several ways how the simulation preorder can be generalized to a distance function. This is done by equipping the classical simulation game between a system and a property with quantitative objectives. In particular, for systems that satisfy a property, a quantitative simulation game can measure the "robustness" of the satisfaction, that is, how much the system can deviate from its nominal behavior while still satisfying the property. For systems that violate a property, a quantitative simulation game can measure the "seriousness" of the violation, that is, how much the property has to be modified so that it is satisfied by the system. These distances can be computed in polynomial time, since the computation reduces to the value problem in limit average games with constant weights. Finally, we demonstrate how the robustness distance can be used to measure how many transmission errors are tolerated by error correcting codes.

  4. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  5. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  6. Quantitative DFT modeling of the enantiomeric excess for dioxirane-catalyzed epoxidations

    PubMed Central

    Schneebeli, Severin T.; Hall, Michelle Lynn

    2009-01-01

    Herein we report the first fully quantum mechanical study of enantioselectivity for a large dataset. We show that transition state modeling at the UB3LYP-DFT/6-31G* level of theory can accurately model enantioselectivity for various dioxirane-catalyzed asymmetric epoxidations. All the synthetically useful high selectivities are successfully “predicted” by this method. Our results hint at the utility of this method to further model other asymmetric reactions and facilitate the discovery process for the experimental organic chemist. Our work suggests the possibility of using computational methods not simply to explain organic phenomena, but also to predict them quantitatively. PMID:19243187

  7. MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates

    SciTech Connect

    Van Wychen, Stefanie; Long, William; Black, Stuart K.; Laurens, Lieve M. L.

    2016-11-24

    A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL-1 without interference from other algae acidic hydrolyzate components.

  8. MBTH: A novel approach to rapid, spectrophotometric quantitation of total algal carbohydrates

    DOE PAGES

    Van Wychen, Stefanie; Long, William; Black, Stuart K.; ...

    2016-11-24

    A high-throughput and robust application of the 3-methyl-2-benzothiazolinone hydrazone (MBTH) method was developed for carbohydrate determination in microalgae. The traditional phenol-sulfuric acid method to quantify carbohydrates is strongly affected by algal biochemical components and exhibits a highly variable response to microalgal monosaccharides. We present a novel use of the MBTH method to accurately quantify carbohydrates in hydrolyzate after acid hydrolysis of algal biomass, without a need for neutralization. As a result, the MBTH method demonstrated consistent and sensitive quantitation of algae-specific monosaccharides down to 5 ug mL-1 without interference from other algae acidic hydrolyzate components.

  9. Accurate Methods for Large Molecular Systems (Preprint)

    DTIC Science & Technology

    2009-01-06

    Gaussian functions. These basis sets can be used in a systematic way to obtain results approaching the complete basis set ( CBS ) limit. However...convergence to the CBS limit. The high accuracy of these basis sets still comes at a significant computational cost, only feasible on relatively small...J. Chem. Phys. 2006, 124, 114103. (b) ccCA: DeYonker, N. J.; Grimes , T.; Yockel, S.; Dinescu, A.; Mintz, B.; Cundari, T. R.; Wilson, A. K. J. Chem

  10. Towards more accurate vegetation mortality predictions

    DOE PAGES

    Sevanto, Sanna Annika; Xu, Chonggang

    2016-09-26

    Predicting the fate of vegetation under changing climate is one of the major challenges of the climate modeling community. Here, terrestrial vegetation dominates the carbon and water cycles over land areas, and dramatic changes in vegetation cover resulting from stressful environmental conditions such as drought feed directly back to local and regional climate, potentially leading to a vicious cycle where vegetation recovery after a disturbance is delayed or impossible.

  11. Comparison of a newly developed automated and quantitative hepatitis C virus (HCV) core antigen test with the HCV RNA assay for clinical usefulness in confirming anti-HCV results.

    PubMed

    Kesli, Recep; Polat, Hakki; Terzi, Yuksel; Kurtoglu, Muhammet Guzel; Uyar, Yavuz

    2011-12-01

    Hepatitis C virus (HCV) is a global health care problem. Diagnosis of HCV infection is mainly based on the detection of anti-HCV antibodies as a screening test with serum samples. Recombinant immunoblot assays are used as supplemental tests and for the final detection and quantification of HCV RNA in confirmatory tests. In this study, we aimed to compare the HCV core antigen test with the HCV RNA assay for confirming anti-HCV results to determine whether the HCV core antigen test may be used as an alternative confirmatory test to the HCV RNA test and to assess the diagnostic values of the total HCV core antigen test by determining the diagnostic specificity and sensitivity rates compared with the HCV RNA test. Sera from a total of 212 treatment-naive patients were analyzed for anti-HCV and HCV core antigen both with the Abbott Architect test and with the molecular HCV RNA assay consisting of a reverse transcription-PCR method as a confirmatory test. The diagnostic sensitivity, specificity, and positive and negative predictive values of the HCV core antigen assay compared to the HCV RNA test were 96.3%, 100%, 100%, and 89.7%, respectively. The levels of HCV core antigen showed a good correlation with those from the HCV RNA quantification (r = 0.907). In conclusion, the Architect HCV antigen assay is highly specific, sensitive, reliable, easy to perform, reproducible, cost-effective, and applicable as a screening, supplemental, and preconfirmatory test for anti-HCV assays used in laboratory procedures for the diagnosis of hepatitis C virus infection.

  12. Obtaining Accurate Change Detection Results from High-Resolution Satellite Sensors

    NASA Technical Reports Server (NTRS)

    Bryant, N.; Bunch, W.; Fretz, R.; Kim, P.; Logan, T.; Smyth, M.; Zobrist, A.

    2012-01-01

    Multi-date acquisitions of high-resolution imaging satellites (e.g. GeoEye and WorldView), can display local changes of current economic interest. However, their large data volume precludes effective manual analysis, requiring image co-registration followed by image-to-image change detection, preferably with minimal analyst attention. We have recently developed an automatic change detection procedure that minimizes false-positives. The processing steps include: (a) Conversion of both the pre- and post- images to reflectance values (this step is of critical importance when different sensors are involved); reflectance values can be either top-of-atmosphere units or have full aerosol optical depth calibration applied using bi-directional reflectance knowledge. (b) Panchromatic band image-to-image co-registration, using an orthorectified base reference image (e.g. Digital Orthophoto Quadrangle) and a digital elevation model; this step can be improved if a stereo-pair of images have been acquired on one of the image dates. (c) Pan-sharpening of the multispectral data to assure recognition of change objects at the highest resolution. (d) Characterization of multispectral data in the post-image ( i.e. the background) using unsupervised cluster analysis. (e) Band ratio selection in the post-image to separate surface materials of interest from the background. (f) Preparing a pre-to-post change image. (g) Identifying locations where change has occurred involving materials of interest.

  13. Quality metric for accurate overlay control in <20nm nodes

    NASA Astrophysics Data System (ADS)

    Klein, Dana; Amit, Eran; Cohen, Guy; Amir, Nuriel; Har-Zvi, Michael; Huang, Chin-Chou Kevin; Karur-Shanmugam, Ramkumar; Pierson, Bill; Kato, Cindy; Kurita, Hiroyuki

    2013-04-01

    The semiconductor industry is moving toward 20nm nodes and below. As the Overlay (OVL) budget is getting tighter at these advanced nodes, the importance in the accuracy in each nanometer of OVL error is critical. When process owners select OVL targets and methods for their process, they must do it wisely; otherwise the reported OVL could be inaccurate, resulting in yield loss. The same problem can occur when the target sampling map is chosen incorrectly, consisting of asymmetric targets that will cause biased correctable terms and a corrupted wafer. Total measurement uncertainty (TMU) is the main parameter that process owners use when choosing an OVL target per layer. Going towards the 20nm nodes and below, TMU will not be enough for accurate OVL control. KLA-Tencor has introduced a quality score named `Qmerit' for its imaging based OVL (IBO) targets, which is obtained on the-fly for each OVL measurement point in X & Y. This Qmerit score will enable the process owners to select compatible targets which provide accurate OVL values for their process and thereby improve their yield. Together with K-T Analyzer's ability to detect the symmetric targets across the wafer and within the field, the Archer tools will continue to provide an independent, reliable measurement of OVL error into the next advanced nodes, enabling fabs to manufacture devices that meet their tight OVL error budgets.

  14. Accurate Anharmonic IR Spectra from Integrated Cc/dft Approach

    NASA Astrophysics Data System (ADS)

    Barone, Vincenzo; Biczysko, Malgorzata; Bloino, Julien; Carnimeo, Ivan; Puzzarini, Cristina

    2014-06-01

    The recent implementation of the computation of infrared (IR) intensities beyond the double harmonic approximation [1] paved the route to routine calculations of infrared spectra for a wide set of molecular systems. Contrary to common beliefs, second-order perturbation theory is able to deliver results of high accuracy provided that anharmonic resonances are properly managed [1,2]. It has been already shown for several small closed- and open shell molecular systems that the differences between coupled cluster (CC) and DFT anharmonic wavenumbers are mainly due to the harmonic terms, paving the route to introduce effective yet accurate hybrid CC/DFT schemes [2]. In this work we present that hybrid CC/DFT models can be applied also to the IR intensities leading to the simulation of highly accurate fully anharmonic IR spectra for medium-size molecules, including ones of atmospheric interest, showing in all cases good agreement with experiment even in the spectral ranges where non-fundamental transitions are predominant[3]. [1] J. Bloino and V. Barone, J. Chem. Phys. 136, 124108 (2012) [2] V. Barone, M. Biczysko, J. Bloino, Phys. Chem. Chem. Phys., 16, 1759-1787 (2014) [3] I. Carnimeo, C. Puzzarini, N. Tasinato, P. Stoppa, A. P. Charmet, M. Biczysko, C. Cappelli and V. Barone, J. Chem. Phys., 139, 074310 (2013)

  15. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  16. Accurate colon residue detection algorithm with partial volume segmentation

    NASA Astrophysics Data System (ADS)

    Li, Xiang; Liang, Zhengrong; Zhang, PengPeng; Kutcher, Gerald J.

    2004-05-01

    Colon cancer is the second leading cause of cancer-related death in the United States. Earlier detection and removal of polyps can dramatically reduce the chance of developing malignant tumor. Due to some limitations of optical colonoscopy used in clinic, many researchers have developed virtual colonoscopy as an alternative technique, in which accurate colon segmentation is crucial. However, partial volume effect and existence of residue make it very challenging. The electronic colon cleaning technique proposed by Chen et al is a very attractive method, which is also kind of hard segmentation method. As mentioned in their paper, some artifacts were produced, which might affect the accurate colon reconstruction. In our paper, instead of labeling each voxel with a unique label or tissue type, the percentage of different tissues within each voxel, which we call a mixture, was considered in establishing a maximum a posterior probability (MAP) image-segmentation framework. A Markov random field (MRF) model was developed to reflect the spatial information for the tissue mixtures. The spatial information based on hard segmentation was used to determine which tissue types are in the specific voxel. Parameters of each tissue class were estimated by the expectation-maximization (EM) algorithm during the MAP tissue-mixture segmentation. Real CT experimental results demonstrated that the partial volume effects between four tissue types have been precisely detected. Meanwhile, the residue has been electronically removed and very smooth and clean interface along the colon wall has been obtained.

  17. Strategy for accurate liver intervention by an optical tracking system

    PubMed Central

    Lin, Qinyong; Yang, Rongqian; Cai, Ken; Guan, Peifeng; Xiao, Weihu; Wu, Xiaoming

    2015-01-01

    Image-guided navigation for radiofrequency ablation of liver tumors requires the accurate guidance of needle insertion into a tumor target. The main challenge of image-guided navigation for radiofrequency ablation of liver tumors is the occurrence of liver deformations caused by respiratory motion. This study reports a strategy of real-time automatic registration to track custom fiducial markers glued onto the surface of a patient’s abdomen to find the respiratory phase, in which the static preoperative CT is performed. Custom fiducial markers are designed. Real-time automatic registration method consists of the automatic localization of custom fiducial markers in the patient and image spaces. The fiducial registration error is calculated in real time and indicates if the current respiratory phase corresponds to the phase of the static preoperative CT. To demonstrate the feasibility of the proposed strategy, a liver simulator is constructed and two volunteers are involved in the preliminary experiments. An ex-vivo porcine liver model is employed to further verify the strategy for liver intervention. Experimental results demonstrate that real-time automatic registration method is rapid, accurate, and feasible for capturing the respiratory phase from which the static preoperative CT anatomical model is generated by tracking the movement of the skin-adhered custom fiducial markers. PMID:26417501

  18. Accurate interlaminar stress recovery from finite element analysis

    NASA Technical Reports Server (NTRS)

    Tessler, Alexander; Riggs, H. Ronald

    1994-01-01

    The accuracy and robustness of a two-dimensional smoothing methodology is examined for the problem of recovering accurate interlaminar shear stress distributions in laminated composite and sandwich plates. The smoothing methodology is based on a variational formulation which combines discrete least-squares and penalty-constraint functionals in a single variational form. The smoothing analysis utilizes optimal strains computed at discrete locations in a finite element analysis. These discrete strain data are smoothed with a smoothing element discretization, producing superior accuracy strains and their first gradients. The approach enables the resulting smooth strain field to be practically C1-continuous throughout the domain of smoothing, exhibiting superconvergent properties of the smoothed quantity. The continuous strain gradients are also obtained directly from the solution. The recovered strain gradients are subsequently employed in the integration o equilibrium equations to obtain accurate interlaminar shear stresses. The problem is a simply-supported rectangular plate under a doubly sinusoidal load. The problem has an exact analytic solution which serves as a measure of goodness of the recovered interlaminar shear stresses. The method has the versatility of being applicable to the analysis of rather general and complex structures built of distinct components and materials, such as found in aircraft design. For these types of structures, the smoothing is achieved with 'patches', each patch covering the domain in which the smoothed quantity is physically continuous.

  19. Accurate three-dimensional documentation of distinct sites

    NASA Astrophysics Data System (ADS)

    Singh, Mahesh K.; Dutta, Ashish; Subramanian, Venkatesh K.

    2017-01-01

    One of the most critical aspects of documenting distinct sites is acquiring detailed and accurate range information. Several three-dimensional (3-D) acquisition techniques are available, but each has its own limitations. This paper presents a range data fusion method with the aim to enhance the descriptive contents of the entire 3-D reconstructed model. A kernel function is introduced for supervised classification of the range data using a kernelized support vector machine. The classification method is based on the local saliency features of the acquired range data. The range data acquired from heterogeneous range sensors are transformed into a defined common reference frame. Based on the segmentation criterion, the fusion of range data is performed by integrating finer regions of range data acquired from a laser range scanner with the coarser region of Kinect's range data. After fusion, the Delaunay triangulation algorithm is applied to generate the highly accurate, realistic 3-D model of the scene. Finally, experimental results show the robustness of the proposed approach.

  20. An Accurate and Dynamic Computer Graphics Muscle Model

    NASA Technical Reports Server (NTRS)

    Levine, David Asher

    1997-01-01

    A computer based musculo-skeletal model was developed at the University in the departments of Mechanical and Biomedical Engineering. This model accurately represents human shoulder kinematics. The result of this model is the graphical display of bones moving through an appropriate range of motion based on inputs of EMGs and external forces. The need existed to incorporate a geometric muscle model in the larger musculo-skeletal model. Previous muscle models did not accurately represent muscle geometries, nor did they account for the kinematics of tendons. This thesis covers the creation of a new muscle model for use in the above musculo-skeletal model. This muscle model was based on anatomical data from the Visible Human Project (VHP) cadaver study. Two-dimensional digital images from the VHP were analyzed and reconstructed to recreate the three-dimensional muscle geometries. The recreated geometries were smoothed, reduced, and sliced to form data files defining the surfaces of each muscle. The muscle modeling function opened these files during run-time and recreated the muscle surface. The modeling function applied constant volume limitations to the muscle and constant geometry limitations to the tendons.

  1. Accurate oscillator strengths for interstellar ultraviolet lines of Cl I

    NASA Technical Reports Server (NTRS)

    Schectman, R. M.; Federman, S. R.; Beideck, D. J.; Ellis, D. J.

    1993-01-01

    Analyses on the abundance of interstellar chlorine rely on accurate oscillator strengths for ultraviolet transitions. Beam-foil spectroscopy was used to obtain f-values for the astrophysically important lines of Cl I at 1088, 1097, and 1347 A. In addition, the line at 1363 A was studied. Our f-values for 1088, 1097 A represent the first laboratory measurements for these lines; the values are f(1088)=0.081 +/- 0.007 (1 sigma) and f(1097) = 0.0088 +/- 0.0013 (1 sigma). These results resolve the issue regarding the relative strengths for 1088, 1097 A in favor of those suggested by astronomical measurements. For the other lines, our results of f(1347) = 0.153 +/- 0.011 (1 sigma) and f(1363) = 0.055 +/- 0.004 (1 sigma) are the most precisely measured values available. The f-values are somewhat greater than previous experimental and theoretical determinations.

  2. Towards quantitative atmospheric water vapor profiling with differential absorption lidar.

    PubMed

    Dinovitser, Alex; Gunn, Lachlan J; Abbott, Derek

    2015-08-24

    Differential Absorption Lidar (DIAL) is a powerful laser-based technique for trace gas profiling of the atmosphere. However, this technique is still under active development requiring precise and accurate wavelength stabilization, as well as accurate spectroscopic parameters of the specific resonance line and the effective absorption cross-section of the system. In this paper we describe a novel master laser system that extends our previous work for robust stabilization to virtually any number of multiple side-line laser wavelengths for the future probing to greater altitudes. In this paper, we also highlight the significance of laser spectral purity on DIAL accuracy, and illustrate a simple re-arrangement of a system for measuring effective absorption cross-section. We present a calibration technique where the laser light is guided to an absorption cell with 33 m path length, and a quantitative number density measurement is then used to obtain the effective absorption cross-section. The same absorption cell is then used for on-line laser stabilization, while microwave beat-frequencies are used to stabilize any number of off-line lasers. We present preliminary results using ∼300 nJ, 1 μs pulses at 3 kHz, with the seed laser operating as a nanojoule transmitter at 822.922 nm, and a receiver consisting of a photomultiplier tube (PMT) coupled to a 356 mm mirror.

  3. Quantitative fluorescence spectroscopy in turbid media: a practical solution to the problem of scattering and absorption.

    PubMed

    Chen, Yao; Chen, Zeng-Ping; Yang, Jing; Jin, Jing-Wen; Zhang, Juan; Yu, Ru-Qin

    2013-02-19

    The presence of practically unavoidable scatterers and background absorbers in turbid media such as biological tissue or cell suspensions can significantly distort the shape and intensity of fluorescence spectra of fluorophores and, hence, greatly hinder the in situ quantitative determination of fluorophores in turbid media. In this contribution, a quantitative fluorescence model (QFM) was proposed to explicitly model the effects of the scattering and absorption on fluorescence measurements. On the basis of the proposed model, a calibration strategy was developed to remove the detrimental effects of scattering and absorption and, hence, realize accurate quantitative analysis of fluorophores in turbid media. A proof-of-concept model system, the determination of free Ca(2+) in turbid media using Fura-2, was utilized to evaluate the performance of the proposed method. Experimental results showed that QFM can provide quite precise concentration predictions for free Ca(2+) in turbid media with an average relative error of about 7%, probably the best results ever achieved for turbid media without the use of advanced optical technologies. QFM has not only good performance but also simplicity of implementation. It does not require characterization of the light scattering properties of turbid media, provided that the light scattering and absorption properties of the test samples are reasonably close to those of the calibration samples. QFM can be developed and extended in many application areas such as ratiometric fluorescent sensors for quantitative live cell imaging.

  4. Method for quantitative proteomics research by using metal element chelated tags coupled with mass spectrometry.

    PubMed

    Liu, Huiling; Zhang, Yangjun; Wang, Jinglan; Wang, Dong; Zhou, Chunxi; Cai, Yun; Qian, Xiaohong

    2006-09-15

    The mass spectrometry-based methods with a stable isotope as the internal standard in quantitative proteomics have been developed quickly in recent years. But the use of some stable isotope reagents is limited by the relative high price and synthetic difficulties. We have developed a new method for quantitative proteomics research by using metal element chelated tags (MECT) coupled with mass spectrometry. The bicyclic anhydride diethylenetriamine-N,N,N',N' ',N' '-pentaacetic acid (DTPA) is covalently coupled to primary amines of peptides, and the ligand is then chelated to the rare earth metals Y and Tb. The tagged peptides are mixed and analyzed by LC-ESI-MS/MS. Peptides are quantified by measuring the relative signal intensities for the Y and Tb tag pairs in MS, which permits the quantitation of the original proteins generating the corresponding peptides. The protein is then identified by the corresponding peptide sequence from its MS/MS spectrum. The MECT method was evaluated by using standard proteins as model sample. The experimental results showed that metal chelate-tagged peptides chromatographically coeluted successfully during the reversed-phase LC analysis. The relative quantitation results were accurate for proteins using MECT. DTPA modification of the N-terminal of peptides promoted cleaner fragmentation (only y-series ions) in mass spectrometry and improved the confidence level of protein identification. The MECT strategy provides a simple, rapid, and economical alternative to current mass tagging technologies available.

  5. Quantitative optical techniques for dense sprays investigation: A survey

    NASA Astrophysics Data System (ADS)

    Coghe, A.; Cossali, G. E.

    2012-01-01

    The experimental study of dense sprays by optical techniques poses many challenges and no methods have proven to be completely reliable when accurate quantitative data are required, for example to validate breakup models and CFD simulations. The present survey is aimed to a critical analysis of optical techniques capable to provide quantitative and reliable data in dense sprays and to point out the conditions necessary to safely obtain such measurements. A single parameter, the optical depth, is proposed to quantify the concept of dense spray and to indicate when multiple scattering becomes predominant and could make the experimental results questionable. Many available optical techniques are divided into two categories: the "classical" ones, like PDA, LDV, PIV, etc., that work well in dilute sprays but show many limitations in dense sprays, and the "emerging" ones more suitable for dense sprays. Among the last ones, those considered more promising are discussed in detail. A number of significant applications are also presented and discussed to better clarify the nature of such complex problem and the feasibility of the new proposed approaches.

  6. Quantitative nuclear magnetic resonance imaging: characterisation of experimental cerebral oedema.

    PubMed Central

    Barnes, D; McDonald, W I; Johnson, G; Tofts, P S; Landon, D N

    1987-01-01

    Magnetic resonance imaging (MRI) has been used quantitatively to define the characteristics of two different models of experimental cerebral oedema in cats: vasogenic oedema produced by cortical freezing and cytotoxic oedema induced by triethyl tin. The MRI results have been correlated with the ultrastructural changes. The images accurately delineated the anatomical extent of the oedema in the two lesions, but did not otherwise discriminate between them. The patterns of measured increase in T1' and T2' were, however, characteristic for each type of oedema, and reflected the protein content. The magnetisation decay characteristics of both normal and oedematous white matter were monoexponential for T1 but biexponential for T2 decay. The relative sizes of the two component exponentials of the latter corresponded with the physical sizes of the major tissue water compartments. Quantitative MRI data can provide reliable information about the physico-chemical environment of tissue water in normal and oedematous cerebral tissue, and are useful for distinguishing between acute and chronic lesions in multiple sclerosis. Images PMID:3572428

  7. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  8. Quantitative imaging features: extension of the oncology medical image database

    NASA Astrophysics Data System (ADS)

    Patel, M. N.; Looney, P. T.; Young, K. C.; Halling-Brown, M. D.

    2015-03-01

    Radiological imaging is fundamental within the healthcare industry and has become routinely adopted for diagnosis, disease monitoring and treatment planning. With the advent of digital imaging modalities and the rapid growth in both diagnostic and therapeutic imaging, the ability to be able to harness this large influx of data is of paramount importance. The Oncology Medical Image Database (OMI-DB) was created to provide a centralized, fully annotated dataset for research. The database contains both processed and unprocessed images, associated data, and annotations and where applicable expert determined ground truths describing features of interest. Medical imaging provides the ability to detect and localize many changes that are important to determine whether a disease is present or a therapy is effective by depicting alterations in anatomic, physiologic, biochemical or molecular processes. Quantitative imaging features are sensitive, specific, accurate and reproducible imaging measures of these changes. Here, we describe an extension to the OMI-DB whereby a range of imaging features and descriptors are pre-calculated using a high throughput approach. The ability to calculate multiple imaging features and data from the acquired images would be valuable and facilitate further research applications investigating detection, prognosis, and classification. The resultant data store contains more than 10 million quantitative features as well as features derived from CAD predictions. Theses data can be used to build predictive models to aid image classification, treatment response assessment as well as to identify prognostic imaging biomarkers.

  9. Quantitative analysis of tumor burden in mouse lung via MRI.

    PubMed

    Tidwell, Vanessa K; Garbow, Joel R; Krupnick, Alexander S; Engelbach, John A; Nehorai, Arye

    2012-02-01

    Lung cancer is the leading cause of cancer death in the United States. Despite recent advances in screening protocols, the majority of patients still present with advanced or disseminated disease. Preclinical rodent models provide a unique opportunity to test novel therapeutic drugs for targeting lung cancer. Respiratory-gated MRI is a key tool for quantitatively measuring lung-tumor burden and monitoring the time-course progression of individual tumors in mouse models of primary and metastatic lung cancer. However, quantitative analysis of lung-tumor burden in mice by MRI presents significant challenges. Herein, a method for measuring tumor burden based upon average lung-image intensity is described and validated. The method requires accurate lung segmentation; its efficiency and throughput would be greatly aided by the ability to automatically segment the lungs. A technique for automated lung segmentation in the presence of varying tumor burden levels is presented. The method includes development of a new, two-dimensional parametric model of the mouse lungs and a multi-faceted cost function to optimally fit the model parameters to each image. Results demonstrate a strong correlation (0.93), comparable with that of fully manual expert segmentation, between the automated method's tumor-burden metric and the tumor burden measured by lung weight.

  10. Linkage disequilibrium interval mapping of quantitative trait loci

    PubMed Central

    Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte

    2006-01-01

    Background For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Results Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Conclusion Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates. PMID:16542433

  11. Quantitative analysis of glycated proteins.

    PubMed

    Priego-Capote, Feliciano; Ramírez-Boo, María; Finamore, Francesco; Gluck, Florent; Sanchez, Jean-Charles

    2014-02-07

    The proposed protocol presents a comprehensive approach for large-scale qualitative and quantitative analysis of glycated proteins (GP) in complex biological samples including biological fluids and cell lysates such as plasma and red blood cells. The method, named glycation isotopic labeling (GIL), is based on the differential labeling of proteins with isotopic [(13)C6]-glucose, which supports quantitation of the resulting glycated peptides after enzymatic digestion with endoproteinase Glu-C. The key principle of the GIL approach is the detection of doublet signals for each glycated peptide in MS precursor scanning (glycated peptide with in vivo [(12)C6]- and in vitro [(13)C6]-glucose). The mass shift of the doublet signals is +6, +3 or +2 Da depending on the peptide charge state and the number of glycation sites. The intensity ratio between doublet signals generates quantitative information of glycated proteins that can be related to the glycemic state of the studied samples. Tandem mass spectrometry with high-energy collisional dissociation (HCD-MS2) and data-dependent methods with collision-induced dissociation (CID-MS3 neutral loss scan) are used for qualitative analysis.

  12. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  13. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome

    PubMed Central

    2016-01-01

    Objective To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Methods Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Results Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). Conclusion These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US. PMID:28119835

  14. Quantitative assay of photoinduced DNA strand breaks by real-time PCR.

    PubMed

    Wiczk, Justyna; Westphal, Kinga; Rak, Janusz

    2016-09-05

    Real-time PCR (qPCR) - a modern methodology primarily used for studying gene expression has been employed for the quantitative assay of an important class of DNA damage - single strand breaks. These DNA lesions which may lead to highly cytotoxic double strand breaks were quantified in a model system where double stranded DNA was sensitized to UV photons by labeling with 5-bromo-2'-deoxyuridine. The amount of breaks formed due to irradiation with several doses of 320nm photons was assayed by two independent methods: LC-MS and qPCR. A very good agreement between the relative damage measured by the two completely different analytical tools proves the applicability of qPCR for the quantitative analysis of SSBs. Our results suggest that the popularity of the hitherto underestimated though accurate and site-specific technique of real-time PCR may increase in future DNA damage studies.

  15. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    NASA Astrophysics Data System (ADS)

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-08-01

    Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.

  16. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    PubMed Central

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-01-01

    Local surface charge density of lipid membranes influences membrane–protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values. PMID:27561322

  17. Development of a quantitative fluorescence-based ligand-binding assay

    PubMed Central

    Breen, Conor J.; Raverdeau, Mathilde; Voorheis, H. Paul

    2016-01-01

    A major goal of biology is to develop a quantitative ligand-binding assay that does not involve the use of radioactivity. Existing fluorescence-based assays have a serious drawback due to fluorescence quenching that accompanies the binding of fluorescently-labeled ligands to their receptors. This limitation of existing fluorescence-based assays prevents the number of cellular receptors under investigation from being accurately measured. We have developed a method where FITC-labeled proteins bound to a cell surface are proteolyzed extensively to eliminate fluorescence quenching and then the fluorescence of the resulting sample is compared to that of a known concentration of the proteolyzed FITC-protein employed. This step enables the number of cellular receptors to be measured quantitatively. We expect that this method will provide researchers with a viable alternative to the use of radioactivity in ligand binding assays. PMID:27161290

  18. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  19. An efficient approach to the quantitative analysis of humic acid in water.

    PubMed

    Wang, Xue; Li, Bao Qiong; Zhai, Hong Lin; Xiong, Meng Yi; Liu, Ying

    2016-01-01

    Rayleigh and Raman scatterings inevitably appear in fluorescence measurements, which make the quantitative analysis more difficult, especially in the overlap of target signals and scattering signals. Based on the grayscale images of three-dimensional fluorescence spectra, the linear model with two selected Zernike moments was established for the determination of humic acid, and applied to the quantitative analysis of the real sample taken from the Yellow River. The correlation coefficient (R(2)) and leave-one-out cross validation correlation coefficient (R(2)cv) were up to 0.9994 and 0.9987, respectively. The average recoveries were reached 96.28%. Compared with N-way partial least square and alternating trilinear decomposition methods, our approach was immune from the scattering and noise signals owing to its powerful multi-resolution characteristic and the obtained results were more reliable and accurate, which could be applied in food analyses.

  20. Quantitative two-dimensional measurement of oil-film thickness by laser-induced fluorescence in a piston-ring model experiment.

    PubMed

    Wigger, Stefan; Füßer, Hans-Jürgen; Fuhrmann, Daniel; Schulz, Christof; Kaiser, Sebastian A

    2016-01-10

    This paper describes advances in using laser-induced fluorescence of dyes for imaging the thickness of oil films in a rotating ring tribometer with optical access, an experiment representing a sliding piston ring in an internal combustion engine. A method for quantitative imaging of the oil-film thickness is developed that overcomes the main challenge, the accurate calibration of the detected fluorescence signal for film thicknesses in the micrometer range. The influence of the background material and its surface roughness is examined, and a method for flat-field correction is introduced. Experiments in the tribometer show that the method yields quantitative, physically plausible results, visualizing features with submicrometer thickness.

  1. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  2. A unique, accurate LWIR optics measurement system

    NASA Astrophysics Data System (ADS)

    Fantone, Stephen D.; Orband, Daniel G.

    2011-05-01

    A compact low-cost LWIR test station has been developed that provides real time MTF testing of IR optical systems and EO imaging systems. The test station is intended to be operated by a technician and can be used to measure the focal length, blur spot size, distortion, and other metrics of system performance. The challenges and tradeoffs incorporated into this instrumentation will be presented. The test station performs the measurement of an IR lens or optical system's first order quantities (focal length, back focal length) including on and off-axis imaging performance (e.g., MTF, resolution, spot size) under actual test conditions to enable the simulation of their actual use. Also described is the method of attaining the needed accuracies so that derived calculations like focal length (EFL = image shift/tan(theta)) can be performed to the requisite accuracy. The station incorporates a patented video capture technology and measures MTF and blur characteristics using newly available lowcost LWIR cameras. This allows real time determination of the optical system performance enabling faster measurements, higher throughput and lower cost results than scanning systems. Multiple spectral filters are also accommodated within the test stations which facilitate performance evaluation under various spectral conditions.

  3. Accurate Measurement of Bone Density with QCT

    NASA Technical Reports Server (NTRS)

    Cleek, Tammy M.; Beaupre, Gary S.; Matsubara, Miki; Whalen, Robert T.; Dalton, Bonnie P. (Technical Monitor)

    2002-01-01

    The objective of this study was to determine the accuracy of bone density measurement with a new OCT technology. A phantom was fabricated using two materials, a water-equivalent compound and hydroxyapatite (HA), combined in precise proportions (QRM GrnbH, Germany). The phantom was designed to have the approximate physical size and range in bone density as a human calcaneus, with regions of 0, 50, 100, 200, 400, and 800 mg/cc HA. The phantom was scanned at 80, 120 and 140 KVp with a GE CT/i HiSpeed Advantage scanner. A ring of highly attenuating material (polyvinyl chloride or teflon) was slipped over the phantom to alter the image by introducing non-axi-symmetric beam hardening. Images were corrected with a new OCT technology using an estimate of the effective X-ray beam spectrum to eliminate beam hardening artifacts. The algorithm computes the volume fraction of HA and water-equivalent matrix in each voxel. We found excellent agreement between expected and computed HA volume fractions. Results were insensitive to beam hardening ring material, HA concentration, and scan voltage settings. Data from all 3 voltages with a best fit linear regression are displays.

  4. A new approach to compute accurate velocity of meteors

    NASA Astrophysics Data System (ADS)

    Egal, Auriane; Gural, Peter; Vaubaillon, Jeremie; Colas, Francois; Thuillot, William

    2016-10-01

    The CABERNET project was designed to push the limits of meteoroid orbit measurements by improving the determination of the meteors' velocities. Indeed, despite of the development of the cameras networks dedicated to the observation of meteors, there is still an important discrepancy between the measured orbits of meteoroids computed and the theoretical results. The gap between the observed and theoretic semi-major axis of the orbits is especially significant; an accurate determination of the orbits of meteoroids therefore largely depends on the computation of the pre-atmospheric velocities. It is then imperative to dig out how to increase the precision of the measurements of the velocity.In this work, we perform an analysis of different methods currently used to compute the velocities and trajectories of the meteors. They are based on the intersecting planes method developed by Ceplecha (1987), the least squares method of Borovicka (1990), and the multi-parameter fitting (MPF) method published by Gural (2012).In order to objectively compare the performances of these techniques, we have simulated realistic meteors ('fakeors') reproducing the different error measurements of many cameras networks. Some fakeors are built following the propagation models studied by Gural (2012), and others created by numerical integrations using the Borovicka et al. 2007 model. Different optimization techniques have also been investigated in order to pick the most suitable one to solve the MPF, and the influence of the geometry of the trajectory on the result is also presented.We will present here the results of an improved implementation of the multi-parameter fitting that allow an accurate orbit computation of meteors with CABERNET. The comparison of different velocities computation seems to show that if the MPF is by far the best method to solve the trajectory and the velocity of a meteor, the ill-conditioning of the costs functions used can lead to large estimate errors for noisy

  5. Energy & Climate: Getting Quantitative

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  6. Conformation of a flexible polymer in explicit solvent: Accurate solvation potentials for Lennard-Jones chains.

    PubMed

    Taylor, Mark P; Ye, Yuting; Adhikari, Shishir R

    2015-11-28

    The conformation of a polymer chain in solution is coupled to the local structure of the surrounding solvent and can undergo large changes in response to variations in solvent density and temperature. The many-body effects of solvent on the structure of an n-mer polymer chain can be formally mapped to an exact n-body solvation potential. Here, we use a pair decomposition of this n-body potential to construct a set of two-body potentials for a Lennard-Jones (LJ) polymer chain in explicit LJ solvent. The solvation potentials are built from numerically exact results for 5-mer chains in solvent combined with an approximate asymptotic expression for the solvation potential between sites that are distant along the chain backbone. These potentials map the many-body chain-in-solvent problem to a few-body single-chain problem and can be used to study a chain of arbitrary length, thereby dramatically reducing the computational complexity of the polymer chain-in-solvent problem. We have constructed solvation potentials at a large number of state points across the LJ solvent phase diagram including the vapor, liquid, and super-critical regions. We use these solvation potentials in single-chain Monte Carlo (MC) simulations with n ≤ 800 to determine the size, intramolecular structure, and scaling behavior of chains in solvent. To assess our results, we have carried out full chain-in-solvent MC simulations (with n ≤ 100) and find that our solvation potential approach is quantitatively accurate for a wide range of solvent conditions for these chain lengths.

  7. Accurate measurement of transgene copy number in crop plants using droplet digital PCR.

    PubMed

    Collier, Ray; Dasgupta, Kasturi; Xing, Yan-Ping; Hernandez, Bryan Tarape; Shao, Min; Rohozinski, Dominica; Kovak, Emma; Lin, Jeanie; de Oliveira, Maria Luiza P; Stover, Ed; McCue, Kent F; Harmon, Frank G; Blechl, Ann; Thomson, James G; Thilmony, Roger

    2017-02-23

    Genetic transformation is a powerful means for the improvement of crop plants, but requires labor and resource intensive methods. An efficient method for identifying single copy transgene insertion events from a population of independent transgenic lines is desirable. Currently transgene copy number is estimated by either Southern blot hybridization analyses or quantitative polymerase chain reaction (qPCR) experiments. Southern hybridization is a convincing and reliable method, but it also is expensive, time-consuming and often requires a large amount of genomic DNA and radioactively labeled probes. Alternatively, qPCR requires less DNA and is potentially simpler to perform, but its results can lack the accuracy and precision needed to confidently distinguish between one and two copy events in transgenic plants with large genomes. To address this need, we developed a droplet digital PCR (dPCR)-based method for transgene copy number measurement in an array of crops: rice, citrus, potato, maize, tomato, and wheat. The method utilizes specific primers to amplify target transgenes, and endogenous reference genes in a single duplexed reaction containing thousands of droplets. Endpoint amplicon production in the droplets is detected and quantified using sequence-specific fluorescently labeled probes. The results demonstrate that this approach can generate confident copy number measurements in independent transgenic lines in these crop species. This method and the compendium of probes and primers will be a useful resource for the plant research community, enabling the simple and accurate determination of transgene copy number in these six important crop species. This article is protected by copyright. All rights reserved.

  8. Application of the accurate mass and time tag approach in studies of the human blood lipidome

    PubMed Central

    Ding, Jie; Sorensen, Christina M.; Jaitly, Navdeep; Jiang, Hongliang; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Metz, Thomas O.

    2008-01-01

    We report a preliminary demonstration of the accurate mass and time (AMT) tag approach for lipidomics. Initial data-dependent LC-MS/MS analyses of human plasma, erythrocyte, and lymphocyte lipids were performed in order to identify lipid molecular species in conjunction with complementary accurate mass and isotopic distribution information. Identified lipids were used to populate initial lipid AMT tag databases containing 250 and 45 entries for those species detected in positive and negative electrospray ionization (ESI) modes, respectively. The positive ESI database was then utilized to identify human plasma, erythrocyte, and lymphocyte lipids in high-throughput LC-MS analyses based on the AMT tag approach. We were able to define the lipid profiles of human plasma, erythrocytes, and lymphocytes based on qualitative and quantitative differences in lipid abundance. PMID:18502191

  9. Robust Accurate Non-Invasive Analyte Monitor

    SciTech Connect

    Robinson, Mark R.

    1998-11-03

    An improved method and apparatus for determining noninvasively and in vivo one or more unknown values of a known characteristic, particularly the concentration of an analyte in human tissue. The method includes: (1) irradiating the tissue with infrared energy (400 nm-2400 nm) having at least several wavelengths in a given range of wavelengths so that there is differential absorption of at least some of the wavelengths by the tissue as a function of the wavelengths and the known characteristic, the differential absorption causeing intensity variations of the wavelengths incident from the tissue; (2) providing a first path through the tissue; (3) optimizing the first path for a first sub-region of the range of wavelengths to maximize the differential absorption by at least some of the wavelengths in the first sub-region; (4) providing a second path through the tissue; and (5) optimizing the second path for a second sub-region of the range, to maximize the differential absorption by at least some of the wavelengths in the second sub-region. In the preferred embodiment a third path through the tissue is provided for, which path is optimized for a third sub-region of the range. With this arrangement, spectral variations which are the result of tissue differences (e.g., melanin and temperature) can be reduced. At least one of the paths represents a partial transmission path through the tissue. This partial transmission path may pass through the nail of a finger once and, preferably, twice. Also included are apparatus for: (1) reducing the arterial pulsations within the tissue; and (2) maximizing the blood content i the tissue.

  10. Toward Accurate Adsorption Energetics on Clay Surfaces

    PubMed Central

    2016-01-01

    Clay minerals are ubiquitous in nature, and the manner in which they interact with their surroundings has important industrial and environmental implications. Consequently, a molecular-level understanding of the adsorption of molecules on clay surfaces is crucial. In this regard computer simulations play an important role, yet the accuracy of widely used empirical force fields (FF) and density functional theory (DFT) exchange-correlation functionals is often unclear in adsorption systems dominated by weak interactions. Herein we present results from quantum Monte Carlo (QMC) for water and methanol adsorption on the prototypical clay kaolinite. To the best of our knowledge, this is the first time QMC has been used to investigate adsorption at a complex, natural surface such as a clay. As well as being valuable in their own right, the QMC benchmarks obtained provide reference data against which the performance of cheaper DFT methods can be tested. Indeed using various DFT exchange-correlation functionals yields a very broad range of adsorption energies, and it is unclear a priori which evaluation is better. QMC reveals that in the systems considered here it is essential to account for van der Waals (vdW) dispersion forces since this alters both the absolute and relative adsorption energies of water and methanol. We show, via FF simulations, that incorrect relative energies can lead to significant changes in the interfacial densities of water and methanol solutions at the kaolinite interface. Despite the clear improvements offered by the vdW-corrected and the vdW-inclusive functionals, absolute adsorption energies are often overestimated, suggesting that the treatment of vdW forces in DFT is not yet a solved problem. PMID:27917256

  11. Diagnostic limitations to accurate diagnosis of cholera.

    PubMed

    Alam, Munirul; Hasan, Nur A; Sultana, Marzia; Nair, G Balakrish; Sadique, A; Faruque, A S G; Endtz, Hubert P; Sack, R B; Huq, A; Colwell, R R; Izumiya, Hidemasa; Morita, Masatomo; Watanabe, Haruo; Cravioto, Alejandro

    2010-11-01

    The treatment regimen for diarrhea depends greatly on correct diagnosis of its etiology. Recent diarrhea outbreaks in Bangladesh showed Vibrio cholerae to be the predominant cause, although more than 40% of the suspected cases failed to show cholera etiology by conventional culture methods (CMs). In the present study, suspected cholera stools collected from every 50th patient during an acute diarrheal outbreak were analyzed extensively using different microbiological and molecular tools to determine their etiology. Of 135 stools tested, 86 (64%) produced V. cholerae O1 by CMs, while 119 (88%) tested positive for V. cholerae O1 by rapid cholera dipstick (DS) assay; all but three samples positive for V. cholerae O1 by CMs were also positive for V. cholerae O1 by DS assay. Of 49 stools that lacked CM-based cholera etiology despite most being positive for V. cholerae O1 by DS assay, 25 (51%) had coccoid V. cholerae O1 cells as confirmed by direct fluorescent antibody (DFA) assay, 36 (73%) amplified primers for the genes wbe O1 and ctxA by multiplex-PCR (M-PCR), and 31 (63%) showed El Tor-specific lytic phage on plaque assay (PA). Each of these methods allowed the cholera etiology to be confirmed for 97% of the stool samples. The results suggest that suspected cholera stools that fail to show etiology by CMs during acute diarrhea outbreaks may be due to the inactivation of V. cholerae by in vivo vibriolytic action of the phage and/or nonculturability induced as a host response.

  12. A Quantitative Tool for Producing DNA-Based Diagnostic Arrays

    SciTech Connect

    Tom J. Whitaker

    2008-07-11

    The purpose of this project was to develop a precise, quantitative method to analyze oligodeoxynucleotides (ODNs) on an array to enable a systematic approach to quality control issues affecting DNA microarrays. Two types of ODN's were tested; ODN's formed by photolithography and ODN's printed onto microarrays. Initial work in Phase I, performed in conjunction with Affymetrix, Inc. who has a patent on a photolithographic in situ technique for creating DNA arrays, was very promising but did seem to indicate that the atomization process was not complete. Soon after Phase II work was under way, Affymetrix had further developed fluorescent methods and indicated they were no longer interested in our resonance ionization technique. This was communicated to the program manager and it was decided that the project would continue and be focused on printed ODNs. The method being tested is called SIRIS, Sputter-Initiated Resonance Ionization Spectroscopy. SIRIS has been shown to be a highly sensitive, selective, and quantitative tool for atomic species. This project was aimed at determining if an ODN could be labeled in such a way that SIRIS could be used to measure the label and thus provide quantitative measurements of the ODN on an array. One of the largest problems in this study has been developing a method that allows us to know the amount of an ODN on a surface independent of the SIRIS measurement. Even though we could accurately determine the amount of ODN deposited on a surface, the amount that actually attached to the surface is very difficult to measure (hence the need for a quantitative tool). A double-labeling procedure was developed in which 33P and Pt were both used to label ODNs. The radioactive 33P could be measured by a proportional counter that maps the counts in one dimension. This gave a good measurement of the amount of ODN remaining on a surface after immobilization and washing. A second label, Pt, was attached to guanine nucleotides in the ODN. Studies

  13. Quantitative blood group typing using surface plasmon resonance.

    PubMed

    Then, Whui Lyn; Aguilar, Marie-Isabel; Garnier, Gil

    2015-11-15

    The accurate and reliable typing of blood groups is essential prior to blood transfusion. While current blood typing methods are well established, results are subjective and heavily reliant on analysis by trained personnel. Techniques for quantifying blood group antibody-antigen interactions are also very limited. Many biosensing systems rely on surface plasmon resonance (SPR) detection to quantify biomolecular interactions. While SPR has been widely used for characterizing antibody-antigen interactions, measuring antibody interactions with whole cells is significantly less common. Previous studies utilized SPR for blood group antigen detection, however, showed poor regeneration causing loss of functionality after a single use. In this study, a fully regenerable, multi-functional platform for quantitative blood group typing via SPR detection is achieved by immobilizing anti-human IgG antibody to the sensor surface, which binds to the Fc region of human IgG antibodies. The surface becomes an interchangeable platform capable of quantifying the blood group interactions between red blood cells (RBCs) and IgG antibodies. As with indirect antiglobulin tests (IAT), which use IgG antibodies for detection, IgG antibodies are initially incubated with RBCs. This facilitates binding to the immobilized monolayer and allows for quantitative blood group detection. Using the D-antigen as an example, a clear distinction between positive (>500 RU) and negative (<100 RU) RBCs is achieved using anti-D IgG. Complete regeneration of the anti-human IgG surface is also successful, showing negligible degradation of the surface after more than 100 regenerations. This novel approach is validated with human-sourced whole blood samples to demonstrate an interesting alternative for quantitative blood grouping using SPR analysis.

  14. Quantitation of signal transduction.

    PubMed

    Krauss, S; Brand, M D

    2000-12-01

    Conventional qualitative approaches to signal transduction provide powerful ways to explore the architecture and function of signaling pathways. However, at the level of the complete system, they do not fully depict the interactions between signaling and metabolic pathways and fail to give a manageable overview of the complexity that is often a feature of cellular signal transduction. Here, we introduce a quantitative experimental approach to signal transduction that helps to overcome these difficulties. We present a quantitative analysis of signal transduction during early mitogen stimulation of lymphocytes, with steady-state respiration rate as a convenient marker of metabolic stimulation. First, by inhibiting various key signaling pathways, we measure their relative importance in regulating respiration. About 80% of the input signal is conveyed via identifiable routes: 50% through pathways sensitive to inhibitors of protein kinase C and MAP kinase and 30% through pathways sensitive to an inhibitor of calcineurin. Second, we quantify how each of these pathways differentially stimulates functional units of reactions that produce and consume a key intermediate in respiration: the mitochondrial membrane potential. Both the PKC and calcineurin routes stimulate consumption more strongly than production, whereas the unidentified signaling routes stimulate production more than consumption, leading to no change in membrane potential despite increased respiration rate. The approach allows a quantitative description of the relative importance of signal transduction pathways and the routes by which they activate a specific cellular process. It should be widely applicable.

  15. Improved image quality in pinhole SPECT by accurate modeling of the point spread function in low magnification systems

    SciTech Connect

    Pino, Francisco; Roé, Nuria; Aguiar, Pablo; Falcon, Carles; Ros, Domènec; Pavía, Javier

    2015-02-15

    Purpose: Single photon emission computed tomography (SPECT) has become an important noninvasive imaging technique in small-animal research. Due to the high resolution required in small-animal SPECT systems, the spatially variant system response needs to be included in the reconstruction algorithm. Accurate modeling of the system response should result in a major improvement in the quality of reconstructed images. The aim of this study was to quantitatively assess the impact that an accurate modeling of spatially variant collimator/detector response has on image-quality parameters, using a low magnification SPECT system equipped with a pinhole collimator and a small gamma camera. Methods: Three methods were used to model the point spread function (PSF). For the first, only the geometrical pinhole aperture was included in the PSF. For the second, the septal penetration through the pinhole collimator was added. In the third method, the measured intrinsic detector response was incorporated. Tomographic spatial resolution was evaluated and contrast, recovery coefficients, contrast-to-noise ratio, and noise were quantified using a custom-built NEMA NU 4–2008 image-quality phantom. Results: A high correlation was found between the experimental data corresponding to intrinsic detector response and the fitted values obtained by means of an asymmetric Gaussian distribution. For all PSF models, resolution improved as the distance from the point source to the center of the field of view increased and when the acquisition radius diminished. An improvement of resolution was observed after a minimum of five iterations when the PSF modeling included more corrections. Contrast, recovery coefficients, and contrast-to-noise ratio were better for the same level of noise in the image when more accurate models were included. Ring-type artifacts were observed when the number of iterations exceeded 12. Conclusions: Accurate modeling of the PSF improves resolution, contrast, and recovery

  16. Evaluating Multiplexed Quantitative Phosphopeptide Analysis on a Hybrid Quadrupole Mass Filter/Linear Ion Trap/Orbitrap Mass Spectrometer

    PubMed Central

    2015-01-01

    As a driver for many biological processes, phosphorylation remains an area of intense research interest. Advances in multiplexed quantitation utilizing isobaric tags (e.g., TMT and iTRAQ) have the potential to create a new paradigm in quantitative proteomics. New instrumentation and software are propelling these multiplexed workflows forward, which results in more accurate, sensitive, and reproducible quantitation across tens of thousands of phosphopeptides. This study assesses the performance of multiplexed quantitative phosphoproteomics on the Orbitrap Fusion mass spectrometer. Utilizing a two-phosphoproteome model of precursor ion interference, we assessed the accuracy of phosphopeptide quantitation across a variety of experimental approaches. These methods included the use of synchronous precursor selection (SPS) to enhance TMT reporter ion intensity and accuracy. We found that (i) ratio distortion remained a problem for phosphopeptide analysis in multiplexed quantitative workflows, (ii) ratio distortion can be overcome by the use of an SPS-MS3 scan, (iii) interfering ions generally possessed a different charge state than the target precursor, and (iv) selecting only the phosphate neutral loss peak (single notch) for the MS3 scan still provided accurate ratio measurements. Remarkably, these data suggest that the underlying cause of interference may not be due to coeluting and cofragmented peptides but instead from consistent, low level background fragmentation. Finally, as a proof-of-concept 10-plex experiment, we compared phosphopeptide levels from five murine brains to five livers. In total, the SPS-MS3 method quantified 38 247 phosphopeptides, corresponding to 11 000 phosphorylation sites. With 10 measurements recorded for each phosphopeptide, this equates to more than 628 000 binary comparisons collected in less than 48 h. PMID:25521595

  17. Efficient and Accurate Indoor Localization Using Landmark Graphs

    NASA Astrophysics Data System (ADS)

    Gu, F.; Kealy, A.; Khoshelham, K.; Shang, J.

    2016-06-01

    Indoor localization is important for a variety of applications such as location-based services, mobile social networks, and emergency response. Fusing spatial information is an effective way to achieve accurate indoor localization with little or with no need for extra hardware. However, existing indoor localization methods that make use of spatial information are either too computationally expensive or too sensitive to the completeness of landmark detection. In this paper, we solve this problem by using the proposed landmark graph. The landmark graph is a directed graph where nodes are landmarks (e.g., doors, staircases, and turns) and edges are accessible paths with heading information. We compared the proposed method with two common Dead Reckoning (DR)-based methods (namely, Compass + Accelerometer + Landmarks and Gyroscope + Accelerometer + Landmarks) by a series of experiments. Experimental results show that the proposed method can achieve 73% accuracy with a positioning error less than 2.5 meters, which outperforms the other two DR-based methods.

  18. Accurate reactions open up the way for more cooperative societies

    NASA Astrophysics Data System (ADS)

    Vukov, Jeromos

    2014-09-01

    We consider a prisoner's dilemma model where the interaction neighborhood is defined by a square lattice. Players are equipped with basic cognitive abilities such as being able to distinguish their partners, remember their actions, and react to their strategy. By means of their short-term memory, they can remember not only the last action of their partner but the way they reacted to it themselves. This additional accuracy in the memory enables the handling of different interaction patterns in a more appropriate way and this results in a cooperative community with a strikingly high cooperation level for any temptation value. However, the more developed cognitive abilities can only be effective if the copying process of the strategies is accurate enough. The excessive extent of faulty decisions can deal a fatal blow to the possibility of stable cooperative relations.

  19. Accurate finite difference methods for time-harmonic wave propagation

    NASA Technical Reports Server (NTRS)

    Harari, Isaac; Turkel, Eli

    1994-01-01

    Finite difference methods for solving problems of time-harmonic acoustics are developed and analyzed. Multidimensional inhomogeneous problems with variable, possibly discontinuous, coefficients are considered, accounting for the effects of employing nonuniform grids. A weighted-average representation is less sensitive to transition in wave resolution (due to variable wave numbers or nonuniform grids) than the standard pointwise representation. Further enhancement in method performance is obtained by basing the stencils on generalizations of Pade approximation, or generalized definitions of the derivative, reducing spurious dispersion, anisotropy and reflection, and by improving the representation of source terms. The resulting schemes have fourth-order accurate local truncation error on uniform grids and third order in the nonuniform case. Guidelines for discretization pertaining to grid orientation and resolution are presented.

  20. Second-Order Accurate Projective Integrators for Multiscale Problems

    SciTech Connect

    Lee, S L; Gear, C W

    2005-05-27

    We introduce new projective versions of second-order accurate Runge-Kutta and Adams-Bashforth methods, and demonstrate their use as outer integrators in solving stiff differential systems. An important outcome is that the new outer integrators, when combined with an inner telescopic projective integrator, can result in fully explicit methods with adaptive outer step size selection and solution accuracy comparable to those obtained by implicit integrators. If the stiff differential equations are not directly available, our formulations and stability analysis are general enough to allow the combined outer-inner projective integrators to be applied to black-box legacy codes or perform a coarse-grained time integration of microscopic systems to evolve macroscopic behavior, for example.

  1. Accurate reactions open up the way for more cooperative societies.

    PubMed

    Vukov, Jeromos

    2014-09-01

    We consider a prisoner's dilemma model where the interaction neighborhood is defined by a square lattice. Players are equipped with basic cognitive abilities such as being able to distinguish their partners, remember their actions, and react to their strategy. By means of their short-term memory, they can remember not only the last action of their partner but the way they reacted to it themselves. This additional accuracy in the memory enables the handling of different interaction patterns in a more appropriate way and this results in a cooperative community with a strikingly high cooperation level for any temptation value. However, the more developed cognitive abilities can only be effective if the copying process of the strategies is accurate enough. The excessive extent of faulty decisions can deal a fatal blow to the possibility of stable cooperative relations.

  2. An Inexpensive and Accurate Tensiometer Using an Electronic Balance

    NASA Astrophysics Data System (ADS)

    Dolz, Manuel; Delegido, Jesús; Hernández, María-Jesús; Pellicer, Julio

    2001-09-01

    A method for measuring surface tension of liquid-air interfaces that consists of a modification of the du Noüy tensiometer is proposed. An electronic balance is used to determine the detachment force with high resolution and the relative displacement ring/plate-liquid surface is carried out by the descent of the liquid-free surface. The procedure familiarizes undergraduate students in applied science and technology with the experimental study of surface tension by means of a simple and accurate method that offers the advantages of sophisticated devices at considerably less cost. The operational aspects that must be taken into account are analyzed: the measuring system and determination of its effective length, measurement of the detachment force, and the relative system-liquid interface displacement rate. To check the accuracy of the proposed tensiometer, measurements of the surface tension of different known liquids have been performed, and good agreement with results reported in the literature was obtained.

  3. Quantitative phase microscopy with asynchronous digital holography.

    PubMed

    Chalut, Kevin J; Brown, William J; Wax, Adam

    2007-03-19

    We demonstrate a new method of measuring quantitative phase in imaging of biological materials. This method, asynchronous digital holography, employs knowledge of a moving fringe created by acousto-optic modulators to execute phase-shifting interferometry using two near-simultaneous interferograms. The method can be used to obtain quantitative phase images of dynamic biological samples on millisecond time scales. We present results on a standard sample, and on live cell samples.

  4. Evaluation of various real-time reverse transcription quantitative PCR assays for norovirus detection.

    PubMed

    Yoo, Ju Eun; Lee, Cheonghoon; Park, SungJun; Ko, GwangPyo

    2017-02-01

    Human noroviruses are widespread and contagious viruses causing nonbacterial gastroenteritis. Real-time reverse transcription quantitative PCR (real-time RT-qPCR) is currently the gold standard for sensitive and accurate detection for these pathogens and serves as a critical tool in outbreak prevention and control. Dif