Science.gov

Sample records for accurate quantitative analyses

  1. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  2. Discrete sensors distribution for accurate plantar pressure analyses.

    PubMed

    Claverie, Laetitia; Ille, Anne; Moretto, Pierre

    2016-12-01

    The aim of this study was to determine the distribution of discrete sensors under the footprint for accurate plantar pressure analyses. For this purpose, two different sensor layouts have been tested and compared, to determine which was the most accurate to monitor plantar pressure with wireless devices in research and/or clinical practice. Ten healthy volunteers participated in the study (age range: 23-58 years). The barycenter of pressures (BoP) determined from the plantar pressure system (W-inshoe®) was compared to the center of pressures (CoP) determined from a force platform (AMTI) in the medial-lateral (ML) and anterior-posterior (AP) directions. Then, the vertical ground reaction force (vGRF) obtained from both W-inshoe® and force platform was compared for both layouts for each subject. The BoP and vGRF determined from the plantar pressure system data showed good correlation (SCC) with those determined from the force platform data, notably for the second sensor organization (ML SCC= 0.95; AP SCC=0.99; vGRF SCC=0.91). The study demonstrates that an adjusted placement of removable sensors is key to accurate plantar pressure analyses. These results are promising for a plantar pressure recording outside clinical or laboratory settings, for long time monitoring, real time feedback or for whatever activity requiring a low-cost system.

  3. Groundtruth approach to accurate quantitation of fluorescence microarrays

    SciTech Connect

    Mascio-Kegelmeyer, L; Tomascik-Cheeseman, L; Burnett, M S; van Hummelen, P; Wyrobek, A J

    2000-12-01

    To more accurately measure fluorescent signals from microarrays, we calibrated our acquisition and analysis systems by using groundtruth samples comprised of known quantities of red and green gene-specific DNA probes hybridized to cDNA targets. We imaged the slides with a full-field, white light CCD imager and analyzed them with our custom analysis software. Here we compare, for multiple genes, results obtained with and without preprocessing (alignment, color crosstalk compensation, dark field subtraction, and integration time). We also evaluate the accuracy of various image processing and analysis techniques (background subtraction, segmentation, quantitation and normalization). This methodology calibrates and validates our system for accurate quantitative measurement of microarrays. Specifically, we show that preprocessing the images produces results significantly closer to the known ground-truth for these samples.

  4. Quantitative proteomic analysis by accurate mass retention time pairs.

    PubMed

    Silva, Jeffrey C; Denny, Richard; Dorschel, Craig A; Gorenstein, Marc; Kass, Ignatius J; Li, Guo-Zhong; McKenna, Therese; Nold, Michael J; Richardson, Keith; Young, Phillip; Geromanos, Scott

    2005-04-01

    Current methodologies for protein quantitation include 2-dimensional gel electrophoresis techniques, metabolic labeling, and stable isotope labeling methods to name only a few. The current literature illustrates both pros and cons for each of the previously mentioned methodologies. Keeping with the teachings of William of Ockham, "with all things being equal the simplest solution tends to be correct", a simple LC/MS based methodology is presented that allows relative changes in abundance of proteins in highly complex mixtures to be determined. Utilizing a reproducible chromatographic separations system along with the high mass resolution and mass accuracy of an orthogonal time-of-flight mass spectrometer, the quantitative comparison of tens of thousands of ions emanating from identically prepared control and experimental samples can be made. Using this configuration, we can determine the change in relative abundance of a small number of ions between the two conditions solely by accurate mass and retention time. Employing standard operating procedures for both sample preparation and ESI-mass spectrometry, one typically obtains under 5 ppm mass precision and quantitative variations between 10 and 15%. The principal focus of this paper will demonstrate the quantitative aspects of the methodology and continue with a discussion of the associated, complementary qualitative capabilities.

  5. New ventures require accurate risk analyses and adjustments.

    PubMed

    Eastaugh, S R

    2000-01-01

    For new business ventures to succeed, healthcare executives need to conduct robust risk analyses and develop new approaches to balance risk and return. Risk analysis involves examination of objective risks and harder-to-quantify subjective risks. Mathematical principles applied to investment portfolios also can be applied to a portfolio of departments or strategic business units within an organization. The ideal business investment would have a high expected return and a low standard deviation. Nonetheless, both conservative and speculative strategies should be considered in determining an organization's optimal service line and helping the organization manage risk.

  6. Quantitative Analyse und Visualisierung der Herzfunktionen

    NASA Astrophysics Data System (ADS)

    Sauer, Anne; Schwarz, Tobias; Engel, Nicole; Seitel, Mathias; Kenngott, Hannes; Mohrhardt, Carsten; Loßnitzer, Dirk; Giannitsis, Evangelos; Katus, Hugo A.; Meinzer, Hans-Peter

    Die computergestützte bildbasierte Analyse der Herzfunktionen ist mittlerweile Standard in der Kardiologie. Die verfügbaren Produkte erfordern meist ein hohes Maß an Benutzerinteraktion und somit einen erhöhten Zeitaufwand. In dieser Arbeit wird ein Ansatz vorgestellt, der dem Kardiologen eine größtenteils automatische Analyse der Herzfunktionen mittels MRT-Bilddaten ermöglicht und damit Zeitersparnis schafft. Hierbei werden alle relevanten herzphysiologsichen Parameter berechnet und mithilfe von Diagrammen und Graphen visualisiert. Diese Berechnungen werden evaluiert, indem die ermittelten Werte mit manuell vermessenen verglichen werden. Der hierbei berechnete mittlere Fehler liegt mit 2,85 mm für die Wanddicke und 1,61 mm für die Wanddickenzunahme immer noch im Bereich einer Pixelgrösse der verwendeten Bilder.

  7. Quantitative DNA Analyses for Airborne Birch Pollen.

    PubMed

    Müller-Germann, Isabell; Vogel, Bernhard; Vogel, Heike; Pauling, Andreas; Fröhlich-Nowoisky, Janine; Pöschl, Ulrich; Després, Viviane R

    2015-01-01

    Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR), which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm) and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8) and the other for a multi-copy gene (ITS). The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm), the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future.

  8. Quantitative DNA Analyses for Airborne Birch Pollen

    PubMed Central

    Müller-Germann, Isabell; Vogel, Bernhard; Vogel, Heike; Pauling, Andreas; Fröhlich-Nowoisky, Janine; Pöschl, Ulrich; Després, Viviane R.

    2015-01-01

    Birch trees produce large amounts of highly allergenic pollen grains that are distributed by wind and impact human health by causing seasonal hay fever, pollen-related asthma, and other allergic diseases. Traditionally, pollen forecasts are based on conventional microscopic counting techniques that are labor-intensive and limited in the reliable identification of species. Molecular biological techniques provide an alternative approach that is less labor-intensive and enables identification of any species by its genetic fingerprint. A particularly promising method is quantitative Real-Time polymerase chain reaction (qPCR), which can be used to determine the number of DNA copies and thus pollen grains in air filter samples. During the birch pollination season in 2010 in Mainz, Germany, we collected air filter samples of fine (<3 μm) and coarse air particulate matter. These were analyzed by qPCR using two different primer pairs: one for a single-copy gene (BP8) and the other for a multi-copy gene (ITS). The BP8 gene was better suitable for reliable qPCR results, and the qPCR results obtained for coarse particulate matter were well correlated with the birch pollen forecasting results of the regional air quality model COSMO-ART. As expected due to the size of birch pollen grains (~23 μm), the concentration of DNA in fine particulate matter was lower than in the coarse particle fraction. For the ITS region the factor was 64, while for the single-copy gene BP8 only 51. The possible presence of so-called sub-pollen particles in the fine particle fraction is, however, interesting even in low concentrations. These particles are known to be highly allergenic, reach deep into airways and cause often severe health problems. In conclusion, the results of this exploratory study open up the possibility of predicting and quantifying the pollen concentration in the atmosphere more precisely in the future. PMID:26492534

  9. Quantitative proteomics using the high resolution accurate mass capabilities of the quadrupole-orbitrap mass spectrometer.

    PubMed

    Gallien, Sebastien; Domon, Bruno

    2014-08-01

    High resolution/accurate mass hybrid mass spectrometers have considerably advanced shotgun proteomics and the recent introduction of fast sequencing capabilities has expanded its use for targeted approaches. More specifically, the quadrupole-orbitrap instrument has a unique configuration and its new features enable a wide range of experiments. An overview of the analytical capabilities of this instrument is presented, with a focus on its application to quantitative analyses. The high resolution, the trapping capability and the versatility of the instrument have allowed quantitative proteomic workflows to be redefined and new data acquisition schemes to be developed. The initial proteomic applications have shown an improvement of the analytical performance. However, as quantification relies on ion trapping, instead of ion beam, further refinement of the technique can be expected.

  10. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads

    PubMed Central

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-01-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith–Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets. PMID:22379138

  11. FANSe: an accurate algorithm for quantitative mapping of large scale sequencing reads.

    PubMed

    Zhang, Gong; Fedyunin, Ivan; Kirchner, Sebastian; Xiao, Chuanle; Valleriani, Angelo; Ignatova, Zoya

    2012-06-01

    The most crucial step in data processing from high-throughput sequencing applications is the accurate and sensitive alignment of the sequencing reads to reference genomes or transcriptomes. The accurate detection of insertions and deletions (indels) and errors introduced by the sequencing platform or by misreading of modified nucleotides is essential for the quantitative processing of the RNA-based sequencing (RNA-Seq) datasets and for the identification of genetic variations and modification patterns. We developed a new, fast and accurate algorithm for nucleic acid sequence analysis, FANSe, with adjustable mismatch allowance settings and ability to handle indels to accurately and quantitatively map millions of reads to small or large reference genomes. It is a seed-based algorithm which uses the whole read information for mapping and high sensitivity and low ambiguity are achieved by using short and non-overlapping reads. Furthermore, FANSe uses hotspot score to prioritize the processing of highly possible matches and implements modified Smith-Watermann refinement with reduced scoring matrix to accelerate the calculation without compromising its sensitivity. The FANSe algorithm stably processes datasets from various sequencing platforms, masked or unmasked and small or large genomes. It shows a remarkable coverage of low-abundance mRNAs which is important for quantitative processing of RNA-Seq datasets.

  12. Identification of Atrial Fibrillation by Quantitative Analyses of Fingertip Photoplethysmogram

    PubMed Central

    Tang, Sung-Chun; Huang, Pei-Wen; Hung, Chi-Sheng; Shan, Shih-Ming; Lin, Yen-Hung; Shieh, Jiann-Shing; Lai, Dar-Ming; Wu, An-Yeu; Jeng, Jiann-Shing

    2017-01-01

    Atrial fibrillation (AF) detection is crucial for stroke prevention. We investigated the potential of quantitative analyses of photoplethysmogram (PPG) waveforms to identify AF. Continuous electrocardiogram (EKG) and fingertip PPG were recorded simultaneously in acute stroke patients (n = 666) admitted to an intensive care unit. Each EKG was visually labeled as AF (n = 150, 22.5%) or non-AF. Linear and nonlinear features from the pulse interval (PIN) and peak amplitude (AMP) of PPG waveforms were extracted from the first 1, 2, and 10 min of data. Logistic regression analysis revealed six independent PPG features feasibly identifying AF rhythm, including three PIN-related (mean, mean of standard deviation, and sample entropy), and three AMP-related features (mean of the root mean square of the successive differences, sample entropy, and turning point ratio) (all p < 0.01). The performance of the PPG analytic program comprising all 6 features that were extracted from the 2-min data was better than that from the 1-min data (area under the receiver operating characteristic curve was 0.972 (95% confidence interval 0.951–0.989) vs. 0.949 (0.929–0.970), p < 0.001 and was comparable to that from the 10-min data [0.973 (0.953–0.993)] for AF identification. In summary, our study established the optimal PPG analytic program in reliably identifying AF rhythm. PMID:28367965

  13. Quantitative analyses for elucidating mechanisms of cell fate commitment in the mouse blastocyst

    NASA Astrophysics Data System (ADS)

    Saiz, Néstor; Kang, Minjung; Puliafito, Alberto; Schrode, Nadine; Xenopoulos, Panagiotis; Lou, Xinghua; Di Talia, Stefano; Hadjantonakis, Anna-Katerina

    2015-03-01

    In recent years we have witnessed a shift from qualitative image analysis towards higher resolution, quantitative analyses of imaging data in developmental biology. This shift has been fueled by technological advances in both imaging and analysis software. We have recently developed a tool for accurate, semi-automated nuclear segmentation of imaging data from early mouse embryos and embryonic stem cells. We have applied this software to the study of the first lineage decisions that take place during mouse development and established analysis pipelines for both static and time-lapse imaging experiments. In this paper we summarize the conclusions from these studies to illustrate how quantitative, single-cell level analysis of imaging data can unveil biological processes that cannot be revealed by traditional qualitative studies.

  14. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  15. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    PubMed Central

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-01-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0–7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems. PMID:27934889

  16. A fluorescence-based quantitative real-time PCR assay for accurate Pocillopora damicornis species identification

    NASA Astrophysics Data System (ADS)

    Thomas, Luke; Stat, Michael; Evans, Richard D.; Kennington, W. Jason

    2016-09-01

    Pocillopora damicornis is one of the most extensively studied coral species globally, but high levels of phenotypic plasticity within the genus make species identification based on morphology alone unreliable. As a result, there is a compelling need to develop cheap and time-effective molecular techniques capable of accurately distinguishing P. damicornis from other congeneric species. Here, we develop a fluorescence-based quantitative real-time PCR (qPCR) assay to genotype a single nucleotide polymorphism that accurately distinguishes P. damicornis from other morphologically similar Pocillopora species. We trial the assay across colonies representing multiple Pocillopora species and then apply the assay to screen samples of Pocillopora spp. collected at regional scales along the coastline of Western Australia. This assay offers a cheap and time-effective alternative to Sanger sequencing and has broad applications including studies on gene flow, dispersal, recruitment and physiological thresholds of P. damicornis.

  17. Who Let the CAT Out of the Bag? Accurately Dealing with Substitutional Heterogeneity in Phylogenomic Analyses.

    PubMed

    Whelan, Nathan V; Halanych, Kenneth M

    2016-09-14

    As phylogenetic datasets have increased in size, site-heterogeneous substitution models such as CAT-F81 and CAT-GTR have been advocated in favor of other models because they purportedly suppress long-branch attraction (LBA). These models are two of the most commonly used models in phylogenomics, and they have been applied to a variety of taxa, ranging from Drosophila to land plants. However, many arguments in favor of CAT models have been based on tenuous assumptions about the true phylogeny, rather than rigorous testing with known trees via simulation. Moreover, CAT models have not been compared to other approaches for handling substitutional heterogeneity such as data partitioning with site-homogeneous substitution models. We simulated amino acid sequence datasets with substitutional heterogeneity on a variety of tree shapes including those susceptible to LBA. Data were analyzed with both CAT models and partitioning to explore model performance; in total over 670,000 CPU hours were used, of which over 97% was spent running analyses with CAT models. In many cases, all models recovered branching patterns that were identical to the known tree. However, CAT-F81 consistently performed worse than other models in inferring the correct branching patterns, and both CAT models often overestimated substitutional heterogeneity. Additionally, reanalysis of two empirical metazoan datasets supports the notion that CAT-F81 tends to recover less accurate trees than data partitioning and CAT-GTR. Given these results, we conclude that partitioning and CAT-GTR perform similarly in recovering accurate branching patterns. However, computation time can be orders of magnitude less for data partitioning, with commonly used implementations of CAT-GTR often failing to reach completion in a reasonable time frame (i.e., for Bayesian analyses to converge). Practices such as removing constant sites and parsimony uninformative characters, or using CAT-F81 when CAT-GTR is deemed too

  18. The importance of accurate muscle modelling for biomechanical analyses: a case study with a lizard skull

    PubMed Central

    Gröning, Flora; Jones, Marc E. H.; Curtis, Neil; Herrel, Anthony; O'Higgins, Paul; Evans, Susan E.; Fagan, Michael J.

    2013-01-01

    Computer-based simulation techniques such as multi-body dynamics analysis are becoming increasingly popular in the field of skull mechanics. Multi-body models can be used for studying the relationships between skull architecture, muscle morphology and feeding performance. However, to be confident in the modelling results, models need to be validated against experimental data, and the effects of uncertainties or inaccuracies in the chosen model attributes need to be assessed with sensitivity analyses. Here, we compare the bite forces predicted by a multi-body model of a lizard (Tupinambis merianae) with in vivo measurements, using anatomical data collected from the same specimen. This subject-specific model predicts bite forces that are very close to the in vivo measurements and also shows a consistent increase in bite force as the bite position is moved posteriorly on the jaw. However, the model is very sensitive to changes in muscle attributes such as fibre length, intrinsic muscle strength and force orientation, with bite force predictions varying considerably when these three variables are altered. We conclude that accurate muscle measurements are crucial to building realistic multi-body models and that subject-specific data should be used whenever possible. PMID:23614944

  19. Accurate and quantitative polarization-sensitive OCT by unbiased birefringence estimator with noise-stochastic correction

    NASA Astrophysics Data System (ADS)

    Kasaragod, Deepa; Sugiyama, Satoshi; Ikuno, Yasushi; Alonso-Caneiro, David; Yamanari, Masahiro; Fukuda, Shinichi; Oshika, Tetsuro; Hong, Young-Joo; Li, En; Makita, Shuichi; Miura, Masahiro; Yasuno, Yoshiaki

    2016-03-01

    Polarization sensitive optical coherence tomography (PS-OCT) is a functional extension of OCT that contrasts the polarization properties of tissues. It has been applied to ophthalmology, cardiology, etc. Proper quantitative imaging is required for a widespread clinical utility. However, the conventional method of averaging to improve the signal to noise ratio (SNR) and the contrast of the phase retardation (or birefringence) images introduce a noise bias offset from the true value. This bias reduces the effectiveness of birefringence contrast for a quantitative study. Although coherent averaging of Jones matrix tomography has been widely utilized and has improved the image quality, the fundamental limitation of nonlinear dependency of phase retardation and birefringence to the SNR was not overcome. So the birefringence obtained by PS-OCT was still not accurate for a quantitative imaging. The nonlinear effect of SNR to phase retardation and birefringence measurement was previously formulated in detail for a Jones matrix OCT (JM-OCT) [1]. Based on this, we had developed a maximum a-posteriori (MAP) estimator and quantitative birefringence imaging was demonstrated [2]. However, this first version of estimator had a theoretical shortcoming. It did not take into account the stochastic nature of SNR of OCT signal. In this paper, we present an improved version of the MAP estimator which takes into account the stochastic property of SNR. This estimator uses a probability distribution function (PDF) of true local retardation, which is proportional to birefringence, under a specific set of measurements of the birefringence and SNR. The PDF was pre-computed by a Monte-Carlo (MC) simulation based on the mathematical model of JM-OCT before the measurement. A comparison between this new MAP estimator, our previous MAP estimator [2], and the standard mean estimator is presented. The comparisons are performed both by numerical simulation and in vivo measurements of anterior and

  20. Optimization of sample preparation for accurate results in quantitative NMR spectroscopy

    NASA Astrophysics Data System (ADS)

    Yamazaki, Taichi; Nakamura, Satoe; Saito, Takeshi

    2017-04-01

    Quantitative nuclear magnetic resonance (qNMR) spectroscopy has received high marks as an excellent measurement tool that does not require the same reference standard as the analyte. Measurement parameters have been discussed in detail and high-resolution balances have been used for sample preparation. However, the high-resolution balances, such as an ultra-microbalance, are not general-purpose analytical tools and many analysts may find those balances difficult to use, thereby hindering accurate sample preparation for qNMR measurement. In this study, we examined the relationship between the resolution of the balance and the amount of sample weighed during sample preparation. We were able to confirm the accuracy of the assay results for samples weighed on a high-resolution balance, such as the ultra-microbalance. Furthermore, when an appropriate tare and amount of sample was weighed on a given balance, accurate assay results were obtained with another high-resolution balance. Although this is a fundamental result, it offers important evidence that would enhance the versatility of the qNMR method.

  1. Cultivation and quantitative proteomic analyses of acidophilic microbial communities

    SciTech Connect

    Belnap, Christopher P.; Pan, Chongle; Verberkmoes, Nathan C; Power, Mary E.; Samatova, Nagiza F; Carver, Rudolf L.; Hettich, Robert {Bob} L; Banfield, Jillian F.

    2010-01-01

    Acid mine drainage (AMD), an extreme environment characterized by low pH and high metal concentrations, can support dense acidophilic microbial biofilm communities that rely on chemoautotrophic production based on iron oxidation. Field determined production rates indicate that, despite the extreme conditions, these communities are sufficiently well adapted to their habitats to achieve primary production rates comparable to those of microbial communities occurring in some non-extreme environments. To enable laboratory studies of growth, production and ecology of AMD microbial communities, a culturing system was designed to reproduce natural biofilms, including organisms recalcitrant to cultivation. A comprehensive metabolic labeling-based quantitative proteomic analysis was used to verify that natural and laboratory communities were comparable at the functional level. Results confirmed that the composition and core metabolic activities of laboratory-grown communities were similar to a natural community, including the presence of active, low abundance bacteria and archaea that have not yet been isolated. However, laboratory growth rates were slow compared with natural communities, and this correlated with increased abundance of stress response proteins for the dominant bacteria in laboratory communities. Modification of cultivation conditions reduced the abundance of stress response proteins and increased laboratory community growth rates. The research presented here represents the first description of the application of a metabolic labeling-based quantitative proteomic analysis at the community level and resulted in a model microbial community system ideal for testing physiological and ecological hypotheses.

  2. SILAC-Based Quantitative Strategies for Accurate Histone Posttranslational Modification Profiling Across Multiple Biological Samples.

    PubMed

    Cuomo, Alessandro; Soldi, Monica; Bonaldi, Tiziana

    2017-01-01

    Histone posttranslational modifications (hPTMs) play a key role in regulating chromatin dynamics and fine-tuning DNA-based processes. Mass spectrometry (MS) has emerged as a versatile technology for the analysis of histones, contributing to the dissection of hPTMs, with special strength in the identification of novel marks and in the assessment of modification cross talks. Stable isotope labeling by amino acid in cell culture (SILAC), when adapted to histones, permits the accurate quantification of PTM changes among distinct functional states; however, its application has been mainly confined to actively dividing cell lines. A spike-in strategy based on SILAC can be used to overcome this limitation and profile hPTMs across multiple samples. We describe here the adaptation of SILAC to the analysis of histones, in both standard and spike-in setups. We also illustrate its coupling to an implemented "shotgun" workflow, by which heavy arginine-labeled histone peptides, produced upon Arg-C digestion, are qualitatively and quantitatively analyzed in an LC-MS/MS system that combines ultrahigh-pressure liquid chromatography (UHPLC) with new-generation Orbitrap high-resolution instrument.

  3. Accurate detection and quantitation of heteroplasmic mitochondrial point mutations by pyrosequencing.

    PubMed

    White, Helen E; Durston, Victoria J; Seller, Anneke; Fratter, Carl; Harvey, John F; Cross, Nicholas C P

    2005-01-01

    Disease-causing mutations in mitochondrial DNA (mtDNA) are typically heteroplasmic and therefore interpretation of genetic tests for mitochondrial disorders can be problematic. Detection of low level heteroplasmy is technically demanding and it is often difficult to discriminate between the absence of a mutation or the failure of a technique to detect the mutation in a particular tissue. The reliable measurement of heteroplasmy in different tissues may help identify individuals who are at risk of developing specific complications and allow improved prognostic advice for patients and family members. We have evaluated Pyrosequencing technology for the detection and estimation of heteroplasmy for six mitochondrial point mutations associated with the following diseases: Leber's hereditary optical neuropathy (LHON), G3460A, G11778A, and T14484C; mitochondrial encephalopathy with lactic acidosis and stroke-like episodes (MELAS), A3243G; myoclonus epilepsy with ragged red fibers (MERRF), A8344G, and neurogenic muscle weakness, ataxia, and retinitis pigmentosa (NARP)/Leighs: T8993G/C. Results obtained from the Pyrosequencing assays for 50 patients with presumptive mitochondrial disease were compared to those obtained using the commonly used diagnostic technique of polymerase chain reaction (PCR) and restriction enzyme digestion. The Pyrosequencing assays provided accurate genotyping and quantitative determination of mutational load with a sensitivity and specificity of 100%. The MELAS A3243G mutation was detected reliably at a level of 1% heteroplasmy. We conclude that Pyrosequencing is a rapid and robust method for detecting heteroplasmic mitochondrial point mutations.

  4. Quantitative Analyses of Certain Enteric Bacteria and Bacterial Extracts

    PubMed Central

    Glenn, William G.; Ralston, James R.; Russell, Warren J.

    1967-01-01

    Standardized individual preparations of five population levels of eight enteric organisms [Escherichia coli (O4:H3), E. coli (O111:B4:H12), Salmonella enteritidis, S. paratyphi B, S. typhimurium, Shigella boydii, S. dysenteriae, and S. sonnei) were prepared. Dry weights, calculated mean cell weight, and nitrogen content of bacterial suspensions before, and of supernatant fluids after, ultrasonic disruption are tabulated. Percentages of disruption, estimated from nitrogen concentration ratios of the suspensions and supernatant fluids, are given. These data are presented as guidelines for the preparation of bacterial extracts prior to precipitin analyses. PMID:16349752

  5. A new method to synthesize competitor RNAs for accurate analyses by competitive RT-PCR.

    PubMed

    Ishibashi, O

    1997-12-03

    A method to synthesize competitor RNAs as internal standards for competitive RT-PCR is improved by using the long accurate PCR (LA-PCR) technique. Competitor templates synthesized by the new method are almost the same in length, and possibly in secondary structure, as target mRNAs to be quantified except that they include the short deletion within the segments to be amplified. This allows the reverse transcription to be achieved with almost the same efficiency from both target mRNAs and competitor RNAs. Therefore, more accurate quantification can be accomplished by using such competitor RNAs.

  6. Quantitative analyses of maxillary sinus using computed tomography.

    PubMed

    Perella, Andréia; Rocha, Sara Dos Santos; Cavalcanti, Marcelo de Gusmão Paraiso

    2003-09-01

    The aim of this study was to evaluate the precision and accuracy of linear measurements of maxillary sinus made in tomographic films, by comparing with 3D reconstructed images. Linear measurements of both maxillary sinus in computed tomography CT of 17 patients, with or without lesion by two calibrated examiners independently, on two occasions, with a single manual caliper. A third examiner has done the same measurements electronically in 3D-CT reconstruction. The statistical analysis was performed using ANOVA (analyses of variance). Intra-observer percentage error was little in both cases, with and without lesion; it ranged from 1.14% to 1.82%. The inter-observer error was a little higher reaching a 2.08% value. The accuracy presented a higher value. The perceptual accuracy error was higher in samples, which had lesion compared to that which had not. CT had provided adequate precision and accuracy for maxillary sinus analyses. The precision in cases with lesion was considered inferior when compared to that without lesion, but it can't affect the method efficacy.

  7. Understanding silicate hydration from quantitative analyses of hydrating tricalcium silicates

    NASA Astrophysics Data System (ADS)

    Pustovgar, Elizaveta; Sangodkar, Rahul P.; Andreev, Andrey S.; Palacios, Marta; Chmelka, Bradley F.; Flatt, Robert J.; D'Espinose de Lacaillerie, Jean-Baptiste

    2016-03-01

    Silicate hydration is prevalent in natural and technological processes, such as, mineral weathering, glass alteration, zeolite syntheses and cement hydration. Tricalcium silicate (Ca3SiO5), the main constituent of Portland cement, is amongst the most reactive silicates in water. Despite its widespread industrial use, the reaction of Ca3SiO5 with water to form calcium-silicate-hydrates (C-S-H) still hosts many open questions. Here, we show that solid-state nuclear magnetic resonance measurements of 29Si-enriched triclinic Ca3SiO5 enable the quantitative monitoring of the hydration process in terms of transient local molecular composition, extent of silicate hydration and polymerization. This provides insights on the relative influence of surface hydroxylation and hydrate precipitation on the hydration rate. When the rate drops, the amount of hydroxylated Ca3SiO5 decreases, thus demonstrating the partial passivation of the surface during the deceleration stage. Moreover, the relative quantities of monomers, dimers, pentamers and octamers in the C-S-H structure are measured.

  8. Understanding silicate hydration from quantitative analyses of hydrating tricalcium silicates

    PubMed Central

    Pustovgar, Elizaveta; Sangodkar, Rahul P.; Andreev, Andrey S.; Palacios, Marta; Chmelka, Bradley F.; Flatt, Robert J.; d'Espinose de Lacaillerie, Jean-Baptiste

    2016-01-01

    Silicate hydration is prevalent in natural and technological processes, such as, mineral weathering, glass alteration, zeolite syntheses and cement hydration. Tricalcium silicate (Ca3SiO5), the main constituent of Portland cement, is amongst the most reactive silicates in water. Despite its widespread industrial use, the reaction of Ca3SiO5 with water to form calcium-silicate-hydrates (C-S-H) still hosts many open questions. Here, we show that solid-state nuclear magnetic resonance measurements of 29Si-enriched triclinic Ca3SiO5 enable the quantitative monitoring of the hydration process in terms of transient local molecular composition, extent of silicate hydration and polymerization. This provides insights on the relative influence of surface hydroxylation and hydrate precipitation on the hydration rate. When the rate drops, the amount of hydroxylated Ca3SiO5 decreases, thus demonstrating the partial passivation of the surface during the deceleration stage. Moreover, the relative quantities of monomers, dimers, pentamers and octamers in the C-S-H structure are measured. PMID:27009966

  9. Analyses on Regional Cultivated Land Changebased on Quantitative Method

    NASA Astrophysics Data System (ADS)

    Cao, Yingui; Yuan, Chun; Zhou, Wei; Wang, Jing

    Three Gorges Project is the great project in the world, which accelerates economic development in the reservoir area of Three Gorges Project. In the process of development in the reservoir area of Three Gorges Project, cultivated land has become the important resources, a lot of cultivated land has been occupied and become the constructing land. In the same time, a lot of cultivated land has been flooded because of the rising of the water level. This paper uses the cultivated land areas and social economic indicators of reservoir area of Three Gorges in 1990-2004, takes the statistic analyses and example research in order to analyze the process of cultivated land, get the driving forces of cultivated land change, find the new methods to stimulate and forecast the cultivated land areas in the future, and serve for the cultivated land protection and successive development in reservoir area of Three Gorges. The results indicate as follow, firstly, in the past 15 years, the cultivated land areas has decreased 200142 hm2, the decreasing quantity per year is 13343 hm2. The whole reservoir area is divided into three different areas, they are upper reaches area, belly area and lower reaches area. The trends of cultivated land change in different reservoir areas are similar to the whole reservoir area. Secondly, the curve of cultivated land areas and per capita GDP takes on the reverse U, and the steps between the change rate of cultivated land and the change rate of GDP are different in some years, which indicates that change of cultivated land and change of GDP are decoupling, besides that, change of cultivated land is connection with the development of urbanization and the policy of returning forestry greatly. Lastly, the precision of multi-regression is lower than the BP neural network in the stimulation of cultivated land, then takes use of the BP neural network to forecast the cultivated land areas in 2005, 2010 and 2015, and the forecasting results are reasonable.

  10. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  11. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations

    NASA Astrophysics Data System (ADS)

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-01

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  12. Accurate radiation temperature and chemical potential from quantitative photoluminescence analysis of hot carrier populations.

    PubMed

    Gibelli, François; Lombez, Laurent; Guillemoles, Jean-François

    2017-02-15

    In order to characterize hot carrier populations in semiconductors, photoluminescence measurement is a convenient tool, enabling us to probe the carrier thermodynamical properties in a contactless way. However, the analysis of the photoluminescence spectra is based on some assumptions which will be discussed in this work. We especially emphasize the importance of the variation of the material absorptivity that should be considered to access accurate thermodynamical properties of the carriers, especially by varying the excitation power. The proposed method enables us to obtain more accurate results of thermodynamical properties by taking into account a rigorous physical description and finds direct application in investigating hot carrier solar cells, which are an adequate concept for achieving high conversion efficiencies with a relatively simple device architecture.

  13. How accurate is the Kubelka-Munk theory of diffuse reflection? A quantitative answer

    NASA Astrophysics Data System (ADS)

    Joseph, Richard I.; Thomas, Michael E.

    2012-10-01

    The (heuristic) Kubelka-Munk theory of diffuse reflectance and transmittance of a film on a substrate, which is widely used because it gives simple analytic results, is compared to the rigorous radiative transfer model of Chandrasekhar. The rigorous model has to be numerically solved, thus is less intuitive. The Kubelka-Munk theory uses an absorption coefficient and scatter coefficient as inputs, similar to the rigorous model of Chandrasekhar. The relationship between these two sets of coefficients is addressed. It is shown that the Kubelka-Munk theory is remarkably accurate if one uses the proper albedo parameter.

  14. Highly sensitive capillary electrophoresis-mass spectrometry for rapid screening and accurate quantitation of drugs of abuse in urine.

    PubMed

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-05-30

    The combination of capillary electrophoresis (CE) and mass spectrometry (MS) is particularly well adapted to bioanalysis due to its high separation efficiency, selectivity, and sensitivity; its short analytical time; and its low solvent and sample consumption. For clinical and forensic toxicology, a two-step analysis is usually performed: first, a screening step for compound identification, and second, confirmation and/or accurate quantitation in cases of presumed positive results. In this study, a fast and sensitive CE-MS workflow was developed for the screening and quantitation of drugs of abuse in urine samples. A CE with a time-of-flight MS (CE-TOF/MS) screening method was developed using a simple urine dilution and on-line sample preconcentration with pH-mediated stacking. The sample stacking allowed for a high loading capacity (20.5% of the capillary length), leading to limits of detection as low as 2 ng mL(-1) for drugs of abuse. Compound quantitation of positive samples was performed by CE-MS/MS with a triple quadrupole MS equipped with an adapted triple-tube sprayer and an electrospray ionization (ESI) source. The CE-ESI-MS/MS method was validated for two model compounds, cocaine (COC) and methadone (MTD), according to the Guidance of the Food and Drug Administration. The quantitative performance was evaluated for selectivity, response function, the lower limit of quantitation, trueness, precision, and accuracy. COC and MTD detection in urine samples was determined to be accurate over the range of 10-1000 ng mL(-1) and 21-1000 ng mL(-1), respectively.

  15. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling.

    PubMed

    Boers, Stefan A; Hays, John P; Jansen, Ruud

    2017-04-05

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison.

  16. Novel micelle PCR-based method for accurate, sensitive and quantitative microbiota profiling

    PubMed Central

    Boers, Stefan A.; Hays, John P.; Jansen, Ruud

    2017-01-01

    In the last decade, many researchers have embraced 16S rRNA gene sequencing techniques, which has led to a wealth of publications and documented differences in the composition of microbial communities derived from many different ecosystems. However, comparison between different microbiota studies is currently very difficult due to the lack of a standardized 16S rRNA gene sequencing protocol. Here we report on a novel approach employing micelle PCR (micPCR) in combination with an internal calibrator that allows for standardization of microbiota profiles via their absolute abundances. The addition of an internal calibrator allows the researcher to express the resulting operational taxonomic units (OTUs) as a measure of 16S rRNA gene copies by correcting the number of sequences of each individual OTU in a sample for efficiency differences in the NGS process. Additionally, accurate quantification of OTUs obtained from negative extraction control samples allows for the subtraction of contaminating bacterial DNA derived from the laboratory environment or chemicals/reagents used. Using equimolar synthetic microbial community samples and low biomass clinical samples, we demonstrate that the calibrated micPCR/NGS methodology possess a much higher precision and a lower limit of detection compared with traditional PCR/NGS, resulting in more accurate microbiota profiles suitable for multi-study comparison. PMID:28378789

  17. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis

    PubMed Central

    Smith, William L.; Chadwick, Sean G.; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E.; Aguin, Tina J.; Sobel, Jack D.

    2016-01-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease: Gardnerella vaginalis, Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the order Clostridiales), Megasphaera phylotype 1 or 2, Lactobacillus iners, Lactobacillus crispatus, Lactobacillus gasseri, and Lactobacillus jensenii. We generated a logistic regression model that identified G. vaginalis, A. vaginae, and Megasphaera phylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion of Lactobacillus spp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV. PMID:26818677

  18. Challenges in Higher Education Research: The Use of Quantitative Tools in Comparative Analyses

    ERIC Educational Resources Information Center

    Reale, Emanuela

    2014-01-01

    Despite the value of the comparative perspective for the study of higher education is widely recognised, there is little consensus about the specific methodological approaches. Quantitative tools outlined their relevance for addressing comparative analyses since they are supposed to reducing the complexity, finding out and graduating similarities…

  19. Recycling and Ambivalence: Quantitative and Qualitative Analyses of Household Recycling among Young Adults

    ERIC Educational Resources Information Center

    Ojala, Maria

    2008-01-01

    Theories about ambivalence, as well as quantitative and qualitative empirical approaches, are applied to obtain an understanding of recycling among young adults. A questionnaire was mailed to 422 Swedish young people. Regression analyses showed that a mix of negative emotions (worry) and positive emotions (hope and joy) about the environmental…

  20. Quantitative spectroscopy of hot stars: accurate atomic data applied on a large scale as driver of recent breakthroughs

    NASA Astrophysics Data System (ADS)

    Przybilla, N.; Schaffenroth, V.; Nieva, M. F.; Butler, K.

    2016-10-01

    OB-type stars present hotbeds for non-LTE physics because of their strong radiation fields that drive the atmospheric plasma out of local thermodynamic equilibrium. We report on recent breakthroughs in the quantitative analysis of the optical and UV-spectra of OB-type stars that were facilitated by application of accurate and precise atomic data on a large scale. An astrophysicist's dream has come true, by bringing observed and model spectra into close match over wide parts of the observed wavelength ranges. This allows tight observational constraints to be derived from OB-type stars for a wide range of applications in astrophysics. However, despite the progress made, many details of the modelling may be improved further. We discuss atomic data needs in terms of laboratory measurements and also ab-initio calculations. Particular emphasis is given to quantitative spectroscopy in the near-IR, which will be the focus in the era of the upcoming extremely large telescopes.

  1. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  2. There's plenty of gloom at the bottom: the many challenges of accurate quantitation in size-based oligomeric separations.

    PubMed

    Striegel, André M

    2013-11-01

    There is a variety of small-molecule species (e.g., tackifiers, plasticizers, oligosaccharides) the size-based characterization of which is of considerable scientific and industrial importance. Likewise, quantitation of the amount of oligomers in a polymer sample is crucial for the import and export of substances into the USA and European Union (EU). While the characterization of ultra-high molar mass macromolecules by size-based separation techniques is generally considered a challenge, it is this author's contention that a greater challenge is encountered when trying to perform, for quantitation purposes, separations in and of the oligomeric region. The latter thesis is expounded herein, by detailing the various obstacles encountered en route to accurate, quantitative oligomeric separations by entropically dominated techniques such as size-exclusion chromatography, hydrodynamic chromatography, and asymmetric flow field-flow fractionation, as well as by methods which are, principally, enthalpically driven such as liquid adsorption and temperature gradient interaction chromatography. These obstacles include, among others, the diminished sensitivity of static light scattering (SLS) detection at low molar masses, the non-constancy of the response of SLS and of commonly employed concentration-sensitive detectors across the oligomeric region, and the loss of oligomers through the accumulation wall membrane in asymmetric flow field-flow fractionation. The battle is not lost, however, because, with some care and given a sufficient supply of sample, the quantitation of both individual oligomeric species and of the total oligomeric region is often possible.

  3. Quantitation of Insulin-Like Growth Factor 1 in Serum by Liquid Chromatography High Resolution Accurate-Mass Mass Spectrometry.

    PubMed

    Ketha, Hemamalini; Singh, Ravinder J

    2016-01-01

    Insulin-like growth factor 1 (IGF-1) is a 70 amino acid peptide hormone which acts as the principal mediator of the effects of growth hormone (GH). Due to a wide variability in circulating concentration of GH, IGF-1 quantitation is the first step in the diagnosis of GH excess or deficiency. Majority (>95 %) of IGF-1 circulates as a ternary complex along with its principle binding protein insulin-like growth factor 1 binding protein 3 (IGFBP-3) and acid labile subunit. The assay design approach for IGF-1 quantitation has to include a step to dissociate IGF-1 from its ternary complex. Several commercial assays employ a buffer containing acidified ethanol to achieve this. Despite several modifications, commercially available immunoassays have been shown to have challenges with interference from IGFBP-3. Additionally, inter-method comparison between IGF-1 immunoassays has been shown to be suboptimal. Mass spectrometry has been utilized for quantitation of IGF-1. In this chapter a liquid chromatography high resolution accurate-mass mass spectrometry (LC-HRAMS) based method for IGF-1 quantitation has been described.

  4. Quantitatively accurate activity measurements with a dedicated cardiac SPECT camera: Physical phantom experiments

    SciTech Connect

    Pourmoghaddas, Amir Wells, R. Glenn

    2016-01-15

    Purpose: Recently, there has been increased interest in dedicated cardiac single photon emission computed tomography (SPECT) scanners with pinhole collimation and improved detector technology due to their improved count sensitivity and resolution over traditional parallel-hole cameras. With traditional cameras, energy-based approaches are often used in the clinic for scatter compensation because they are fast and easily implemented. Some of the cardiac cameras use cadmium-zinc-telluride (CZT) detectors which can complicate the use of energy-based scatter correction (SC) due to the low-energy tail—an increased number of unscattered photons detected with reduced energy. Modified energy-based scatter correction methods can be implemented, but their level of accuracy is unclear. In this study, the authors validated by physical phantom experiments the quantitative accuracy and reproducibility of easily implemented correction techniques applied to {sup 99m}Tc myocardial imaging with a CZT-detector-based gamma camera with multiple heads, each with a single-pinhole collimator. Methods: Activity in the cardiac compartment of an Anthropomorphic Torso phantom (Data Spectrum Corporation) was measured through 15 {sup 99m}Tc-SPECT acquisitions. The ratio of activity concentrations in organ compartments resembled a clinical {sup 99m}Tc-sestamibi scan and was kept consistent across all experiments (1.2:1 heart to liver and 1.5:1 heart to lung). Two background activity levels were considered: no activity (cold) and an activity concentration 1/10th of the heart (hot). A plastic “lesion” was placed inside of the septal wall of the myocardial insert to simulate the presence of a region without tracer uptake and contrast in this lesion was calculated for all images. The true net activity in each compartment was measured with a dose calibrator (CRC-25R, Capintec, Inc.). A 10 min SPECT image was acquired using a dedicated cardiac camera with CZT detectors (Discovery NM530c, GE

  5. Development and Validation of a Highly Accurate Quantitative Real-Time PCR Assay for Diagnosis of Bacterial Vaginosis.

    PubMed

    Hilbert, David W; Smith, William L; Chadwick, Sean G; Toner, Geoffrey; Mordechai, Eli; Adelson, Martin E; Aguin, Tina J; Sobel, Jack D; Gygax, Scott E

    2016-04-01

    Bacterial vaginosis (BV) is the most common gynecological infection in the United States. Diagnosis based on Amsel's criteria can be challenging and can be aided by laboratory-based testing. A standard method for diagnosis in research studies is enumeration of bacterial morphotypes of a Gram-stained vaginal smear (i.e., Nugent scoring). However, this technique is subjective, requires specialized training, and is not widely available. Therefore, a highly accurate molecular assay for the diagnosis of BV would be of great utility. We analyzed 385 vaginal specimens collected prospectively from subjects who were evaluated for BV by clinical signs and Nugent scoring. We analyzed quantitative real-time PCR (qPCR) assays on DNA extracted from these specimens to quantify nine organisms associated with vaginal health or disease:Gardnerella vaginalis,Atopobium vaginae, BV-associated bacteria 2 (BVAB2, an uncultured member of the orderClostridiales),Megasphaeraphylotype 1 or 2,Lactobacillus iners,Lactobacillus crispatus,Lactobacillus gasseri, andLactobacillus jensenii We generated a logistic regression model that identifiedG. vaginalis,A. vaginae, andMegasphaeraphylotypes 1 and 2 as the organisms for which quantification provided the most accurate diagnosis of symptomatic BV, as defined by Amsel's criteria and Nugent scoring, with 92% sensitivity, 95% specificity, 94% positive predictive value, and 94% negative predictive value. The inclusion ofLactobacillusspp. did not contribute sufficiently to the quantitative model for symptomatic BV detection. This molecular assay is a highly accurate laboratory tool to assist in the diagnosis of symptomatic BV.

  6. A Global Approach to Accurate and Automatic Quantitative Analysis of NMR Spectra by Complex Least-Squares Curve Fitting

    NASA Astrophysics Data System (ADS)

    Martin, Y. L.

    The performance of quantitative analysis of 1D NMR spectra depends greatly on the choice of the NMR signal model. Complex least-squares analysis is well suited for optimizing the quantitative determination of spectra containing a limited number of signals (<30) obtained under satisfactory conditions of signal-to-noise ratio (>20). From a general point of view it is concluded, on the basis of mathematical considerations and numerical simulations, that, in the absence of truncation of the free-induction decay, complex least-squares curve fitting either in the time or in the frequency domain and linear-prediction methods are in fact nearly equivalent and give identical results. However, in the situation considered, complex least-squares analysis in the frequency domain is more flexible since it enables the quality of convergence to be appraised at every resonance position. An efficient data-processing strategy has been developed which makes use of an approximate conjugate-gradient algorithm. All spectral parameters (frequency, damping factors, amplitudes, phases, initial delay associated with intensity, and phase parameters of a baseline correction) are simultaneously managed in an integrated approach which is fully automatable. The behavior of the error as a function of the signal-to-noise ratio is theoretically estimated, and the influence of apodization is discussed. The least-squares curve fitting is theoretically proved to be the most accurate approach for quantitative analysis of 1D NMR data acquired with reasonable signal-to-noise ratio. The method enables complex spectral residuals to be sorted out. These residuals, which can be cumulated thanks to the possibility of correcting for frequency shifts and phase errors, extract systematic components, such as isotopic satellite lines, and characterize the shape and the intensity of the spectral distortion with respect to the Lorentzian model. This distortion is shown to be nearly independent of the chemical species

  7. Simple, fast, and accurate methodology for quantitative analysis using Fourier transform infrared spectroscopy, with bio-hybrid fuel cell examples.

    PubMed

    Mackie, David M; Jahnke, Justin P; Benyamin, Marcus S; Sumner, James J

    2016-01-01

    The standard methodologies for quantitative analysis (QA) of mixtures using Fourier transform infrared (FTIR) instruments have evolved until they are now more complicated than necessary for many users' purposes. We present a simpler methodology, suitable for widespread adoption of FTIR QA as a standard laboratory technique across disciplines by occasional users.•Algorithm is straightforward and intuitive, yet it is also fast, accurate, and robust.•Relies on component spectra, minimization of errors, and local adaptive mesh refinement.•Tested successfully on real mixtures of up to nine components. We show that our methodology is robust to challenging experimental conditions such as similar substances, component percentages differing by three orders of magnitude, and imperfect (noisy) spectra. As examples, we analyze biological, chemical, and physical aspects of bio-hybrid fuel cells.

  8. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  9. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  10. Renal Cortical Lactate Dehydrogenase: A Useful, Accurate, Quantitative Marker of In Vivo Tubular Injury and Acute Renal Failure

    PubMed Central

    Zager, Richard A.; Johnson, Ali C. M.; Becker, Kirsten

    2013-01-01

    Studies of experimental acute kidney injury (AKI) are critically dependent on having precise methods for assessing the extent of tubular cell death. However, the most widely used techniques either provide indirect assessments (e.g., BUN, creatinine), suffer from the need for semi-quantitative grading (renal histology), or reflect the status of residual viable, not the number of lost, renal tubular cells (e.g., NGAL content). Lactate dehydrogenase (LDH) release is a highly reliable test for assessing degrees of in vitro cell death. However, its utility as an in vivo AKI marker has not been defined. Towards this end, CD-1 mice were subjected to graded renal ischemia (0, 15, 22, 30, 40, or 60 min) or to nephrotoxic (glycerol; maleate) AKI. Sham operated mice, or mice with AKI in the absence of acute tubular necrosis (ureteral obstruction; endotoxemia), served as negative controls. Renal cortical LDH or NGAL levels were assayed 2 or 24 hrs later. Ischemic, glycerol, and maleate-induced AKI were each associated with striking, steep, inverse correlations (r, −0.89) between renal injury severity and renal LDH content. With severe AKI, >65% LDH declines were observed. Corresponding prompt plasma and urinary LDH increases were observed. These observations, coupled with the maintenance of normal cortical LDH mRNA levels, indicated the renal LDH efflux, not decreased LDH synthesis, caused the falling cortical LDH levels. Renal LDH content was well maintained with sham surgery, ureteral obstruction or endotoxemic AKI. In contrast to LDH, renal cortical NGAL levels did not correlate with AKI severity. In sum, the above results indicate that renal cortical LDH assay is a highly accurate quantitative technique for gauging the extent of experimental acute ischemic and toxic renal injury. That it avoids the limitations of more traditional AKI markers implies great potential utility in experimental studies that require precise quantitation of tubule cell death. PMID:23825563

  11. Genome-wide association analyses of quantitative traits: the GAW16 experience.

    PubMed

    Ghosh, Saurabh

    2009-01-01

    The group that formed on the theme of genome-wide association analyses of quantitative traits (Group 2) in the Genetic Analysis Workshop 16 comprised eight sets of investigators. Three data sets were available: one on autoantibodies related to rheumatoid arthritis provided by the North American Rheumatoid Arthritis Consortium; the second on anthropometric, lipid, and biochemical measures provided by the Framingham Heart Study (FHS); and the third a simulated data set modeled after FHS. The different investigators in the group addressed a large set of statistical challenges and applied a wide spectrum of association methods in analyzing quantitative traits at the genome-wide level. While some previously reported genes were validated, some novel chromosomal regions provided significant evidence of association in multiple contributions in the group. In this report, we discuss the different strategies explored by the different investigators with the common goal of improving the power to detect association.

  12. Quantitative polymerase chain reaction analysis of DNA from noninvasive samples for accurate microsatellite genotyping of wild chimpanzees (Pan troglodytes verus).

    PubMed

    Morin, P A; Chambers, K E; Boesch, C; Vigilant, L

    2001-07-01

    Noninvasive samples are useful for molecular genetic analyses of wild animal populations. However, the low DNA content of such samples makes DNA amplification difficult, and there is the potential for erroneous results when one of two alleles at heterozygous microsatellite loci fails to be amplified. In this study we describe an assay designed to measure the amount of amplifiable nuclear DNA in low DNA concentration extracts from noninvasive samples. We describe the range of DNA amounts obtained from chimpanzee faeces and shed hair samples and formulate a new efficient approach for accurate microsatellite genotyping. Prescreening of extracts for DNA quantity is recommended for sorting of samples for likely success and reliability. Repetition of results remains extensive for analysis of microsatellite amplifications beginning from low starting amounts of DNA, but is reduced for those with higher DNA content.

  13. Quantitative methods for analysing cumulative effects on fish migration success: a review.

    PubMed

    Johnson, J E; Patterson, D A; Martins, E G; Cooke, S J; Hinch, S G

    2012-07-01

    It is often recognized, but seldom addressed, that a quantitative assessment of the cumulative effects, both additive and non-additive, of multiple stressors on fish survival would provide a more realistic representation of the factors that influence fish migration. This review presents a compilation of analytical methods applied to a well-studied fish migration, a more general review of quantitative multivariable methods, and a synthesis on how to apply new analytical techniques in fish migration studies. A compilation of adult migration papers from Fraser River sockeye salmon Oncorhynchus nerka revealed a limited number of multivariable methods being applied and the sub-optimal reliance on univariable methods for multivariable problems. The literature review of fisheries science, general biology and medicine identified a large number of alternative methods for dealing with cumulative effects, with a limited number of techniques being used in fish migration studies. An evaluation of the different methods revealed that certain classes of multivariable analyses will probably prove useful in future assessments of cumulative effects on fish migration. This overview and evaluation of quantitative methods gathered from the disparate fields should serve as a primer for anyone seeking to quantify cumulative effects on fish migration survival.

  14. Quantitative proteomic analyses of crop seedlings subjected to stress conditions; a commentary.

    PubMed

    Nanjo, Yohei; Nouri, Mohammad-Zaman; Komatsu, Setsuko

    2011-07-01

    Quantitative proteomics is one of the analytical approaches used to clarify crop responses to stress conditions. Recent remarkable advances in proteomics technologies allow for the identification of a wider range of proteins than was previously possible. Current proteomic methods fall into roughly two categories: gel-based quantification methods, including conventional two-dimensional gel electrophoresis and two-dimensional fluorescence difference gel electrophoresis, and MS-based quantification methods consists of label-based and label-free protein quantification approaches. Although MS-based quantification methods have become mainstream in recent years, gel-based quantification methods are still useful for proteomic analyses. Previous studies examining crop responses to stress conditions reveal that each method has both advantages and disadvantages in regard to protein quantification in comparative proteomic analyses. Furthermore, one proteomics approach cannot be fully substituted by another technique. In this review, we discuss and highlight the basis and applications of quantitative proteomic analysis approaches in crop seedlings in response to flooding and osmotic stress as two environmental stresses.

  15. Application of an Effective Statistical Technique for an Accurate and Powerful Mining of Quantitative Trait Loci for Rice Aroma Trait

    PubMed Central

    Golestan Hashemi, Farahnaz Sadat; Rafii, Mohd Y.; Ismail, Mohd Razi; Mohamed, Mahmud Tengku Muda; Rahim, Harun A.; Latif, Mohammad Abdul; Aslani, Farzad

    2015-01-01

    When a phenotype of interest is associated with an external/internal covariate, covariate inclusion in quantitative trait loci (QTL) analyses can diminish residual variation and subsequently enhance the ability of QTL detection. In the in vitro synthesis of 2-acetyl-1-pyrroline (2AP), the main fragrance compound in rice, the thermal processing during the Maillard-type reaction between proline and carbohydrate reduction produces a roasted, popcorn-like aroma. Hence, for the first time, we included the proline amino acid, an important precursor of 2AP, as a covariate in our QTL mapping analyses to precisely explore the genetic factors affecting natural variation for rice scent. Consequently, two QTLs were traced on chromosomes 4 and 8. They explained from 20% to 49% of the total aroma phenotypic variance. Additionally, by saturating the interval harboring the major QTL using gene-based primers, a putative allele of fgr (major genetic determinant of fragrance) was mapped in the QTL on the 8th chromosome in the interval RM223-SCU015RM (1.63 cM). These loci supported previous studies of different accessions. Such QTLs can be widely used by breeders in crop improvement programs and for further fine mapping. Moreover, no previous studies and findings were found on simultaneous assessment of the relationship among 2AP, proline and fragrance QTLs. Therefore, our findings can help further our understanding of the metabolomic and genetic basis of 2AP biosynthesis in aromatic rice. PMID:26061689

  16. Accurate, fast and cost-effective diagnostic test for monosomy 1p36 using real-time quantitative PCR.

    PubMed

    Cunha, Pricila da Silva; Pena, Heloisa B; D'Angelo, Carla Sustek; Koiffmann, Celia P; Rosenfeld, Jill A; Shaffer, Lisa G; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5-0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs.

  17. Accurate, Fast and Cost-Effective Diagnostic Test for Monosomy 1p36 Using Real-Time Quantitative PCR

    PubMed Central

    Cunha, Pricila da Silva; Pena, Heloisa B.; D'Angelo, Carla Sustek; Koiffmann, Celia P.; Rosenfeld, Jill A.; Shaffer, Lisa G.; Stofanko, Martin; Gonçalves-Dornelas, Higgor; Pena, Sérgio Danilo Junho

    2014-01-01

    Monosomy 1p36 is considered the most common subtelomeric deletion syndrome in humans and it accounts for 0.5–0.7% of all the cases of idiopathic intellectual disability. The molecular diagnosis is often made by microarray-based comparative genomic hybridization (aCGH), which has the drawback of being a high-cost technique. However, patients with classic monosomy 1p36 share some typical clinical characteristics that, together with its common prevalence, justify the development of a less expensive, targeted diagnostic method. In this study, we developed a simple, rapid, and inexpensive real-time quantitative PCR (qPCR) assay for targeted diagnosis of monosomy 1p36, easily accessible for low-budget laboratories in developing countries. For this, we have chosen two target genes which are deleted in the majority of patients with monosomy 1p36: PRKCZ and SKI. In total, 39 patients previously diagnosed with monosomy 1p36 by aCGH, fluorescent in situ hybridization (FISH), and/or multiplex ligation-dependent probe amplification (MLPA) all tested positive on our qPCR assay. By simultaneously using these two genes we have been able to detect 1p36 deletions with 100% sensitivity and 100% specificity. We conclude that qPCR of PRKCZ and SKI is a fast and accurate diagnostic test for monosomy 1p36, costing less than 10 US dollars in reagent costs. PMID:24839341

  18. Quantitative analyses and modelling to support achievement of the 2020 goals for nine neglected tropical diseases.

    PubMed

    Hollingsworth, T Déirdre; Adams, Emily R; Anderson, Roy M; Atkins, Katherine; Bartsch, Sarah; Basáñez, María-Gloria; Behrend, Matthew; Blok, David J; Chapman, Lloyd A C; Coffeng, Luc; Courtenay, Orin; Crump, Ron E; de Vlas, Sake J; Dobson, Andy; Dyson, Louise; Farkas, Hajnal; Galvani, Alison P; Gambhir, Manoj; Gurarie, David; Irvine, Michael A; Jervis, Sarah; Keeling, Matt J; Kelly-Hope, Louise; King, Charles; Lee, Bruce Y; Le Rutte, Epke A; Lietman, Thomas M; Ndeffo-Mbah, Martial; Medley, Graham F; Michael, Edwin; Pandey, Abhishek; Peterson, Jennifer K; Pinsent, Amy; Porco, Travis C; Richardus, Jan Hendrik; Reimer, Lisa; Rock, Kat S; Singh, Brajendra K; Stolk, Wilma; Swaminathan, Subramanian; Torr, Steve J; Townsend, Jeffrey; Truscott, James; Walker, Martin; Zoueva, Alexandra

    2015-12-09

    Quantitative analysis and mathematical models are useful tools in informing strategies to control or eliminate disease. Currently, there is an urgent need to develop these tools to inform policy to achieve the 2020 goals for neglected tropical diseases (NTDs). In this paper we give an overview of a collection of novel model-based analyses which aim to address key questions on the dynamics of transmission and control of nine NTDs: Chagas disease, visceral leishmaniasis, human African trypanosomiasis, leprosy, soil-transmitted helminths, schistosomiasis, lymphatic filariasis, onchocerciasis and trachoma. Several common themes resonate throughout these analyses, including: the importance of epidemiological setting on the success of interventions; targeting groups who are at highest risk of infection or re-infection; and reaching populations who are not accessing interventions and may act as a reservoir for infection,. The results also highlight the challenge of maintaining elimination 'as a public health problem' when true elimination is not reached. The models elucidate the factors that may be contributing most to persistence of disease and discuss the requirements for eventually achieving true elimination, if that is possible. Overall this collection presents new analyses to inform current control initiatives. These papers form a base from which further development of the models and more rigorous validation against a variety of datasets can help to give more detailed advice. At the moment, the models' predictions are being considered as the world prepares for a final push towards control or elimination of neglected tropical diseases by 2020.

  19. Quantitative analyses of carbon in anhydrous and hydrated interplanetary dust particles

    NASA Technical Reports Server (NTRS)

    Thomas, Kathie L.; Keller, Lindsay P.; Blanford, George E.; Mckay, David S.

    1994-01-01

    Carbon is an important and significant component of most anhydrous and hydrated IDP's. We have analyzed approx. 40 anhydrous and hydrated chondritic IDP's for major and minor elements, including C and O. Quantitative analyses of light elements in small particles are difficult and require careful procedures in order to obtain reliable results. In our work, we have completed extensive analytical checks to verify the accuracy and precision of C abundances in IDP's. In our present work, additional methods are used to verify C abundances in IDP's including analysis of IDP thin sections embedded in S, and direct observation of carbonaceous material in thin sections. Our work shows conclusively that C is strongly enriched in IDP's relative to CI abundances.

  20. Use of quantitative microbiological analyses to trace origin of contamination of parenteral nutrition solutions.

    PubMed

    Bhakdi, Sucharit; Krämer, Irene; Siegel, Ekkehard; Jansen, Bernd; Exner, Martin

    2012-05-01

    In the summer of 2010, parenteral nutrition (PN) admixtures were administered to neonates in the Pediatric Department of the University Medical Center Mainz that provoked severe clinical sequelae. Contamination of a dummy infusion with Enterobacter cloacae and Escherichia hermannii was detected on the day of the incident, and the same isolates were subsequently grown from all PN admixtures as well as from the parent amino acid solution from which the admixtures had been prepared. Quantitative microbiological analyses paired with the determination of endotoxin concentrations enabled the conclusion to be reached that the amino acid solution had represented the primary source of contamination, which must have occurred in the distant past and may have derived from passage of the bacteria through a crack in the glass container. The findings have large implications, and the approaches employed should become of value when similar incidents occur again in the future.

  1. Quantitative analyses of glass via laser-induced breakdown spectroscopy in argon

    NASA Astrophysics Data System (ADS)

    Gerhard, C.; Hermann, J.; Mercadier, L.; Loewenthal, L.; Axente, E.; Luculescu, C. R.; Sarnet, T.; Sentis, M.; Viöl, W.

    2014-11-01

    We demonstrate that elemental analysis of glass with a measurement precision of about 10% can be performed via calibration-free laser-induced breakdown spectroscopy. Therefore, plasma emission spectra recorded during ultraviolet laser ablation of different glasses are compared to the spectral radiance computed for a plasma in local thermodynamic equilibrium. Using an iterative calculation algorithm, we deduce the relative elemental fractions and the plasma properties from the best agreement between measured and computed spectra. The measurement method is validated in two ways. First, the LIBS measurements are performed on fused silica composed of more than 99.9% of SiO2. Second, the oxygen fractions measured for heavy flint and barite crown glasses are compared to the values expected from the glass composing oxides. The measured compositions are furthermore compared with those obtained by X-ray photoelectron spectroscopy and energy-dispersive X-ray spectroscopy. It is shown that accurate LIBS analyses require spectra recording with short enough delays between laser pulse and detector gate, when the electron density is larger than 1017 cm- 3. The results show that laser-induced breakdown spectroscopy based on accurate plasma modeling is suitable for elemental analysis of complex materials such as glasses, with an analytical performance comparable or even better than that obtained with standard techniques.

  2. Quantitative proteomic analyses of the response of acidophilic microbial communities to different pH conditions.

    PubMed

    Belnap, Christopher P; Pan, Chongle; Denef, Vincent J; Samatova, Nagiza F; Hettich, Robert L; Banfield, Jillian F

    2011-07-01

    Extensive genomic characterization of multi-species acid mine drainage microbial consortia combined with laboratory cultivation has enabled the application of quantitative proteomic analyses at the community level. In this study, quantitative proteomic comparisons were used to functionally characterize laboratory-cultivated acidophilic communities sustained in pH 1.45 or 0.85 conditions. The distributions of all proteins identified for individual organisms indicated biases for either high or low pH, and suggests pH-specific niche partitioning for low abundance bacteria and archaea. Although the proteome of the dominant bacterium, Leptospirillum group II, was largely unaffected by pH treatments, analysis of functional categories indicated proteins involved in amino acid and nucleotide metabolism, as well as cell membrane/envelope biogenesis were overrepresented at high pH. Comparison of specific protein abundances indicates higher pH conditions favor Leptospirillum group III, whereas low pH conditions promote the growth of certain archaea. Thus, quantitative proteomic comparisons revealed distinct differences in community composition and metabolic function of individual organisms during different pH treatments. Proteomic analysis revealed other aspects of community function. Different numbers of phage proteins were identified across biological replicates, indicating stochastic spatial heterogeneity of phage outbreaks. Additionally, proteomic data were used to identify a previously unknown genotypic variant of Leptospirillum group II, an indication of selection for a specific Leptospirillum group II population in laboratory communities. Our results confirm the importance of pH and related geochemical factors in fine-tuning acidophilic microbial community structure and function at the species and strain level, and demonstrate the broad utility of proteomics in laboratory community studies.

  3. Quantitative proteomic analyses of the response of acidophilic microbial communities to different pH conditions

    SciTech Connect

    Belnap, Christopher P.; Pan, Chongle; Denef, Vincent; Samatova, Nagiza F; Hettich, Robert {Bob} L; Banfield, Jillian F.

    2011-01-01

    Extensive genomic characterization of multi-species acid mine drainage microbial consortia combined with laboratory cultivation has enabled the application of quantitative proteomic analyses at the community level. In this study, quantitative proteomic comparisons were used to functionally characterize laboratory-cultivated acidophilic communities sustained in pH 1.45 or 0.85 conditions. The distributions of all proteins identified for individual organisms indicated biases for either high or low pH, and suggests pH-specific niche partitioning for low abundance bacteria and archaea. Although the proteome of the dominant bacterium, Leptospirillum group II, was largely unaffected by pH treatments, analysis of functional categories indicated proteins involved in amino acid and nucleotide metabolism, as well as cell membrane/envelope biogenesis were overrepresented at high pH. Comparison of specific protein abundances indicates higher pH conditions favor Leptospirillum group III, whereas low pH conditions promote the growth of certain archaea. Thus, quantitative proteomic comparisons revealed distinct differences in community composition and metabolic function of individual organisms during different pH treatments. Proteomic analysis revealed other aspects of community function. Different numbers of phage proteins were identified across biological replicates, indicating stochastic spatial heterogeneity of phage outbreaks. Additionally, proteomic data were used to identify a previously unknown genotypic variant of Leptospirillum group II, an indication of selection for a specific Leptospirillum group II population in laboratory communities. Our results confirm the importance of pH and related geochemical factors in fine-tuning acidophilic microbial community structure and function at the species and strain level, and demonstrate the broad utility of proteomics in laboratory community studies.

  4. Quantitative analyses of the 3D nuclear landscape recorded with super-resolved fluorescence microscopy.

    PubMed

    Schmid, Volker J; Cremer, Marion; Cremer, Thomas

    2017-03-18

    Recent advancements of super-resolved fluorescence microscopy have revolutionized microscopic studies of cells, including the exceedingly complex structural organization of cell nuclei in space and time. In this paper we describe and discuss tools for (semi-) automated, quantitative 3D analyses of the spatial nuclear organization. These tools allow the quantitative assessment of highly resolved different chromatin compaction levels in individual cell nuclei, which reflect functionally different regions or sub-compartments of the 3D nuclear landscape, and measurements of absolute distances between sites of different chromatin compaction. In addition, these tools allow 3D mapping of specific DNA/RNA sequences and nuclear proteins relative to the 3D chromatin compaction maps and comparisons of multiple cell nuclei. The tools are available in the free and open source R packages nucim and bioimagetools. We discuss the use of masks for the segmentation of nuclei and the use of DNA stains, such as DAPI, as a proxy for local differences in chromatin compaction. We further discuss the limitations of 3D maps of the nuclear landscape as well as problems of the biological interpretation of such data.

  5. Automated and quantitative headspace in-tube extraction for the accurate determination of highly volatile compounds from wines and beers.

    PubMed

    Zapata, Julián; Mateo-Vivaracho, Laura; Lopez, Ricardo; Ferreira, Vicente

    2012-03-23

    An automatic headspace in-tube extraction (ITEX) method for the accurate determination of acetaldehyde, ethyl acetate, diacetyl and other volatile compounds from wine and beer has been developed and validated. Method accuracy is based on the nearly quantitative transference of volatile compounds from the sample to the ITEX trap. For achieving that goal most methodological aspects and parameters have been carefully examined. The vial and sample sizes and the trapping materials were found to be critical due to the pernicious saturation effects of ethanol. Small 2 mL vials containing very small amounts of sample (20 μL of 1:10 diluted sample) and a trap filled with 22 mg of Bond Elut ENV resins could guarantee a complete trapping of sample vapors. The complete extraction requires 100 × 0.5 mL pumping strokes at 60 °C and takes 24 min. Analytes are further desorbed at 240 °C into the GC injector under a 1:5 split ratio. The proportion of analytes finally transferred to the trap ranged from 85 to 99%. The validation of the method showed satisfactory figures of merit. Determination coefficients were better than 0.995 in all cases and good repeatability was also obtained (better than 7% in all cases). Reproducibility was better than 8.3% except for acetaldehyde (13.1%). Detection limits were below the odor detection thresholds of these target compounds in wine and beer and well below the normal ranges of occurrence. Recoveries were not significantly different to 100%, except in the case of acetaldehyde. In such a case it could be determined that the method is not able to break some of the adducts that this compound forms with sulfites. However, such problem was avoided after incubating the sample with glyoxal. The method can constitute a general and reliable alternative for the analysis of very volatile compounds in other difficult matrixes.

  6. Insight in genome-wide association of metabolite quantitative traits by exome sequence analyses.

    PubMed

    Demirkan, Ayşe; Henneman, Peter; Verhoeven, Aswin; Dharuri, Harish; Amin, Najaf; van Klinken, Jan Bert; Karssen, Lennart C; de Vries, Boukje; Meissner, Axel; Göraler, Sibel; van den Maagdenberg, Arn M J M; Deelder, André M; C 't Hoen, Peter A; van Duijn, Cornelia M; van Dijk, Ko Willems

    2015-01-01

    Metabolite quantitative traits carry great promise for epidemiological studies, and their genetic background has been addressed using Genome-Wide Association Studies (GWAS). Thus far, the role of less common variants has not been exhaustively studied. Here, we set out a GWAS for metabolite quantitative traits in serum, followed by exome sequence analysis to zoom in on putative causal variants in the associated genes. 1H Nuclear Magnetic Resonance (1H-NMR) spectroscopy experiments yielded successful quantification of 42 unique metabolites in 2,482 individuals from The Erasmus Rucphen Family (ERF) study. Heritability of metabolites were estimated by SOLAR. GWAS was performed by linear mixed models, using HapMap imputations. Based on physical vicinity and pathway analyses, candidate genes were screened for coding region variation using exome sequence data. Heritability estimates for metabolites ranged between 10% and 52%. GWAS replicated three known loci in the metabolome wide significance: CPS1 with glycine (P-value  = 1.27×10-32), PRODH with proline (P-value  = 1.11×10-19), SLC16A9 with carnitine level (P-value  = 4.81×10-14) and uncovered a novel association between DMGDH and dimethyl-glycine (P-value  = 1.65×10-19) level. In addition, we found three novel, suggestively significant loci: TNP1 with pyruvate (P-value  = 1.26×10-8), KCNJ16 with 3-hydroxybutyrate (P-value  = 1.65×10-8) and 2p12 locus with valine (P-value  = 3.49×10-8). Exome sequence analysis identified potentially causal coding and regulatory variants located in the genes CPS1, KCNJ2 and PRODH, and revealed allelic heterogeneity for CPS1 and PRODH. Combined GWAS and exome analyses of metabolites detected by high-resolution 1H-NMR is a robust approach to uncover metabolite quantitative trait loci (mQTL), and the likely causative variants in these loci. It is anticipated that insight in the genetics of intermediate phenotypes will provide additional insight into the

  7. Insight in Genome-Wide Association of Metabolite Quantitative Traits by Exome Sequence Analyses

    PubMed Central

    Verhoeven, Aswin; Dharuri, Harish; Amin, Najaf; van Klinken, Jan Bert; Karssen, Lennart C.; de Vries, Boukje; Meissner, Axel; Göraler, Sibel; van den Maagdenberg, Arn M. J. M.; Deelder, André M.; C ’t Hoen, Peter A.; van Duijn, Cornelia M.; van Dijk, Ko Willems

    2015-01-01

    Metabolite quantitative traits carry great promise for epidemiological studies, and their genetic background has been addressed using Genome-Wide Association Studies (GWAS). Thus far, the role of less common variants has not been exhaustively studied. Here, we set out a GWAS for metabolite quantitative traits in serum, followed by exome sequence analysis to zoom in on putative causal variants in the associated genes. 1H Nuclear Magnetic Resonance (1H-NMR) spectroscopy experiments yielded successful quantification of 42 unique metabolites in 2,482 individuals from The Erasmus Rucphen Family (ERF) study. Heritability of metabolites were estimated by SOLAR. GWAS was performed by linear mixed models, using HapMap imputations. Based on physical vicinity and pathway analyses, candidate genes were screened for coding region variation using exome sequence data. Heritability estimates for metabolites ranged between 10% and 52%. GWAS replicated three known loci in the metabolome wide significance: CPS1 with glycine (P-value  = 1.27×10−32), PRODH with proline (P-value  = 1.11×10−19), SLC16A9 with carnitine level (P-value  = 4.81×10−14) and uncovered a novel association between DMGDH and dimethyl-glycine (P-value  = 1.65×10−19) level. In addition, we found three novel, suggestively significant loci: TNP1 with pyruvate (P-value  = 1.26×10−8), KCNJ16 with 3-hydroxybutyrate (P-value  = 1.65×10−8) and 2p12 locus with valine (P-value  = 3.49×10−8). Exome sequence analysis identified potentially causal coding and regulatory variants located in the genes CPS1, KCNJ2 and PRODH, and revealed allelic heterogeneity for CPS1 and PRODH. Combined GWAS and exome analyses of metabolites detected by high-resolution 1H-NMR is a robust approach to uncover metabolite quantitative trait loci (mQTL), and the likely causative variants in these loci. It is anticipated that insight in the genetics of intermediate phenotypes will provide additional

  8. Quantitative stability analyses of multiwall carbon nanotube nanofluids following water/ice phase change cycling

    NASA Astrophysics Data System (ADS)

    Ivall, Jason; Langlois-Rahme, Gabriel; Coulombe, Sylvain; Servio, Phillip

    2017-02-01

    Multiwall carbon nanotube nanofluids are regularly investigated for phase change enhancement between liquid and solid states owing to their improved heat transfer properties. The potential applications are numerous, the most notable being latent heat thermal energy storage, but the success of all nanofluid-assisted technologies hinges greatly on the ability of nanoparticles to remain stably dispersed after repeated phase change cycles. In this report, the stability of aqueous nanofluids made from oxygen-functionalized multiwall carbon nanotubes (f-MWCNTs) was profiled over the course of 20 freeze/thaw cycles. Sonication was used after each cycle to re-disperse clusters formed from the crystallization process. This study offers a quantitative evaluation of f-MWCNT-nanofluid stability as a result of phase change through optical characterization of concentration and particle size. It also provides insight into the integrity of the surface functionalities through zeta potential and XPS analyses. Concentration and particle size measurements showed moderate and consistent recoverability of f-MWCNT dispersion following ultrasonication. XPS measurements of solid-state MWCNTs exposed to freeze/thaw cycling in water, and zeta potential analyses of the nanofluids indicate that the surface oxygen content is preserved throughout phase change and over repeated cycles. These results suggest a resilience of oxygen-functionalized MWCNTs to the freezing and thawing of water, which is ideal for their utilization as phase change enhancers.

  9. Quantitative stability analyses of multiwall carbon nanotube nanofluids following water/ice phase change cycling.

    PubMed

    Ivall, Jason; Langlois-Rahme, Gabriel; Coulombe, Sylvain; Servio, Phillip

    2017-02-03

    Multiwall carbon nanotube nanofluids are regularly investigated for phase change enhancement between liquid and solid states owing to their improved heat transfer properties. The potential applications are numerous, the most notable being latent heat thermal energy storage, but the success of all nanofluid-assisted technologies hinges greatly on the ability of nanoparticles to remain stably dispersed after repeated phase change cycles. In this report, the stability of aqueous nanofluids made from oxygen-functionalized multiwall carbon nanotubes (f-MWCNTs) was profiled over the course of 20 freeze/thaw cycles. Sonication was used after each cycle to re-disperse clusters formed from the crystallization process. This study offers a quantitative evaluation of f-MWCNT-nanofluid stability as a result of phase change through optical characterization of concentration and particle size. It also provides insight into the integrity of the surface functionalities through zeta potential and XPS analyses. Concentration and particle size measurements showed moderate and consistent recoverability of f-MWCNT dispersion following ultrasonication. XPS measurements of solid-state MWCNTs exposed to freeze/thaw cycling in water, and zeta potential analyses of the nanofluids indicate that the surface oxygen content is preserved throughout phase change and over repeated cycles. These results suggest a resilience of oxygen-functionalized MWCNTs to the freezing and thawing of water, which is ideal for their utilization as phase change enhancers.

  10. Genome-wide Linkage Analyses of Quantitative and Categorical Autism Subphenotypes

    PubMed Central

    Liu, Xiao-Qing; Paterson, Andrew D.; Szatmari, Peter

    2008-01-01

    Background The search for susceptibility genes in autism and autism spectrum disorders (ASD) has been hindered by the possible small effects of individual genes and by genetic (locus) heterogeneity. To overcome these obstacles, one method is to use autism-related subphenotypes instead of the categorical diagnosis of autism since they may be more directly related to the underlying susceptibility loci. Another strategy is to analyze subsets of families that meet certain clinical criteria to reduce genetic heterogeneity. Methods In this study, using 976 multiplex families from the Autism Genome Project consortium, we performed genome-wide linkage analyses on two quantitative subphenotypes, the total scores of the reciprocal social interaction domain and the restricted, repetitive, and stereotyped patterns of behavior domain from the Autism Diagnostic Interview-Revised. We also selected subsets of ASD families based on four binary subphenotypes, delayed onset of first words, delayed onset of first phrases, verbal status, and IQ ≥ 70. Results When the ASD families with IQ ≥ 70 were used, a logarithm of odds (LOD) score of 4.01 was obtained on chromosome 15q13.3-q14, which was previously linked to schizophrenia. We also obtained a LOD score of 3.40 on chromosome 11p15.4-p15.3 using the ASD families with delayed onset of first phrases. No significant evidence for linkage was obtained for the two quantitative traits. Conclusions This study demonstrates that selection of informative subphenotypes to define a homogeneous set of ASD families could be very important in detecting the susceptibility loci in autism. PMID:18632090

  11. Quantitative XRD HW-IR plot for clay mineral domain size and lattice strain analyses

    NASA Astrophysics Data System (ADS)

    Wang, H. J.; Chen, D. Z.; Zhou, J.; Chen, T.; Wang, H.; Zhang, Z. Q.

    2003-04-01

    Based on integral-breadth method, the one of three basic XRD methods (Klug &Alexander, 1974), authors (2000) proposed a qualitative half width (HW)-intensity ratio (IR) plot for clay mineral domain size and lattice strain analyses. In this study, the quantitative HW-IR plot is further developed on the basis of i) the curve relation between the Voigt function and the Pearson VII function; ii) the relationship between the Kübler index and the Weaver index. By numerical simulating, it is derived a curve relation between shape indexes k of the Voigt function and u of the Pearson VII function. With this curve relation, k and u can be converted each other in an accuracy of ten thousandth and therefore the domain size and the lattice strain contributions can be precisely separated from an XRD peak according to Langford's (1978) formula. For micaceous minerals, the HW-IR plot requires only a pair of values of the Kübler index and the Weaver index from 1nm reflection. For other clay minerals, the plot needs a pair of values of the (00l) peak's half width and intensity ratio IR. IR is a ratio of peak maximum to the intensity at the position of maximum minus 0.422oΔ2Θ in CuKα radiation. This quantitative plot renders a mean dimension of clay particles perpendicular to the reflection plane (00l) and an approximate upper limit strain normal to d001. The accuracy for domain size analysis reaches one tenth of nanometre and that for the lattice strain analysis is in ten thousandth respectively. This plot method can be widely used with any digital X-ray diffractometer, whose XRD data can be converted into text format. Excel 5.0 or latter versions in both English and Chinese can well support the HW-IR plot. This study was supported by NNSFC (Grant No 40272022)

  12. A new material mapping procedure for quantitative computed tomography-based, continuum finite element analyses of the vertebra.

    PubMed

    Unnikrishnan, Ginu U; Morgan, Elise F

    2011-07-01

    Inaccuracies in the estimation of material properties and errors in the assignment of these properties into finite element models limit the reliability, accuracy, and precision of quantitative computed tomography (QCT)-based finite element analyses of the vertebra. In this work, a new mesh-independent, material mapping procedure was developed to improve the quality of predictions of vertebral mechanical behavior from QCT-based finite element models. In this procedure, an intermediate step, called the material block model, was introduced to determine the distribution of material properties based on bone mineral density, and these properties were then mapped onto the finite element mesh. A sensitivity study was first conducted on a calibration phantom to understand the influence of the size of the material blocks on the computed bone mineral density. It was observed that varying the material block size produced only marginal changes in the predictions of mineral density. Finite element (FE) analyses were then conducted on a square column-shaped region of the vertebra and also on the entire vertebra in order to study the effect of material block size on the FE-derived outcomes. The predicted values of stiffness for the column and the vertebra decreased with decreasing block size. When these results were compared to those of a mesh convergence analysis, it was found that the influence of element size on vertebral stiffness was less than that of the material block size. This mapping procedure allows the material properties in a finite element study to be determined based on the block size required for an accurate representation of the material field, while the size of the finite elements can be selected independently and based on the required numerical accuracy of the finite element solution. The mesh-independent, material mapping procedure developed in this study could be particularly helpful in improving the accuracy of finite element analyses of vertebroplasty and

  13. Improving Short Term Instability for Quantitative Analyses with Portable Electronic Noses

    PubMed Central

    Macías, Miguel Macías; Agudo, J. Enrique; Manso, Antonio García; Orellana, Carlos Javier García; Velasco, Horacio Manuel González; Caballero, Ramón Gallardo

    2014-01-01

    One of the main problems when working with electronic noses is the lack of reproducibility or repeatability of the sensor response, so that, if this problem is not properly considered, electronic noses can be useless, especially for quantitative analyses. On the other hand, irreproducibility is increased with portable and low cost electronic noses where laboratory equipment like gas zero generators cannot be used. In this work, we study the reproducibility of two portable electronic noses, the PEN3 (commercial) and CAPINose (a proprietary design) by using synthetic wine samples. We show that in both cases short term instability associated to the sensors' response to the same sample and under the same conditions represents a major problem and we propose an internal normalization technique that, in both cases, reduces the variability of the sensors' response. Finally, we show that the normalization proposed seems to be more effective in the CAPINose case, reducing, for example, the variability associated to the TGS2602 sensor from 12.19% to 2.2%. PMID:24932869

  14. Quantitative analyses of CD133 expression facilitate researches on tumor stem cells.

    PubMed

    Liao, Yongqiang; Hu, Xiaotong; Huang, Xuefeng; He, Chao

    2010-01-01

    CD133 is regarded as a marker of tumor initiating cells in many tumors, including colorectal cancer. O'Brien and Ricci et al. have proved that in primary colorectal tumors there are colorectal tumor stem cells (initiating cells) which are marked by CD133 antigen. Using a genetic knockin lacZ reporter mouse model, Shmelkov et al. challenged this increasingly influential viewpoint and drew two important conclusions that challenge former opinions. First, CD133 is widely distributed throughout the full range of tumor epithelial cells in the colon as opposed to being limited to a few cells. Second, CD133 negative cells of colon tumors are also tumorigenic, and are more inclined to metastasize. Based on these two opinions, we hypothesize that the expression of CD133 is different among tumor cells, and that quantitative but not qualitative analyses of CD133 abundance are necessary to determine the relationship between CD133 expression and tumor stem cell characteristics. To verify this hypothesis, colorectal cancer cell line SW620 was cultured and sorted into CD133(Hi), CD133(Mid) and CD133(Low) subgroups using magnetic microbeads to compare their xenograft biological characteristics. The results showed that the CD133(Hi) subgroup of SW620 is more close to the tumor initiating cells in terms of biological characteristics than CD133(Mid) and CD133(low) subgroups, but the CD133(low) subgroup still maintains the ability of tumorigenicity. It supported that tumor initiating cells are more correlated to the abundance of CD133.

  15. Simultaneous Extraction from Bacterioplankton of Total RNA and DNA Suitable for Quantitative Structure and Function Analyses

    PubMed Central

    Weinbauer, Markus G.; Fritz, Ingo; Wenderoth, Dirk F.; Höfle, Manfred G.

    2002-01-01

    The aim of this study was to develop a protocol for the simultaneous extraction from bacterioplankton of RNA and DNA suitable for quantitative molecular analysis. By using a combined mechanical and chemical extraction method, the highest RNA and DNA yield was obtained with sodium lauryl sarcosinate-phenol or DivoLab-phenol as the extraction mix. The efficiency of extraction of nucleic acids was comparatively high and varied only moderately in gram-negative bacterial isolates and bacterioplankton (RNA, 52 to 66%; DNA, 43 to 61%); significant amounts of nucleic acids were also obtained for a gram-positive bacterial isolate (RNA, 20 to 30%; DNA, 20 to 25%). Reverse transcription-PCR and PCR amplification products of fragments of 16S rRNA and its genes were obtained from all isolates and communities, indicating that the extracted nucleic acids were intact and pure enough for community structure analyses. By using single-strand conformation polymorphism of fragments of 16S rRNA and its gene, community fingerprints were obtained from pond bacterioplankton. mRNA transcripts encoding fragments of the enzyme nitrite reductase gene (nir gene) could be detected in a pond water sample, indicating that the extraction method is also suitable for studying gene expression. The extraction method presented yields nucleic acids that can be used to perform structural and functional studies of bacterioplankton communities from a single sample. PMID:11872453

  16. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    PubMed Central

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-01-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates. PMID:25844042

  17. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses.

    PubMed

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-04-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates.

  18. Temporal variation of traffic on highways and the development of accurate temporal allocation factors for air pollution analyses

    NASA Astrophysics Data System (ADS)

    Batterman, Stuart; Cook, Richard; Justin, Thomas

    2015-04-01

    Traffic activity encompasses the number, mix, speed and acceleration of vehicles on roadways. The temporal pattern and variation of traffic activity reflects vehicle use, congestion and safety issues, and it represents a major influence on emissions and concentrations of traffic-related air pollutants. Accurate characterization of vehicle flows is critical in analyzing and modeling urban and local-scale pollutants, especially in near-road environments and traffic corridors. This study describes methods to improve the characterization of temporal variation of traffic activity. Annual, monthly, daily and hourly temporal allocation factors (TAFs), which describe the expected temporal variation in traffic activity, were developed using four years of hourly traffic activity data recorded at 14 continuous counting stations across the Detroit, Michigan, U.S. region. Five sites also provided vehicle classification. TAF-based models provide a simple means to apportion annual average estimates of traffic volume to hourly estimates. The analysis shows the need to separate TAFs for total and commercial vehicles, and weekdays, Saturdays, Sundays and observed holidays. Using either site-specific or urban-wide TAFs, nearly all of the variation in historical traffic activity at the street scale could be explained; unexplained variation was attributed to adverse weather, traffic accidents and construction. The methods and results presented in this paper can improve air quality dispersion modeling of mobile sources, and can be used to evaluate and model temporal variation in ambient air quality monitoring data and exposure estimates.

  19. Study Quality in SLA: An Assessment of Designs, Analyses, and Reporting Practices in Quantitative L2 Research

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2013-01-01

    This study assesses research and reporting practices in quantitative second language (L2) research. A sample of 606 primary studies, published from 1990 to 2010 in "Language Learning and Studies in Second Language Acquisition," was collected and coded for designs, statistical analyses, reporting practices, and outcomes (i.e., effect…

  20. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  1. Quantitative analyses of cross-sectional shape of the distal radius in three species of macaques.

    PubMed

    Kikuchi, Yasuhiro

    2004-04-01

    I conducted quantitative analyses of the cross-sectional shape of the distal radial shaft in three species of macaques, which differ in locomotor behavior: semi-terrestrial Japanese macaques ( Macaca fuscata), arboreal long tailed macaques ( M. fascicularis), and relatively terrestrial rhesus macaques ( M. mulatta). I took CT scans of the distal radial shafts of a total of 180 specimens at the level of the inferior radio-ulnar articulation. From each CT image, the periosteal outline of the radius was traced automatically by a digital imaging technique. I determined five points (landmarks) on the outline by developing a standardized morphometric technique. Bone surface lengths were measured by using these landmarks and their soft tissue correlates were investigated. The results of this study were as follows: (1) Semi-terrestrial M. fuscata has features that are approximately intermediate between those of the other two species. M. fuscata has a relatively small groove for M. abductor pollicis longus and a large groove for Mm. extensor carpi radialis longus et brevis. These characters resemble those of M. fascicularis. On the other hand, the ulnar notch of M. fuscata is relatively large, a character which is similar to that of M. mulatta. Moreover, compared to the other two macaques, the surface of the flexor muscles of M. fuscata is intermediate in size. (2) The more terrestrial M. mulatta has a relatively large groove for M. abductor pollicis longus and a small groove for Mm. extensor carpi radialis longus et brevis. Moreover, M. mulatta has a relatively large ulnar notch and a small surface for the flexor muscles. (3) The arboreal M. fascicularis has similar features to those of M. fuscata for the first and second relative size index. However, in the ulnar notch, M. fascicularis has a peculiar character and the surface for the flexor muscles is relatively large compared to those of the other two species. These results can be interpreted in terms of positional

  2. Can a quantitative simulation of an Otto engine be accurately rendered by a simple Novikov model with heat leak?

    NASA Astrophysics Data System (ADS)

    Fischer, A.; Hoffmann, K.-H.

    2004-03-01

    In this case study a complex Otto engine simulation provides data including, but not limited to, effects from losses due to heat conduction, exhaust losses and frictional losses. This data is used as a benchmark to test whether the Novikov engine with heat leak, a simple endoreversible model, can reproduce the complex engine behavior quantitatively by an appropriate choice of model parameters. The reproduction obtained proves to be of high quality.

  3. Robust Algorithm for Alignment of Liquid Chromatography-Mass Spectrometry Analyses in an Accurate Mass and Time Tag Data Analysis Pipeline

    SciTech Connect

    Jaitly, Navdeep; Monroe, Matthew E.; Petyuk, Vladislav A.; Clauss, Therese RW; Adkins, Joshua N.; Smith, Richard D.

    2006-11-01

    Liquid chromatography coupled to mass spectrometry (LC-MS) and tandem mass spectrometry (LC-MS/MS) has become a standard technique for analyzing complex peptide mixtures to determine composition and relative quantity. Several high-throughput proteomics techniques attempt to combine complementary results from multiple LC-MS and LC-MS/MS analyses to provide more comprehensive and accurate results. To effectively collate results from these techniques, variations in mass and elution time measurements between related analyses are corrected by using algorithms designed to align the various types of results: LC-MS/MS vs. LC-MS/MS, LC-MS vs. LC-MS/MS, and LC-MS vs. LC-MS. Described herein are new algorithms referred to collectively as Liquid Chromatography based Mass Spectrometric Warping and Alignment of Retention times of Peptides (LCMSWARP) which use a dynamic elution time warping approach similar to traditional algorithms that correct variation in elution time using piecewise linear functions. LCMSWARP is compared to a linear alignment algorithm that assumes a linear transformation of elution time between analyses. LCMSWARP also corrects for drift in mass measurement accuracies that are often seen in an LC-MS analysis due to factors such as analyzer drift. We also describe the alignment of LC-MS results and provide examples of alignment of analyses from different chromatographic systems to demonstrate more complex transformation functions.

  4. QUANTITATIVE AND QUALITATIVE ANALYSES OF EXOGENOUS AND ENDOGENOUS CHILDREN IN SOME READING PROCESSES.

    ERIC Educational Resources Information Center

    CAPOBIANCO, RUDOLPH J.; MILLER, DONALD Y.

    THE PURPOSE OF THE PRESENT STUDY WAS TO INVESTIGATE THESE ASPECTS OF THE READING PROCESS--(1) SILENT AND ORAL READING ACHIEVEMENT, (2) PATTERN OF READING ERRORS, (3) AND AUDITORY AND VISUAL PERCEPTION TECHNIQUES. THE FACT THAT COMPARISONS BETWEEN THE EXOGENOUS AND ENDOGENOUS GROUPS ON THE QUANTITATIVE AND MOST OF THE QUALITATIVE ASPECTS OF TEST…

  5. Evaluation of the National Science Foundation's Local Course Improvement Program, Volume II: Quantitative Analyses.

    ERIC Educational Resources Information Center

    Kulik, James A.; And Others

    This report is the second of three volumes describing the results of the evaluation of the National Science Foundation (NSF) Local Course Improvement (LOCI) program. This volume describes the quantitative results of the program. Evaluation of the LOCI program involved answering questions in the areas of the need for science course improvement as…

  6. Quantitative Research Methods in Chaos and Complexity: From Probability to Post Hoc Regression Analyses

    ERIC Educational Resources Information Center

    Gilstrap, Donald L.

    2013-01-01

    In addition to qualitative methods presented in chaos and complexity theories in educational research, this article addresses quantitative methods that may show potential for future research studies. Although much in the social and behavioral sciences literature has focused on computer simulations, this article explores current chaos and…

  7. Infectious titres of sheep scrapie and bovine spongiform encephalopathy agents cannot be accurately predicted from quantitative laboratory test results.

    PubMed

    González, Lorenzo; Thorne, Leigh; Jeffrey, Martin; Martin, Stuart; Spiropoulos, John; Beck, Katy E; Lockey, Richard W; Vickery, Christopher M; Holder, Thomas; Terry, Linda

    2012-11-01

    It is widely accepted that abnormal forms of the prion protein (PrP) are the best surrogate marker for the infectious agent of prion diseases and, in practice, the detection of such disease-associated (PrP(d)) and/or protease-resistant (PrP(res)) forms of PrP is the cornerstone of diagnosis and surveillance of the transmissible spongiform encephalopathies (TSEs). Nevertheless, some studies question the consistent association between infectivity and abnormal PrP detection. To address this discrepancy, 11 brain samples of sheep affected with natural scrapie or experimental bovine spongiform encephalopathy were selected on the basis of the magnitude and predominant types of PrP(d) accumulation, as shown by immunohistochemical (IHC) examination; contra-lateral hemi-brain samples were inoculated at three different dilutions into transgenic mice overexpressing ovine PrP and were also subjected to quantitative analysis by three biochemical tests (BCTs). Six samples gave 'low' infectious titres (10⁶·⁵ to 10⁶·⁷ LD₅₀ g⁻¹) and five gave 'high titres' (10⁸·¹ to ≥ 10⁸·⁷ LD₅₀ g⁻¹) and, with the exception of the Western blot analysis, those two groups tended to correspond with samples with lower PrP(d)/PrP(res) results by IHC/BCTs. However, no statistical association could be confirmed due to high individual sample variability. It is concluded that although detection of abnormal forms of PrP by laboratory methods remains useful to confirm TSE infection, infectivity titres cannot be predicted from quantitative test results, at least for the TSE sources and host PRNP genotypes used in this study. Furthermore, the near inverse correlation between infectious titres and Western blot results (high protease pre-treatment) argues for a dissociation between infectivity and PrP(res).

  8. DAPAR & ProStaR: software to perform statistical analyses in quantitative discovery proteomics.

    PubMed

    Wieczorek, Samuel; Combes, Florence; Lazar, Cosmin; Giai Gianetto, Quentin; Gatto, Laurent; Dorffer, Alexia; Hesse, Anne-Marie; Couté, Yohann; Ferro, Myriam; Bruley, Christophe; Burger, Thomas

    2017-01-01

    DAPAR and ProStaR are software tools to perform the statistical analysis of label-free XIC-based quantitative discovery proteomics experiments. DAPAR contains procedures to filter, normalize, impute missing value, aggregate peptide intensities, perform null hypothesis significance tests and select the most likely differentially abundant proteins with a corresponding false discovery rate. ProStaR is a graphical user interface that allows friendly access to the DAPAR functionalities through a web browser.

  9. Quantitative and confirmatory analyses of malachite green and leucomalachite green residues in fish and shrimp.

    PubMed

    Andersen, Wendy C; Turnipseed, Sherri B; Roybal, José E

    2006-06-28

    Liquid chromatographic methods are presented for the quantitative and confirmatory determination of malachite green (MG) and leucomalachite green (LMG) for channel catfish, rainbow trout, tilapia, basa, Atlantic salmon, and tiger shrimp. Residues were extracted from tissues with ammonium acetate buffer and acetonitrile and isolated by partitioning into dichloromethane. LMG was quantitatively oxidized to the chromic MG with 2,3-dichloro-5,6-dicyano-1,4-benzoquinone. Extracts were analyzed for total MG by liquid chromatography with both visible detection (LC-VIS) at 618 nm for routine screening and ion trap mass spectrometry (LC-MSn) with no discharge-atmospheric pressure chemical ionization for residue confirmation. The method was validated in each species fortified with LMG at 1, 2, 4, and 10 ng/g (ppb), and average recoveries ranged from 85.9 to 93.9%. Quantitative data were consistent for the two detection methods, with measured method detection limits of 1.0 ng/g for LC-VIS and 0.25 ng/g for LC-MSn. Incurred tissues from catfish, trout, tilapia, and salmon that had been treated with MG were also extracted and analyzed as part of this study.

  10. Multi-functionality of computer-aided quantitative vertebral fracture morphometry analyses

    PubMed Central

    Oei, Ling; Ly, Felisia; El Saddy, Salih; Makurthou, Ater A.; Hofman, Albert; van Rooij, Frank J. A.; Uitterlinden, André G.; Zillikens, M. Carola; Rivadeneira, Fernando

    2013-01-01

    Osteoporotic vertebral fractures are an increasingly active area of research. Oftentimes assessments are performed by software-assisted quantitative morphometry. Here, we will discuss multi-functionality of these data for research purposes. A team of trained research assistants processed lateral spine radiographs from the population-based Rotterdam Study with SpineAnalyzer® software (Optasia Medical Ltd, Cheadle, UK). Next, the raw coordinate data of the two upper corners of Th5 and the two lower corners of Th12 were extracted to calculate the Cobb’s kyphosis angle. In addition, two readers performed independent manual measurements of the Cobb’s kyphosis angle between Th5 and Th12 for a sample (n=99). The mean kyphosis angle and its standard deviation were 53° and 10° for the SpineAnalyzer® software measurements and 54° and 12° by manual measurements, respectively. The Pearson’s correlation coefficient was 0.65 [95% confidence interval (CI): 0.53-0.75; P=2×10–13]. There was a substantial intraclass correlation with a coefficient of 0.64 (95% CI: 0.51-0.74). The mean difference between methods was 1° (95% CI: –2°-4°), with 95% limits of agreement of –20°-17° and there were no systematic biases. In conclusion, vertebral fracture morphometry data can be used to derive the Cobb’s kyphosis angle. Even more quantitative measures could be derived from the raw data, such as vertebral wedging, intervertebral disc space, spondylolisthesis and the lordosis angle. These measures may be of interest for research into musculoskeletal disorders such as osteoporosis, degenerative disease or Scheuermann’s disease. Large-scale studies may benefit from efficient capture of multiple quantitative measures in the spine. PMID:24273742

  11. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    SciTech Connect

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; Parks, Jerry M.

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of

  12. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid–Base and Ligand Binding Equilibria of Aquacobalamin

    DOE PAGES

    Johnston, Ryne C.; Zhou, Jing; Smith, Jeremy C.; ...

    2016-07-08

    In redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. Moreover, a major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co ligand binding equilibrium constants (Kon/off), pKas and reduction potentials for models of aquacobalaminmore » in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for CoIII, CoII, and CoI species, respectively, and the second model features saturation of each vacant axial coordination site on CoII and CoI species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co axial ligand binding, leading to substantial errors in predicted

  13. Toward Quantitatively Accurate Calculation of the Redox-Associated Acid-Base and Ligand Binding Equilibria of Aquacobalamin.

    PubMed

    Johnston, Ryne C; Zhou, Jing; Smith, Jeremy C; Parks, Jerry M

    2016-08-04

    Redox processes in complex transition metal-containing species are often intimately associated with changes in ligand protonation states and metal coordination number. A major challenge is therefore to develop consistent computational approaches for computing pH-dependent redox and ligand dissociation properties of organometallic species. Reduction of the Co center in the vitamin B12 derivative aquacobalamin can be accompanied by ligand dissociation, protonation, or both, making these properties difficult to compute accurately. We examine this challenge here by using density functional theory and continuum solvation to compute Co-ligand binding equilibrium constants (Kon/off), pKas, and reduction potentials for models of aquacobalamin in aqueous solution. We consider two models for cobalamin ligand coordination: the first follows the hexa, penta, tetra coordination scheme for Co(III), Co(II), and Co(I) species, respectively, and the second model features saturation of each vacant axial coordination site on Co(II) and Co(I) species with a single, explicit water molecule to maintain six directly interacting ligands or water molecules in each oxidation state. Comparing these two coordination schemes in combination with five dispersion-corrected density functionals, we find that the accuracy of the computed properties is largely independent of the scheme used, but including only a continuum representation of the solvent yields marginally better results than saturating the first solvation shell around Co throughout. PBE performs best, displaying balanced accuracy and superior performance overall, with RMS errors of 80 mV for seven reduction potentials, 2.0 log units for five pKas and 2.3 log units for two log Kon/off values for the aquacobalamin system. Furthermore, we find that the BP86 functional commonly used in corrinoid studies suffers from erratic behavior and inaccurate descriptions of Co-axial ligand binding, leading to substantial errors in predicted pKas and

  14. Can MRI accurately detect pilon articular malreduction? A quantitative comparison between CT and 3T MRI bone models

    PubMed Central

    Radzi, Shairah; Dlaska, Constantin Edmond; Cowin, Gary; Robinson, Mark; Pratap, Jit; Schuetz, Michael Andreas; Mishra, Sanjay

    2016-01-01

    Background Pilon fracture reduction is a challenging surgery. Radiographs are commonly used to assess the quality of reduction, but are limited in revealing the remaining bone incongruities. The study aimed to develop a method in quantifying articular malreductions using 3D computed tomography (CT) and magnetic resonance imaging (MRI) models. Methods CT and MRI data were acquired using three pairs of human cadaveric ankle specimens. Common tibial pilon fractures were simulated by performing osteotomies to the ankle specimens. Five of the created fractures [three AO type-B (43-B1), and two AO type-C (43-C1) fractures] were then reduced and stabilised using titanium implants, then rescanned. All datasets were reconstructed into CT and MRI models, and were analysed in regards to intra-articular steps and gaps, surface deviations, malrotations and maltranslations of the bone fragments. Results Initial results reveal that type B fracture CT and MRI models differed by ~0.2 (step), ~0.18 (surface deviations), ~0.56° (rotation) and ~0.4 mm (translation). Type C fracture MRI models showed metal artefacts extending to the articular surface, thus unsuitable for analysis. Type C fracture CT models differed from their CT and MRI contralateral models by ~0.15 (surface deviation), ~1.63° (rotation) and ~0.4 mm (translation). Conclusions Type B fracture MRI models were comparable to CT and may potentially be used for the postoperative assessment of articular reduction on a case-to-case basis. PMID:28090442

  15. Quantitative analyses of spectral measurement error based on Monte-Carlo simulation

    NASA Astrophysics Data System (ADS)

    Jiang, Jingying; Ma, Congcong; Zhang, Qi; Lu, Junsheng; Xu, Kexin

    2015-03-01

    The spectral measurement error is controlled by the resolution and the sensitivity of the spectroscopic instrument and the instability of involved environment. In this talk, the spectral measurement error has been analyzed quantitatively by using the Monte Carlo (MC) simulation. Take the floating reference point measurement for example, unavoidably there is a deviation between the measuring position and the theoretical position due to various influence factors. In order to determine the error caused by the positioning accuracy of the measuring device, Monte Carlo simulation has been carried out at the wavelength of 1310nm, simulating Intralipid solution of 2%. MC simulation was performed with the number of 1010 photons and the sampling interval of the ring at 1μm. The data from MC simulation will be analyzed on the basis of thinning and calculating method (TCM) proposed in this talk. The results indicate that TCM could be used to quantitatively analyze the spectral measurement error brought by the positioning inaccuracy.

  16. Quantitative histological image analyses of reticulin fibers in a myelofibrotic mouse

    PubMed Central

    Lucero, Hector A.; Patterson, Shenia; Matsuura, Shinobu; Ravid, Katya

    2016-01-01

    Bone marrow (BM) reticulin fibrosis (RF), revealed by silver staining of tissue sections, is associated with myeloproliferative neoplasms, while tools for quantitative assessment of reticulin deposition throughout a femur BM are still in need. Here, we present such a method, allowing via analysis of hundreds of composite images to identify a patchy nature of RF throughout the BM during disease progression in a mouse model of myelofibrosis. To this end, initial conversion of silver stained BM color images into binary images identified two limitations: variable color, owing to polychromatic staining of reticulin fibers, and variable background in different sections of the same batch, limiting application of the color deconvolution method, and use of constant threshold, respectively. By blind coding image identities, to allow for threshold input (still within a narrow range), and using shape filtering to further eliminate background we were able to quantitate RF in myelofibrotic Gata-1low (experimental) and wild type (control) mice as a function of animal age. Color images spanning the whole femur BM were batch-analyzed using ImageJ software, aided by our two newly added macros. The results show heterogeneous RF density in different areas of the marrow of Gata-1low mice, with degrees of heterogeneity reduced upon aging. This method can be applied uniformly across laboratories in studies assessing RF remodeling induced by aging or other conditions in animal models. PMID:28008415

  17. Quantitative analyses of RAG-RSS interactions and conformations revealed by atomic force microscopy.

    PubMed

    Pavlicek, Jeffrey W; Lyubchenko, Yuri L; Chang, Yung

    2008-10-28

    During V(D)J recombination, site specific DNA excision is dictated by the binding of RAG1/2 proteins to the conserved recombination signal sequence (RSS) within the genome. The interaction between RAG1/2 and RSS is thought to involve a large DNA distortion that is permissive for DNA cleavage. In this study, using atomic force microscopy imaging (AFM), we analyzed individual RAG-RSS complexes, in which the bending angle of RAG-associated RSS substrates could be visualized and quantified. We provided the quantitative measurement on the conformations of specific RAG-12RSS complexes. Previous data indicating the necessity of RAG2 for recombination implies a structural role in the RAG-RSS complex. Surprisingly, however, no significant difference was observed in conformational bending with AFM between RAG1-12RSS and RAG1/2-12RSS. RAG1 was found sufficient to induce DNA bending, and the addition of RAG2 did not change the bending profile. In addition, a prenicked 12RSS bound by RAG1/2 proteins displayed a conformation similar to the one observed with the intact 12RSS, implying that no greater DNA bending occurs after the nicking step in the signal complex. Taken together, the quantitative AFM results on the components of the recombinase emphasize a tightly held complex with a bend angle value near 60 degrees , which may be a prerequisite step for the site-specific nicking by the V(D)J recombinase.

  18. Quantitative analyses of tartaric acid based on terahertz time domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Cao, Binghua; Fan, Mengbao

    2010-10-01

    Terahertz wave is the electromagnetic spectrum situated between microwave and infrared wave. Quantitative analysis based on terahertz spectroscopy is very important for the application of terahertz techniques. But how to realize it is still under study. L-tartaric acid is widely used as acidulant in beverage, and other food, such as soft drinks, wine, candy, bread and some colloidal sweetmeats. In this paper, terahertz time-domain spectroscopy is applied to quantify the tartaric acid. Two methods are employed to process the terahertz spectra of different samples with different content of tartaric acid. The first one is linear regression combining correlation analysis. The second is partial least square (PLS), in which the absorption spectra in the 0.8-1.4THz region are used to quantify the tartaric acid. To compare the performance of these two principles, the relative error of the two methods is analyzed. For this experiment, the first method does better than the second one. But the first method is suitable for the quantitative analysis of materials which has obvious terahertz absorption peaks, while for material which has no obvious terahertz absorption peaks, the second one is more appropriate.

  19. Genome-Wide Pathway Association Studies of Multiple Correlated Quantitative Phenotypes Using Principle Component Analyses

    PubMed Central

    Zhang, Feng; Guo, Xiong; Wu, Shixun; Han, Jing; Liu, Yongjun; Shen, Hui; Deng, Hong-Wen

    2012-01-01

    Genome-wide pathway association studies provide novel insight into the biological mechanism underlying complex diseases. Current pathway association studies primarily focus on single important disease phenotype, which is sometimes insufficient to characterize the clinical manifestations of complex diseases. We present a multi-phenotypes pathway association study(MPPAS) approach using principle component analysis(PCA). In our approach, PCA is first applied to multiple correlated quantitative phenotypes for extracting a set of orthogonal phenotypic components. The extracted phenotypic components are then used for pathway association analysis instead of original quantitative phenotypes. Four statistics were proposed for PCA-based MPPAS in this study. Simulations using the real data from the HapMap project were conducted to evaluate the power and type I error rates of PCA-based MPPAS under various scenarios considering sample sizes, additive and interactive genetic effects. A real genome-wide association study data set of bone mineral density (BMD) at hip and spine were also analyzed by PCA-based MPPAS. Simulation studies illustrated the performance of PCA-based MPPAS for identifying the causal pathways underlying complex diseases. Genome-wide MPPAS of BMD detected associations between BMD and KENNY_CTNNB1_TARGETS_UP as well as LONGEVITYPATHWAY pathways in this study. We aim to provide a applicable MPPAS approach, which may help to gain deep understanding the potential biological mechanism of association results for complex diseases. PMID:23285279

  20. Interfacial undercooling in solidification of colloidal suspensions: analyses with quantitative measurements

    PubMed Central

    You, Jiaxue; Wang, Lilin; Wang, Zhijun; Li, Junjie; Wang, Jincheng; Lin, Xin; Huang, Weidong

    2016-01-01

    Interfacial undercooling in the complex solidification of colloidal suspensions is of significance and remains a puzzling problem. Two types of interfacial undercooling are supposed to be involved in the freezing of colloidal suspensions, i.e., solute constitutional supercooling (SCS) caused by additives in the solvent and particulate constitutional supercooling (PCS) caused by particles. However, quantitative identification of the interfacial undercooling in the solidification of colloidal suspensions, is still absent; thus, the question of which type of undercooling is dominant in this complex system remains unanswered. Here, we quantitatively measured the static and dynamic interface undercoolings of SCS and PCS in ideal and practical colloidal systems. We show that the interfacial undercooling primarily comes from SCS caused by the additives in the solvent, while PCS is minor. This finding implies that the thermodynamic effect of particles from the PCS is not the fundamental physical mechanism for pattern formation of cellular growth and lamellar structure in the solidification of colloidal suspensions, a general case of ice-templating method. Instead, the patterns in the ice-templating method can be controlled effectively by adjusting the additives. PMID:27329394

  1. Quantitative Proteomic Analyses of Molecular Mechanisms Associated with Cytoplasmic Incompatibility in Drosophila melanogaster Induced by Wolbachia.

    PubMed

    Yuan, Lin-Ling; Chen, Xiulan; Zong, Qiong; Zhao, Ting; Wang, Jia-Lin; Zheng, Ya; Zhang, Ming; Wang, Zailong; Brownlie, Jeremy C; Yang, Fuquan; Wang, Yu-Feng

    2015-09-04

    To investigate the molecular mechanisms of cytoplasmic incompatibility (CI) induced by Wolbachia bacteria in Drosophila melanogaster, we applied an isobaric tags for relative and absolute quantitation (iTRAQ)-based quantitative proteomic assay to identify differentially expressed proteins extracted from spermathecae and seminal receptacles (SSR) of uninfected females mated with either 1-day-old Wolbachia-uninfected (1T) or infected males (1W) or 5-day-old infected males (5W). In total, 1317 proteins were quantified; 83 proteins were identified as having at least a 1.5-fold change in expression when 1W was compared with 1T. Differentially expressed proteins were related to metabolism, immunity, and reproduction. Wolbachia changed the expression of seminal fluid proteins (Sfps). Wolbachia may disrupt the abundance of proteins in SSR by affecting ubiquitin-proteasome-mediated proteolysis. Knocking down two Sfp genes (CG9334 and CG2668) in Wolbachia-free males resulted in significantly lower embryonic hatch rates with a phenotype of chromatin bridges. Wolbachia-infected females may rescue the hatch rates. This suggests that the changed expression of some Sfps may be one of the mechanisms of CI induced by Wolbachia. This study provides a panel of candidate proteins that may be involved in the interaction between Wolbachia and their insect hosts and, through future functional studies, may help to elucidate the underlying mechanisms of Wolbachia-induced CI.

  2. Identification and validation of reference genes for accurate normalization of real-time quantitative PCR data in kiwifruit.

    PubMed

    Ferradás, Yolanda; Rey, Laura; Martínez, Óscar; Rey, Manuel; González, Ma Victoria

    2016-05-01

    Identification and validation of reference genes are required for the normalization of qPCR data. We studied the expression stability produced by eight primer pairs amplifying four common genes used as references for normalization. Samples representing different tissues, organs and developmental stages in kiwifruit (Actinidia chinensis var. deliciosa (A. Chev.) A. Chev.) were used. A total of 117 kiwifruit samples were divided into five sample sets (mature leaves, axillary buds, stigmatic arms, fruit flesh and seeds). All samples were also analysed as a single set. The expression stability of the candidate primer pairs was tested using three algorithms (geNorm, NormFinder and BestKeeper). The minimum number of reference genes necessary for normalization was also determined. A unique primer pair was selected for amplifying the 18S rRNA gene. The primer pair selected for amplifying the ACTIN gene was different depending on the sample set. 18S 2 and ACT 2 were the candidate primer pairs selected for normalization in the three sample sets (mature leaves, fruit flesh and stigmatic arms). 18S 2 and ACT 3 were the primer pairs selected for normalization in axillary buds. No primer pair could be selected for use as the reference for the seed sample set. The analysis of all samples in a single set did not produce the selection of any stably expressing primer pair. Considering data previously reported in the literature, we validated the selected primer pairs amplifying the FLOWERING LOCUS T gene for use in the normalization of gene expression in kiwifruit.

  3. Quantitative characterization of metastatic disease in the spine. Part II. Histogram-based analyses

    SciTech Connect

    Whyne, Cari; Hardisty, Michael; Wu, Florence; Skrinskas, Tomas; Clemons, Mark; Gordon, Lyle; Basran, Parminder S.

    2007-08-15

    Radiological imaging is essential to the appropriate management of patients with bone metastasis; however, there have been no widely accepted guidelines as to the optimal method for quantifying the potential impact of skeletal lesions or to evaluate response to treatment. The current inability to rapidly quantify the response of bone metastases excludes patients with cancer and bone disease from participating in clinical trials of many new treatments as these studies frequently require patients with so-called measurable disease. Computed tomography (CT) can provide excellent skeletal detail with a sensitivity for the diagnosis of bone metastases. The purpose of this study was to establish an objective method to quantitatively characterize disease in the bony spine using CT-based segmentations. It was hypothesized that histogram analysis of CT vertebral density distributions would enable standardized segmentation of tumor tissue and consequently allow quantification of disease in the metastatic spine. Thirty two healthy vertebral CT scans were first studied to establish a baseline characterization. The histograms of the trabecular centrums were found to be Gaussian distributions (average root-mean-square difference=30 voxel counts), as expected for a uniform material. Intrapatient vertebral level similarity was also observed as the means were not significantly different (p>0.8). Thus, a patient-specific healthy vertebral body histogram is able to characterize healthy trabecular bone throughout that individual's thoracolumbar spine. Eleven metastatically involved vertebrae were analyzed to determine the characteristics of the lytic and blastic bone voxels relative to the healthy bone. Lytic and blastic tumors were segmented as connected areas with voxel intensities between specified thresholds. The tested thresholds were {mu}-1.0{sigma}, {mu}-1.5{sigma}, and {mu}-2.0{sigma}, for lytic and {mu}+2.0{sigma}, {mu}+3.0{sigma}, and {mu}+3.5{sigma} for blastic tissue where

  4. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: I, theoretical foundations.

    PubMed

    Pezzotti, Giuseppe; Zhu, Wenliang; Boffelli, Marco; Adachi, Tetsuya; Ichioka, Hiroaki; Yamamoto, Toshiro; Marunaka, Yoshinori; Kanamura, Narisato

    2015-05-01

    The Raman spectroscopic method has quantitatively been applied to the analysis of local crystallographic orientation in both single-crystal hydroxyapatite and human teeth. Raman selection rules for all the vibrational modes of the hexagonal structure were expanded into explicit functions of Euler angles in space and six Raman tensor elements (RTE). A theoretical treatment has also been put forward according to the orientation distribution function (ODF) formalism, which allows one to resolve the statistical orientation patterns of the nm-sized hydroxyapatite crystallite comprised in the Raman microprobe. Close-form solutions could be obtained for the Euler angles and their statistical distributions resolved with respect to the direction of the average texture axis. Polarized Raman spectra from single-crystalline hydroxyapatite and textured polycrystalline (teeth enamel) samples were compared, and a validation of the proposed Raman method could be obtained through confirming the agreement between RTE values obtained from different samples.

  5. In-field measurements of PCDF emissions from coal combustion and their quantitative analyses

    SciTech Connect

    Pehlivan, M.; Beduk, D.; Pehlivan, E.

    2008-07-01

    In this study, a series of polychlorinated dibenzofurans (PCDFs) emitted to the surrounding soil as the result of the combustion of coal and wood from the industrial steam boilers and household stoves have been identified. Levels of polychlorinated dibenzofurans (PCDF) in soil samples were measured at different sites in proximity to the municipal solid waste incinerator (MSWI) to determine baseline contamination and the contributory role of incinerator emissions. PCDF contaminants were concentrated from soil samples and isolated from other materials by chromatographic methods. PCDF isomers were identified separately by column chromatography utilizing column packed with materials such as Kieselgel/44 vol% H{sub 2}SO{sub 4}, Macro Alumina B Super 1, Mix. Column, Bio Beads S-X3 Gel Chromatography, Min Alumina B Super 1 + Kieselgel/AgNO{sub 3} and their quantitative determinations were performed by GC/MS (gas chromatography/mass spectroscopy). The PCDF levels were subsequently compared with established values from previous studies.

  6. Vasoactive intestinal polypeptide immunoreactivity in the human cerebellum: qualitative and quantitative analyses

    PubMed Central

    Benagiano, Vincenzo; Flace, Paolo; Lorusso, Loredana; Rizzi, Anna; Bosco, Lorenzo; Cagiano, Raffaele; Ambrosi, Glauco

    2009-01-01

    Although autoradiographic, reverse transcription-polymerase chain reaction and immunohistochemical studies have demonstrated receptors for vasoactive intestinal polypeptide (VIP) in the cerebellum of various species, immunohistochemistry has never shown immunoreactivity for VIP within cerebellar neuronal bodies and processes. The present study aimed to ascertain whether VIP immunoreactivity really does exist in the human cerebellum by making a systematic analysis of samples removed post-mortem from all of the cerebellar lobes. The study was carried out using light microscopy immunohistochemical techniques based on a set of four different antibodies (three polyclonal and one monoclonal) against VIP, carefully selected on the basis of control tests performed on human colon. All of the antibodies used showed VIP-immunoreactive neuronal bodies and processes distributed in the cerebellar cortex and subjacent white matter of all of the cerebellum lobes, having similar qualitative patterns of distribution. Immunoreactive neurons included subpopulations of the main neuron types of the cortex. Statistical analysis of the quantitative data on the VIP immunoreactivity revealed by the different antibodies in the different cerebellar lobes did not demonstrate any significant differences. In conclusion, using four different anti-VIP antibodies, the first evidence of VIP immunoreactivity is herein supplied in the human post-mortem cerebellum, with similar qualitative/quantitative patterns of distribution among the different cerebellum lobes. Owing to the function performed by VIP as a neurotransmitter/neuromodulator, it is a candidate for a role in intrinsic and extrinsic (projective) circuits of the cerebellum, in agreement with previous demonstrations of receptors for VIP in the cerebellar cortex and nuclei. As VIP signalling pathways are implicated in the regulation of cognitive and psychic functions, cerebral blood flow and metabolism, processes of histomorphogenesis

  7. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2016-10-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  8. Qualitative and quantitative analyses of alkaloids in Uncaria species by UPLC-ESI-Q-TOF/MS.

    PubMed

    Wang, Hai-Bo; Qi, Wen; Zhang, Lin; Yuan, Dan

    2014-01-01

    An ultra performance liquid chromatography (UPLC) coupled with quadrupole time-of-flight mass spectrometry (Q-TOF/MS) method has been optimized and established for the rapid analysis of the alkaloids in 22 samples originating from five Uncaria (U.) species. The accurate mass measurement of all the protonated molecules and subsequent fragment ions offers higher quality structural information for the interpretation of fragmentation pathways of the various groups of alkaloids. A total of 19 oxindole alkaloids, 16 indole alkaloids and 1 flavone were identified by co-chromatography of the sample extract with authentic standards, comparison of the retention time, characteristic molecular ions and fragment ions, or were tentatively identified by MS/MS determination. Moreover, the method was validated for the simultaneous quantification of the 24 components within 10.5 min. The potential chemical markers were identified for classification of the U. species samples by principal component analysis (PCA) and orthogonal partial least squared discriminant analysis (OPLS-DA). The results demonstrate the similarity and differences in alkaloids among the five U. species, which is helpful for the standardization and quality control of the medical materials of the U. Ramulus Cum Unics (URCU). Furthermore, with multivariate statistical analysis, the determined markers are more definite and useful for chemotaxonomy of the U. genus.

  9. Quantitative analysis of x-ray images with a television image analyser.

    PubMed

    Schleicher, A; Tillmann, B; Zilles, K

    1980-07-01

    A method for the quantitative evaluation of X-rays is described. The image is decomposed into individual image points by a mechanical scanning procedure, and at each image point the area fraction of a measuring field not covered by silver grains is determined with an image analyzer. This parameter is interpreted as representing a value corresponding to a specific degree of film blackness. The relationship between the measured value and the X-ray absorption is described by standard curves. With the aid of an aluminum scale, the measured value can be expressed directly by the thickness of an aluminum equivalent with a corresponding X-ray absorption. Details about the adjustment of the image analyzer for detecting the silver grains, the resolution of different degrees of X-ray absorption, as well as the computer-controlled scanning procedure are described. An example demonstrates its applicability to analyze the density distribution of bony tissue around the human humero-ulnar joint. The procedure is not limited to the evaluation of X-rays, but is applicable whenever silver grains can be detected in a film layer by an image analyzer.

  10. Quantitative structure-activity relationship models of chemical transformations from matched pairs analyses.

    PubMed

    Beck, Jeremy M; Springer, Clayton

    2014-04-28

    The concepts of activity cliffs and matched molecular pairs (MMP) are recent paradigms for analysis of data sets to identify structural changes that may be used to modify the potency of lead molecules in drug discovery projects. Analysis of MMPs was recently demonstrated as a feasible technique for quantitative structure-activity relationship (QSAR) modeling of prospective compounds. Although within a small data set, the lack of matched pairs, and the lack of knowledge about specific chemical transformations limit prospective applications. Here we present an alternative technique that determines pairwise descriptors for each matched pair and then uses a QSAR model to estimate the activity change associated with a chemical transformation. The descriptors effectively group similar transformations and incorporate information about the transformation and its local environment. Use of a transformation QSAR model allows one to estimate the activity change for novel transformations and therefore returns predictions for a larger fraction of test set compounds. Application of the proposed methodology to four public data sets results in increased model performance over a benchmark random forest and direct application of chemical transformations using QSAR-by-matched molecular pairs analysis (QSAR-by-MMPA).

  11. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    USGS Publications Warehouse

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  12. High-throughput, quantitative analyses of genetic interactions in E. coli.

    PubMed

    Typas, Athanasios; Nichols, Robert J; Siegele, Deborah A; Shales, Michael; Collins, Sean R; Lim, Bentley; Braberg, Hannes; Yamamoto, Natsuko; Takeuchi, Rikiya; Wanner, Barry L; Mori, Hirotada; Weissman, Jonathan S; Krogan, Nevan J; Gross, Carol A

    2008-09-01

    Large-scale genetic interaction studies provide the basis for defining gene function and pathway architecture. Recent advances in the ability to generate double mutants en masse in Saccharomyces cerevisiae have dramatically accelerated the acquisition of genetic interaction information and the biological inferences that follow. Here we describe a method based on F factor-driven conjugation, which allows for high-throughput generation of double mutants in Escherichia coli. This method, termed genetic interaction analysis technology for E. coli (GIANT-coli), permits us to systematically generate and array double-mutant cells on solid media in high-density arrays. We show that colony size provides a robust and quantitative output of cellular fitness and that GIANT-coli can recapitulate known synthetic interactions and identify previously unidentified negative (synthetic sickness or lethality) and positive (suppressive or epistatic) relationships. Finally, we describe a complementary strategy for genome-wide suppressor-mutant identification. Together, these methods permit rapid, large-scale genetic interaction studies in E. coli.

  13. Laboratory Assay of Brood Care for Quantitative Analyses of Individual Differences in Honey Bee (Apis mellifera) Affiliative Behavior

    PubMed Central

    Shpigler, Hagai Y.; Robinson, Gene E.

    2015-01-01

    Care of offspring is a form of affiliative behavior that is fundamental to studies of animal social behavior. Insects do not figure prominently in this topic because Drosophila melanogaster and other traditional models show little if any paternal or maternal care. However, the eusocial honey bee exhibits cooperative brood care with larvae receiving intense and continuous care from their adult sisters, but this behavior has not been well studied because a robust quantitative assay does not exist. We present a new laboratory assay that enables quantification of group or individual honey bee brood “nursing behavior” toward a queen larva. In addition to validating the assay, we used it to examine the influence of the age of the larva and the genetic background of the adult bees on nursing performance. This new assay also can be used in the future for mechanistic analyses of eusociality and comparative analyses of affilative behavior with other animals. PMID:26569402

  14. Comparative Genomics Analyses Reveal Extensive Chromosome Colinearity and Novel Quantitative Trait Loci in Eucalyptus.

    PubMed

    Li, Fagen; Zhou, Changpin; Weng, Qijie; Li, Mei; Yu, Xiaoli; Guo, Yong; Wang, Yu; Zhang, Xiaohong; Gan, Siming

    2015-01-01

    Dense genetic maps, along with quantitative trait loci (QTLs) detected on such maps, are powerful tools for genomics and molecular breeding studies. In the important woody genus Eucalyptus, the recent release of E. grandis genome sequence allows for sequence-based genomic comparison and searching for positional candidate genes within QTL regions. Here, dense genetic maps were constructed for E. urophylla and E. tereticornis using genomic simple sequence repeats (SSR), expressed sequence tag (EST) derived SSR, EST-derived cleaved amplified polymorphic sequence (EST-CAPS), and diversity arrays technology (DArT) markers. The E. urophylla and E. tereticornis maps comprised 700 and 585 markers across 11 linkage groups, totaling at 1,208.2 and 1,241.4 cM in length, respectively. Extensive synteny and colinearity were observed as compared to three earlier DArT-based eucalypt maps (two maps with E. grandis × E. urophylla and one map of E. globulus) and with the E. grandis genome sequence. Fifty-three QTLs for growth (10-56 months of age) and wood density (56 months) were identified in 22 discrete regions on both maps, in which only one colocalizaiton was found between growth and wood density. Novel QTLs were revealed as compared with those previously detected on DArT-based maps for similar ages in Eucalyptus. Eleven to 585 positional candidate genes were obained for a 56-month-old QTL through aligning QTL confidence interval with the E. grandis genome. These results will assist in comparative genomics studies, targeted gene characterization, and marker-assisted selection in Eucalyptus and the related taxa.

  15. The superior analyses of igneous rocks from Roth's Tabellen, 1869 to 1884, arranged according to the quantitative system of classification

    USGS Publications Warehouse

    Washington, H.S.

    1904-01-01

    In Professional Paper No. 14 there were collected the chemical analyses of igneous rocks published from 1884 to 1900, inclusive, arranged according to the quantitative system of classification recently proposed by Cross, Iddings, Pirsson, and Washington. In order to supplement this work it has appeared advisable to select the more reliable and complete of the earlier analyses collected by Justus Roth and arrange them also in the same manner for publication. Petrographers would thus have available for use according to the new system almost the entire body of chemical work of real value on igneous rocks, the exceptions being a few analyses published prior to 1900 which may have been overlooked by both Roth and myself. The two collections would form a foundation as broad as possible for future research and discussion. I must express my sense of obligation to the United States Geological Survey for publishing the present collection of analyses, and my thanks to my colleagues in the new system of classification for their friendly advice and assistance. 

  16. Spatially quantitative models for vulnerability analyses and resilience measures in flood risk management: Case study Rafina, Greece

    NASA Astrophysics Data System (ADS)

    Karagiorgos, Konstantinos; Chiari, Michael; Hübl, Johannes; Maris, Fotis; Thaler, Thomas; Fuchs, Sven

    2013-04-01

    We will address spatially quantitative models for vulnerability analyses in flood risk management in the catchment of Rafina, 25 km east of Athens, Greece; and potential measures to reduce damage costs. The evaluation of flood damage losses is relatively advanced. Nevertheless, major problems arise since there are no market prices for the evaluation process available. Moreover, there is particular gap in quantifying the damages and necessary expenditures for the implementation of mitigation measures with respect to flash floods. The key issue is to develop prototypes for assessing flood losses and the impact of mitigation measures on flood resilience by adjusting a vulnerability model and to further develop the method in a Mediterranean region influenced by both, mountain and coastal characteristics of land development. The objective of this study is to create a spatial and temporal analysis of the vulnerability factors based on a method combining spatially explicit loss data, data on the value of exposed elements at risk, and data on flood intensities. In this contribution, a methodology for the development of a flood damage assessment as a function of the process intensity and the degree of loss is presented. It is shown that (1) such relationships for defined object categories are dependent on site-specific and process-specific characteristics, but there is a correlation between process types that have similar characteristics; (2) existing semi-quantitative approaches of vulnerability assessment for elements at risk can be improved based on the proposed quantitative method; and (3) the concept of risk can be enhanced with respect to a standardised and comprehensive implementation by applying the vulnerability functions to be developed within the proposed research. Therefore, loss data were collected from responsible administrative bodies and analysed on an object level. The used model is based on a basin scale approach as well as data on elements at risk exposed

  17. Quantitation and Identification of Intact Major Milk Proteins for High-Throughput LC-ESI-Q-TOF MS Analyses

    PubMed Central

    Vincent, Delphine; Elkins, Aaron; Condina, Mark R.; Ezernieks, Vilnis; Rochfort, Simone

    2016-01-01

    Cow’s milk is an important source of proteins in human nutrition. On average, cow’s milk contains 3.5% protein. The most abundant proteins in bovine milk are caseins and some of the whey proteins, namely beta-lactoglobulin, alpha-lactalbumin, and serum albumin. A number of allelic variants and post-translationally modified forms of these proteins have been identified. Their occurrence varies with breed, individuality, stage of lactation, and health and nutritional status of the animal. It is therefore essential to have reliable methods of detection and quantitation of these proteins. Traditionally, major milk proteins are quantified using liquid chromatography (LC) and ultra violet detection method. However, as these protein variants co-elute to some degree, another dimension of separation is beneficial to accurately measure their amounts. Mass spectrometry (MS) offers such a tool. In this study, we tested several RP-HPLC and MS parameters to optimise the analysis of intact bovine proteins from milk. From our tests, we developed an optimum method that includes a 20-28-40% phase B gradient with 0.02% TFA in both mobile phases, at 0.2 mL/min flow rate, using 75°C for the C8 column temperature, scanning every 3 sec over a 600–3000 m/z window. The optimisations were performed using external standards commercially purchased for which ionisation efficiency, linearity of calibration, LOD, LOQ, sensitivity, selectivity, precision, reproducibility, and mass accuracy were demonstrated. From the MS analysis, we can use extracted ion chromatograms (EICs) of specific ion series of known proteins and integrate peaks at defined retention time (RT) window for quantitation purposes. This optimum quantitative method was successfully applied to two bulk milk samples from different breeds, Holstein-Friesian and Jersey, to assess differences in protein variant levels. PMID:27749892

  18. Canagliflozin use in patients with renal impairment-Utility of quantitative clinical pharmacology analyses in dose optimization.

    PubMed

    Khurana, Manoj; Vaidyanathan, Jayabharathi; Marathe, Anshu; Mehrotra, Nitin; Sahajwalla, Chandrahas G; Zineh, Issam; Jain, Lokesh

    2015-06-01

    Canagliflozin (INVOKANA™) is approved as an adjunct to diet and exercise to improve glycemic control in adults with type 2 diabetes mellitus (T2DM). Canagliflozin inhibits renal sodium-glucose co-transporter 2 (SGLT2), thereby, reducing reabsorption of filtered glucose and increasing urinary glucose excretion. Given the mechanism of action of SGLT2 inhibitors, we assessed the interplay between renal function, efficacy (HbA1c reduction), and safety (renal adverse reactions). The focus of this article is to highlight the FDA's quantitative clinical pharmacology analyses that were conducted to support the regulatory decision on dosing in patients with renal impairment (RI). The metrics for assessment of efficacy for T2DM drugs is standard; however, there is no standard method for evaluation of renal effects for diabetes drugs. Therefore, several analyses were conducted to assess the impact of canagliflozin on renal function (as measured by eGFR) based on available data. These analyses provided support for approval of canagliflozin in T2DM patients with baseline eGFR ≥ 45 mL/min/1.73 m(2) , highlighting a data-driven approach to dose optimization. The availability of a relatively rich safety dataset (ie, frequent and early measurements of laboratory markers) in the canagliflozin clinical development program enabled adequate assessment of benefit-risk balance in various patient subgroups based on renal function.

  19. How do trees grow? Response from the graphical and quantitative analyses of computed tomography scanning data collected on stem sections.

    PubMed

    Dutilleul, Pierre; Han, Li Wen; Beaulieu, Jean

    2014-06-01

    Tree growth, as measured via the width of annual rings, is used for environmental impact assessment and climate back-forecasting. This fascinating natural process has been studied at various scales in the stem (from cell and fiber within a growth ring, to ring and entire stem) in one, two, and three dimensions. A new approach is presented to study tree growth in 3D from stem sections, at a scale sufficiently small to allow the delineation of reliable limits for annual rings and large enough to capture directional variation in growth rates. The technology applied is computed tomography scanning, which provides - for one stem section - millions of data (indirect measures of wood density) that can be mapped, together with a companion measure of dispersion and growth ring limits in filigree. Graphical and quantitative analyses are reported for white spruce trees with circular vs non-circular growth. Implications for dendroclimatological research are discussed.

  20. Quantitative assessment of macular thickness in normal subjects and patients with diabetic retinopathy by scanning retinal thickness analyser

    PubMed Central

    Oshima, Y.; Emi, K.; Yamanishi, S.; Motokura, M.

    1999-01-01

    AIMS—To evaluate the scanning retinal thickness analyser (RTA), a novel non-invasive imaging instrument, in diagnosing and quantitatively characterising diabetic macular oedema, and to investigate the relation between central macula thickness measured by RTA and other clinical examinations.
METHODS—Central macular thickness was measured using the RTA in 40 normal subjects and 60 patients with diabetic retinopathy. The reproducibility of the retinal thickness measurements was evaluated by calculating the mean of the inter- and intrasession variations. Central macular thickness was correlated with the results of visual acuity measurements, biomicroscopy, and fluorescein angiography.
RESULTS—Intra- and intersession reproducibility of the RTA in normal subjects was plus or minus 5.2% (16 µm) and plus or minus 6.1% (19 µm), respectively. The mean central macular thickness was 182 (SD 16) µm in normal subjects, 283 (116) µm in diabetic eyes without clinically significant macular oedema (CSMO), and 564 (168) µm in diabetic eyes with CSMO. Central macular thickness was significantly greater (p<0.001) in eyes with diabetic retinopathy than in normal subjects, even when macular thickening did not meet the standard for CSMO (p=0.019) measured by biomicroscopy. Although greater fluorescein leakage at the macula results in greater central macular thickness, only eyes with diffuse leakage had statistically significant macular thickening compared with normal subjects (p=0.022). Central macular thickness measured with the RTA was significantly correlated with the logarithmic converted visual acuity (r2= 0.76) in diabetic eyes.
CONCLUSION—Scanning RTA, which has good reproducibility, might be useful to quantitatively detect and monitor macular thickening in diabetic retinopathy. Central macular thickness was highly correlated with logarithmic converted visual acuity in diabetic macular oedema.

 Keywords: scanning retinal thickness analyser; macular

  1. Qualitative and quantitative analyses of Compound Danshen extract based on (1)H NMR method and its application for quality control.

    PubMed

    Yan, Kai-Jing; Chu, Yang; Huang, Jian-Hua; Jiang, Miao-Miao; Li, Wei; Wang, Yue-Fei; Huang, Hui-Yong; Qin, Yu-Hui; Ma, Xiao-Hui; Zhou, Shui-Ping; Sun, Henry; Wang, Wei

    2016-11-30

    In this study, a new approach using (1)H NMR spectroscopy combined with chemometrics method was developed for qualitative and quantitative analyses of extracts of Compound Danshen Dripping Pills (CDDP). For the qualitative analysis, some metabolites presented in Compound Danshen extract (CDE, extraction intermediate of CDDP) were detected, including phenolic acids, saponins, saccharides, organic acids and amino acids, by the proposed (1)H NMR method, and metabolites profiles were further analyzed by selected chemometrics algorithms to define the threshold values for product quality evaluation. Moreover, three main phenolic acids (danshensu, salvianolic acid B, and procatechuic aldehyde) in CDE were determined simultaneously, and method validation in terms of linearity, precision, repeatability, accuracy, and stability of the dissolved target compounds in solution was performed. The average recoveries varied between 84.20% and 110.75% while the RSDs were below 6.34% for the three phenolic acids. This (1)H NMR method offers an integral view of the extract composition, allows the qualitative and quantitative analysis of CDDP, and has the potential to be a supplementary tool to UPLC/HPLC for quality assessment of Chinese herbal medicines.

  2. Rapid Quantitative Analyses of Elements on Herb Medicine and Food Powder Using TEA CO2 Laser-Induced Plasma

    NASA Astrophysics Data System (ADS)

    Khumaeni, Ali; Ramli, Muliadi; Idris, Nasrullah; Lee, Yong Inn; Kurniawan, Koo Hendrik; Lie, Tjung Jie; Deguchi, Yoji; Niki, Hideaki; Kagawa, Kiichiro

    2009-03-01

    A novel technique for rapid quantitative analyses of elements on herb medicine and food powder has successfully been developed. In this technique, the powder samples were plugged in a small hole (2 mm in diameter and 3 mm in depth) and covered by a metal mesh. The Transversely Excited Atmospheric (TEA) CO2 laser (1500 mJ, 200 ns) was focused on the powder sample surfaces passing through the metal mesh at atmospheric pressure of nitrogen surrounding gas. It is hypothesized that the small hole functions to confine the powder particles and suppresses the blowing-off, while the metal mesh works as the source of electrons to initiate the strong gas breakdown plasma. The confined powder particles are subsequently ablated by the laser irradiation and the ablated particles move into the strong gas breakdown plasma region to be atomized and excited. Using this method, a quantitative analysis of the milk powder sample containing different concentrations of Ca was successfully demonstrated, resulting in a good linear calibration curve with high precision.

  3. Advances in Quantitative Analyses and Reference Materials Related to Laser Ablation ICP-MS: A Look at Methods and New Directions

    NASA Astrophysics Data System (ADS)

    Koenig, A. E.; Ridley, W. I.

    2009-12-01

    al. 2002), the MACS-1 and MACS-3 Ca carbonate RMs and a prototype Ca phosphate RM. Other work in-house currently includes testing of additional sulfide materials (Fe and Ni sulfides) and a gypsum material. Data for several matrices and RMs will be presented using multiple laser wavelengths. For new methods development regarding quantitative analyses, we have developed several new methods for quantitative trace element mapping in a variety of mineral, biomineral and materials applications. Rapid trace element mapping in bones (Koenig et al. 2009) is not only quantitative for trace elements but provides data that would be difficult to obtain as quickly or accurately by EPMA or other techniques. A method has been developed for rapid mapping of trace elements in building materials and other complex rock materials using a modification of the sum to 100% method presented by others (e.g. Leach and Heftje, 2001). This paper will outline new methods of integrating imaging and analytical data from EPMA, SEM, Raman and other techniques that improve the utility, accuracy and overall science of the subsequent LA-ICP-MS. Additional new directions for quantitative analyses of fluid inclusions, tissues, minerals and biological samples will be discussed.

  4. Mitochondrial DNA as a non-invasive biomarker: Accurate quantification using real time quantitative PCR without co-amplification of pseudogenes and dilution bias

    SciTech Connect

    Malik, Afshan N.; Shahni, Rojeen; Rodriguez-de-Ledesma, Ana; Laftah, Abas; Cunningham, Phil

    2011-08-19

    Highlights: {yields} Mitochondrial dysfunction is central to many diseases of oxidative stress. {yields} 95% of the mitochondrial genome is duplicated in the nuclear genome. {yields} Dilution of untreated genomic DNA leads to dilution bias. {yields} Unique primers and template pretreatment are needed to accurately measure mitochondrial DNA content. -- Abstract: Circulating mitochondrial DNA (MtDNA) is a potential non-invasive biomarker of cellular mitochondrial dysfunction, the latter known to be central to a wide range of human diseases. Changes in MtDNA are usually determined by quantification of MtDNA relative to nuclear DNA (Mt/N) using real time quantitative PCR. We propose that the methodology for measuring Mt/N needs to be improved and we have identified that current methods have at least one of the following three problems: (1) As much of the mitochondrial genome is duplicated in the nuclear genome, many commonly used MtDNA primers co-amplify homologous pseudogenes found in the nuclear genome; (2) use of regions from genes such as {beta}-actin and 18S rRNA which are repetitive and/or highly variable for qPCR of the nuclear genome leads to errors; and (3) the size difference of mitochondrial and nuclear genomes cause a 'dilution bias' when template DNA is diluted. We describe a PCR-based method using unique regions in the human mitochondrial genome not duplicated in the nuclear genome; unique single copy region in the nuclear genome and template treatment to remove dilution bias, to accurately quantify MtDNA from human samples.

  5. Accurate, quantitative assays for the hydrolysis of soluble type I, II, and III /sup 3/H-acetylated collagens by bacterial and tissue collagenases

    SciTech Connect

    Mallya, S.K.; Mookhtiar, K.A.; Van Wart, H.E.

    1986-11-01

    Accurate and quantitative assays for the hydrolysis of soluble /sup 3/H-acetylated rat tendon type I, bovine cartilage type II, and human amnion type III collagens by both bacterial and tissue collagenases have been developed. The assays are carried out at any temperature in the 1-30/sup 0/C range in a single reaction tube and the progress of the reaction is monitored by withdrawing aliquots as a function of time, quenching with 1,10-phenanthroline, and quantitation of the concentration of hydrolysis fragments. The latter is achieved by selective denaturation of these fragments by incubation under conditions described in the previous paper of this issue. The assays give percentages of hydrolysis of all three collagen types by neutrophil collagenase that agree well with the results of gel electrophoresis experiments. The initial rates of hydrolysis of all three collagens are proportional to the concentration of both neutrophil or Clostridial collagenases over a 10-fold range of enzyme concentrations. All three assays can be carried out at collagen concentrations that range from 0.06 to 2 mg/ml and give linear double reciprocal plots for both tissue and bacterial collagenases that can be used to evaluate the kinetic parameters K/sub m/ and k/sub cat/ or V/sub max/. The assay developed for the hydrolysis of rat type I collagen by neutrophil collagenase is shown to be more sensitive by at least one order of magnitude than comparable assays that use rat type I collagen fibrils or gels as substrate.

  6. Gene set analyses of genome-wide association studies on 49 quantitative traits measured in a single genetic epidemiology dataset.

    PubMed

    Kim, Jihye; Kwon, Ji-Sun; Kim, Sangsoo

    2013-09-01

    Gene set analysis is a powerful tool for interpreting a genome-wide association study result and is gaining popularity these days. Comparison of the gene sets obtained for a variety of traits measured from a single genetic epidemiology dataset may give insights into the biological mechanisms underlying these traits. Based on the previously published single nucleotide polymorphism (SNP) genotype data on 8,842 individuals enrolled in the Korea Association Resource project, we performed a series of systematic genome-wide association analyses for 49 quantitative traits of basic epidemiological, anthropometric, or blood chemistry parameters. Each analysis result was subjected to subsequent gene set analyses based on Gene Ontology (GO) terms using gene set analysis software, GSA-SNP, identifying a set of GO terms significantly associated to each trait (pcorr < 0.05). Pairwise comparison of the traits in terms of the semantic similarity in their GO sets revealed surprising cases where phenotypically uncorrelated traits showed high similarity in terms of biological pathways. For example, the pH level was related to 7 other traits that showed low phenotypic correlations with it. A literature survey implies that these traits may be regulated partly by common pathways that involve neuronal or nerve systems.

  7. Comparing the accuracy of quantitative versus qualitative analyses of interim PET to prognosticate Hodgkin lymphoma: a systematic review protocol of diagnostic test accuracy

    PubMed Central

    Procházka, Vít; Klugar, Miloslav; Bachanova, Veronika; Klugarová, Jitka; Tučková, Dagmar; Papajík, Tomáš

    2016-01-01

    Introduction Hodgkin lymphoma is an effectively treated malignancy, yet 20% of patients relapse or are refractory to front-line treatments with potentially fatal outcomes. Early detection of poor treatment responders is crucial for appropriate application of tailored treatment strategies. Tumour metabolic imaging of Hodgkin lymphoma using visual (qualitative) 18-fluorodeoxyglucose positron emission tomography (FDG-PET) is a gold standard for staging and final outcome assessment, but results gathered during the interim period are less accurate. Analysis of continuous metabolic–morphological data (quantitative) FDG-PET may enhance the robustness of interim disease monitoring, and help to improve treatment decision-making processes. The objective of this review is to compare diagnostic test accuracy of quantitative versus qualitative interim FDG-PET in the prognostication of patients with Hodgkin lymphoma. Methods The literature on this topic will be reviewed in a 3-step strategy that follows methods described by the Joanna Briggs Institute (JBI). First, MEDLINE and EMBASE databases will be searched. Second, listed databases for published literature (MEDLINE, Tripdatabase, Pedro, EMBASE, the Cochrane Central Register of Controlled Trials and WoS) and unpublished literature (Open Grey, Current Controlled Trials, MedNar, ClinicalTrials.gov, Cos Conference Papers Index and International Clinical Trials Registry Platform of the WHO) will be queried. Third, 2 independent reviewers will analyse titles, abstracts and full texts, and perform hand search of relevant studies, and then perform critical appraisal and data extraction from selected studies using the DATARI tool (JBI). If possible, a statistical meta-analysis will be performed on pooled sensitivity and specificity data gathered from the selected studies. Statistical heterogeneity will be assessed. Funnel plots, Begg's rank correlations and Egger's regression tests will be used to detect and/or correct publication

  8. Detection and quantitation of trace phenolphthalein (in pharmaceutical preparations and in forensic exhibits) by liquid chromatography-tandem mass spectrometry, a sensitive and accurate method.

    PubMed

    Sharma, Kakali; Sharma, Shiba P; Lahiri, Sujit C

    2013-01-01

    Phenolphthalein, an acid-base indicator and laxative, is important as a constituent of widely used weight-reducing multicomponent food formulations. Phenolphthalein is an useful reagent in forensic science for the identification of blood stains of suspected victims and for apprehending erring officials accepting bribes in graft or trap cases. The pink-colored alkaline hand washes originating from the phenolphthalein-smeared notes can easily be determined spectrophotometrically. But in many cases, colored solution turns colorless with time, which renders the genuineness of bribe cases doubtful to the judiciary. No method is known till now for the detection and identification of phenolphthalein in colorless forensic exhibits with positive proof. Liquid chromatography-tandem mass spectrometry had been found to be most sensitive, accurate method capable of detection and quantitation of trace phenolphthalein in commercial formulations and colorless forensic exhibits with positive proof. The detection limit of phenolphthalein was found to be 1.66 pg/L or ng/mL, and the calibration curve shows good linearity (r(2) = 0.9974).

  9. Allele Specific Locked Nucleic Acid Quantitative PCR (ASLNAqPCR): An Accurate and Cost-Effective Assay to Diagnose and Quantify KRAS and BRAF Mutation

    PubMed Central

    Morandi, Luca; de Biase, Dario; Visani, Michela; Cesari, Valentina; De Maglio, Giovanna; Pizzolitto, Stefano; Pession, Annalisa; Tallini, Giovanni

    2012-01-01

    The use of tyrosine kinase inhibitors (TKIs) requires the testing for hot spot mutations of the molecular effectors downstream the membrane-bound tyrosine kinases since their wild type status is expected for response to TKI therapy. We report a novel assay that we have called Allele Specific Locked Nucleic Acid quantitative PCR (ASLNAqPCR). The assay uses LNA-modified allele specific primers and LNA-modified beacon probes to increase sensitivity, specificity and to accurately quantify mutations. We designed primers specific for codon 12/13 KRAS mutations and BRAF V600E, and validated the assay with 300 routine samples from a variety of sources, including cytology specimens. All were analyzed by ASLNAqPCR and Sanger sequencing. Discordant cases were pyrosequenced. ASLNAqPCR correctly identified BRAF and KRAS mutations in all discordant cases and all had a mutated/wild type DNA ratio below the analytical sensitivity of the Sanger method. ASLNAqPCR was 100% specific with greater accuracy, positive and negative predictive values compared with Sanger sequencing. The analytical sensitivity of ASLNAqPCR is 0.1%, allowing quantification of mutated DNA in small neoplastic cell clones. ASLNAqPCR can be performed in any laboratory with real-time PCR equipment, is very cost-effective and can easily be adapted to detect hot spot mutations in other oncogenes. PMID:22558339

  10. Joint analyses of open comments and quantitative data: Added value in a job satisfaction survey of hospital professionals

    PubMed Central

    Gilles, Ingrid; Mayer, Mauro; Courvoisier, Nelly; Peytremann-Bridevaux, Isabelle

    2017-01-01

    Objective To obtain a comprehensive understanding of the job opinions of hospital professionals by conducting qualitative analyses of the open comments included in a job satisfaction survey and combining these results with the quantitative results. Design A cross-sectional survey targeting all Lausanne University Hospital professionals was performed in the fall of 2013. Material and methods The survey considered ten job satisfaction dimensions (e.g. self-fulfilment, workload, management, work-related burnout, organisational commitment, intent to stay) and included an open comment section. Computer-assisted qualitative analyses were conducted on these comments. Satisfaction rates on the included dimensions and professional groups were entered as predictive variables in the qualitative analyses. Participants Of 10 838 hospital professionals, 4978 participated in the survey and 1067 provided open comments. Data from 1045 respondents with usable comments constituted the analytic sample (133 physicians, 393 nurses, 135 laboratory technicians, 247 administrative staff, including researchers, 67 logistic staff, 44 psycho-social workers, and 26 unspecified). Results Almost a third of the comments addressed scheduling issues, mostly related to problems and exhaustion linked to shifts, work-life balance, and difficulties with colleagues’ absences and the consequences for quality of care and patient safety. The other two-thirds related to classic themes included in job satisfaction surveys. Although some comments were provided equally by all professional groups, others were group specific: work and hierarchy pressures for physicians, healthcare quality and patient safety for nurses, skill recognition for administrative staff. Overall, respondents’ comments were consistent with their job satisfaction ratings. Conclusion Open comment analysis provides a comprehensive understanding of hospital professionals’ job experiences, allowing better consideration of quality

  11. Development and validation of a liquid chromatography-tandem mass spectrometric assay for quantitative analyses of triptans in hair.

    PubMed

    Vandelli, Daniele; Palazzoli, Federica; Verri, Patrizia; Rustichelli, Cecilia; Marchesi, Filippo; Ferrari, Anna; Baraldi, Carlo; Giuliani, Enrico; Licata, Manuela; Silingardi, Enrico

    2016-04-01

    Triptans are specific drugs widely used for acute treatment of migraine, being selective 5HT1B/1D receptor agonists. A proper assumption of triptans is very important for an effective treatment; nevertheless patients often underuse, misuse, overuse or use triptans inconsistently, i.e., not following the prescribed therapy. Drug analysis in hair can represent a powerful tool for monitoring the compliance of the patient to the therapy, since it can greatly increase the time-window of detection compared to analyses in biological fluids, such as plasma or urine. In the present study, a liquid chromatography-tandem mass spectrometric (LC-MS/MS) method has been developed and validated for the quantitative analysis in human hair of five triptans commonly prescribed in Italy: almotriptan (AL), eletriptan (EP), rizatriptan (RIZ), sumatriptan (SUM) and zolmitriptan (ZP). Hair samples were decontaminated and incubated overnight in diluted hydrochloric acid; the extracts were purified by mixed-mode SPE cartridges and analyzed by LC-MS/MS under gradient elution in positive multiple reaction monitoring (MRM) mode. The procedure was fully validated in terms of selectivity, linearity, limit of detection (LOD) and lower limit of quantitation (LLOQ), accuracy, precision, carry-over, recovery, matrix effect and dilution integrity. The method was linear in the range 10-1000pg/mg hair, with R(2) values of at least 0.990; the validated LLOQ values were in the range 5-7pg/mg hair. The method offered satisfactory precision (RSD <10%), accuracy (90-110%) and recovery (>85%) values. The validated procedure was applied on 147 authentic hair samples from subjects being treated in the Headache Centre of Modena University Hospital in order to verify the possibility of monitoring the corresponding hair levels for the taken triptans.

  12. A second-generation device for automated training and quantitative behavior analyses of molecularly-tractable model organisms.

    PubMed

    Blackiston, Douglas; Shomrat, Tal; Nicolas, Cindy L; Granata, Christopher; Levin, Michael

    2010-12-17

    A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays). The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science.

  13. A Second-Generation Device for Automated Training and Quantitative Behavior Analyses of Molecularly-Tractable Model Organisms

    PubMed Central

    Blackiston, Douglas; Shomrat, Tal; Nicolas, Cindy L.; Granata, Christopher; Levin, Michael

    2010-01-01

    A deep understanding of cognitive processes requires functional, quantitative analyses of the steps leading from genetics and the development of nervous system structure to behavior. Molecularly-tractable model systems such as Xenopus laevis and planaria offer an unprecedented opportunity to dissect the mechanisms determining the complex structure of the brain and CNS. A standardized platform that facilitated quantitative analysis of behavior would make a significant impact on evolutionary ethology, neuropharmacology, and cognitive science. While some animal tracking systems exist, the available systems do not allow automated training (feedback to individual subjects in real time, which is necessary for operant conditioning assays). The lack of standardization in the field, and the numerous technical challenges that face the development of a versatile system with the necessary capabilities, comprise a significant barrier keeping molecular developmental biology labs from integrating behavior analysis endpoints into their pharmacological and genetic perturbations. Here we report the development of a second-generation system that is a highly flexible, powerful machine vision and environmental control platform. In order to enable multidisciplinary studies aimed at understanding the roles of genes in brain function and behavior, and aid other laboratories that do not have the facilities to undergo complex engineering development, we describe the device and the problems that it overcomes. We also present sample data using frog tadpoles and flatworms to illustrate its use. Having solved significant engineering challenges in its construction, the resulting design is a relatively inexpensive instrument of wide relevance for several fields, and will accelerate interdisciplinary discovery in pharmacology, neurobiology, regenerative medicine, and cognitive science. PMID:21179424

  14. Specific catalysis of asparaginyl deamidation by carboxylic acids: kinetic, thermodynamic, and quantitative structure-property relationship analyses.

    PubMed

    Connolly, Brian D; Tran, Benjamin; Moore, Jamie M R; Sharma, Vikas K; Kosky, Andrew

    2014-04-07

    Asparaginyl (Asn) deamidation could lead to altered potency, safety, and/or pharmacokinetics of therapeutic protein drugs. In this study, we investigated the effects of several different carboxylic acids on Asn deamidation rates using an IgG1 monoclonal antibody (mAb1*) and a model hexapeptide (peptide1) with the sequence YGKNGG. Thermodynamic analyses of the kinetics data revealed that higher deamidation rates are associated with predominantly more negative ΔS and, to a lesser extent, more positive ΔH. The observed differences in deamidation rates were attributed to the unique ability of each type of carboxylic acid to stabilize the energetically unfavorable transition-state conformations required for imide formation. Quantitative structure property relationship (QSPR) analysis using kinetic data demonstrated that molecular descriptors encoding for the geometric spatial distribution of atomic properties on various carboxylic acids are effective determinants for the deamidation reaction. Specifically, the number of O-O and O-H atom pairs on carboxyl and hydroxyl groups with interatomic distances of 4-5 Å on a carboxylic acid buffer appears to determine the rate of deamidation. Collectively, the results from structural and thermodynamic analyses indicate that carboxylic acids presumably form multiple hydrogen bonds and charge-charge interactions with the relevant deamidation site and provide alignment between the reactive atoms on the side chain and backbone. We propose that carboxylic acids catalyze deamidation by stabilizing a specific, energetically unfavorable transition-state conformation of l-asparaginyl intermediate II that readily facilitates bond formation between the γ-carbonyl carbon and the deprotonated backbone nitrogen for cyclic imide formation.

  15. Interfacing microwells with nanoliter compartments: a sampler generating high-resolution concentration gradients for quantitative biochemical analyses in droplets.

    PubMed

    Gielen, Fabrice; Buryska, Tomas; Van Vliet, Liisa; Butz, Maren; Damborsky, Jiri; Prokop, Zbynek; Hollfelder, Florian

    2015-01-06

    Analysis of concentration dependencies is key to the quantitative understanding of biological and chemical systems. In experimental tests involving concentration gradients such as inhibitor library screening, the number of data points and the ratio between the stock volume and the volume required in each test determine the quality and efficiency of the information gained. Titerplate assays are currently the most widely used format, even though they require microlitre volumes. Compartmentalization of reactions in pico- to nanoliter water-in-oil droplets in microfluidic devices provides a solution for massive volume reduction. This work addresses the challenge of producing microfluidic-based concentration gradients in a way that every droplet represents one unique reagent combination. We present a simple microcapillary technique able to generate such series of monodisperse water-in-oil droplets (with a frequency of up to 10 Hz) from a sample presented in an open well (e.g., a titerplate). Time-dependent variation of the well content results in microdroplets that represent time capsules of the composition of the source well. By preserving the spatial encoding of the droplets in tubing, each reactor is assigned an accurate concentration value. We used this approach to record kinetic time courses of the haloalkane dehalogenase DbjA and analyzed 150 combinations of enzyme/substrate/inhibitor in less than 5 min, resulting in conclusive Michaelis-Menten and inhibition curves. Avoiding chips and merely requiring two pumps, a magnetic plate with a stirrer, tubing, and a pipet tip, this easy-to-use device rivals the output of much more expensive liquid handling systems using a fraction (∼100-fold less) of the reagents consumed in microwell format.

  16. Classical genetic and quantitative trait loci analyses of heterosis in a maize hybrid between two elite inbred lines.

    PubMed

    Frascaroli, Elisabetta; Canè, Maria Angela; Landi, Pierangelo; Pea, Giorgio; Gianfranceschi, Luca; Villa, Marzio; Morgante, Michele; Pè, Mario Enrico

    2007-05-01

    The exploitation of heterosis is one of the most outstanding advancements in plant breeding, although its genetic basis is not well understood yet. This research was conducted on the materials arising from the maize single cross B73 x H99 to study heterosis by procedures of classical genetic and quantitative trait loci (QTL) analyses. Materials were the basic generations, the derived 142 recombinant inbred lines (RILs), and the three testcross populations obtained by crossing the 142 RILs to each parent and their F(1). For seedling weight (SW), number of kernels per plant (NK), and grain yield (GY), heterosis was >100% and the average degree of dominance was >1. Epistasis was significant for SW and NK but not for GY. Several QTL were identified and in most cases they were in the additive-dominance range for traits with low heterosis and mostly in the dominance-overdominance range for plant height (PH), SW, NK, and GY. Only a few QTL with digenic epistasis were identified. The importance of dominance effects was confirmed by highly significant correlations between heterozygosity level and phenotypic performance, especially for GY. Some chromosome regions presented overlaps of overdominant QTL for SW, PH, NK, and GY, suggesting pleiotropic effects on overall plant vigor.

  17. Quantitative solid-state 13C nuclear magnetic resonance spectrometric analyses of wood xylen: effect of increasing carbohydrate content

    USGS Publications Warehouse

    Bates, A.L.; Hatcher, P.G.

    1992-01-01

    Isolated lignin with a low carbohydrate content was spiked with increasing amounts of alpha-cellulose, and then analysed by solid-state 13C nuclear magnetic resonance (NMR) using cross-polarization with magic angle spinning (CPMAS) and dipolar dephasing methods in order to assess the quantitative reliability of CPMAS measurement of carbohydrate content and to determine how increasingly intense resonances for carbohydrate carbons affect calculations of the degree of lignin's aromatic ring substitution and methoxyl carbon content. Comparisons were made of the carbohydrate content calculated by NMR with carbohydrate concentrations obtained by phenol-sulfuric acid assay and by the calculation from the known amounts of cellulose added. The NMR methods used in this study yield overestimates for carbohydrate carbons due to resonance area overlap from the aliphatic side chain carbons of lignin. When corrections are made for these overlapping resonance areas, the NMR results agree very well with results obtained by other methods. Neither the calculated methoxyl carbon content nor the degree of aromatic ring substitution in lignin, both calculated from dipolar dephasing spectra, change with cellulose content. Likewise, lignin methoxyl content does not correlate with cellulose abundance when measured by integration of CPMAS spectra. ?? 1992.

  18. Empirical Bayes factor analyses of quantitative trait loci for gestation length in Iberian × Meishan F2 sows.

    PubMed

    Casellas, J; Varona, L; Muñoz, G; Ramírez, O; Barragán, C; Tomás, A; Martínez-Giner, M; Ovilo, C; Sánchez, A; Noguera, J L; Rodríguez, M C

    2008-02-01

    The aim of this study was to investigate chromosomal regions affecting gestation length in sows. An experimental F2 cross between Iberian and Meishan pig breeds was used for this purpose and we genotyped 119 markers covering the 18 porcine autosomal chromosomes. Within this context, we have developed a new empirical Bayes factor (BF) approach to compare between nested models, with and without the quantitative trait loci (QTL) effect, and after including the location of the QTL as an unknown parameter in the model. This empirical BF can be easily calculated from the output of a Markov chain Monte Carlo sampling by averaging conditional densities at the null QTL effects. Linkage analyses were performed in each chromosome using an animal model to account for infinitesimal genetic effects. Initially, three QTL were detected at chromosomes 6, 8 and 11 although, after correcting for multiple testing, only the additive QTL located in cM 110 of chromosome 8 remained. For this QTL, the allelic effect of substitution of the Iberian allele increased gestation length in 0.521 days, with a highest posterior density region at 95% ranged between 0.121 and 0.972 days. Although future studies are necessary to confirm if detected QTL is relevant and segregating in commercial pig populations, a hot-spot on the genetic regulation of gestation length in pigs seems to be located in chromosome 8.

  19. Genetic relationship between lodging and lodging components in barley (Hordeum vulgare) based on unconditional and conditional quantitative trait locus analyses.

    PubMed

    Chen, W Y; Liu, Z M; Deng, G B; Pan, Z F; Liang, J J; Zeng, X Q; Tashi, N M; Long, H; Yu, M Q

    2014-03-17

    Lodging (LD) is a major constraint limiting the yield and forage quality of barley. Detailed analyses of LD component (LDC) traits were conducted using 246 F2 plants generated from a cross between cultivars ZQ320 and 1277. Genetic relationships between LD and LDC were evaluated by unconditional and conditional quantitative trait locus (QTL) mapping with 117 simple sequence repeat markers. Ultimately, 53 unconditional QTL related to LD were identified on seven barley chromosomes. Up to 15 QTL accounted for over 10% of the phenotypic variation, and up to 20 QTL for culm strength were detected. Six QTL with pleiotropic effects showing significant negative correlations with LD were found between markers Bmag353 and GBM1482 on chromosome 4H. These alleles and alleles of QTL for wall thickness, culm strength, plant height, and plant weight originated from ZQ320. Conditional mapping identified 96 additional QTL for LD. Conditional QTL analysis demonstrated that plant height, plant height center of gravity, and length of the sixth internode had the greatest contribution to LD, whereas culm strength and length of the fourth internode, and culm strength of the second internode were the key factors for LD-resistant. Therefore, lodging resistance in barley can be improved based on selection of alleles affecting culm strength, wall thickness, plant height, and plant weight. The conditional QTL mapping method can be used to evaluate possible genetic relationships between LD and LDC while efficiently and precisely determining counteracting QTL, which will help in understanding the genetic basis of LD in barley.

  20. Quantitative three-dimensional microtextural analyses of tooth wear as a tool for dietary discrimination in fishes

    PubMed Central

    Purnell, Mark; Seehausen, Ole; Galis, Frietson

    2012-01-01

    Resource polymorphisms and competition for resources are significant factors in speciation. Many examples come from fishes, and cichlids are of particular importance because of their role as model organisms at the interface of ecology, development, genetics and evolution. However, analysis of trophic resource use in fishes can be difficult and time-consuming, and for fossil fish species it is particularly problematic. Here, we present evidence from cichlids that analysis of tooth microwear based on high-resolution (sub-micrometre scale) three-dimensional data and new ISO standards for quantification of surface textures provides a powerful tool for dietary discrimination and investigation of trophic resource exploitation. Our results suggest that three-dimensional approaches to analysis offer significant advantages over two-dimensional operator-scored methods of microwear analysis, including applicability to rough tooth surfaces that lack distinct scratches and pits. Tooth microwear textures develop over a longer period of time than is represented by stomach contents, and analyses based on textures are less prone to biases introduced by opportunistic feeding. They are more sensitive to subtle dietary differences than isotopic analysis. Quantitative textural analysis of tooth microwear has a useful role to play, complementing existing approaches, in trophic analysis of fishes—both extant and extinct. PMID:22491979

  1. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    PubMed Central

    2010-01-01

    Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references for normalization of gene

  2. Impact of vaccine herd-protection effects in cost-effectiveness analyses of childhood vaccinations. A quantitative comparative analysis

    PubMed Central

    Maldonado, Yvonne; Ioannidis, John P. A.; Contopoulos-Ioannidis, Despina

    2017-01-01

    Background Inclusion of vaccine herd-protection effects in cost-effectiveness analyses (CEAs) can impact the CEAs-conclusions. However, empirical epidemiologic data on the size of herd-protection effects from original studies are limited. Methods We performed a quantitative comparative analysis of the impact of herd-protection effects in CEAs for four childhood vaccinations (pneumococcal, meningococcal, rotavirus and influenza). We considered CEAs reporting incremental-cost-effectiveness-ratios (ICERs) (per quality-adjusted-life-years [QALY] gained; per life-years [LY] gained or per disability-adjusted-life-years [DALY] avoided), both with and without herd protection, while keeping all other model parameters stable. We calculated the size of the ICER-differences without vs with-herd-protection and estimated how often inclusion of herd-protection led to crossing of the cost-effectiveness threshold (of an assumed societal-willingness-to-pay) of $50,000 for more-developed countries or X3GDP/capita (WHO-threshold) for less-developed countries. Results We identified 35 CEA studies (20 pneumococcal, 4 meningococcal, 8 rotavirus and 3 influenza vaccines) with 99 ICER-analyses (55 per-QALY, 27 per-LY and 17 per-DALY). The median ICER-absolute differences per QALY, LY and DALY (without minus with herd-protection) were $15,620 (IQR: $877 to $48,376); $54,871 (IQR: $787 to $115,026) and $49 (IQR: $15 to $1,636) respectively. When the target-vaccination strategy was not cost-saving without herd-protection, inclusion of herd-protection always resulted in more favorable results. In CEAs that had ICERs above the cost-effectiveness threshold without herd-protection, inclusion of herd-protection led to crossing of that threshold in 45% of the cases. This impacted only CEAs for more developed countries, as all but one CEAs for less developed countries had ICERs below the WHO-cost-effectiveness threshold even without herd-protection. In several analyses, recommendation for the

  3. Receptor site topographies for phencyclidine-like and sigma drugs: predictions from quantitative conformational, electrostatic potential, and radioreceptor analyses.

    PubMed

    Manallack, D T; Wong, M G; Costa, M; Andrews, P R; Beart, P M

    1988-12-01

    Computer-assisted molecular modelling techniques and electrostatic analyses of a wide range of phenycyclidine (PCP) and sigma ligands, in conjunction with radioreceptor studies, were used to determine the topographies of the PCP and sigma receptors. The PCP receptor model was defined using key molecules from the arylcyclohexylamine, benzomorphan, bridged benz[f]isoquinoline, and dibenzocycloalkenimine drug classes. Hypothetical receptor points (R1, R2) were constructed onto the aromatic ring of each compound to represent hydrophobic interactions with the receptor, along with an additional receptor point (R3) representing a hydrogen bond between the nitrogen atom and the receptor. The superimposition of these key molecules gave the coordinates of the receptor points and nitrogen defining the primary PCP pharmacophore as follows: R1 (0.00, 3.50, 0.00), R2 (0.00, -3.50, 0.00), R3 (6.66, -1.13, 0.00), and N (3.90, -1.46, -0.32). Additional analyses were used to describe secondary binding sites for an additional hydrogen bonding site and two lipophilic clefts. Similarly, the sigma receptor model was constructed from ligands of the benzomorphan, octahydrobenzo[f]quinoline, phenylpiperidine, and diphenylguanidine drug classes. Coordinates for the primary sigma pharmacophore are as follows: R1 (0.00, 3.50, 0.00), R2 (0.00, -3.50, 0.00), R3 (6.09, 2.09, 0.00), and N (4.9, -0.12, -1.25). Secondary binding sites for sigma ligands were proposed for the interaction of aromatic ring substituents and large N-substituted lipophilic groups with the receptor. The sigma receptor model differs from the PCP model in the position of nitrogen atom, direction of the nitrogen lone pair vector, and secondary sigma binding sites. This study has thus demonstrated that the differing quantitative structure-activity relationships of PCP and sigma ligands allow the definition of discrete receptors. These models may be used in conjunction with rational drug design techniques to design novel PCP

  4. Genome-Wide Identification and Validation of Reference Genes in Infected Tomato Leaves for Quantitative RT-PCR Analyses

    PubMed Central

    Müller, Oliver A.; Grau, Jan; Thieme, Sabine; Prochaska, Heike; Adlung, Norman; Sorgatz, Anika; Bonas, Ulla

    2015-01-01

    The Gram-negative bacterium Xanthomonas campestris pv. vesicatoria (Xcv) causes bacterial spot disease of pepper and tomato by direct translocation of type III effector proteins into the plant cell cytosol. Once in the plant cell the effectors interfere with host cell processes and manipulate the plant transcriptome. Quantitative RT-PCR (qRT-PCR) is usually the method of choice to analyze transcriptional changes of selected plant genes. Reliable results depend, however, on measuring stably expressed reference genes that serve as internal normalization controls. We identified the most stably expressed tomato genes based on microarray analyses of Xcv-infected tomato leaves and evaluated the reliability of 11 genes for qRT-PCR studies in comparison to four traditionally employed reference genes. Three different statistical algorithms, geNorm, NormFinder and BestKeeper, concordantly determined the superiority of the newly identified reference genes. The most suitable reference genes encode proteins with homology to PHD finger family proteins and the U6 snRNA-associated protein LSm7. In addition, we identified pepper orthologs and validated several genes as reliable normalization controls for qRT-PCR analysis of Xcv-infected pepper plants. The newly identified reference genes will be beneficial for future qRT-PCR studies of the Xcv-tomato and Xcv-pepper pathosystems, as well as for the identification of suitable normalization controls for qRT-PCR studies of other plant-pathogen interactions, especially, if related plant species are used in combination with bacterial pathogens. PMID:26313760

  5. Molecular docking and 3D-quantitative structure activity relationship analyses of peptidyl vinyl sulfones: Plasmodium Falciparum cysteine proteases inhibitors

    NASA Astrophysics Data System (ADS)

    Teixeira, Cátia; Gomes, José R. B.; Couesnon, Thierry; Gomes, Paula

    2011-08-01

    Comparative molecular field analysis (CoMFA) and comparative molecular similarity indices analysis (CoMSIA) based on three-dimensional quantitative structure-activity relationship (3D-QSAR) studies were conducted on a series (39 molecules) of peptidyl vinyl sulfone derivatives as potential Plasmodium Falciparum cysteine proteases inhibitors. Two different methods of alignment were employed: (i) a receptor-docked alignment derived from the structure-based docking algorithm GOLD and (ii) a ligand-based alignment using the structure of one of the ligands derived from a crystal structure from the PDB databank. The best predictions were obtained for the receptor-docked alignment with a CoMFA standard model ( q 2 = 0.696 and r 2 = 0.980) and with CoMSIA combined electrostatic, and hydrophobic fields ( q 2 = 0.711 and r 2 = 0.992). Both models were validated by a test set of nine compounds and gave satisfactory predictive r 2 pred values of 0.76 and 0.74, respectively. CoMFA and CoMSIA contour maps were used to identify critical regions where any change in the steric, electrostatic, and hydrophobic fields may affect the inhibitory activity, and to highlight the key structural features required for biological activity. Moreover, the results obtained from 3D-QSAR analyses were superimposed on the Plasmodium Falciparum cysteine proteases active site and the main interactions were studied. The present work provides extremely useful guidelines for future structural modifications of this class of compounds towards the development of superior antimalarials.

  6. Genome-Wide Identification and Validation of Reference Genes in Infected Tomato Leaves for Quantitative RT-PCR Analyses.

    PubMed

    Müller, Oliver A; Grau, Jan; Thieme, Sabine; Prochaska, Heike; Adlung, Norman; Sorgatz, Anika; Bonas, Ulla

    2015-01-01

    The Gram-negative bacterium Xanthomonas campestris pv. vesicatoria (Xcv) causes bacterial spot disease of pepper and tomato by direct translocation of type III effector proteins into the plant cell cytosol. Once in the plant cell the effectors interfere with host cell processes and manipulate the plant transcriptome. Quantitative RT-PCR (qRT-PCR) is usually the method of choice to analyze transcriptional changes of selected plant genes. Reliable results depend, however, on measuring stably expressed reference genes that serve as internal normalization controls. We identified the most stably expressed tomato genes based on microarray analyses of Xcv-infected tomato leaves and evaluated the reliability of 11 genes for qRT-PCR studies in comparison to four traditionally employed reference genes. Three different statistical algorithms, geNorm, NormFinder and BestKeeper, concordantly determined the superiority of the newly identified reference genes. The most suitable reference genes encode proteins with homology to PHD finger family proteins and the U6 snRNA-associated protein LSm7. In addition, we identified pepper orthologs and validated several genes as reliable normalization controls for qRT-PCR analysis of Xcv-infected pepper plants. The newly identified reference genes will be beneficial for future qRT-PCR studies of the Xcv-tomato and Xcv-pepper pathosystems, as well as for the identification of suitable normalization controls for qRT-PCR studies of other plant-pathogen interactions, especially, if related plant species are used in combination with bacterial pathogens.

  7. A comparative study of quantitative microsegregation analyses performed during the solidification of the Ni-base superalloy CMSX-10

    SciTech Connect

    Seo, Seong-Moon; Jeong, Hi-Won; Ahn, Young-Keun; Yun, Dae Won; Lee, Je-Hyun; Yoo, Young-Soo

    2014-03-01

    Quantitative microsegregation analyses were systematically carried out during the solidification of the Ni-base superalloy CMSX-10 to clarify the methodological effect on the quantification of microsegregation and to fully understand the solidification microstructure. Three experimental techniques, namely, mushy zone quenching (MZQ), planar directional solidification followed by quenching (PDSQ), and random sampling (RS), were implemented for the analysis of microsegregation tendency and the magnitude of solute elements by electron probe microanalysis. The microprobe data and the calculation results of the diffusion field ahead of the solid/liquid (S/L) interface of PDSQ samples revealed that the liquid composition at the S/L interface is significantly influenced by quenching. By applying the PDSQ technique, it was also found that the partition coefficients of all solute elements do not change appreciably during the solidification of primary γ. All three techniques could reasonably predict the segregation behavior of most solute elements. Nevertheless, the RS approach has a tendency to overestimate the magnitude of segregation for most solute elements when compared to the MZQ and PDSQ techniques. Moreover, the segregation direction of Cr and Mo predicted by the RS approach was found to be opposite from the results obtained by the MZQ and PDSQ techniques. This conflicting segregation behavior of Cr and Mo was discussed intensively. It was shown that the formation of Cr-rich areas near the γ/γ′ eutectic in various Ni-base superalloys, including the CMSX-10 alloy, could be successfully explained by the results of microprobe analysis performed on a sample quenched during the planar directional solidification of γ/γ′ eutectic. - Highlights: • Methodological effect on the quantification of microsegregation was clarified. • The liquid composition at the S/L interface was influenced by quenching. • The segregation direction of Cr varied depending on the

  8. Diachronous fault array growth within continental rift basins: Quantitative analyses from the East Shetland Basin, northern North Sea

    NASA Astrophysics Data System (ADS)

    Claringbould, Johan; Bell, Rebecca; Jackson, Christopher; Gawthorpe, Robert; Odinsen, Tore

    2016-04-01

    The evolution of rift basins has been the subject of many studies, however, these studies have been mainly restricted to investigating the geometry of rift-related fault arrays. The relative timing of development of individual faults that make up the fault array is not yet well constrained. First-order tectono-stratigraphic models for rifts predict that normal faults develop broadly synchronously throughout the basin during a temporally distinct 'syn-rift' episode. However, largely due to the mechanical interaction between adjacent structures, distinctly diachronous activity is known to occur on the scale of individual fault segments and systems. Our limited understanding of how individual segments and systems contribute to array-scale strain largely reflects the limited dimension and resolution of the data available and methods applied. Here we utilize a regional extensive subsurface dataset comprising multiple 3D seismic MegaSurveys (10,000 km2), long (>75km) 2D seismic profiles, and exploration wells, to investigate the evolution of the fault array in the East Shetland Basin, North Viking Graben, northern North Sea. Previous studies propose this basin formed in response to multiphase rifting during two temporally distinct extensional phases in the Permian-Triassic and Middle-to-Late Jurassic, separated by a period of tectonic quiescence and thermal subsidence in the Early Jurassic. We document the timing of growth of individual structures within the rift-related fault array across the East Shetland Basin, constraining the progressive migration of strain from pre-Triassic-to-Late Jurassic. The methods used include (i) qualitative isochron map analysis, (ii) quantitative syn-kinematic deposit thickness difference across fault & expansion index calculations, and (iii) along fault throw-depth & backstripped displacement-length analyses. In contrast to established models, we demonstrate that the initiation, growth, and cessation of individual fault segments and

  9. Spectral simulation methods for enhancing qualitative and quantitative analyses based on infrared spectroscopy and quantitative calibration methods for passive infrared remote sensing of volatile organic compounds

    NASA Astrophysics Data System (ADS)

    Sulub, Yusuf Ismail

    Infrared spectroscopy (IR) has over the years found a myriad of applications including passive environmental remote sensing of toxic pollutants and the development of a blood glucose sensor. In this dissertation, capabilities of both these applications are further enhanced with data analysis strategies employing digital signal processing and novel simulation approaches. Both quantitative and qualitative determinations of volatile organic compounds are investigated in the passive IR remote sensing research described in this dissertation. In the quantitative work, partial least-squares (PLS) regression analysis is used to generate multivariate calibration models for passive Fourier transform IR remote sensing measurements of open-air generated vapors of ethanol in the presence methanol as an interfering species. A step-wise co-addition scheme coupled with a digital filtering approach is used to attenuate the effects of variation in optical path length or plume width. For the qualitative study, an IR imaging line scanner is used to acquire remote sensing data in both spatial and spectral domains. This technology is capable of not only identifying but also specifying the location of the sample under investigation. Successful implementation of this methodology is hampered by the huge costs incurred to conduct these experiments and the impracticality of acquiring large amounts of representative training data. To address this problem, a novel simulation approach is developed that generates training data based on synthetic analyte-active and measured analyte-inactive data. Subsequently, automated pattern classifiers are generated using piecewise linear discriminant analysis to predict the presence of the analyte signature in measured imaging data acquired in remote sensing applications. Near infrared glucose determinations based on the region of 5000--4000 cm-1 is the focus of the research in the latter part of this dissertation. A six-component aqueous matrix of glucose

  10. An ultra-clean technique for accurately analysing Pb isotopes and heavy metals at high spatial resolution in ice cores with sub-pg g(-1) Pb concentrations.

    PubMed

    Burn, Laurie J; Rosman, Kevin J R; Candelone, Jean-Pierre; Vallelonga, Paul; Burton, Graeme R; Smith, Andrew M; Morgan, Vin I; Barbante, Carlo; Hong, Sungmin; Boutron, Claude F

    2009-02-23

    Measurements of Pb isotope ratios in ice containing sub-pg g(-1) concentrations are easily compromised by contamination, particularly where limited sample is available. Improved techniques are essential if Antarctic ice cores are to be analysed with sufficient spatial resolution to reveal seasonal variations due to climate. This was achieved here by using stainless steel chisels and saws and strict protocols in an ultra-clean cold room to decontaminate and section ice cores. Artificial ice cores, prepared from high purity water were used to develop and refine the procedures and quantify blanks. Ba and In, two other important elements present at pg g(-1) and fg g(-1) concentrations in Polar ice, were also measured. The final blank amounted to 0.2+/-0.2 pg of Pb with (206)Pb/(207)Pb and (208)Pb/(207)Pb ratios of 1.16+/-0.12 and 2.35+/-0.16, respectively, 1.5+/-0.4 pg of Ba and 0.6+/-2.0 fg of In, most of which probably originates from abrasion of the steel saws by the ice. The procedure was demonstrated on a Holocene Antarctic ice core section and was shown to contribute blanks of only approximately 5%, approximately 14% and approximately 0.8% to monthly resolved samples with respective Pb, Ba and In concentrations of 0.12 pg g(-1), 0.3 pg g(-1) and 2.3 fg g(-1). Uncertainties in the Pb isotopic ratio measurements were degraded by only approximately 0.2%.

  11. Reduced Number of Pigmented Neurons in the Substantia Nigra of Dystonia Patients? Findings from Extensive Neuropathologic, Immunohistochemistry, and Quantitative Analyses

    PubMed Central

    Iacono, Diego; Geraci-Erck, Maria; Peng, Hui; Rabin, Marcie L.; Kurlan, Roger

    2015-01-01

    Background Dystonias (Dys) represent the third most common movement disorder after essential tremor (ET) and Parkinson's disease (PD). While some pathogenetic mechanisms and genetic causes of Dys have been identified, little is known about their neuropathologic features. Previous neuropathologic studies have reported generically defined neuronal loss in various cerebral regions of Dys brains, mostly in the basal ganglia (BG), and specifically in the substantia nigra (SN). Enlarged pigmented neurons in the SN of Dys patients with and without specific genetic mutations (e.g., GAG deletions in DYT1 dystonia) have also been described. Whether or not Dys brains are associated with decreased numbers or other morphometric changes of specific neuronal types is unknown and has never been addressed with quantitative methodologies. Methods Quantitative immunohistochemistry protocols were used to estimate neuronal counts and volumes of nigral pigmented neurons in 13 SN of Dys patients and 13 SN of age-matched control subjects (C). Results We observed a significant reduction (∼20%) of pigmented neurons in the SN of Dys compared to C (p<0.01). Neither significant volumetric changes nor evident neurodegenerative signs were observed in the remaining pool of nigral pigmented neurons in Dys brains. These novel quantitative findings were confirmed after exclusion of possible co-occurring SN pathologies including Lewy pathology, tau-neurofibrillary tangles, β-amyloid deposits, ubiquitin (ubiq), and phosphorylated-TAR DNA-binding protein 43 (pTDP43)-positive inclusions. Discussion A reduced number of nigral pigmented neurons in the absence of evident neurodegenerative signs in Dys brains could indicate previously unconsidered pathogenetic mechanisms of Dys such as neurodevelopmental defects in the SN. PMID:26069855

  12. Characterization of a Highly Conserved Histone Related Protein, Ydl156w, and Its Functional Associations Using Quantitative Proteomic Analyses*

    PubMed Central

    Gilmore, Joshua M.; Sardiu, Mihaela E.; Venkatesh, Swaminathan; Stutzman, Brent; Peak, Allison; Seidel, Chris W.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.

    2012-01-01

    A significant challenge in biology is to functionally annotate novel and uncharacterized proteins. Several approaches are available for deducing the function of proteins in silico based upon sequence homology and physical or genetic interaction, yet this approach is limited to proteins with well-characterized domains, paralogs and/or orthologs in other species, as well as on the availability of suitable large-scale data sets. Here, we present a quantitative proteomics approach extending the protein network of core histones H2A, H2B, H3, and H4 in Saccharomyces cerevisiae, among which a novel associated protein, the previously uncharacterized Ydl156w, was identified. In order to predict the role of Ydl156w, we designed and applied integrative bioinformatics, quantitative proteomics and biochemistry approaches aiming to infer its function. Reciprocal analysis of Ydl156w protein interactions demonstrated a strong association with all four histones and also to proteins strongly associated with histones including Rim1, Rfa2 and 3, Yku70, and Yku80. Through a subsequent combination of the focused quantitative proteomics experiments with available large-scale genetic interaction data and Gene Ontology functional associations, we provided sufficient evidence to associate Ydl156w with multiple processes including chromatin remodeling, transcription and DNA repair/replication. To gain deeper insights into the role of Ydl156w in histone biology we investigated the effect of the genetic deletion of ydl156w on H4 associated proteins, which lead to a dramatic decrease in the association of H4 with RNA polymerase III proteins. The implication of a role for Ydl156w in RNA Polymerase III mediated transcription was consequently verified by RNA-Seq experiments. Finally, using these approaches we generated a refined network of Ydl156w-associated proteins. PMID:22199229

  13. Quantitative and qualitative analyses of under-balcony acoustics with real and simulated arrays of multiple sources

    NASA Astrophysics Data System (ADS)

    Kwon, Youngmin

    The objective of this study was to quantitatively and qualitatively identify the acoustics of the under-balcony areas in music performance halls under realistic conditions that are close to an orchestral performance in consideration of multiple music instrumental sources and their diverse sound propagation patterns. The study executed monaural and binaural impulse response measurements with an array of sixteen directional sources (loudspeakers) for acoustical assessments. Actual measurements in a performance hall as well as computer simulations were conducted for the quantitative assessments. Psycho-acoustical listening tests were conducted for the qualitative assessments using the music signals binaurally recorded in the hall with the same source array. The results obtained from the multiple directional source tests were analyzed by comparing them to those obtained from the tests performed with a single omni-directional source. These two sets of results obtained in the under-balcony area were also compared to those obtained in the main orchestra area. The quantitative results showed that the use of a single source conforming to conventional measurement protocol seems to be competent for measurements of the room acoustical parameters such as EDTmid, RTmid, C80500-2k, IACCE3 and IACCL3. These quantitative measures, however, did not always agree with the results of the qualitative assessments. The primary reason is that, in many other acoustical analysis respects, the acoustical phenomena shown from the multiple source measurements were not similar to those shown from the single source measurements. Remarkable differences were observed in time-domain impulse responses, frequency content, spectral distribution, directional distribution of the early reflections, and in sound energy density over time. Therefore, the room acoustical parameters alone should not be the acoustical representative characterizing a performance hall or a specific area such as the under

  14. Quantitative global proteome and lysine succinylome analyses provide insights into metabolic regulation and lymph node metastasis in gastric cancer.

    PubMed

    Song, Yongxi; Wang, Jun; Cheng, Zhongyi; Gao, Peng; Sun, Jingxu; Chen, Xiaowan; Chen, Chen; Wang, Yunlong; Wang, Zhenning

    2017-02-06

    With the rapid development of high-throughput quantitative proteomic and transcriptomic approaches, the molecular mechanisms of cancers have been comprehensively explored. However, cancer is a multi-dimensional disease with sophisticated regulations, and few studies focus on the crosstalk among multiomics. In order to explore the molecular mechanisms of gastric cancer (GC), particularly in the process of lymph node metastasis (LNM), we investigated dynamic profiling changes as well as crosstalk between long non-coding RNAs (lncRNAs), the proteome, and the lysine succinylome. Our study reports the first qualitative and quantitative profile of lysine succinylation in GC. We identified a novel mechanism through which the TCA cycle and pentose phosphate pathway might be regulated through lysine succinylation in their core enzymes. We then examined the potential of using lysine succinylation as a biomarker for GC and successfully developed a succinylation-dependent antibody for the K569 site in Caldesmon as putative biomarker. Finally, we investigated the relationship between the lysine succinylome and lncRNAs, identifying potential crosstalks between two lncRNAs and one succinylation site. These results expand our understanding of the mechanisms of tumorigenesis and provide new information for the diagnosis and prognosis of GC.

  15. Quantitative global proteome and lysine succinylome analyses provide insights into metabolic regulation and lymph node metastasis in gastric cancer

    PubMed Central

    Song, Yongxi; Wang, Jun; Cheng, Zhongyi; Gao, Peng; Sun, Jingxu; Chen, Xiaowan; Chen, Chen; Wang, Yunlong; Wang, Zhenning

    2017-01-01

    With the rapid development of high-throughput quantitative proteomic and transcriptomic approaches, the molecular mechanisms of cancers have been comprehensively explored. However, cancer is a multi-dimensional disease with sophisticated regulations, and few studies focus on the crosstalk among multiomics. In order to explore the molecular mechanisms of gastric cancer (GC), particularly in the process of lymph node metastasis (LNM), we investigated dynamic profiling changes as well as crosstalk between long non-coding RNAs (lncRNAs), the proteome, and the lysine succinylome. Our study reports the first qualitative and quantitative profile of lysine succinylation in GC. We identified a novel mechanism through which the TCA cycle and pentose phosphate pathway might be regulated through lysine succinylation in their core enzymes. We then examined the potential of using lysine succinylation as a biomarker for GC and successfully developed a succinylation-dependent antibody for the K569 site in Caldesmon as putative biomarker. Finally, we investigated the relationship between the lysine succinylome and lncRNAs, identifying potential crosstalks between two lncRNAs and one succinylation site. These results expand our understanding of the mechanisms of tumorigenesis and provide new information for the diagnosis and prognosis of GC. PMID:28165029

  16. Semi-quantitative ion microprobe mass analyses of mineral-rich particles from the upper freeport coal

    USGS Publications Warehouse

    Finkelman, R.B.; Simons, D.S.; Dulong, F.T.; Steel, E.B.

    1984-01-01

    An ion microprobe mass analyzer (IMMA) has been used to analyze semi-quantitatively mineral-rich coal particles from two separate facies of the Upper Freeport coal bed. Accuracy is estimated to be ??? 20% for those elements making up more than 0.1 wt.% of the particles and ??? 50% for elements making up less than 0.1 wt.%. Using IMMA data, we found statistically significant differences between the two samples for five (Fe, Ca, Mn, Li, Ce) of the 25 elements detected. For Li and Mn the differences between the mineral-rich particles within samples were similar to differences found between samples on a whole-coal basis. For Ca and Fe, the differences are attributed to different modes of occurrence, and for Ce, the differences are probably due to an irregular distribution of an inorganic phase. We conclude that the IMMA can be used to obtain semi-quantitative data that may provide insight into the distribution and mode of occurrence of some of the elements in coal. ?? 1984.

  17. AquaLite, a bioluminescent label for immunoassay and nucleic acid detection: quantitative analyses at the attomol level

    NASA Astrophysics Data System (ADS)

    Smith, David F.; Stults, Nancy L.

    1996-04-01

    AquaLiteR is a direct, bioluminescent label capable of detecting attomol levels of analyte in clinical immunoassays and assays for the quantitative measurement of nucleic acids. Bioluminescent immunoassays (BIAs) require no radioisotopes and avoid complex fluorescent measurements and many of the variables of indirect enzyme immunoassays (EIAs). AquaLite, a recombinant form of the photoprotein aequorin from a bioluminescent jellyfish, is coupled directly to antibodies to prepare bioluminescent conjugates for assay development. When the AquaLite-antibody complex is exposed to a solution containing calcium ions, a flash of blue light ((lambda) max equals 469 nm) is generated. The light signal is measured in commercially available luminometers that simultaneously inject a calcium solution and detect subattomol photoprotein levies in either test tubes or microtiter plates. Immunometric or 'sandwich' type assays are available for the quantitative measurement of human endocrine hormones and nucleic acids. The AquaLite TSH assay can detect 1 attomol of thyroid stimulating hormone (TSH) in 0.2 mL of human serum and is a useful clinical tool for diagnosing hyperthyroid patients. AquaLite-based nucleic acid detection permits quantifying attomol levels of specific nucleic acid markers and represents possible solution to the difficult problem of quantifying the targets of nucleic acid amplification methods.

  18. Differential label-free quantitative proteomic analysis of Shewanella oneidensis cultured under aerobic and suboxic conditions by accurate mass and time tag approach.

    PubMed

    Fang, Ruihua; Elias, Dwayne A; Monroe, Matthew E; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D; Callister, Stephen J; Moore, Ronald J; Gorby, Yuri A; Adkins, Joshua N; Fredrickson, Jim K; Lipton, Mary S; Smith, Richard D

    2006-04-01

    We describe the application of LC-MS without the use of stable isotope labeling for differential quantitative proteomic analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and suboxic conditions. LC-MS/MS was used to initially identify peptide sequences, and LC-FTICR was used to confirm these identifications as well as measure relative peptide abundances. 2343 peptides covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as statistical analysis of microarrays, whereas another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis was transitioned from aerobic to suboxic conditions.

  19. Differential Label-free Quantitative Proteomic Analysis of Shewanella oneidensis Cultured under Aerobic and Suboxic Conditions by Accurate Mass and Time Tag Approach

    SciTech Connect

    Fang, Ruihua; Elias, Dwayne A.; Monroe, Matthew E.; Shen, Yufeng; McIntosh, Martin; Wang, Pei; Goddard, Carrie D.; Callister, Stephen J.; Moore, Ronald J.; Gorby, Yuri A.; Adkins, Joshua N.; Fredrickson, Jim K.; Lipton, Mary S.; Smith, Richard D.

    2006-04-01

    We describe the application of liquid chromatography coupled to mass spectrometry (LC/MS) without the use of stable isotope labeling for differential quantitative proteomics analysis of whole cell lysates of Shewanella oneidensis MR-1 cultured under aerobic and sub-oxic conditions. Liquid chromatography coupled to tandem mass spectrometry (LC-MS/MS) was used to initially identify peptide sequences, and LC coupled to Fourier transform ion cyclotron resonance mass spectrometry (LC-FTICR) was used to confirm these identifications, as well as measure relative peptide abundances. 2343 peptides, covering 668 proteins were identified with high confidence and quantified. Among these proteins, a subset of 56 changed significantly using statistical approaches such as SAM, while another subset of 56 that were annotated as performing housekeeping functions remained essentially unchanged in relative abundance. Numerous proteins involved in anaerobic energy metabolism exhibited up to a 10-fold increase in relative abundance when S. oneidensis is transitioned from aerobic to sub-oxic conditions.

  20. Validation of real-time PCR analyses for line-specific quantitation of genetically modified maize and soybean using new reference molecules.

    PubMed

    Shindo, Yoichiro; Kuribara, Hideo; Matsuoka, Takeshi; Futo, Satoshi; Sawada, Chihiro; Shono, Jinji; Akiyama, Hiroshi; Goda, Yukihiro; Toyoda, Masatake; Hino, Akihiro

    2002-01-01

    Novel analytical methods based on real-time quantitative polymerase chain reactions by use of new reference molecules were validated in interlaboratory studies for the quantitation of genetically modified (GM) maize and soy. More than 13 laboratories from Japan, Korea, and the United States participated in the studies. The interlaboratory studies included 2 separate stages: (1) measurement tests of coefficient values, the ratio of recombinant DNA (r-DNA) sequence, and endogenous DNA sequence in the seeds of GM maize and GM soy; and (2) blind tests with 6 pairs of maize and soy samples, including different levels of GM maize or GM soy. Test results showed that the methods are applicable to the specific quantitation of the 5 lines of GM maize and one line of GM soy. After statistical treatment to remove outliers, the repeatability and reproducibility of these methods at a level of 5.0% were <13.7 and 15.9%, respectively. The quantitation limits of the methods were 0.50% for Bt11, T25, and MON810, and 0.10% for GA21, Event176, and Roundup Ready soy. The results of blind tests showed that the numerical information obtained from these methods will contribute to practical analyses for labeling systems of GM crops.

  1. Oxalic Acid from Lentinula edodes Culture Filtrate: Antimicrobial Activity on Phytopathogenic Bacteria and Qualitative and Quantitative Analyses

    PubMed Central

    Kwak, A-Min; Lee, In-Kyoung; Lee, Sang-Yeop

    2016-01-01

    The culture filtrate of Lentinula edodes shows potent antimicrobial activity against the plant pathogenic bacteria Ralstonia solanacearum. Bioassay-guided fractionation was conducted using Diaion HP-20 column chromatography, and the insoluble active compound was not adsorbed on the resin. Further fractionation by high-performance liquid chromatography (HPLC) suggested that the active compounds were organic acids. Nine organic acids were detected in the culture filtrate of L. edodes; oxalic acid was the major component and exhibited antibacterial activity against nine different phytopathogenic bacteria. Quantitative analysis by HPLC revealed that the content of oxalic acid was higher in the water extract from spent mushroom substrate than in liquid culture. This suggests that the water extract of spent L. edodes substrate is an eco-friendly control agent for plant diseases. PMID:28154495

  2. The influence of the design matrix on treatment effect estimates in the quantitative analyses of single-subject experimental design research.

    PubMed

    Moeyaert, Mariola; Ugille, Maaike; Ferron, John M; Beretvas, S Natasha; Van den Noortgate, Wim

    2014-09-01

    The quantitative methods for analyzing single-subject experimental data have expanded during the last decade, including the use of regression models to statistically analyze the data, but still a lot of questions remain. One question is how to specify predictors in a regression model to account for the specifics of the design and estimate the effect size of interest. These quantitative effect sizes are used in retrospective analyses and allow synthesis of single-subject experimental study results which is informative for evidence-based decision making, research and theory building, and policy discussions. We discuss different design matrices that can be used for the most common single-subject experimental designs (SSEDs), namely, the multiple-baseline designs, reversal designs, and alternating treatment designs, and provide empirical illustrations. The purpose of this article is to guide single-subject experimental data analysts interested in analyzing and meta-analyzing SSED data.

  3. PCR Bias in Ecological Analysis: a Case Study for Quantitative Taq Nuclease Assays in Analyses of Microbial Communities†

    PubMed Central

    Becker, Sven; Böger, Peter; Oehlmann, Ralfh; Ernst, Anneliese

    2000-01-01

    Succession of ecotypes, physiologically diverse strains with negligible rRNA sequence divergence, may explain the dominance of small, red-pigmented (phycoerythrin-rich) cyanobacteria in the autotrophic picoplankton of deep lakes (C. Postius and A. Ernst, Arch. Microbiol. 172:69–75, 1999). In order to test this hypothesis, it is necessary to determine the abundance of specific ecotypes or genotypes in a mixed background of phylogenetically similar organisms. In this study, we examined the performance of Taq nuclease assays (TNAs), PCR-based assays in which the amount of an amplicon is monitored by hydrolysis of a labeled oligonucleotide (TaqMan probe) when hybridized to the amplicon. High accuracy and a 7-order detection range made the real-time TNA superior to the corresponding end point technique. However, in samples containing mixtures of homologous target sequences, quantification can be biased due to limited specificity of PCR primers and probe oligonucleotides and due to accumulation of amplicons that are not detected by the TaqMan probe. A decrease in reaction efficiency, which can be recognized by direct monitoring of amplification, provides experimental evidence for the presence of such a problem and emphasizes the need for real-time technology in quantitative PCR. Use of specific primers and probes and control of amplification efficiency allow correct quantification of target DNA in the presence of an up to 104-fold excess of phylogenetically similar DNA and of an up to 107-fold excess of dissimilar DNA. PMID:11055948

  4. LobeFinder: A Convex Hull-Based Method for Quantitative Boundary Analyses of Lobed Plant Cells1[OPEN

    PubMed Central

    Wu, Tzu-Ching; Belteton, Samuel A.; Szymanski, Daniel B.; Umulis, David M.

    2016-01-01

    Dicot leaves are composed of a heterogeneous mosaic of jigsaw puzzle piece-shaped pavement cells that vary greatly in size and the complexity of their shape. Given the importance of the epidermis and this particular cell type for leaf expansion, there is a strong need to understand how pavement cells morph from a simple polyhedral shape into highly lobed and interdigitated cells. At present, it is still unclear how and when the patterns of lobing are initiated in pavement cells, and one major technological bottleneck to addressing the problem is the lack of a robust and objective methodology to identify and track lobing events during the transition from simple cell geometry to lobed cells. We developed a convex hull-based algorithm termed LobeFinder to identify lobes, quantify geometric properties, and create a useful graphical output of cell coordinates for further analysis. The algorithm was validated against manually curated images of pavement cells of widely varying sizes and shapes. The ability to objectively count and detect new lobe initiation events provides an improved quantitative framework to analyze mutant phenotypes, detect symmetry-breaking events in time-lapse image data, and quantify the time-dependent correlation between cell shape change and intracellular factors that may play a role in the morphogenesis process. PMID:27288363

  5. Filamin C, a dysregulated protein in cancer revealed by label-free quantitative proteomic analyses of human gastric cancer cells.

    PubMed

    Qiao, Jie; Cui, Shu-Jian; Xu, Lei-Lei; Chen, Si-Jie; Yao, Jun; Jiang, Ying-Hua; Peng, Gang; Fang, Cai-Yun; Yang, Peng-Yuan; Liu, Feng

    2015-01-20

    Gastric cancer (GC) is the fourth and fifth most common cancer in men and women, respectively. We identified 2,750 proteins at false discovery rates of 1.3% (protein) and 0.03% (spectrum) by comparing the proteomic profiles of three GC and a normal gastric cell lines. Nine proteins were significantly dysregulated in all three GC cell lines, including filamin C, a muscle-specific filamin and a large actin-cross-linking protein. Downregulation of filamin C in GC cell lines and tissues were verified using quantitative PCR and immunohistochemistry. Data-mining using public microarray datasets shown that filamin C was significantly reduced in many human primary and metastasis cancers. Transient expression or silencing of filamin C affected the proliferation and colony formation of cancer cells. Silencing of endogenous filamin C enhanced cancer cell migration and invasion, whereas ectopic expression of filamin C had opposing effects. Silencing of filamin C increased the expression of matrix metallopeptidase 2 and improved the metastasis of prostate cancer in a zebrafish model. High filamin C associated with better prognosis of prostate cancer, leukemia and breast cancer patients. These findings establish a functional role of filamin C in human cancers and these data will be valuable for further study of its mechanisms.

  6. Comprehensive and quantitative proteomic analyses of zebrafish plasma reveals conserved protein profiles between genders and between zebrafish and human.

    PubMed

    Li, Caixia; Tan, Xing Fei; Lim, Teck Kwang; Lin, Qingsong; Gong, Zhiyuan

    2016-04-13

    Omic approaches have been increasingly used in the zebrafish model for holistic understanding of molecular events and mechanisms of tissue functions. However, plasma is rarely used for omic profiling because of the technical challenges in collecting sufficient blood. In this study, we employed two mass spectrometric (MS) approaches for a comprehensive characterization of zebrafish plasma proteome, i.e. conventional shotgun liquid chromatography-tandem mass spectrometry (LC-MS/MS) for an overview study and quantitative SWATH (Sequential Window Acquisition of all THeoretical fragment-ion spectra) for comparison between genders. 959 proteins were identified in the shotgun profiling with estimated concentrations spanning almost five orders of magnitudes. Other than the presence of a few highly abundant female egg yolk precursor proteins (vitellogenins), the proteomic profiles of male and female plasmas were very similar in both number and abundance and there were basically no other highly gender-biased proteins. The types of plasma proteins based on IPA (Ingenuity Pathway Analysis) classification and tissue sources of production were also very similar. Furthermore, the zebrafish plasma proteome shares significant similarities with human plasma proteome, in particular in top abundant proteins including apolipoproteins and complements. Thus, the current study provided a valuable dataset for future evaluation of plasma proteins in zebrafish.

  7. Quantitative analyses of ammonia-oxidizing archaea (AOA) and ammonia-oxidizing bacteria (AOB) in fields with different soil types.

    PubMed

    Morimoto, Sho; Hayatsu, Masahito; Takada Hoshino, Yuko; Nagaoka, Kazunari; Yamazaki, Masatsugu; Karasawa, Toshihiko; Takenaka, Makoto; Akiyama, Hiroko

    2011-01-01

    Soil type is one of the key factors affecting soil microbial communities. With regard to ammonia-oxidizing archaea (AOA) and ammonia-oxidizing bacteria (AOB), however, it has not been determined how soil type affects their community size and soil nitrification activity. Here we quantitatively analyzed the ammonia monooxygenase genes (amoA) of these ammonia oxidizers in fields with three different soil types (Low-humic Andosol [LHA], Gray Lowland Soil [GLS], and Yellow Soil [YS]) under common cropping conditions, and assessed the relationships between soil nitrification activity and the abundance of each amoA. Nitrification activity of LHA was highest, followed by that of GLS and YS; this order was consistent with that for the abundance of AOB amoA. Abundance of AOB amoA showed temporal variation, which was similar to that observed in nitrification activity, and a strong relationship (adjusted R(2)=0.742) was observed between the abundance of AOB amoA and nitrification activity. Abundance of AOA amoA also exhibited a significant relationship (adjusted R(2)=0.228) with nitrification activity, although this relationship was much weaker. Our results indicate that soil type affects the community size of AOA and AOB and the resulting nitrification activity, and that AOB are major contributors to nitrification in soils, while AOA are partially responsible.

  8. COTS-Based Fault Tolerance in Deep Space: Qualitative and Quantitative Analyses of a Bus Network Architecture

    NASA Technical Reports Server (NTRS)

    Tai, Ann T.; Chau, Savio N.; Alkalai, Leon

    2000-01-01

    Using COTS products, standards and intellectual properties (IPs) for all the system and component interfaces is a crucial step toward significant reduction of both system cost and development cost as the COTS interfaces enable other COTS products and IPs to be readily accommodated by the target system architecture. With respect to the long-term survivable systems for deep-space missions, the major challenge for us is, under stringent power and mass constraints, to achieve ultra-high reliability of the system comprising COTS products and standards that are not developed for mission-critical applications. The spirit of our solution is to exploit the pertinent standard features of a COTS product to circumvent its shortcomings, though these standard features may not be originally designed for highly reliable systems. In this paper, we discuss our experiences and findings on the design of an IEEE 1394 compliant fault-tolerant COTS-based bus architecture. We first derive and qualitatively analyze a -'stacktree topology" that not only complies with IEEE 1394 but also enables the implementation of a fault-tolerant bus architecture without node redundancy. We then present a quantitative evaluation that demonstrates significant reliability improvement from the COTS-based fault tolerance.

  9. Vibrational algorithms for quantitative crystallographic analyses of hydroxyapatite-based biomaterials: II, application to decayed human teeth.

    PubMed

    Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato

    2015-05-01

    A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.

  10. LobeFinder: A Convex Hull-Based Method for Quantitative Boundary Analyses of Lobed Plant Cells.

    PubMed

    Wu, Tzu-Ching; Belteton, Samuel A; Pack, Jessica; Szymanski, Daniel B; Umulis, David M

    2016-08-01

    Dicot leaves are composed of a heterogeneous mosaic of jigsaw puzzle piece-shaped pavement cells that vary greatly in size and the complexity of their shape. Given the importance of the epidermis and this particular cell type for leaf expansion, there is a strong need to understand how pavement cells morph from a simple polyhedral shape into highly lobed and interdigitated cells. At present, it is still unclear how and when the patterns of lobing are initiated in pavement cells, and one major technological bottleneck to addressing the problem is the lack of a robust and objective methodology to identify and track lobing events during the transition from simple cell geometry to lobed cells. We developed a convex hull-based algorithm termed LobeFinder to identify lobes, quantify geometric properties, and create a useful graphical output of cell coordinates for further analysis. The algorithm was validated against manually curated images of pavement cells of widely varying sizes and shapes. The ability to objectively count and detect new lobe initiation events provides an improved quantitative framework to analyze mutant phenotypes, detect symmetry-breaking events in time-lapse image data, and quantify the time-dependent correlation between cell shape change and intracellular factors that may play a role in the morphogenesis process.

  11. Quantitative Analyse von Korallengemeinschaften des Sanganeb-Atolls (mittleres Rotes Meer). I. Die Besiedlungsstruktur hydrodynamisch unterschiedlich exponierter Außen- und Innenriffe

    NASA Astrophysics Data System (ADS)

    Mergner, H.; Schuhmacher, H.

    1985-12-01

    The Sanganeb-Atoll off Port Sudan is an elongate annular reef which rests on a probably raised block in the fracture zone along the Red Sea-graben. Its gross morphology was most likely formed by subaerial erosion during low sealevel conditions. Features of its topography and hydrography are described. The prevailing wind waves are from NE, Hence, the outer and inner reef slopes are exposed to different hydrodynamic conditions. The sessile benthos was analysed using the quadrat method. Four test quadrats (5×5 m each) were selected on the outer and inner slopes at a depth of 10 m along a SSW-NNE transect across the atoll. Cnidaria were by far the most dominating group; coralline algae, Porifera, Bryozoa and Ascidia, however, counted for just under 3 % living cover. Light and temperature intensities did not differ significantly at the sites studied; water movement, however, decreased in the following order: TQ IV (outer NE side of the reef ring) was exposed to strong swell and surf; TQ II (inner side of the SW-ring) was met by a strong longreef current; TQ I was situated on the outer lee of the SW-atoll ring and TQ III in the inner lee of the NE-side. This hydrodynamic gradient correlates with the composition of the coral communities from predominantly branching Scleractinia (staghorn-like and other Acropora species and Pocillopora) in TQ IV, through a Lobophyllia, Porites and Xenia-dominated community in TQ II, and a mixed community with an increasing percentage of xeniid and alcyoniid soft corals in TQ I, to a community in TQ III which is dominated by the soft corals Sinularia and Dendronephthya. The cnidarian cover ranged between 42.4 and 56.6 % whereby the two exposed test quadrats had a higher living coverage than the protected ones. In total, 2649 colonies comprising 124 species of stony, soft and hydrocorals were recorded by an elaborate method of accurate in-situ mapping. The 90 scleractinian species found include 3 species new to the Red Sea and 11 hitherto

  12. Quantitative and qualitative validations of a sonication-based DNA extraction approach for PCR-based molecular biological analyses.

    PubMed

    Dai, Xiaohu; Chen, Sisi; Li, Ning; Yan, Han

    2016-05-15

    The aim of this study was to comprehensively validate the sonication-based DNA extraction method, in hope of the replacement of the so-called 'standard DNA extraction method' - the commercial kit method. Microbial cells in the digested sludge sample, containing relatively high amount of PCR-inhibitory substances, such as humic acid and protein, were applied as the experimental alternatives. The procedure involving solid/liquid separation of sludge sample and dilution of both DNA templates and inhibitors, the minimum templates for PCR-based analyses, and the in-depth understanding from the bias analysis by pyrosequencing technology were obtained and confirmed the availability of the sonication-based DNA extraction method.

  13. Quantitative and molecular analyses of mutation in a pSV2gpt transformed CHO cell line

    SciTech Connect

    Stankowski, L.F. Jr.; Tindall, K.R.; Hsie, A.W.

    1983-01-01

    Following NDA-mediated gene transfer we have isolated a cell line useful for studying gene mutation at the molecular level. This line, AS52, derived from a hypoxanthine-guanine phosphoribosyl transferase (HGPRT) deficient Chinese hamster ovary (CHO) cell line, carries a single copy of the E. coli xanthine-guanine phosphoribosyl transferase (XGPRT) gene (gpt) and exhibits a spontaneous mutant frequency of 20 TG/sup r/ mutants/10/sup 6/ clonable cells. As with HGPRT/sup -/ mutants, XGPRT/sup -/ mutants can be selected in 6-thioguanine. AS52 (XGPRT/sup +/) and wild type CHO (HGPRT/sup +/) cell exhibit almost identical cytotoxic responses to various agents. We observed significant differences in mutation induction by UV light and ethyl methanesulfonate (EMS). Ratios of XGPRT/sup -/ to HGPRT/sup -/ mutants induced per unit dose (J/m/sup 2/ for UV light and ..mu..g/ml for EMS) are 1.4 and 0.70, respectively. Preliminary Southern blot hybridization analyses has been performed on 30 XGPRT/sup -/ AS52 mutants. A majority of spontaneous mutants have deletions ranging in size from 1 to 4 kilobases (9/19) to complete loss of gpt sequences (4/19); the remainder have no detectable (5/19) or only minor (1/19) alterations. 5/5 UV-induced and 5/6 EMS-induced mutants do not show a detectable change. Similar analyses are underway for mutations induced by x-irradiation and ICR 191 treatment.

  14. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    PubMed

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts.

  15. Performing statistical analyses on quantitative data in Taverna workflows: An example using R and maxdBrowse to identify differentially-expressed genes from microarray data

    PubMed Central

    Li, Peter; Castrillo, Juan I; Velarde, Giles; Wassink, Ingo; Soiland-Reyes, Stian; Owen, Stuart; Withers, David; Oinn, Tom; Pocock, Matthew R; Goble, Carole A; Oliver, Stephen G; Kell, Douglas B

    2008-01-01

    Background There has been a dramatic increase in the amount of quantitative data derived from the measurement of changes at different levels of biological complexity during the post-genomic era. However, there are a number of issues associated with the use of computational tools employed for the analysis of such data. For example, computational tools such as R and MATLAB require prior knowledge of their programming languages in order to implement statistical analyses on data. Combining two or more tools in an analysis may also be problematic since data may have to be manually copied and pasted between separate user interfaces for each tool. Furthermore, this transfer of data may require a reconciliation step in order for there to be interoperability between computational tools. Results Developments in the Taverna workflow system have enabled pipelines to be constructed and enacted for generic and ad hoc analyses of quantitative data. Here, we present an example of such a workflow involving the statistical identification of differentially-expressed genes from microarray data followed by the annotation of their relationships to cellular processes. This workflow makes use of customised maxdBrowse web services, a system that allows Taverna to query and retrieve gene expression data from the maxdLoad2 microarray database. These data are then analysed by R to identify differentially-expressed genes using the Taverna RShell processor which has been developed for invoking this tool when it has been deployed as a service using the RServe library. In addition, the workflow uses Beanshell scripts to reconcile mismatches of data between services as well as to implement a form of user interaction for selecting subsets of microarray data for analysis as part of the workflow execution. A new plugin system in the Taverna software architecture is demonstrated by the use of renderers for displaying PDF files and CSV formatted data within the Taverna workbench. Conclusion Taverna can

  16. Quantitative analyses of ammonia-oxidizing Archaea and bacteria in the sediments of four nitrogen-rich wetlands in China.

    PubMed

    Wang, Shanyun; Wang, Yu; Feng, Xiaojuan; Zhai, Liming; Zhu, Guibing

    2011-04-01

    With the rapid development of ammonia-synthesizing industry, the ammonia-nitrogen pollution in wetlands acting as the sink of point and diffuse pollution has been increased dramatically. Most of ammonia-nitrogen is oxidized at least once by ammonia-oxidizing prokaryotes to complete the nitrogen cycle. Current research findings have expanded the known ammonia-oxidizing prokaryotes from the domain Bacteria to Archaea. However, in the complex wetlands environment, it remains unclear whether ammonia oxidation is exclusively or predominantly linked to Archaea or Bacteria as implied by specific high abundance. In this research, the abundance and composition of Archaea and Bacteria in sediments of four kinds of wetlands with different nitrogen concentration were investigated by using quantitative real-time polymerase chain reaction, cloning, and sequencing approaches based on amoA genes. The results indicated that AOA distributed widely in wetland sediments, and the phylogenetic tree revealed that archaeal amoA functional gene sequences from wetlands sediments cluster as two major evolutionary branches: soil/sediment and sediment/water. The bacteria functionally dominated microbial ammonia oxidation in different wetlands sediments on the basis of molecule analysis, potential nitrification rate, and soil chemistry. Moreover, the factors influencing AOA and AOB abundances with environmental indicator were also analyzed, and the results addressed the copy numbers of archaeal and bacterial amoA functional gene having the higher correlation with pH and ammonia concentration. The pH had relatively great negative impact on the abundance of AOA and AOB, while ammonia concentration showed positive impact on AOB abundance only. These findings could be fundamental to improve understanding of the importance of AOB and AOA in nitrogen and other nutrients cycle in wetland ecosystems.

  17. Quantitative Proteomic Analyses Identify ABA-Related Proteins and Signal Pathways in Maize Leaves under Drought Conditions.

    PubMed

    Zhao, Yulong; Wang, Yankai; Yang, Hao; Wang, Wei; Wu, Jianyu; Hu, Xiuli

    2016-01-01

    Drought stress is one of major factors resulting in maize yield loss. The roles of abscisic acid (ABA) have been widely studied in crops in response to drought stress. However, more attention is needed to identify key ABA-related proteins and also gain deeper molecular insights about drought stress in maize. Based on this need, the physiology and proteomics of the ABA-deficient maize mutant vp5 and its wild-type Vp5 under drought stress were examined and analyzed. Malondialdehyde content increased and quantum efficiency of photosystem II decreased under drought stress in both genotypes. However, the magnitude of the increase or decrease was significantly higher in vp5 than in Vp5. A total of 7051 proteins with overlapping expression patterns among three replicates in the two genotypes were identified by Multiplex run iTRAQ-based quantitative proteomic and liquid chromatography-tandem mass spectrometry methods, of which the expression of only 150 proteins (130 in Vp5, 27 in vp5) showed changes of at least 1.5-fold under drought stress. Among the 150 proteins, 67 and 60 proteins were up-regulated and down-regulated by drought stress in an ABA-dependent way, respectively. ABA was found to play active roles in regulating signaling pathways related to photosynthesis, oxidative phosphorylation (mainly related to ATP synthesis), and glutathione metabolism (involved in antioxidative reaction) in the maize response to drought stress. Our results provide an extensive dataset of ABA-dependent, drought-regulated proteins in maize plants, which may help to elucidate the underlying mechanisms of ABA-enhanced tolerance to drought stress in maize.

  18. Quantitative Proteomic Analyses Identify ABA-Related Proteins and Signal Pathways in Maize Leaves under Drought Conditions

    PubMed Central

    Zhao, Yulong; Wang, Yankai; Yang, Hao; Wang, Wei; Wu, Jianyu; Hu, Xiuli

    2016-01-01

    Drought stress is one of major factors resulting in maize yield loss. The roles of abscisic acid (ABA) have been widely studied in crops in response to drought stress. However, more attention is needed to identify key ABA-related proteins and also gain deeper molecular insights about drought stress in maize. Based on this need, the physiology and proteomics of the ABA-deficient maize mutant vp5 and its wild-type Vp5 under drought stress were examined and analyzed. Malondialdehyde content increased and quantum efficiency of photosystem II decreased under drought stress in both genotypes. However, the magnitude of the increase or decrease was significantly higher in vp5 than in Vp5. A total of 7051 proteins with overlapping expression patterns among three replicates in the two genotypes were identified by Multiplex run iTRAQ-based quantitative proteomic and liquid chromatography-tandem mass spectrometry methods, of which the expression of only 150 proteins (130 in Vp5, 27 in vp5) showed changes of at least 1.5-fold under drought stress. Among the 150 proteins, 67 and 60 proteins were up-regulated and down-regulated by drought stress in an ABA-dependent way, respectively. ABA was found to play active roles in regulating signaling pathways related to photosynthesis, oxidative phosphorylation (mainly related to ATP synthesis), and glutathione metabolism (involved in antioxidative reaction) in the maize response to drought stress. Our results provide an extensive dataset of ABA-dependent, drought-regulated proteins in maize plants, which may help to elucidate the underlying mechanisms of ABA-enhanced tolerance to drought stress in maize. PMID:28008332

  19. Quantitative Signaling and Structure-Activity Analyses Demonstrate Functional Selectivity at the Nociceptin/Orphanin FQ Opioid Receptor

    PubMed Central

    Chang, Steven D.; Mascarella, S. Wayne; Spangler, Skylar M.; Gurevich, Vsevolod V.; Navarro, Hernan A.; Carroll, F. Ivy

    2015-01-01

    Comprehensive studies that consolidate selective ligands, quantitative comparisons of G protein versus arrestin-2/3 coupling, together with structure-activity relationship models for G protein–coupled receptor (GPCR) systems are less commonly employed. Here we examine biased signaling at the nociceptin/orphanin FQ opioid receptor (NOPR), the most recently identified member of the opioid receptor family. Using real-time, live-cell assays, we identified the signaling profiles of several NOPR-selective ligands in upstream GPCR signaling (G protein and arrestin pathways) to determine their relative transduction coefficients and signaling bias. Complementing this analysis, we designed novel ligands on the basis of NOPR antagonist J-113,397 [(±)-1-[(3R*,4R*)-1-(cyclooctylmethyl)-3-(hydroxymethyl)-4-piperidinyl]-3-ethyl-1,3-dihydro-2H-benzimidazol-2-one] to explore structure-activity relationships. Our study shows that NOPR is capable of biased signaling, and further, the NOPR selective ligands MCOPPB [1-[1-(1-methylcyclooctyl)-4-piperidinyl]-2-(3R)-3-piperidinyl-1H-benzimidazole trihydrochloride] and NNC 63-0532 [8-(1-naphthalenylmethyl)-4-oxo-1-phenyl-1,3,8-triazaspiro[4.5]decane-3-acetic acid, methyl ester] are G protein–biased agonists. Additionally, minor structural modification of J-113,397 can dramatically shift signaling from antagonist to partial agonist activity. We explore these findings with in silico modeling of binding poses. This work is the first to demonstrate functional selectivity and identification of biased ligands at the nociceptin opioid receptor. PMID:26134494

  20. Evaluation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in Pyrus pyrifolia using different tissue samples and seasonal conditions.

    PubMed

    Imai, Tsuyoshi; Ubi, Benjamin E; Saito, Takanori; Moriguchi, Takaya

    2014-01-01

    We have evaluated suitable reference genes for real time (RT)-quantitative PCR (qPCR) analysis in Japanese pear (Pyrus pyrifolia). We tested most frequently used genes in the literature such as β-Tubulin, Histone H3, Actin, Elongation factor-1α, Glyceraldehyde-3-phosphate dehydrogenase, together with newly added genes Annexin, SAND and TIP41. A total of 17 primer combinations for these eight genes were evaluated using cDNAs synthesized from 16 tissue samples from four groups, namely: flower bud, flower organ, fruit flesh and fruit skin. Gene expression stabilities were analyzed using geNorm and NormFinder software packages or by ΔCt method. geNorm analysis indicated three best performing genes as being sufficient for reliable normalization of RT-qPCR data. Suitable reference genes were different among sample groups, suggesting the importance of validation of gene expression stability of reference genes in the samples of interest. Ranking of stability was basically similar between geNorm and NormFinder, suggesting usefulness of these programs based on different algorithms. ΔCt method suggested somewhat different results in some groups such as flower organ or fruit skin; though the overall results were in good correlation with geNorm or NormFinder. Gene expression of two cold-inducible genes PpCBF2 and PpCBF4 were quantified using the three most and the three least stable reference genes suggested by geNorm. Although normalized quantities were different between them, the relative quantities within a group of samples were similar even when the least stable reference genes were used. Our data suggested that using the geometric mean value of three reference genes for normalization is quite a reliable approach to evaluating gene expression by RT-qPCR. We propose that the initial evaluation of gene expression stability by ΔCt method, and subsequent evaluation by geNorm or NormFinder for limited number of superior gene candidates will be a practical way of finding out

  1. Quantitative determination of the oxidation state of iron in biotite using x-ray photoelectron spectroscopy: II. In situ analyses

    SciTech Connect

    Raeburn, S.P. |; Ilton, E.S.; Veblen, D.R.

    1997-11-01

    X-ray photoelectron spectroscopy (XPS) was used to determine Fe(III)/{Sigma}Fe in individual biotite crystals in thin sections of ten metapelites and one syenite. The in situ XPS analyses of Fe(III)/{Sigma}Fe in biotite crystals in the metapelites were compared with published Fe(III)/{Sigma}Fe values determined by Moessbauer spectroscopy (MS) for mineral separates from the same hand samples. The difference between Fe(III)/{Sigma}Fe by the two techniques was greatest for samples with the lowest Fe(III)/{Sigma}Fe (by MS). For eight metamorphic biotites with Fe(III)/{Sigma}Fe = 9-27% comparison of the two techniques yielded a linear correlation of r = 0.94 and a statistically acceptable fit of [Fe(III)/{Sigma}Fe]{sub xps} = [Fe(III)/{Sigma}Fe]{sub ms}. The difference between Fe(III)/{Sigma}Fe by the two techniques was greater for two samples with Fe(III)/{Sigma}Fe {le} 6% (by MS). For biotite in the syenite sample, Fe(III)/{Sigma}Fe determined by both in situ XPS and bulk wet chemistry/electron probe microanalysis were similar. This contribution demonstrates that XPS can be used to analyze bulk Fe(III)/{Sigma}Fe in minerals in thin sections when appropriate precautions taken to avoid oxidation of the near-surface during preparation of samples. 25 refs., 3 figs., 4 tabs.

  2. Narrative and quantitative analyses of workers' compensation-covered injuries in short-haul vs. long-haul trucking.

    PubMed

    Chandler, Mark D; Bunn, Terry L; Slavova, Svetla

    2017-03-01

    Trucking remains one of the most dangerous industries in the U.S. Study aims were to (1) identify differences in worker injury types; (2) describer typical injury scenarios; and (3) recommend injury control measures, in short-haul vs. long-haul trucking. Narrative text analyses of Kentucky short-haul and long-haul trucking workers' compensation first reports of injury were performed. A higher percentage of lifting and cranking injuries were identified in short-haul trucking compared with long-haul trucking that had a higher percentage of securing/opening/closing/adjusting injuries that involved tarping, trailer door handling, and cab slippage. In contrast, a higher proportion of short-haul trucking injury scenarios involved roadway departures and rear-end collisions. Study findings can be used to inform intrastate vs. interstate trucking injury prevention control strategies such as an enhanced driver safety training and safe freight handling in short-haul trucking, and tarping, trailer safety, and cab safety in long-haul trucking.

  3. The neutralization of interferons by antibody. I. Quantitative and theoretical analyses of the neutralization reaction in different bioassay systems.

    PubMed

    Grossberg, S E; Kawade, Y; Kohase, M; Yokoyama, H; Finter, N

    2001-09-01

    The highly specific ability of antibodies to inhibit the biologic activity of cytokines or other therapeutic proteins is widely used in research and a subject of increasing clinical importance. The need exists for a standardized approach to the reporting of neutralizing antibody potency soundly based on theoretical and practical considerations and tested by experimental data. Pursuant to the original studies of Kawade on the theoretical and functional aspects of neutralization of interferons (IFN), experimental data were obtained by different laboratories employing varied methodology to address two hypotheses concerning the nature of IFN neutralization reactions, based on a derived formula that allows expression of neutralizing power as the reduction of 10 laboratory units (LU)/ml to 1 LU/ml, the end point of most bioassays. Two hypotheses are posed: (1) antibody acts to neutralize a fixed amount of biologically active IFN molecules, or (2) antibody reduces IFN activity in a set ratio of added/residual biologically active IFN. The first, or fixed amount, hypothesis relates to the reactivity of high-affinity antibodies neutralizing equimolar amounts of antigen, whereas the second, or constant proportion, hypothesis postulates a reduction in the ratio of total added IFN to residual active IFN molecules, such as a low-affinity antibody might exhibit. Analyses of data of the neutralization of IFN-alpha and IFN-beta are presented, employing human polyclonal antibodies and murine monoclonal antibodies (mAb). The theoretical constructs of Kawade are extended in the Appendix and correlated with new experimental data in the text. The data clearly indicate that the low-antibody affinity, constant proportion hypothesis, rather than the high-antibody affinity, fixed amount hypothesis, is applicable, if the bioassay is sensitive to IFN. The findings presented here and in the following paper (pp. 743-755, this issue) taken together provide the basis for a standardized method of

  4. Pathway, in silico and tissue-specific expression quantitative analyses of oesophageal squamous cell carcinoma genome-wide association studies data

    PubMed Central

    Hyland, Paula L; Zhang, Han; Yang, Qi; Yang, Howard H; Hu, Nan; Lin, Shih-Wen; Su, Hua; Wang, Lemin; Wang, Chaoyu; Ding, Ti; Fan, Jin-Hu; Qiao, You-Lin; Sung, Hyuna; Wheeler, William; Giffen, Carol; Burdett, Laurie; Wang, Zhaoming; Lee, Maxwell P; Chanock, Stephen J; Dawsey, Sanford M; Freedman, Neal D; Abnet, Christian C; Goldstein, Alisa M; Yu, Kai; Taylor, Philip R

    2016-01-01

    Background: Oesophageal cancer is the fourth leading cause of cancer death in China where essentially all cases are histologically oesophageal squamous cell carcinoma (ESCC). Agnostic pathway-based analyses of genome-wide association study (GWAS) data combined with tissue-specific expression quantitative trait loci (eQTL) analysis and publicly available functional data can identify biological pathways and/or genes enriched with functionally-relevant disease-associated variants. Method: We used the adaptive multilocus joint test to analyse 1827 pathways containing 6060 genes using GWAS data from 1942 ESCC cases and 2111 controls with Chinese ancestry. We examined the function of risk alleles using in silico and eQTL analyses in oesophageal tissues. Results: Associations with ESCC risk were observed for 36 pathways predominantly involved in apoptosis, cell cycle regulation and DNA repair and containing known GWAS-associated genes. After excluding genes with previous GWAS signals, candidate pathways (and genes) for ESCC risk included taste transduction (KEGG_hsa04742; TAS2R13, TAS2R42, TAS2R14, TAS2R46,TAS2R50), long-patch base excision repair (Reactome_pid; POLD2) and the metabolics pathway (KEGG_hsa01100; MTAP, GAPDH, DCTD, POLD2, AMDHD1). We identified and validated CASP8 rs13016963 and IDH2 rs11630814 as eQTLs, and CASP8 rs3769823 and IDH2 rs4561444 as the potential functional variants in high-linkage disequilibrium with these single nucleotide polymorphisms (SNPs), respectively. Further, IDH2 mRNA levels were down-regulated in ESCC (tumour:normal-fold change = 0.69, P = 6.75E-14). Conclusion: Agnostic pathway-based analyses and integration of multiple types of functional data provide new evidence for the contribution of genes in taste transduction and metabolism to ESCC susceptibility, and for the functionality of both established and new ESCC risk-related SNPs. PMID:26635288

  5. Till formation under a soft-bedded palaeo-ice stream of the Scandinavian Ice Sheet, constrained using qualitative and quantitative microstructural analyses

    NASA Astrophysics Data System (ADS)

    Narloch, Włodzimierz; Piotrowski, Jan A.; Wysota, Wojciech; Tylmann, Karol

    2015-08-01

    This study combines micro- and macroscale studies, laboratory experiments and quantitative analyses to decipher processes of till formation under a palaeo-ice stream and the nature of subglacial sediment deformation. Till micromorphology (grain lineations, grain stacks, turbate structures, crushed grains, intraclasts and domains), grain-size and till fabric data are used to investigate a basal till generated by the Vistula Ice Stream of the Scandinavian Ice Sheet during the last glaciation in north-central Poland. A comparison of microstructures from the in situ basal till and laboratory-sheared till experiments show statistical relationships between the number of grain lineations and grain stacks; and between the number of grain lineations and turbate structures. Microstructures in the in situ till document both brittle and ductile styles of deformation, possibly due to fluctuating basal water pressures beneath the ice stream. No systematic vertical and lateral trends are detected in the parameters investigated in the in situ till, which suggests a subglacial mosaic of relatively stable and unstable areas. This situation can be explained by an unscaled space-transgressive model of subglacial till formation whereby at any given point in time different processes operated in different places under the ice sheet, possibly related to the distance from the ice margin and water pressure at the ice-bed interface. A new quantitative measure reflecting the relationship between the number of grain lineations and grain stacks may be helpful in discriminating between pervasive and non-pervasive deformation and constraining the degree of stress heterogeneity within a deformed bed. Independent strain magnitude estimations revealed by a quantitative analysis of micro- and macro-particle data show low cumulative strain in the ice-stream till in the order of 10-102.

  6. A rapid and accurate method for the quantitative estimation of natural polysaccharides and their fractions using high performance size exclusion chromatography coupled with multi-angle laser light scattering and refractive index detector.

    PubMed

    Cheong, Kit-Leong; Wu, Ding-Tao; Zhao, Jing; Li, Shao-Ping

    2015-06-26

    In this study, a rapid and accurate method for quantitative analysis of natural polysaccharides and their different fractions was developed. Firstly, high performance size exclusion chromatography (HPSEC) was utilized to separate natural polysaccharides. And then the molecular masses of their fractions were determined by multi-angle laser light scattering (MALLS). Finally, quantification of polysaccharides or their fractions was performed based on their response to refractive index detector (RID) and their universal refractive index increment (dn/dc). Accuracy of the developed method for the quantification of individual and mixed polysaccharide standards, including konjac glucomannan, CM-arabinan, xyloglucan, larch arabinogalactan, oat β-glucan, dextran (410, 270, and 25 kDa), mixed xyloglucan and CM-arabinan, and mixed dextran 270 K and CM-arabinan was determined, and their average recoveries were between 90.6% and 98.3%. The limits of detection (LOD) and quantification (LOQ) were ranging from 10.68 to 20.25 μg/mL, and 42.70 to 68.85 μg/mL, respectively. Comparing to the conventional phenol sulfuric acid assay and HPSEC coupled with evaporative light scattering detection (HPSEC-ELSD) analysis, the developed HPSEC-MALLS-RID method based on universal dn/dc for the quantification of polysaccharides and their fractions is much more simple, rapid, and accurate with no need of individual polysaccharide standard, as well as free of calibration curve. The developed method was also successfully utilized for quantitative analysis of polysaccharides and their different fractions from three medicinal plants of Panax genus, Panax ginseng, Panax notoginseng and Panax quinquefolius. The results suggested that the HPSEC-MALLS-RID method based on universal dn/dc could be used as a routine technique for the quantification of polysaccharides and their fractions in natural resources.

  7. Analysis of glycosidically bound aroma precursors in tea leaves. 1. Qualitative and quantitative analyses of glycosides with aglycons as aroma compounds.

    PubMed

    Wang, D; Yoshimura, T; Kubota, K; Kobayashi, A

    2000-11-01

    Twenty-six synthetic glycosides constituting aglycons of the main tea aroma compounds ((Z)-3-hexenol, benzyl alcohol, 2-phenylethanol, methyl salicylate, geraniol, linalool, and four isomers of linalool oxides) were synthesized in our laboratory as authentic compounds. Those compounds were used to carry out a direct qualitative and quantitative determination of the glycosides as aroma precursors in different tea cultivars by capillary gas chromatographic-mass spectrometric (GC-MS) analyses after trifluoroacetyl conversion of the tea glycosidic fractions. Eleven beta-D-glucopyranosides, 10 beta-primeverosides (6-O-beta-D-xylopyranosyl-beta-D-glucopyranoside) with aglycons as the above alcohols, and geranyl beta-vicianoside (6-O-alpha-L-arabinopyranosyl-beta-D-glucopyranoside) were identified (tentatively identified in the case of methyl salicylate beta-primeveroside) in fresh tea leaves and quantified on the basis of calibration curves that had been established by using the synthetic compounds. Primeverosides were more abundant than glucosides in each cultivar we investigated for making green tea, oolong tea, and black tea. Separation of the diastereoisomers of linalool and four isomers of linalool oxides by GC analyses is also discussed.

  8. Quantitative x-ray diffraction analyses of samples used for sorption studies by the Isotope and Nuclear Chemistry Division, Los Alamos National Laboratory

    SciTech Connect

    Chipera, S.J.; Bish, D.L.

    1989-09-01

    Yucca Mountain, Nevada, is currently being investigated to determine its suitability to host our nation`s first geologic high-level nuclear waste repository. As part of an effort to determine how radionuclides will interact with rocks at Yucca Mountain, the Isotope and Nuclear Chemistry (INC) Division of Los Alamos National Laboratory has conducted numerous batch sorption experiments using core samples from Yucca Mountain. In order to understand better the interaction between the rocks and radionuclides, we have analyzed the samples used by INC with quantitative x-ray diffraction methods. Our analytical methods accurately determine the presence or absence of major phases, but we have not identified phases present below {approximately}1 wt %. These results should aid in understanding and predicting the potential interactions between radionuclides and the rocks at Yucca Mountain, although the mineralogic complexity of the samples and the lack of information on trace phases suggest that pure mineral studies may be necessary for a more complete understanding. 12 refs., 1 fig., 1 tab.

  9. Accurate, Direct, and High-Throughput Analyses of a Broad Spectrum of Endogenously Generated DNA Base Modifications with Isotope-Dilution Two-Dimensional Ultraperformance Liquid Chromatography with Tandem Mass Spectrometry: Possible Clinical Implication.

    PubMed

    Gackowski, Daniel; Starczak, Marta; Zarakowska, Ewelina; Modrzejewska, Martyna; Szpila, Anna; Banaszkiewicz, Zbigniew; Olinski, Ryszard

    2016-12-20

    Our hereby presented methodology is suitable for reliable assessment of the most common unavoidable DNA modifications which arise as a product of fundamental metabolic processes. 8-Oxoguanine, one of the oxidatively modified DNA bases, is a typical biomarker of oxidative stress. A noncanonical base, uracil, may be also present in small quantities in DNA. A set of ten-eleven translocation (TET) proteins are involved in oxidation of 5-methylcytosine to 5-hydroxymethylcytosine which can be further oxidized to 5-formylcytosine and 5-carboxycytosine. 5-Hydroxymethyluracil may be formed in deamination reaction of 5-hydroxymethylcytosine or can be also generated by TET enzymes. All of the aforementioned modifications seem to play some regulatory roles. We applied isotope-dilution automated online two-dimensional ultraperformance liquid chromatography with tandem mass spectrometry (2D-UPLC-MS/MS) for direct measurement of the 5-methyl-2'-deoxycytidine, 5-(hydroxymethyl)-2'-deoxycytidine, 5-formyl-2'-deoxycytidine, 5-carboxy-2'-deoxycytidine, 5-(hydroxymethyl)-2'-deoxyuridine, 2'-deoxyuridine, and 8-oxo-2'-deoxyguanosine. Analyses of DNA extracted from matched human samples showed that the 5-(hydroxymethyl)-2'-deoxycytidine level was 5-fold lower in colorectal carcinoma tumor in comparison with the normal one from the tumor's margin; also 5-formyl-2'-deoxycytidine and 5-carboxy-2'-deoxycytidine were lower in colorectal carcinoma tissue (ca. 2.5- and 3.5-fold, respectively). No such differences was found for 2'-deoxyuridine and 5-(hydroxymethyl)-2'-deoxyuridine. The presented methodology is suitable for fast, accurate, and complex evaluation of an array of endogenously generated DNA deoxynucleosides modifications. This novel technique could be used for monitoring of cancer and other diseases related to oxidative stress, aberrant metabolism, and environmental exposure. Furthermore, the fully automated two-dimensional separation is extremely useful for analysis of material

  10. Lichens biomonitoring as feasible methodology to assess air pollution in natural ecosystems: combined study of quantitative PAHs analyses and lichen biodiversity in the Pyrenees Mountains.

    PubMed

    Blasco, María; Domeño, Celia; Nerín, Cristina

    2008-06-01

    The air quality in the Aragón valley, in the central Pyrenees, has been assessed by evaluation of lichen biodiversity and mapped by elaboration of the Index of Air Purity (IAP) based on observations of the presence and abundance of eight kinds of lichen with different sensitivity to air pollution. The IAP values obtained have been compared with quantitative analytical measures of 16 PAHs in the lichen Evernia prunastri, because this species was associated with a wide range of traffic exposure and levels of urbanization. Analyses of PAHs were carried out by the DSASE method followed by an SPE clean-up step and GC-MS analysis. The concentration of total PAHs found in lichen samples from the Aragón valley ranged from 692 to 6420 ng g(-1) and the PAHs profile showed predominance of compounds with three aromatic rings. The influence of the road traffic in the area has been shown because values over the median concentration of PAHs (>1092 ng g(-1)), percentage of combustion PAHs (>50%), and equivalent toxicity (>169) were found in lichens collected at places exposed to the influence of traffic. The combination of both methods suggests IAP as a general method for evaluating the air pollution referenced to PAHs because it can be correlated with the content of combustion PAHs and poor lichen biodiversity can be partly explained by the air pollution caused by specific PAHs.

  11. A High-Density Genetic Map with Array-Based Markers Facilitates Structural and Quantitative Trait Locus Analyses of the Common Wheat Genome

    PubMed Central

    Iehisa, Julio Cesar Masaru; Ohno, Ryoko; Kimura, Tatsuro; Enoki, Hiroyuki; Nishimura, Satoru; Okamoto, Yuki; Nasuda, Shuhei; Takumi, Shigeo

    2014-01-01

    The large genome and allohexaploidy of common wheat have complicated construction of a high-density genetic map. Although improvements in the throughput of next-generation sequencing (NGS) technologies have made it possible to obtain a large amount of genotyping data for an entire mapping population by direct sequencing, including hexaploid wheat, a significant number of missing data points are often apparent due to the low coverage of sequencing. In the present study, a microarray-based polymorphism detection system was developed using NGS data obtained from complexity-reduced genomic DNA of two common wheat cultivars, Chinese Spring (CS) and Mironovskaya 808. After design and selection of polymorphic probes, 13,056 new markers were added to the linkage map of a recombinant inbred mapping population between CS and Mironovskaya 808. On average, 2.49 missing data points per marker were observed in the 201 recombinant inbred lines, with a maximum of 42. Around 40% of the new markers were derived from genic regions and 11% from repetitive regions. The low number of retroelements indicated that the new polymorphic markers were mainly derived from the less repetitive region of the wheat genome. Around 25% of the mapped sequences were useful for alignment with the physical map of barley. Quantitative trait locus (QTL) analyses of 14 agronomically important traits related to flowering, spikes, and seeds demonstrated that the new high-density map showed improved QTL detection, resolution, and accuracy over the original simple sequence repeat map. PMID:24972598

  12. Docking and three-dimensional quantitative structure-activity relationship analyses of imidazole and thiazolidine derivatives as Aurora A kinase inhibitors.

    PubMed

    Im, Chaeuk

    2016-12-01

    Aurora A kinase is involved in the inactivation of apoptosis leading to ovarian, breast, colon, and pancreatic cancers. Inhibitors of Aurora A kinase promote aberrant mitosis resulting in arrest at a pseudo G1 state to induce mitotic catastrophe, ultimately leading to apoptosis. In this study, ligand-based and docking-based three-dimensional quantitative structure-activity relationship (3D-QSAR) analyses of imidazole and thiazolidine derivatives as potential Aurora A kinase inhibitors were performed. The results provided highly reliable and predictive 3D-QSAR comparative molecular similarity index analysis (CoMSIA) models with a cross-validated q(2) value of 0.768, non-cross-validated r(2) value of 0.983, and predictive coefficient [Formula: see text] value of 0.978. CoMSIA contour maps suggested that the NH and benzyl hydroxy groups in R9, and the CO group in the thiazolidine ring and pyridine ring were important components for biological activity. The maps also suggest that the introduction of hydroxy groups at C2 of the imino-phenyl ring, C5 in the pyridine ring, or the substitution of the imino-phenyl ring for the imino-2-pyridine ring could be applied to enhance biological activity.

  13. Value of quantitative and qualitative analyses of circulating cell-free DNA as diagnostic tools for hepatocellular carcinoma: a meta-analysis.

    PubMed

    Liao, Wenjun; Mao, Yilei; Ge, Penglei; Yang, Huayu; Xu, Haifeng; Lu, Xin; Sang, Xinting; Zhong, Shouxian

    2015-04-01

    Qualitative and quantitative analyses of circulating cell-free DNA (cfDNA) are potential methods for the detection of hepatocellular carcinoma (HCC). Many studies have evaluated these approaches, but the results have been variable. This meta-analysis is the first to synthesize these published results and evaluate the use of circulating cfDNA values for HCC diagnosis. All articles that met our inclusion criteria were assessed using QUADAS guidelines after the literature research. We also investigated 3 subgroups in this meta-analysis: qualitative analysis of abnormal concentrations of circulating cfDNA; qualitative analysis of single-gene methylation alterations; and multiple analyses combined with alpha-fetoprotein (AFP). Statistical analyses were performed using the software Stata 12.0. We synthesized these published results and calculated accuracy measures (pooled sensitivity and specificity, positive/negative likelihood ratios [PLRs/NLRs], diagnostic odds ratios [DORs], and corresponding 95% confidence intervals [95% CIs]). Data were pooled using bivariate generalized linear mixed model. Furthermore, summary receiver operating characteristic curves and area under the curve (AUC) were used to summarize overall test performance. Heterogeneity and publication bias were also examined. A total of 2424 subjects included 1280 HCC patients in 22 studies were recruited in this meta-analysis. Pooled sensitivity and specificity, PLR, NLR, DOR, AUC, and CIs of quantitative analysis were 0.741 (95% CI: 0.610-0.840), 0.851 (95% CI: 0.718-0.927), 4.970 (95% CI: 2.694-9.169), 0.304 (95% CI: 0.205-0.451), 16.347 (95% CI: 8.250-32.388), and 0.86 (95% CI: 0.83-0.89), respectively. For qualitative analysis, the values were 0.538 (95% CI: 0.401-0.669), 0.944 (95% CI: 0.889-0.972), 9.545 (95% CI: 5.298-17.196), 0.490 (95% CI: 0.372-0.646), 19.491 (95% CI: 10.458-36.329), and 0.87 (95% CI: 0.84-0.90), respectively. After combining with AFP assay, the values were 0.818 (95% CI: 0

  14. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  15. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  16. Dimensionality Analyses of the "GRE"® revised General Test Verbal and Quantitative Measures. ETS GRE® Board Research Report. ETS GRE®-16-02. ETS Research Report. RR-16-20

    ERIC Educational Resources Information Center

    Robin, Frédéric; Bejar, Isaac; Liang, Longjuan; Rijmen, Frank

    2016-01-01

    Exploratory and confirmatory factor analyses of domestic data from the" GRE"® revised General Test, introduced in 2011, were conducted separately for the verbal (VBL) and quantitative (QNT) reasoning measures to evaluate the unidimensionality and local independence assumptions required by item response theory (IRT). Results based on data…

  17. Assessment and validation of a suite of reverse transcription-quantitative PCR reference genes for analyses of density-dependent behavioural plasticity in the Australian plague locust

    PubMed Central

    2011-01-01

    Background The Australian plague locust, Chortoicetes terminifera, is among the most promising species to unravel the suites of genes underling the density-dependent shift from shy and cryptic solitarious behaviour to the highly active and aggregating gregarious behaviour that is characteristic of locusts. This is because it lacks many of the major phenotypic changes in colour and morphology that accompany phase change in other locust species. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the most sensitive method available for determining changes in gene expression. However, to accurately monitor the expression of target genes, it is essential to select an appropriate normalization strategy to control for non-specific variation between samples. Here we identify eight potential reference genes and examine their expression stability at different rearing density treatments in neural tissue of the Australian plague locust. Results Taking advantage of the new orthologous DNA sequences available in locusts, we developed primers for genes encoding 18SrRNA, ribosomal protein L32 (RpL32), armadillo (Arm), actin 5C (Actin), succinate dehydrogenase (SDHa), glyceraldehyde-3P-dehydrogenase (GAPDH), elongation factor 1 alpha (EF1a) and annexin IX (AnnIX). The relative transcription levels of these eight genes were then analyzed in three treatment groups differing in rearing density (isolated, short- and long-term crowded), each made up of five pools of four neural tissue samples from 5th instar nymphs. SDHa and GAPDH, which are both involved in metabolic pathways, were identified as the least stable in expression levels, challenging their usefulness in normalization. Based on calculations performed with the geNorm and NormFinder programs, the best combination of two genes for normalization of gene expression data following crowding in the Australian plague locust was EF1a and Arm. We applied their use to studying a target gene that encodes a Ca2

  18. Accurate quantitation for in vitro refolding of single domain antibody fragments expressed as inclusion bodies by referring the concomitant expression of a soluble form in the periplasms of Escherichia coli.

    PubMed

    Noguchi, Tomoaki; Nishida, Yuichi; Takizawa, Keiji; Cui, Yue; Tsutsumi, Koki; Hamada, Takashi; Nishi, Yoshisuke

    2017-03-01

    Single domain antibody fragments from two species, a camel VHH (PM1) and a shark VNAR (A6), were derived from inclusion bodies of E. coli and refolded in vitro following three refolding recipes for comparing refolding efficiencies: three-step cold dialysis refolding (TCDR), one-step hot dialysis refolding (OHDR), and one-step cold dialysis refolding (OCDR), as these fragments were expressed as 'a soluble form' either in cytoplasm or periplasm, but the amount were much less than those expressed as 'an insoluble form (inclusion body)' in cytoplasm and periplasm. In order to verify the refolding efficiencies from inclusion bodies correctly, proteins purified from periplasmic soluble fractions were used as reference samples. These samples showed far-UV spectra of a typical β-sheet-dominant structure in circular dichroism (CD) spectroscopy and so did the refolded samples as well. As the maximal magnitude of ellipticity in millidegrees (θmax) observed at a given wave length was proportional to the concentrations of the respective reference samples, we could draw linear regression lines for the magnitudes vs. sample concentrations. By using these lines, we measured the concentrations for the refolded PM1 and A6 samples purified from solubilized cytoplasmic insoluble fractions. The refolding efficiency of PM1 was almost 50% following TCDR and 40% and 30% following OHDR and OCDR, respectively, whereas the value of A6 was around 30% following TCDR, and out of bound for quantitation following the other two recipes. The ELISA curves, which were derived from the refolded samples, coincided better with those obtained from the reference samples after converting the values from the protein-concentrations at recovery to the ones of refolded proteins using recovery ratios, indicating that such a correction gives better results for the accurate measure of the ELISA curves than those without correction. Our method require constructing a dual expression system, expressed both in

  19. Quantitative Analyses about Market- and Prevalence-Based Needs for Adapted Physical Education Teachers in the Public Schools in the United States

    ERIC Educational Resources Information Center

    Zhang, Jiabei

    2011-01-01

    The purpose of this study was to analyze quantitative needs for more adapted physical education (APE) teachers based on both market- and prevalence-based models. The market-based need for more APE teachers was examined based on APE teacher positions funded, while the prevalence-based need for additional APE teachers was analyzed based on students…

  20. Evaluation of a real-time quantitative PCR method with propidium monazide treatment for analyses of viable fecal indicator bacteria in wastewater samples

    EPA Science Inventory

    The U.S. EPA is currently evaluating rapid, real-time quantitative PCR (qPCR) methods for determining recreational water quality based on measurements of fecal indicator bacteria DNA sequences. In order to potentially use qPCR for other Clean Water Act needs, such as updating cri...

  1. High resolution rare-earth elements analyses of natural apatite and its application in geo-sciences: Combined micro-PIXE, quantitative CL spectroscopy and electron spin resonance analyses

    NASA Astrophysics Data System (ADS)

    Habermann, D.; Götte, T.; Meijer, J.; Stephan, A.; Richter, D. K.; Niklas, J. R.

    2000-03-01

    The rare-earth element (REE) distribution in natural apatite is analysed by micro-PIXE, cathodoluminescence (CL) microscopy and spectroscopy and electron spin resonance (ESR) spectroscopy. The micro-PIXE analyses of an apatite crystal from Cerro de Mercado (Mexico) and the summary of 20 analyses of six francolite (conodonts of Triassic age) samples indicate that most of the REEs are enriched in apatite and francolite comparative to average shale standard (NASC). The analyses of fossil francolite revealing the REE-distribution not to be in balance with the REE-distribution of seawater and fish bone debris. Strong inhomogenous lateral REE-distribution in fossil conodont material is shown by CL-mapping and most probably not being a vital effect. Therefore, the resulting REE-signal from fossil francolite is the sum of vital and post-mortem incorporation. The necessary charge compensation for the substitution of divalent Ca by trivalent REE being done by different kind of electron defects and defect ions.

  2. Semi-quantitative and simulation analyses of effects of γ rays on determination of calibration factors of PET scanners with point-like (22)Na sources.

    PubMed

    Hasegawa, Tomoyuki; Sato, Yasushi; Oda, Keiichi; Wada, Yasuhiro; Murayama, Hideo; Yamada, Takahiro

    2011-09-21

    The uncertainty of radioactivity concentrations measured with positron emission tomography (PET) scanners ultimately depends on the uncertainty of the calibration factors. A new practical calibration scheme using point-like (22)Na radioactive sources has been developed. The purpose of this study is to theoretically investigate the effects of the associated 1.275 MeV γ rays on the calibration factors. The physical processes affecting the coincidence data were categorized in order to derive approximate semi-quantitative formulae. Assuming the design parameters of some typical commercial PET scanners, the effects of the γ rays as relative deviations in the calibration factors were evaluated by semi-quantitative formulae and a Monte Carlo simulation. The relative deviations in the calibration factors were less than 4%, depending on the details of the PET scanners. The event losses due to rejecting multiple coincidence events of scattered γ rays had the strongest effect. The results from the semi-quantitative formulae and the Monte Carlo simulation were consistent and were useful in understanding the underlying mechanisms. The deviations are considered small enough to correct on the basis of precise Monte Carlo simulation. This study thus offers an important theoretical basis for the validity of the calibration method using point-like (22)Na radioactive sources.

  3. Quantitative Analysis of Intra-chromosomal Contacts: The 3C-qPCR Method.

    PubMed

    Ea, Vuthy; Court, Franck; Forné, Thierry

    2017-01-01

    The chromosome conformation capture (3C) technique is fundamental to many population-based methods investigating chromatin dynamics and organization in eukaryotes. Here, we provide a modified quantitative 3C (3C-qPCR) protocol for improved quantitative analyses of intra-chromosomal contacts. We also describe an algorithm for data normalization which allows more accurate comparisons between contact profiles.

  4. Quantitative analyses of the effect of silk fibroin/nano-hydroxyapatite composites on osteogenic differentiation of MG-63 human osteosarcoma cells.

    PubMed

    Lin, Linxue; Hao, Runsong; Xiong, Wei; Zhong, Jian

    2015-05-01

    Silk fibroin (SF)/nano-hydroxyapatite (n-HA) composites are potential biomaterials for bone defect repair. Up to now, the biological evaluation studies of SF/n-HA composites have primarily concentrated on their biocompatibility at cell level such as cell viability and proliferation and tissue level such as material absorption and new bone formation. In this work, SF/n-HA composites were fabricated using a simplified coprecipitation methods and were deposited onto Ti alloy substrates. Then the cell adhesion ability of SF/n-HA composites was observed by SEM and cell proliferation ability of SF/n-HA composites was determined by MTT assay. The ALP activity, BGP contents, and Col I contents of MG-63 human osteosarcoma cells on SF/n-HA composites were quantitatively analyzed. HA nanocrystals were used as controls. These experiments showed that SF/n-HA composites had better cell adhesion and osteogenic differentiation abilities than n-HA materials. This work provides quantitative data to analyze the effect of SF/n-HA composites on cell osteogenic differentiation.

  5. Combined orcein and martius scarlet blue (OMSB) staining for qualitative and quantitative analyses of atherosclerotic plaques in brachiocephalic arteries in apoE/LDLR(-/-) mice.

    PubMed

    Gajda, Mariusz; Jasztal, Agnieszka; Banasik, Tomasz; Jasek-Gajda, Ewa; Chlopicki, Stefan

    2017-02-06

    Numerous cellular and extracellular components should be analyzed in sections of atherosclerotic plaques to assess atherosclerosis progression and vulnerability. Here, we combined orcein (O) staining for elastic fibers and martius scarlet blue (MSB) polychrome to visualize various morphological contents of plaque in brachiocephalic arteries (BCA) of apoE/LDLR(-/-) mice. Elastic fibers (including broken elastic laminae and 'buried' fibrous caps) were stained purple and they could be easily distinguished from collagen fibers (blue). Orcein allowed clear identification of even the finest elastic fibers. Erythrocytes were stained yellow and they could easily be discerned from mature fibrin (red). Old fibrin tends to acquire blue color. The method of OMSB staining is simple, takes less than 1 h to perform and can be adapted to automatic stainers. Most importantly, the color separation is good enough to allow digital automatic segmentation of specific components in tissue section and quantitative analysis of the plaque constituents. OMSB was used to compare atherosclerotic plaques in proximal and distal regions of BCA in apoE/LDLR(-/-) mice. In conclusion, OMSB staining represents a novel staining that could be routinely used for qualitative and quantitative microscopic assessments of formaldehyde-fixed and paraffin-embedded sections of arteries with atherosclerotic lesions.

  6. Detailed Chemical Composition of Condensed Tannins via Quantitative (31)P NMR and HSQC Analyses: Acacia catechu, Schinopsis balansae, and Acacia mearnsii.

    PubMed

    Crestini, Claudia; Lange, Heiko; Bianchetti, Giulia

    2016-09-23

    The chemical composition of Acacia catechu, Schinopsis balansae, and Acacia mearnsii proanthocyanidins has been determined using a novel analytical approach that rests on the concerted use of quantitative (31)P NMR and two-dimensional heteronuclear NMR spectroscopy. This approach has offered significant detailed information regarding the structure and purity of these complex and often elusive proanthocyanidins. More specifically, rings A, B, and C of their flavan-3-ol units show well-defined and resolved absorbance regions in both the quantitative (31)P NMR and HSQC spectra. By integrating each of these regions in the (31)P NMR spectra, it is possible to identify the oxygenation patterns of the flavan-3-ol units. At the same time it is possible to acquire a fingerprint of the proanthocyanidin sample and evaluate its purity via the HSQC information. This analytical approach is suitable for both the purified natural product proanthocyanidins and their commercial analogues. Overall, this effort demonstrates the power of the concerted use of these two NMR techniques for the structural elucidation of natural products containing labile hydroxy protons and a carbon framework that can be traced out via HSQC.

  7. Origin of mobility enhancement by chemical treatment of gate-dielectric surface in organic thin-film transistors: Quantitative analyses of various limiting factors in pentacene thin films

    NASA Astrophysics Data System (ADS)

    Matsubara, R.; Sakai, Y.; Nomura, T.; Sakai, M.; Kudo, K.; Majima, Y.; Knipp, D.; Nakamura, M.

    2015-11-01

    For the better performance of organic thin-film transistors (TFTs), gate-insulator surface treatments are often applied. However, the origin of mobility increase has not been well understood because mobility-limiting factors have not been compared quantitatively. In this work, we clarify the influence of gate-insulator surface treatments in pentacene thin-film transistors on the limiting factors of mobility, i.e., size of crystal-growth domain, crystallite size, HOMO-band-edge fluctuation, and carrier transport barrier at domain boundary. We quantitatively investigated these factors for pentacene TFTs with bare, hexamethyldisilazane-treated, and polyimide-coated SiO2 layers as gate dielectrics. By applying these surface treatments, size of crystal-growth domain increases but both crystallite size and HOMO-band-edge fluctuation remain unchanged. Analyzing the experimental results, we also show that the barrier height at the boundary between crystal-growth domains is not sensitive to the treatments. The results imply that the essential increase in mobility by these surface treatments is only due to the increase in size of crystal-growth domain or the decrease in the number of energy barriers at domain boundaries in the TFT channel.

  8. Applications of Quaternary stratigraphic, soil-geomorphic, and quantitative geomorphic analyses to the evaluation of tectonic activity and landscape evolution in the Upper Coastal Plain, South Carolina

    SciTech Connect

    Hanson, K.L.; Bullard, T.F.; de Wit, M.W.; Stieve, A.L.

    1993-07-01

    Geomorphic analyses combined with mapping of fluvial terraces and upland geomorphic surfaces provide new approaches and data for evaluating the Quaternary activity of post-Cretaceous faults that are recognized in subsurface data at the Savannah River Site in the Upper Coastal Plain of southwestern South Carolina. Analyses of longitudinal stream and terrace profiles, regional slope maps, and drainage basin morphometry indicate long-term uplift and southeast tilt of the site region. Preliminary results of drainage basin characterization suggests an apparent rejuvenation of drainages along the trace of the Pen Branch fault (a Tertiary reactivated reverse fault that initiated as a basin-margin normal fault along the northern boundary of the Triassic Dunbarton Basin). This apparent rejuvenation of drainages may be the result of nontectonic geomorphic processes or local tectonic uplift and tilting within a framework of regional uplift.

  9. 3D numerical analyses for the quantitative risk assessment of subsidence and water flood due to the partial collapse of an abandoned gypsum mine.

    NASA Astrophysics Data System (ADS)

    Castellanza, R.; Orlandi, G. M.; di Prisco, C.; Frigerio, G.; Flessati, L.; Fernandez Merodo, J. A.; Agliardi, F.; Grisi, S.; Crosta, G. B.

    2015-09-01

    After the abandonment occurred in the '70s, the mining system (rooms and pillars) located in S. Lazzaro di Savena (BO, Italy), grown on three levels with the method rooms and pillars, has been progressively more and more affected by degradation processes due to water infiltration. The mine is located underneath a residential area causing significant concern to the local municipality. On the basis of in situ surveys, laboratory and in situ geomechanical tests, some critical scenarios were adopted in the analyses to simulate the progressive collapse of pillars and of roofs in the most critical sectors of the mine. A first set of numerical analyses using 3D geotechnical FEM codes were performed to predict the extension of the subsidence area and its interaction with buildings. Secondly 3D CFD analyses were used to evaluated the amount of water that could be eventually ejected outside the mine and eventually flooding the downstream village. The predicted extension of the subsidence area together with the predicted amount of the ejected water have been used to design possible remedial measurements.

  10. Evaluation of Drosophila metabolic labeling strategies for in vivo quantitative proteomic analyses with applications to early pupa formation and amino acid starvation.

    PubMed

    Chang, Ying-Che; Tang, Hong-Wen; Liang, Suh-Yuen; Pu, Tsung-Hsien; Meng, Tzu-Ching; Khoo, Kay-Hooi; Chen, Guang-Chao

    2013-05-03

    Although stable isotope labeling by amino acids in cell culture (SILAC)-based quantitative proteomics was first developed as a cell culture-based technique, stable isotope-labeled amino acids have since been successfully introduced in vivo into select multicellular model organisms by manipulating the feeding diets. An earlier study by others has demonstrated that heavy lysine labeled Drosophila melanogaster can be derived by feeding with an exclusive heavy lysine labeled yeast diet. In this work, we have further evaluated the use of heavy lysine and/or arginine for metabolic labeling of fruit flies, with an aim to determine its respective quantification accuracy and versatility. In vivo conversion of heavy lysine and/or heavy arginine to several nonessential amino acids was observed in labeled flies, leading to distorted isotope pattern and underestimated heavy to light ratio. These quantification defects can nonetheless be rectified at protein level using the normalization function. The only caveat is that such a normalization strategy may not be suitable for every biological application, particularly when modified peptides need to be individually quantified at peptide level. In such cases, we showed that peptide ratios calculated from the summed intensities of all isotope peaks are less affected by the heavy amino acid conversion and therefore less sequence-dependent and more reliable. Applying either the single Lys8 or double Lys6/Arg10 metabolic labeling strategy to flies, we quantitatively mapped the proteomic changes during the onset of metamorphosis and upon amino acid deprivation. The expression of a number of steroid hormone 20-hydroxyecdysone regulated proteins was found to be changed significantly during larval-pupa transition, while several subunits of the V-ATPase complex and components regulating actomyosin were up-regulated under starvation-induced autophagy conditions.

  11. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  12. On the Metropolis-Hastings acceptance probability to add or drop a quantitative trait locus in Markov chain Monte Carlo-based Bayesian analyses.

    PubMed Central

    Jannink, Jean-Luc; Fernando, Rohan L

    2004-01-01

    The Metropolis-Hastings algorithm used in analyses that estimate the number of QTL segregating in a mapping population requires the calculation of an acceptance probability to add or drop a QTL from the model. Expressions for this acceptance probability need to recognize that sets of QTL are unordered such that the number of equivalent sets increases with the factorial of the QTL number. Here, we show how accounting for this fact affects the acceptance probability and review expressions found in the literature. PMID:15020452

  13. Quantitative in vivo fluorescence cross-correlation analyses highlight the importance of competitive effects in the regulation of protein-protein interactions.

    PubMed

    Sadaie, Wakako; Harada, Yoshie; Matsuda, Michiyuki; Aoki, Kazuhiro

    2014-09-01

    Computer-assisted simulation is a promising approach for clarifying complicated signaling networks. However, this approach is currently limited by a deficiency of kinetic parameters determined in living cells. To overcome this problem, we applied fluorescence cross-correlation spectrometry (FCCS) to measure dissociation constant (Kd) values of signaling molecule complexes in living cells (in vivo Kd). Among the pairs of fluorescent molecules tested, that of monomerized enhanced green fluorescent protein (mEGFP) and HaloTag-tetramethylrhodamine was most suitable for the measurement of in vivo Kd by FCCS. Using this pair, we determined 22 in vivo Kd values of signaling molecule complexes comprising the epidermal growth factor receptor (EGFR)-Ras-extracellular signal-regulated kinase (ERK) mitogen-activated protein (MAP) kinase pathway. With these parameters, we developed a kinetic simulation model of the EGFR-Ras-ERK MAP kinase pathway and uncovered a potential role played by stoichiometry in Shc binding to EGFR during the peak activations of Ras, MEK, and ERK. Intriguingly, most of the in vivo Kd values determined in this study were higher than the in vitro Kd values reported previously, suggesting the significance of competitive bindings inside cells. These in vivo Kd values will provide a sound basis for the quantitative understanding of signal transduction.

  14. Quantitative fluorescence spectroscopy and flow cytometry analyses of cell-penetrating peptides internalization pathways: optimization, pitfalls, comparison with mass spectrometry quantification

    NASA Astrophysics Data System (ADS)

    Illien, Françoise; Rodriguez, Nicolas; Amoura, Mehdi; Joliot, Alain; Pallerla, Manjula; Cribier, Sophie; Burlina, Fabienne; Sagan, Sandrine

    2016-11-01

    The mechanism of cell-penetrating peptides entry into cells is unclear, preventing the development of more efficient vectors for biotechnological or therapeutic purposes. Here, we developed a protocol relying on fluorometry to distinguish endocytosis from direct membrane translocation, using Penetratin, TAT and R9. The quantities of internalized CPPs measured by fluorometry in cell lysates converge with those obtained by our previously reported mass spectrometry quantification method. By contrast, flow cytometry quantification faces several limitations due to fluorescence quenching processes that depend on the cell line and occur at peptide/cell ratio >6.108 for CF-Penetratin. The analysis of cellular internalization of a doubly labeled fluorescent and biotinylated Penetratin analogue by the two independent techniques, fluorometry and mass spectrometry, gave consistent results at the quantitative and qualitative levels. Both techniques revealed the use of two alternative translocation and endocytosis pathways, whose relative efficacy depends on cell-surface sugars and peptide concentration. We confirmed that Penetratin translocates at low concentration and uses endocytosis at high μM concentrations. We further demonstrate that the hydrophobic/hydrophilic nature of the N-terminal extremity impacts on the internalization efficiency of CPPs. We expect these results and the associated protocols to help unraveling the translocation pathway to the cytosol of cells.

  15. Quantitative fluorescence spectroscopy and flow cytometry analyses of cell-penetrating peptides internalization pathways: optimization, pitfalls, comparison with mass spectrometry quantification.

    PubMed

    Illien, Françoise; Rodriguez, Nicolas; Amoura, Mehdi; Joliot, Alain; Pallerla, Manjula; Cribier, Sophie; Burlina, Fabienne; Sagan, Sandrine

    2016-11-14

    The mechanism of cell-penetrating peptides entry into cells is unclear, preventing the development of more efficient vectors for biotechnological or therapeutic purposes. Here, we developed a protocol relying on fluorometry to distinguish endocytosis from direct membrane translocation, using Penetratin, TAT and R9. The quantities of internalized CPPs measured by fluorometry in cell lysates converge with those obtained by our previously reported mass spectrometry quantification method. By contrast, flow cytometry quantification faces several limitations due to fluorescence quenching processes that depend on the cell line and occur at peptide/cell ratio >6.10(8) for CF-Penetratin. The analysis of cellular internalization of a doubly labeled fluorescent and biotinylated Penetratin analogue by the two independent techniques, fluorometry and mass spectrometry, gave consistent results at the quantitative and qualitative levels. Both techniques revealed the use of two alternative translocation and endocytosis pathways, whose relative efficacy depends on cell-surface sugars and peptide concentration. We confirmed that Penetratin translocates at low concentration and uses endocytosis at high μM concentrations. We further demonstrate that the hydrophobic/hydrophilic nature of the N-terminal extremity impacts on the internalization efficiency of CPPs. We expect these results and the associated protocols to help unraveling the translocation pathway to the cytosol of cells.

  16. Correlation and quantitative trait loci analyses of total chlorophyll content and photosynthetic rate of rice (Oryza sativa) under water stress and well-watered conditions.

    PubMed

    Hu, Song-Ping; Zhou, Ying; Zhang, Lin; Zhu, Xiu-Dong; Li, Lin; Luo, Li-Jun; Liu, Guo-Lan; Zhou, Qing-Ming

    2009-09-01

    In order to explore the relevant molecular genetic mechanisms of photosynthetic rate (PR) and chlorophyll content (CC) in rice (Oryza sativa L.), we conducted a series of related experiments using a population of recombinant inbred lines (Zhenshan97B x IRAT109). We found a significant correlation between CC and PR (R= 0.19**) in well-watered conditions, but no significant correlation during water stress (r= 0.08). We detected 13 main quantitative trait loci (QTLs) located on chromosomes 1, 2, 3, 4, 5, 6, and 10, which were associated with CC, including six QTLs located on chromosomes 1, 2, 3, 4, and 5 during water stress, and seven QTLs located on chromosomes 2, 3, 4, 6, and 10 in well-watered conditions. These QTLs explained 47.39% of phenotypic variation during water stress and 56.19% in well-watered conditions. We detected four main QTLs associated with PR; three of them (qPR2, qPR10, qPR11) were located on chromosomes 2, 10, and 11 during water stress, and one (qPR10) was located on chromosome 10 in well-watered conditions. These QTLs explained 34.37% and 18.41% of the phenotypic variation in water stress and well-watered conditions, respectively. In total, CC was largely controlled by main QTLs, and PR was mainly controlled by epistatic QTL pairs.

  17. Quantitative fluorescence spectroscopy and flow cytometry analyses of cell-penetrating peptides internalization pathways: optimization, pitfalls, comparison with mass spectrometry quantification

    PubMed Central

    Illien, Françoise; Rodriguez, Nicolas; Amoura, Mehdi; Joliot, Alain; Pallerla, Manjula; Cribier, Sophie; Burlina, Fabienne; Sagan, Sandrine

    2016-01-01

    The mechanism of cell-penetrating peptides entry into cells is unclear, preventing the development of more efficient vectors for biotechnological or therapeutic purposes. Here, we developed a protocol relying on fluorometry to distinguish endocytosis from direct membrane translocation, using Penetratin, TAT and R9. The quantities of internalized CPPs measured by fluorometry in cell lysates converge with those obtained by our previously reported mass spectrometry quantification method. By contrast, flow cytometry quantification faces several limitations due to fluorescence quenching processes that depend on the cell line and occur at peptide/cell ratio >6.108 for CF-Penetratin. The analysis of cellular internalization of a doubly labeled fluorescent and biotinylated Penetratin analogue by the two independent techniques, fluorometry and mass spectrometry, gave consistent results at the quantitative and qualitative levels. Both techniques revealed the use of two alternative translocation and endocytosis pathways, whose relative efficacy depends on cell-surface sugars and peptide concentration. We confirmed that Penetratin translocates at low concentration and uses endocytosis at high μM concentrations. We further demonstrate that the hydrophobic/hydrophilic nature of the N-terminal extremity impacts on the internalization efficiency of CPPs. We expect these results and the associated protocols to help unraveling the translocation pathway to the cytosol of cells. PMID:27841303

  18. Fast and Reliable Quantitative Peptidomics with labelpepmatch.

    PubMed

    Verdonck, Rik; De Haes, Wouter; Cardoen, Dries; Menschaert, Gerben; Huhn, Thomas; Landuyt, Bart; Baggerman, Geert; Boonen, Kurt; Wenseleers, Tom; Schoofs, Liliane

    2016-03-04

    The use of stable isotope tags in quantitative peptidomics offers many advantages, but the laborious identification of matching sets of labeled peptide peaks is still a major bottleneck. Here we present labelpepmatch, an R-package for fast and straightforward analysis of LC-MS spectra of labeled peptides. This open-source tool offers fast and accurate identification of peak pairs alongside an appropriate framework for statistical inference on quantitative peptidomics data, based on techniques from other -omics disciplines. A relevant case study on the desert locust Schistocerca gregaria proves our pipeline to be a reliable tool for quick but thorough explorative analyses.

  19. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    NASA Astrophysics Data System (ADS)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  20. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  1. Qualitative and quantitative analyses of Epstein-Barr virus early antigen diffuse component by western blotting enzyme-linked immunosorbent assay with a monoclonal antibody.

    PubMed Central

    Lin, J C; Choi, E I; Pagano, J S

    1985-01-01

    We report the use of monoclonal antibody against the early antigen diffuse component (anti-EA-D) of Epstein-Barr virus (EBV) to analyze, both qualitatively and quantitatively, the expression of EA-D in various human lymphoblastoid cell lines activated by chemical inducers. The kinetics of synthesis of EA-D in P3HR-1, B95-8, and Ramos/AW cells were similar in that they all reached the peak of synthesis on day 5 after induction. Surprisingly, no expression of EA-D was found in induced BJAB/GC, an EBV-genome-containing cell line. EBV-negative cell lines, BJAB and Ramos, were negative for EA-D. Raji cells had no detectable EA-D but responded rapidly to induction, reaching a peak on day 3. Superinfection of Raji cells also resulted in marked induction of EA-D, which reached a plateau between 8 to 12 h postinfection. Western blotting coupled with the enzyme-linked immunosorbent assay was employed to identify polypeptides representing EA-D. A family of four polypeptides with molecular weights of 46,000 (46K protein), 49,000, 52,000, and 55,000 were identified to be reactive with monoclonal anti-EA-D antiserum. The pattern of EA-D polypeptides expressed in each cell line was different. Of particular interest was the expression of a large quantity of 46K protein both in induced Raji and P3HR-1 cells, but not in superinfected Raji cells. A 49K doublet was expressed in activated p3HR-1, B95-8, and Ramos/AW cells and in superinfected Raji cells. In addition, two distinct 52K and 55K polypeptides were expressed in induced Ramos/AW and superinfected Raji cells. However, none of these EA-D polypeptides was detectable in BJAB/GC, BJAB, Ramos, and mock-infected Raji cells. To approximate relative concentrations of EA-D in cell extracts, we employed the enzyme-linked immunosorbent assay and immunoblot dot methods by using one of the purified EA-D components to construct a standard curve. Depending upon the cell lines, it was estimated that ca. 1 to 3% (determined by the enzyme

  2. Integration of CO2 flux and remotely-sensed data for primary production and ecosystem respiration analyses in the Northern Great Plains: potential for quantitative spatial extrapolation

    USGS Publications Warehouse

    Gilmanov, Tagir G.; Tieszen, Larry L.; Wylie, Bruce K.; Flanagan, Larry B.; Frank, Albert B.; Haferkamp, Marshall R.; Meyers, Tilden P.; Morgan, Jack A.

    2005-01-01

    Aim  Extrapolation of tower CO2 fluxes will be greatly facilitated if robust relationships between flux components and remotely sensed factors are established. Long-term measurements at five Northern Great Plains locations were used to obtain relationships between CO2fluxes and photosynthetically active radiation (Q), other on-site factors, and Normalized Difference Vegetation Index (NDVI) from the SPOT VEGETATION data set. Location  CO2 flux data from the following stations and years were analysed: Lethbridge, Alberta 1998–2001; Fort Peck, MT 2000, 2002; Miles City, MT 2000–01; Mandan, ND 1999–2001; and Cheyenne, WY 1997–98. Results  Analyses based on light-response functions allowed partitioning net CO2 flux (F) into gross primary productivity (Pg) and ecosystem respiration (Re). Weekly averages of daytime respiration, γday, estimated from light responses were closely correlated with weekly averages of measured night-time respiration, γnight (R2 0.64 to 0.95). Daytime respiration tended to be higher than night-time respiration, and regressions of γday on γnight for all sites were different from 1 : 1 relationships. Over 13 site-years, gross primary production varied from 459 to 2491 g CO2 m−2 year−1, ecosystem respiration from 996 to 1881 g CO2 m−2 year−1, and net ecosystem exchange from −537 (source) to +610 g CO2 m−2 year−1 (sink). Maximum daily ecological light-use efficiencies, ɛd,max = Pg/Q, were in the range 0.014 to 0.032 mol CO2 (mol incident quanta)−1. Main conclusions  Ten-day average Pg was significantly more highly correlated with NDVI than 10-day average daytime flux, Pd (R2 = 0.46 to 0.77 for Pg-NDVI and 0.05 to 0.58 for Pd-NDVI relationships). Ten-day average Re was also positively correlated with NDVI, with R2values from 0.57 to 0.77. Patterns of the relationships of Pg and Re with NDVI and other factors indicate possibilities for establishing multivariate

  3. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  4. Quantitative aspects of inductively coupled plasma mass spectrometry.

    PubMed

    Bulska, Ewa; Wagner, Barbara

    2016-10-28

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided.This article is part of the themed issue 'Quantitative mass spectrometry'.

  5. A method to accurately quantitate intensities of (32)P-DNA bands when multiple bands appear in a single lane of a gel is used to study dNTP insertion opposite a benzo[a]pyrene-dG adduct by Sulfolobus DNA polymerases Dpo4 and Dbh.

    PubMed

    Sholder, Gabriel; Loechler, Edward L

    2015-01-01

    Quantitating relative (32)P-band intensity in gels is desired, e.g., to study primer-extension kinetics of DNA polymerases (DNAPs). Following imaging, multiple (32)P-bands are often present in lanes. Though individual bands appear by eye to be simple and well-resolved, scanning reveals they are actually skewed-Gaussian in shape and neighboring bands are overlapping, which complicates quantitation, because slower migrating bands often have considerable contributions from the trailing edges of faster migrating bands. A method is described to accurately quantitate adjacent (32)P-bands, which relies on having a standard: a simple skewed-Gaussian curve from an analogous pure, single-component band (e.g., primer alone). This single-component scan/curve is superimposed on its corresponding band in an experimentally determined scan/curve containing multiple bands (e.g., generated in a primer-extension reaction); intensity exceeding the single-component scan/curve is attributed to other components (e.g., insertion products). Relative areas/intensities are determined via pixel analysis, from which relative molarity of components is computed. Common software is used. Commonly used alternative methods (e.g., drawing boxes around bands) are shown to be less accurate. Our method was used to study kinetics of dNTP primer-extension opposite a benzo[a]pyrene-N(2)-dG-adduct with four DNAPs, including Sulfolobus solfataricus Dpo4 and Sulfolobus acidocaldarius Dbh. Vmax/Km is similar for correct dCTP insertion with Dpo4 and Dbh. Compared to Dpo4, Dbh misinsertion is slower for dATP (∼20-fold), dGTP (∼110-fold) and dTTP (∼6-fold), due to decreases in Vmax. These findings provide support that Dbh is in the same Y-Family DNAP class as eukaryotic DNAP κ and bacterial DNAP IV, which accurately bypass N(2)-dG adducts, as well as establish the scan-method described herein as an accurate method to quantitate relative intensity of overlapping bands in a single lane, whether generated

  6. Some quantitative analyses of Net Import Reliance

    SciTech Connect

    Harker, R.I.

    1985-01-01

    The question of security of supply of mineral commodities as applied to the technically advanced nations (TAN's) is of growing concern in view of recent geopolitical problems and long-term demand trends. Net Import Reliance (N.I.R.) is not a satisfactory measure of vulnerability, or conversely, of security, since it includes no consideration of domestic reserves, nor the number of foreign sources, nor the relative importance of each of the foreign sources nor their reliability. A proposed Distribution Security Index (D.S.I.) takes all but the last of these quantifiable factors into account. It is a step towards the differentiation of the securities of individual TAN's with respect to specific commodities for which the N.I.R.'s approach 100%. 25 references, 2 figures, 4 tables.

  7. Accurate Finite Difference Algorithms

    NASA Technical Reports Server (NTRS)

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  8. Accurate monotone cubic interpolation

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1991-01-01

    Monotone piecewise cubic interpolants are simple and effective. They are generally third-order accurate, except near strict local extrema where accuracy degenerates to second-order due to the monotonicity constraint. Algorithms for piecewise cubic interpolants, which preserve monotonicity as well as uniform third and fourth-order accuracy are presented. The gain of accuracy is obtained by relaxing the monotonicity constraint in a geometric framework in which the median function plays a crucial role.

  9. Application of the accurate mass and time tag approach in studies of the human blood lipidome

    PubMed Central

    Ding, Jie; Sorensen, Christina M.; Jaitly, Navdeep; Jiang, Hongliang; Orton, Daniel J.; Monroe, Matthew E.; Moore, Ronald J.; Smith, Richard D.; Metz, Thomas O.

    2008-01-01

    We report a preliminary demonstration of the accurate mass and time (AMT) tag approach for lipidomics. Initial data-dependent LC-MS/MS analyses of human plasma, erythrocyte, and lymphocyte lipids were performed in order to identify lipid molecular species in conjunction with complementary accurate mass and isotopic distribution information. Identified lipids were used to populate initial lipid AMT tag databases containing 250 and 45 entries for those species detected in positive and negative electrospray ionization (ESI) modes, respectively. The positive ESI database was then utilized to identify human plasma, erythrocyte, and lymphocyte lipids in high-throughput LC-MS analyses based on the AMT tag approach. We were able to define the lipid profiles of human plasma, erythrocytes, and lymphocytes based on qualitative and quantitative differences in lipid abundance. PMID:18502191

  10. Accurate quantum chemical calculations

    NASA Technical Reports Server (NTRS)

    Bauschlicher, Charles W., Jr.; Langhoff, Stephen R.; Taylor, Peter R.

    1989-01-01

    An important goal of quantum chemical calculations is to provide an understanding of chemical bonding and molecular electronic structure. A second goal, the prediction of energy differences to chemical accuracy, has been much harder to attain. First, the computational resources required to achieve such accuracy are very large, and second, it is not straightforward to demonstrate that an apparently accurate result, in terms of agreement with experiment, does not result from a cancellation of errors. Recent advances in electronic structure methodology, coupled with the power of vector supercomputers, have made it possible to solve a number of electronic structure problems exactly using the full configuration interaction (FCI) method within a subspace of the complete Hilbert space. These exact results can be used to benchmark approximate techniques that are applicable to a wider range of chemical and physical problems. The methodology of many-electron quantum chemistry is reviewed. Methods are considered in detail for performing FCI calculations. The application of FCI methods to several three-electron problems in molecular physics are discussed. A number of benchmark applications of FCI wave functions are described. Atomic basis sets and the development of improved methods for handling very large basis sets are discussed: these are then applied to a number of chemical and spectroscopic problems; to transition metals; and to problems involving potential energy surfaces. Although the experiences described give considerable grounds for optimism about the general ability to perform accurate calculations, there are several problems that have proved less tractable, at least with current computer resources, and these and possible solutions are discussed.

  11. BIOACCESSIBILITY TESTS ACCURATELY ESTIMATE ...

    EPA Pesticide Factsheets

    Hazards of soil-borne Pb to wild birds may be more accurately quantified if the bioavailability of that Pb is known. To better understand the bioavailability of Pb to birds, we measured blood Pb concentrations in Japanese quail (Coturnix japonica) fed diets containing Pb-contaminated soils. Relative bioavailabilities were expressed by comparison with blood Pb concentrations in quail fed a Pb acetate reference diet. Diets containing soil from five Pb-contaminated Superfund sites had relative bioavailabilities from 33%-63%, with a mean of about 50%. Treatment of two of the soils with P significantly reduced the bioavailability of Pb. The bioaccessibility of the Pb in the test soils was then measured in six in vitro tests and regressed on bioavailability. They were: the “Relative Bioavailability Leaching Procedure” (RBALP) at pH 1.5, the same test conducted at pH 2.5, the “Ohio State University In vitro Gastrointestinal” method (OSU IVG), the “Urban Soil Bioaccessible Lead Test”, the modified “Physiologically Based Extraction Test” and the “Waterfowl Physiologically Based Extraction Test.” All regressions had positive slopes. Based on criteria of slope and coefficient of determination, the RBALP pH 2.5 and OSU IVG tests performed very well. Speciation by X-ray absorption spectroscopy demonstrated that, on average, most of the Pb in the sampled soils was sorbed to minerals (30%), bound to organic matter 24%, or present as Pb sulfate 18%. Ad

  12. Accurate spectral color measurements

    NASA Astrophysics Data System (ADS)

    Hiltunen, Jouni; Jaeaeskelaeinen, Timo; Parkkinen, Jussi P. S.

    1999-08-01

    Surface color measurement is of importance in a very wide range of industrial applications including paint, paper, printing, photography, textiles, plastics and so on. For a demanding color measurements spectral approach is often needed. One can measure a color spectrum with a spectrophotometer using calibrated standard samples as a reference. Because it is impossible to define absolute color values of a sample, we always work with approximations. The human eye can perceive color difference as small as 0.5 CIELAB units and thus distinguish millions of colors. This 0.5 unit difference should be a goal for the precise color measurements. This limit is not a problem if we only want to measure the color difference of two samples, but if we want to know in a same time exact color coordinate values accuracy problems arise. The values of two instruments can be astonishingly different. The accuracy of the instrument used in color measurement may depend on various errors such as photometric non-linearity, wavelength error, integrating sphere dark level error, integrating sphere error in both specular included and specular excluded modes. Thus the correction formulas should be used to get more accurate results. Another question is how many channels i.e. wavelengths we are using to measure a spectrum. It is obvious that the sampling interval should be short to get more precise results. Furthermore, the result we get is always compromise of measuring time, conditions and cost. Sometimes we have to use portable syste or the shape and the size of samples makes it impossible to use sensitive equipment. In this study a small set of calibrated color tiles measured with the Perkin Elmer Lamda 18 and the Minolta CM-2002 spectrophotometers are compared. In the paper we explain the typical error sources of spectral color measurements, and show which are the accuracy demands a good colorimeter should have.

  13. Quantitative myocardial perfusion SPECT.

    PubMed

    Tsui, B M; Frey, E C; LaCroix, K J; Lalush, D S; McCartney, W H; King, M A; Gullberg, G T

    1998-01-01

    In recent years, there has been much interest in the clinical application of attenuation compensation to myocardial perfusion single photon emission computed tomography (SPECT) with the promise that accurate quantitative images can be obtained to improve clinical diagnoses. The different attenuation compensation methods that are available create confusion and some misconceptions. Also, attenuation-compensated images reveal other image-degrading effects including collimator-detector blurring and scatter that are not apparent in uncompensated images. This article presents basic concepts of the major factors that degrade the quality and quantitative accuracy of myocardial perfusion SPECT images, and includes a discussion of the various image reconstruction and compensation methods and misconceptions and pitfalls in implementation. The differences between the various compensation methods and their performance are demonstrated. Particular emphasis is directed to an approach that promises to provide quantitative myocardial perfusion SPECT images by accurately compensating for the 3-dimensional (3-D) attenuation, collimator-detector response, and scatter effects. With advances in the computer hardware and optimized implementation techniques, quantitatively accurate and high-quality myocardial perfusion SPECT images can be obtained in clinically acceptable processing time. Examples from simulation, phantom, and patient studies are used to demonstrate the various aspects of the investigation. We conclude that quantitative myocardial perfusion SPECT, which holds great promise to improve clinical diagnosis, is an achievable goal in the near future.

  14. Sociopolitical Analyses.

    ERIC Educational Resources Information Center

    Van Galen, Jane, Ed.; And Others

    1992-01-01

    This theme issue of the serial "Educational Foundations" contains four articles devoted to the topic of "Sociopolitical Analyses." In "An Interview with Peter L. McLaren," Mary Leach presented the views of Peter L. McLaren on topics of local and national discourses, values, and the politics of difference. Landon E.…

  15. Accurate protein crystallography at ultra-high resolution: Valence electron distribution in crambin

    PubMed Central

    Jelsch, Christian; Teeter, Martha M.; Lamzin, Victor; Pichon-Pesme, Virginie; Blessing, Robert H.; Lecomte, Claude

    2000-01-01

    The charge density distribution of a protein has been refined experimentally. Diffraction data for a crambin crystal were measured to ultra-high resolution (0.54 Å) at low temperature by using short-wavelength synchrotron radiation. The crystal structure was refined with a model for charged, nonspherical, multipolar atoms to accurately describe the molecular electron density distribution. The refined parameters agree within 25% with our transferable electron density library derived from accurate single crystal diffraction analyses of several amino acids and small peptides. The resulting electron density maps of redistributed valence electrons (deformation maps) compare quantitatively well with a high-level quantum mechanical calculation performed on a monopeptide. This study provides validation for experimentally derived parameters and a window into charge density analysis of biological macromolecules. PMID:10737790

  16. Accurate protein crystallography at ultra-high resolution: valence electron distribution in crambin.

    PubMed

    Jelsch, C; Teeter, M M; Lamzin, V; Pichon-Pesme, V; Blessing, R H; Lecomte, C

    2000-03-28

    The charge density distribution of a protein has been refined experimentally. Diffraction data for a crambin crystal were measured to ultra-high resolution (0.54 A) at low temperature by using short-wavelength synchrotron radiation. The crystal structure was refined with a model for charged, nonspherical, multipolar atoms to accurately describe the molecular electron density distribution. The refined parameters agree within 25% with our transferable electron density library derived from accurate single crystal diffraction analyses of several amino acids and small peptides. The resulting electron density maps of redistributed valence electrons (deformation maps) compare quantitatively well with a high-level quantum mechanical calculation performed on a monopeptide. This study provides validation for experimentally derived parameters and a window into charge density analysis of biological macromolecules.

  17. Multivariate Quantitative Chemical Analysis

    NASA Technical Reports Server (NTRS)

    Kinchen, David G.; Capezza, Mary

    1995-01-01

    Technique of multivariate quantitative chemical analysis devised for use in determining relative proportions of two components mixed and sprayed together onto object to form thermally insulating foam. Potentially adaptable to other materials, especially in process-monitoring applications in which necessary to know and control critical properties of products via quantitative chemical analyses of products. In addition to chemical composition, also used to determine such physical properties as densities and strengths.

  18. Quantitative analysis of endogenous compounds.

    PubMed

    Thakare, Rhishikesh; Chhonker, Yashpal S; Gautam, Nagsen; Alamoudi, Jawaher Abdullah; Alnouti, Yazen

    2016-09-05

    Accurate quantitative analysis of endogenous analytes is essential for several clinical and non-clinical applications. LC-MS/MS is the technique of choice for quantitative analyses. Absolute quantification by LC/MS requires preparing standard curves in the same matrix as the study samples so that the matrix effect and the extraction efficiency for analytes are the same in both the standard and study samples. However, by definition, analyte-free biological matrices do not exist for endogenous compounds. To address the lack of blank matrices for the quantification of endogenous compounds by LC-MS/MS, four approaches are used including the standard addition, the background subtraction, the surrogate matrix, and the surrogate analyte methods. This review article presents an overview these approaches, cite and summarize their applications, and compare their advantages and disadvantages. In addition, we discuss in details, validation requirements and compatibility with FDA guidelines to ensure method reliability in quantifying endogenous compounds. The standard addition, background subtraction, and the surrogate analyte approaches allow the use of the same matrix for the calibration curve as the one to be analyzed in the test samples. However, in the surrogate matrix approach, various matrices such as artificial, stripped, and neat matrices are used as surrogate matrices for the actual matrix of study samples. For the surrogate analyte approach, it is required to demonstrate similarity in matrix effect and recovery between surrogate and authentic endogenous analytes. Similarly, for the surrogate matrix approach, it is required to demonstrate similar matrix effect and extraction recovery in both the surrogate and original matrices. All these methods represent indirect approaches to quantify endogenous compounds and regardless of what approach is followed, it has to be shown that none of the validation criteria have been compromised due to the indirect analyses.

  19. Spotsizer: High-throughput quantitative analysis of microbial growth

    PubMed Central

    Jeffares, Daniel C.; Arzhaeva, Yulia; Bähler, Jürg

    2017-01-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license. PMID:27712582

  20. Spotsizer: High-throughput quantitative analysis of microbial growth.

    PubMed

    Bischof, Leanne; Převorovský, Martin; Rallis, Charalampos; Jeffares, Daniel C; Arzhaeva, Yulia; Bähler, Jürg

    2016-10-01

    Microbial colony growth can serve as a useful readout in assays for studying complex genetic interactions or the effects of chemical compounds. Although computational tools for acquiring quantitative measurements of microbial colonies have been developed, their utility can be compromised by inflexible input image requirements, non-trivial installation procedures, or complicated operation. Here, we present the Spotsizer software tool for automated colony size measurements in images of robotically arrayed microbial colonies. Spotsizer features a convenient graphical user interface (GUI), has both single-image and batch-processing capabilities, and works with multiple input image formats and different colony grid types. We demonstrate how Spotsizer can be used for high-throughput quantitative analysis of fission yeast growth. The user-friendly Spotsizer tool provides rapid, accurate, and robust quantitative analyses of microbial growth in a high-throughput format. Spotsizer is freely available at https://data.csiro.au/dap/landingpage?pid=csiro:15330 under a proprietary CSIRO license.

  1. Whole proteomes as internal standards in quantitative proteomics.

    PubMed

    Ong, Shao-En

    2010-07-30

    As mass-spectrometry-based quantitative proteomics approaches become increasingly powerful, researchers are taking advantage of well established methodologies and improving instrumentation to pioneer new protein expression profiling methods. For example, pooling several proteomes labeled using the stable isotope labeling by amino acids in cell culture (SILAC) method yields a whole-proteome stable isotope-labeled internal standard that can be mixed with a tissue-derived proteome for quantification. By increasing quantitative accuracy in the analysis of tissue proteomes, such methods should improve integration of protein expression profiling data with transcriptomic data and enhance downstream bioinformatic analyses. An accurate and scalable quantitative method to analyze tumor proteomes at the depth of several thousand proteins provides a powerful tool for global protein quantification of tissue samples and promises to redefine our understanding of tumor biology.

  2. Accurate Evaluation of Quantum Integrals

    NASA Technical Reports Server (NTRS)

    Galant, D. C.; Goorvitch, D.; Witteborn, Fred C. (Technical Monitor)

    1995-01-01

    Combining an appropriate finite difference method with Richardson's extrapolation results in a simple, highly accurate numerical method for solving a Schrodinger's equation. Important results are that error estimates are provided, and that one can extrapolate expectation values rather than the wavefunctions to obtain highly accurate expectation values. We discuss the eigenvalues, the error growth in repeated Richardson's extrapolation, and show that the expectation values calculated on a crude mesh can be extrapolated to obtain expectation values of high accuracy.

  3. Analyser-based phase contrast image reconstruction using geometrical optics.

    PubMed

    Kitchen, M J; Pavlov, K M; Siu, K K W; Menk, R H; Tromba, G; Lewis, R A

    2007-07-21

    Analyser-based phase contrast imaging can provide radiographs of exceptional contrast at high resolution (<100 microm), whilst quantitative phase and attenuation information can be extracted using just two images when the approximations of geometrical optics are satisfied. Analytical phase retrieval can be performed by fitting the analyser rocking curve with a symmetric Pearson type VII function. The Pearson VII function provided at least a 10% better fit to experimentally measured rocking curves than linear or Gaussian functions. A test phantom, a hollow nylon cylinder, was imaged at 20 keV using a Si(1 1 1) analyser at the ELETTRA synchrotron radiation facility. Our phase retrieval method yielded a more accurate object reconstruction than methods based on a linear fit to the rocking curve. Where reconstructions failed to map expected values, calculations of the Takagi number permitted distinction between the violation of the geometrical optics conditions and the failure of curve fitting procedures. The need for synchronized object/detector translation stages was removed by using a large, divergent beam and imaging the object in segments. Our image acquisition and reconstruction procedure enables quantitative phase retrieval for systems with a divergent source and accounts for imperfections in the analyser.

  4. LSM: perceptually accurate line segment merging

    NASA Astrophysics Data System (ADS)

    Hamid, Naila; Khan, Nazar

    2016-11-01

    Existing line segment detectors tend to break up perceptually distinct line segments into multiple segments. We propose an algorithm for merging such broken segments to recover the original perceptually accurate line segments. The algorithm proceeds by grouping line segments on the basis of angular and spatial proximity. Then those line segment pairs within each group that satisfy unique, adaptive mergeability criteria are successively merged to form a single line segment. This process is repeated until no more line segments can be merged. We also propose a method for quantitative comparison of line segment detection algorithms. Results on the York Urban dataset show that our merged line segments are closer to human-marked ground-truth line segments compared to state-of-the-art line segment detection algorithms.

  5. Fully automated quantitative cephalometry using convolutional neural networks.

    PubMed

    Arık, Sercan Ö; Ibragimov, Bulat; Xing, Lei

    2017-01-01

    Quantitative cephalometry plays an essential role in clinical diagnosis, treatment, and surgery. Development of fully automated techniques for these procedures is important to enable consistently accurate computerized analyses. We study the application of deep convolutional neural networks (CNNs) for fully automated quantitative cephalometry for the first time. The proposed framework utilizes CNNs for detection of landmarks that describe the anatomy of the depicted patient and yield quantitative estimation of pathologies in the jaws and skull base regions. We use a publicly available cephalometric x-ray image dataset to train CNNs for recognition of landmark appearance patterns. CNNs are trained to output probabilistic estimations of different landmark locations, which are combined using a shape-based model. We evaluate the overall framework on the test set and compare with other proposed techniques. We use the estimated landmark locations to assess anatomically relevant measurements and classify them into different anatomical types. Overall, our results demonstrate high anatomical landmark detection accuracy ([Formula: see text] to 2% higher success detection rate for a 2-mm range compared with the top benchmarks in the literature) and high anatomical type classification accuracy ([Formula: see text] average classification accuracy for test set). We demonstrate that CNNs, which merely input raw image patches, are promising for accurate quantitative cephalometry.

  6. On study design in neuroimaging heritability analyses

    NASA Astrophysics Data System (ADS)

    Koran, Mary Ellen; Li, Bo; Jahanshad, Neda; Thornton-Wells, Tricia A.; Glahn, David C.; Thompson, Paul M.; Blangero, John; Nichols, Thomas E.; Kochunov, Peter; Landman, Bennett A.

    2014-03-01

    Imaging genetics is an emerging methodology that combines genetic information with imaging-derived metrics to understand how genetic factors impact observable structural, functional, and quantitative phenotypes. Many of the most well-known genetic studies are based on Genome-Wide Association Studies (GWAS), which use large populations of related or unrelated individuals to associate traits and disorders with individual genetic factors. Merging imaging and genetics may potentially lead to improved power of association in GWAS because imaging traits may be more sensitive phenotypes, being closer to underlying genetic mechanisms, and their quantitative nature inherently increases power. We are developing SOLAR-ECLIPSE (SE) imaging genetics software which is capable of performing genetic analyses with both large-scale quantitative trait data and family structures of variable complexity. This program can estimate the contribution of genetic commonality among related subjects to a given phenotype, and essentially answer the question of whether or not the phenotype is heritable. This central factor of interest, heritability, offers bounds on the direct genetic influence over observed phenotypes. In order for a trait to be a good phenotype for GWAS, it must be heritable: at least some proportion of its variance must be due to genetic influences. A variety of family structures are commonly used for estimating heritability, yet the variability and biases for each as a function of the sample size are unknown. Herein, we investigate the ability of SOLAR to accurately estimate heritability models based on imaging data simulated using Monte Carlo methods implemented in R. We characterize the bias and the variability of heritability estimates from SOLAR as a function of sample size and pedigree structure (including twins, nuclear families, and nuclear families with grandparents).

  7. A simplified and accurate detection of the genetically modified wheat MON71800 with one calibrator plasmid.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2015-06-01

    With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid.

  8. Software for quantitative trait analysis.

    PubMed

    Almasy, Laura; Warren, Diane M

    2005-09-01

    This paper provides a brief overview of software currently available for the genetic analysis of quantitative traits in humans. Programs that implement variance components, Markov Chain Monte Carlo (MCMC), Haseman-Elston (H-E) and penetrance model-based linkage analyses are discussed, as are programs for measured genotype association analyses and quantitative trait transmission disequilibrium tests. The software compared includes LINKAGE, FASTLINK, PAP, SOLAR, SEGPATH, ACT, Mx, MERLIN, GENEHUNTER, Loki, Mendel, SAGE, QTDT and FBAT. Where possible, the paper provides URLs for acquiring these programs through the internet, details of the platforms for which the software is available and the types of analyses performed.

  9. Possible EEG sequelae of very long duration marihuana use: pilot findings from topographic quantitative EEG analyses of subjects with 15 to 24 years of cumulative daily exposure to THC.

    PubMed

    Struve, F A; Patrick, G; Straumanis, J J; Fitz-Gerald, M J; Manno, J

    1998-01-01

    In previous work we demonstrated and replicated a significant association between increased absolute and relative power and interhemispheric coherence of EEG alpha activity over the bilateral frontal-central cortex ("alpha hyperfrontality") in daily marihuana users as contrasted with nonusers. In this report we focused our analyses on subjects who reported smoking marihuana on a daily basis for 15 to 24 consecutive years. Compared to nonuser controls and subjects who had used marihuana on a daily basis for shorter periods of time, subjects with excessively long cumulative exposures to THC were found to have significantly elevated absolute power of theta activity over bilateral frontal-central cortex, as well as significantly increased interhemispheric coherence of theta activity across central and posterior regions. Concurrent reaction time studies conducted in our laboratory suggest that very long duration cumulative marihuana exposure might be associated with slowed cognitive processing.

  10. QUANTITATIVE MORPHOLOGY

    EPA Science Inventory

    Abstract: In toxicology, the role of quantitative assessment of brain morphology can be understood in the context of two types of treatment-related alterations. One type of alteration is specifically associated with treatment and is not observed in control animals. Measurement ...

  11. Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses

    EPA Science Inventory

    Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...

  12. On numerically accurate finite element

    NASA Technical Reports Server (NTRS)

    Nagtegaal, J. C.; Parks, D. M.; Rice, J. R.

    1974-01-01

    A general criterion for testing a mesh with topologically similar repeat units is given, and the analysis shows that only a few conventional element types and arrangements are, or can be made suitable for computations in the fully plastic range. Further, a new variational principle, which can easily and simply be incorporated into an existing finite element program, is presented. This allows accurate computations to be made even for element designs that would not normally be suitable. Numerical results are given for three plane strain problems, namely pure bending of a beam, a thick-walled tube under pressure, and a deep double edge cracked tensile specimen. The effects of various element designs and of the new variational procedure are illustrated. Elastic-plastic computation at finite strain are discussed.

  13. Quantitative glycomics.

    PubMed

    Orlando, Ron

    2010-01-01

    The ability to quantitatively determine changes is an essential component of comparative glycomics. Multiple strategies are available by which this can be accomplished. These include label-free approaches and strategies where an isotopic label is incorporated into the glycans prior to analysis. The focus of this chapter is to describe each of these approaches while providing insight into their strengths and weaknesses, so that glycomic investigators can make an educated choice of the strategy that is best suited for their particular application.

  14. A highly sensitive and accurate gene expression analysis by sequencing ("bead-seq") for a single cell.

    PubMed

    Matsunaga, Hiroko; Goto, Mari; Arikawa, Koji; Shirai, Masataka; Tsunoda, Hiroyuki; Huang, Huan; Kambara, Hideki

    2015-02-15

    Analyses of gene expressions in single cells are important for understanding detailed biological phenomena. Here, a highly sensitive and accurate method by sequencing (called "bead-seq") to obtain a whole gene expression profile for a single cell is proposed. A key feature of the method is to use a complementary DNA (cDNA) library on magnetic beads, which enables adding washing steps to remove residual reagents in a sample preparation process. By adding the washing steps, the next steps can be carried out under the optimal conditions without losing cDNAs. Error sources were carefully evaluated to conclude that the first several steps were the key steps. It is demonstrated that bead-seq is superior to the conventional methods for single-cell gene expression analyses in terms of reproducibility, quantitative accuracy, and biases caused during sample preparation and sequencing processes.

  15. Mapping methods for computationally efficient and accurate structural reliability

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.; Chamis, Christos C.

    1992-01-01

    Mapping methods are developed to improve the accuracy and efficiency of probabilistic structural analyses with coarse finite element meshes. The mapping methods consist of: (1) deterministic structural analyses with fine (convergent) finite element meshes, (2) probabilistic structural analyses with coarse finite element meshes, (3) the relationship between the probabilistic structural responses from the coarse and fine finite element meshes, and (4) a probabilistic mapping. The results show that the scatter of the probabilistic structural responses and structural reliability can be accurately predicted using a coarse finite element model with proper mapping methods. Therefore, large structures can be analyzed probabilistically using finite element methods.

  16. Accurate ab Initio Spin Densities.

    PubMed

    Boguslawski, Katharina; Marti, Konrad H; Legeza, Ors; Reiher, Markus

    2012-06-12

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as a basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys.2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CASCI-type wave function provides insight into chemically interesting features of the molecule under study such as the distribution of α and β electrons in terms of Slater determinants, CI coefficients, and natural orbitals. The methodology is applied to an iron nitrosyl complex which we have identified as a challenging system for standard approaches [J. Chem. Theory Comput.2011, 7, 2740].

  17. Compilation of Sandia coal char combustion data and kinetic analyses

    SciTech Connect

    Mitchell, R.E.; Hurt, R.H.; Baxter, L.L.; Hardesty, D.R.

    1992-06-01

    An experimental project was undertaken to characterize the physical and chemical processes that govern the combustion of pulverized coal chars. The experimental endeavor establishes a database on the reactivities of coal chars as a function of coal type, particle size, particle temperature, gas temperature, and gas and composition. The project also provides a better understanding of the mechanism of char oxidation, and yields quantitative information on the release rates of nitrogen- and sulfur-containing species during char combustion. An accurate predictive engineering model of the overall char combustion process under technologically relevant conditions in a primary product of this experimental effort. This document summarizes the experimental effort, the approach used to analyze the data, and individual compilations of data and kinetic analyses for each of the parent coals investigates.

  18. Network Class Superposition Analyses

    PubMed Central

    Pearson, Carl A. B.; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., for the yeast cell cycle process [1]), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix , which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for derived from Boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with . We show how to generate Derrida plots based on . We show that -based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on . We motivate all of these results in terms of a popular molecular biology Boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for , for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses. PMID:23565141

  19. Network class superposition analyses.

    PubMed

    Pearson, Carl A B; Zeng, Chen; Simha, Rahul

    2013-01-01

    Networks are often used to understand a whole system by modeling the interactions among its pieces. Examples include biomolecules in a cell interacting to provide some primary function, or species in an environment forming a stable community. However, these interactions are often unknown; instead, the pieces' dynamic states are known, and network structure must be inferred. Because observed function may be explained by many different networks (e.g., ≈ 10(30) for the yeast cell cycle process), considering dynamics beyond this primary function means picking a single network or suitable sample: measuring over all networks exhibiting the primary function is computationally infeasible. We circumvent that obstacle by calculating the network class ensemble. We represent the ensemble by a stochastic matrix T, which is a transition-by-transition superposition of the system dynamics for each member of the class. We present concrete results for T derived from boolean time series dynamics on networks obeying the Strong Inhibition rule, by applying T to several traditional questions about network dynamics. We show that the distribution of the number of point attractors can be accurately estimated with T. We show how to generate Derrida plots based on T. We show that T-based Shannon entropy outperforms other methods at selecting experiments to further narrow the network structure. We also outline an experimental test of predictions based on T. We motivate all of these results in terms of a popular molecular biology boolean network model for the yeast cell cycle, but the methods and analyses we introduce are general. We conclude with open questions for T, for example, application to other models, computational considerations when scaling up to larger systems, and other potential analyses.

  20. MASIC: a software program for fast quantitation and flexible visualization of chromatographic profiles from detected LC-MS(/MS) features

    SciTech Connect

    Monroe, Matthew E.; Shaw, Jason L.; Daly, Don S.; Adkins, Joshua N.; Smith, Richard D.

    2008-06-01

    Quantitative analysis of liquid chromatography (LC)- mass spectrometry (MS) and tandem mass spectrometry (MS/MS) data is essential to many proteomics studies. We have developed MASIC to accurately measure peptide abundances and LC elution times in low-resolution LC-MS/MS analyses. This software program uses an efficient processing algorithm to quickly generate mass specific selected ion chromatograms from a dataset and provides an interactive browser that allows users to examine individual chromatograms in a variety of fashions. The improved elution time estimates afforded by MASIC increase the utility of LC-MS/MS data in the accurate mass and time (AMT) tag approach to proteomics.

  1. Accurate phosphoregulation of kinetochore–microtubule affinity requires unconstrained molecular interactions

    PubMed Central

    Zaytsev, Anatoly V.; Sundin, Lynsie J.R.; DeLuca, Keith F.

    2014-01-01

    Accurate chromosome segregation relies on dynamic interactions between microtubules (MTs) and the NDC80 complex, a major kinetochore MT-binding component. Phosphorylation at multiple residues of its Hec1 subunit may tune kinetochore–MT binding affinity for diverse mitotic functions, but molecular details of such phosphoregulation remain elusive. Using quantitative analyses of mitotic progression in mammalian cells, we show that Hec1 phosphorylation provides graded control of kinetochore–MT affinity. In contrast, modeling the kinetochore interface with repetitive MT binding sites predicts a switchlike response. To reconcile these findings, we hypothesize that interactions between NDC80 complexes and MTs are not constrained, i.e., the NDC80 complexes can alternate their binding between adjacent kinetochore MTs. Experiments using cells with phosphomimetic Hec1 mutants corroborate predictions of such a model but not of the repetitive sites model. We propose that accurate regulation of kinetochore–MT affinity is driven by incremental phosphorylation of an NDC80 molecular “lawn,” in which the NDC80–MT bonds reorganize dynamically in response to the number and stability of MT attachments. PMID:24982430

  2. Approaches for the accurate definition of geological time boundaries

    NASA Astrophysics Data System (ADS)

    Schaltegger, Urs; Baresel, Björn; Ovtcharova, Maria; Goudemand, Nicolas; Bucher, Hugo

    2015-04-01

    Which strategies lead to the most precise and accurate date of a given geological boundary? Geological units are usually defined by the occurrence of characteristic taxa and hence boundaries between these geological units correspond to dramatic faunal and/or floral turnovers and they are primarily defined using first or last occurrences of index species, or ideally by the separation interval between two consecutive, characteristic associations of fossil taxa. These boundaries need to be defined in a way that enables their worldwide recognition and correlation across different stratigraphic successions, using tools as different as bio-, magneto-, and chemo-stratigraphy, and astrochronology. Sedimentary sequences can be dated in numerical terms by applying high-precision chemical-abrasion, isotope-dilution, thermal-ionization mass spectrometry (CA-ID-TIMS) U-Pb age determination to zircon (ZrSiO4) in intercalated volcanic ashes. But, though volcanic activity is common in geological history, ashes are not necessarily close to the boundary we would like to date precisely and accurately. In addition, U-Pb zircon data sets may be very complex and difficult to interpret in terms of the age of ash deposition. To overcome these difficulties we use a multi-proxy approach we applied to the precise and accurate dating of the Permo-Triassic and Early-Middle Triassic boundaries in South China. a) Dense sampling of ashes across the critical time interval and a sufficiently large number of analysed zircons per ash sample can guarantee the recognition of all system complexities. Geochronological datasets from U-Pb dating of volcanic zircon may indeed combine effects of i) post-crystallization Pb loss from percolation of hydrothermal fluids (even using chemical abrasion), with ii) age dispersion from prolonged residence of earlier crystallized zircon in the magmatic system. As a result, U-Pb dates of individual zircons are both apparently younger and older than the depositional age

  3. On the accurate molecular dynamics analysis of biological molecules

    NASA Astrophysics Data System (ADS)

    Yamashita, Takefumi

    2016-12-01

    As the evolution of computational technology has now enabled long molecular dynamics (MD) simulation, the evaluation of many physical properties shows improved convergence. Therefore, we can examine the detailed conditions of MD simulations and perform quantitative MD analyses. In this study, we address the quantitative and accuracy aspects of MD simulations using two example systems. First, it is found that several conditions of the MD simulations influence the area/lipid of the lipid bilayer. Second, we successfully detect the small but important differences in antibody motion between the antigen-bound and unbound states.

  4. Knowledge Discovery in Textual Documentation: Qualitative and Quantitative Analyses.

    ERIC Educational Resources Information Center

    Loh, Stanley; De Oliveira, Jose Palazzo M.; Gastal, Fabio Leite

    2001-01-01

    Presents an application of knowledge discovery in texts (KDT) concerning medical records of a psychiatric hospital. The approach helps physicians to extract knowledge about patients and diseases that may be used for epidemiological studies, for training professionals, and to support physicians to diagnose and evaluate diseases. (Author/AEF)

  5. Object-oriented fault tree evaluation program for quantitative analyses

    NASA Technical Reports Server (NTRS)

    Patterson-Hine, F. A.; Koen, B. V.

    1988-01-01

    Object-oriented programming can be combined with fault free techniques to give a significantly improved environment for evaluating the safety and reliability of large complex systems for space missions. Deep knowledge about system components and interactions, available from reliability studies and other sources, can be described using objects that make up a knowledge base. This knowledge base can be interrogated throughout the design process, during system testing, and during operation, and can be easily modified to reflect design changes in order to maintain a consistent information source. An object-oriented environment for reliability assessment has been developed on a Texas Instrument (TI) Explorer LISP workstation. The program, which directly evaluates system fault trees, utilizes the object-oriented extension to LISP called Flavors that is available on the Explorer. The object representation of a fault tree facilitates the storage and retrieval of information associated with each event in the tree, including tree structural information and intermediate results obtained during the tree reduction process. Reliability data associated with each basic event are stored in the fault tree objects. The object-oriented environment on the Explorer also includes a graphical tree editor which was modified to display and edit the fault trees.

  6. Noninvasive hemoglobin monitoring: how accurate is enough?

    PubMed

    Rice, Mark J; Gravenstein, Nikolaus; Morey, Timothy E

    2013-10-01

    Evaluating the accuracy of medical devices has traditionally been a blend of statistical analyses, at times without contextualizing the clinical application. There have been a number of recent publications on the accuracy of a continuous noninvasive hemoglobin measurement device, the Masimo Radical-7 Pulse Co-oximeter, focusing on the traditional statistical metrics of bias and precision. In this review, which contains material presented at the Innovations and Applications of Monitoring Perfusion, Oxygenation, and Ventilation (IAMPOV) Symposium at Yale University in 2012, we critically investigated these metrics as applied to the new technology, exploring what is required of a noninvasive hemoglobin monitor and whether the conventional statistics adequately answer our questions about clinical accuracy. We discuss the glucose error grid, well known in the glucose monitoring literature, and describe an analogous version for hemoglobin monitoring. This hemoglobin error grid can be used to evaluate the required clinical accuracy (±g/dL) of a hemoglobin measurement device to provide more conclusive evidence on whether to transfuse an individual patient. The important decision to transfuse a patient usually requires both an accurate hemoglobin measurement and a physiologic reason to elect transfusion. It is our opinion that the published accuracy data of the Masimo Radical-7 is not good enough to make the transfusion decision.

  7. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2013-07-01 2013-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  8. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2012-07-01 2012-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  9. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  10. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2014-07-01 2014-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  11. 38 CFR 4.46 - Accurate measurement.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2011-07-01 2011-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  12. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  13. Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.

    PubMed

    Hamlet, Stephen M

    2010-01-01

    The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.

  14. Landslide inventories: The essential part of seismic landslide hazard analyses

    USGS Publications Warehouse

    Harp, E.L.; Keefer, D.K.; Sato, H.P.; Yagi, H.

    2011-01-01

    A detailed and accurate landslide inventory is an essential part of seismic landslide hazard analysis. An ideal inventory would cover the entire area affected by an earthquake and include all of the landslides that are possible to detect down to sizes of 1-5. m in length. The landslides must also be located accurately and mapped as polygons depicting their true shapes. Such mapped landslide distributions can then be used to perform seismic landslide hazard analysis and other quantitative analyses. Detailed inventory maps of landslide triggered by earthquakes began in the early 1960s with the use of aerial photography. In recent years, advances in technology have resulted in the accessibility of satellite imagery with sufficiently high resolution to identify and map all but the smallest of landslides triggered by a seismic event. With this ability to view any area of the globe, we can acquire imagery for any earthquake that triggers significant numbers of landslides. However, a common problem of incomplete coverage of the full distributions of landslides has emerged along with the advent of high resolution satellite imagery. ?? 2010.

  15. Retarding field energy analyser ion current calibration and transmission

    NASA Astrophysics Data System (ADS)

    Denieffe, K.; Mahony, C. M. O.; Maguire, P. D.; Gahan, D.; Hopkins, M. B.

    2011-02-01

    Accurate measurement of ion current density and ion energy distributions (IEDs) is often critical for plasma processes in both industrial and research settings. Retarding field energy analysers (RFEAs) have been used to measure IEDs because they are considered accurate, relatively simple and cost effective. However, their usage for critical measurement of ion current density is less common due to difficulties in estimating the proportion of incident ion current reaching the current collector through the RFEA retarding grids. In this paper an RFEA has been calibrated to measure ion current density from an ion beam at pressures ranging from 0.5 to 50.0 mTorr. A unique method is presented where the currents generated at each of the retarding grids and the RFEA upper face are measured separately, allowing the reduction in ion current to be monitored and accounted for at each stage of ion transit to the collector. From these I-V measurements a physical model is described. Subsequently, a mathematical description is extracted which includes parameters to account for grid transmissions, upper face secondary electron emission and collisionality. Pressure-dependent calibration factors can be calculated from least mean square best fits of the collector current to the model allowing quantitative measurement of ion current density.

  16. Multidimensional Genome-wide Analyses Show Accurate FVIII Integration by ZFN in Primary Human Cells

    PubMed Central

    Sivalingam, Jaichandran; Kenanov, Dimitar; Han, Hao; Nirmal, Ajit Johnson; Ng, Wai Har; Lee, Sze Sing; Masilamani, Jeyakumar; Phan, Toan Thang; Maurer-Stroh, Sebastian; Kon, Oi Lian

    2016-01-01

    Costly coagulation factor VIII (FVIII) replacement therapy is a barrier to optimal clinical management of hemophilia A. Therapy using FVIII-secreting autologous primary cells is potentially efficacious and more affordable. Zinc finger nucleases (ZFN) mediate transgene integration into the AAVS1 locus but comprehensive evaluation of off-target genome effects is currently lacking. In light of serious adverse effects in clinical trials which employed genome-integrating viral vectors, this study evaluated potential genotoxicity of ZFN-mediated transgenesis using different techniques. We employed deep sequencing of predicted off-target sites, copy number analysis, whole-genome sequencing, and RNA-seq in primary human umbilical cord-lining epithelial cells (CLECs) with AAVS1 ZFN-mediated FVIII transgene integration. We combined molecular features to enhance the accuracy and activity of ZFN-mediated transgenesis. Our data showed a low frequency of ZFN-associated indels, no detectable off-target transgene integrations or chromosomal rearrangements. ZFN-modified CLECs had very few dysregulated transcripts and no evidence of activated oncogenic pathways. We also showed AAVS1 ZFN activity and durable FVIII transgene secretion in primary human dermal fibroblasts, bone marrow- and adipose tissue-derived stromal cells. Our study suggests that, with close attention to the molecular design of genome-modifying constructs, AAVS1 ZFN-mediated FVIII integration in several primary human cell types may be safe and efficacious. PMID:26689265

  17. Assessing the reproducibility of discriminant function analyses

    PubMed Central

    Andrew, Rose L.; Albert, Arianne Y.K.; Renaut, Sebastien; Rennison, Diana J.; Bock, Dan G.

    2015-01-01

    Data are the foundation of empirical research, yet all too often the datasets underlying published papers are unavailable, incorrect, or poorly curated. This is a serious issue, because future researchers are then unable to validate published results or reuse data to explore new ideas and hypotheses. Even if data files are securely stored and accessible, they must also be accompanied by accurate labels and identifiers. To assess how often problems with metadata or data curation affect the reproducibility of published results, we attempted to reproduce Discriminant Function Analyses (DFAs) from the field of organismal biology. DFA is a commonly used statistical analysis that has changed little since its inception almost eight decades ago, and therefore provides an opportunity to test reproducibility among datasets of varying ages. Out of 100 papers we initially surveyed, fourteen were excluded because they did not present the common types of quantitative result from their DFA or gave insufficient details of their DFA. Of the remaining 86 datasets, there were 15 cases for which we were unable to confidently relate the dataset we received to the one used in the published analysis. The reasons ranged from incomprehensible or absent variable labels, the DFA being performed on an unspecified subset of the data, or the dataset we received being incomplete. We focused on reproducing three common summary statistics from DFAs: the percent variance explained, the percentage correctly assigned and the largest discriminant function coefficient. The reproducibility of the first two was fairly high (20 of 26, and 44 of 60 datasets, respectively), whereas our success rate with the discriminant function coefficients was lower (15 of 26 datasets). When considering all three summary statistics, we were able to completely reproduce 46 (65%) of 71 datasets. While our results show that a majority of studies are reproducible, they highlight the fact that many studies still are not the

  18. Quantitative imaging methods in osteoporosis

    PubMed Central

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M. Carola

    2016-01-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research. PMID:28090446

  19. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  20. Quantitative characterisation of sedimentary grains

    NASA Astrophysics Data System (ADS)

    Tunwal, Mohit; Mulchrone, Kieran F.; Meere, Patrick A.

    2016-04-01

    Analysis of sedimentary texture helps in determining the formation, transportation and deposition processes of sedimentary rocks. Grain size analysis is traditionally quantitative, whereas grain shape analysis is largely qualitative. A semi-automated approach to quantitatively analyse shape and size of sand sized sedimentary grains is presented. Grain boundaries are manually traced from thin section microphotographs in the case of lithified samples and are automatically identified in the case of loose sediments. Shape and size paramters can then be estimated using a software package written on the Mathematica platform. While automated methodology already exists for loose sediment analysis, the available techniques for the case of lithified samples are limited to cases of high definition thin section microphotographs showing clear contrast between framework grains and matrix. Along with the size of grain, shape parameters such as roundness, angularity, circularity, irregularity and fractal dimension are measured. A new grain shape parameter developed using Fourier descriptors has also been developed. To test this new approach theoretical examples were analysed and produce high quality results supporting the accuracy of the algorithm. Furthermore sandstone samples from known aeolian and fluvial environments from the Dingle Basin, County Kerry, Ireland were collected and analysed. Modern loose sediments from glacial till from County Cork, Ireland and aeolian sediments from Rajasthan, India have also been collected and analysed. A graphical summary of the data is presented and allows for quantitative distinction between samples extracted from different sedimentary environments.

  1. Quantitative Decision Support Requires Quantitative User Guidance

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2009-12-01

    Is it conceivable that models run on 2007 computer hardware could provide robust and credible probabilistic information for decision support and user guidance at the ZIP code level for sub-daily meteorological events in 2060? In 2090? Retrospectively, how informative would output from today’s models have proven in 2003? or the 1930’s? Consultancies in the United Kingdom, including the Met Office, are offering services to “future-proof” their customers from climate change. How is a US or European based user or policy maker to determine the extent to which exciting new Bayesian methods are relevant here? or when a commercial supplier is vastly overselling the insights of today’s climate science? How are policy makers and academic economists to make the closely related decisions facing them? How can we communicate deep uncertainty in the future at small length-scales without undermining the firm foundation established by climate science regarding global trends? Three distinct aspects of the communication of the uses of climate model output targeting users and policy makers, as well as other specialist adaptation scientists, are discussed. First, a brief scientific evaluation of the length and time scales at which climate model output is likely to become uninformative is provided, including a note on the applicability the latest Bayesian methodology to current state-of-the-art general circulation models output. Second, a critical evaluation of the language often employed in communication of climate model output, a language which accurately states that models are “better”, have “improved” and now “include” and “simulate” relevant meteorological processed, without clearly identifying where the current information is thought to be uninformative and misleads, both for the current climate and as a function of the state of the (each) climate simulation. And thirdly, a general approach for evaluating the relevance of quantitative climate model output

  2. Accurate measurement of streamwise vortices using dual-plane PIV

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Breuer, Kenneth S.

    2012-11-01

    Low Reynolds number aerodynamic experiments with flapping animals (such as bats and small birds) are of particular interest due to their application to micro air vehicles which operate in a similar parameter space. Previous PIV wake measurements described the structures left by bats and birds and provided insight into the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions based on said measurements. The highly three-dimensional and unsteady nature of the flows associated with flapping flight are major challenges for accurate measurements. The challenge of animal flight measurements is finding small flow features in a large field of view at high speed with limited laser energy and camera resolution. Cross-stream measurement is further complicated by the predominately out-of-plane flow that requires thick laser sheets and short inter-frame times, which increase noise and measurement uncertainty. Choosing appropriate experimental parameters requires compromise between the spatial and temporal resolution and the dynamic range of the measurement. To explore these challenges, we do a case study on the wake of a fixed wing. The fixed model simplifies the experiment and allows direct measurements of the aerodynamic forces via load cell. We present a detailed analysis of the wake measurements, discuss the criteria for making accurate measurements, and present a solution for making quantitative aerodynamic load measurements behind free-flyers.

  3. Bioimaging for quantitative phenotype analysis.

    PubMed

    Chen, Weiyang; Xia, Xian; Huang, Yi; Chen, Xingwei; Han, Jing-Dong J

    2016-06-01

    With the development of bio-imaging techniques, an increasing number of studies apply these techniques to generate a myriad of image data. Its applications range from quantification of cellular, tissue, organismal and behavioral phenotypes of model organisms, to human facial phenotypes. The bio-imaging approaches to automatically detect, quantify, and profile phenotypic changes related to specific biological questions open new doors to studying phenotype-genotype associations and to precisely evaluating molecular changes associated with quantitative phenotypes. Here, we review major applications of bioimage-based quantitative phenotype analysis. Specifically, we describe the biological questions and experimental needs addressable by these analyses, computational techniques and tools that are available in these contexts, and the new perspectives on phenotype-genotype association uncovered by such analyses.

  4. Mill profiler machines soft materials accurately

    NASA Technical Reports Server (NTRS)

    Rauschl, J. A.

    1966-01-01

    Mill profiler machines bevels, slots, and grooves in soft materials, such as styrofoam phenolic-filled cores, to any desired thickness. A single operator can accurately control cutting depths in contour or straight line work.

  5. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  6. Increasing the quantitative bandwidth of NMR measurements.

    PubMed

    Power, J E; Foroozandeh, M; Adams, R W; Nilsson, M; Coombes, S R; Phillips, A R; Morris, G A

    2016-02-18

    The frequency range of quantitative NMR is increased from tens to hundreds of kHz by a new pulse sequence, CHORUS. It uses chirp pulses to excite uniformly over very large bandwidths, yielding accurate integrals even for nuclei such as (19)F that have very wide spectra.

  7. A new HPLC method for azithromycin quantitation.

    PubMed

    Zubata, Patricia; Ceresole, Rita; Rosasco, Maria Ana; Pizzorno, Maria Teresa

    2002-02-01

    A simple liquid chromatographic method was developed for the estimation of azithromycin raw material and in pharmaceutical forms. The sample was chromatographed on a reverse phase C18 column and eluants monitored at a wavelength of 215 nm. The method was accurate, precise and sufficiently selective. It is applicable for its quantitation, stability and dissolution tests.

  8. Accurate and fast multiple-testing correction in eQTL studies.

    PubMed

    Sul, Jae Hoon; Raj, Towfique; de Jong, Simone; de Bakker, Paul I W; Raychaudhuri, Soumya; Ophoff, Roel A; Stranger, Barbara E; Eskin, Eleazar; Han, Buhm

    2015-06-04

    In studies of expression quantitative trait loci (eQTLs), it is of increasing interest to identify eGenes, the genes whose expression levels are associated with variation at a particular genetic variant. Detecting eGenes is important for follow-up analyses and prioritization because genes are the main entities in biological processes. To detect eGenes, one typically focuses on the genetic variant with the minimum p value among all variants in cis with a gene and corrects for multiple testing to obtain a gene-level p value. For performing multiple-testing correction, a permutation test is widely used. Because of growing sample sizes of eQTL studies, however, the permutation test has become a computational bottleneck in eQTL studies. In this paper, we propose an efficient approach for correcting for multiple testing and assess eGene p values by utilizing a multivariate normal distribution. Our approach properly takes into account the linkage-disequilibrium structure among variants, and its time complexity is independent of sample size. By applying our small-sample correction techniques, our method achieves high accuracy in both small and large studies. We have shown that our method consistently produces extremely accurate p values (accuracy > 98%) for three human eQTL datasets with different sample sizes and SNP densities: the Genotype-Tissue Expression pilot dataset, the multi-region brain dataset, and the HapMap 3 dataset.

  9. NOAA's National Snow Analyses

    NASA Astrophysics Data System (ADS)

    Carroll, T. R.; Cline, D. W.; Olheiser, C. M.; Rost, A. A.; Nilsson, A. O.; Fall, G. M.; Li, L.; Bovitz, C. T.

    2005-12-01

    NOAA's National Operational Hydrologic Remote Sensing Center (NOHRSC) routinely ingests all of the electronically available, real-time, ground-based, snow data; airborne snow water equivalent data; satellite areal extent of snow cover information; and numerical weather prediction (NWP) model forcings for the coterminous U.S. The NWP model forcings are physically downscaled from their native 13 km2 spatial resolution to a 1 km2 resolution for the CONUS. The downscaled NWP forcings drive an energy-and-mass-balance snow accumulation and ablation model at a 1 km2 spatial resolution and at a 1 hour temporal resolution for the country. The ground-based, airborne, and satellite snow observations are assimilated into the snow model's simulated state variables using a Newtonian nudging technique. The principle advantages of the assimilation technique are: (1) approximate balance is maintained in the snow model, (2) physical processes are easily accommodated in the model, and (3) asynoptic data are incorporated at the appropriate times. The snow model is reinitialized with the assimilated snow observations to generate a variety of snow products that combine to form NOAA's NOHRSC National Snow Analyses (NSA). The NOHRSC NSA incorporate all of the available information necessary and available to produce a "best estimate" of real-time snow cover conditions at 1 km2 spatial resolution and 1 hour temporal resolution for the country. The NOHRSC NSA consist of a variety of daily, operational, products that characterize real-time snowpack conditions including: snow water equivalent, snow depth, surface and internal snowpack temperatures, surface and blowing snow sublimation, and snowmelt for the CONUS. The products are generated and distributed in a variety of formats including: interactive maps, time-series, alphanumeric products (e.g., mean areal snow water equivalent on a hydrologic basin-by-basin basis), text and map discussions, map animations, and quantitative gridded products

  10. Accurate vessel segmentation with constrained B-snake.

    PubMed

    Yuanzhi Cheng; Xin Hu; Ji Wang; Yadong Wang; Tamura, Shinichi

    2015-08-01

    We describe an active contour framework with accurate shape and size constraints on the vessel cross-sectional planes to produce the vessel segmentation. It starts with a multiscale vessel axis tracing in a 3D computed tomography (CT) data, followed by vessel boundary delineation on the cross-sectional planes derived from the extracted axis. The vessel boundary surface is deformed under constrained movements on the cross sections and is voxelized to produce the final vascular segmentation. The novelty of this paper lies in the accurate contour point detection of thin vessels based on the CT scanning model, in the efficient implementation of missing contour points in the problematic regions and in the active contour model with accurate shape and size constraints. The main advantage of our framework is that it avoids disconnected and incomplete segmentation of the vessels in the problematic regions that contain touching vessels (vessels in close proximity to each other), diseased portions (pathologic structure attached to a vessel), and thin vessels. It is particularly suitable for accurate segmentation of thin and low contrast vessels. Our method is evaluated and demonstrated on CT data sets from our partner site, and its results are compared with three related methods. Our method is also tested on two publicly available databases and its results are compared with the recently published method. The applicability of the proposed method to some challenging clinical problems, the segmentation of the vessels in the problematic regions, is demonstrated with good results on both quantitative and qualitative experimentations; our segmentation algorithm can delineate vessel boundaries that have level of variability similar to those obtained manually.

  11. Estimating distributions out of qualitative and (semi)quantitative microbiological contamination data for use in risk assessment.

    PubMed

    Busschaert, P; Geeraerd, A H; Uyttendaele, M; Van Impe, J F

    2010-04-15

    A framework using maximum likelihood estimation (MLE) is used to fit a probability distribution to a set of qualitative (e.g., absence in 25 g), semi-quantitative (e.g., presence in 25 g and absence in 1g) and/or quantitative test results (e.g., 10 CFU/g). Uncertainty about the parameters of the variability distribution is characterized through a non-parametric bootstrapping method. The resulting distribution function can be used as an input for a second order Monte Carlo simulation in quantitative risk assessment. As an illustration, the method is applied to two sets of in silico generated data. It is demonstrated that correct interpretation of data results in an accurate representation of the contamination level distribution. Subsequently, two case studies are analyzed, namely (i) quantitative analyses of Campylobacter spp. in food samples with nondetects, and (ii) combined quantitative, qualitative, semiquantitative analyses and nondetects of Listeria monocytogenes in smoked fish samples. The first of these case studies is also used to illustrate what the influence is of the limit of quantification, measurement error, and the number of samples included in the data set. Application of these techniques offers a way for meta-analysis of the many relevant yet diverse data sets that are available in literature and (inter)national reports of surveillance or baseline surveys, therefore increases the information input of a risk assessment and, by consequence, the correctness of the outcome of the risk assessment.

  12. An efficient approach to the quantitative analysis of humic acid in water.

    PubMed

    Wang, Xue; Li, Bao Qiong; Zhai, Hong Lin; Xiong, Meng Yi; Liu, Ying

    2016-01-01

    Rayleigh and Raman scatterings inevitably appear in fluorescence measurements, which make the quantitative analysis more difficult, especially in the overlap of target signals and scattering signals. Based on the grayscale images of three-dimensional fluorescence spectra, the linear model with two selected Zernike moments was established for the determination of humic acid, and applied to the quantitative analysis of the real sample taken from the Yellow River. The correlation coefficient (R(2)) and leave-one-out cross validation correlation coefficient (R(2)cv) were up to 0.9994 and 0.9987, respectively. The average recoveries were reached 96.28%. Compared with N-way partial least square and alternating trilinear decomposition methods, our approach was immune from the scattering and noise signals owing to its powerful multi-resolution characteristic and the obtained results were more reliable and accurate, which could be applied in food analyses.

  13. Theory and practice in quantitative genetics.

    PubMed

    Posthuma, Daniëlle; Beem, A Leo; de Geus, Eco J C; van Baal, G Caroline M; von Hjelmborg, Jacob B; Iachine, Ivan; Boomsma, Dorret I

    2003-10-01

    With the rapid advances in molecular biology, the near completion of the human genome, the development of appropriate statistical genetic methods and the availability of the necessary computing power, the identification of quantitative trait loci has now become a realistic prospect for quantitative geneticists. We briefly describe the theoretical biometrical foundations underlying quantitative genetics. These theoretical underpinnings are translated into mathematical equations that allow the assessment of the contribution of observed (using DNA samples) and unobserved (using known genetic relationships) genetic variation to population variance in quantitative traits. Several statistical models for quantitative genetic analyses are described, such as models for the classical twin design, multivariate and longitudinal genetic analyses, extended twin analyses, and linkage and association analyses. For each, we show how the theoretical biometrical model can be translated into algebraic equations that may be used to generate scripts for statistical genetic software packages, such as Mx, Lisrel, SOLAR, or MERLIN. For using the former program a web-library (available from http://www.psy.vu.nl/mxbib) has been developed of freely available scripts that can be used to conduct all genetic analyses described in this paper.

  14. Measurement of lentiviral vector titre and copy number by cross-species duplex quantitative PCR.

    PubMed

    Christodoulou, I; Patsali, P; Stephanou, C; Antoniou, M; Kleanthous, M; Lederer, C W

    2016-01-01

    Lentiviruses are the vectors of choice for many preclinical studies and clinical applications of gene therapy. Accurate measurement of biological vector titre before treatment is a prerequisite for vector dosing, and the calculation of vector integration sites per cell after treatment is as critical to the characterisation of modified cell products as it is to long-term follow-up and the assessment of risk and therapeutic efficiency in patients. These analyses are typically based on quantitative real-time PCR (qPCR), but as yet compromise accuracy and comparability between laboratories and experimental systems, the former by using separate simplex reactions for the detection of endogene and lentiviral sequences and the latter by designing different PCR assays for analyses in human cells and animal disease models. In this study, we validate in human and murine cells a qPCR system for the single-tube assessment of lentiviral vector copy numbers that is suitable for analyses in at least 33 different mammalian species, including human and other primates, mouse, pig, cat and domestic ruminants. The established assay combines the accuracy of single-tube quantitation by duplex qPCR with the convenience of one-off assay optimisation for cross-species analyses and with the direct comparability of lentiviral transduction efficiencies in different species.

  15. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    ERIC Educational Resources Information Center

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  16. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  17. Linkage disequilibrium interval mapping of quantitative trait loci

    PubMed Central

    Boitard, Simon; Abdallah, Jihad; de Rochambeau, Hubert; Cierco-Ayrolles, Christine; Mangin, Brigitte

    2006-01-01

    Background For many years gene mapping studies have been performed through linkage analyses based on pedigree data. Recently, linkage disequilibrium methods based on unrelated individuals have been advocated as powerful tools to refine estimates of gene location. Many strategies have been proposed to deal with simply inherited disease traits. However, locating quantitative trait loci is statistically more challenging and considerable research is needed to provide robust and computationally efficient methods. Results Under a three-locus Wright-Fisher model, we derived approximate expressions for the expected haplotype frequencies in a population. We considered haplotypes comprising one trait locus and two flanking markers. Using these theoretical expressions, we built a likelihood-maximization method, called HAPim, for estimating the location of a quantitative trait locus. For each postulated position, the method only requires information from the two flanking markers. Over a wide range of simulation scenarios it was found to be more accurate than a two-marker composite likelihood method. It also performed as well as identity by descent methods, whilst being valuable in a wider range of populations. Conclusion Our method makes efficient use of marker information, and can be valuable for fine mapping purposes. Its performance is increased if multiallelic markers are available. Several improvements can be developed to account for more complex evolution scenarios or provide robust confidence intervals for the location estimates. PMID:16542433

  18. Laser Ablation/Ionisation Mass Spectrometry: Sensitive and Quantitative Chemical Depth Profiling of Solid Materials.

    PubMed

    Riedo, Andreas; Grimaudo, Valentine; Moreno-García, Pavel; Neuland, Maike B; Tulej, Marek; Broekmann, Peter; Wurz, Peter

    2016-01-01

    Direct quantitative and sensitive chemical analysis of solid materials with high spatial resolution, both in lateral and vertical direction is of high importance in various fields of analytical research, ranging from in situ space research to the semiconductor industry. Accurate knowledge of the chemical composition of solid materials allows a better understanding of physical and chemical processes that formed/altered the material and allows e.g. to further improve these processes. So far, state-of-the-art techniques such as SIMS, LA-ICP-MS or GD-MS have been applied for chemical analyses in these fields of research. In this report we review the current measurement capability and the applicability of our Laser Ablation/Ionisation Mass Spectrometer (instrument name LMS) for the chemical analysis of solids with high spatial resolution. The most recent chemical analyses conducted on various solid materials, including e.g. alloys, fossils and meteorites are discussed.

  19. Accurate estimation of sigma(exp 0) using AIRSAR data

    NASA Technical Reports Server (NTRS)

    Holecz, Francesco; Rignot, Eric

    1995-01-01

    During recent years signature analysis, classification, and modeling of Synthetic Aperture Radar (SAR) data as well as estimation of geophysical parameters from SAR data have received a great deal of interest. An important requirement for the quantitative use of SAR data is the accurate estimation of the backscattering coefficient sigma(exp 0). In terrain with relief variations radar signals are distorted due to the projection of the scene topography into the slant range-Doppler plane. The effect of these variations is to change the physical size of the scattering area, leading to errors in the radar backscatter values and incidence angle. For this reason the local incidence angle, derived from sensor position and Digital Elevation Model (DEM) data must always be considered. Especially in the airborne case, the antenna gain pattern can be an additional source of radiometric error, because the radar look angle is not known precisely as a result of the the aircraft motions and the local surface topography. Consequently, radiometric distortions due to the antenna gain pattern must also be corrected for each resolution cell, by taking into account aircraft displacements (position and attitude) and position of the backscatter element, defined by the DEM data. In this paper, a method to derive an accurate estimation of the backscattering coefficient using NASA/JPL AIRSAR data is presented. The results are evaluated in terms of geometric accuracy, radiometric variations of sigma(exp 0), and precision of the estimated forest biomass.

  20. Accurate 3D quantification of the bronchial parameters in MDCT

    NASA Astrophysics Data System (ADS)

    Saragaglia, A.; Fetita, C.; Preteux, F.; Brillet, P. Y.; Grenier, P. A.

    2005-08-01

    The assessment of bronchial reactivity and wall remodeling in asthma plays a crucial role in better understanding such a disease and evaluating therapeutic responses. Today, multi-detector computed tomography (MDCT) makes it possible to perform an accurate estimation of bronchial parameters (lumen and wall areas) by allowing a quantitative analysis in a cross-section plane orthogonal to the bronchus axis. This paper provides the tools for such an analysis by developing a 3D investigation method which relies on 3D reconstruction of bronchial lumen and central axis computation. Cross-section images at bronchial locations interactively selected along the central axis are generated at appropriate spatial resolution. An automated approach is then developed for accurately segmenting the inner and outer bronchi contours on the cross-section images. It combines mathematical morphology operators, such as "connection cost", and energy-controlled propagation in order to overcome the difficulties raised by vessel adjacencies and wall irregularities. The segmentation accuracy was validated with respect to a 3D mathematically-modeled phantom of a pair bronchus-vessel which mimics the characteristics of real data in terms of gray-level distribution, caliber and orientation. When applying the developed quantification approach to such a model with calibers ranging from 3 to 10 mm diameter, the lumen area relative errors varied from 3.7% to 0.15%, while the bronchus area was estimated with a relative error less than 5.1%.

  1. D-BRAIN: Anatomically Accurate Simulated Diffusion MRI Brain Data.

    PubMed

    Perrone, Daniele; Jeurissen, Ben; Aelterman, Jan; Roine, Timo; Sijbers, Jan; Pizurica, Aleksandra; Leemans, Alexander; Philips, Wilfried

    2016-01-01

    Diffusion Weighted (DW) MRI allows for the non-invasive study of water diffusion inside living tissues. As such, it is useful for the investigation of human brain white matter (WM) connectivity in vivo through fiber tractography (FT) algorithms. Many DW-MRI tailored restoration techniques and FT algorithms have been developed. However, it is not clear how accurately these methods reproduce the WM bundle characteristics in real-world conditions, such as in the presence of noise, partial volume effect, and a limited spatial and angular resolution. The difficulty lies in the lack of a realistic brain phantom on the one hand, and a sufficiently accurate way of modeling the acquisition-related degradation on the other. This paper proposes a software phantom that approximates a human brain to a high degree of realism and that can incorporate complex brain-like structural features. We refer to it as a Diffusion BRAIN (D-BRAIN) phantom. Also, we propose an accurate model of a (DW) MRI acquisition protocol to allow for validation of methods in realistic conditions with data imperfections. The phantom model simulates anatomical and diffusion properties for multiple brain tissue components, and can serve as a ground-truth to evaluate FT algorithms, among others. The simulation of the acquisition process allows one to include noise, partial volume effects, and limited spatial and angular resolution in the images. In this way, the effect of image artifacts on, for instance, fiber tractography can be investigated with great detail. The proposed framework enables reliable and quantitative evaluation of DW-MR image processing and FT algorithms at the level of large-scale WM structures. The effect of noise levels and other data characteristics on cortico-cortical connectivity and tractography-based grey matter parcellation can be investigated as well.

  2. Feedback about More Accurate versus Less Accurate Trials: Differential Effects on Self-Confidence and Activation

    ERIC Educational Resources Information Center

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-01-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected by feedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On Day 1, participants performed a golf putting task under one of…

  3. Serum Insulin-like Growth Factor I Quantitation by Mass Spectrometry: Insights for Protein Quantitation with this Technology.

    PubMed

    Kam, Richard Kin Ting; Ho, Chung Shun; Chan, Michael Ho Ming

    2016-12-01

    Liquid chromatography mass spectrometry (LC-MS) is a widely used technique in the clinical laboratory, especially for small molecule quantitation in biological specimens, for example, steroid hormones and therapeutic drugs. Analysis of circulating macromolecules, including proteins and peptides, is largely dominated by traditional enzymatic, spectrophotometric, or immunological assays in clinical laboratories. However, these methodologies are known to be subjected to interfering substances, for example heterophilic antibodies, as well as subjected to non-specificity issues. In recent years, there has been a growing interest in using LC-MS platforms for protein analysis in the clinical setting, due to the superior specificity compared to immunoassay, and the possibility of simultaneous quantitation of multiple proteins. Different analytical approaches are possible using LC-MS-based methodology, including accurate mass measurement of intact molecules, protein digestion followed by detection of proteolytic peptides, and in combination with immunoaffinity purification. Proteins with different complexity, isoforms, variants, or chemical alteration can be simultaneously analysed by LC-MS, either by targeted or non-targeted approaches. While the LC-MS platform offers a more specific determination of proteins, there remain issues of LC-MS assay harmonization, correlation with current existing platforms, and the potential impact in making clinical decision. In this review, the clinical utility, historical aspect, and challenges in using LC-MS for protein analysis in the clinical setting will be discussed, using insulin-like growth factor (IGF) as an example.

  4. Serum Insulin-like Growth Factor I Quantitation by Mass Spectrometry: Insights for Protein Quantitation with this Technology

    PubMed Central

    Ho, Chung Shun; Chan, Michael Ho Ming

    2016-01-01

    Liquid chromatography mass spectrometry (LC-MS) is a widely used technique in the clinical laboratory, especially for small molecule quantitation in biological specimens, for example, steroid hormones and therapeutic drugs. Analysis of circulating macromolecules, including proteins and peptides, is largely dominated by traditional enzymatic, spectrophotometric, or immunological assays in clinical laboratories. However, these methodologies are known to be subjected to interfering substances, for example heterophilic antibodies, as well as subjected to non-specificity issues. In recent years, there has been a growing interest in using LC-MS platforms for protein analysis in the clinical setting, due to the superior specificity compared to immunoassay, and the possibility of simultaneous quantitation of multiple proteins. Different analytical approaches are possible using LC-MS-based methodology, including accurate mass measurement of intact molecules, protein digestion followed by detection of proteolytic peptides, and in combination with immunoaffinity purification. Proteins with different complexity, isoforms, variants, or chemical alteration can be simultaneously analysed by LC-MS, either by targeted or non-targeted approaches. While the LC-MS platform offers a more specific determination of proteins, there remain issues of LC-MS assay harmonization, correlation with current existing platforms, and the potential impact in making clinical decision. In this review, the clinical utility, historical aspect, and challenges in using LC-MS for protein analysis in the clinical setting will be discussed, using insulin-like growth factor (IGF) as an example. PMID:28149264

  5. Spacelab Charcoal Analyses

    NASA Technical Reports Server (NTRS)

    Slivon, L. E.; Hernon-Kenny, L. A.; Katona, V. R.; Dejarme, L. E.

    1995-01-01

    This report describes analytical methods and results obtained from chemical analysis of 31 charcoal samples in five sets. Each set was obtained from a single scrubber used to filter ambient air on board a Spacelab mission. Analysis of the charcoal samples was conducted by thermal desorption followed by gas chromatography/mass spectrometry (GC/MS). All samples were analyzed using identical methods. The method used for these analyses was able to detect compounds independent of their polarity or volatility. In addition to the charcoal samples, analyses of three Environmental Control and Life Support System (ECLSS) water samples were conducted specifically for trimethylamine.

  6. Accurate Guitar Tuning by Cochlear Implant Musicians

    PubMed Central

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  7. New model accurately predicts reformate composition

    SciTech Connect

    Ancheyta-Juarez, J.; Aguilar-Rodriguez, E. )

    1994-01-31

    Although naphtha reforming is a well-known process, the evolution of catalyst formulation, as well as new trends in gasoline specifications, have led to rapid evolution of the process, including: reactor design, regeneration mode, and operating conditions. Mathematical modeling of the reforming process is an increasingly important tool. It is fundamental to the proper design of new reactors and revamp of existing ones. Modeling can be used to optimize operating conditions, analyze the effects of process variables, and enhance unit performance. Instituto Mexicano del Petroleo has developed a model of the catalytic reforming process that accurately predicts reformate composition at the higher-severity conditions at which new reformers are being designed. The new AA model is more accurate than previous proposals because it takes into account the effects of temperature and pressure on the rate constants of each chemical reaction.

  8. Accurate colorimetric feedback for RGB LED clusters

    NASA Astrophysics Data System (ADS)

    Man, Kwong; Ashdown, Ian

    2006-08-01

    We present an empirical model of LED emission spectra that is applicable to both InGaN and AlInGaP high-flux LEDs, and which accurately predicts their relative spectral power distributions over a wide range of LED junction temperatures. We further demonstrate with laboratory measurements that changes in LED spectral power distribution with temperature can be accurately predicted with first- or second-order equations. This provides the basis for a real-time colorimetric feedback system for RGB LED clusters that can maintain the chromaticity of white light at constant intensity to within +/-0.003 Δuv over a range of 45 degrees Celsius, and to within 0.01 Δuv when dimmed over an intensity range of 10:1.

  9. Accurate guitar tuning by cochlear implant musicians.

    PubMed

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  10. An Accurate, Simplified Model Intrabeam Scattering

    SciTech Connect

    Bane, Karl LF

    2002-05-23

    Beginning with the general Bjorken-Mtingwa solution for intrabeam scattering (IBS) we derive an accurate, greatly simplified model of IBS, valid for high energy beams in normal storage ring lattices. In addition, we show that, under the same conditions, a modified version of Piwinski's IBS formulation (where {eta}{sub x,y}{sup 2}/{beta}{sub x,y} has been replaced by {Eta}{sub x,y}) asymptotically approaches the result of Bjorken-Mtingwa.

  11. An accurate registration technique for distorted images

    NASA Technical Reports Server (NTRS)

    Delapena, Michele; Shaw, Richard A.; Linde, Peter; Dravins, Dainis

    1990-01-01

    Accurate registration of International Ultraviolet Explorer (IUE) images is crucial because the variability of the geometrical distortions that are introduced by the SEC-Vidicon cameras ensures that raw science images are never perfectly aligned with the Intensity Transfer Functions (ITFs) (i.e., graded floodlamp exposures that are used to linearize and normalize the camera response). A technique for precisely registering IUE images which uses a cross correlation of the fixed pattern that exists in all raw IUE images is described.

  12. On accurate determination of contact angle

    NASA Technical Reports Server (NTRS)

    Concus, P.; Finn, R.

    1992-01-01

    Methods are proposed that exploit a microgravity environment to obtain highly accurate measurement of contact angle. These methods, which are based on our earlier mathematical results, do not require detailed measurement of a liquid free-surface, as they incorporate discontinuous or nearly-discontinuous behavior of the liquid bulk in certain container geometries. Physical testing is planned in the forthcoming IML-2 space flight and in related preparatory ground-based experiments.

  13. Restricted versus Unrestricted Learning: Synthesis of Recent Meta-Analyses

    ERIC Educational Resources Information Center

    Johnson, Genevieve

    2007-01-01

    Meta-analysis is a method of quantitatively summarizing the results of experimental research. This article summarizes four meta-analyses published since 2003 that compare the effect of DE and traditional education (TE) on student learning. Despite limitations, synthesis of these meta-analyses establish, at the very least, equivalent learning…

  14. Wavelet Analyses and Applications

    ERIC Educational Resources Information Center

    Bordeianu, Cristian C.; Landau, Rubin H.; Paez, Manuel J.

    2009-01-01

    It is shown how a modern extension of Fourier analysis known as wavelet analysis is applied to signals containing multiscale information. First, a continuous wavelet transform is used to analyse the spectrum of a nonstationary signal (one whose form changes in time). The spectral analysis of such a signal gives the strength of the signal in each…

  15. Apollo 14 microbial analyses

    NASA Technical Reports Server (NTRS)

    Taylor, G. R.

    1972-01-01

    Extensive microbiological analyses that were performed on the Apollo 14 prime and backup crewmembers and ancillary personnel are discussed. The crewmembers were subjected to four separate and quite different environments during the 137-day monitoring period. The relation between each of these environments and observed changes in the microflora of each astronaut are presented.

  16. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    NASA Technical Reports Server (NTRS)

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  17. Recent advances in quantitative neuroproteomics.

    PubMed

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2013-06-15

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed

  18. Fully automated software solution for protein quantitation by global metabolic labeling with stable isotopes.

    PubMed

    Bindschedler, L V; Cramer, R

    2011-06-15

    Metabolic stable isotope labeling is increasingly employed for accurate protein (and metabolite) quantitation using mass spectrometry (MS). It provides sample-specific isotopologues that can be used to facilitate comparative analysis of two or more samples. Stable Isotope Labeling by Amino acids in Cell culture (SILAC) has been used for almost a decade in proteomic research and analytical software solutions have been established that provide an easy and integrated workflow for elucidating sample abundance ratios for most MS data formats. While SILAC is a discrete labeling method using specific amino acids, global metabolic stable isotope labeling using isotopes such as (15)N labels the entire element content of the sample, i.e. for (15)N the entire peptide backbone in addition to all nitrogen-containing side chains. Although global metabolic labeling can deliver advantages with regard to isotope incorporation and costs, the requirements for data analysis are more demanding because, for instance for polypeptides, the mass difference introduced by the label depends on the amino acid composition. Consequently, there has been less progress on the automation of the data processing and mining steps for this type of protein quantitation. Here, we present a new integrated software solution for the quantitative analysis of protein expression in differential samples and show the benefits of high-resolution MS data in quantitative proteomic analyses.

  19. Sources of Technical Variability in Quantitative LC-MS Proteomics: Human Brain Tissue Sample Analysis.

    SciTech Connect

    Piehowski, Paul D.; Petyuk, Vladislav A.; Orton, Daniel J.; Xie, Fang; Moore, Ronald J.; Ramirez Restrepo, Manuel; Engel, Anzhelika; Lieberman, Andrew P.; Albin, Roger L.; Camp, David G.; Smith, Richard D.; Myers, Amanda J.

    2013-05-03

    To design a robust quantitative proteomics study, an understanding of both the inherent heterogeneity of the biological samples being studied as well as the technical variability of the proteomics methods and platform is needed. Additionally, accurately identifying the technical steps associated with the largest variability would provide valuable information for the improvement and design of future processing pipelines. We present an experimental strategy that allows for a detailed examination of the variability of the quantitative LC-MS proteomics measurements. By replicating analyses at different stages of processing, various technical components can be estimated and their individual contribution to technical variability can be dissected. This design can be easily adapted to other quantitative proteomics pipelines. Herein, we applied this methodology to our label-free workflow for the processing of human brain tissue. For this application, the pipeline was divided into four critical components: Tissue dissection and homogenization (extraction), protein denaturation followed by trypsin digestion and SPE clean-up (digestion), short-term run-to-run instrumental response fluctuation (instrumental variance), and long-term drift of the quantitative response of the LC-MS/MS platform over the 2 week period of continuous analysis (instrumental stability). From this analysis, we found the following contributions to variability: extraction (72%) >> instrumental variance (16%) > instrumental stability (8.4%) > digestion (3.1%). Furthermore, the stability of the platform and its’ suitability for discovery proteomics studies is demonstrated.

  20. Evaluation of an automatic HPLC analyser for thalassemia and haemoglobin variants screening

    PubMed Central

    Barella, S.; Gasperini, D.; Perseu, L.; Paglietti, E.; Sollaino, C.; Paderi, L.; Pirroni, M. G.; Maccioni, L.; Mosca, A.

    1995-01-01

    In this paper the authors report the evolution of a new automatic HPLC analyser for screening haemoglobinopathies. HbA2 and F determinations are accurate and reproducible. The analysis time is short (6.5 min) and there is a good separation between the HbA2 values of β-thalassemia carriers from normals and α-thalassemia carriers, with no overlap between these groups. In addition, the system is also able to detect and quantitate most of the haemoglobin variants, particularly those (HbS, HbC, HbE and Hb Lepore) able to interact with β-thalassemia and could make haemoglobin electrophoresis unnecessary in all samples. The ease of operation and the limited technical work make this system especially suitable for laboratories with a high workload and allow the cost of screening to be reduced. PMID:18925016

  1. Micromechanical Failure Analyses for Finite Element Polymer Modeling

    SciTech Connect

    CHAMBERS,ROBERT S.; REEDY JR.,EARL DAVID; LO,CHI S.; ADOLF,DOUGLAS B.; GUESS,TOMMY R.

    2000-11-01

    Polymer stresses around sharp corners and in constrained geometries of encapsulated components can generate cracks leading to system failures. Often, analysts use maximum stresses as a qualitative indicator for evaluating the strength of encapsulated component designs. Although this approach has been useful for making relative comparisons screening prospective design changes, it has not been tied quantitatively to failure. Accurate failure models are needed for analyses to predict whether encapsulated components meet life cycle requirements. With Sandia's recently developed nonlinear viscoelastic polymer models, it has been possible to examine more accurately the local stress-strain distributions in zones of likely failure initiation looking for physically based failure mechanisms and continuum metrics that correlate with the cohesive failure event. This study has identified significant differences between rubbery and glassy failure mechanisms that suggest reasonable alternatives for cohesive failure criteria and metrics. Rubbery failure seems best characterized by the mechanisms of finite extensibility and appears to correlate with maximum strain predictions. Glassy failure, however, seems driven by cavitation and correlates with the maximum hydrostatic tension. Using these metrics, two three-point bending geometries were tested and analyzed under variable loading rates, different temperatures and comparable mesh resolution (i.e., accuracy) to make quantitative failure predictions. The resulting predictions and observations agreed well suggesting the need for additional research. In a separate, additional study, the asymptotically singular stress state found at the tip of a rigid, square inclusion embedded within a thin, linear elastic disk was determined for uniform cooling. The singular stress field is characterized by a single stress intensity factor K{sub a} and the applicable K{sub a} calibration relationship has been determined for both fully bonded and

  2. Reference gene selection for quantitative real-time PCR normalization in Quercus suber.

    PubMed

    Marum, Liliana; Miguel, Andreia; Ricardo, Cândido P; Miguel, Célia

    2012-01-01

    The use of reverse transcription quantitative PCR technology to assess gene expression levels requires an accurate normalization of data in order to avoid misinterpretation of experimental results and erroneous analyses. Despite being the focus of several transcriptomics projects, oaks, and particularly cork oak (Quercus suber), have not been investigated regarding the identification of reference genes suitable for the normalization of real-time quantitative PCR data. In this study, ten candidate reference genes (Act, CACs, EF-1α, GAPDH, His3, PsaH, Sand, PP2A, ß-Tub and Ubq) were evaluated to determine the most stable internal reference for quantitative PCR normalization in cork oak. The transcript abundance of these genes was analysed in several tissues of cork oak, including leaves, reproduction cork, and periderm from branches at different developmental stages (1-, 2-, and 3-year old) or collected in different dates (active growth period versus dormancy). The three statistical methods (geNorm, NormFinder, and CV method) used in the evaluation of the most suitable combination of reference genes identified Act and CACs as the most stable candidates when all the samples were analysed together, while ß-Tub and PsaH showed the lowest expression stability. However, when different tissues, developmental stages, and collection dates were analysed separately, the reference genes exhibited some variation in their expression levels. In this study, and for the first time, we have identified and validated reference genes in cork oak that can be used for quantification of target gene expression in different tissues and experimental conditions and will be useful as a starting point for gene expression studies in other oaks.

  3. Accurate upwind methods for the Euler equations

    NASA Technical Reports Server (NTRS)

    Huynh, Hung T.

    1993-01-01

    A new class of piecewise linear methods for the numerical solution of the one-dimensional Euler equations of gas dynamics is presented. These methods are uniformly second-order accurate, and can be considered as extensions of Godunov's scheme. With an appropriate definition of monotonicity preservation for the case of linear convection, it can be shown that they preserve monotonicity. Similar to Van Leer's MUSCL scheme, they consist of two key steps: a reconstruction step followed by an upwind step. For the reconstruction step, a monotonicity constraint that preserves uniform second-order accuracy is introduced. Computational efficiency is enhanced by devising a criterion that detects the 'smooth' part of the data where the constraint is redundant. The concept and coding of the constraint are simplified by the use of the median function. A slope steepening technique, which has no effect at smooth regions and can resolve a contact discontinuity in four cells, is described. As for the upwind step, existing and new methods are applied in a manner slightly different from those in the literature. These methods are derived by approximating the Euler equations via linearization and diagonalization. At a 'smooth' interface, Harten, Lax, and Van Leer's one intermediate state model is employed. A modification for this model that can resolve contact discontinuities is presented. Near a discontinuity, either this modified model or a more accurate one, namely, Roe's flux-difference splitting. is used. The current presentation of Roe's method, via the conceptually simple flux-vector splitting, not only establishes a connection between the two splittings, but also leads to an admissibility correction with no conditional statement, and an efficient approximation to Osher's approximate Riemann solver. These reconstruction and upwind steps result in schemes that are uniformly second-order accurate and economical at smooth regions, and yield high resolution at discontinuities.

  4. Accurate measurement of unsteady state fluid temperature

    NASA Astrophysics Data System (ADS)

    Jaremkiewicz, Magdalena

    2017-03-01

    In this paper, two accurate methods for determining the transient fluid temperature were presented. Measurements were conducted for boiling water since its temperature is known. At the beginning the thermometers are at the ambient temperature and next they are immediately immersed into saturated water. The measurements were carried out with two thermometers of different construction but with the same housing outer diameter equal to 15 mm. One of them is a K-type industrial thermometer widely available commercially. The temperature indicated by the thermometer was corrected considering the thermometers as the first or second order inertia devices. The new design of a thermometer was proposed and also used to measure the temperature of boiling water. Its characteristic feature is a cylinder-shaped housing with the sheath thermocouple located in its center. The temperature of the fluid was determined based on measurements taken in the axis of the solid cylindrical element (housing) using the inverse space marching method. Measurements of the transient temperature of the air flowing through the wind tunnel using the same thermometers were also carried out. The proposed measurement technique provides more accurate results compared with measurements using industrial thermometers in conjunction with simple temperature correction using the inertial thermometer model of the first or second order. By comparing the results, it was demonstrated that the new thermometer allows obtaining the fluid temperature much faster and with higher accuracy in comparison to the industrial thermometer. Accurate measurements of the fast changing fluid temperature are possible due to the low inertia thermometer and fast space marching method applied for solving the inverse heat conduction problem.

  5. NEUTRONICS ANALYSES FOR SNS TARGETS DEPOSITIONS

    SciTech Connect

    Popova, Irina I; Remec, Igor; Gallmeier, Franz X

    2016-01-01

    In order to deposit Spallation Neutron Source (SNS) spent facility components replaced due to end-of-life radiation-induced material damage or burn-up, or because of mechanical failure or design improvements, waste classification analyses are being performed. These analyses include an accurate estimate of the radionuclide inventory, on which base components are classified and an appropriate container for transport and storage is determined. After the choice for the container is made, transport calculations are performed for the facility component to be placed inside the container, ensuring compliance with waste management regulations. When necessary, additional shielding is added. Most of the effort is concentrated on the target deposition, which normally takes place once or twice per year. Additionally, the second target station (STS) is in a process of design and waste management analyses for the STS target are being developed to support a deposition plan

  6. The first accurate description of an aurora

    NASA Astrophysics Data System (ADS)

    Schröder, Wilfried

    2006-12-01

    As technology has advanced, the scientific study of auroral phenomena has increased by leaps and bounds. A look back at the earliest descriptions of aurorae offers an interesting look into how medieval scholars viewed the subjects that we study.Although there are earlier fragmentary references in the literature, the first accurate description of the aurora borealis appears to be that published by the German Catholic scholar Konrad von Megenberg (1309-1374) in his book Das Buch der Natur (The Book of Nature). The book was written between 1349 and 1350.

  7. Determining accurate distances to nearby galaxies

    NASA Astrophysics Data System (ADS)

    Bonanos, Alceste Zoe

    2005-11-01

    Determining accurate distances to nearby or distant galaxies is a very simple conceptually, yet complicated in practice, task. Presently, distances to nearby galaxies are only known to an accuracy of 10-15%. The current anchor galaxy of the extragalactic distance scale is the Large Magellanic Cloud, which has large (10-15%) systematic uncertainties associated with it, because of its morphology, its non-uniform reddening and the unknown metallicity dependence of the Cepheid period-luminosity relation. This work aims to determine accurate distances to some nearby galaxies, and subsequently help reduce the error in the extragalactic distance scale and the Hubble constant H 0 . In particular, this work presents the first distance determination of the DIRECT Project to M33 with detached eclipsing binaries. DIRECT aims to obtain a new anchor galaxy for the extragalactic distance scale by measuring direct, accurate (to 5%) distances to two Local Group galaxies, M31 and M33, with detached eclipsing binaries. It involves a massive variability survey of these galaxies and subsequent photometric and spectroscopic follow-up of the detached binaries discovered. In this work, I also present a catalog of variable stars discovered in one of the DIRECT fields, M31Y, which includes 41 eclipsing binaries. Additionally, we derive the distance to the Draco Dwarf Spheroidal galaxy, with ~100 RR Lyrae found in our first CCD variability study of this galaxy. A "hybrid" method of discovering Cepheids with ground-based telescopes is described next. It involves applying the image subtraction technique on the images obtained from ground-based telescopes and then following them up with the Hubble Space Telescope to derive Cepheid period-luminosity distances. By re-analyzing ESO Very Large Telescope data on M83 (NGC 5236), we demonstrate that this method is much more powerful for detecting variability, especially in crowded fields. I finally present photometry for the Wolf-Rayet binary WR 20a

  8. New law requires 'medically accurate' lesson plans.

    PubMed

    1999-09-17

    The California Legislature has passed a bill requiring all textbooks and materials used to teach about AIDS be medically accurate and objective. Statements made within the curriculum must be supported by research conducted in compliance with scientific methods, and published in peer-reviewed journals. Some of the current lesson plans were found to contain scientifically unsupported and biased information. In addition, the bill requires material to be "free of racial, ethnic, or gender biases." The legislation is supported by a wide range of interests, but opposed by the California Right to Life Education Fund, because they believe it discredits abstinence-only material.

  9. Atmospheric tether mission analyses

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA is considering the use of tethered satellites to explore regions of the atmosphere inaccessible to spacecraft or high altitude research balloons. This report summarizes the Lockheed Martin Astronautics (LMA) effort for the engineering study team assessment of an Orbiter-based atmospheric tether mission. Lockheed Martin responsibilities included design recommendations for the deployer and tether, as well as tether dynamic analyses for the mission. Three tether configurations were studied including single line, multistrand (Hoytether) and tape designs.

  10. Quantitative Spectroscopy of Deneb

    NASA Astrophysics Data System (ADS)

    Schiller, Florian; Przybilla, N.

    We use the visually brightest A-type supergiant Deneb (A2 Ia) as benchmark for testing a spectro- scopic analysis technique developed for quantitative studies of BA-type supergiants. Our NLTE spectrum synthesis technique allows us to derive stellar parameters and elemental abundances with unprecedented accuracy. The study is based on a high-resolution and high-S/N spectrum obtained with the Echelle spectrograph FOCES on the Calar Alto 2.2 m telescope. Practically all inconsistencies reported in earlier studies are resolved. A self-consistent view of Deneb is thus obtained, allowing us to discuss its evolutionary state in detail by comparison with the most recent generation of evolution models for massive stars. The basic atmospheric parameters Teff = 8525 ± 75 K and log g = 1.10 ± 0.05 dex (cgs) and the distance imply the following fundamental parameters for Deneb: M spec = 17 ± 3 M⊙ , L = 1.77 ± 0.29 · 105 L⊙ and R = 192 ± 16 R⊙ . The derived He and CNO abundances indicate mixing with nuclear processed matter. The high N/C ratio of 4.64 ± 1.39 and a N/O ratio of 0.88 ± 0.07 (mass fractions) could in principle be explained by evolutionary models with initially very rapid rotation. A mass of ˜ 22 M⊙ is implied for the progenitor on the zero-age main se- quence, i.e. it was a late O-type star. Significant mass-loss has occurred, probably enhanced by pronounced centrifugal forces. The observational constraints favour a scenario for the evolu- tion of Deneb where the effects of rotational mixing may be amplified by an interaction with a magnetic field. Analogous analyses of such highly luminous BA-type supergiants will allow for precision studies of different galaxies in the Local Group and beyond.

  11. Accurate oscillator strengths for interstellar ultraviolet lines of Cl I

    NASA Technical Reports Server (NTRS)

    Schectman, R. M.; Federman, S. R.; Beideck, D. J.; Ellis, D. J.

    1993-01-01

    Analyses on the abundance of interstellar chlorine rely on accurate oscillator strengths for ultraviolet transitions. Beam-foil spectroscopy was used to obtain f-values for the astrophysically important lines of Cl I at 1088, 1097, and 1347 A. In addition, the line at 1363 A was studied. Our f-values for 1088, 1097 A represent the first laboratory measurements for these lines; the values are f(1088)=0.081 +/- 0.007 (1 sigma) and f(1097) = 0.0088 +/- 0.0013 (1 sigma). These results resolve the issue regarding the relative strengths for 1088, 1097 A in favor of those suggested by astronomical measurements. For the other lines, our results of f(1347) = 0.153 +/- 0.011 (1 sigma) and f(1363) = 0.055 +/- 0.004 (1 sigma) are the most precisely measured values available. The f-values are somewhat greater than previous experimental and theoretical determinations.

  12. A technique for evaluating bone ingrowth into 3D printed, porous Ti6Al4V implants accurately using X-ray micro-computed tomography and histomorphometry.

    PubMed

    Palmquist, Anders; Shah, Furqan A; Emanuelsson, Lena; Omar, Omar; Suska, Felicia

    2017-03-01

    This paper investigates the application of X-ray micro-computed tomography (micro-CT) to accurately evaluate bone formation within 3D printed, porous Ti6Al4V implants manufactured using Electron Beam Melting (EBM), retrieved after six months of healing in sheep femur and tibia. All samples were scanned twice (i.e., before and after resin embedding), using fast, low-resolution scans (Skyscan 1172; Bruker micro-CT, Kontich, Belgium), and were analysed by 2D and 3D morphometry. The main questions posed were: (i) Can low resolution, fast scans provide morphometric data of bone formed inside (and around) metal implants with a complex, open-pore architecture?, (ii) Can micro-CT be used to accurately quantify both the bone area (BA) and bone-implant contact (BIC)?, (iii) What degree of error is introduced in the quantitative data by varying the threshold values?, and (iv) Does resin embedding influence the accuracy of the analysis? To validate the accuracy of micro-CT measurements, each data set was correlated with a corresponding centrally cut histological section. The results show that quantitative histomorphometry corresponds strongly with 3D measurements made by micro-CT, where a high correlation exists between the two techniques for bone area/volume measurements around and inside the porous network. On the contrary, the direct bone-implant contact is challenging to estimate accurately or reproducibly. Large errors may be introduced in micro-CT measurements when segmentation is performed without calibrating the data set against a corresponding histological section. Generally, the bone area measurement is strongly influenced by the lower threshold limit, while the upper threshold limit has little or no effect. Resin embedding does not compromise the accuracy of micro-CT measurements, although there is a change in the contrast distributions and optimisation of the threshold ranges is required.

  13. Accurate taxonomic assignment of short pyrosequencing reads.

    PubMed

    Clemente, José C; Jansson, Jesper; Valiente, Gabriel

    2010-01-01

    Ambiguities in the taxonomy dependent assignment of pyrosequencing reads are usually resolved by mapping each read to the lowest common ancestor in a reference taxonomy of all those sequences that match the read. This conservative approach has the drawback of mapping a read to a possibly large clade that may also contain many sequences not matching the read. A more accurate taxonomic assignment of short reads can be made by mapping each read to the node in the reference taxonomy that provides the best precision and recall. We show that given a suffix array for the sequences in the reference taxonomy, a short read can be mapped to the node of the reference taxonomy with the best combined value of precision and recall in time linear in the size of the taxonomy subtree rooted at the lowest common ancestor of the matching sequences. An accurate taxonomic assignment of short reads can thus be made with about the same efficiency as when mapping each read to the lowest common ancestor of all matching sequences in a reference taxonomy. We demonstrate the effectiveness of our approach on several metagenomic datasets of marine and gut microbiota.

  14. Accurate shear measurement with faint sources

    SciTech Connect

    Zhang, Jun; Foucaud, Sebastien; Luo, Wentao E-mail: walt@shao.ac.cn

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  15. Accurate pose estimation for forensic identification

    NASA Astrophysics Data System (ADS)

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  16. Sparse and accurate high resolution SAR imaging

    NASA Astrophysics Data System (ADS)

    Vu, Duc; Zhao, Kexin; Rowe, William; Li, Jian

    2012-05-01

    We investigate the usage of an adaptive method, the Iterative Adaptive Approach (IAA), in combination with a maximum a posteriori (MAP) estimate to reconstruct high resolution SAR images that are both sparse and accurate. IAA is a nonparametric weighted least squares algorithm that is robust and user parameter-free. IAA has been shown to reconstruct SAR images with excellent side lobes suppression and high resolution enhancement. We first reconstruct the SAR images using IAA, and then we enforce sparsity by using MAP with a sparsity inducing prior. By coupling these two methods, we can produce a sparse and accurate high resolution image that are conducive for feature extractions and target classification applications. In addition, we show how IAA can be made computationally efficient without sacrificing accuracies, a desirable property for SAR applications where the size of the problems is quite large. We demonstrate the success of our approach using the Air Force Research Lab's "Gotcha Volumetric SAR Data Set Version 1.0" challenge dataset. Via the widely used FFT, individual vehicles contained in the scene are barely recognizable due to the poor resolution and high side lobe nature of FFT. However with our approach clear edges, boundaries, and textures of the vehicles are obtained.

  17. Accurate basis set truncation for wavefunction embedding

    NASA Astrophysics Data System (ADS)

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  18. Accurate perception of negative emotions predicts functional capacity in schizophrenia.

    PubMed

    Abram, Samantha V; Karpouzian, Tatiana M; Reilly, James L; Derntl, Birgit; Habel, Ute; Smith, Matthew J

    2014-04-30

    Several studies suggest facial affect perception (FAP) deficits in schizophrenia are linked to poorer social functioning. However, whether reduced functioning is associated with inaccurate perception of specific emotional valence or a global FAP impairment remains unclear. The present study examined whether impairment in the perception of specific emotional valences (positive, negative) and neutrality were uniquely associated with social functioning, using a multimodal social functioning battery. A sample of 59 individuals with schizophrenia and 41 controls completed a computerized FAP task, and measures of functional capacity, social competence, and social attainment. Participants also underwent neuropsychological testing and symptom assessment. Regression analyses revealed that only accurately perceiving negative emotions explained significant variance (7.9%) in functional capacity after accounting for neurocognitive function and symptoms. Partial correlations indicated that accurately perceiving anger, in particular, was positively correlated with functional capacity. FAP for positive, negative, or neutral emotions were not related to social competence or social attainment. Our findings were consistent with prior literature suggesting negative emotions are related to functional capacity in schizophrenia. Furthermore, the observed relationship between perceiving anger and performance of everyday living skills is novel and warrants further exploration.

  19. Accurate stone analysis: the impact on disease diagnosis and treatment.

    PubMed

    Mandel, Neil S; Mandel, Ian C; Kolbach-Mandel, Ann M

    2017-02-01

    This manuscript reviews the requirements for acceptable compositional analysis of kidney stones using various biophysical methods. High-resolution X-ray powder diffraction crystallography and Fourier transform infrared spectroscopy (FTIR) are the only acceptable methods in our labs for kidney stone analysis. The use of well-constructed spectral reference libraries is the basis for accurate and complete stone analysis. The literature included in this manuscript identify errors in most commercial laboratories and in some academic centers. We provide personal comments on why such errors are occurring at such high rates, and although the work load is rather large, it is very worthwhile in providing accurate stone compositions. We also provide the results of our almost 90,000 stone analyses and a breakdown of the number of components we have observed in the various stones. We also offer advice on determining the method used by the various FTIR equipment manufacturers who also provide a stone analysis library so that the FTIR users can feel comfortable in the accuracy of their reported results. Such an analysis on the accuracy of the individual reference libraries could positively influence the reduction in their respective error rates.

  20. Accurate phylogenetic classification of variable-length DNA fragments.

    PubMed

    McHardy, Alice Carolyn; Martín, Héctor García; Tsirigos, Aristotelis; Hugenholtz, Philip; Rigoutsos, Isidore

    2007-01-01

    Metagenome studies have retrieved vast amounts of sequence data from a variety of environments leading to new discoveries and insights into the uncultured microbial world. Except for very simple communities, the encountered diversity has made fragment assembly and the subsequent analysis a challenging problem. A taxonomic characterization of metagenomic fragments is required for a deeper understanding of shotgun-sequenced microbial communities, but success has mostly been limited to sequences containing phylogenetic marker genes. Here we present PhyloPythia, a composition-based classifier that combines higher-level generic clades from a set of 340 completed genomes with sample-derived population models. Extensive analyses on synthetic and real metagenome data sets showed that PhyloPythia allows the accurate classification of most sequence fragments across all considered taxonomic ranks, even for unknown organisms. The method requires no more than 100 kb of training sequence for the creation of accurate models of sample-specific populations and can assign fragments >or=1 kb with high specificity.

  1. Quantitative DNA Methylation Profiling in Cancer.

    PubMed

    Ammerpohl, Ole; Haake, Andrea; Kolarova, Julia; Siebert, Reiner

    2016-01-01

    Epigenetic mechanisms including DNA methylation are fundamental for the regulation of gene expression. Epigenetic alterations can lead to the development and the evolution of malignant tumors as well as the emergence of phenotypically different cancer cells or metastasis from one single tumor cell. Here we describe bisulfite pyrosequencing, a technology to perform quantitative DNA methylation analyses, to detect aberrant DNA methylation in malignant tumors.

  2. Accurate measurement of transgene copy number in crop plants using droplet digital PCR.

    PubMed

    Collier, Ray; Dasgupta, Kasturi; Xing, Yan-Ping; Hernandez, Bryan Tarape; Shao, Min; Rohozinski, Dominica; Kovak, Emma; Lin, Jeanie; de Oliveira, Maria Luiza P; Stover, Ed; McCue, Kent F; Harmon, Frank G; Blechl, Ann; Thomson, James G; Thilmony, Roger

    2017-02-23

    Genetic transformation is a powerful means for the improvement of crop plants, but requires labor and resource intensive methods. An efficient method for identifying single copy transgene insertion events from a population of independent transgenic lines is desirable. Currently transgene copy number is estimated by either Southern blot hybridization analyses or quantitative polymerase chain reaction (qPCR) experiments. Southern hybridization is a convincing and reliable method, but it also is expensive, time-consuming and often requires a large amount of genomic DNA and radioactively labeled probes. Alternatively, qPCR requires less DNA and is potentially simpler to perform, but its results can lack the accuracy and precision needed to confidently distinguish between one and two copy events in transgenic plants with large genomes. To address this need, we developed a droplet digital PCR (dPCR)-based method for transgene copy number measurement in an array of crops: rice, citrus, potato, maize, tomato, and wheat. The method utilizes specific primers to amplify target transgenes, and endogenous reference genes in a single duplexed reaction containing thousands of droplets. Endpoint amplicon production in the droplets is detected and quantified using sequence-specific fluorescently labeled probes. The results demonstrate that this approach can generate confident copy number measurements in independent transgenic lines in these crop species. This method and the compendium of probes and primers will be a useful resource for the plant research community, enabling the simple and accurate determination of transgene copy number in these six important crop species. This article is protected by copyright. All rights reserved.

  3. Quantitative analysis of blood vessel geometry

    NASA Astrophysics Data System (ADS)

    Fuhrman, Michael G.; Abdul-Karim, Othman; Shah, Sujal; Gilbert, Steven G.; Van Bibber, Richard

    2001-07-01

    Re-narrowing or restenosis of a human coronary artery occurs within six months in one third of balloon angioplasty procedures. Accurate and repeatable quantitative analysis of vessel shape is important to characterize the progression and type of restenosis, and to evaluate effects new therapies might have. A combination of complicated geometry and image variability, and the need for high resolution and large image size makes visual/manual analysis slow, difficult, and prone to error. The image processing and analysis described here was developed to automate feature extraction of the lumen, internal elastic lamina, neointima, external elastic lamina, and tunica adventitia and to enable an objective, quantitative definition of blood vessel geometry. The quantitative geometrical analysis enables the measurement of several features including perimeter, area, and other metrics of vessel damage. Automation of feature extraction creates a high throughput capability that enables analysis of serial sections for more accurate measurement of restenosis dimensions. Measurement results are input into a relational database where they can be statistically analyzed compared across studies. As part of the integrated process, results are also imprinted on the images themselves to facilitate auditing of the results. The analysis is fast, repeatable and accurate while allowing the pathologist to control the measurement process.

  4. LDEF Satellite Radiation Analyses

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Model calculations and analyses have been carried out to compare with several sets of data (dose, induced radioactivity in various experiment samples and spacecraft components, fission foil measurements, and LET spectra) from passive radiation dosimetry on the Long Duration Exposure Facility (LDEF) satellite, which was recovered after almost six years in space. The calculations and data comparisons are used to estimate the accuracy of current models and methods for predicting the ionizing radiation environment in low earth orbit. The emphasis is on checking the accuracy of trapped proton flux and anisotropy models.

  5. Targeted Quantitation of Proteins by Mass Spectrometry

    PubMed Central

    2013-01-01

    Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332

  6. Accurate response surface approximations for weight equations based on structural optimization

    NASA Astrophysics Data System (ADS)

    Papila, Melih

    Accurate weight prediction methods are vitally important for aircraft design optimization. Therefore, designers seek weight prediction techniques with low computational cost and high accuracy, and usually require a compromise between the two. The compromise can be achieved by combining stress analysis and response surface (RS) methodology. While stress analysis provides accurate weight information, RS techniques help to transmit effectively this information to the optimization procedure. The focus of this dissertation is structural weight equations in the form of RS approximations and their accuracy when fitted to results of structural optimizations that are based on finite element analyses. Use of RS methodology filters out the numerical noise in structural optimization results and provides a smooth weight function that can easily be used in gradient-based configuration optimization. In engineering applications RS approximations of low order polynomials are widely used, but the weight may not be modeled well by low-order polynomials, leading to bias errors. In addition, some structural optimization results may have high-amplitude errors (outliers) that may severely affect the accuracy of the weight equation. Statistical techniques associated with RS methodology are sought in order to deal with these two difficulties: (1) high-amplitude numerical noise (outliers) and (2) approximation model inadequacy. The investigation starts with reducing approximation error by identifying and repairing outliers. A potential reason for outliers in optimization results is premature convergence, and outliers of such nature may be corrected by employing different convergence settings. It is demonstrated that outlier repair can lead to accuracy improvements over the more standard approach of removing outliers. The adequacy of approximation is then studied by a modified lack-of-fit approach, and RS errors due to the approximation model are reduced by using higher order polynomials. In

  7. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, D.D.

    The present invention is a thermometer used for measuring furnace temperatures in the range of about 1800/sup 0/ to 2700/sup 0/C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  8. Apparatus for accurately measuring high temperatures

    DOEpatents

    Smith, Douglas D.

    1985-01-01

    The present invention is a thermometer used for measuring furnace temperaes in the range of about 1800.degree. to 2700.degree. C. The thermometer comprises a broadband multicolor thermal radiation sensor positioned to be in optical alignment with the end of a blackbody sight tube extending into the furnace. A valve-shutter arrangement is positioned between the radiation sensor and the sight tube and a chamber for containing a charge of high pressure gas is positioned between the valve-shutter arrangement and the radiation sensor. A momentary opening of the valve shutter arrangement allows a pulse of the high gas to purge the sight tube of air-borne thermal radiation contaminants which permits the radiation sensor to accurately measure the thermal radiation emanating from the end of the sight tube.

  9. Highly accurate articulated coordinate measuring machine

    DOEpatents

    Bieg, Lothar F.; Jokiel, Jr., Bernhard; Ensz, Mark T.; Watson, Robert D.

    2003-12-30

    Disclosed is a highly accurate articulated coordinate measuring machine, comprising a revolute joint, comprising a circular encoder wheel, having an axis of rotation; a plurality of marks disposed around at least a portion of the circumference of the encoder wheel; bearing means for supporting the encoder wheel, while permitting free rotation of the encoder wheel about the wheel's axis of rotation; and a sensor, rigidly attached to the bearing means, for detecting the motion of at least some of the marks as the encoder wheel rotates; a probe arm, having a proximal end rigidly attached to the encoder wheel, and having a distal end with a probe tip attached thereto; and coordinate processing means, operatively connected to the sensor, for converting the output of the sensor into a set of cylindrical coordinates representing the position of the probe tip relative to a reference cylindrical coordinate system.

  10. Practical aspects of spatially high accurate methods

    NASA Technical Reports Server (NTRS)

    Godfrey, Andrew G.; Mitchell, Curtis R.; Walters, Robert W.

    1992-01-01

    The computational qualities of high order spatially accurate methods for the finite volume solution of the Euler equations are presented. Two dimensional essentially non-oscillatory (ENO), k-exact, and 'dimension by dimension' ENO reconstruction operators are discussed and compared in terms of reconstruction and solution accuracy, computational cost and oscillatory behavior in supersonic flows with shocks. Inherent steady state convergence difficulties are demonstrated for adaptive stencil algorithms. An exact solution to the heat equation is used to determine reconstruction error, and the computational intensity is reflected in operation counts. Standard MUSCL differencing is included for comparison. Numerical experiments presented include the Ringleb flow for numerical accuracy and a shock reflection problem. A vortex-shock interaction demonstrates the ability of the ENO scheme to excel in simulating unsteady high-frequency flow physics.

  11. Obtaining accurate translations from expressed sequence tags.

    PubMed

    Wasmuth, James; Blaxter, Mark

    2009-01-01

    The genomes of an increasing number of species are being investigated through the generation of expressed sequence tags (ESTs). However, ESTs are prone to sequencing errors and typically define incomplete transcripts, making downstream annotation difficult. Annotation would be greatly improved with robust polypeptide translations. Many current solutions for EST translation require a large number of full-length gene sequences for training purposes, a resource that is not available for the majority of EST projects. As part of our ongoing EST programs investigating these "neglected" genomes, we have developed a polypeptide prediction pipeline, prot4EST. It incorporates freely available software to produce final translations that are more accurate than those derived from any single method. We describe how this integrated approach goes a long way to overcoming the deficit in training data.

  12. Micron Accurate Absolute Ranging System: Range Extension

    NASA Technical Reports Server (NTRS)

    Smalley, Larry L.; Smith, Kely L.

    1999-01-01

    The purpose of this research is to investigate Fresnel diffraction as a means of obtaining absolute distance measurements with micron or greater accuracy. It is believed that such a system would prove useful to the Next Generation Space Telescope (NGST) as a non-intrusive, non-contact measuring system for use with secondary concentrator station-keeping systems. The present research attempts to validate past experiments and develop ways to apply the phenomena of Fresnel diffraction to micron accurate measurement. This report discusses past research on the phenomena, and the basis of the use Fresnel diffraction distance metrology. The apparatus used in the recent investigations, experimental procedures used, preliminary results are discussed in detail. Continued research and equipment requirements on the extension of the effective range of the Fresnel diffraction systems is also described.

  13. Accurate radio positions with the Tidbinbilla interferometer

    NASA Technical Reports Server (NTRS)

    Batty, M. J.; Gulkis, S.; Jauncey, D. L.; Rayner, P. T.

    1979-01-01

    The Tidbinbilla interferometer (Batty et al., 1977) is designed specifically to provide accurate radio position measurements of compact radio sources in the Southern Hemisphere with high sensitivity. The interferometer uses the 26-m and 64-m antennas of the Deep Space Network at Tidbinbilla, near Canberra. The two antennas are separated by 200 m on a north-south baseline. By utilizing the existing antennas and the low-noise traveling-wave masers at 2.29 GHz, it has been possible to produce a high-sensitivity instrument with a minimum of capital expenditure. The north-south baseline ensures that a good range of UV coverage is obtained, so that sources lying in the declination range between about -80 and +30 deg may be observed with nearly orthogonal projected baselines of no less than about 1000 lambda. The instrument also provides high-accuracy flux density measurements for compact radio sources.

  14. Magnetic ranging tool accurately guides replacement well

    SciTech Connect

    Lane, J.B.; Wesson, J.P. )

    1992-12-21

    This paper reports on magnetic ranging surveys and directional drilling technology which accurately guided a replacement well bore to intersect a leaking gas storage well with casing damage. The second well bore was then used to pump cement into the original leaking casing shoe. The repair well bore kicked off from the surface hole, bypassed casing damage in the middle of the well, and intersected the damaged well near the casing shoe. The repair well was subsequently completed in the gas storage zone near the original well bore, salvaging the valuable bottom hole location in the reservoir. This method would prevent the loss of storage gas, and it would prevent a potential underground blowout that could permanently damage the integrity of the storage field.

  15. Mobile app-based quantitative scanometric analysis.

    PubMed

    Wong, Jessica X H; Liu, Frank S F; Yu, Hua-Zhong

    2014-12-16

    The feasibility of using smartphones and other mobile devices as the detection platform for quantitative scanometric assays is demonstrated. The different scanning modes (color, grayscale, black/white) and grayscale converting protocols (average, weighted average/luminosity, and software specific) have been compared in determining the optical darkness ratio (ODR) values, a conventional quantitation measure for scanometric assays. A mobile app was developed to image and analyze scanometric assays, as demonstrated by paper-printed tests and a biotin-streptavidin assay on a plastic substrate. Primarily for ODR analysis, the app has been shown to perform as well as a traditional desktop scanner, augmenting that smartphones (and other mobile devices) promise to be a practical platform for accurate, quantitative chemical analysis and medical diagnostics.

  16. The high cost of accurate knowledge.

    PubMed

    Sutcliffe, Kathleen M; Weber, Klaus

    2003-05-01

    Many business thinkers believe it's the role of senior managers to scan the external environment to monitor contingencies and constraints, and to use that precise knowledge to modify the company's strategy and design. As these thinkers see it, managers need accurate and abundant information to carry out that role. According to that logic, it makes sense to invest heavily in systems for collecting and organizing competitive information. Another school of pundits contends that, since today's complex information often isn't precise anyway, it's not worth going overboard with such investments. In other words, it's not the accuracy and abundance of information that should matter most to top executives--rather, it's how that information is interpreted. After all, the role of senior managers isn't just to make decisions; it's to set direction and motivate others in the face of ambiguities and conflicting demands. Top executives must interpret information and communicate those interpretations--they must manage meaning more than they must manage information. So which of these competing views is the right one? Research conducted by academics Sutcliffe and Weber found that how accurate senior executives are about their competitive environments is indeed less important for strategy and corresponding organizational changes than the way in which they interpret information about their environments. Investments in shaping those interpretations, therefore, may create a more durable competitive advantage than investments in obtaining and organizing more information. And what kinds of interpretations are most closely linked with high performance? Their research suggests that high performers respond positively to opportunities, yet they aren't overconfident in their abilities to take advantage of those opportunities.

  17. Quantitative plant proteomics.

    PubMed

    Bindschedler, Laurence V; Cramer, Rainer

    2011-02-01

    Quantitation is an inherent requirement in comparative proteomics and there is no exception to this for plant proteomics. Quantitative proteomics has high demands on the experimental workflow, requiring a thorough design and often a complex multi-step structure. It has to include sufficient numbers of biological and technical replicates and methods that are able to facilitate a quantitative signal read-out. Quantitative plant proteomics in particular poses many additional challenges but because of the nature of plants it also offers some potential advantages. In general, analysis of plants has been less prominent in proteomics. Low protein concentration, difficulties in protein extraction, genome multiploidy, high Rubisco abundance in green tissue, and an absence of well-annotated and completed genome sequences are some of the main challenges in plant proteomics. However, the latter is now changing with several genomes emerging for model plants and crops such as potato, tomato, soybean, rice, maize and barley. This review discusses the current status in quantitative plant proteomics (MS-based and non-MS-based) and its challenges and potentials. Both relative and absolute quantitation methods in plant proteomics from DIGE to MS-based analysis after isotope labeling and label-free quantitation are described and illustrated by published studies. In particular, we describe plant-specific quantitative methods such as metabolic labeling methods that can take full advantage of plant metabolism and culture practices, and discuss other potential advantages and challenges that may arise from the unique properties of plants.

  18. On an efficient and accurate method to integrate restricted three-body orbits

    NASA Technical Reports Server (NTRS)

    Murison, Marc A.

    1989-01-01

    This work is a quantitative analysis of the advantages of the Bulirsch-Stoer (1966) method, demonstrating that this method is certainly worth considering when working with small N dynamical systems. The results, qualitatively suspected by many users, are quantitatively confirmed as follows: (1) the Bulirsch-Stoer extrapolation method is very fast and moderately accurate; (2) regularization of the equations of motion stabilizes the error behavior of the method and is, of course, essential during close approaches; and (3) when applicable, a manifold-correction algorithm reduces numerical errors to the limits of machine accuracy. In addition, for the specific case of the restricted three-body problem, even a small eccentricity for the orbit of the primaries drastically affects the accuracy of integrations, whether regularized or not; the circular restricted problem integrates much more accurately.

  19. Partial differential equation techniques for analysing animal movement: A comparison of different methods.

    PubMed

    Wang, Yi-Shan; Potts, Jonathan R

    2017-03-07

    Recent advances in animal tracking have allowed us to uncover the drivers of movement in unprecedented detail. This has enabled modellers to construct ever more realistic models of animal movement, which aid in uncovering detailed patterns of space use in animal populations. Partial differential equations (PDEs) provide a popular tool for mathematically analysing such models. However, their construction often relies on simplifying assumptions which may greatly affect the model outcomes. Here, we analyse the effect of various PDE approximations on the analysis of some simple movement models, including a biased random walk, central-place foraging processes and movement in heterogeneous landscapes. Perhaps the most commonly-used PDE method dates back to a seminal paper of Patlak from 1953. However, our results show that this can be a very poor approximation in even quite simple models. On the other hand, more recent methods, based on transport equation formalisms, can provide more accurate results, as long as the kernel describing the animal's movement is sufficiently smooth. When the movement kernel is not smooth, we show that both the older and newer methods can lead to quantitatively misleading results. Our detailed analysis will aid future researchers in the appropriate choice of PDE approximation for analysing models of animal movement.

  20. Application of Mixed-Methods Approaches to Higher Education and Intersectional Analyses

    ERIC Educational Resources Information Center

    Griffin, Kimberly A.; Museus, Samuel D.

    2011-01-01

    In this article, the authors discuss the utility of combining quantitative and qualitative methods in conducting intersectional analyses. First, they discuss some of the paradigmatic underpinnings of qualitative and quantitative research, and how these methods can be used in intersectional analyses. They then consider how paradigmatic pragmatism…

  1. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    NASA Astrophysics Data System (ADS)

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  2. Improving the Linkages between Air Pollution Epidemiology and Quantitative Risk Assessment

    PubMed Central

    Bell, Michelle L.; Walker, Katy; Hubbell, Bryan

    2011-01-01

    Background: Air pollution epidemiology plays an integral role in both identifying the hazards of air pollution as well as supplying the risk coefficients that are used in quantitative risk assessments. Evidence from both epidemiology and risk assessments has historically supported critical environmental policy decisions. The extent to which risk assessors can properly specify a quantitative risk assessment and characterize key sources of uncertainty depends in part on the availability, and clarity, of data and assumptions in the epidemiological studies. Objectives: We discuss the interests shared by air pollution epidemiology and risk assessment communities in ensuring that the findings of epidemiological studies are appropriately characterized and applied correctly in risk assessments. We highlight the key input parameters for risk assessments and consider how modest changes in the characterization of these data might enable more accurate risk assessments that better represent the findings of epidemiological studies. Discussion: We argue that more complete information regarding the methodological choices and input data used in epidemiological studies would support more accurate risk assessments—to the benefit of both disciplines. In particular, we suggest including additional details regarding air quality, demographic, and health data, as well as certain types of data-rich graphics. Conclusions: Relatively modest changes to the data reported in epidemiological studies will improve the quality of risk assessments and help prevent the misinterpretation and mischaracterization of the results of epidemiological studies. Such changes may also benefit epidemiologists undertaking meta-analyses. We suggest workshops as a way to improve the dialogue between the two communities. PMID:21816702

  3. Higher order accurate partial implicitization: An unconditionally stable fourth-order-accurate explicit numerical technique

    NASA Technical Reports Server (NTRS)

    Graves, R. A., Jr.

    1975-01-01

    The previously obtained second-order-accurate partial implicitization numerical technique used in the solution of fluid dynamic problems was modified with little complication to achieve fourth-order accuracy. The Von Neumann stability analysis demonstrated the unconditional linear stability of the technique. The order of the truncation error was deduced from the Taylor series expansions of the linearized difference equations and was verified by numerical solutions to Burger's equation. For comparison, results were also obtained for Burger's equation using a second-order-accurate partial-implicitization scheme, as well as the fourth-order scheme of Kreiss.

  4. EEG analyses with SOBI.

    SciTech Connect

    Glickman, Matthew R.; Tang, Akaysha

    2009-02-01

    The motivating vision behind Sandia's MENTOR/PAL LDRD project has been that of systems which use real-time psychophysiological data to support and enhance human performance, both individually and of groups. Relevant and significant psychophysiological data being a necessary prerequisite to such systems, this LDRD has focused on identifying and refining such signals. The project has focused in particular on EEG (electroencephalogram) data as a promising candidate signal because it (potentially) provides a broad window on brain activity with relatively low cost and logistical constraints. We report here on two analyses performed on EEG data collected in this project using the SOBI (Second Order Blind Identification) algorithm to identify two independent sources of brain activity: one in the frontal lobe and one in the occipital. The first study looks at directional influences between the two components, while the second study looks at inferring gender based upon the frontal component.

  5. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  6. Accurate measurement of streamwise vortices in low speed aerodynamic flows

    NASA Astrophysics Data System (ADS)

    Waldman, Rye M.; Kudo, Jun; Breuer, Kenneth S.

    2010-11-01

    Low Reynolds number experiments with flapping animals (such as bats and small birds) are of current interest in understanding biological flight mechanics, and due to their application to Micro Air Vehicles (MAVs) which operate in a similar parameter space. Previous PIV wake measurements have described the structures left by bats and birds, and provided insight to the time history of their aerodynamic force generation; however, these studies have faced difficulty drawing quantitative conclusions due to significant experimental challenges associated with the highly three-dimensional and unsteady nature of the flows, and the low wake velocities associated with lifting bodies that only weigh a few grams. This requires the high-speed resolution of small flow features in a large field of view using limited laser energy and finite camera resolution. Cross-stream measurements are further complicated by the high out-of-plane flow which requires thick laser sheets and short interframe times. To quantify and address these challenges we present data from a model study on the wake behind a fixed wing at conditions comparable to those found in biological flight. We present a detailed analysis of the PIV wake measurements, discuss the criteria necessary for accurate measurements, and present a new dual-plane PIV configuration to resolve these issues.

  7. Accurate multiple network alignment through context-sensitive random walk

    PubMed Central

    2015-01-01

    Background Comparative network analysis can provide an effective means of analyzing large-scale biological networks and gaining novel insights into their structure and organization. Global network alignment aims to predict the best overall mapping between a given set of biological networks, thereby identifying important similarities as well as differences among the networks. It has been shown that network alignment methods can be used to detect pathways or network modules that are conserved across different networks. Until now, a number of network alignment algorithms have been proposed based on different formulations and approaches, many of them focusing on pairwise alignment. Results In this work, we propose a novel multiple network alignment algorithm based on a context-sensitive random walk model. The random walker employed in the proposed algorithm switches between two different modes, namely, an individual walk on a single network and a simultaneous walk on two networks. The switching decision is made in a context-sensitive manner by examining the current neighborhood, which is effective for quantitatively estimating the degree of correspondence between nodes that belong to different networks, in a manner that sensibly integrates node similarity and topological similarity. The resulting node correspondence scores are then used to predict the maximum expected accuracy (MEA) alignment of the given networks. Conclusions Performance evaluation based on synthetic networks as well as real protein-protein interaction networks shows that the proposed algorithm can construct more accurate multiple network alignments compared to other leading methods. PMID:25707987

  8. Raman Spectroscopy as an Accurate Probe of Defects in Graphene

    NASA Astrophysics Data System (ADS)

    Rodriguez-Nieva, Joaquin; Barros, Eduardo; Saito, Riichiro; Dresselhaus, Mildred

    2014-03-01

    Raman Spectroscopy has proved to be an invaluable non-destructive technique that allows us to obtain intrinsic information about graphene. Furthermore, defect-induced Raman features, namely the D and D' bands, have previously been used to assess the purity of graphitic samples. However, quantitative studies of the signatures of the different types of defects on the Raman spectra is still an open problem. Experimental results already suggest that the Raman intensity ratio ID /ID' may allow us to identify the nature of the defects. We study from a theoretical point of view the power and limitations of Raman spectroscopy in the study of defects in graphene. We derive an analytic model that describes the Double Resonance Raman process of disordered graphene samples, and which explicitly shows the role played by both the defect-dependent parameters as well as the experimentally-controlled variables. We compare our model with previous Raman experiments, and use it to guide new ways in which defects in graphene can be accurately probed with Raman spectroscopy. We acknowledge support from NSF grant DMR1004147.

  9. Rapid Accurate Identification of Bacterial and Viral Pathogens

    SciTech Connect

    Dunn, John

    2007-03-09

    The goals of this program were to develop two assays for rapid, accurate identification of pathogenic organisms at the strain level. The first assay "Quantitative Genome Profiling or QGP" is a real time PCR assay with a restriction enzyme-based component. Its underlying concept is that certain enzymes should cleave genomic DNA at many sites and that in some cases these cuts will interrupt the connection on the genomic DNA between flanking PCR primer pairs thereby eliminating selected PCR amplifications. When this occurs the appearance of the real-time PCR threshold (Ct) signal during DNA amplification is totally eliminated or, if cutting is incomplete, greatly delayed compared to an uncut control. This temporal difference in appearance of the Ct signal relative to undigested control DNA provides a rapid, high-throughput approach for DNA-based identification of different but closely related pathogens depending upon the nucleotide sequence of the target region. The second assay we developed uses the nucleotide sequence of pairs of shmi identifier tags (-21 bp) to identify DNA molecules. Subtle differences in linked tag pair combinations can also be used to distinguish between closely related isolates..

  10. Personalized Orthodontic Accurate Tooth Arrangement System with Complete Teeth Model.

    PubMed

    Cheng, Cheng; Cheng, Xiaosheng; Dai, Ning; Liu, Yi; Fan, Qilei; Hou, Yulin; Jiang, Xiaotong

    2015-09-01

    The accuracy, validity and lack of relation information between dental root and jaw in tooth arrangement are key problems in tooth arrangement technology. This paper aims to describe a newly developed virtual, personalized and accurate tooth arrangement system based on complete information about dental root and skull. Firstly, a feature constraint database of a 3D teeth model is established. Secondly, for computed simulation of tooth movement, the reference planes and lines are defined by the anatomical reference points. The matching mathematical model of teeth pattern and the principle of the specific pose transformation of rigid body are fully utilized. The relation of position between dental root and alveolar bone is considered during the design process. Finally, the relative pose relationships among various teeth are optimized using the object mover, and a personalized therapeutic schedule is formulated. Experimental results show that the virtual tooth arrangement system can arrange abnormal teeth very well and is sufficiently flexible. The relation of position between root and jaw is favorable. This newly developed system is characterized by high-speed processing and quantitative evaluation of the amount of 3D movement of an individual tooth.

  11. Does a pneumotach accurately characterize voice function?

    NASA Astrophysics Data System (ADS)

    Walters, Gage; Krane, Michael

    2016-11-01

    A study is presented which addresses how a pneumotach might adversely affect clinical measurements of voice function. A pneumotach is a device, typically a mask, worn over the mouth, in order to measure time-varying glottal volume flow. By measuring the time-varying difference in pressure across a known aerodynamic resistance element in the mask, the glottal volume flow waveform is estimated. Because it adds aerodynamic resistance to the vocal system, there is some concern that using a pneumotach may not accurately portray the behavior of the voice. To test this hypothesis, experiments were performed in a simplified airway model with the principal dimensions of an adult human upper airway. A compliant constriction, fabricated from silicone rubber, modeled the vocal folds. Variations of transglottal pressure, time-averaged volume flow, model vocal fold vibration amplitude, and radiated sound with subglottal pressure were performed, with and without the pneumotach in place, and differences noted. Acknowledge support of NIH Grant 2R01DC005642-10A1.

  12. Accurate thermoplasmonic simulation of metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Yu, Da-Miao; Liu, Yan-Nan; Tian, Fa-Lin; Pan, Xiao-Min; Sheng, Xin-Qing

    2017-01-01

    Thermoplasmonics leads to enhanced heat generation due to the localized surface plasmon resonances. The measurement of heat generation is fundamentally a complicated task, which necessitates the development of theoretical simulation techniques. In this paper, an efficient and accurate numerical scheme is proposed for applications with complex metallic nanostructures. Light absorption and temperature increase are, respectively, obtained by solving the volume integral equation (VIE) and the steady-state heat diffusion equation through the method of moments (MoM). Previously, methods based on surface integral equations (SIEs) were utilized to obtain light absorption. However, computing light absorption from the equivalent current is as expensive as O(NsNv), where Ns and Nv, respectively, denote the number of surface and volumetric unknowns. Our approach reduces the cost to O(Nv) by using VIE. The accuracy, efficiency and capability of the proposed scheme are validated by multiple simulations. The simulations show that our proposed method is more efficient than the approach based on SIEs under comparable accuracy, especially for the case where many incidents are of interest. The simulations also indicate that the temperature profile can be tuned by several factors, such as the geometry configuration of array, beam direction, and light wavelength.

  13. Accurate method for computing correlated color temperature.

    PubMed

    Li, Changjun; Cui, Guihua; Melgosa, Manuel; Ruan, Xiukai; Zhang, Yaoju; Ma, Long; Xiao, Kaida; Luo, M Ronnier

    2016-06-27

    For the correlated color temperature (CCT) of a light source to be estimated, a nonlinear optimization problem must be solved. In all previous methods available to compute CCT, the objective function has only been approximated, and their predictions have achieved limited accuracy. For example, different unacceptable CCT values have been predicted for light sources located on the same isotemperature line. In this paper, we propose to compute CCT using the Newton method, which requires the first and second derivatives of the objective function. Following the current recommendation by the International Commission on Illumination (CIE) for the computation of tristimulus values (summations at 1 nm steps from 360 nm to 830 nm), the objective function and its first and second derivatives are explicitly given and used in our computations. Comprehensive tests demonstrate that the proposed method, together with an initial estimation of CCT using Robertson's method [J. Opt. Soc. Am. 58, 1528-1535 (1968)], gives highly accurate predictions below 0.0012 K for light sources with CCTs ranging from 500 K to 106 K.

  14. Accurate Theoretical Thermochemistry for Fluoroethyl Radicals.

    PubMed

    Ganyecz, Ádám; Kállay, Mihály; Csontos, József

    2017-02-09

    An accurate coupled-cluster (CC) based model chemistry was applied to calculate reliable thermochemical quantities for hydrofluorocarbon derivatives including radicals 1-fluoroethyl (CH3-CHF), 1,1-difluoroethyl (CH3-CF2), 2-fluoroethyl (CH2F-CH2), 1,2-difluoroethyl (CH2F-CHF), 2,2-difluoroethyl (CHF2-CH2), 2,2,2-trifluoroethyl (CF3-CH2), 1,2,2,2-tetrafluoroethyl (CF3-CHF), and pentafluoroethyl (CF3-CF2). The model chemistry used contains iterative triple and perturbative quadruple excitations in CC theory, as well as scalar relativistic and diagonal Born-Oppenheimer corrections. To obtain heat of formation values with better than chemical accuracy perturbative quadruple excitations and scalar relativistic corrections were inevitable. Their contributions to the heats of formation steadily increase with the number of fluorine atoms in the radical reaching 10 kJ/mol for CF3-CF2. When discrepancies were found between the experimental and our values it was always possible to resolve the issue by recalculating the experimental result with currently recommended auxiliary data. For each radical studied here this study delivers the best heat of formation as well as entropy data.

  15. Accurate, reliable prototype earth horizon sensor head

    NASA Technical Reports Server (NTRS)

    Schwarz, F.; Cohen, H.

    1973-01-01

    The design and performance is described of an accurate and reliable prototype earth sensor head (ARPESH). The ARPESH employs a detection logic 'locator' concept and horizon sensor mechanization which should lead to high accuracy horizon sensing that is minimally degraded by spatial or temporal variations in sensing attitude from a satellite in orbit around the earth at altitudes in the 500 km environ 1,2. An accuracy of horizon location to within 0.7 km has been predicted, independent of meteorological conditions. This corresponds to an error of 0.015 deg-at 500 km altitude. Laboratory evaluation of the sensor indicates that this accuracy is achieved. First, the basic operating principles of ARPESH are described; next, detailed design and construction data is presented and then performance of the sensor under laboratory conditions in which the sensor is installed in a simulator that permits it to scan over a blackbody source against background representing the earth space interface for various equivalent plant temperatures.

  16. Accurate methods for large molecular systems.

    PubMed

    Gordon, Mark S; Mullin, Jonathan M; Pruitt, Spencer R; Roskop, Luke B; Slipchenko, Lyudmila V; Boatz, Jerry A

    2009-07-23

    Three exciting new methods that address the accurate prediction of processes and properties of large molecular systems are discussed. The systematic fragmentation method (SFM) and the fragment molecular orbital (FMO) method both decompose a large molecular system (e.g., protein, liquid, zeolite) into small subunits (fragments) in very different ways that are designed to both retain the high accuracy of the chosen quantum mechanical level of theory while greatly reducing the demands on computational time and resources. Each of these methods is inherently scalable and is therefore eminently capable of taking advantage of massively parallel computer hardware while retaining the accuracy of the corresponding electronic structure method from which it is derived. The effective fragment potential (EFP) method is a sophisticated approach for the prediction of nonbonded and intermolecular interactions. Therefore, the EFP method provides a way to further reduce the computational effort while retaining accuracy by treating the far-field interactions in place of the full electronic structure method. The performance of the methods is demonstrated using applications to several systems, including benzene dimer, small organic species, pieces of the alpha helix, water, and ionic liquids.

  17. Accurate equilibrium structures for piperidine and cyclohexane.

    PubMed

    Demaison, Jean; Craig, Norman C; Groner, Peter; Écija, Patricia; Cocinero, Emilio J; Lesarri, Alberto; Rudolph, Heinz Dieter

    2015-03-05

    Extended and improved microwave (MW) measurements are reported for the isotopologues of piperidine. New ground state (GS) rotational constants are fitted to MW transitions with quartic centrifugal distortion constants taken from ab initio calculations. Predicate values for the geometric parameters of piperidine and cyclohexane are found from a high level of ab initio theory including adjustments for basis set dependence and for correlation of the core electrons. Equilibrium rotational constants are obtained from GS rotational constants corrected for vibration-rotation interactions and electronic contributions. Equilibrium structures for piperidine and cyclohexane are fitted by the mixed estimation method. In this method, structural parameters are fitted concurrently to predicate parameters (with appropriate uncertainties) and moments of inertia (with uncertainties). The new structures are regarded as being accurate to 0.001 Å and 0.2°. Comparisons are made between bond parameters in equatorial piperidine and cyclohexane. Another interesting result of this study is that a structure determination is an effective way to check the accuracy of the ground state experimental rotational constants.

  18. Accurate lineshape spectroscopy and the Boltzmann constant

    PubMed Central

    Truong, G.-W.; Anstie, J. D.; May, E. F.; Stace, T. M.; Luiten, A. N.

    2015-01-01

    Spectroscopy has an illustrious history delivering serendipitous discoveries and providing a stringent testbed for new physical predictions, including applications from trace materials detection, to understanding the atmospheres of stars and planets, and even constraining cosmological models. Reaching fundamental-noise limits permits optimal extraction of spectroscopic information from an absorption measurement. Here, we demonstrate a quantum-limited spectrometer that delivers high-precision measurements of the absorption lineshape. These measurements yield a very accurate measurement of the excited-state (6P1/2) hyperfine splitting in Cs, and reveals a breakdown in the well-known Voigt spectral profile. We develop a theoretical model that accounts for this breakdown, explaining the observations to within the shot-noise limit. Our model enables us to infer the thermal velocity dispersion of the Cs vapour with an uncertainty of 35 p.p.m. within an hour. This allows us to determine a value for Boltzmann's constant with a precision of 6 p.p.m., and an uncertainty of 71 p.p.m. PMID:26465085

  19. Accurate upper body rehabilitation system using kinect.

    PubMed

    Sinha, Sanjana; Bhowmick, Brojeshwar; Chakravarty, Kingshuk; Sinha, Aniruddha; Das, Abhijit

    2016-08-01

    The growing importance of Kinect as a tool for clinical assessment and rehabilitation is due to its portability, low cost and markerless system for human motion capture. However, the accuracy of Kinect in measuring three-dimensional body joint center locations often fails to meet clinical standards of accuracy when compared to marker-based motion capture systems such as Vicon. The length of the body segment connecting any two joints, measured as the distance between three-dimensional Kinect skeleton joint coordinates, has been observed to vary with time. The orientation of the line connecting adjoining Kinect skeletal coordinates has also been seen to differ from the actual orientation of the physical body segment. Hence we have proposed an optimization method that utilizes Kinect Depth and RGB information to search for the joint center location that satisfies constraints on body segment length and as well as orientation. An experimental study have been carried out on ten healthy participants performing upper body range of motion exercises. The results report 72% reduction in body segment length variance and 2° improvement in Range of Motion (ROM) angle hence enabling to more accurate measurements for upper limb exercises.

  20. Accurate, reproducible measurement of blood pressure.

    PubMed Central

    Campbell, N R; Chockalingam, A; Fodor, J G; McKay, D W

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine consumption, smoking and physical exertion within half an hour before measurement. The use of standardized techniques to measure blood pressure will help to avoid large systematic errors. Poor technique can account for differences in readings of more than 15 mm Hg and ultimately misdiagnosis. Most of the recommended procedures are simple and, when routinely incorporated into clinical practice, require little additional time. The equipment must be appropriate and in good condition. Physicians should have a suitable selection of cuff sizes readily available; the use of the correct cuff size is essential to minimize systematic errors in blood pressure measurement. Semiannual calibration of aneroid sphygmomanometers and annual inspection of mercury sphygmomanometers and blood pressure cuffs are recommended. We review the methods recommended for measuring blood pressure and discuss the factors known to produce large differences in blood pressure readings. PMID:2192791

  1. Fast and accurate exhaled breath ammonia measurement.

    PubMed

    Solga, Steven F; Mudalel, Matthew L; Spacek, Lisa A; Risby, Terence H

    2014-06-11

    This exhaled breath ammonia method uses a fast and highly sensitive spectroscopic method known as quartz enhanced photoacoustic spectroscopy (QEPAS) that uses a quantum cascade based laser. The monitor is coupled to a sampler that measures mouth pressure and carbon dioxide. The system is temperature controlled and specifically designed to address the reactivity of this compound. The sampler provides immediate feedback to the subject and the technician on the quality of the breath effort. Together with the quick response time of the monitor, this system is capable of accurately measuring exhaled breath ammonia representative of deep lung systemic levels. Because the system is easy to use and produces real time results, it has enabled experiments to identify factors that influence measurements. For example, mouth rinse and oral pH reproducibly and significantly affect results and therefore must be controlled. Temperature and mode of breathing are other examples. As our understanding of these factors evolves, error is reduced, and clinical studies become more meaningful. This system is very reliable and individual measurements are inexpensive. The sampler is relatively inexpensive and quite portable, but the monitor is neither. This limits options for some clinical studies and provides rational for future innovations.

  2. Accurate Fission Data for Nuclear Safety

    NASA Astrophysics Data System (ADS)

    Solders, A.; Gorelov, D.; Jokinen, A.; Kolhinen, V. S.; Lantz, M.; Mattera, A.; Penttilä, H.; Pomp, S.; Rakopoulos, V.; Rinta-Antila, S.

    2014-05-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyväskylä. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (1012 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons for benchmarking and to study the energy dependence of fission yields. The scientific program is extensive and is planed to start in 2013 with a measurement of isomeric yield ratios of proton induced fission in uranium. This will be followed by studies of independent yields of thermal and fast neutron induced fission of various actinides.

  3. Accurate simulation of optical properties in dyes.

    PubMed

    Jacquemin, Denis; Perpète, Eric A; Ciofini, Ilaria; Adamo, Carlo

    2009-02-17

    Since Antiquity, humans have produced and commercialized dyes. To this day, extraction of natural dyes often requires lengthy and costly procedures. In the 19th century, global markets and new industrial products drove a significant effort to synthesize artificial dyes, characterized by low production costs, huge quantities, and new optical properties (colors). Dyes that encompass classes of molecules absorbing in the UV-visible part of the electromagnetic spectrum now have a wider range of applications, including coloring (textiles, food, paintings), energy production (photovoltaic cells, OLEDs), or pharmaceuticals (diagnostics, drugs). Parallel to the growth in dye applications, researchers have increased their efforts to design and synthesize new dyes to customize absorption and emission properties. In particular, dyes containing one or more metallic centers allow for the construction of fairly sophisticated systems capable of selectively reacting to light of a given wavelength and behaving as molecular devices (photochemical molecular devices, PMDs).Theoretical tools able to predict and interpret the excited-state properties of organic and inorganic dyes allow for an efficient screening of photochemical centers. In this Account, we report recent developments defining a quantitative ab initio protocol (based on time-dependent density functional theory) for modeling dye spectral properties. In particular, we discuss the importance of several parameters, such as the methods used for electronic structure calculations, solvent effects, and statistical treatments. In addition, we illustrate the performance of such simulation tools through case studies. We also comment on current weak points of these methods and ways to improve them.

  4. Accurate Automated Apnea Analysis in Preterm Infants

    PubMed Central

    Vergales, Brooke D.; Paget-Brown, Alix O.; Lee, Hoshik; Guin, Lauren E.; Smoot, Terri J.; Rusin, Craig G.; Clark, Matthew T.; Delos, John B.; Fairchild, Karen D.; Lake, Douglas E.; Moorman, Randall; Kattwinkel, John

    2017-01-01

    Objective In 2006 the apnea of prematurity (AOP) consensus group identified inaccurate counting of apnea episodes as a major barrier to progress in AOP research. We compare nursing records of AOP to events detected by a clinically validated computer algorithm that detects apnea from standard bedside monitors. Study Design Waveform, vital sign, and alarm data were collected continuously from all very low-birth-weight infants admitted over a 25-month period, analyzed for central apnea, bradycardia, and desaturation (ABD) events, and compared with nursing documentation collected from charts. Our algorithm defined apnea as > 10 seconds if accompanied by bradycardia and desaturation. Results Of the 3,019 nurse-recorded events, only 68% had any algorithm-detected ABD event. Of the 5,275 algorithm-detected prolonged apnea events > 30 seconds, only 26% had nurse-recorded documentation within 1 hour. Monitor alarms sounded in only 74% of events of algorithm-detected prolonged apnea events > 10 seconds. There were 8,190,418 monitor alarms of any description throughout the neonatal intensive care unit during the 747 days analyzed, or one alarm every 2 to 3 minutes per nurse. Conclusion An automated computer algorithm for continuous ABD quantitation is a far more reliable tool than the medical record to address the important research questions identified by the 2006 AOP consensus group. PMID:23592319

  5. Indian Ocean analyses

    NASA Technical Reports Server (NTRS)

    Meyers, Gary

    1992-01-01

    The background and goals of Indian Ocean thermal sampling are discussed from the perspective of a national project which has research goals relevant to variation of climate in Australia. The critical areas of SST variation are identified. The first goal of thermal sampling at this stage is to develop a climatology of thermal structure in the areas and a description of the annual variation of major currents. The sampling strategy is reviewed. Dense XBT sampling is required to achieve accurate, monthly maps of isotherm-depth because of the high level of noise in the measurements caused by aliasing of small scale variation. In the Indian Ocean ship routes dictate where adequate sampling can be achieved. An efficient sampling rate on available routes is determined based on objective analysis. The statistical structure required for objective analysis is described and compared at 95 locations in the tropical Pacific and 107 in the tropical Indian Oceans. XBT data management and quality control methods at CSIRO are reviewed. Results on the mean and annual variation of temperature and baroclinic structure in the South Equatorial Current and Pacific/Indian Ocean Throughflow are presented for the region between northwest Australia and Java-Timor. The mean relative geostrophic transport (0/400 db) of Throughflow is approximately 5 x 106 m3/sec. A nearly equal volume transport is associated with the reference velocity at 400 db. The Throughflow feeds the South Equatorial Current, which has maximum westward flow in August/September, at the end of the southeasterly Monsoon season. A strong semiannual oscillation in the South Java Current is documented. The results are in good agreement with the Semtner and Chervin (1988) ocean general circulation model. The talk concludes with comments on data inadequacies (insufficient coverage, timeliness) particular to the Indian Ocean and suggestions on the future role that can be played by Data Centers, particularly with regard to quality

  6. Using Microsoft Office Excel 2007 to conduct generalized matching analyses.

    PubMed

    Reed, Derek D

    2009-01-01

    The generalized matching equation is a robust and empirically supported means of analyzing relations between reinforcement and behavior. Unfortunately, no simple task analysis is available to behavior analysts interested in using the matching equation to evaluate data in clinical or applied settings. This technical article presents a task analysis for the use of Microsoft Excel to analyze and plot the generalized matching equation. Using a data-based case example and a step-by-step guide for completing the analysis, these instructions are intended to promote the use of quantitative analyses by researchers with little to no experience in quantitative analyses or the matching law.

  7. RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS

    PubMed Central

    Craft, George E; Chen, Anshu; Nairn, Angus C

    2014-01-01

    The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to

  8. Recapturing Quantitative Biology.

    ERIC Educational Resources Information Center

    Pernezny, Ken; And Others

    1996-01-01

    Presents a classroom activity on estimating animal populations. Uses shoe boxes and candies to emphasize the importance of mathematics in biology while introducing the methods of quantitative ecology. (JRH)

  9. Quantitative receptor autoradiography

    SciTech Connect

    Boast, C.A.; Snowhill, E.W.; Altar, C.A.

    1986-01-01

    Quantitative receptor autoradiography addresses the topic of technical and scientific advances in the sphere of quantitative autoradiography. The volume opens with a overview of the field from a historical and critical perspective. Following is a detailed discussion of in vitro data obtained from a variety of neurotransmitter systems. The next section explores applications of autoradiography, and the final two chapters consider experimental models. Methodological considerations are emphasized, including the use of computers for image analysis.

  10. Accurate orbit propagation with planetary close encounters

    NASA Astrophysics Data System (ADS)

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  11. Important Nearby Galaxies without Accurate Distances

    NASA Astrophysics Data System (ADS)

    McQuinn, Kristen

    2014-10-01

    The Spitzer Infrared Nearby Galaxies Survey (SINGS) and its offspring programs (e.g., THINGS, HERACLES, KINGFISH) have resulted in a fundamental change in our view of star formation and the ISM in galaxies, and together they represent the most complete multi-wavelength data set yet assembled for a large sample of nearby galaxies. These great investments of observing time have been dedicated to the goal of understanding the interstellar medium, the star formation process, and, more generally, galactic evolution at the present epoch. Nearby galaxies provide the basis for which we interpret the distant universe, and the SINGS sample represents the best studied nearby galaxies.Accurate distances are fundamental to interpreting observations of galaxies. Surprisingly, many of the SINGS spiral galaxies have numerous distance estimates resulting in confusion. We can rectify this situation for 8 of the SINGS spiral galaxies within 10 Mpc at a very low cost through measurements of the tip of the red giant branch. The proposed observations will provide an accuracy of better than 0.1 in distance modulus. Our sample includes such well known galaxies as M51 (the Whirlpool), M63 (the Sunflower), M104 (the Sombrero), and M74 (the archetypal grand design spiral).We are also proposing coordinated parallel WFC3 UV observations of the central regions of the galaxies, rich with high-mass UV-bright stars. As a secondary science goal we will compare the resolved UV stellar populations with integrated UV emission measurements used in calibrating star formation rates. Our observations will complement the growing HST UV atlas of high resolution images of nearby galaxies.

  12. Accurate paleointensities - the multi-method approach

    NASA Astrophysics Data System (ADS)

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  13. Accurate glucose detection in a small etalon

    NASA Astrophysics Data System (ADS)

    Martini, Joerg; Kuebler, Sebastian; Recht, Michael; Torres, Francisco; Roe, Jeffrey; Kiesel, Peter; Bruce, Richard

    2010-02-01

    We are developing a continuous glucose monitor for subcutaneous long-term implantation. This detector contains a double chamber Fabry-Perot-etalon that measures the differential refractive index (RI) between a reference and a measurement chamber at 850 nm. The etalon chambers have wavelength dependent transmission maxima which dependent linearly on the RI of their contents. An RI difference of ▵n=1.5.10-6 changes the spectral position of a transmission maximum by 1pm in our measurement. By sweeping the wavelength of a single-mode Vertical-Cavity-Surface-Emitting-Laser (VCSEL) linearly in time and detecting the maximum transmission peaks of the etalon we are able to measure the RI of a liquid. We have demonstrated accuracy of ▵n=+/-3.5.10-6 over a ▵n-range of 0 to 1.75.10-4 and an accuracy of 2% over a ▵nrange of 1.75.10-4 to 9.8.10-4. The accuracy is primarily limited by the reference measurement. The RI difference between the etalon chambers is made specific to glucose by the competitive, reversible release of Concanavalin A (ConA) from an immobilized dextran matrix. The matrix and ConA bound to it, is positioned outside the optical detection path. ConA is released from the matrix by reacting with glucose and diffuses into the optical path to change the RI in the etalon. Factors such as temperature affect the RI in measurement and detection chamber equally but do not affect the differential measurement. A typical standard deviation in RI is +/-1.4.10-6 over the range 32°C to 42°C. The detector enables an accurate glucose specific concentration measurement.

  14. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Astrophysics Data System (ADS)

    Wheeler, K.; Knuth, K.; Castle, P.

    2005-12-01

    and IKONOS imagery and the 3-D volume estimates. The combination of these then allow for a rapid and hopefully very accurate estimation of biomass.

  15. How flatbed scanners upset accurate film dosimetry

    NASA Astrophysics Data System (ADS)

    van Battum, L. J.; Huizenga, H.; Verdaasdonk, R. M.; Heukelom, S.

    2016-01-01

    Film is an excellent dosimeter for verification of dose distributions due to its high spatial resolution. Irradiated film can be digitized with low-cost, transmission, flatbed scanners. However, a disadvantage is their lateral scan effect (LSE): a scanner readout change over its lateral scan axis. Although anisotropic light scattering was presented as the origin of the LSE, this paper presents an alternative cause. Hereto, LSE for two flatbed scanners (Epson 1680 Expression Pro and Epson 10000XL), and Gafchromic film (EBT, EBT2, EBT3) was investigated, focused on three effects: cross talk, optical path length and polarization. Cross talk was examined using triangular sheets of various optical densities. The optical path length effect was studied using absorptive and reflective neutral density filters with well-defined optical characteristics (OD range 0.2-2.0). Linear polarizer sheets were used to investigate light polarization on the CCD signal in absence and presence of (un)irradiated Gafchromic film. Film dose values ranged between 0.2 to 9 Gy, i.e. an optical density range between 0.25 to 1.1. Measurements were performed in the scanner’s transmission mode, with red-green-blue channels. LSE was found to depend on scanner construction and film type. Its magnitude depends on dose: for 9 Gy increasing up to 14% at maximum lateral position. Cross talk was only significant in high contrast regions, up to 2% for very small fields. The optical path length effect introduced by film on the scanner causes 3% for pixels in the extreme lateral position. Light polarization due to film and the scanner’s optical mirror system is the main contributor, different in magnitude for the red, green and blue channel. We concluded that any Gafchromic EBT type film scanned with a flatbed scanner will face these optical effects. Accurate dosimetry requires correction of LSE, therefore, determination of the LSE per color channel and dose delivered to the film.

  16. Towards Accurate Application Characterization for Exascale (APEX)

    SciTech Connect

    Hammond, Simon David

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  17. Quantitative photoacoustic tomography based on the radiative transfer equation.

    PubMed

    Yao, Lei; Sun, Yao; Jiang, Huabei

    2009-06-15

    We describe a method for quantitative photoacoustic tomography (PAT) based on the radiative transfer equation (RTE) coupled with the Helmholtz photoacoustic wave equation. This RTE-based quantitative PAT allows for accurate recovery of absolute absorption coefficient images of heterogeneous media and provides significantly improved image reconstruction for the cases where the photon diffusion approximation may fail. The method and associated finite element reconstruction algorithm are validated using a series of tissuelike phantom experiments.

  18. Motor equivalence during multi-finger accurate force production

    PubMed Central

    Mattos, Daniela; Schöner, Gregor; Zatsiorsky, Vladimir M.; Latash, Mark L.

    2014-01-01

    We explored stability of multi-finger cyclical accurate force production action by analysis of responses to small perturbations applied to one of the fingers and inter-cycle analysis of variance. Healthy subjects performed two versions of the cyclical task, with and without an explicit target. The “inverse piano” apparatus was used to lift/lower a finger by 1 cm over 0.5 s; the subjects were always instructed to perform the task as accurate as they could at all times. Deviations in the spaces of finger forces and modes (hypothetical commands to individual fingers) were quantified in directions that did not change total force (motor equivalent) and in directions that changed the total force (non-motor equivalent). Motor equivalent deviations started immediately with the perturbation and increased progressively with time. After a sequence of lifting-lowering perturbations leading to the initial conditions, motor equivalent deviations were dominating. These phenomena were less pronounced for analysis performed with respect to the total moment of force with respect to an axis parallel to the forearm/hand. Analysis of inter-cycle variance showed consistently higher variance in a subspace that did not change the total force as compared to the variance that affected total force. We interpret the results as reflections of task-specific stability of the redundant multi-finger system. Large motor equivalent deviations suggest that reactions of the neuromotor system to a perturbation involve large changes of neural commands that do not affect salient performance variables, even during actions with the purpose to correct those salient variables. Consistency of the analyses of motor equivalence and variance analysis provides additional support for the idea of task-specific stability ensured at a neural level. PMID:25344311

  19. Braking of fast and accurate elbow flexions in the monkey.

    PubMed Central

    Flament, D; Hore, J; Vilis, T

    1984-01-01

    The processes responsible for braking fast and accurate elbow movements were studied in the monkey. The movements studied were made over different amplitudes and against different inertias . All were made to the same end position. Only fast movements that showed the typical biphasic or triphasic pattern of activity in agonists and antagonists were analysed in detail. For movements made over different amplitudes and at different velocities there was symmetry between the acceleration and deceleration phases of the movements. For movements of the same amplitude performed at different velocities there was a direct linear relation between peak velocity and both the peak acceleration (and integrated agonist burst) and peak deceleration (and integrated antagonist burst). The slopes of these relations and their intercept with the peak velocity axis were a function of movement amplitude. This was such that for large and small movements of the same peak velocity and the same end position (i) peak acceleration and phasic agonist activity were larger for the small movements and (ii) peak deceleration and phasic antagonist activity were larger for the small movements. The slope of these relations and the symmetry between acceleration and deceleration were not affected by the addition of an inertial load to the handle held by the monkey. The results indicate that fast and accurate elbow movements in the monkey are braked by antagonist activity that is centrally programmed. As all movements were made to the same end position, the larger antagonist burst in small movements, made at the same peak velocity as large movements, cannot be due to differences in the viscoelastic contribution to braking (cf. Marsden, Obeso & Rothwell , 1983).(ABSTRACT TRUNCATED AT 250 WORDS) PMID:6737291

  20. Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?

    PubMed Central

    Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.

    2005-01-01

    Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597

  1. ICSH recommendations for assessing automated high-performance liquid chromatography and capillary electrophoresis equipment for the quantitation of HbA2.

    PubMed

    Stephens, A D; Colah, R; Fucharoen, S; Hoyer, J; Keren, D; McFarlane, A; Perrett, D; Wild, B J

    2015-10-01

    Automated high performance liquid chromatography and Capillary electrophoresis are used to quantitate the proportion of Hemoglobin A2 (HbA2 ) in blood samples order to enable screening and diagnosis of carriers of β-thalassemia. Since there is only a very small difference in HbA2 levels between people who are carriers and people who are not carriers such analyses need to be both precise and accurate. This paper examines the different parameters of such equipment and discusses how they should be assessed.

  2. Quantitative analysis of retinal OCT.

    PubMed

    Sonka, Milan; Abràmoff, Michael D

    2016-10-01

    Clinical acceptance of 3-D OCT retinal imaging brought rapid development of quantitative 3-D analysis of retinal layers, vasculature, retinal lesions as well as facilitated new research in retinal diseases. One of the cornerstones of many such analyses is segmentation and thickness quantification of retinal layers and the choroid, with an inherently 3-D simultaneous multi-layer LOGISMOS (Layered Optimal Graph Image Segmentation for Multiple Objects and Surfaces) segmentation approach being extremely well suited for the task. Once retinal layers are segmented, regional thickness, brightness, or texture-based indices of individual layers can be easily determined and thus contribute to our understanding of retinal or optic nerve head (ONH) disease processes and can be employed for determination of disease status, treatment responses, visual function, etc. Out of many applications, examples provided in this paper focus on image-guided therapy and outcome prediction in age-related macular degeneration and on assessing visual function from retinal layer structure in glaucoma.

  3. 2D map projections for visualization and quantitative analysis of 3D fluorescence micrographs

    PubMed Central

    Sendra, G. Hernán; Hoerth, Christian H.; Wunder, Christian; Lorenz, Holger

    2015-01-01

    We introduce Map3-2D, a freely available software to accurately project up to five-dimensional (5D) fluorescence microscopy image data onto full-content 2D maps. Similar to the Earth’s projection onto cartographic maps, Map3-2D unfolds surface information from a stack of images onto a single, structurally connected map. We demonstrate its applicability for visualization and quantitative analyses of spherical and uneven surfaces in fixed and dynamic live samples by using mammalian and yeast cells, and giant unilamellar vesicles. Map3-2D software is available at http://www.zmbh.uni-heidelberg.de//Central_Services/Imaging_Facility/Map3-2D.html. PMID:26208256

  4. How accurate are Scottish cancer registration data?

    PubMed Central

    Brewster, D.; Crichton, J.; Muir, C.

    1994-01-01

    In order to assess the accuracy of Scottish cancer registration data, a random sample of 2,200 registrations, attributed to the year 1990, was generated. Relevant medical records were available for review in 2,021 (92%) cases. Registration details were reabstracted from available records and compared with data in the registry. Discrepancies in identifying items of data (surname, forename, sex and date of birth) were found in 3.5% of cases. Most were trivial and would not disturb record linkage. Discrepancy rates of 7.1% in post code of residence at the time of diagnosis (excluding differences arising through boundary changes), 11.0% in anniversary date (excluding differences of 6 weeks or less), 7.7% in histological verification status, 5.4% in ICD-9 site codes (the first three digits) and 14.5% in ICD-O morphology codes (excluding 'inferred' morphology codes) were recorded. Overall, serious discrepancies were judged to have occurred in 2.8% of cases. In many respects, therefore, Scottish cancer registration data show a high level of accuracy that compares favourably to the reported accuracy of the few other cancer registries undertaking such analyses. PMID:7947104

  5. Accurate detection of differential RNA processing

    PubMed Central

    Drewe, Philipp; Stegle, Oliver; Hartmann, Lisa; Kahles, André; Bohnert, Regina; Wachter, Andreas; Borgwardt, Karsten; Rätsch, Gunnar

    2013-01-01

    Deep transcriptome sequencing (RNA-Seq) has become a vital tool for studying the state of cells in the context of varying environments, genotypes and other factors. RNA-Seq profiling data enable identification of novel isoforms, quantification of known isoforms and detection of changes in transcriptional or RNA-processing activity. Existing approaches to detect differential isoform abundance between samples either require a complete isoform annotation or fall short in providing statistically robust and calibrated significance estimates. Here, we propose a suite of statistical tests to address these open needs: a parametric test that uses known isoform annotations to detect changes in relative isoform abundance and a non-parametric test that detects differential read coverages and can be applied when isoform annotations are not available. Both methods account for the discrete nature of read counts and the inherent biological variability. We demonstrate that these tests compare favorably to previous methods, both in terms of accuracy and statistical calibrations. We use these techniques to analyze RNA-Seq libraries from Arabidopsis thaliana and Drosophila melanogaster. The identified differential RNA processing events were consistent with RT–qPCR measurements and previous studies. The proposed toolkit is available from http://bioweb.me/rdiff and enables in-depth analyses of transcriptomes, with or without available isoform annotation. PMID:23585274

  6. Towards quantitative analysis of retinal features in optical coherence tomography.

    PubMed

    Baroni, Maurizio; Fortunato, Pina; La Torre, Agostino

    2007-05-01

    The purpose of this paper was to propose a new computer method for quantitative evaluation of representative features of the retina using optical coherence tomography (OCT). A multi-step approach was devised and positively tested for segmentation of the three main retinal layers: the vitreo-retinal interface and the inner and outer retina. Following a preprocessing step, three regions of interest were delimited. Significant peaks corresponding to high and low intensity strips were located along the OCT A-scan lines and accurate boundaries between different layers were obtained by maximizing an edge likelihood function. For a quantitative description, thickness measurement, densitometry, texture and curvature analyses were performed. As a first application, the effect of intravitreal injection of triamcinolone acetonide (IVTA) for the treatment of vitreo-retinal interface syndrome was evaluated. Almost all the parameters, measured on a set of 16 pathologic OCT images, were statistically different before and after IVTA injection (p<0.05). Shape analysis of the internal limiting membrane confirmed the reduction of the pathological traction state. Other significant parameters, such as reflectivity and texture contrast, exhibited relevant changes both at the vitreo-retinal interface and in the inner retinal layers. Texture parameters in the inner and outer retinal layers significantly correlated with the visual acuity restoration. According to these findings an IVTA injection might be considered a possible alternative to surgery for selected patients. In conclusion, the proposed approach appeared to be a promising tool for the investigation of tissue changes produced by pathology and/or therapy.

  7. Simple Mathematical Models Do Not Accurately Predict Early SIV Dynamics

    PubMed Central

    Noecker, Cecilia; Schaefer, Krista; Zaccheo, Kelly; Yang, Yiding; Day, Judy; Ganusov, Vitaly V.

    2015-01-01

    Upon infection of a new host, human immunodeficiency virus (HIV) replicates in the mucosal tissues and is generally undetectable in circulation for 1–2 weeks post-infection. Several interventions against HIV including vaccines and antiretroviral prophylaxis target virus replication at this earliest stage of infection. Mathematical models have been used to understand how HIV spreads from mucosal tissues systemically and what impact vaccination and/or antiretroviral prophylaxis has on viral eradication. Because predictions of such models have been rarely compared to experimental data, it remains unclear which processes included in these models are critical for predicting early HIV dynamics. Here we modified the “standard” mathematical model of HIV infection to include two populations of infected cells: cells that are actively producing the virus and cells that are transitioning into virus production mode. We evaluated the effects of several poorly known parameters on infection outcomes in this model and compared model predictions to experimental data on infection of non-human primates with variable doses of simian immunodifficiency virus (SIV). First, we found that the mode of virus production by infected cells (budding vs. bursting) has a minimal impact on the early virus dynamics for a wide range of model parameters, as long as the parameters are constrained to provide the observed rate of SIV load increase in the blood of infected animals. Interestingly and in contrast with previous results, we found that the bursting mode of virus production generally results in a higher probability of viral extinction than the budding mode of virus production. Second, this mathematical model was not able to accurately describe the change in experimentally determined probability of host infection with increasing viral doses. Third and finally, the model was also unable to accurately explain the decline in the time to virus detection with increasing viral dose. These results

  8. Bayesian B-spline mapping for dynamic quantitative traits.

    PubMed

    Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong

    2012-04-01

    Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.

  9. Towards Quantitative Spatial Models of Seabed Sediment Composition

    PubMed Central

    Stephens, David; Diesing, Markus

    2015-01-01

    There is a need for fit-for-purpose maps for accurately depicting the types of seabed substrate and habitat and the properties of the seabed for the benefits of research, resource management, conservation and spatial planning. The aim of this study is to determine whether it is possible to predict substrate composition across a large area of seabed using legacy grain-size data and environmental predictors. The study area includes the North Sea up to approximately 58.44°N and the United Kingdom’s parts of the English Channel and the Celtic Seas. The analysis combines outputs from hydrodynamic models as well as optical remote sensing data from satellite platforms and bathymetric variables, which are mainly derived from acoustic remote sensing. We build a statistical regression model to make quantitative predictions of sediment composition (fractions of mud, sand and gravel) using the random forest algorithm. The compositional data is analysed on the additive log-ratio scale. An independent test set indicates that approximately 66% and 71% of the variability of the two log-ratio variables are explained by the predictive models. A EUNIS substrate model, derived from the predicted sediment composition, achieved an overall accuracy of 83% and a kappa coefficient of 0.60. We demonstrate that it is feasible to spatially predict the seabed sediment composition across a large area of continental shelf in a repeatable and validated way. We also highlight the potential for further improvements to the method. PMID:26600040

  10. Waste Stream Analyses for Nuclear Fuel Cycles

    SciTech Connect

    N. R. Soelberg

    2010-08-01

    A high-level study was performed in Fiscal Year 2009 for the U.S. Department of Energy (DOE) Office of Nuclear Energy (NE) Advanced Fuel Cycle Initiative (AFCI) to provide information for a range of nuclear fuel cycle options (Wigeland 2009). At that time, some fuel cycle options could not be adequately evaluated since they were not well defined and lacked sufficient information. As a result, five families of these fuel cycle options are being studied during Fiscal Year 2010 by the Systems Analysis Campaign for the DOE NE Fuel Cycle Research and Development (FCRD) program. The quality and completeness of data available to date for the fuel cycle options is insufficient to perform quantitative radioactive waste analyses using recommended metrics. This study has been limited thus far to qualitative analyses of waste streams from the candidate fuel cycle options, because quantitative data for wastes from the front end, fuel fabrication, reactor core structure, and used fuel for these options is generally not yet available.

  11. Accurate First-Principles Spectra Predictions for Planetological and Astrophysical Applications at Various T-Conditions

    NASA Astrophysics Data System (ADS)

    Rey, M.; Nikitin, A. V.; Tyuterev, V.

    2014-06-01

    Knowledge of near infrared intensities of rovibrational transitions of polyatomic molecules is essential for the modeling of various planetary atmospheres, brown dwarfs and for other astrophysical applications 1,2,3. For example, to analyze exoplanets, atmospheric models have been developed, thus making the need to provide accurate spectroscopic data. Consequently, the spectral characterization of such planetary objects relies on the necessity of having adequate and reliable molecular data in extreme conditions (temperature, optical path length, pressure). On the other hand, in the modeling of astrophysical opacities, millions of lines are generally involved and the line-by-line extraction is clearly not feasible in laboratory measurements. It is thus suggested that this large amount of data could be interpreted only by reliable theoretical predictions. There exists essentially two theoretical approaches for the computation and prediction of spectra. The first one is based on empirically-fitted effective spectroscopic models. Another way for computing energies, line positions and intensities is based on global variational calculations using ab initio surfaces. They do not yet reach the spectroscopic accuracy stricto sensu but implicitly account for all intramolecular interactions including resonance couplings in a wide spectral range. The final aim of this work is to provide reliable predictions which could be quantitatively accurate with respect to the precision of available observations and as complete as possible. All this thus requires extensive first-principles quantum mechanical calculations essentially based on three necessary ingredients which are (i) accurate intramolecular potential energy surface and dipole moment surface components well-defined in a large range of vibrational displacements and (ii) efficient computational methods combined with suitable choices of coordinates to account for molecular symmetry properties and to achieve a good numerical

  12. Quantitative lipopolysaccharide analysis using HPLC/MS/MS and its combination with the limulus amebocyte lysate assay[S

    PubMed Central

    Pais de Barros, Jean-Paul; Gautier, Thomas; Sali, Wahib; Adrie, Christophe; Choubley, Hélène; Charron, Emilie; Lalande, Caroline; Le Guern, Naig; Deckert, Valérie; Monchi, Mehran; Quenot, Jean-Pierre; Lagrost, Laurent

    2015-01-01

    Quantitation of plasma lipopolysaccharides (LPSs) might be used to document Gram-negative bacterial infection. In the present work, LPS-derived 3-hydroxymyristate was extracted from plasma samples with an organic solvent, separated by reversed phase HPLC, and quantitated by MS/MS. This mass assay was combined with the limulus amebocyte lysate (LAL) bioassay to monitor neutralization of LPS activity in biological samples. The described HPLC/MS/MS method is a reliable, practical, accurate, and sensitive tool to quantitate LPS. The combination of the LAL and HPLC/MS/MS analyses provided new evidence for the intrinsic capacity of plasma lipoproteins and phospholipid transfer protein to neutralize the activity of LPS. In a subset of patients with systemic inflammatory response syndrome, with documented infection but with a negative plasma LAL test, significant amounts of LPS were measured by the HPLC/MS/MS method. Patients with the highest plasma LPS concentration were more severely ill. HPLC/MS/MS is a relevant method to quantitate endotoxin in a sample, to assess the efficacy of LPS neutralization, and to evaluate the proinflammatory potential of LPS in vivo. PMID:26023073

  13. Estimating quantitative genetic parameters in wild populations: a comparison of pedigree and genomic approaches.

    PubMed

    Bérénos, Camillo; Ellis, Philip A; Pilkington, Jill G; Pemberton, Josephine M

    2014-07-01

    The estimation of quantitative genetic parameters in wild populations is generally limited by the accuracy and completeness of the available pedigree information. Using relatedness at genomewide markers can potentially remove this limitation and lead to less biased and more precise estimates. We estimated heritability, maternal genetic effects and genetic correlations for body size traits in an unmanaged long-term study population of Soay sheep on St Kilda using three increasingly complete and accurate estimates of relatedness: (i) Pedigree 1, using observation-derived maternal links and microsatellite-derived paternal links; (ii) Pedigree 2, using SNP-derived assignment of both maternity and paternity; and (iii) whole-genome relatedness at 37 037 autosomal SNPs. In initial analyses, heritability estimates were strikingly similar for all three methods, while standard errors were systematically lower in analyses based on Pedigree 2 and genomic relatedness. Genetic correlations were generally strong, differed little between the three estimates of relatedness and the standard errors declined only very slightly with improved relatedness information. When partitioning maternal effects into separate genetic and environmental components, maternal genetic effects found in juvenile traits increased substantially across the three relatedness estimates. Heritability declined compared to parallel models where only a maternal environment effect was fitted, suggesting that maternal genetic effects are confounded with direct genetic effects and that more accurate estimates of relatedness were better able to separate maternal genetic effects from direct genetic effects. We found that the heritability captured by SNP markers asymptoted at about half the SNPs available, suggesting that denser marker panels are not necessarily required for precise and unbiased heritability estimates. Finally, we present guidelines for the use of genomic relatedness in future quantitative genetics

  14. Gas-phase purification enables accurate, large-scale, multiplexed proteome quantification with isobaric tagging

    PubMed Central

    Wenger, Craig D; Lee, M Violet; Hebert, Alexander S; McAlister, Graeme C; Phanstiel, Douglas H; Westphall, Michael S; Coon, Joshua J

    2011-01-01

    We describe a mass spectrometry method, QuantMode, which improves the accuracy of isobaric tag–based quantification by alleviating the pervasive problem of precursor interference—co-isolation of impurities—through gas-phase purification. QuantMode analysis of a yeast sample ‘contaminated’ with interfering human peptides showed substantially improved quantitative accuracy compared to a standard scan, with a small loss of spectral identifications. This technique will allow large-scale, multiplexed quantitative proteomics analyses using isobaric tagging. PMID:21963608

  15. Accuracy of quantitative visual soil assessment

    NASA Astrophysics Data System (ADS)

    van Leeuwen, Maricke; Heuvelink, Gerard; Stoorvogel, Jetse; Wallinga, Jakob; de Boer, Imke; van Dam, Jos; van Essen, Everhard; Moolenaar, Simon; Verhoeven, Frank; Stoof, Cathelijne

    2016-04-01

    farmers carried out quantitative visual observations all independently from each other. All observers assessed five sites, having a sand, peat or clay soil. For almost all quantitative visual observations the spread of observed values was low (coefficient of variation < 1.0), except for the number of biopores and gley mottles. Furthermore, farmers' observed mean values were significantly higher than soil scientists' mean values, for soil structure, amount of gley mottles and compaction. This study showed that VSA could be a valuable tool to assess soil quality. Subjectivity, due to the background of the observer, might influence the outcome of visual assessment of some soil properties. In countries where soil analyses can easily be carried out, VSA might be a good replenishment to available soil chemical analyses, and in countries where it is not feasible to carry out soil analyses, VSA might be a good start to assess soil quality.

  16. Extremely accurate sequential verification of RELAP5-3D

    SciTech Connect

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method of manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.

  17. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  18. Enzymic determination of plasma cholesterol on discrete automatic analysers.

    PubMed

    Nobbs, B T; Smith, J M; Walker, A W

    1977-09-01

    Enzymic procedures for the determination of plasma cholesterol, using cholesterol esterase and cholesterol oxidase, have been adapted to the Vickers D-300, Vickers M,-300, and Vitatron AKES discrete analysers. The results obtained by these methods have been compared to those obtained by manual and continuous flow Liebermann-Burchard methods. The enzymic methods were found to be accurate, precise and of adequate sensitivity.

  19. 77 FR 3800 - Accurate NDE & Inspection, LLC; Confirmatory Order

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ... COMMISSION Accurate NDE & Inspection, LLC; Confirmatory Order In the Matter of Accurate NDE & Docket: 150... request ADR with the NRC in an attempt to resolve issues associated with this matter. In response, on August 9, 2011, Accurate NDE requested ADR to resolve this matter with the NRC. On September 28,...

  20. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  1. Quantitative photoacoustic tomography

    PubMed Central

    Yuan, Zhen; Jiang, Huabei

    2009-01-01

    In this paper, several algorithms that allow for quantitative photoacoustic reconstruction of tissue optical, acoustic and physiological properties are described in a finite-element method based framework. These quantitative reconstruction algorithms are compared, and the merits and limitations associated with these methods are discussed. In addition, a multispectral approach is presented for concurrent reconstructions of multiple parameters including deoxyhaemoglobin, oxyhaemoglobin and water concentrations as well as acoustic speed. Simulation and in vivo experiments are used to demonstrate the effectiveness of the reconstruction algorithms presented. PMID:19581254

  2. DEMOGRAPHY AND VIABILITY ANALYSES OF A DIAMONDBACK TERRAPIN POPULATION

    EPA Science Inventory

    The diamondback terrapin Malaclemys terrapin is a long-lived species with special management requirements, but quantitative analyses to support management are lacking. I analyzed mark-recapture data and constructed an age-classified matrix population model to determine the status...

  3. Note on the chromatographic analyses of marine polyunsaturated fatty acids

    USGS Publications Warehouse

    Schultz, D.M.; Quinn, J.G.

    1977-01-01

    Gas-liquid chromatography was used to study the effects of saponification/methylation and thin-layer chromatographic isolation on the analyses of polyunsaturated fatty acids. Using selected procedures, the qualitative and quantitative distribution of these acids in marine organisms can be determined with a high degree of accuracy. ?? 1977 Springer-Verlag.

  4. Guidelines for Meta-Analyses of Counseling Psychology Research

    ERIC Educational Resources Information Center

    Quintana, Stephen M.; Minami, Takuya

    2006-01-01

    This article conceptually describes the steps in conducting quantitative meta-analyses of counseling psychology research with minimal reliance on statistical formulas. The authors identify sources that describe necessary statistical formula for various meta-analytic calculations and describe recent developments in meta-analytic techniques. The…

  5. Deficiencies of Reporting in Meta-Analyses and Some Remedies

    ERIC Educational Resources Information Center

    Harwell, Michael; Maeda, Yukiko

    2008-01-01

    There is general agreement that meta-analysis is an important tool for synthesizing study results in quantitative educational research. Yet, a shared feature of many meta-analyses is a failure to report sufficient information for readers to fully judge the reported findings, such as the populations to which generalizations are to be made,…

  6. A quantitative description for efficient financial markets

    NASA Astrophysics Data System (ADS)

    Immonen, Eero

    2015-09-01

    In this article we develop a control system model for describing efficient financial markets. We define the efficiency of a financial market in quantitative terms by robust asymptotic price-value equality in this model. By invoking the Internal Model Principle of robust output regulation theory we then show that under No Bubble Conditions, in the proposed model, the market is efficient if and only if the following conditions hold true: (1) the traders, as a group, can identify any mispricing in asset value (even if no one single trader can do it accurately), and (2) the traders, as a group, incorporate an internal model of the value process (again, even if no one single trader knows it). This main result of the article, which deliberately avoids the requirement for investor rationality, demonstrates, in quantitative terms, that the more transparent the markets are, the more efficient they are. An extensive example is provided to illustrate the theoretical development.

  7. Uncertainty Quantification for Quantitative Imaging Holdup Measurements

    SciTech Connect

    Bevill, Aaron M; Bledsoe, Keith C

    2016-01-01

    In nuclear fuel cycle safeguards, special nuclear material "held up" in pipes, ducts, and glove boxes causes significant uncertainty in material-unaccounted-for estimates. Quantitative imaging is a proposed non-destructive assay technique with potential to estimate the holdup mass more accurately and reliably than current techniques. However, uncertainty analysis for quantitative imaging remains a significant challenge. In this work we demonstrate an analysis approach for data acquired with a fast-neutron coded aperture imager. The work includes a calibrated forward model of the imager. Cross-validation indicates that the forward model predicts the imager data typically within 23%; further improvements are forthcoming. A new algorithm based on the chi-squared goodness-of-fit metric then uses the forward model to calculate a holdup confidence interval. The new algorithm removes geometry approximations that previous methods require, making it a more reliable uncertainty estimator.

  8. A Fast, Accurate and Sensitive GC-FID Method for the Analyses of Glycols in Water and Urine

    NASA Technical Reports Server (NTRS)

    Kuo, C. Mike; Alverson, James T.; Gazda, Daniel B.

    2017-01-01

    Glycols, specifically ethylene glycol and 1,2-propanediol, are some of the major organic compounds found in the humidity condensate samples collected on the International Space Station. The current analytical method for glycols is a GC/MS method with direct sample injection. This method is simple and fast, but it is not very sensitive. Reporting limits for ethylene glycol and 1,2-propanediol are only 1 ppm. A much more sensitive GC/FID method was developed, in which glycols were derivatized with benzoyl chloride for 10 minutes before being extracted with hexane. Using 1,3-propanediol as an internal standard, the detection limits for the GC/FID method was determined to be 50 ppb and the analysis only takes 7 minutes. Data from the GC/MS and the new GC/FID methods shows excellent agreement with each other. Factors affecting the sensitivity, including sample volume, NaOH concentration and volume, volume of benzoyl chloride, reaction time and temperature, were investigated. Interferences during derivatization and possible method to reduce interferences were also investigated.

  9. Quantitative SPECT of uptake of monoclonal antibodies

    SciTech Connect

    DeNardo, G.L.; Macey, D.J.; DeNardo, S.J.; Zhang, C.G.; Custer, T.R.

    1989-01-01

    Absolute quantitation of the distribution of radiolabeled antibodies is important to the efficient conduct of research with these agents and their ultimate use for imaging and treatment, but is formidable because of the unrestricted nature of their distribution within the patient. Planar imaging methods have been developed and provide an adequate approximation of the distribution of radionuclide for many purposes, particularly when there is considerable specificity of targeting. This is not currently the case for antibodies and is unlikely in the future. Single photon emission computed tomography (SPECT) provides potential for greater accuracy because it reduces problems caused by superimposition of tissues and non-target contributions to target counts. SPECT measurement of radionuclide content requires: (1) accurate determination of camera sensitivity; (2) accurate determination of the number of counts in a defined region of interest; (3) correction for attenuation; (4) correction for scatter and septal penetration; (5) accurate measurement of the administered dose; (6) adequate statistics; and (7) accurate definition of tissue mass or volume. The major impediment to each of these requirements is scatter of many types. The magnitude of this problem can be diminished by improvements in tomographic camera design, computer algorithms, and methodological approaches. 34 references.

  10. Accurate phase measurements for thick spherical objects using optical quadrature microscopy

    NASA Astrophysics Data System (ADS)

    Warger, William C., II; DiMarzio, Charles A.

    2009-02-01

    In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.

  11. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments

    PubMed Central

    Eter, Wael A.; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-01-01

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, 111In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of 111In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers. PMID:27080529

  12. SPECT-OPT multimodal imaging enables accurate evaluation of radiotracers for β-cell mass assessments.

    PubMed

    Eter, Wael A; Parween, Saba; Joosten, Lieke; Frielink, Cathelijne; Eriksson, Maria; Brom, Maarten; Ahlgren, Ulf; Gotthardt, Martin

    2016-04-15

    Single Photon Emission Computed Tomography (SPECT) has become a promising experimental approach to monitor changes in β-cell mass (BCM) during diabetes progression. SPECT imaging of pancreatic islets is most commonly cross-validated by stereological analysis of histological pancreatic sections after insulin staining. Typically, stereological methods do not accurately determine the total β-cell volume, which is inconvenient when correlating total pancreatic tracer uptake with BCM. Alternative methods are therefore warranted to cross-validate β-cell imaging using radiotracers. In this study, we introduce multimodal SPECT - optical projection tomography (OPT) imaging as an accurate approach to cross-validate radionuclide-based imaging of β-cells. Uptake of a promising radiotracer for β-cell imaging by SPECT, (111)In-exendin-3, was measured by ex vivo-SPECT and cross evaluated by 3D quantitative OPT imaging as well as with histology within healthy and alloxan-treated Brown Norway rat pancreata. SPECT signal was in excellent linear correlation with OPT data as compared to histology. While histological determination of islet spatial distribution was challenging, SPECT and OPT revealed similar distribution patterns of (111)In-exendin-3 and insulin positive β-cell volumes between different pancreatic lobes, both visually and quantitatively. We propose ex vivo SPECT-OPT multimodal imaging as a highly accurate strategy for validating the performance of β-cell radiotracers.

  13. The Quantitative Imaging Network in Precision Medicine

    PubMed Central

    Nordstrom, Robert J.

    2017-01-01

    Precision medicine is a healthcare model that seeks to incorporate a wealth of patient information to identify and classify disease progression and to provide tailored therapeutic solutions for individual patients. Interventions are based on knowledge of molecular and mechanistic causes, pathogenesis and pathology of disease. Individual characteristics of the patients are then used to select appropriate healthcare options. Imaging is playing an increasingly important role in identifying relevant characteristics that help to stratify patients for different interventions. However, lack of standards, limitations in image-processing interoperability, and errors in data collection can limit the applicability of imaging in clinical decision support. Quantitative imaging is the attempt to extract reliable, numerical information from images to eliminate qualitative judgments and errors for providing accurate measures of tumor response to therapy or for predicting future response. This issue of Tomography reports quantitative imaging developments made by several members of the National Cancer Institute Quantitative Imaging Network, a program dedicated to the promotion of quantitative imaging methods for clinical decision support. PMID:28083563

  14. DNA DAMAGE QUANTITATION BY ALKALINE GEL ELECTROPHORESIS.

    SciTech Connect

    SUTHERLAND,B.M.; BENNETT,P.V.; SUTHERLAND, J.C.

    2004-03-24

    Physical and chemical agents in the environment, those used in clinical applications, or encountered during recreational exposures to sunlight, induce damages in DNA. Understanding the biological impact of these agents requires quantitation of the levels of such damages in laboratory test systems as well as in field or clinical samples. Alkaline gel electrophoresis provides a sensitive (down to {approx} a few lesions/5Mb), rapid method of direct quantitation of a wide variety of DNA damages in nanogram quantities of non-radioactive DNAs from laboratory, field, or clinical specimens, including higher plants and animals. This method stems from velocity sedimentation studies of DNA populations, and from the simple methods of agarose gel electrophoresis. Our laboratories have developed quantitative agarose gel methods, analytical descriptions of DNA migration during electrophoresis on agarose gels (1-6), and electronic imaging for accurate determinations of DNA mass (7-9). Although all these components improve sensitivity and throughput of large numbers of samples (7,8,10), a simple version using only standard molecular biology equipment allows routine analysis of DNA damages at moderate frequencies. We present here a description of the methods, as well as a brief description of the underlying principles, required for a simplified approach to quantitation of DNA damages by alkaline gel electrophoresis.

  15. Quantitative Simulation Games

    NASA Astrophysics Data System (ADS)

    Černý, Pavol; Henzinger, Thomas A.; Radhakrishna, Arjun

    While a boolean notion of correctness is given by a preorder on systems and properties, a quantitative notion of correctness is defined by a distance function on systems and properties, where the distance between a system and a property provides a measure of "fit" or "desirability." In this article, we explore several ways how the simulation preorder can be generalized to a distance function. This is done by equipping the classical simulation game between a system and a property with quantitative objectives. In particular, for systems that satisfy a property, a quantitative simulation game can measure the "robustness" of the satisfaction, that is, how much the system can deviate from its nominal behavior while still satisfying the property. For systems that violate a property, a quantitative simulation game can measure the "seriousness" of the violation, that is, how much the property has to be modified so that it is satisfied by the system. These distances can be computed in polynomial time, since the computation reduces to the value problem in limit average games with constant weights. Finally, we demonstrate how the robustness distance can be used to measure how many transmission errors are tolerated by error correcting codes.

  16. Emerging high throughput analyses of cyanobacterial toxins and toxic cyanobacteria.

    PubMed

    Sivonen, Kaarina

    2008-01-01

    The common occurrence of toxic cyanobacteria causes problems for health of animals and human beings. More research and good monitoring systems are needed to protect water users. It is important to have rapid, reliable and accurate analysis i.e. high throughput methods to identify the toxins as well as toxin producers in the environment. Excellent methods, such as ELISA already exist to analyse cyanobacterial hepatotoxins and saxitoxins, and PPIA for microcystins and nodularins. The LC/MS method can be fast in identifying the toxicants in the samples. Further development of this area should resolve the problems with sampling and sample preparation, which still are the bottlenecks of rapid analyses. In addition, the availability of reliable reference materials and standards should be resolved. Molecular detection methods are now routine in clinical and criminal laboratories and may also become important in environmental diagnostics. One prerequisite for the development of molecular analysis is that pure cultures of the producer organisms are available for identification of the biosynthetic genes responsible for toxin production and for proper testing of the diagnostic methods. Good methods are already available for the microcystin and nodularin-producing cyanobacteria such as conventional PCR, quantitative real-time PCR and microarrays/DNA chips. The DNA-chip technology offers an attractive monitoring system for toxic and non-toxic cyanobacteria. Only with these new technologies (PCR + DNA-chips) will we be able to study toxic cyanobacteria populations in situ and the effects of environmental factors on the occurrence and proliferation of especially toxic cyanobacteria. This is likely to yield important information for mitigation purposes. Further development of these methods should include all cyanobacterial biodiversity, including all toxin producers and primers/probes to detect producers of neurotoxins, cylindrospermopsins etc. (genes are unknown). The on

  17. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  18. Phylogenetic ANOVA: The Expression Variance and Evolution Model for Quantitative Trait Evolution.

    PubMed

    Rohlfs, Rori V; Nielsen, Rasmus

    2015-09-01

    A number of methods have been developed for modeling the evolution of a quantitative trait on a phylogeny. These methods have received renewed interest in the context of genome-wide studies of gene expression, in which the expression levels of many genes can be modeled as quantitative traits. We here develop a new method for joint analyses of quantitative traits within- and between species, the Expression Variance and Evolution (EVE) model. The model parameterizes the ratio of population to evolutionary expression variance, facilitating a wide variety of analyses, including a test for lineage-specific shifts in expression level, and a phylogenetic ANOVA that can detect genes with increased or decreased ratios of expression divergence to diversity, analogous to the famous Hudson Kreitman Aguadé (HKA) test used to detect selection at the DNA level. We use simulations to explore the properties of these tests under a variety of circumstances and show that the phylogenetic ANOVA is more accurate than the standard ANOVA (no accounting for phylogeny) sometimes used in transcriptomics. We then apply the EVE model to a mammalian phylogeny of 15 species typed for expression levels in liver tissue. We identify genes with high expression divergence between species as candidates for expression level adaptation, and genes with high expression diversity within species as candidates for expression level conservation and/or plasticity. Using the test for lineage-specific expression shifts, we identify several candidate genes for expression level adaptation on the catarrhine and human lineages, including genes putatively related to dietary changes in humans. We compare these results to those reported previously using a model which ignores expression variance within species, uncovering important differences in performance. We demonstrate the necessity for a phylogenetic model in comparative expression studies and show the utility of the EVE model to detect expression divergence

  19. Evaluating the efficacy of continuous quantitative characters for reconstructing the phylogeny of a morphologically homogeneous spider taxon (Araneae, Mygalomorphae, Antrodiaetidae, Antrodiaetus).

    PubMed

    Hendrixson, Brent E; Bond, Jason E

    2009-10-01

    The use of continuous quantitative characters for phylogenetic analyses has long been contentious in the systematics literature. Recent studies argue for and against their use, but there have been relatively few attempts to evaluate whether these characters provide an accurate estimate of phylogeny, despite the fact that a number of methods have been developed to analyze these types of data for phylogenetic inference. A tree topology will be produced for a given methodology and set of characters, but little can be concluded with regards to the accuracy of phylogenetic signal without an independent evaluation of those characters. We assess the performance of continuous quantitative characters for the mygalomorph spider genus Antrodiaetus, a group that is morphologically homogeneous and one for which few discrete (morphological) characters have been observed. Phylogenetic signal contained in continuous quantitative characters is compared to an independently derived phylogeny inferred on the basis of multiple nuclear and mitochondrial gene loci. Tree topology randomizations, regression techniques, and topological tests all demonstrate that continuous quantitative characters in Antrodiaetus conflict with the phylogenetic signal contained in the gene trees. Our results show that the use of continuous quantitative characters for phylogenetic reconstruction may be inappropriate for reconstructing Antrodiaetus phylogeny and indicate that due caution should be exercised before employing this character type in the absence of other independently derived sources of characters.

  20. SiNG-PCRseq: Accurate inter-sequence quantification achieved by spiking-in a neighbor genome for competitive PCR amplicon sequencing.

    PubMed

    Oh, Soo A; Yang, Inchul; Hahn, Yoonsoo; Kang, Yong-Kook; Chung, Sun-Ku; Jeong, Sangkyun

    2015-07-06

    Despite the recent technological advances in DNA quantitation by sequencing, accurate delineation of the quantitative relationship among different DNA sequences is yet to be elaborated due to difficulties in correcting the sequence-specific quantitation biases. We here developed a novel DNA quantitation method via spiking-in a neighbor genome for competitive PCR amplicon sequencing (SiNG-PCRseq). This method utilizes genome-wide chemically equivalent but easily discriminable homologous sequences with a known copy arrangement in the neighbor genome. By comparing the amounts of selected human DNA sequences simultaneously to those of matched sequences in the orangutan genome, we could accurately draw the quantitative relationships for those sequences in the human genome (root-mean-square deviations <0.05). Technical replications of cDNA quantitation performed using different reagents at different time points also resulted in excellent correlations (R(2) > 0.95). The cDNA quantitation using SiNG-PCRseq was highly concordant with the RNA-seq-derived version in inter-sample comparisons (R(2) = 0.88), but relatively discordant in inter-sequence quantitation (R(2) < 0.44), indicating considerable level of sequence-dependent quantitative biases in RNA-seq. Considering the measurement structure explicitly relating the amount of different sequences within a sample, SiNG-PCRseq will facilitate sharing and comparing the quantitation data generated under different spatio-temporal settings.

  1. Fluorochromes for DNA Staining and Quantitation.

    PubMed

    Mazzini, Giuliano; Danova, Marco

    2017-01-01

    In these last few decades the great explosion of the molecular approaches has casted a little shadow on the DNA quantitative analysis. Nevertheless DNA cytochemistry represented a long piece of history in cell biology since the advent of the Feulgen reaction. This discovery was really the milestone of the emerging quantitative cytochemistry, and scientists from all over the world produced a very large literature on this subject. This first era of quantitation (histochemistry followed by cytochemistry) started by means of absorption measurements (histophotometry and cytophotometry). The successive introduction of fluorescence microscopy gave a great boost to quantitation, making easier and faster the determination of cell components by means of cytofluorometry. The development of flow cytometry further contributed to the importance of quantitative cytochemistry. At its beginning, the mission of flow cytometry was still DNA quantitation. For a decade the Feulgen reaction had been the reference methodology for both conventional and flow cytofluorometry; the advent of Shiff-type reagents contributed to expand the variety of possible fluorochromes excitable in the entire visible spectrum as well as in the ultraviolet region. The fluorescence scenario was progressively enriched by new probes among which are the intercalating dyes which made DNA quantitation simple and fast, thus spreading it worldwide. The final explosion of cytofluorometry was made possible by the availability of a large variety of probes directly binding DNA structure. In addition, immunofluorescence allowed to correlate the cell cycle-related DNA content to other cell markers. In the clinical application of flow cytometry, this promoted the introduction of multiparametric analyses aimed at describing the cytokinetic characteristics of a given cell subpopulation defined by a specific immunophenotype setting.

  2. Accurate fundamental parameters for 23 bright solar-type stars

    NASA Astrophysics Data System (ADS)

    Bruntt, H.; Bedding, T. R.; Quirion, P.-O.; Lo Curto, G.; Carrier, F.; Smalley, B.; Dall, T. H.; Arentoft, T.; Bazot, M.; Butler, R. P.

    2010-07-01

    We combine results from interferometry, asteroseismology and spectroscopy to determine accurate fundamental parameters of 23 bright solar-type stars, from spectral type F5 to K2 and luminosity classes III-V. For some stars we can use direct techniques to determine the mass, radius, luminosity and effective temperature, and we compare with indirect methods that rely on photometric calibrations or spectroscopic analyses. We use the asteroseismic information available in the literature to infer an indirect mass with an accuracy of 4-15 per cent. From indirect methods we determine luminosity and radius to 3 per cent. We find evidence that the luminosity from the indirect method is slightly overestimated (~ 5 per cent) for the coolest stars, indicating that their bolometric corrections (BCs) are too negative. For Teff we find a slight offset of -40 +/- 20K between the spectroscopic method and the direct method, meaning the spectroscopic temperatures are too high. From the spectroscopic analysis we determine the detailed chemical composition for 13 elements, including Li, C and O. The metallicity ranges from [Fe/H] = -1.7 to +0.4, and there is clear evidence for α-element enhancement in the metal-poor stars. We find no significant offset between the spectroscopic surface gravity and the value from combining asteroseismology with radius estimates. From the spectroscopy we also determine v sin i and we present a new calibration of macroturbulence and microturbulence. From the comparison between the results from the direct and spectroscopic methods we claim that we can determine Teff, log g and [Fe/H] with absolute accuracies of 80K, 0.08 and 0.07dex. Photometric calibrations of Strömgren indices provide accurate results for Teff and [Fe/H] but will be more uncertain for distant stars when interstellar reddening becomes important. The indirect methods are important to obtain reliable estimates of the fundamental parameters of relatively faint stars when interferometry

  3. Leadership and Culture-Building in Schools: Quantitative and Qualitative Understandings.

    ERIC Educational Resources Information Center

    Sashkin, Marshall; Sashkin, Molly G.

    Understanding effective school leadership as a function of culture building through quantitative and qualitative analyses is the purpose of this paper. The two-part quantitative phase of the research focused on statistical measures of culture and leadership behavior directed toward culture building in the school. The first quantitative part…

  4. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR

  5. PCaAnalyser: A 2D-Image Analysis Based Module for Effective Determination of Prostate Cancer Progression in 3D Culture

    PubMed Central

    Lovitt, Carrie J.; Avery, Vicky M.

    2013-01-01

    Three-dimensional (3D) in vitro cell based assays for Prostate Cancer (PCa) research are rapidly becoming the preferred alternative to that of conventional 2D monolayer cultures. 3D assays more precisely mimic the microenvironment found in vivo, and thus are ideally suited to evaluate compounds and their suitability for progression in the drug discovery pipeline. To achieve the desired high throughput needed for most screening programs, automated quantification of 3D cultures is required. Towards this end, this paper reports on the development of a prototype analysis module for an automated high-content-analysis (HCA) system, which allows for accurate and fast investigation of in vitro 3D cell culture models for PCa. The Java based program, which we have named PCaAnalyser, uses novel algorithms that allow accurate and rapid quantitation of protein expression in 3D cell culture. As currently configured, the PCaAnalyser can quantify a range of biological parameters including: nuclei-count, nuclei-spheroid membership prediction, various function based classification of peripheral and non-peripheral areas to measure expression of biomarkers and protein constituents known to be associated with PCa progression, as well as defining segregate cellular-objects effectively for a range of signal-to-noise ratios. In addition, PCaAnalyser architecture is highly flexible, operating as a single independent analysis, as well as in batch mode; essential for High-Throughput-Screening (HTS). Utilising the PCaAnalyser, accurate and rapid analysis in an automated high throughput manner is provided, and reproducible analysis of the distribution and intensity of well-established markers associated with PCa progression in a range of metastatic PCa cell-lines (DU145 and PC3) in a 3D model demonstrated. PMID:24278197

  6. Energy & Climate: Getting Quantitative

    NASA Astrophysics Data System (ADS)

    Wolfson, Richard

    2011-11-01

    A noted environmentalist claims that buying an SUV instead of a regular car is energetically equivalent to leaving your refrigerator door open for seven years. A fossil-fuel apologist argues that solar energy is a pie-in-the-sky dream promulgated by na"ive environmentalists, because there's nowhere near enough solar energy to meet humankind's energy demand. A group advocating shutdown of the Vermont Yankee nuclear plant claims that 70% of its electrical energy is lost in transmission lines. Around the world, thousands agitate for climate action, under the numerical banner ``350.'' Neither the environmentalist, the fossil-fuel apologist, the antinuclear activists, nor most of those marching under the ``350'' banner can back up their assertions with quantitative arguments. Yet questions about energy and its environmental impacts almost always require quantitative answers. Physics can help! This poster gives some cogent examples, based on the newly published 2^nd edition of the author's textbook Energy, Environment, and Climate.

  7. Spectroscopically Accurate Line Lists for Application in Sulphur Chemistry

    NASA Astrophysics Data System (ADS)

    Underwood, D. S.; Azzam, A. A. A.; Yurchenko, S. N.; Tennyson, J.

    2013-09-01

    Monitoring sulphur chemistry is thought to be of great importance for exoplanets. Doing this requires detailed knowledge of the spectroscopic properties of sulphur containing molecules such as hydrogen sulphide (H2S) [1], sulphur dioxide (SO2), and sulphur trioxide (SO3). Each of these molecules can be found in terrestrial environments, produced in volcano emissions on Earth, and analysis of their spectroscopic data can prove useful to the characterisation of exoplanets, as well as the study of planets in our own solar system, with both having a possible presence on Venus. A complete, high temperature list of line positions and intensities for H32 2 S is presented. The DVR3D program suite is used to calculate the bound ro-vibration energy levels, wavefunctions, and dipole transition intensities using Radau coordinates. The calculations are based on a newly determined, spectroscopically refined potential energy surface (PES) and a new, high accuracy, ab initio dipole moment surface (DMS). Tests show that the PES enables us to calculate the line positions accurately and the DMS gives satisfactory results for line intensities. Comparisons with experiment as well as with previous theoretical spectra will be presented. The results of this study will form an important addition to the databases which are considered as sources of information for space applications; especially, in analysing the spectra of extrasolar planets, and remote sensing studies for Venus and Earth, as well as laboratory investigations and pollution studies. An ab initio line list for SO3 was previously computed using the variational nuclear motion program TROVE [2], and was suitable for modelling room temperature SO3 spectra. The calculations considered transitions in the region of 0-4000 cm-1 with rotational states up to J = 85, and includes 174,674,257 transitions. A list of 10,878 experimental transitions had relative intensities placed on an absolute scale, and were provided in a form suitable

  8. Efficient design, accurate fabrication and effective characterization of plasmonic quasicrystalline arrays of nano-spherical particles

    PubMed Central

    Namin, Farhad A.; Yuwen, Yu A.; Liu, Liu; Panaretos, Anastasios H.; Werner, Douglas H.; Mayer, Theresa S.

    2016-01-01

    In this paper, the scattering properties of two-dimensional quasicrystalline plasmonic lattices are investigated. We combine a newly developed synthesis technique, which allows for accurate fabrication of spherical nanoparticles, with a recently published variation of generalized multiparticle Mie theory to develop the first quantitative model for plasmonic nano-spherical arrays based on quasicrystalline morphologies. In particular, we study the scattering properties of Penrose and Ammann- Beenker gold spherical nanoparticle array lattices. We demonstrate that by using quasicrystalline lattices, one can obtain multi-band or broadband plasmonic resonances which are not possible in periodic structures. Unlike previously published works, our technique provides quantitative results which show excellent agreement with experimental measurements. PMID:26911709

  9. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  10. Absolute quantitation of protein posttranslational modification isoform.

    PubMed

    Yang, Zhu; Li, Ning

    2015-01-01

    Mass spectrometry has been widely applied in characterization and quantification of proteins from complex biological samples. Because the numbers of absolute amounts of proteins are needed in construction of mathematical models for molecular systems of various biological phenotypes and phenomena, a number of quantitative proteomic methods have been adopted to measure absolute quantities of proteins using mass spectrometry. The liquid chromatography-tandem mass spectrometry (LC-MS/MS) coupled with internal peptide standards, i.e., the stable isotope-coded peptide dilution series, which was originated from the field of analytical chemistry, becomes a widely applied method in absolute quantitative proteomics research. This approach provides more and more absolute protein quantitation results of high confidence. As quantitative study of posttranslational modification (PTM) that modulates the biological activity of proteins is crucial for biological science and each isoform may contribute a unique biological function, degradation, and/or subcellular location, the absolute quantitation of protein PTM isoforms has become more relevant to its biological significance. In order to obtain the absolute cellular amount of a PTM isoform of a protein accurately, impacts of protein fractionation, protein enrichment, and proteolytic digestion yield should be taken into consideration and those effects before differentially stable isotope-coded PTM peptide standards are spiked into sample peptides have to be corrected. Assisted with stable isotope-labeled peptide standards, the absolute quantitation of isoforms of posttranslationally modified protein (AQUIP) method takes all these factors into account and determines the absolute amount of a protein PTM isoform from the absolute amount of the protein of interest and the PTM occupancy at the site of the protein. The absolute amount of the protein of interest is inferred by quantifying both the absolute amounts of a few PTM

  11. Quantitation of signal transduction.

    PubMed

    Krauss, S; Brand, M D

    2000-12-01

    Conventional qualitative approaches to signal transduction provide powerful ways to explore the architecture and function of signaling pathways. However, at the level of the complete system, they do not fully depict the interactions between signaling and metabolic pathways and fail to give a manageable overview of the complexity that is often a feature of cellular signal transduction. Here, we introduce a quantitative experimental approach to signal transduction that helps to overcome these difficulties. We present a quantitative analysis of signal transduction during early mitogen stimulation of lymphocytes, with steady-state respiration rate as a convenient marker of metabolic stimulation. First, by inhibiting various key signaling pathways, we measure their relative importance in regulating respiration. About 80% of the input signal is conveyed via identifiable routes: 50% through pathways sensitive to inhibitors of protein kinase C and MAP kinase and 30% through pathways sensitive to an inhibitor of calcineurin. Second, we quantify how each of these pathways differentially stimulates functional units of reactions that produce and consume a key intermediate in respiration: the mitochondrial membrane potential. Both the PKC and calcineurin routes stimulate consumption more strongly than production, whereas the unidentified signaling routes stimulate production more than consumption, leading to no change in membrane potential despite increased respiration rate. The approach allows a quantitative description of the relative importance of signal transduction pathways and the routes by which they activate a specific cellular process. It should be widely applicable.

  12. Towards accurate and precise estimates of lion density.

    PubMed

    Elliot, Nicholas B; Gopalaswamy, Arjun M

    2016-12-13

    Reliable estimates of animal density are fundamental to our understanding of ecological processes and population dynamics. Furthermore, their accuracy is vital to conservation biology since wildlife authorities rely on these figures to make decisions. However, it is notoriously difficult to accurately estimate density for wide-ranging species such as carnivores that occur at low densities. In recent years, significant progress has been made in density estimation of Asian carnivores, but the methods have not been widely adapted to African carnivores. African lions (Panthera leo) provide an excellent example as although abundance indices have been shown to produce poor inferences, they continue to be used to estimate lion density and inform management and policy. In this study we adapt a Bayesian spatially explicit capture-recapture model to estimate lion density in the Maasai Mara National Reserve (MMNR) and surrounding conservancies in Kenya. We utilize sightings data from a three-month survey period to produce statistically rigorous spatial density estimates. Overall posterior mean lion density was estimated to be 16.85 (posterior standard deviation = 1.30) lions over one year of age per 100km(2) with a sex ratio of 2.2♀:1♂. We argue that such methods should be developed, improved and favored over less reliable methods such as track and call-up surveys. We caution against trend analyses based on surveys of differing reliability and call for a unified framework to assess lion numbers across their range in order for better informed management and policy decisions to be made. This article is protected by copyright. All rights reserved.

  13. Project analysis and integration economic analyses summary

    NASA Technical Reports Server (NTRS)

    Macomber, H. L.

    1986-01-01

    An economic-analysis summary was presented for the manufacture of crystalline-silicon modules involving silicon ingot/sheet, growth, slicing, cell manufacture, and module assembly. Economic analyses provided: useful quantitative aspects for complex decision-making to the Flat-plate Solar Array (FSA) Project; yardsticks for design and performance to industry; and demonstration of how to evaluate and understand the worth of research and development both to JPL and other government agencies and programs. It was concluded that future research and development funds for photovoltaics must be provided by the Federal Government because the solar industry today does not reap enough profits from its present-day sales of photovoltaic equipment.

  14. EEO Implications of Job Analyses.

    ERIC Educational Resources Information Center

    Lacy, D. Patrick, Jr.

    1979-01-01

    Discusses job analyses as they relate to the requirements of Title VII of the Civil Rights Act of 1964, the Equal Pay Act of 1963, and the Rehabilitation Act of 1973. Argues that job analyses can establish the job-relatedness of entrance requirements and aid in defenses against charges of discrimination. Journal availability: see EA 511 615.

  15. Foucault test: a quantitative evaluation method.

    PubMed

    Rodríguez, Gustavo; Villa, Jesús; Ivanov, Rumen; González, Efrén; Martínez, Geminiano

    2016-08-01

    Reliable and accurate testing methods are essential to guiding the polishing process during the figuring of optical telescope mirrors. With the natural advancement of technology, the procedures and instruments used to carry out this delicate task have consistently increased in sensitivity, but also in complexity and cost. Fortunately, throughout history, the Foucault knife-edge test has shown the potential to measure transverse aberrations in the order of the wavelength, mainly when described in terms of physical theory, which allows a quantitative interpretation of its characteristic shadowmaps. Our previous publication on this topic derived a closed mathematical formulation that directly relates the knife-edge position with the observed irradiance pattern. The present work addresses the quite unexplored problem of the wavefront's gradient estimation from experimental captures of the test, which is achieved by means of an optimization algorithm featuring a proposed ad hoc cost function. The partial derivatives thereby calculated are then integrated by means of a Fourier-based algorithm to retrieve the mirror's actual surface profile. To date and to the best of our knowledge, this is the very first time that a complete mathematical-grounded treatment of this optical phenomenon is presented, complemented by an image-processing algorithm which allows a quantitative calculation of the corresponding slope at any given point of the mirror's surface, so that it becomes possible to accurately estimate the aberrations present in the analyzed concave device just through its associated foucaultgrams.

  16. Quantifying Methane Fluxes Simply and Accurately: The Tracer Dilution Method

    NASA Astrophysics Data System (ADS)

    Rella, Christopher; Crosson, Eric; Green, Roger; Hater, Gary; Dayton, Dave; Lafleur, Rick; Merrill, Ray; Tan, Sze; Thoma, Eben

    2010-05-01

    Methane is an important atmospheric constituent with a wide variety of sources, both natural and anthropogenic, including wetlands and other water bodies, permafrost, farms, landfills, and areas with significant petrochemical exploration, drilling, transport, or processing, or refining occurs. Despite its importance to the carbon cycle, its significant impact as a greenhouse gas, and its ubiquity in modern life as a source of energy, its sources and sinks in marine and terrestrial ecosystems are only poorly understood. This is largely because high quality, quantitative measurements of methane fluxes in these different environments have not been available, due both to the lack of robust field-deployable instrumentation as well as to the fact that most significant sources of methane extend over large areas (from 10's to 1,000,000's of square meters) and are heterogeneous emitters - i.e., the methane is not emitted evenly over the area in question. Quantifying the total methane emissions from such sources becomes a tremendous challenge, compounded by the fact that atmospheric transport from emission point to detection point can be highly variable. In this presentation we describe a robust, accurate, and easy-to-deploy technique called the tracer dilution method, in which a known gas (such as acetylene, nitrous oxide, or sulfur hexafluoride) is released in the same vicinity of the methane emissions. Measurements of methane and the tracer gas are then made downwind of the release point, in the so-called far-field, where the area of methane emissions cannot be distinguished from a point source (i.e., the two gas plumes are well-mixed). In this regime, the methane emissions are given by the ratio of the two measured concentrations, multiplied by the known tracer emission rate. The challenges associated with atmospheric variability and heterogeneous methane emissions are handled automatically by the transport and dispersion of the tracer. We present detailed methane flux

  17. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  18. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage

  19. Accuracy of finite element analyses of CT scans in predictions of vertebral failure patterns under axial compression and anterior flexion.

    PubMed

    Jackman, Timothy M; DelMonaco, Alex M; Morgan, Elise F

    2016-01-25

    Finite element (FE) models built from quantitative computed tomography (QCT) scans can provide patient-specific estimates of bone strength and fracture risk in the spine. While prior studies demonstrate accurate QCT-based FE predictions of vertebral stiffness and strength, the accuracy of the predicted failure patterns, i.e., the locations where failure occurs within the vertebra and the way in which the vertebra deforms as failure progresses, is less clear. This study used digital volume correlation (DVC) analyses of time-lapse micro-computed tomography (μCT) images acquired during mechanical testing (compression and anterior flexion) of thoracic spine segments (T7-T9, n=28) to measure displacements occurring throughout the T8 vertebral body at the ultimate point. These displacements were compared to those simulated by QCT-based FE analyses of T8. We hypothesized that the FE predictions would be more accurate when the boundary conditions are based on measurements of pressure distributions within intervertebral discs of similar level of disc degeneration vs. boundary conditions representing rigid platens. The FE simulations captured some of the general, qualitative features of the failure patterns; however, displacement errors ranged 12-279%. Contrary to our hypothesis, no differences in displacement errors were found when using boundary conditions representing measurements of disc pressure vs. rigid platens. The smallest displacement errors were obtained using boundary conditions that were measured directly by DVC at the T8 endplates. These findings indicate that further work is needed to develop methods of identifying physiological loading conditions for the vertebral body, for the purpose of achieving robust, patient-specific FE analyses of failure mechanisms.

  20. Accuracy of finite element analyses of CT scans in predictions of vertebral failure patterns under axial compression and anterior flexion

    PubMed Central

    Jackman, Timothy M.; DelMonaco, Alex M.; Morgan, Elise F.

    2016-01-01

    Finite element (FE) models built from quantitative computed tomography (QCT) scans can provide patient-specific estimates of bone strength and fracture risk in the spine. While prior studies demonstrate accurate QCT-based FE predictions of vertebral stiffness and strength, the accuracy of the predicted failure patterns, i.e., the locations where failure occurs within the vertebra and the way in which the vertebra deforms as failure progresses, is less clear. This study used digital volume correlation (DVC) analyses of time-lapse micro-computed tomography (µCT) images acquired during mechanical testing (compression and anterior flexion) of thoracic spine segments (T7–T9, n = 28) to measure displacements occurring throughout the T8 vertebral body at the ultimate point. These displacements were compared to those simulated by QCT-based FE analyses of T8. We hypothesized that the FE predictions would be more accurate when the boundary conditions are based on measurements of pressure distributions within intervertebral discs of similar level of disc degeneration vs. boundary conditions representing rigid platens. The FE simulations captured some of the general, qualitative features of the failure patterns; however, displacement errors ranged 12–279%. Contrary to our hypothesis, no differences in displacement errors were found when using boundary conditions representing measurements of disc pressure vs. rigid platens. The smallest displacement errors were obtained using boundary conditions that were measured directly by DVC at the T8 endplates. These findings indicate that further work is needed to develop methods of identifying physiological loading conditions for the vertebral body, for the purpose of achieving robust, patient-specific FE analyses of failure mechanisms. PMID:26792288

  1. Quantitative non-destructive testing

    NASA Technical Reports Server (NTRS)

    Welch, C. S.

    1985-01-01

    The work undertaken during this period included two primary efforts. The first is a continuation of theoretical development from the previous year of models and data analyses for NDE using the Optical Thermal Infra-Red Measurement System (OPTITHIRMS) system, which involves heat injection with a laser and observation of the resulting thermal pattern with an infrared imaging system. The second is an investigation into the use of the thermoelastic effect as an effective tool for NDE. As in the past, the effort is aimed towards NDE techniques applicable to composite materials in structural applications. The theoretical development described produced several models of temperature patterns over several geometries and material types. Agreement between model data and temperature observations was obtained. A model study with one of these models investigated some fundamental difficulties with the proposed method (the primitive equation method) for obtaining diffusivity values in plates of thickness and supplied guidelines for avoiding these difficulties. A wide range of computing speeds was found among the various models, with a one-dimensional model based on Laplace's integral solution being both very fast and very accurate.

  2. Quantitative rainbow schlieren deflectometry

    NASA Technical Reports Server (NTRS)

    Greenberg, Paul S.; Klimek, Robert B.; Buchele, Donald R.

    1995-01-01

    In the rainbow schlieren apparatus, a continuously graded rainbow filter is placed in the back focal plane of the decollimating lens. Refractive-index gradients in the test section thus appear as gradations in hue rather than irradiance. A simple system is described wherein a conventional color CCD array and video digitizer are used to quantify accurately the color attributes of the resulting image, and hence the associated ray deflections. The present system provides a sensitivity comparable with that of conventional interferometry, while being simpler to implement and less sensitive to mechanical misalignment.

  3. Quantitative computed tomography assessment of lung structure and function in pulmonary emphysema.

    PubMed

    Madani, A; Keyzer, C; Gevenois, P A

    2001-10-01

    Accurate diagnosis and quantification of pulmonary emphysema during life is important to understand the natural history of the disease, to assess the extent of the disease, and to evaluate and follow-up therapeutic interventions. Since pulmonary emphysema is defined through pathological criteria, new methods of diagnosis and quantification should be validated by comparisons against histological references. Recent studies have addressed the capability of computed tomography (CT) to quantify pulmonary emphysema accurately. The studies reviewed in this article have been based on CT scans obtained after deep inspiration or expiration, on subjective visual grading and on objective measurements of attenuation values. Especially dedicated software was used for this purpose, which provided numerical data, on both two- and three-dimensional approaches, and compared CT data with pulmonary function tests. More recently, fractal and textural analyses were applied to computed tomography scans to assess the presence, the extent, and the types of emphysema. Quantitative computed tomography has already been used in patient selection for surgical treatment of pulmonary emphysema and in pharmacotherapeutical trials. However, despite numerous and extensive studies, this technique has not yet been standardized and important questions about how best to use computed tomography for the quantification of pulmonary emphysema are still unsolved.

  4. Multivariate Analysis and Quantitation of (17)O-NMR in Primary Alcohol Mixtures

    SciTech Connect

    Alam, M.Kathleen; Alam, Todd M.

    1999-07-01

    Multivariate techniques were used to address the quantification of {sup 17}O-NMR (nuclear magnetic resonance) spectra for a series of primary alcohol mixtures. Due to highly overlapping resonances, quantitative spectral evaluation using standard integration and deconvolution techniques proved difficult. Multivariate evaluation of the {sup 17}O-NMR spectral data obtained for 26 mixtures of five primary alcohols demonstrated that obtaining information about spectral overlap and interferences allowed the development of more accurate models. Initial partial least squares (PLS) models developed for the {sup 17}O-NMR data collected from the primary alcohol mixtures resulted in very poor precision, with signal overlap between the different chemical species suspected of being the primary contributor to the error. To directly evaluate the question of spectral overlap in these alcohol mixtures, net analyte signal (NAS) analyses were performed. The NAS results indicate that alcohols with similar chain lengths produced severely overlapping {sup 17}O-NMR resonances. Grouping the alcohols based on chain length allowed more accurate and robust calibration models to be developed.

  5. Quantitative imaging with a mobile phone microscope.

    PubMed

    Skandarajah, Arunan; Reber, Clay D; Switz, Neil A; Fletcher, Daniel A

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone-based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications.

  6. Quantitative Imaging with a Mobile Phone Microscope

    PubMed Central

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  7. Accurate Quantification of Lipid Species by Electrospray Ionization Mass Spectrometry — Meets a Key Challenge in Lipidomics

    PubMed Central

    Yang, Kui; Han, Xianlin

    2011-01-01

    Electrospray ionization mass spectrometry (ESI-MS) has become one of the most popular and powerful technologies to identify and quantify individual lipid species in lipidomics. Meanwhile, quantitative analysis of lipid species by ESI-MS has also become a major obstacle to meet the challenges of lipidomics. Herein, we discuss the principles, advantages, and possible limitations of different mass spectrometry-based methodologies for lipid quantification, as well as a few practical issues important for accurate quantification of individual lipid species. Accordingly, accurate quantification of individual lipid species, one of the key challenges in lipidomics, can be practically met. PMID:22905337

  8. Validating quantitative precipitation forecast for the Flood Meteorological Office, Patna region during 2011-2014

    NASA Astrophysics Data System (ADS)

    Giri, R. K.; Panda, Jagabandhu; Rath, Sudhansu S.; Kumar, Ravindra

    2016-06-01

    In order to issue an accurate warning for flood, a better or appropriate quantitative forecasting of precipitation is required. In view of this, the present study intends to validate the quantitative precipitation forecast (QPF) issued during southwest monsoon season for six river catchments (basin) under the flood meteorological office, Patna region. The forecast is analysed statistically by computing various skill scores of six different precipitation ranges during the years 2011-2014. The analysis of QPF validation indicates that the multi-model ensemble (MME) based forecasting is more reliable in the precipitation ranges of 1-10 and 11-25 mm. However, the reliability decreases for higher ranges of rainfall and also for the lowest range, i.e., below 1 mm. In order to testify synoptic analogue method based MME forecasting for QPF during an extreme weather event, a case study of tropical cyclone Phailin is performed. It is realized that in case of extreme events like cyclonic storms, the MME forecasting is qualitatively useful for issue of warning for the occurrence of floods, though it may not be reliable for the QPF. However, QPF may be improved using satellite and radar products.

  9. Quantitation of fixative-induced morphologic and antigenic variation in mouse and human breast cancers.

    PubMed

    Cardiff, Robert D; Hubbard, Neil E; Engelberg, Jesse A; Munn, Robert J; Miller, Claramae H; Walls, Judith E; Chen, Jane Q; Velásquez-García, Héctor A; Galvez, Jose J; Bell, Katie J; Beckett, Laurel A; Li, Yue-Ju; Borowsky, Alexander D

    2013-04-01

    Quantitative Image Analysis (QIA) of digitized whole slide images for morphometric parameters and immunohistochemistry of breast cancer antigens was used to evaluate the technical reproducibility, biological variability, and intratumoral heterogeneity in three transplantable mouse mammary tumor models of human breast cancer. The relative preservation of structure and immunogenicity of the three mouse models and three human breast cancers was also compared when fixed with representatives of four distinct classes of fixatives. The three mouse mammary tumor cell models were an ER+/PR+ model (SSM2), a Her2+ model (NDL), and a triple negative model (MET1). The four breast cancer antigens were ER, PR, Her2, and Ki67. The fixatives included examples of (1) strong cross-linkers, (2) weak cross-linkers, (3) coagulants, and (4) combination fixatives. Each parameter was quantitatively analyzed using modified Aperio Technologies ImageScope algorithms. Careful pre-analytical adjustments to the algorithms were required to provide accurate results. The QIA permitted rigorous statistical analysis of results and grading by rank order. The analyses suggested excellent technical reproducibility and confirmed biological heterogeneity within each tumor. The strong cross-linker fixatives, such as formalin, consistently ranked higher than weak cross-linker, coagulant and combination fixatives in both the morphometric and immunohistochemical parameters.

  10. Evaluation of the quantitative performances of supercritical fluid chromatography: from method development to validation.

    PubMed

    Dispas, Amandine; Lebrun, Pierre; Ziemons, Eric; Marini, Roland; Rozet, Eric; Hubert, Philippe

    2014-08-01

    Recently, the number of papers about SFC increased drastically but scientists did not truly focus their work on quantitative performances of this technique. In order to prove the potential of UHPSFC, the present work discussed about the different steps of the analytical life cycle of a method: from development to validation and application. Moreover, the UHPSFC quantitative performances were evaluated in comparison with UHPLC, which is the main technique used for quality control in the pharmaceutical industry and then could be considered as a reference. The methods were developed using Design Space strategy, leading to the optimization of robust method. In this context, when the Design Space optimization shows guarantee of quality, no more robustness study is required prior to the validation. Then, the methods were geometrically transferred in order to reduce the analysis time. The UHPSFC and UHPLC methods were validated based on the total error approach using accuracy profile. Even if UHPLC showed better precision and sensitivity, UHPSFC method is able to give accurate results in a dosing range larger than the 80-120% range required by the European Medicines Agency. Consequently, UHPSFC results are valid and could be used for the control of active substance in a finished pharmaceutical product. Finally, UHPSFC validated method was used to analyse real samples and gave similar results than the reference method (UHPLC).

  11. A Quantitative System-Scale Characterization of the Metabolism of Clostridium acetobutylicum

    PubMed Central

    Yoo, Minyeong; Bestel-Corre, Gwenaelle; Croux, Christian; Riviere, Antoine; Meynial-Salles, Isabelle

    2015-01-01

    ABSTRACT Engineering industrial microorganisms for ambitious applications, for example, the production of second-generation biofuels such as butanol, is impeded by a lack of knowledge of primary metabolism and its regulation. A quantitative system-scale analysis was applied to the biofuel-producing bacterium Clostridium acetobutylicum, a microorganism used for the industrial production of solvent. An improved genome-scale model, iCac967, was first developed based on thorough biochemical characterizations of 15 key metabolic enzymes and on extensive literature analysis to acquire accurate fluxomic data. In parallel, quantitative transcriptomic and proteomic analyses were performed to assess the number of mRNA molecules per cell for all genes under acidogenic, solventogenic, and alcohologenic steady-state conditions as well as the number of cytosolic protein molecules per cell for approximately 700 genes under at least one of the three steady-state conditions. A complete fluxomic, transcriptomic, and proteomic analysis applied to different metabolic states allowed us to better understand the regulation of primary metabolism. Moreover, this analysis enabled the functional characterization of numerous enzymes involved in primary metabolism, including (i) the enzymes involved in the two different butanol pathways and their cofactor specificities, (ii) the primary hydrogenase and its redox partner, (iii) the major butyryl coenzyme A (butyryl-CoA) dehydrogenase, and (iv) the major glyceraldehyde-3-phosphate dehydrogenase. This study provides important information for further metabolic engineering of C. acetobutylicum to develop a commercial process for the production of n-butanol. PMID:26604256

  12. Protein Quantitation of the Developing Cochlea Using Mass Spectrometry.

    PubMed

    Darville, Lancia N F; Sokolowski, Bernd H A

    2016-01-01

    Mass spectrometry-based proteomics allows for the measurement of hundreds to thousands of proteins in a biological system. Additionally, mass spectrometry can also be used to quantify proteins and peptides. However, observing quantitative differences between biological systems using mass spectrometry-based proteomics can be challenging because it is critical to have a method that is fast, reproducible, and accurate. Therefore, to study differential protein expression in biological samples labeling or label-free quantitative methods can be used. Labeling methods have been widely used in quantitative proteomics, however label-free methods have become equally as popular and more preferred because they produce faster, cleaner, and simpler results. Here, we describe the methods by which proteins are isolated and identified from cochlear sensory epithelia tissues at different ages and quantitatively differentiated using label-free mass spectrometry.

  13. A predictable and accurate technique with elastomeric impression materials.

    PubMed

    Barghi, N; Ontiveros, J C

    1999-08-01

    A method for obtaining more predictable and accurate final impressions with polyvinylsiloxane impression materials in conjunction with stock trays is proposed and tested. Heavy impression material is used in advance for construction of a modified custom tray, while extra-light material is used for obtaining a more accurate final impression.

  14. Tube dimpling tool assures accurate dip-brazed joints

    NASA Technical Reports Server (NTRS)

    Beuyukian, C. S.; Heisman, R. M.

    1968-01-01

    Portable, hand-held dimpling tool assures accurate brazed joints between tubes of different diameters. Prior to brazing, the tool performs precise dimpling and nipple forming and also provides control and accurate measuring of the height of nipples and depth of dimples so formed.

  15. Considerations when quantitating protein abundance by immunoblot.

    PubMed

    McDonough, Alicia A; Veiras, Luciana C; Minas, Jacqueline N; Ralph, Donna Lee

    2015-03-15

    The development of the immunoblot to detect and characterize a protein with an antisera, even in a crude mixture, was a breakthrough with wide-ranging and unpredictable applications across physiology and medicine. Initially, this technique was viewed as a tool for qualitative, not quantitative, analyses of proteins because of the high number of variables between sample preparation and detection with antibodies. Nonetheless, as the immunoblot method was streamlined and improved, investigators pushed it to quantitate protein abundance in unpurified samples as a function of treatment, genotype, or pathology. This short review, geared at investigators, reviewers, and critical readers, presents a set of issues that are of critical importance for quantitative analysis of protein abundance: 1) Consider whether tissue samples are of equivalent integrity and assess how handling between collection and assay influences the apparent relative abundance. 2) Establish the specificity of the antiserum for the protein of interest by providing clear images, molecular weight markers, positive and negative controls, and vendor details. 3) Provide convincing evidence for linearity of the detection system by assessing signal density as a function of sample loaded. 4) Recognize that loading control proteins are rarely in the same linear range of detection as the protein of interest; consider protein staining of the gel or blot. In summary, with careful attention to sample integrity, antibody specificity, linearity of the detection system, and acceptable loading controls, investigators can implement quantitative immunoblots to convincingly assess protein abundance in their samples.

  16. A Proteomic Study of the HUPO Plasma Proteome Project's Pilot Samples using an Accurate Mass and Time Tag Strategy

    SciTech Connect

    Adkins, Joshua N.; Monroe, Matthew E.; Auberry, Kenneth J.; Shen, Yufeng; Jacobs, Jon M.; Camp, David G.; Vitzthum, Frank; Rodland, Karin D.; Zangar, Richard C.; Smith, Richard D.; Pounds, Joel G.

    2005-08-01

    Characterization of the human blood plasma proteome is critical to the discovery of routinely useful clinical biomarkers. We used an Accurate Mass and Time (AMT) tag strategy with high-resolution mass accuracy capillary liquid chromatography Fourier-Transform Ion Cyclotron Resonance Mass Spectrometry (cLC-FTICR MS) to perform a global proteomic analysis of pilot study samples as part of the HUPO Plasma Proteome Project. HUPO reference serum and citrated plasma samples from African Americans, Asian Americans, and Caucasian Americans were analyzed, in addition to a Pacific Northwest National Laboratory reference serum and plasma. The AMT tag strategy allowed us to leverage two previously published “shotgun” proteomics experiments to perform global analyses on these samples in triplicate in less than 4 days total analysis time. A total of 722 (22% with multiple peptide identifications) International Protein Index (IPI) redundant proteins, or 377 protein families by ProteinProphet, were identified over the 6 individual HUPO serum and plasma samples. The samples yielded a similar number of identified redundant proteins in the plasma samples (average 446 +/-23) as found in the serum samples (average 440+/-20). These proteins were identified by an average of 956+/-35 unique peptides in plasma and 930+/-11 unique peptides in serum. In addition to this high-throughput analysis, the AMT tag approach was used with a Z-score normalization to compare relative protein abundances. This analysis highlighted both known differences in serum and citrated plasma such as fibrinogens, and reproducible differences in peptide abundances from proteins such as soluble activin receptor-like kinase 7b and glycoprotein m6b. The AMT tag strategy not only improved our sample throughput, and provided a basis for estimated quantitation.

  17. Using GPS To Teach More Than Accurate Positions.

    ERIC Educational Resources Information Center

    Johnson, Marie C.; Guth, Peter L.

    2002-01-01

    Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…

  18. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  19. Quantitative Genetics in the Genomics Era

    PubMed Central

    Hill, William G.

    2012-01-01

    The genetic analysis of quantitative or complex traits has been based mainly on statistical quantities such as genetic variances and heritability. These analyses continue to be developed, for example in studies of natural populations. Genomic methods are having an impact on progress and prospects. Actual relationships of individuals can be estimated enabling novel quantitative analyses. Increasing precision of linkage mapping is feasible with dense marker panels and designed stocks allowing multiple generations of recombination, and large SNP panels enable the use of genome wide association analysis utilising historical recombination. Whilst such analyses are identifying many loci for disease genes and traits such as height, typically each individually contributes a small amount of the variation. Only by fitting all SNPs without regard to significance can a high proportion be accounted for, so a classical polygenic model with near infinitesimally small effects remains a useful one. Theory indicates that a high proportion of variants will have low minor allele frequency, making detection difficult. Genomic selection, based on simultaneously fitting very dense markers and incorporating these with phenotypic data in breeding value prediction is revolutionising breeding programmes in agriculture and has a major potential role in human disease prediction. PMID:23115521

  20. Analysing the ventricular fibrillation waveform.

    PubMed

    Reed, Matthew J; Clegg, Gareth R; Robertson, Colin E

    2003-04-01

    The surface electrocardiogram associated with ventricular fibrillation has been of interest to researchers for some time. Over the last few decades, techniques have been developed to analyse this signal in an attempt to obtain more information about the state of the myocardium and the chances of successful defibrillation. This review looks at the implications of analysing the VF waveform and discusses the various techniques that have been used, including fast Fourier transform analysis, wavelet transform analysis and mathematical techniques such as chaos theory.

  1. Nonspectroscopic imaging for quantitative chlorophyll sensing

    NASA Astrophysics Data System (ADS)

    Kim, Taehoon; Kim, Jeong-Im; Visbal-Onufrak, Michelle A.; Chapple, Clint; Kim, Young L.

    2016-01-01

    Nondestructive imaging of physiological changes in plants has been intensively used as an invaluable tool for visualizing heterogeneous responses to various types of abiotic and biotic stress. However, conventional approaches often have intrinsic limitations for quantitative analyses, requiring bulky and expensive optical instruments for capturing full spectral information. We report a spectrometerless (or spectrometer-free) reflectance imaging method that allows for nondestructive and quantitative chlorophyll imaging in individual leaves in situ in a handheld device format. The combination of a handheld-type imaging system and a hyperspectral reconstruction algorithm from an RGB camera offers simple instrumentation and operation while avoiding the use of an imaging spectrograph or tunable color filter. This platform could potentially be integrated into a compact, inexpensive, and portable system, while being of great value in high-throughput phenotyping facilities and laboratory settings.

  2. The accurate assessment of small-angle X-ray scattering data

    SciTech Connect

    Grant, Thomas D.; Luft, Joseph R.; Carter, Lester G.; Matsui, Tsutomu; Weiss, Thomas M.; Martel, Anne; Snell, Edward H.

    2015-01-01

    A set of quantitative techniques is suggested for assessing SAXS data quality. These are applied in the form of a script, SAXStats, to a test set of 27 proteins, showing that these techniques are more sensitive than manual assessment of data quality. Small-angle X-ray scattering (SAXS) has grown in popularity in recent times with the advent of bright synchrotron X-ray sources, powerful computational resources and algorithms enabling the calculation of increasingly complex models. However, the lack of standardized data-quality metrics presents difficulties for the growing user community in accurately assessing the quality of experimental SAXS data. Here, a series of metrics to quantitatively describe SAXS data in an objective manner using statistical evaluations are defined. These metrics are applied to identify the effects of radiation damage, concentration dependence and interparticle interactions on SAXS data from a set of 27 previously described targets for which high-resolution structures have been determined via X-ray crystallography or nuclear magnetic resonance (NMR) spectroscopy. The studies show that these metrics are sufficient to characterize SAXS data quality on a small sample set with statistical rigor and sensitivity similar to or better than manual analysis. The development of data-quality analysis strategies such as these initial efforts is needed to enable the accurate and unbiased assessment of SAXS data quality.

  3. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  4. Nonlinear shell analyses of the space shuttle solid rocket boosters

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, Ronnie E.; Nemeth, Michael P.

    1989-01-01

    A variety of structural analyses have been performed on the Solid Rocket Boosters (SRB's) to provide information that would contribute to the understanding of the failure which destroyed the Space Shuttle Challenger. This paper describes nonlinear shell analyses that were performed to characterize the behavior of an overall SRB structure and a segment of the SRB in the vicinity of the External Tank Attachment (ETA) ring. Shell finite element models were used that would accurately reflect the global load transfer in an SRB in a manner such that nonlinear shell collapse and ovalization could be assessed. The purpose of these analyses was to calculate the overall deflection and stress distributions for these SRB models when subjected to mechanical loads corresponding to critical times during the launch sequence. Static analyses of these SRB models were performed using a snapshot picture of the loads. Analytical results obtained using these models show no evidence of nonlinear shell collapse for the pre-liftoff loading cases considered.