On Quantitative Comparative Research in Communication and Language Evolution
Oller, D. Kimbrough; Griebel, Ulrike
2014-01-01
Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives. PMID:25285057
On Quantitative Comparative Research in Communication and Language Evolution.
Oller, D Kimbrough; Griebel, Ulrike
2014-09-01
Quantitative comparison of human language and natural animal communication requires improved conceptualizations. We argue that an infrastructural approach to development and evolution incorporating an extended interpretation of the distinctions among illocution, perlocution, and meaning (Austin 1962; Oller and Griebel 2008) can help place the issues relevant to quantitative comparison in perspective. The approach can illuminate the controversy revolving around the notion of functional referentiality as applied to alarm calls, for example in the vervet monkey. We argue that referentiality offers a poor point of quantitative comparison across language and animal communication in the wild. Evidence shows that even newborn human cry could be deemed to show functional referentiality according to the criteria typically invoked by advocates of referentiality in animal communication. Exploring the essence of the idea of illocution, we illustrate an important realm of commonality among animal communication systems and human language, a commonality that opens the door to more productive, quantifiable comparisons. Finally, we delineate two examples of infrastructural communicative capabilities that should be particularly amenable to direct quantitative comparison across humans and our closest relatives.
The Local Geometry of Multiattribute Tradeoff Preferences
McGeachie, Michael; Doyle, Jon
2011-01-01
Existing representations for multiattribute ceteris paribus preference statements have provided useful treatments and clear semantics for qualitative comparisons, but have not provided similarly clear representations or semantics for comparisons involving quantitative tradeoffs. We use directional derivatives and other concepts from elementary differential geometry to interpret conditional multiattribute ceteris paribus preference comparisons that state bounds on quantitative tradeoff ratios. This semantics extends the familiar economic notion of marginal rate of substitution to multiple continuous or discrete attributes. The same geometric concepts also provide means for interpreting statements about the relative importance of different attributes. PMID:21528018
Comparisons between wave directional spectra from SAR and pressure sensor arrays
NASA Technical Reports Server (NTRS)
Pawka, S. S.; Inman, D. L.; Hsiao, S. V.; Shemdin, O. H.
1980-01-01
Simultaneous directional wave measurements were made at Torrey Pines Beach, California, by a synthetic aperture radar (SAR) and a linear array of pressure sensors. The measurements were conducted during the West Coast Experiment in March 1977. Quantitative comparisons of the normalized directional spectra from the two systems were made for wave periods of 6.9-17.0 s. The comparison results were variable but generally showed good agreement of the primary mode of the normalized directional energy. An attempt was made to quantify the physical criteria for good wave imaging in the SAR. A frequency band analysis of wave parameters such as band energy, slope, and orbital velocity did not show good correlation with the directional comparisons. It is noted that absolute values of the wave height spectrum cannot be derived from the SAR images yet and, consequently, no comparisons of absolute energy levels with corresponding array measurements were intended.
Indirect scaling methods for testing quantitative emotion theories.
Junge, Martin; Reisenzein, Rainer
2013-01-01
Two studies investigated the utility of indirect scaling methods, based on graded pair comparisons, for the testing of quantitative emotion theories. In Study 1, we measured the intensity of relief and disappointment caused by lottery outcomes, and in Study 2, the intensity of disgust evoked by pictures, using both direct intensity ratings and graded pair comparisons. The stimuli were systematically constructed to reflect variables expected to influence the intensity of the emotions according to theoretical models of relief/disappointment and disgust, respectively. Two probabilistic scaling methods were used to estimate scale values from the pair comparison judgements: Additive functional measurement (AFM) and maximum likelihood difference scaling (MLDS). The emotion models were fitted to the direct and indirect intensity measurements using nonlinear regression (Study 1) and analysis of variance (Study 2). Both studies found substantially improved fits of the emotion models for the indirectly determined emotion intensities, with their advantage being evident particularly at the level of individual participants. The results suggest that indirect scaling methods yield more precise measurements of emotion intensity than rating scales and thereby provide stronger tests of emotion theories in general and quantitative emotion theories in particular.
Testing process predictions of models of risky choice: a quantitative model comparison approach
Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard
2013-01-01
This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472
Yu, Clinton; Huszagh, Alexander; Viner, Rosa; Novitsky, Eric J; Rychnovsky, Scott D; Huang, Lan
2016-10-18
Cross-linking mass spectrometry (XL-MS) represents a recently popularized hybrid methodology for defining protein-protein interactions (PPIs) and analyzing structures of large protein assemblies. In particular, XL-MS strategies have been demonstrated to be effective in elucidating molecular details of PPIs at the peptide resolution, providing a complementary set of structural data that can be utilized to refine existing complex structures or direct de novo modeling of unknown protein structures. To study structural and interaction dynamics of protein complexes, quantitative cross-linking mass spectrometry (QXL-MS) strategies based on isotope-labeled cross-linkers have been developed. Although successful, these approaches are mostly limited to pairwise comparisons. In order to establish a robust workflow enabling comparative analysis of multiple cross-linked samples simultaneously, we have developed a multiplexed QXL-MS strategy, namely, QMIX (Quantitation of Multiplexed, Isobaric-labeled cross (X)-linked peptides) by integrating MS-cleavable cross-linkers with isobaric labeling reagents. This study has established a new analytical platform for quantitative analysis of cross-linked peptides, which can be directly applied for multiplexed comparisons of the conformational dynamics of protein complexes and PPIs at the proteome scale in future studies.
Identification of common coexpression modules based on quantitative network comparison.
Jo, Yousang; Kim, Sanghyeon; Lee, Doheon
2018-06-13
Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.
Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons
2014-01-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829
Lipiäinen, Tiina; Fraser-Miller, Sara J; Gordon, Keith C; Strachan, Clare J
2018-02-05
This study considers the potential of low-frequency (terahertz) Raman spectroscopy in the quantitative analysis of ternary mixtures of solid-state forms. Direct comparison between low-frequency and mid-frequency spectral regions for quantitative analysis of crystal form mixtures, without confounding sampling and instrumental variations, is reported for the first time. Piroxicam was used as a model drug, and the low-frequency spectra of piroxicam forms β, α2 and monohydrate are presented for the first time. These forms show clear spectral differences in both the low- and mid-frequency regions. Both spectral regions provided quantitative models suitable for predicting the mixture compositions using partial least squares regression (PLSR), but the low-frequency data gave better models, based on lower errors of prediction (2.7, 3.1 and 3.2% root-mean-square errors of prediction [RMSEP] values for the β, α2 and monohydrate forms, respectively) than the mid-frequency data (6.3, 5.4 and 4.8%, for the β, α2 and monohydrate forms, respectively). The better performance of low-frequency Raman analysis was attributed to larger spectral differences between the solid-state forms, combined with a higher signal-to-noise ratio. Copyright © 2017 Elsevier B.V. All rights reserved.
Comparative Mammalian Cell Toxicity of N-DBPs and C-DBPs
In order to generate a quantitative, direct comparison amongst classes of drinking water disinfection by-products (DBPs), we developed and calibrated in vitro mammalian cell cytotoxicity and genotoxicity assays to integrate the analytical biology with the analytical chemistry of ...
Quantitative imaging biomarkers: a review of statistical methods for computer algorithm comparisons.
Obuchowski, Nancy A; Reeves, Anthony P; Huang, Erich P; Wang, Xiao-Feng; Buckler, Andrew J; Kim, Hyun J Grace; Barnhart, Huiman X; Jackson, Edward F; Giger, Maryellen L; Pennello, Gene; Toledano, Alicia Y; Kalpathy-Cramer, Jayashree; Apanasovich, Tatiyana V; Kinahan, Paul E; Myers, Kyle J; Goldgof, Dmitry B; Barboriak, Daniel P; Gillies, Robert J; Schwartz, Lawrence H; Sullivan, Daniel C
2015-02-01
Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
A comparison study of image features between FFDM and film mammogram images
Jing, Hao; Yang, Yongyi; Wernick, Miles N.; Yarusso, Laura M.; Nishikawa, Robert M.
2012-01-01
Purpose: This work is to provide a direct, quantitative comparison of image features measured by film and full-field digital mammography (FFDM). The purpose is to investigate whether there is any systematic difference between film and FFDM in terms of quantitative image features and their influence on the performance of a computer-aided diagnosis (CAD) system. Methods: The authors make use of a set of matched film-FFDM image pairs acquired from cadaver breast specimens with simulated microcalcifications consisting of bone and teeth fragments using both a GE digital mammography system and a screen-film system. To quantify the image features, the authors consider a set of 12 textural features of lesion regions and six image features of individual microcalcifications (MCs). The authors first conduct a direct comparison on these quantitative features extracted from film and FFDM images. The authors then study the performance of a CAD classifier for discriminating between MCs and false positives (FPs) when the classifier is trained on images of different types (film, FFDM, or both). Results: For all the features considered, the quantitative results show a high degree of correlation between features extracted from film and FFDM, with the correlation coefficients ranging from 0.7326 to 0.9602 for the different features. Based on a Fisher sign rank test, there was no significant difference observed between the features extracted from film and those from FFDM. For both MC detection and discrimination of FPs from MCs, FFDM had a slight but statistically significant advantage in performance; however, when the classifiers were trained on different types of images (acquired with FFDM or SFM) for discriminating MCs from FPs, there was little difference. Conclusions: The results indicate good agreement between film and FFDM in quantitative image features. While FFDM images provide better detection performance in MCs, FFDM and film images may be interchangeable for the purposes of training CAD algorithms, and a single CAD algorithm may be applied to either type of images. PMID:22830771
Database for LDV Signal Processor Performance Analysis
NASA Technical Reports Server (NTRS)
Baker, Glenn D.; Murphy, R. Jay; Meyers, James F.
1989-01-01
A comparative and quantitative analysis of various laser velocimeter signal processors is difficult because standards for characterizing signal bursts have not been established. This leaves the researcher to select a signal processor based only on manufacturers' claims without the benefit of direct comparison. The present paper proposes the use of a database of digitized signal bursts obtained from a laser velocimeter under various configurations as a method for directly comparing signal processors.
Sproston, E L; Carrillo, C D; Boulter-Bitzer, J
2014-12-01
Harmonisation of methods between Canadian government agencies is essential to accurately assess and compare the prevalence and concentrations present on retail poultry intended for human consumption. The standard qualitative procedure used by Health Canada differs to that used by the USDA for both quantitative and qualitative methods. A comparison of three methods was performed on raw poultry samples obtained from an abattoir to determine if one method is superior to the others in isolating Campylobacter from chicken carcass rinses. The average percent of positive samples was 34.72% (95% CI, 29.2-40.2), 39.24% (95% CI, 33.6-44.9), 39.93% (95% CI, 34.3-45.6) for the direct plating US method and the US enrichment and Health Canada enrichment methods, respectively. Overall there were significant differences when comparing either of the enrichment methods to the direct plating method using the McNemars chi squared test. On comparison of weekly data (Fishers exact test) direct plating was only inferior to the enrichment methods on a single occasion. Direct plating is important for enumeration and establishing the concentration of Campylobacter present on raw poultry. However, enrichment methods are also vital to identify positive samples where concentrations are below the detection limit for direct plating. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.
Bremner, J D; Baldwin, R; Horti, A; Staib, L H; Ng, C K; Tan, P Z; Zea-Ponce, Y; Zoghbi, S; Seibyl, J P; Soufer, R; Charney, D S; Innis, R B
1999-08-31
Although positron emission tomography (PET) and single photon emission computed tomography (SPECT) are increasingly used for quantitation of neuroreceptor binding, almost no studies to date have involved a direct comparison of the two. One study found a high level of agreement between the two techniques, although there was a systematic 30% increase in measures of benzodiazepine receptor binding in SPECT compared with PET. The purpose of the current study was to directly compare quantitation of benzodiazepine receptor binding in the same human subjects using PET and SPECT with high specific activity [11C]iomazenil and [123I]iomazenil, respectively. All subjects were administered a single bolus of high specific activity iomazenil labeled with 11C or 123I followed by dynamic PET or SPECT imaging of the brain. Arterial blood samples were obtained for measurement of metabolite-corrected radioligand in plasma. Compartmental modeling was used to fit values for kinetic rate constants of transfer of radioligand between plasma and brain compartments. These values were used for calculation of binding potential (BP = Bmax/Kd) and product of BP and the fraction of free non-protein-bound parent compound (V3'). Mean values for V3' in PET and SPECT were as follows: temporal cortex 23+/-5 and 22+/-3 ml/g, frontal cortex23+/-6 and 22+/-3 ml/g, occipital cortex 28+/-3 and 31+/-5 ml/g, and striatum 4+/-4 and 7+/-4 ml/g. These preliminary findings indicate that PET and SPECT provide comparable results in quantitation of neuroreceptor binding in the human brain.
Comparison of a direct and indirect ELISA for quantitating antisperm antibody in semen.
Lynch, D M; Howe, S E
1987-01-01
A direct and an indirect quantitative ELISA for antisperm antibody were compared using the spermatozoa and cell-free seminal fluid of 66 infertile males. The normal concentration of sperm binding immunoglobulin was less than or equal to 1.5 fg Ig per spermatozoon for the indirect seminal plasma assay and less than or equal to 1.5 fg Ig per spermatozoon by the direct assay. Of the 66 infertile males, 21% (14/66) had elevated levels of antisperm antibody in their seminal plasma and 26% (17/66) had elevated levels bound directly to their spermatozoa. The direct correlation between the results of these assays was 94%. A simple linear regression analysis between the indirect and direct measurements of antisperm antibody resulted in a correlation coefficient of r = 0.907. There was no statistically significant difference between results from the direct and indirect methods of the patients as a group. However, there was evidence of autospecificity in a small percentage of males who had elevated levels of antisperm antibody by the direct assay that was not detected by the indirect assay using pooled donor spermatozoa.
The Choice of Initial Web Search Strategies: A Comparison between Finnish and American Searchers.
ERIC Educational Resources Information Center
Iivonen, Mirja; White, Marilyn Domas
2001-01-01
Describes a study that used qualitative and quantitative methodologies to analyze differences between Finnish and American Web searchers in their choice of initial search strategies (direct address, subject directory, and search engines) and their reasoning underlying their choices. Considers implications for considering cultural differences in…
Direct and Quantitative Photothermal Absorption Spectroscopy of Individual Particulates
2013-01-01
1(a). By taking the ratio of the spectral absorption efficiency of the microwire to the corresponding volumetri - cally equivalent thin film, an...of D¼ 983 nm. For further comparison, the theoretical spectral absorption efficiency for a volumetri - cally equivalent (t¼ 983p/4 nm) thin film, Qabs
NASA Technical Reports Server (NTRS)
Sharp, J. M.; Thomas, R. W.
1975-01-01
How LANDSAT imagery can be cost effectively employed to augment an operational hydrologic model is described. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the LANDSAT-aided approach.
Nadort, Annemarie; Woolthuis, Rutger G.; van Leeuwen, Ton G.; Faber, Dirk J.
2013-01-01
We present integrated Laser Speckle Contrast Imaging (LSCI) and Sidestream Dark Field (SDF) flowmetry to provide real-time, non-invasive and quantitative measurements of speckle decorrelation times related to microcirculatory flow. Using a multi exposure acquisition scheme, precise speckle decorrelation times were obtained. Applying SDF-LSCI in vitro and in vivo allows direct comparison between speckle contrast decorrelation and flow velocities, while imaging the phantom and microcirculation architecture. This resulted in a novel analysis approach that distinguishes decorrelation due to flow from other additive decorrelation sources. PMID:24298399
Pupil movements to light and accommodative stimulation - A comparative study.
NASA Technical Reports Server (NTRS)
Semmlow, J.; Stark, L.
1973-01-01
Isolation and definition of specific response components in pupil reflexes through comparison of the dynamic features of light-induced and accommodation-induced pupil movements. A quantitative analysis of the behavior of the complex nonlinear pupil responses reveals the presence of two independent nonlinear characteristics: a range-dependent gain and a direction dependence or movement asymmetry. These nonlinear properties are attributed to motor processes because they are observable in pupil responses to both light and accommodation stimuli. The possible mechanisms and consequences of these pupil response characteristics are quantitatively defined and discussed.
Quantitative proteomics in biological research.
Wilm, Matthias
2009-10-01
Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.
A Taylor weak-statement algorithm for hyperbolic conservation laws
NASA Technical Reports Server (NTRS)
Baker, A. J.; Kim, J. W.
1987-01-01
Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.
Quantitative prediction of drug side effects based on drug-related features.
Niu, Yanqing; Zhang, Wen
2017-09-01
Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.
A cost-effectiveness comparison of existing and Landsat-aided snow water content estimation systems
NASA Technical Reports Server (NTRS)
Sharp, J. M.; Thomas, R. W.
1975-01-01
This study describes how Landsat imagery can be cost-effectively employed to augment an operational hydrologic model. Attention is directed toward the estimation of snow water content, a major predictor variable in the volumetric runoff forecasting model presently used by the California Department of Water Resources. A stratified double sampling scheme is supplemented with qualitative and quantitative analyses of existing operations to develop a comparison between the existing and satellite-aided approaches to snow water content estimation. Results show a decided advantage for the Landsat-aided approach.
Fiber tractography using machine learning.
Neher, Peter F; Côté, Marc-Alexandre; Houde, Jean-Christophe; Descoteaux, Maxime; Maier-Hein, Klaus H
2017-09-01
We present a fiber tractography approach based on a random forest classification and voting process, guiding each step of the streamline progression by directly processing raw diffusion-weighted signal intensities. For comparison to the state-of-the-art, i.e. tractography pipelines that rely on mathematical modeling, we performed a quantitative and qualitative evaluation with multiple phantom and in vivo experiments, including a comparison to the 96 submissions of the ISMRM tractography challenge 2015. The results demonstrate the vast potential of machine learning for fiber tractography. Copyright © 2017 Elsevier Inc. All rights reserved.
Method comparison for forest soil carbon and nitrogen estimates in the Delaware River basin
B. Xu; Yude Pan; A.H. Johnson; A.F. Plante
2016-01-01
The accuracy of forest soil C and N estimates is hampered by forest soils that are rocky, inaccessible, and spatially heterogeneous. A composite coring technique is the standard method used in Forest Inventory and Analysis, but its accuracy has been questioned. Quantitative soil pits provide direct measurement of rock content and soil mass from a larger, more...
ERIC Educational Resources Information Center
Cizdziel, James V.
2011-01-01
In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…
ERIC Educational Resources Information Center
Sun, Chyng; Bridges, Ana; Wosnitzer, Robert; Scharrer, Erica; Liberman, Rachael
2008-01-01
Pornography is a lucrative business. Increasingly, women have participated in both its production, direction, and consumption. This study investigated how the content in popular pornographic videos created by female directors differs from that of their male counterparts. We conducted a quantitative analysis of 122 randomly selected scenes from 44…
Development of a Contact Permeation Test Fixture and Method
2013-04-01
direct contact with the skin, indicates the need for a quantitative contact test method. Comparison tests were conducted with VX on a standardized...Guide for the Care and Use of Laboratory Animals (8th ed.; National Research Council: Washington, DC, 2011). This test was also performed in...1 1.2 Development of a Contact-Based Permeation Test Method ........................................ 1 2. EXPERIMENTAL PROCEDURES
Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G
2017-10-01
A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Methodological aspects of multicenter studies with quantitative PET.
Boellaard, Ronald
2011-01-01
Quantification of whole-body FDG PET studies is affected by many physiological and physical factors. Much of the variability in reported standardized uptake value (SUV) data seen in the literature results from the variability in methodology applied among these studies, i.e., due to the use of different scanners, acquisition and reconstruction settings, region of interest strategies, SUV normalization, and/or corrections methods. To date, the variability in applied methodology prohibits a proper comparison and exchange of quantitative FDG PET data. Consequently, the promising role of quantitative PET has been demonstrated in several monocentric studies, but these published results cannot be used directly as a guideline for clinical (multicenter) trials performed elsewhere. In this chapter, the main causes affecting whole-body FDG PET quantification and strategies to minimize its inter-institute variability are addressed.
NASA Astrophysics Data System (ADS)
Konstantinov, M. S.; Orlov, A. A.
2014-12-01
The paper analyzes the possibility of the use of a gravity-assist maneuver for flight to Jupiter. The advantage of the Earth gravity-assist maneuver in comparison with the direct transfer in terms of reduction of amount of energy required per transfer is considered. Quantitative and qualitative evaluations of two transfer profiles are given.
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
Podlipská, Jana; Guermazi, Ali; Lehenkari, Petri; Niinimäki, Jaakko; Roemer, Frank W; Arokoski, Jari P; Kaukinen, Päivi; Liukkonen, Esa; Lammentausta, Eveliina; Nieminen, Miika T; Tervonen, Osmo; Koski, Juhani M; Saarakkala, Simo
2016-03-01
Osteoarthritis (OA) is a common degenerative musculoskeletal disease highly prevalent in aging societies worldwide. Traditionally, knee OA is diagnosed using conventional radiography. However, structural changes of articular cartilage or menisci cannot be directly evaluated using this method. On the other hand, ultrasound is a promising tool able to provide direct information on soft tissue degeneration. The aim of our study was to systematically determine the site-specific diagnostic performance of semi-quantitative ultrasound grading of knee femoral articular cartilage, osteophytes and meniscal extrusion, and of radiographic assessment of joint space narrowing and osteophytes, using MRI as a reference standard. Eighty asymptomatic and 79 symptomatic subjects with mean age of 57.7 years were included in the study. Ultrasound performed best in the assessment of femoral medial and lateral osteophytes, and medial meniscal extrusion. In comparison to radiography, ultrasound performed better or at least equally well in identification of tibio-femoral osteophytes, medial meniscal extrusion and medial femoral cartilage morphological degeneration. Ultrasound provides relevant additional diagnostic information on tissue-specific morphological changes not depicted by conventional radiography. Consequently, the use of ultrasound as a complementary imaging tool along with radiography may enable more accurate and cost-effective diagnostics of knee osteoarthritis at the primary healthcare level.
Direction-dependent stability of skyrmion lattice in helimagnets induced by exchange anisotropy
NASA Astrophysics Data System (ADS)
Hu, Yangfan
2018-06-01
Exchange anisotropy provides a direction dependent mechanism for the stability of the skyrmion lattice phase in noncentrosymmetric bulk chiral magnets. Based on the Fourier representation of the skyrmion lattice, we explain the direction dependence of the temperature-magnetic field phase diagram for bulk MnSi through a phenomenological mean-field model incorporating exchange anisotropy. Through quantitative comparison with experimental results, we clarify that the stability of the skyrmion lattice phase in bulk MnSi is determined by a combined effect of negative exchange anisotropy and thermal fluctuation. The effect of exchange anisotropy and the order of Fourier representation on the equilibrium properties of the skyrmion lattice is discussed in detail.
NASA Technical Reports Server (NTRS)
Bathel, Brett F.; Danehy, Paul M.; Johansen, Craig T.; Ashcraft, Scott W.; Novak, Luke A.
2013-01-01
Numerical predictions of the Mars Science Laboratory reaction control system jets interacting with a Mach 10 hypersonic flow are compared to experimental nitric oxide planar laser-induced fluorescence data. The steady Reynolds Averaged Navier Stokes equations using the Baldwin-Barth one-equation turbulence model were solved using the OVERFLOW code. The experimental fluorescence data used for comparison consists of qualitative two-dimensional visualization images, qualitative reconstructed three-dimensional flow structures, and quantitative two-dimensional distributions of streamwise velocity. Through modeling of the fluorescence signal equation, computational flow images were produced and directly compared to the qualitative fluorescence data.
Quantitative SIMS Imaging of Agar-Based Microbial Communities.
Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V
2018-05-01
After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Couturier, Laurent, E-mail: laurent.couturier55@ho
The fine microstructure obtained by unmixing of a solid solution either by classical precipitation or spinodal decomposition is often characterized either by small angle scattering or atom probe tomography. This article shows that a common data analysis framework can be used to analyze data obtained from these two techniques. An example of the application of this common analysis is given for characterization of the unmixing of the Fe-Cr matrix of a 15-5 PH stainless steel during long-term ageing at 350 °C and 400 °C. A direct comparison of the Cr composition fluctuations amplitudes and characteristic lengths obtained with both techniquesmore » is made showing a quantitative agreement for the fluctuation amplitudes. The origin of the discrepancy remaining for the characteristic lengths is discussed. - Highlights: •Common analysis framework for atom probe tomography and small angle scattering •Comparison of same microstructural characteristics obtained using both techniques •Good correlation of Cr composition fluctuations amplitudes from both techniques •Good correlation of Cr composition fluctuations amplitudes with classic V parameter.« less
An anthropomorphic phantom for quantitative evaluation of breast MRI.
Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo
2011-02-01
In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of breast MRI imaging protocols for lesion detection and characterization.
Johnston, Patrick A; Brown, Robert C
2014-08-13
A rapid method for the quantitation of total sugars in pyrolysis liquids using high-performance liquid chromatography (HPLC) was developed. The method avoids the tedious and time-consuming sample preparation required by current analytical methods. It is possible to directly analyze hydrolyzed pyrolysis liquids, bypassing the neutralization step usually required in determination of total sugars. A comparison with traditional methods was used to determine the validity of the results. The calibration curve coefficient of determination on all standard compounds was >0.999 using a refractive index detector. The relative standard deviation for the new method was 1.13%. The spiked sugar recoveries on the pyrolysis liquid samples were between 104 and 105%. The research demonstrates that it is possible to obtain excellent accuracy and efficiency using HPLC to quantitate glucose after acid hydrolysis of polymeric and oligomeric sugars found in fast pyrolysis bio-oils without neutralization.
Cinelli, Giorgia; Tositti, Laura; Mostacci, Domiziano; Baré, Jonathan
2016-05-01
In view of assessing natural radioactivity with on-site quantitative gamma spectrometry, efficiency calibration of NaI(Tl) detectors is investigated. A calibration based on Monte Carlo simulation of detector response is proposed, to render reliable quantitative analysis practicable in field campaigns. The method is developed with reference to contact geometry, in which measurements are taken placing the NaI(Tl) probe directly against the solid source to be analyzed. The Monte Carlo code used for the simulations was MCNP. Experimental verification of the calibration goodness is obtained by comparison with appropriate standards, as reported. On-site measurements yield a quick quantitative assessment of natural radioactivity levels present ((40)K, (238)U and (232)Th). On-site gamma spectrometry can prove particularly useful insofar as it provides information on materials from which samples cannot be taken. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
A quantitative comparison of corrective and perfective maintenance
NASA Technical Reports Server (NTRS)
Henry, Joel; Cain, James
1994-01-01
This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.
NASA Astrophysics Data System (ADS)
Rogers, K.; Cooper, W. T.; Hodgkins, S. B.; Verbeke, B. A.; Chanton, J.
2017-12-01
Solid state direct polarization 13C NMR spectroscopy (DP-NMR) is generally considered the most quantitatively reliable method for soil organic matter (SOM) characterization, including determination of the relative abundances of carbon functional groups. These functional abundances can then be used to calculate important soil parameters such as degree of humification and extent of aromaticity that reveal differences in reactivity or compositional changes along gradients (e.g. thaw chronosequence in permafrost). Unfortunately, the 13C NMR DP-NMR experiment is time-consuming, with a single sample often requiring over 24 hours of instrument time. Alternatively, solid state cross polarization 13C NMR (CP-NMR) can circumvent this problem, reducing analyses times to 4-6 hours but with some loss of quantitative reliability. Attenuated Total Reflectance Fourier Transform Infrared spectroscopy (ATR-FTIR) is a quick and relatively inexpensive method for characterizing solid materials, and has been suggested as an alternative to NMR for analysis of soil organic matter and determination of humification (HI) and aromatization (AI) indices. However, the quantitative reliability of ATR-FTIR for SOM analyses has never been verified, nor have any ATR-FTIR data been compared to similar measurements by NMR. In this work we focused on FTIR vibrational bands that correspond to the three functional groups used to calculate HI and AI values: carbohydrates (1030 cm-1), aromatics (1510, 1630 cm-1), and aliphatics (2850, 2920 cm-1). Data from ATR-FTIR measurements were compared to analogous quantitation by DP- and CP-NMR using peat samples from Sweden, Minnesota, and North Carolina. DP- and CP-NMR correlate very strongly, although the correlations are not always 1:1. Direct comparison of relative abundances of the three functional groups determined by NMR and ATR-FTIR yielded satisfactory results for carbohydrates (r2= 0.78) and aliphatics (r2=0.58), but less so for aromatics (r2= 0.395). ATR-FTIR has to this point been used primarily for relative abundance analyses (e.g. calculating HI and AI values), but these results suggest FTIR can provide quantitative reliability that approaches that of NMR.
Mahan, Ellen D.; Morrow, Kathleen M.; Hayes, John E.
2015-01-01
Background Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Study Design Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey’s honestly significant difference test. Results Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Conclusions Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. PMID:21757061
Mahan, Ellen D; Morrow, Kathleen M; Hayes, John E
2011-08-01
Increasing prevalence of HIV infection among women worldwide has motivated the development of female-initiated prevention methods, including gel-based microbicides. User acceptability is vital for microbicide success; however, varying cultural vaginal practices indicate multiple formulations must be developed to appeal to different populations. Perceptual attributes of microbicides have been identified as primary drivers of acceptability; however, previous studies do not allow for direct comparison of these qualities between multiple formulations. Six vaginal products were analyzed ex vivo using descriptive analysis. Perceptual attributes of samples were identified by trained participants (n=10) and rated quantitatively using scales based on a panel-developed lexicon. Data were analyzed using two-way ANOVAs for each attribute; product differences were assessed via Tukey's honestly significant difference test. Significant differences were found between products for multiple attributes. Patterns were also seen for attributes across intended product usage (i.e., contraceptive, moisturizer or lubricant). For example, Options© Gynol II® (Caldwell Consumer Health, LLC) was significantly stickier and grainier than other products. Descriptive analysis, a quantitative approach that is based on consensus lexicon usage among participants, successfully quantified perceptual differences among vaginal products. Since perceptual attributes of products can be directly compared quantitatively, this study represents a novel approach that could be used to inform rational design of microbicides. Copyright © 2011 Elsevier Inc. All rights reserved.
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
Characterising dark matter searches at colliders and direct detection experiments: Vector mediators
Buchmueller, Oliver; Dolan, Matthew J.; Malik, Sarah A.; ...
2015-01-09
We introduce a Minimal Simplified Dark Matter (MSDM) framework to quantitatively characterise dark matter (DM) searches at the LHC. We study two MSDM models where the DM is a Dirac fermion which interacts with a vector and axial-vector mediator. The models are characterised by four parameters: m DM, M med , g DM and g q, the DM and mediator masses, and the mediator couplings to DM and quarks respectively. The MSDM models accurately capture the full event kinematics, and the dependence on all masses and couplings can be systematically studied. The interpretation of mono-jet searches in this framework canmore » be used to establish an equal-footing comparison with direct detection experiments. For theories with a vector mediator, LHC mono-jet searches possess better sensitivity than direct detection searches for light DM masses (≲5 GeV). For axial-vector mediators, LHC and direct detection searches generally probe orthogonal directions in the parameter space. We explore the projected limits of these searches from the ultimate reach of the LHC and multi-ton xenon direct detection experiments, and find that the complementarity of the searches remains. In conclusion, we provide a comparison of limits in the MSDM and effective field theory (EFT) frameworks to highlight the deficiencies of the EFT framework, particularly when exploring the complementarity of mono-jet and direct detection searches.« less
Surface areas of fractally rough particles studied by scattering
NASA Astrophysics Data System (ADS)
Hurd, Alan J.; Schaefer, Dale W.; Smith, Douglas M.; Ross, Steven B.; Le Méhauté, Alain; Spooner, Steven
1989-05-01
The small-angle scattering from fractally rough surfaces has the potential to give information on the surface area at a given resolution. By use of quantitative neutron and x-ray scattering, a direct comparison of surface areas of fractally rough powders was made between scattering and adsorption techniques. This study supports a recently proposed correction to the theory for scattering from fractal surfaces. In addition, the scattering data provide an independent calibration of molecular adsorbate areas.
Flow Modulation and Force Control in Insect Fast Maneuver
NASA Astrophysics Data System (ADS)
Li, Chengyu; Dong, Haibo; Zhang, Wen; Gai, Kuo
2012-11-01
In this work, an integrated study combining high-speed photogrammetry and direct numerical simulation (DNS) is used to study free flying insects in fast maneuver. Quantitative measurement has shown the significant differences between quad-winged flyers such as dragonfly and damselfly and two-winged flyers such as cicada. Comparisons of unsteady 3D vortex formation and associated aerodynamic force production reveal the different mechanisms used by insects in fast turn. This work is supported by NSF CBET-1055949.
Streaks and vortices in near-wall turbulence.
Chernyshenko, S I; Baig, M F
2005-05-15
This paper presents evidence that organization of wall-normal motions plays almost no role in the creation of streaks. This evidence consists of the theory of streak generation not requiring the existence of organized vortices, extensive quantitative comparisons between the theory and direct numerical simulations, including examples of large variation in average spacing of the streaks of different scalars simultaneously present in the flow, and an example of the scalar streaks in an artificially created purely random flow.
Improving the geological interpretation of magnetic and gravity satellite anomalies
NASA Technical Reports Server (NTRS)
Hinze, W. J.; Braile, L. W. (Principal Investigator); Vonfrese, R. R. B.
1985-01-01
Current limitations in the quantitative interpretation of satellite-elevation geopotential field data and magnetic anomaly data were investigated along with techniques to overcome them. A major result was the preparation of an improved scalar magnetic anomaly map of South America and adjacent marine areas directly from the original MAGSAT data. In addition, comparisons of South American and Euro-African data show a strong correlation of anomalies along the Atlantic rifted margins of the continents.
Telerobotics - Display, control, and communication problems
NASA Technical Reports Server (NTRS)
Stark, Lawrence; Kim, Won-Soo; Tendick, Frank; Hannaford, Blake; Ellis, Stephen
1987-01-01
An experimental telerobotics simulation is described suitable for studying human operator (HO) performance. Simple manipulator pick-and-place and tracking tasks allowed quantitative comparison of a number of calligraphic display viewing conditions. An enhanced perspective display was effective with a reference line from target to base, with or without a complex three-dimensional grid framing the view. This was true especially if geometrical display parameters such as azimuth and elevation were arranged to be near optimal. Quantitative comparisons were made possible, utilizing control performance measures such as root mean square error. There was a distinct preference for controlling the manipulator in end-effector Cartesian space for the primitive pick-and-place task, rather than controlling joint angles and then, via direct kinematis, the end-effector position. An introduced communication delay was found to produce decrease in performance. In considerable part, this difficulty could be compensated for by preview control information. The fact that neurological control of normal human movement contains a sampled data period of 0.2 s may relate to this robustness of HO control to delay.
Computerized EEG analysis for studying the effect of drugs on the central nervous system.
Rosadini, G; Cavazza, B; Rodriguez, G; Sannita, W G; Siccardi, A
1977-11-01
Samples of our experience in quantitative pharmaco-EEG are reviewed to discuss and define its applicability and limits. Simple processing systems, such as the computation of Hjorth's descriptors, are useful for on-line monitoring of drug-induced EEG modifications which are evident also at the visual visual analysis. Power spectral analysis is suitable to identify and quantify EEG effects not evident at the visual inspection. It demonstrated how the EEG effects of compounds in a long-acting formulation vary according to the sampling time and the explored cerebral area. EEG modifications not detected by power spectral analysis can be defined by comparing statistically (F test) the spectral values of the EEG from a single lead at the different samples (longitudinal comparison), or the spectral values from different leads at any sample (intrahemispheric comparison). The presently available procedures of quantitative pharmaco-EEG are effective when applied to the study of mutltilead EEG recordings in a statistically significant sample of population. They do not seem reliable in the monitoring of directing of neuropyschiatric therapies in single patients, due to individual variability of drug effects.
Arab, Lenore; Khan, Faraz; Lam, Helen
2013-01-01
A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men. PMID:23319129
Arab, Lenore; Khan, Faraz; Lam, Helen
2013-01-01
A systematic literature review of human studies relating caffeine or caffeine-rich beverages to cognitive decline reveals only 6 studies that have collected and analyzed cognition data in a prospective fashion that enables study of decline across the spectrum of cognition. These 6 studies, in general, evaluate cognitive function using the Mini Mental State Exam and base their beverage data on FFQs. Studies included in our review differed in their source populations, duration of study, and most dramatically in how their analyses were done, disallowing direct quantitative comparisons of their effect estimates. Only one of the studies reported on all 3 exposures, coffee, tea, and caffeine, making comparisons of findings across studies more difficult. However, in general, it can be stated that for all studies of tea and most studies of coffee and caffeine, the estimates of cognitive decline were lower among consumers, although there is a lack of a distinct dose response. Only a few measures showed a quantitative significance and, interestingly, studies indicate a stronger effect among women than men.
Yoshimitsu, Kengo; Shinagawa, Yoshinobu; Mitsufuji, Toshimichi; Mutoh, Emi; Urakawa, Hiroshi; Sakamoto, Keiko; Fujimitsu, Ritsuko; Takano, Koichi
2017-01-10
To elucidate whether any differences are present in the stiffness map obtained with a multiscale direct inversion algorithm (MSDI) vs that with a multimodel direct inversion algorithm (MMDI), both qualitatively and quantitatively. The MR elastography (MRE) data of 37 consecutive patients who underwent liver MR elastography between September and October 2014 were retrospectively analyzed by using both MSDI and MMDI. Two radiologists qualitatively assessed the stiffness maps for the image quality in consensus, and the measured liver stiffness and measurable areas were quantitatively compared between MSDI and MMDI. MMDI provided a stiffness map of better image quality, with comparable or slightly less artifacts. Measurable areas by MMDI (43.7 ± 17.8 cm 2 ) was larger than that by MSDI (37.5 ± 14.7 cm 2 ) (P < 0.05). Liver stiffness measured by MMDI (4.51 ± 2.32 kPa) was slightly (7%), but significantly less than that by MSDI (4.86 ± 2.44 kPa) (P < 0.05). MMDI can provide stiffness map of better image quality, and slightly lower stiffness values as compared to MSDI at 3T MRE, which radiologists should be aware of.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Young-Min; Pennycook, Stephen J.; Borisevich, Albina Y.
Octahedral tilt behavior is increasingly recognized as an important contributing factor to the physical behavior of perovskite oxide materials and especially their interfaces, necessitating the development of high-resolution methods of tilt mapping. There are currently two major approaches for quantitative imaging of tilts in scanning transmission electron microscopy (STEM), bright field (BF) and annular bright field (ABF). In this study, we show that BF STEM can be reliably used for measurements of oxygen octahedral tilts. While optimal conditions for BF imaging are more restricted with respect to sample thickness and defocus, we find that BF imaging with an aberration-corrected microscopemore » with the accelerating voltage of 300 kV gives us the most accurate quantitative measurement of the oxygen column positions. Using the tilted perovskite structure of BiFeO 3 (BFO) as our test sample, we simulate BF and ABF images in a wide range of conditions, identifying the optimal imaging conditions for each mode. Finally, we show that unlike ABF imaging, BF imaging remains directly quantitatively interpretable for a wide range of the specimen mistilt, suggesting that it should be preferable to the ABF STEM imaging for quantitative structure determination.« less
Kim, Young-Min; Pennycook, Stephen J.; Borisevich, Albina Y.
2017-04-29
Octahedral tilt behavior is increasingly recognized as an important contributing factor to the physical behavior of perovskite oxide materials and especially their interfaces, necessitating the development of high-resolution methods of tilt mapping. There are currently two major approaches for quantitative imaging of tilts in scanning transmission electron microscopy (STEM), bright field (BF) and annular bright field (ABF). In this study, we show that BF STEM can be reliably used for measurements of oxygen octahedral tilts. While optimal conditions for BF imaging are more restricted with respect to sample thickness and defocus, we find that BF imaging with an aberration-corrected microscopemore » with the accelerating voltage of 300 kV gives us the most accurate quantitative measurement of the oxygen column positions. Using the tilted perovskite structure of BiFeO 3 (BFO) as our test sample, we simulate BF and ABF images in a wide range of conditions, identifying the optimal imaging conditions for each mode. Finally, we show that unlike ABF imaging, BF imaging remains directly quantitatively interpretable for a wide range of the specimen mistilt, suggesting that it should be preferable to the ABF STEM imaging for quantitative structure determination.« less
Obuchowski, Nancy A; Barnhart, Huiman X; Buckler, Andrew J; Pennello, Gene; Wang, Xiao-Feng; Kalpathy-Cramer, Jayashree; Kim, Hyun J Grace; Reeves, Anthony P
2015-02-01
Quantitative imaging biomarkers are being used increasingly in medicine to diagnose and monitor patients' disease. The computer algorithms that measure quantitative imaging biomarkers have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms' bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms' performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for quantitative imaging biomarker studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.
Jastrzembski, Jillian A; Bee, Madeleine Y; Sacks, Gavin L
2017-10-25
Ambient ionization mass spectrometric (AI-MS) techniques like direct analysis in real time (DART) offer the potential for rapid quantitative analyses of trace volatiles in food matrices, but performance is generally limited by the lack of preconcentration and extraction steps. The sensitivity and selectivity of AI-MS approaches can be improved through solid-phase microextraction (SPME) with appropriate thin-film geometries, for example, solid-phase mesh-enhanced sorption from headspace (SPMESH). This work improves the SPMESH-DART-MS approach for use in food analyses and validates the approach for trace volatile analysis for two compounds in real samples (grape macerates). SPMESH units prepared with different sorbent coatings were evaluated for their ability to extract a range of odor-active volatiles, with poly(dimethylsiloxane)/divinylbenzene giving the most satisfactory results. In combination with high-resolution mass spectrometry (HRMS), detection limits for SPMESH-DART-MS under 4 ng/L in less than 30 s acquisition times could be achieved for some volatiles [3-isobutyl-2-methoxypyrazine (IBMP) and β-damascenone]. A comparison of SPMESH-DART-MS and SPME-GC-MS quantitation of linalool and IBMP demonstrates excellent agreement between the two methods for real grape samples (r 2 ≥ 0.90), although linalool measurements appeared to also include isobaric interference.
Junginger, Andrej; Garcia-Muller, Pablo L; Borondo, F; Benito, R M; Hernandez, Rigoberto
2016-01-14
The reaction rate rises and falls with increasing density or friction when a molecule is activated by collisions with the solvent particles. This so-called Kramers turnover has recently been observed in the isomerization reaction of LiCN in an argon bath. In this paper, we demonstrate by direct comparison with those results that a reduced-dimensional (generalized) Langevin description gives rise to similar reaction dynamics as the corresponding (computationally expensive) full molecular dynamics calculations. We show that the density distributions within the Langevin description are in direct agreement with the full molecular dynamics results and that the turnover in the reaction rates is reproduced qualitatively and quantitatively at different temperatures.
Wulf, J S; Rühmann, S; Rego, I; Puhl, I; Treutter, D; Zude, M
2008-05-14
Laser-induced fluorescence spectroscopy (LIFS) was nondestructively applied on strawberries (EX = 337 nm, EM = 400-820 nm) to test the feasibility of quantitatively determining native phenolic compounds in strawberries. Eighteen phenolic compounds were identified in fruit skin by UV and MS spectroscopy and quantitatively determined by use of rp-HPLC for separation and diode-array or chemical reaction detection. Partial least-squares calibration models were built for single phenolic compounds by means of nondestructively recorded fluorescence spectra in the blue-green wavelength range using different data preprocessing methods. The direct orthogonal signal correction resulted in r (2) = 0.99 and rmsep < 8% for p-coumaroyl-glucose, and r (2) = 0.99 and rmsep < 24% for cinnamoyl-glucose. In comparison, the correction of the fluorescence spectral data with simultaneously recorded reflectance spectra did not further improve the calibration models. Results show the potential of LIFS for a rapid and nondestructive assessment of contents of p-coumaroyl-glucose and cinnamoyl-glucose in strawberry fruits.
Solid-Phase Radioimmunoassay of Total and Influenza-Specific Immunoglobulin G
Daugharty, Harry; Warfield, Donna T.; Davis, Marianne L.
1972-01-01
An antigen-antibody system of polystyrene tubes coated with immunoglobulin antibody was used for quantitating immunoglobulins. A similar radioimmunoassay method was adapted for a viral antigen-antibody system. The viral system can be used for quantitating viruses and for measuring virus-specific antibodies by reacting with 125iodine-labeled anti-immunoglobulin G (IgG). Optimal conditions for coating the solid phase, specificity of the immune reaction, and other kinetics and sensitivities of the assay method were investigated. Comparison of direct and indirect methods of assaying for immunoglobulins or viral antibody indicates that the indirect method is more sensitive and can quantitate a minimum of 0.037 μg of IgG per ml. Results of solid-phase radioimmunoassay for influenza antibody correlate well with hemagglutinin antibody titers but not with complement-fixing antibody titers. Radioimmunoassay results for influenza antibody by solid phase are likewise in agreement with results by the carrier precipitate radioimmunoassay method. The simplicity, reproducibility, and versatility of the solid-phase procedure make it diagnostically useful. PMID:5062884
3D quantitative analysis of early decomposition changes of the human face.
Caplova, Zuzana; Gibelli, Daniele Maria; Poppa, Pasquale; Cummaudo, Marco; Obertova, Zuzana; Sforza, Chiarella; Cattaneo, Cristina
2018-03-01
Decomposition of the human body and human face is influenced, among other things, by environmental conditions. The early decomposition changes that modify the appearance of the face may hamper the recognition and identification of the deceased. Quantitative assessment of those changes may provide important information for forensic identification. This report presents a pilot 3D quantitative approach of tracking early decomposition changes of a single cadaver in controlled environmental conditions by summarizing the change with weekly morphological descriptions. The root mean square (RMS) value was used to evaluate the changes of the face after death. The results showed a high correlation (r = 0.863) between the measured RMS and the time since death. RMS values of each scan are presented, as well as the average weekly RMS values. The quantification of decomposition changes could improve the accuracy of antemortem facial approximation and potentially could allow the direct comparisons of antemortem and postmortem 3D scans.
NASA Astrophysics Data System (ADS)
Zhang, Jialin; Chen, Qian; Li, Jiaji; Zuo, Chao
2017-02-01
The transport of intensity equation (TIE) is a powerful tool for direct quantitative phase retrieval in microscopy imaging. However, there may be some problems when dealing with the boundary condition of the TIE. The previous work introduces a hard-edged aperture to the camera port of the traditional bright field microscope to generate the boundary signal for the TIE solver. Under this Neumann boundary condition, we can obtain the quantitative phase without any assumption or prior knowledge about the test object and the setup. In this paper, we will demonstrate the effectiveness of this method based on some experiments in practice. The micro lens array will be used for the comparison of two TIE solvers results based on introducing the aperture or not and this accurate quantitative phase imaging technique allows measuring cell dry mass which is used in biology to follow cell cycle, to investigate cell metabolism, or to address effects of drugs.
Gómez-Ríos, Germán Augusto; Liu, Chang; Tascon, Marcos; Reyes-Garcés, Nathaly; Arnold, Don W; Covey, Thomas R; Pawliszyn, Janusz
2017-04-04
In recent years, the direct coupling of solid phase microextraction (SPME) and mass spectrometry (MS) has shown its great potential to improve limits of quantitation, accelerate analysis throughput, and diminish potential matrix effects when compared to direct injection to MS. In this study, we introduce the open port probe (OPP) as a robust interface to couple biocompatible SPME (Bio-SPME) fibers to MS systems for direct electrospray ionization. The presented design consisted of minimal alterations to the front-end of the instrument and provided better sensitivity, simplicity, speed, wider compound coverage, and high-throughput in comparison to the LC-MS based approach. Quantitative determination of clenbuterol, fentanyl, and buprenorphine was successfully achieved in human urine. Despite the use of short extraction/desorption times (5 min/5 s), limits of quantitation below the minimum required performance levels (MRPL) set by the world antidoping agency (WADA) were obtained with good accuracy (≥90%) and linearity (R 2 > 0.99) over the range evaluated for all analytes using sample volumes of 300 μL. In-line technologies such as multiple reaction monitoring with multistage fragmentation (MRM 3 ) and differential mobility spectrometry (DMS) were used to enhance the selectivity of the method without compromising analysis speed. On the basis of calculations, once coupled to high throughput, this method can potentially yield preparation times as low as 15 s per sample based on the 96-well plate format. Our results demonstrated that Bio-SPME-OPP-MS efficiently integrates sampling/sample cleanup and atmospheric pressure ionization, making it an advantageous configuration for several bioanalytical applications, including doping in sports, in vivo tissue sampling, and therapeutic drug monitoring.
Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin
NASA Astrophysics Data System (ADS)
Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.
2013-12-01
Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.
NASA Technical Reports Server (NTRS)
Cooper, Clayton S.; Laurendeau, Normand M.; Hicks, Yolanda R. (Technical Monitor)
2000-01-01
Lean direct-injection (LDI) spray flames offer the possibility of reducing NO(sub x) emissions from gas turbines by rapid mixing of the liquid fuel and air so as to drive the flame structure toward partially-premixed conditions. We consider the technical approaches required to utilize laser-induced fluorescence methods for quantitatively measuring NO concentrations in high-pressure LDI spray flames. In the progression from atmospheric to high-pressure measurements, the LIF method requires a shift from the saturated to the linear regime of fluorescence measurements. As such, we discuss quantitative, spatially resolved laser-saturated fluorescence (LSF), linear laser-induced fluorescence (LIF), and planar laser-induced fluorescence (PLIF) measurements of NO concentration in LDI spray flames. Spatially-resolved LIF measurements of NO concentration (ppm) are reported for preheated, LDI spray flames at pressures of two to five atmospheres. The spray is produced by a hollow-cone, pressure-atomized nozzle supplied with liquid heptane. NO is excited via the Q(sub 2)(26.5) transition of the gamma(0,0) band. Detection is performed in a two nanometer region centered on the gamma(0,1) band. A complete scheme is developed by which quantitative NO concentrations in high-pressure LDI spray flames can be measured by applying linear LIF. NO is doped into the reactants and convected through the flame with no apparent destruction, thus allowing a NO fluorescence calibration to be taken inside the flame environment. The in-situ calibration scheme is validated by comparisons to a reference flame. Quantitative NO profiles are presented and analyzed so as to better understand the operation of lean-direct injectors for gas turbine combustors. Moreover, parametric studies are provided for variations in pressure, air-preheat temperature, and equivalence ratio. Similar parametric studies are performed for lean, premixed-prevaporized flames to permit comparisons to those for LDI flames. Finally, PLIF is expanded to high pressure in an effort to quantify the detected fluorescence image for LDI flames. Success is achieved by correcting the PLIF calibration via a single-point LIF measurement. This procedure removes the influence of any preferential background that occurs in the PLIF detection window. In general, both the LIF and PLIF measurements verify that the LDI strategy could be used to reduce NO(sub x) emissions in future gas turbine combustors.
Study of Interesting Solidification Phenomena on the Ground and in Space (MEPHISTO)
NASA Technical Reports Server (NTRS)
Alexander, J. Iwan D.; Favier, J.-J.; Garandet, J.-P.
1999-01-01
Real-time Seebeck voltage variations in a Sn-Bi melt during directional solidification in the MEPHISTO spaceflight experiment flown on the USMP-3 mission, have been correlated with well-characterized thruster firings and an Orbiter Main System (OMS) burn. The Seebeck voltage measurement is related to the response of the instantaneous average melt composition at the melt-crystal interface. This allowed us to make a direct comparison of numerical simulations with the experimentally obtained Seebeck signals. Based on the results of preflight and real-time computations, several well-defined thruster firing events were programmed to occur at specific times during the experiment. In particular, we simulated the effects of the thruster firings on melt and crystal composition in a directionally solidifying Sn-Bi alloy. The relative accelerations produced by the firings were simulated by impulsive accelerations of the same magnitude, duration and orientation as the requested firings. A comparison of the simulation results with the Seebeck signal indicates that there is a good agreement between the two. This unique opportunity allows us to make the first quantitative characterization of actual g-jitter effects on an actual crystal growth experiment and to calibrate our models of g-jitter effects on crystal growth.
Study of Interesting Solidification Phenomena on the Ground and in Space (MEPHISTO)
NASA Technical Reports Server (NTRS)
Favier, J.-J.; Iwan, J.; Alexander, D.; Garandet, J.-P.
1998-01-01
Real-time Seebeck voltage variations in a Sn-Bi melt during directional solidification in the MEPHISTO spaceflight experiment flown on the USMP-3 mission, can be correlated with well characterized thruster firings and an Orbiter Main System (OMS) burn. The Seebeck voltage measurement is related to the response of the instantaneous average melt composition at the melt-crystal interface. This allowed us to make a direct comparison of numerical simulations with the experimentally obtained Seebeck signals. Based on the results of preflight and real-time computations, several well-defined thruster firing events were programmed to occur at specific times during the experiment. In particular, we simulated the effects of the thruster firings on melt and crystal composition in a directionally solidifying Sn-Bi alloy. The relative accelerations produced by the firings were simulated by impulsive accelerations of the same magnitude, duration and orientation as the requested firings. A comparison of the simulation results with the Seebeck signal indicates that there is a good agreement between the two. This unique opportunity allows us, for the first time, to quantitatively characterize actual g-jitter effects on an actual crystal growth experiment and to properly calibrate our models of g-jitter effects on crystal growth.
Faraji, Amir H; Abhinav, Kumar; Jarbo, Kevin; Yeh, Fang-Cheng; Shin, Samuel S; Pathak, Sudhir; Hirsch, Barry E; Schneider, Walter; Fernandez-Miranda, Juan C; Friedlander, Robert M
2015-11-01
Brainstem cavernous malformations (CMs) are challenging due to a higher symptomatic hemorrhage rate and potential morbidity associated with their resection. The authors aimed to preoperatively define the relationship of CMs to the perilesional corticospinal tracts (CSTs) by obtaining qualitative and quantitative data using high-definition fiber tractography. These data were examined postoperatively by using longitudinal scans and in relation to patients' symptomatology. The extent of involvement of the CST was further evaluated longitudinally using the automated "diffusion connectometry" analysis. Fiber tractography was performed with DSI Studio using a quantitative anisotropy (QA)-based generalized deterministic tracking algorithm. Qualitatively, CST was classified as being "disrupted" and/or "displaced." Quantitative analysis involved obtaining mean QA values for the CST and its perilesional and nonperilesional segments. The contralateral CST was used for comparison. Diffusion connectometry analysis included comparison of patients' data with a template from 90 normal subjects. Three patients (mean age 22 years) with symptomatic pontomesencephalic hemorrhagic CMs and varying degrees of hemiparesis were identified. The mean follow-up period was 37.3 months. Qualitatively, CST was partially disrupted and displaced in all. Direction of the displacement was different in each case and progressively improved corresponding with the patient's neurological status. No patient experienced neurological decline related to the resection. The perilesional mean QA percentage decreases supported tract disruption and decreased further over the follow-up period (Case 1, 26%-49%; Case 2, 35%-66%; and Case 3, 63%-78%). Diffusion connectometry demonstrated rostrocaudal involvement of the CST consistent with the quantitative data. Hemorrhagic brainstem CMs can disrupt and displace perilesional white matter tracts with the latter occurring in unpredictable directions. This requires the use of tractography to accurately define their orientation to optimize surgical entry point, minimize morbidity, and enhance neurological outcomes. Observed anisotropy decreases in the perilesional segments are consistent with neural injury following hemorrhagic insults. A model using these values in different CST segments can be used to longitudinally monitor its craniocaudal integrity. Diffusion connectometry is a complementary approach providing longitudinal information on the rostrocaudal involvement of the CST.
Bimetallic Effect of Single Nanocatalysts Visualized by Super-Resolution Catalysis Imaging
Chen, Guanqun; Zou, Ningmu; Chen, Bo; ...
2017-11-01
Compared with their monometallic counterparts, bimetallic nanoparticles often show enhanced catalytic activity associated with the bimetallic interface. Direct quantitation of catalytic activity at the bimetallic interface is important for understanding the enhancement mechanism, but challenging experimentally. Here using single-molecule super-resolution catalysis imaging in correlation with electron microscopy, we report the first quantitative visualization of enhanced bimetallic activity within single bimetallic nanoparticles. We focus on heteronuclear bimetallic PdAu nanoparticles that present a well-defined Pd–Au bimetallic interface in catalyzing a photodriven fluorogenic disproportionation reaction. Our approach also enables a direct comparison between the bimetallic and monometallic regions within the same nanoparticle. Theoreticalmore » calculations further provide insights into the electronic nature of N–O bond activation of the reactant (resazurin) adsorbed on bimetallic sites. Subparticle activity correlation between bimetallic enhancement and monometallic activity suggests that the favorable locations to construct bimetallic sites are those monometallic sites with higher activity, leading to a strategy for making effective bimetallic nanocatalysts. Furthermore, the results highlight the power of super-resolution catalysis imaging in gaining insights that could help improve nanocatalysts.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Guanqun; Zou, Ningmu; Chen, Bo
Compared with their monometallic counterparts, bimetallic nanoparticles often show enhanced catalytic activity associated with the bimetallic interface. Direct quantitation of catalytic activity at the bimetallic interface is important for understanding the enhancement mechanism, but challenging experimentally. Here using single-molecule super-resolution catalysis imaging in correlation with electron microscopy, we report the first quantitative visualization of enhanced bimetallic activity within single bimetallic nanoparticles. We focus on heteronuclear bimetallic PdAu nanoparticles that present a well-defined Pd–Au bimetallic interface in catalyzing a photodriven fluorogenic disproportionation reaction. Our approach also enables a direct comparison between the bimetallic and monometallic regions within the same nanoparticle. Theoreticalmore » calculations further provide insights into the electronic nature of N–O bond activation of the reactant (resazurin) adsorbed on bimetallic sites. Subparticle activity correlation between bimetallic enhancement and monometallic activity suggests that the favorable locations to construct bimetallic sites are those monometallic sites with higher activity, leading to a strategy for making effective bimetallic nanocatalysts. Furthermore, the results highlight the power of super-resolution catalysis imaging in gaining insights that could help improve nanocatalysts.« less
Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus
2008-09-01
Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lauer, Mark A.; Poirier, David R.; Erdmann, Robert G.
2014-09-01
This report covers the modeling of seven directionally solidified samples, five under normal gravitational conditions and two in microgravity. A model is presented to predict macrosegregation during the melting phases of samples solidified under microgravitational conditions. The results of this model are compared against two samples processed in microgravity and good agreement is found. A second model is presented that captures thermosolutal convection during directional solidification. Results for this model are compared across several experiments and quantitative comparisons are made between the model and the experimentally obtained radial macrosegregation profiles with good agreement being found. Changes in cross section weremore » present in some samples and micrographs of these are qualitatively compared with the results of the simulations. It is found that macrosegregation patterns can be affected by changing the mold material.« less
Development of image processing techniques for applications in flow visualization and analysis
NASA Technical Reports Server (NTRS)
Disimile, Peter J.; Shoe, Bridget; Toy, Norman; Savory, Eric; Tahouri, Bahman
1991-01-01
A comparison between two flow visualization studies of an axi-symmetric circular jet issuing into still fluid, using two different experimental techniques, is described. In the first case laser induced fluorescence is used to visualize the flow structure, whilst smoke is utilized in the second. Quantitative information was obtained from these visualized flow regimes using two different digital imaging systems. Results are presented of the rate at which the jet expands in the downstream direction and these compare favorably with the more established data.
Computation of Nonlinear Backscattering Using a High-Order Numerical Method
NASA Technical Reports Server (NTRS)
Fibich, G.; Ilan, B.; Tsynkov, S.
2001-01-01
The nonlinear Schrodinger equation (NLS) is the standard model for propagation of intense laser beams in Kerr media. The NLS is derived from the nonlinear Helmholtz equation (NLH) by employing the paraxial approximation and neglecting the backscattered waves. In this study we use a fourth-order finite-difference method supplemented by special two-way artificial boundary conditions (ABCs) to solve the NLH as a boundary value problem. Our numerical methodology allows for a direct comparison of the NLH and NLS models and for an accurate quantitative assessment of the backscattered signal.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimminau, G; Nagler, B; Higginbotham, A
2008-06-19
Calculations of the x-ray diffraction patterns from shocked crystals derived from the results of Non-Equilibrium-Molecular-Dynamics (NEMD) simulations are presented. The atomic coordinates predicted by the NEMD simulations combined with atomic form factors are used to generate a discrete distribution of electron density. A Fast-Fourier-Transform (FFT) of this distribution provides an image of the crystal in reciprocal space, which can be further processed to produce quantitative simulated data for direct comparison with experiments that employ picosecond x-ray diffraction from laser-irradiated crystalline targets.
Electric Fields at the Active Site of an Enzyme: Direct Comparison of Experiment with Theory
NASA Astrophysics Data System (ADS)
Suydam, Ian T.; Snow, Christopher D.; Pande, Vijay S.; Boxer, Steven G.
2006-07-01
The electric fields produced in folded proteins influence nearly every aspect of protein function. We present a vibrational spectroscopy technique that measures changes in electric field at a specific site of a protein as shifts in frequency (Stark shifts) of a calibrated nitrile vibration. A nitrile-containing inhibitor is used to deliver a unique probe vibration to the active site of human aldose reductase, and the response of the nitrile stretch frequency is measured for a series of mutations in the enzyme active site. These shifts yield quantitative information on electric fields that can be directly compared with electrostatics calculations. We show that extensive molecular dynamics simulations and ensemble averaging are required to reproduce the observed changes in field.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kaur, Amandeep; Deepshikha; Vinayak, Karan Singh
2016-07-15
We performed a theoretical investigation of different mass-asymmetric reactions to access the direct impact of the density-dependent part of symmetry energy on multifragmentation. The simulations are performed for a specific set of reactions having same system mass and N/Z content, using isospin-dependent quantum molecular dynamics model to estimate the quantitative dependence of fragment production on themass-asymmetry factor (τ) for various symmetry energy forms. The dynamics associated with different mass-asymmetric reactions is explored and the direct role of symmetry energy is checked. Also a comparison with the experimental data (asymmetric reaction) is presented for a different equation of states (symmetry energymore » forms).« less
Kafle, Amol; Klaene, Joshua; Hall, Adam B; Glick, James; Coy, Stephen L; Vouros, Paul
2013-07-15
There is continued interest in exploring new analytical technologies for the detection and quantitation of DNA adducts, biomarkers which provide direct evidence of exposure and genetic damage in cells. With the goal of reducing clean-up steps and improving sample throughput, a Differential Mobility Spectrometry/Mass Spectrometry (DMS/MS) platform has been introduced for adduct analysis. A DMS/MS platform has been utilized for the analysis of dG-ABP, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl (4-ABP). After optimization of the DMS parameters, each sample was analyzed in just 30 s following a simple protein precipitation step of the digested DNA. A detection limit of one modification in 10^6 nucleosides has been achieved using only 2 µg of DNA. A brief comparison (quantitative and qualitative) with liquid chromatography/mass spectrometry is also presented highlighting the advantages of using the DMS/MS method as a high-throughput platform. The data presented demonstrate the successful application of a DMS/MS/MS platform for the rapid quantitation of DNA adducts using, as a model analyte, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl. Copyright © 2013 John Wiley & Sons, Ltd.
A systematic review of quantitative burn wound microbiology in the management of burns patients.
Halstead, Fenella D; Lee, Kwang Chear; Kwei, Johnny; Dretzke, Janine; Oppenheim, Beryl A; Moiemen, Naiem S
2018-02-01
The early diagnosis of infection or sepsis in burns are important for patient care. Globally, a large number of burn centres advocate quantitative cultures of wound biopsies for patient management, since there is assumed to be a direct link between the bioburden of a burn wound and the risk of microbial invasion. Given the conflicting study findings in this area, a systematic review was warranted. Bibliographic databases were searched with no language restrictions to August 2015. Study selection, data extraction and risk of bias assessment were performed in duplicate using pre-defined criteria. Substantial heterogeneity precluded quantitative synthesis, and findings were described narratively, sub-grouped by clinical question. Twenty six laboratory and/or clinical studies were included. Substantial heterogeneity hampered comparisons across studies and interpretation of findings. Limited evidence suggests that (i) more than one quantitative microbiology sample is required to obtain reliable estimates of bacterial load; (ii) biopsies are more sensitive than swabs in diagnosing or predicting sepsis; (iii) high bacterial loads may predict worse clinical outcomes, and (iv) both quantitative and semi-quantitative culture reports need to be interpreted with caution and in the context of other clinical risk factors. The evidence base for the utility and reliability of quantitative microbiology for diagnosing or predicting clinical outcomes in burns patients is limited and often poorly reported. Consequently future research is warranted. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.
van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie
2017-09-01
This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Hargrove, W. W.; Norman, S. P.; Kumar, J.; Hoffman, F. M.
2017-12-01
National-scale polar analysis of MODIS NDVI allows quantification of degree of seasonality expressed by local vegetation, and also selects the most optimum start/end of a local "phenological year" that is empirically customized for the vegetation that is growing at each location. Interannual differences in timing of phenology make direct comparisons of vegetation health and performance between years difficult, whether at the same or different locations. By "sliding" the two phenologies in time using a Procrustean linear time shift, any particular phenological event or "completion milestone" can be synchronized, allowing direct comparison of differences in timing of other remaining milestones. Going beyond a simple linear translation, time can be "rubber-sheeted," compressed or dilated. Considering one phenology curve to be a reference, the second phenology can be "rubber-sheeted" to fit that baseline as well as possible by stretching or shrinking time to match multiple control points, which can be any recognizable phenological events. Similar to "rubber sheeting" to georectify a map inside a GIS, rubber sheeting a phenology curve also yields a warping signature that shows at every time and every location how many days the adjusted phenology is ahead or behind the phenological development of the reference vegetation. Using such temporal methods to "adjust" phenologies may help to quantify vegetation impacts from frost, drought, wildfire, insects and diseases by permitting the most commensurate quantitative comparisons with unaffected vegetation.
Punishment in human choice: direct or competitive suppression?
Critchfield, Thomas S; Paletz, Elliott M; MacAleese, Kenneth R; Newland, M Christopher
2003-01-01
This investigation compared the predictions of two models describing the integration of reinforcement and punishment effects in operant choice. Deluty's (1976) competitive-suppression model (conceptually related to two-factor punishment theories) and de Villiers' (1980) direct-suppression model (conceptually related to one-factor punishment theories) have been tested previously in nonhumans but not at the individual level in humans. Mouse clicking by college students was maintained in a two-alternative concurrent schedule of variable-interval money reinforcement. Punishment consisted of variable-interval money losses. Experiment 1 verified that money loss was an effective punisher in this context. Experiment 2 consisted of qualitative model comparisons similar to those used in previous studies involving nonhumans. Following a no-punishment baseline, punishment was superimposed upon both response alternatives. Under schedule values for which the direct-suppression model, but not the competitive-suppression model, predicted distinct shifts from baseline performance, or vice versa, 12 of 14 individual-subject functions, generated by 7 subjects, supported the direct-suppression model. When the punishment models were converted to the form of the generalized matching law, least-squares linear regression fits for a direct-suppression model were superior to those of a competitive-suppression model for 6 of 7 subjects. In Experiment 3, a more thorough quantitative test of the modified models, fits for a direct-suppression model were superior in 11 of 13 cases. These results correspond well to those of investigations conducted with nonhumans and provide the first individual-subject evidence that a direct-suppression model, evaluated both qualitatively and quantitatively, describes human punishment better than a competitive-suppression model. We discuss implications for developing better punishment models and future investigations of punishment in human choice. PMID:13677606
Quantification of Ice Accretions for Icing Scaling Evaluations
NASA Technical Reports Server (NTRS)
Ruff, Gary A.; Anderson, David N.
2003-01-01
The comparison of ice accretion characteristics is an integral part of aircraft icing research. It is often necessary to compare an ice accretion obtained from a flight test or numerical simulation to one produced in an icing wind tunnel or for validation of an icing scaling method. Traditionally, this has been accomplished by overlaying two-dimensional tracings of ice accretion shapes. This paper addresses the basic question of how to compare ice accretions using more quantitative methods. For simplicity, geometric characteristics of the ice accretions are used for the comparison. One method evaluated is a direct comparison of the percent differences of the geometric measurements. The second method inputs these measurements into a fuzzy inference system to obtain a single measure of the goodness of the comparison. The procedures are demonstrated by comparing ice shapes obtained in the Icing Research Tunnel at NASA Glenn Research Center during recent icing scaling tests. The results demonstrate that this type of analysis is useful in quantifying the similarity of ice accretion shapes and that the procedures should be further developed by expanding the analysis to additional icing data sets.
Studying the fundamental limit of optical fiber links to the 10-21 level.
Xu, Dan; Lee, Won-Kyu; Stefani, Fabio; Lopez, Olivier; Amy-Klein, Anne; Pottie, Paul-Eric
2018-04-16
We present a hybrid fiber link combining effective optical frequency transfer and evaluation of performances with a self-synchronized two-way comparison. It enables us to detect the round-trip fiber noise and each of the forward and backward one-way fiber noises simultaneously. The various signals acquired with this setup allow us to study quantitatively several properties of optical fiber links. We check the reciprocity of the accumulated noise forth and back over a bi-directional fiber to the level of 3.1(±3.9) × 10 -20 based on a 160000s continuous data. We also analyze the noise correlation between two adjacent fibers and show the first experimental evidence of interferometric noise at very low Fourier frequency. We estimate redundantly and consistently the stability and accuracy of the transferred optical frequency over 43 km at 4 × 10 -21 level after 16 days of integration and demonstrate that a frequency comparison with instability as low as 8 × 10 -18 would be achievable with uni-directional fibers in urban area.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Control volume based hydrocephalus research; analysis of human data
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer
2010-11-01
Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.
NASA Astrophysics Data System (ADS)
Simon, E.; Nowicki, S.; Neumann, T.; Tyahla, L.; Saba, J. L.; Guerber, J. R.; Bonin, J. A.; DiMarzio, J. P.
2017-12-01
The Cryosphere model Comparison tool (CmCt) is a web based ice sheet model validation tool that is being developed by NASA to facilitate direct comparison between observational data and various ice sheet models. The CmCt allows the user to take advantage of several decades worth of observations from Greenland and Antarctica. Currently, the CmCt can be used to compare ice sheet models provided by the user with remotely sensed satellite data from ICESat (Ice, Cloud, and land Elevation Satellite) laser altimetry, GRACE (Gravity Recovery and Climate Experiment) satellite, and radar altimetry (ERS-1, ERS-2, and Envisat). One or more models can be uploaded through the CmCt website and compared with observational data, or compared to each other or other models. The CmCt calculates statistics on the differences between the model and observations, and other quantitative and qualitative metrics, which can be used to evaluate the different model simulations against the observations. The qualitative metrics consist of a range of visual outputs and the quantitative metrics consist of several whole-ice-sheet scalar values that can be used to assign an overall score to a particular simulation. The comparison results from CmCt are useful in quantifying improvements within a specific model (or within a class of models) as a result of differences in model dynamics (e.g., shallow vs. higher-order dynamics approximations), model physics (e.g., representations of ice sheet rheological or basal processes), or model resolution (mesh resolution and/or changes in the spatial resolution of input datasets). The framework and metrics could also be used for use as a model-to-model intercomparison tool, simply by swapping outputs from another model as the observational datasets. Future versions of the tool will include comparisons with other datasets that are of interest to the modeling community, such as ice velocity, ice thickness, and surface mass balance.
Li, Qiuping; Xu, Yinghua; Zhou, Huiya; Loke, Alice Yuen
2015-12-01
The purpose of this study was to test the previous proposed Preliminary Live with Love Conceptual Framework (P-LLCF) that focuses on spousal caregiver-patient couples in their journey of coping with cancer as dyads. A mixed-methods study that included qualitative and quantitative approaches was conducted. Methods of concept and theory analysis, and structural equation modeling (SEM) were applied in testing the P-LLCF. In the qualitative approach in testing the concepts included in the P-LLCF, a comparison was made between the P-LLCF with a preliminary conceptual framework derived from focus group interviews among Chinese couples' coping with cancer. The comparison showed that the concepts identified in the P-LLCF are relevant to the phenomenon under scrutiny, and attributes of the concepts are consistent with those identified among Chinese cancer couple dyads. In the quantitative study, 117 cancer couples were recruited. The findings showed that inter-relationships exist among the components included in the P-LLCF: event situation, dyadic mediators, dyadic appraisal, dyadic coping, and dyadic outcomes. In that the event situation will impact the dyadic outcomes directly or indirectly through Dyadic Mediators. The dyadic mediators, dyadic appraisal, and dyadic coping are interrelated and work together to benefit the dyadic outcomes. This study provides evidence that supports the interlinked components and the relationship included in the P-LLCF. The findings of this study are important in that they provide healthcare professionals with guidance and directions according to the P-LLCF on how to plan supportive programs for couples coping with cancer. Copyright © 2015 Elsevier Ltd. All rights reserved.
Impaired cognitive plasticity and goal-directed control in adolescent obsessive-compulsive disorder.
Gottwald, Julia; de Wit, Sanne; Apergis-Schoute, Annemieke M; Morein-Zamir, Sharon; Kaser, Muzaffer; Cormack, Francesca; Sule, Akeem; Limmer, Winifred; Morris, Anna Conway; Robbins, Trevor W; Sahakian, Barbara J
2018-01-22
Youths with obsessive-compulsive disorder (OCD) experience severe distress and impaired functioning at school and at home. Critical cognitive domains for daily functioning and academic success are learning, memory, cognitive flexibility and goal-directed behavioural control. Performance in these important domains among teenagers with OCD was therefore investigated in this study. A total of 36 youths with OCD and 36 healthy comparison subjects completed two memory tasks: Pattern Recognition Memory (PRM) and Paired Associates Learning (PAL); as well as the Intra-Extra Dimensional Set Shift (IED) task to quantitatively gauge learning as well as cognitive flexibility. A subset of 30 participants of each group also completed a Differential-Outcome Effect (DOE) task followed by a Slips-of-Action Task, designed to assess the balance of goal-directed and habitual behavioural control. Adolescent OCD patients showed a significant learning and memory impairment. Compared with healthy comparison subjects, they made more errors on PRM and PAL and in the first stages of IED involving discrimination and reversal learning. Patients were also slower to learn about contingencies in the DOE task and were less sensitive to outcome devaluation, suggesting an impairment in goal-directed control. This study advances the characterization of juvenile OCD. Patients demonstrated impairments in all learning and memory tasks. We also provide the first experimental evidence of impaired goal-directed control and lack of cognitive plasticity early in the development of OCD. The extent to which the impairments in these cognitive domains impact academic performance and symptom development warrants further investigation.
NASA Astrophysics Data System (ADS)
Morin, Efrat; Marra, Francesco; Peleg, Nadav; Mei, Yiwen; Anagnostou, Emmanouil N.
2017-04-01
Rainfall frequency analysis is used to quantify the probability of occurrence of extreme rainfall and is traditionally based on rain gauge records. The limited spatial coverage of rain gauges is insufficient to sample the spatiotemporal variability of extreme rainfall and to provide the areal information required by management and design applications. Conversely, remote sensing instruments, even if quantitative uncertain, offer coverage and spatiotemporal detail that allow overcoming these issues. In recent years, remote sensing datasets began to be used for frequency analyses, taking advantage of increased record lengths and quantitative adjustments of the data. However, the studies so far made use of concepts and techniques developed for rain gauge (i.e. point or multiple-point) data and have been validated by comparison with gauge-derived analyses. These procedures add further sources of uncertainty and prevent from isolating between data and methodological uncertainties and from fully exploiting the available information. In this study, we step out of the gauge-centered concept presenting a direct comparison between at-site Intensity-Duration-Frequency (IDF) curves derived from different remote sensing datasets on corresponding spatial scales, temporal resolutions and records. We analyzed 16 years of homogeneously corrected and gauge-adjusted C-Band weather radar estimates, high-resolution CMORPH and gauge-adjusted high-resolution CMORPH over the Eastern Mediterranean. Results of this study include: (a) good spatial correlation between radar and satellite IDFs ( 0.7 for 2-5 years return period); (b) consistent correlation and dispersion in the raw and gauge adjusted CMORPH; (c) bias is almost uniform with return period for 12-24 h durations; (d) radar identifies thicker tail distributions than CMORPH and the tail of the distributions depends on the spatial and temporal scales. These results demonstrate the potential of remote sensing datasets for rainfall frequency analysis for management (e.g. warning and early-warning systems) and design (e.g. sewer design, large scale drainage planning)
NASA Technical Reports Server (NTRS)
Kuhlow, W. W.; Chatters, G. C.
1977-01-01
An earth edge methodology has been developed to account for the relative attitude changes between successive ATS-6 images which allows reasonable high quality wind sets to be produced. The method consists of measuring the displacements of the right and left infrared earth edges between successive ATS-6 images as a function of scan line; from these measurements the attitude changes can be deduced and used to correct the apparent cloud displacement measurements. The wind data sets generated from ATS-6 using the earth-edge methodology were compared with those derived from the SMS-1 images (and model) covering the same time period. Quantitative comparisons for low level trade cumuli were made at interpolated uniformly spaced grid points and for selected individual comparison clouds. Selected individual comparison clouds, the root-mean-square differences for the U and V components were 1.0 and 1.2 meters per second with a maximum wind direction difference of 15 deg.
Xu, Xiaoli; Peng, Cheng; Wang, Xiaofu; Chen, Xiaoyun; Wang, Qiang; Xu, Junfeng
2016-12-01
This study evaluated the applicability of droplet digital PCR (ddPCR) as a tool for maize zygosity determination using quantitative real-time PCR (qPCR) as a reference technology. Quantitative real-time PCR is commonly used to determine transgene copy number or GMO zygosity characterization. However, its effectiveness is based on identical reaction efficiencies for the transgene and the endogenous reference gene. Additionally, a calibrator sample should be utilized for accuracy. Droplet digital PCR is a DNA molecule counting technique that directly counts the absolute number of target and reference DNA molecules in a sample, independent of assay efficiency or external calibrators. The zygosity of the transgene can be easily determined using the ratio of the quantity of the target gene to the reference single copy endogenous gene. In this study, both the qPCR and ddPCR methods were used to determine insect-resistant transgenic maize IE034 zygosity. Both methods performed well, but the ddPCR method was more convenient because of its absolute quantification property.
Ewert, Alice; Granvogl, Michael; Schieberle, Peter
2011-04-27
Two stable isotope dilution assays were developed for the quantitation of acrolein in fats and oils using [(13)C(3)]-acrolein as the internal standard. First, a direct GC-MS headspace method, followed by an indirect GC-MS method using derivatization with pentafluorophenyl hydrazine, was established. Analysis of six different types of oils varying in their pattern of fatty acids showed significant differences in the amounts of acrolein formed after heating at various temperatures and for various times. For example, after 24 h at 140 °C, coconut oil contained 6.7 mg/kg, whereas linseed oil was highest with 242.3 mg/kg. A comparison of the results showed that the extent of acrolein formation seemed to be correlated with the amount of linolenic acid in the oils. Although the acrolein concentrations were lowered in all six oils after frying of potato crisps, linseed and rapeseed oil still contained the highest amounts of acrolein after frying. By applying both methods on different thermally treated fats and oils, nearly identical quantitative data were obtained.
Phenotypic selection in natural populations: what limits directional selection?
Kingsolver, Joel G; Diamond, Sarah E
2011-03-01
Studies of phenotypic selection document directional selection in many natural populations. What factors reduce total directional selection and the cumulative evolutionary responses to selection? We combine two data sets for phenotypic selection, representing more than 4,600 distinct estimates of selection from 143 studies, to evaluate the potential roles of fitness trade-offs, indirect (correlated) selection, temporally varying selection, and stabilizing selection for reducing net directional selection and cumulative responses to selection. We detected little evidence that trade-offs among different fitness components reduced total directional selection in most study systems. Comparisons of selection gradients and selection differentials suggest that correlated selection frequently reduced total selection on size but not on other types of traits. The direction of selection on a trait often changes over time in many temporally replicated studies, but these fluctuations have limited impact in reducing cumulative directional selection in most study systems. Analyses of quadratic selection gradients indicated stabilizing selection on body size in at least some studies but provided little evidence that stabilizing selection is more common than disruptive selection for most traits or study systems. Our analyses provide little evidence that fitness trade-offs, correlated selection, or stabilizing selection strongly constrains the directional selection reported for most quantitative traits.
Directed differential connectivity graph of interictal epileptiform discharges
Amini, Ladan; Jutten, Christian; Achard, Sophie; David, Olivier; Soltanian-Zadeh, Hamid; Hossein-Zadeh, Gh. Ali; Kahane, Philippe; Minotti, Lorella; Vercueil, Laurent
2011-01-01
In this paper, we study temporal couplings between interictal events of spatially remote regions in order to localize the leading epileptic regions from intracerebral electroencephalogram (iEEG). We aim to assess whether quantitative epileptic graph analysis during interictal period may be helpful to predict the seizure onset zone of ictal iEEG. Using wavelet transform, cross-correlation coefficient, and multiple hypothesis test, we propose a differential connectivity graph (DCG) to represent the connections that change significantly between epileptic and non-epileptic states as defined by the interictal events. Post-processings based on mutual information and multi-objective optimization are proposed to localize the leading epileptic regions through DCG. The suggested approach is applied on iEEG recordings of five patients suffering from focal epilepsy. Quantitative comparisons of the proposed epileptic regions within ictal onset zones detected by visual inspection and using electrically stimulated seizures, reveal good performance of the present method. PMID:21156385
Quantitative description of ion transport via plasma membrane of yeast and small cells.
Volkov, Vadim
2015-01-01
Modeling of ion transport via plasma membrane needs identification and quantitative understanding of the involved processes. Brief characterization of main ion transport systems of a yeast cell (Pma1, Ena1, TOK1, Nha1, Trk1, Trk2, non-selective cation conductance) and determining the exact number of molecules of each transporter per a typical cell allow us to predict the corresponding ion flows. In this review a comparison of ion transport in small yeast cell and several animal cell types is provided. The importance of cell volume to surface ratio is emphasized. The role of cell wall and lipid rafts is discussed in respect to required increase in spatial and temporary resolution of measurements. Conclusions are formulated to describe specific features of ion transport in a yeast cell. Potential directions of future research are outlined based on the assumptions.
Quantitative description of ion transport via plasma membrane of yeast and small cells
Volkov, Vadim
2015-01-01
Modeling of ion transport via plasma membrane needs identification and quantitative understanding of the involved processes. Brief characterization of main ion transport systems of a yeast cell (Pma1, Ena1, TOK1, Nha1, Trk1, Trk2, non-selective cation conductance) and determining the exact number of molecules of each transporter per a typical cell allow us to predict the corresponding ion flows. In this review a comparison of ion transport in small yeast cell and several animal cell types is provided. The importance of cell volume to surface ratio is emphasized. The role of cell wall and lipid rafts is discussed in respect to required increase in spatial and temporary resolution of measurements. Conclusions are formulated to describe specific features of ion transport in a yeast cell. Potential directions of future research are outlined based on the assumptions. PMID:26113853
Darwish, Hany W; Bakheit, Ahmed H; Abdelhameed, Ali S
2016-03-01
Simultaneous spectrophotometric analysis of a multi-component dosage form of olmesartan, amlodipine and hydrochlorothiazide used for the treatment of hypertension has been carried out using various chemometric methods. Multivariate calibration methods include classical least squares (CLS) executed by net analyte processing (NAP-CLS), orthogonal signal correction (OSC-CLS) and direct orthogonal signal correction (DOSC-CLS) in addition to multivariate curve resolution-alternating least squares (MCR-ALS). Results demonstrated the efficiency of the proposed methods as quantitative tools of analysis as well as their qualitative capability. The three analytes were determined precisely using the aforementioned methods in an external data set and in a dosage form after optimization of experimental conditions. Finally, the efficiency of the models was validated via comparison with the partial least squares (PLS) method in terms of accuracy and precision.
Valero, E; Sanz, J; Martínez-Castro, I
2001-06-01
Direct thermal desorption (DTD) has been used as a technique for extracting volatile components of cheese as a preliminary step to their gas chromatographic (GC) analysis. In this study, it is applied to different cheese varieties: Camembert, blue, Chaumes, and La Serena. Volatiles are also extracted using other techniques such as simultaneous distillation-extraction and dynamic headspace. Separation and identification of the cheese components are carried out by GC-mass spectrometry. Approximately 100 compounds are detected in the examined cheeses. The described results show that DTD is fast, simple, and easy to automate; requires only a small amount of sample (approximately 50 mg); and affords quantitative information about the main groups of compounds present in cheeses.
Phonon Lifetime Observation in Epitaxial ScN Film with Inelastic X-Ray Scattering Spectroscopy.
Uchiyama, H; Oshima, Y; Patterson, R; Iwamoto, S; Shiomi, J; Shimamura, K
2018-06-08
Phonon-phonon scattering dominates the thermal properties in nonmetallic materials, and it directly influences device performance in applications. The understanding of the scattering has been progressing using computational approaches, and the direct and systematic observation of phonon modes that include momentum dependences is desirable. We report experimental data on the phonon dispersion curves and lifetimes in an epitaxially grown ScN film using inelastic x-ray scattering measurements. The momentum dependence of the optical phonon lifetimes is estimated from the spectral width, and the highest-energy phonon mode around the zone center is found to possess a short lifetime of 0.21 ps. A comparison with first-principles calculations shows that our observed phonon lifetimes are quantitatively explained by three-body phonon-phonon interactions.
Phonon Lifetime Observation in Epitaxial ScN Film with Inelastic X-Ray Scattering Spectroscopy
NASA Astrophysics Data System (ADS)
Uchiyama, H.; Oshima, Y.; Patterson, R.; Iwamoto, S.; Shiomi, J.; Shimamura, K.
2018-06-01
Phonon-phonon scattering dominates the thermal properties in nonmetallic materials, and it directly influences device performance in applications. The understanding of the scattering has been progressing using computational approaches, and the direct and systematic observation of phonon modes that include momentum dependences is desirable. We report experimental data on the phonon dispersion curves and lifetimes in an epitaxially grown ScN film using inelastic x-ray scattering measurements. The momentum dependence of the optical phonon lifetimes is estimated from the spectral width, and the highest-energy phonon mode around the zone center is found to possess a short lifetime of 0.21 ps. A comparison with first-principles calculations shows that our observed phonon lifetimes are quantitatively explained by three-body phonon-phonon interactions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masada, Youhei; Sano, Takayoshi, E-mail: ymasada@harbor.kobe-u.ac.jp, E-mail: sano@ile.osaka-u.ac.jp
2014-10-10
The mechanism of large-scale dynamos in rigidly rotating stratified convection is explored by direct numerical simulations (DNS) in Cartesian geometry. A mean-field dynamo model is also constructed using turbulent velocity profiles consistently extracted from the corresponding DNS results. By quantitative comparison between the DNS and our mean-field model, it is demonstrated that the oscillatory α{sup 2} dynamo wave, excited and sustained in the convection zone, is responsible for large-scale magnetic activities such as cyclic polarity reversal and spatiotemporal migration. The results provide strong evidence that a nonuniformity of the α-effect, which is a natural outcome of rotating stratified convection, canmore » be an important prerequisite for large-scale stellar dynamos, even without the Ω-effect.« less
NASA Astrophysics Data System (ADS)
Reineker, P.; Kenkre, V. M.; Kühne, R.
1981-08-01
A quantitative comparison of a simple theoretical prediction for the drift mobility of photo-electrons in organic molecular crystals, calculated within the model of the coupled band-like and hopping motion, with experiments in napthalene of Schein et al. and Karl et al. is given.
Tobita, Kenji; Matsumoto, Takuya; Ohashi, Satoru; Bessho, Masahiko; Kaneko, Masako; Ohnishi, Isao
2012-07-01
It has been previously demonstrated that low-intensity pulsed ultrasound stimulation (LIPUS) enhances formation of the medullary canal and cortex in a gap-healing model of the tibia in rabbits, shortens the time required for remodeling, and enhances mineralization of the callus. In the current study, the mechanical integrity of these models was confirmed. In order to do this, the cross-sectional moment of inertia (CSMI) obtained from quantitative micro-computed tomography scans was calculated, and a comparison was made with a four-point bending test. This parameter can be analyzed in any direction, and three directions were selected in order to adopt an XYZ coordinate (X and Y for bending; Z for torsion). The present results demonstrated that LIPUS improved earlier restoration of bending stiffness at the healing site. In addition, LIPUS was effective not only in the ultrasound-irradiated plane, but also in the other two planes. CSMI may provide the structural as well as compositional determinants to assess fracture healing and would be very useful to replace the mechanical testing.
Yu, Kebing; Salomon, Arthur R
2009-12-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through MS/MS. Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to various experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our high throughput autonomous proteomic pipeline used in the automated acquisition and post-acquisition analysis of proteomic data.
Nadin-Davis, Susan; Knowles, Margaret K; Burke, Teresa; Böse, Reinhard; Devenish, John
2015-07-01
A quantitative real-time polymerase chain reaction method (qPCR) was developed and tested for the detection of Taylorella equigenitalis. It was shown to have an analytical sensitivity of 5 colony-forming units (CFU) of T. equigenitalis when applied to the testing of culture swabs that mimicked field samples, and a high analytical specificity in not reacting to 8 other commensal bacterial species associated with horses. As designed, it could also differentiate specifically between T. equigenitalis and T. asinigenitalis. The qPCR was compared to standard culture in a study that included 45 swab samples from 6 horses (1 stallion, 5 mares) naturally infected with T. equigenitalis in Canada, 39 swab samples from 5 naturally infected stallions in Germany, and 311 swab samples from 87 culture negative horses in Canada. When the comparison was conducted on an individual sample swab basis, the qPCR had a statistical sensitivity and specificity of 100% and 96.4%, respectively, and 100% and 99.1% when the comparison was conducted on a sample set basis. A comparison was also made on 203 sample swabs from the 5 German stallions taken over a span of 4 to 9 mo following antibiotic treatment. The qPCR was found to be highly sensitive and at least as good as culture in detecting the presence of T. equigenitalis in post-treatment samples. The work demonstrates that the qPCR assay described here can potentially be used to detect the presence of T. equigenitalis directly from submitted sample swabs taken from infected horses and also for determining T. equigenitalis freedom following treatment.
Nadin-Davis, Susan; Knowles, Margaret K.; Burke, Teresa; Böse, Reinhard; Devenish, John
2015-01-01
A quantitative real-time polymerase chain reaction method (qPCR) was developed and tested for the detection of Taylorella equigenitalis. It was shown to have an analytical sensitivity of 5 colony-forming units (CFU) of T. equigenitalis when applied to the testing of culture swabs that mimicked field samples, and a high analytical specificity in not reacting to 8 other commensal bacterial species associated with horses. As designed, it could also differentiate specifically between T. equigenitalis and T. asinigenitalis. The qPCR was compared to standard culture in a study that included 45 swab samples from 6 horses (1 stallion, 5 mares) naturally infected with T. equigenitalis in Canada, 39 swab samples from 5 naturally infected stallions in Germany, and 311 swab samples from 87 culture negative horses in Canada. When the comparison was conducted on an individual sample swab basis, the qPCR had a statistical sensitivity and specificity of 100% and 96.4%, respectively, and 100% and 99.1% when the comparison was conducted on a sample set basis. A comparison was also made on 203 sample swabs from the 5 German stallions taken over a span of 4 to 9 mo following antibiotic treatment. The qPCR was found to be highly sensitive and at least as good as culture in detecting the presence of T. equigenitalis in post-treatment samples. The work demonstrates that the qPCR assay described here can potentially be used to detect the presence of T. equigenitalis directly from submitted sample swabs taken from infected horses and also for determining T. equigenitalis freedom following treatment. PMID:26130847
Burch, Matthew J.; Fancher, Chris M.; Patala, Srikanth; ...
2016-11-18
A novel technique, which directly and nondestructively maps polar domains using electron backscatter diffraction (EBSD) is described and demonstrated. Through dynamical diffraction simulations and quantitative comparison to experimental EBSD patterns, the absolute orientation of a non-centrosymmetric crystal can be determined. With this information, the polar domains of a material can be mapped. The technique is demonstrated by mapping the non-ferroelastic, or 180°, ferroelectric domains in periodically poled LiNbO 3 single crystals. Furthermore, the authors demonstrate the possibility of mapping polarity using this technique in other polar materials system.
NASA Astrophysics Data System (ADS)
Liu, D. R.; Mangelinck-Noël, N.; Gandin, Ch-A.; Zimmermann, G.; Sturz, L.; Nguyen Thi, H.; Billia, B.
2016-03-01
A two-dimensional multi-scale cellular automaton - finite element (CAFE) model is used to simulate grain structure evolution and microsegregation formation during solidification of refined Al-7wt%Si alloys under microgravity. The CAFE simulations are first qualitatively compared with the benchmark experimental data under microgravity. Qualitative agreement is obtained for the position of columnar to equiaxed transition (CET) and the CET transition mode (sharp or progressive). Further comparisons of the distributions of grain elongation factor and equivalent diameter are conducted and reveal a fair quantitative agreement.
Comparison of GEANT4 very low energy cross section models with experimental data in water.
Incerti, S; Ivanchenko, A; Karamitros, M; Mantero, A; Moretto, P; Tran, H N; Mascialino, B; Champion, C; Ivanchenko, V N; Bernal, M A; Francis, Z; Villagrasa, C; Baldacchin, G; Guèye, P; Capra, R; Nieminen, P; Zacharatou, C
2010-09-01
The GEANT4 general-purpose Monte Carlo simulation toolkit is able to simulate physical interaction processes of electrons, hydrogen and helium atoms with charge states (H0, H+) and (He0, He+, He2+), respectively, in liquid water, the main component of biological systems, down to the electron volt regime and the submicrometer scale, providing GEANT4 users with the so-called "GEANT4-DNA" physics models suitable for microdosimetry simulation applications. The corresponding software has been recently re-engineered in order to provide GEANT4 users with a coherent and unique approach to the simulation of electromagnetic interactions within the GEANT4 toolkit framework (since GEANT4 version 9.3 beta). This work presents a quantitative comparison of these physics models with a collection of experimental data in water collected from the literature. An evaluation of the closeness between the total and differential cross section models available in the GEANT4 toolkit for microdosimetry and experimental reference data is performed using a dedicated statistical toolkit that includes the Kolmogorov-Smirnov statistical test. The authors used experimental data acquired in water vapor as direct measurements in the liquid phase are not yet available in the literature. Comparisons with several recommendations are also presented. The authors have assessed the compatibility of experimental data with GEANT4 microdosimetry models by means of quantitative methods. The results show that microdosimetric measurements in liquid water are necessary to assess quantitatively the validity of the software implementation for the liquid water phase. Nevertheless, a comparison with existing experimental data in water vapor provides a qualitative appreciation of the plausibility of the simulation models. The existing reference data themselves should undergo a critical interpretation and selection, as some of the series exhibit significant deviations from each other. The GEANT4-DNA physics models available in the GEANT4 toolkit have been compared in this article to available experimental data in the water vapor phase as well as to several published recommendations on the mass stopping power. These models represent a first step in the extension of the GEANT4 Monte Carlo toolkit to the simulation of biological effects of ionizing radiation.
Multi-scale Modeling of Plasticity in Tantalum.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Hojun; Battaile, Corbett Chandler.; Carroll, Jay
In this report, we present a multi-scale computational model to simulate plastic deformation of tantalum and validating experiments. In atomistic/ dislocation level, dislocation kink- pair theory is used to formulate temperature and strain rate dependent constitutive equations. The kink-pair theory is calibrated to available data from single crystal experiments to produce accurate and convenient constitutive laws. The model is then implemented into a BCC crystal plasticity finite element method (CP-FEM) model to predict temperature and strain rate dependent yield stresses of single and polycrystalline tantalum and compared with existing experimental data from the literature. Furthermore, classical continuum constitutive models describingmore » temperature and strain rate dependent flow behaviors are fit to the yield stresses obtained from the CP-FEM polycrystal predictions. The model is then used to conduct hydro- dynamic simulations of Taylor cylinder impact test and compared with experiments. In order to validate the proposed tantalum CP-FEM model with experiments, we introduce a method for quantitative comparison of CP-FEM models with various experimental techniques. To mitigate the effects of unknown subsurface microstructure, tantalum tensile specimens with a pseudo-two-dimensional grain structure and grain sizes on the order of millimeters are used. A technique combining an electron back scatter diffraction (EBSD) and high resolution digital image correlation (HR-DIC) is used to measure the texture and sub-grain strain fields upon uniaxial tensile loading at various applied strains. Deformed specimens are also analyzed with optical profilometry measurements to obtain out-of- plane strain fields. These high resolution measurements are directly compared with large-scale CP-FEM predictions. This computational method directly links fundamental dislocation physics to plastic deformations in the grain-scale and to the engineering-scale applications. Furthermore, direct and quantitative comparisons between experimental measurements and simulation show that the proposed model accurately captures plasticity in deformation of polycrystalline tantalum.« less
Seo, K H; Valentin-Bon, I E; Brackett, R E
2006-03-01
Salmonellosis caused by Salmonella Enteritidis (SE) is a significant cause of foodborne illnesses in the United States. Consumption of undercooked eggs and egg-containing products has been the primary risk factor for the disease. The importance of the bacterial enumeration technique has been enormously stressed because of the quantitative risk analysis of SE in shell eggs. Traditional enumeration methods mainly depend on slow and tedious most-probable-number (MPN) methods. Therefore, specific, sensitive, and rapid methods for SE quantitation are needed to collect sufficient data for risk assessment and food safety policy development. We previously developed a real-time quantitative PCR assay for the direct detection and enumeration of SE and, in this study, applied it to naturally contaminated ice cream samples with and without enrichment. The detection limit of the real-time PCR assay was determined with artificially inoculated ice cream. When applied to the direct detection and quantification of SE in ice cream, the real-time PCR assay was as sensitive as the conventional plate count method in frequency of detection. However, populations of SE derived from real-time quantitative PCR were approximately 1 log higher than provided by MPN and CFU values obtained by conventional culture methods. The detection and enumeration of SE in naturally contaminated ice cream can be completed in 3 h by this real-time PCR method, whereas the cultural enrichment method requires 5 to 7 days. A commercial immunoassay for the specific detection of SE was also included in the study. The real-time PCR assay proved to be a valuable tool that may be useful to the food industry in monitoring its processes to improve product quality and safety.
Quantitative risk analysis of oil storage facilities in seismic areas.
Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto
2005-08-31
Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.
Veltman, Karin; Huijbregts, Mark A J; Hendriks, A Jan
2010-07-01
Both biotic ligand models (BLM) and bioaccumulation models aim to quantify metal exposure based on mechanistic knowledge, but key factors included in the description of metal uptake differ between the two approaches. Here, we present a quantitative comparison of both approaches and show that BLM and bioaccumulation kinetics can be merged into a common mechanistic framework for metal uptake in aquatic organisms. Our results show that metal-specific absorption efficiencies calculated from BLM-parameters for freshwater fish are highly comparable, i.e. within a factor of 2.4 for silver, cadmium, copper, and zinc, to bioaccumulation-absorption efficiencies for predominantly marine fish. Conditional affinity constants are significantly related to the metal-specific covalent index. Additionally, the affinity constants of calcium, cadmium, copper, sodium, and zinc are significantly comparable across aquatic species, including molluscs, daphnids, and fish. This suggests that affinity constants can be estimated from the covalent index, and constants can be extrapolated across species. A new model is proposed that integrates the combined effect of metal chemodynamics, as speciation, competition, and ligand affinity, and species characteristics, as size, on metal uptake by aquatic organisms. An important direction for further research is the quantitative comparison of the proposed model with acute toxicity values for organisms belonging to different size classes.
Multilayer Markov Random Field models for change detection in optical remote sensing images
NASA Astrophysics Data System (ADS)
Benedek, Csaba; Shadaydeh, Maha; Kato, Zoltan; Szirányi, Tamás; Zerubia, Josiane
2015-09-01
In this paper, we give a comparative study on three Multilayer Markov Random Field (MRF) based solutions proposed for change detection in optical remote sensing images, called Multicue MRF, Conditional Mixed Markov model, and Fusion MRF. Our purposes are twofold. On one hand, we highlight the significance of the focused model family and we set them against various state-of-the-art approaches through a thematic analysis and quantitative tests. We discuss the advantages and drawbacks of class comparison vs. direct approaches, usage of training data, various targeted application fields and different ways of Ground Truth generation, meantime informing the Reader in which roles the Multilayer MRFs can be efficiently applied. On the other hand we also emphasize the differences between the three focused models at various levels, considering the model structures, feature extraction, layer interpretation, change concept definition, parameter tuning and performance. We provide qualitative and quantitative comparison results using principally a publicly available change detection database which contains aerial image pairs and Ground Truth change masks. We conclude that the discussed models are competitive against alternative state-of-the-art solutions, if one uses them as pre-processing filters in multitemporal optical image analysis. In addition, they cover together a large range of applications, considering the different usage options of the three approaches.
Perry, G M L; Audet, C; Bernatchez, L
2005-09-01
The importance of directional selection relative to neutral evolution may be determined by comparing quantitative genetic variation in phenotype (Q(ST)) to variation at neutral molecular markers (F(ST)). Quantitative divergence between salmonid life history types is often considerable, but ontogenetic changes in the significance of major sources of genetic variance during post-hatch development suggest that selective differentiation varies by developmental stage. In this study, we tested the hypothesis that maternal genetic differentiation between anadromous and resident brook charr (Salvelinus fontinalis Mitchill) populations for early quantitative traits (embryonic size/growth, survival, egg number and developmental time) would be greater than neutral genetic differentiation, but that the maternal genetic basis for differentiation would be higher for pre-resorption traits than post-resorption traits. Quantitative genetic divergence between anadromous (seawater migratory) and resident Laval River (Québec) brook charr based on maternal genetic variance was high (Q(ST) > 0.4) for embryonic length, yolk sac volume, embryonic growth rate and time to first response to feeding relative to neutral genetic differentiation [F(ST) = 0.153 (0.071-0.214)], with anadromous females having positive genetic coefficients for all of the above characters. However, Q(ST) was essentially zero for all traits post-resorption of the yolk sac. Our results indicate that the observed divergence between resident and anadromous brook charr has been driven by directional selection, and may therefore be adaptive. Moreover, they provide among the first evidence that the relative importance of selective differentiation may be highly context-specific, and varies by genetic contributions to phenotype by parental sex at specific points in offspring ontogeny. This in turn suggests that interpretations of Q(ST)-F(ST) comparisons may be improved by considering the structure of quantitative genetic architecture by age category and the sex of the parent used in estimation.
NASA Astrophysics Data System (ADS)
Singh, Manpreet; Alabanza, Anginelle; Gonzalez, Lorelis E.; Wang, Weiwei; Reeves, W. Brian; Hahm, Jong-In
2016-02-01
Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules.Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules. Electronic supplementary information (ESI) available: Typical SEM images of the ZnO NRs used in the biomarker assays are provided in Fig. S1. See DOI: 10.1039/c5nr08706f
Cilia, M.; Fish, T.; Yang, X.; Mclaughlin, M.; Thannhauser, T. W.
2009-01-01
Protein extraction methods can vary widely in reproducibility and in representation of the total proteome, yet there are limited data comparing protein isolation methods. The methodical comparison of protein isolation methods is the first critical step for proteomic studies. To address this, we compared three methods for isolation, purification, and solubilization of insect proteins. The aphid Schizaphis graminum, an agricultural pest, was the source of insect tissue. Proteins were extracted using TCA in acetone (TCA-acetone), phenol, or multi-detergents in a chaotrope solution. Extracted proteins were solubilized in a multiple chaotrope solution and examined using 1-D and 2-D electrophoresis and compared directly using 2-D Difference Gel Electrophoresis (2-D DIGE). Mass spectrometry was used to identify proteins from each extraction type. We were unable to ascribe the differences in the proteins extracted to particular physical characteristics, cell location, or biological function. The TCA-acetone extraction yielded the greatest amount of protein from aphid tissues. Each extraction method isolated a unique subset of the aphid proteome. The TCA-acetone method was explored further for its quantitative reliability using 2-D DIGE. Principal component analysis showed that little of the variation in the data was a result of technical issues, thus demonstrating that the TCA-acetone extraction is a reliable method for preparing aphid proteins for a quantitative proteomics experiment. These data suggest that although the TCA-acetone method is a suitable method for quantitative aphid proteomics, a combination of extraction approaches is recommended for increasing proteome coverage when using gel-based separation techniques. PMID:19721822
Sabike, Islam I; Uemura, Ryoko; Kirino, Yumi; Mekata, Hirohisa; Sekiguchi, Satoshi; Okabayashi, Tamaki; Goto, Yoshitaka; Yamazaki, Wataru
2016-01-01
Rapid identification of Campylobacter -positive flocks before slaughter, following freezing and heat treatment for the Campylobacter -positive carcasses at the slaughterhouses is an effective control strategy against foodborne campylobacteriosis. We evaluated a loop-mediated isothermal amplification (LAMP) assay for the direct screening of naturally contaminated chicken cloacal swabs for C. jejuni / C. coli to compare this assay with conventional quantitative culture methods. In a comparison study of 165 broilers, the LAMP assay showed 82.8% (48/58 by conventional culture) sensitivity, 100% (107/107) specificity, 100% (48/48) positive predictive value (PPV), and 91.5% (107/117) negative predictive value (NPV). In a comparison of 55 flocks, LAMP showed 90.5% (19/21) sensitivity, 100% (34/34) specificity, 100% (19/19) PPV, and 94.4% (34/36) NPV. In the cumulative total of 28 farm-level comparisons, LAMP showed 100% (12/12) sensitivity, 100% (16/16) specificity, 100% (12/12) PPV, and 100% (16/16) NPV. The LAMP assay required less than 90 min from the arrival of the fecal samples to final results in the laboratory. This suggests that the LAMP assay will facilitate the identification of C. jejuni / C. coli -positive broiler flocks at the farm level or in slaughterhouses before slaughtering, which would make it an effective tool in preventing the spread of Campylobacter contamination.
Delgado Reyes, Lourdes M; Bohache, Kevin; Wijeakumar, Sobanawartiny; Spencer, John P
2018-04-01
Motion artifacts are often a significant component of the measured signal in functional near-infrared spectroscopy (fNIRS) experiments. A variety of methods have been proposed to address this issue, including principal components analysis (PCA), correlation-based signal improvement (CBSI), wavelet filtering, and spline interpolation. The efficacy of these techniques has been compared using simulated data; however, our understanding of how these techniques fare when dealing with task-based cognitive data is limited. Brigadoi et al. compared motion correction techniques in a sample of adult data measured during a simple cognitive task. Wavelet filtering showed the most promise as an optimal technique for motion correction. Given that fNIRS is often used with infants and young children, it is critical to evaluate the effectiveness of motion correction techniques directly with data from these age groups. This study addresses that problem by evaluating motion correction algorithms implemented in HomER2. The efficacy of each technique was compared quantitatively using objective metrics related to the physiological properties of the hemodynamic response. Results showed that targeted PCA (tPCA), spline, and CBSI retained a higher number of trials. These techniques also performed well in direct head-to-head comparisons with the other approaches using quantitative metrics. The CBSI method corrected many of the artifacts present in our data; however, this approach produced sometimes unstable HRFs. The targeted PCA and spline methods proved to be the most robust, performing well across all comparison metrics. When compared head to head, tPCA consistently outperformed spline. We conclude, therefore, that tPCA is an effective technique for correcting motion artifacts in fNIRS data from young children.
Investigating Children's Abilities to Count and Make Quantitative Comparisons
ERIC Educational Resources Information Center
Lee, Joohi; Md-Yunus, Sham'ah
2016-01-01
This study was designed to investigate children's abilities to count and make quantitative comparisons. In addition, this study utilized reasoning questions (i.e., how did you know?). Thirty-four preschoolers, mean age 4.5 years old, participated in the study. According to the results, 89% of the children (n = 30) were able to do rote counting and…
Quantitative comparison of 3D third harmonic generation and fluorescence microscopy images.
Zhang, Zhiqing; Kuzmin, Nikolay V; Groot, Marie Louise; de Munck, Jan C
2018-01-01
Third harmonic generation (THG) microscopy is a label-free imaging technique that shows great potential for rapid pathology of brain tissue during brain tumor surgery. However, the interpretation of THG brain images should be quantitatively linked to images of more standard imaging techniques, which so far has been done qualitatively only. We establish here such a quantitative link between THG images of mouse brain tissue and all-nuclei-highlighted fluorescence images, acquired simultaneously from the same tissue area. For quantitative comparison of a substantial pair of images, we present here a segmentation workflow that is applicable for both THG and fluorescence images, with a precision of 91.3 % and 95.8 % achieved respectively. We find that the correspondence between the main features of the two imaging modalities amounts to 88.9 %, providing quantitative evidence of the interpretation of dark holes as brain cells. Moreover, 80 % bright objects in THG images overlap with nuclei highlighted in the fluorescence images, and they are 2 times smaller than the dark holes, showing that cells of different morphologies can be recognized in THG images. We expect that the described quantitative comparison is applicable to other types of brain tissue and with more specific staining experiments for cell type identification. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Comparison of Activity Determination of Radium 226 in FUSRAP Soil using Various Energy Lines - 12299
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, Brian; Donakowski, Jough; Hays, David
2012-07-01
Gamma spectroscopy is used at the Formerly Utilized Sites Remedial Action Program (FUSRAP) Maywood Superfund Site as the primary radioanalytical tool for quantization of activities of the radionuclides of concern in site soil. When selecting energy lines in gamma spectroscopy, a number of factors are considered including assumptions concerning secondary equilibrium, interferences, and the strength of the lines. The case of the Maywood radionuclide of concern radium-226 (Ra-226) is considered in this paper. At the FUSRAP Maywood Superfund Site, one of the daughters produced from radioactive decay of Ra-226, lead-214 (Pb- 214), is used to quantitate Ra-226. Another Ra-226 daughter,more » bismuth-214 (Bi-214), also may be used to quantitate Ra-226. In this paper, a comparison of Ra-226 to Pb-214 activities and Ra-226 to Bi-214 activities, obtained using gamma spectrometry for a large number of soil samples, was performed. The Pb-214, Bi-214, and Ra-226 activities were quantitated using the 352 kilo electron volt (keV), 609 keV, and 186 keV lines, respectively. The comparisons were made after correcting the Ra-226 activities by a factor of 0.571 and both ignoring and accounting for the contribution of a U-235 interfering line to the Ra-226 line. For the Pb-214 and Bi-214 activities, a mean in-growth factor was employed. The gamma spectrometer was calibrated for efficiency and energy using a mixed gamma standard and an energy range of 59 keV to 1830 keV. The authors expect other sites with Ra-226 contamination in soil may benefit from the discussions and points in this paper. Proper use of correction factors and comparison of the data from three different gamma-emitting radionuclides revealed agreement with expectations and provided confidence that using such correction factors generates quality data. The results indicate that if contamination is low level and due to NORM, the Ra-226 can be measured directly if corrected to subtract the contribution from U-235. If there is any indication that technologically enhanced uranium may be present, the preferred measurement approach for quantitation of Ra-226 activity is detection of one of the Ra-226 daughters, Pb-214 or Bi-214, using a correction factor obtained from an in-growth curve. The results also show that the adjusted Ra-226 results compare very well with both the Pb-214 and Bi-214 results obtained using an in-growth curve correction factor. (authors)« less
Direct detection of density of gap states in C60 single crystals by photoemission spectroscopy
NASA Astrophysics Data System (ADS)
Bussolotti, Fabio; Yang, Janpeng; Hiramoto, Masahiro; Kaji, Toshihiko; Kera, Satoshi; Ueno, Nobuo
2015-09-01
We report on the direct and quantitative evaluation of density of gap states (DOGS) in large-size C60 single crystals by using ultralow-background, high-sensitivity ultraviolet photoemission spectroscopy. The charging of the crystals during photoionization was overcome using photoconduction induced by simultaneous laser irradiation. By comparison with the spectra of as-deposited and gas exposed C60 thin films the following results were found: (i) The DOGS near the highest occupied molecular orbital edge in the C60 single crystals (1019-1021states e V-1c m-3) mainly originates from the exposure to inert and ambient gas atmosphere during the sample preparation, storage, and transfer; (ii) the contribution of other sources of gap states such as structural imperfections at grain boundaries is negligible (<1018states e V-1c m-3) .
Albu, Silvia A; Al-Karmi, Salma A; Vito, Alyssa; Dzandzi, James P K; Zlitni, Aimen; Beckford-Vera, Denis; Blacker, Megan; Janzen, Nancy; Patel, Ramesh M; Capretta, Alfredo; Valliant, John F
2016-01-20
A convenient method to prepare radioiodinated tetrazines was developed, such that a bioorthogonal inverse electron demand Diels-Alder reaction can be used to label biomolecules with iodine-125 for in vitro screening and in vivo biodistribution studies. The tetrazine was prepared by employing a high-yielding oxidative halo destannylation reaction that concomitantly oxidized the dihydrotetrazine precursor. The product reacts quickly and efficiently with trans-cyclooctene derivatives. Utility was demonstrated through antibody and hormone labeling experiments and by evaluating products using standard analytical methods, in vitro assays, and quantitative biodistribution studies where the latter was performed in direct comparison to Bolton-Hunter and direct iodination methods. The approach described provides a convenient and advantageous alternative to conventional protein iodination methods that can expedite preclinical development and evaluation of biotherapeutics.
NASA Astrophysics Data System (ADS)
Yogeshwar, P.; Tezkan, B.; Israil, M.; Candansayar, M. E.
2012-01-01
The impact of sewage irrigation and groundwater contamination were investigated near Roorkee in north India using the Direct Current Resistivity (DCR) method and the Radiomagnetotelluric (RMT) method. Intensive field measurements were carried out in the vicinity of a waste disposal site, which was extensively irrigated with sewage water. For comparison a profile was investigated on a reference site, where no contamination was expected. In addition to conventional 1D and 2D inversion, the measured data sets were interpreted using a 2D joint inversion algorithm. The inversion results from the data obtained from the sewage irrigated site indicate a decrease of resistivity up to 75% in comparison with the reference site. The depth range from 5 to 15 m is identified as a shallow unconfined aquifer and the decreased resistivities are ascribed as the influence of contamination. Furthermore, a systematic increase in the resistivities of the shallow unconfined aquifer is detected as we move away from the waste disposal site. The advantages of both, the DCR and RMT methods, are quantitatively integrated by the 2D joint inversion of both data sets and lead to a joint model, which explains both data sets.
A multisite assessment of the quantitative capabilities of the Xpert MTB/RIF assay.
Blakemore, Robert; Nabeta, Pamela; Davidow, Amy L; Vadwai, Viral; Tahirli, Rasim; Munsamy, Vanisha; Nicol, Mark; Jones, Martin; Persing, David H; Hillemann, Doris; Ruesch-Gerdes, Sabine; Leisegang, Felicity; Zamudio, Carlos; Rodrigues, Camilla; Boehme, Catharina C; Perkins, Mark D; Alland, David
2011-11-01
The Xpert MTB/RIF is an automated molecular test for Mycobacterium tuberculosis that estimates bacterial burden by measuring the threshold-cycle (Ct) of its M. tuberculosis-specific real-time polymerase chain reaction. Bacterial burden is an important biomarker for disease severity, infection control risk, and response to therapy. Evaluate bacterial load quantitation by Xpert MTB/RIF compared with conventional quantitative methods. Xpert MTB/RIF results were compared with smear-microscopy, semiquantiative solid culture, and time-to-detection in liquid culture for 741 patients and 2,008 samples tested in a multisite clinical trial. An internal control real-time polymerase chain reaction was evaluated for its ability to identify inaccurate quantitative Xpert MTB/RIF results. Assays with an internal control Ct greater than 34 were likely to be inaccurately quantitated; this represented 15% of M. tuberculosis-positive tests. Excluding these, decreasing M. tuberculosis Ct was associated with increasing smear microscopy grade for smears of concentrated sputum pellets (r(s) = -0.77) and directly from sputum (r(s) =-0.71). A Ct cutoff of approximately 27.7 best predicted smear-positive status. The association between M. tuberculosis Ct and time-to-detection in liquid culture (r(s) = 0.68) and semiquantitative colony counts (r(s) = -0.56) was weaker than smear. Tests of paired same-patient sputum showed that high viscosity sputum samples contained ×32 more M. tuberculosis than nonviscous samples. Comparisons between the grade of the acid-fast bacilli smear and Xpert MTB/RIF quantitative data across study sites enabled us to identify a site outlier in microscopy. Xpert MTB/RIF quantitation offers a new, standardized approach to measuring bacterial burden in the sputum of patients with tuberculosis.
Hoppe, Elisabeth; Körzdörfer, Gregor; Würfl, Tobias; Wetzl, Jens; Lugauer, Felix; Pfeuffer, Josef; Maier, Andreas
2017-01-01
The purpose of this work is to evaluate methods from deep learning for application to Magnetic Resonance Fingerprinting (MRF). MRF is a recently proposed measurement technique for generating quantitative parameter maps. In MRF a non-steady state signal is generated by a pseudo-random excitation pattern. A comparison of the measured signal in each voxel with the physical model yields quantitative parameter maps. Currently, the comparison is done by matching a dictionary of simulated signals to the acquired signals. To accelerate the computation of quantitative maps we train a Convolutional Neural Network (CNN) on simulated dictionary data. As a proof of principle we show that the neural network implicitly encodes the dictionary and can replace the matching process.
Morrison, Geoffrey Stewart
2014-05-01
In this paper it is argued that one should not attempt to directly assess whether a forensic analysis technique is scientifically acceptable. Rather one should first specify what one considers to be appropriate principles governing acceptable practice, then consider any particular approach in light of those principles. This paper focuses on one principle: the validity and reliability of an approach should be empirically tested under conditions reflecting those of the case under investigation using test data drawn from the relevant population. Versions of this principle have been key elements in several reports on forensic science, including forensic voice comparison, published over the last four-and-a-half decades. The aural-spectrographic approach to forensic voice comparison (also known as "voiceprint" or "voicegram" examination) and the currently widely practiced auditory-acoustic-phonetic approach are considered in light of this principle (these two approaches do not appear to be mutually exclusive). Approaches based on data, quantitative measurements, and statistical models are also considered in light of this principle. © 2013.
Reducing the Matrix Effect in Organic Cluster SIMS Using Dynamic Reactive Ionization
NASA Astrophysics Data System (ADS)
Tian, Hua; Wucher, Andreas; Winograd, Nicholas
2016-12-01
Dynamic reactive ionization (DRI) utilizes a reactive molecule, HCl, which is doped into an Ar cluster projectile and activated to produce protons at the bombardment site on the cold sample surface with the presence of water. The methodology has been shown to enhance the ionization of protonated molecular ions and to reduce salt suppression in complex biomatrices. In this study, we further examine the possibility of obtaining improved quantitation with DRI during depth profiling of thin films. Using a trehalose film as a model system, we are able to define optimal DRI conditions for depth profiling. Next, the strategy is applied to a multilayer system consisting of the polymer antioxidants Irganox 1098 and 1010. These binary mixtures have demonstrated large matrix effects, making quantitative SIMS measurement not feasible. Systematic comparisons of depth profiling of this multilayer film between directly using GCIB, and under DRI conditions, show that the latter enhances protonated ions for both components by 4- to 15-fold, resulting in uniform depth profiling in positive ion mode and almost no matrix effect in negative ion mode. The methodology offers a new strategy to tackle the matrix effect and should lead to improved quantitative measurement using SIMS.
Guan, Wenna; Zhao, Hui; Lu, Xuefeng; Wang, Cong; Yang, Menglong; Bai, Fali
2011-11-11
Simple and rapid quantitative determination of fatty-acid-based biofuels is greatly important for the study of genetic engineering progress for biofuels production by microalgae. Ideal biofuels produced from biological systems should be chemically similar to petroleum, like fatty-acid-based molecules including free fatty acids, fatty acid methyl esters, fatty acid ethyl esters, fatty alcohols and fatty alkanes. This study founded a gas chromatography-mass spectrometry (GC-MS) method for simultaneous quantification of seven free fatty acids, nine fatty acid methyl esters, five fatty acid ethyl esters, five fatty alcohols and three fatty alkanes produced by wild-type Synechocystis PCC 6803 and its genetically engineered strain. Data obtained from GC-MS analyses were quantified using internal standard peak area comparisons. The linearity, limit of detection (LOD) and precision (RSD) of the method were evaluated. The results demonstrated that fatty-acid-based biofuels can be directly determined by GC-MS without derivation. Therefore, rapid and reliable quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria can be achieved using the GC-MS method founded in this work. Copyright © 2011 Elsevier B.V. All rights reserved.
Sun, Shihao; Wang, Hui; Xie, Jianping; Su, Yue
2016-01-01
Jujube extract is commonly used as a food additive and flavoring. The sensory properties of the extract, especially sweetness, are a critical factor determining the product quality and therefore affecting consumer acceptability. Small molecular carbohydrates make major contribution to the sweetness of the jujube extract, and their types and contents in the extract have direct influence on quality of the product. So, an appropriate qualitative and quantitative method for determination of the carbohydrates is vitally important for quality control of the product. High performance liquid chromatography-evaporative light scattering detection (HPLC-ELSD), liquid chromatography-electronic spay ionization tandem mass spectrometry (LC-ESI-MS/MS), and gas chromatography-mass spectrometry (GC-MS) methods have been developed and applied to determining small molecular carbohydrates in jujube extract, respectively. Eight sugars and alditols were identified from the extract, including rhamnose, xylitol, arabitol, fructose, glucose, inositol, sucrose, and maltose. Comparisons were carried out to investigate the performance of the methods. Although the methods have been found to perform satisfactorily, only three sugars (fructose, glucose and inositol) could be detected by all these methods. Meanwhile, a similar quantitative result for the three sugars can be obtained by the methods. Eight sugars and alditols in the jujube extract were determined by HPLC-ELSD, LC-ESI-MS/MS and GC-MS, respectively. The LC-ELSD method and the LC-ESI-MS/MS method with good precision and accuracy were suitable for quantitative analysis of carbohydrates in jujube extract; although the performance of the GC-MS method for quantitative analysis was inferior to the other methods, it has a wider scope in qualitative analysis. A multi-analysis technique should be adopted in order to obtain complete constituents of about the carbohydrates in jujube extract, and the methods should be employed according to the purpose of analysis.
NASA Astrophysics Data System (ADS)
Fu, Chenghua; Hu, Zhanning
2018-03-01
In this paper, we investigate the characteristics of the nuclear spin entanglement generated by an intermedium with an optically excited triplet. Significantly, the interaction between the two nuclear spins presents to be a direct XY coupling in each of the effective subspace Hamiltonians which are obtained by applying a transformation on the natural Hamiltonian. The quantum concurrence and negativity are discussed to quantitatively describe the quantum entanglement, and a comparison between them can reveal the nature of their relationship. An innovative general equation describing the relationship between the concurrence and negativity is explicitly obtained.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...
Mander, Luke; Li, Mao; Mio, Washington; Fowlkes, Charless C; Punyasena, Surangi W
2013-11-07
Taxonomic identification of pollen and spores uses inherently qualitative descriptions of morphology. Consequently, identifications are restricted to categories that can be reliably classified by multiple analysts, resulting in the coarse taxonomic resolution of the pollen and spore record. Grass pollen represents an archetypal example; it is not routinely identified below family level. To address this issue, we developed quantitative morphometric methods to characterize surface ornamentation and classify grass pollen grains. This produces a means of quantifying morphological features that are traditionally described qualitatively. We used scanning electron microscopy to image 240 specimens of pollen from 12 species within the grass family (Poaceae). We classified these species by developing algorithmic features that quantify the size and density of sculptural elements on the pollen surface, and measure the complexity of the ornamentation they form. These features yielded a classification accuracy of 77.5%. In comparison, a texture descriptor based on modelling the statistical distribution of brightness values in image patches yielded a classification accuracy of 85.8%, and seven human subjects achieved accuracies between 68.33 and 81.67%. The algorithmic features we developed directly relate to biologically meaningful features of grass pollen morphology, and could facilitate direct interpretation of unsupervised classification results from fossil material.
A comparison of in vitro cytotoxicity assays in medical device regulatory studies.
Liu, Xuemei; Rodeheaver, Denise P; White, Jeffrey C; Wright, Ann M; Walker, Lisa M; Zhang, Fan; Shannon, Stephen
2018-06-06
Medical device biocompatibility testing is used to evaluate the risk of adverse effects on tissues from exposure to leachates/extracts. A battery of tests is typically recommended in accordance with regulatory standards to determine if the device is biocompatible. In vitro cytotoxicity, a key element of the standards, is a required endpoint for all types of medical devices. Each validated cytotoxicity method has different methodology and acceptance criteria that could influence the selection of a specific test. In addition, some guidances are more specific than others as to the recommended test methods. For example, the International Organization for Standardization (ISO 1 ) cites preference for quantitative methods (e.g., tetrazolium (MTT/XTT), neutral red (NR), or colony formation assays (CFA)) over qualitative methods (e.g., elution, agar overlay/diffusion, or direct), while a recent ISO standard for contact lens/lens care solutions specifically requires a qualitative direct test. Qualitative methods are described in United States Pharmacopeia (USP) while quantitative CFAs are listed in Japan guidance. The aim of this review is to compare the methodologies such as test article preparation, test conditions, and criteria for six cytotoxicity methods recommended in regulatory standards in order to inform decisions on which method(s) to select during the medical device safety evaluation. Copyright © 2018. Published by Elsevier Inc.
Mrakic-Sposta, Simona; Gussoni, Maristella; Montorsi, Michela; Porcelli, Simone; Vezzoli, Alessandra
2014-01-01
The growing interest in the role of Reactive Oxygen Species (ROS) and in the assessment of oxidative stress in health and disease clashes with the lack of consensus on reliable quantitative noninvasive methods applicable. The study aimed at demonstrating that a recently developed Electron Paramagnetic Resonance microinvasive method provides direct evidence of the “instantaneous” presence of ROS returning absolute concentration levels that correlate with “a posteriori” assays of ROS-induced damage by means of biomarkers. The reliability of the choice to measure ROS production rate in human capillary blood rather than in plasma was tested (step I). A significant (P < 0.01) linear relationship between EPR data collected on capillary blood versus venous blood (R 2 = 0.95), plasma (R 2 = 0.82), and erythrocytes (R 2 = 0.73) was found. Then (step II) ROS production changes of various subjects' categories, young versus old and healthy versus pathological at rest condition, were found significantly different (range 0.0001–0.05 P level). The comparison of the results with antioxidant capacity and oxidative damage biomarkers concentrations showed that all changes indicating increased oxidative stress are directly related to ROS production increase. Therefore, the adopted method may be an automated technique for a lot of routine in clinical trials. PMID:25374651
Yu, Kebing; Salomon, Arthur R.
2010-01-01
Recently, dramatic progress has been achieved in expanding the sensitivity, resolution, mass accuracy, and scan rate of mass spectrometers able to fragment and identify peptides through tandem mass spectrometry (MS/MS). Unfortunately, this enhanced ability to acquire proteomic data has not been accompanied by a concomitant increase in the availability of flexible tools allowing users to rapidly assimilate, explore, and analyze this data and adapt to a variety of experimental workflows with minimal user intervention. Here we fill this critical gap by providing a flexible relational database called PeptideDepot for organization of expansive proteomic data sets, collation of proteomic data with available protein information resources, and visual comparison of multiple quantitative proteomic experiments. Our software design, built upon the synergistic combination of a MySQL database for safe warehousing of proteomic data with a FileMaker-driven graphical user interface for flexible adaptation to diverse workflows, enables proteomic end-users to directly tailor the presentation of proteomic data to the unique analysis requirements of the individual proteomics lab. PeptideDepot may be deployed as an independent software tool or integrated directly with our High Throughput Autonomous Proteomic Pipeline (HTAPP) used in the automated acquisition and post-acquisition analysis of proteomic data. PMID:19834895
Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A
2014-12-01
Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.
Katsoulidou, Antigoni; Petrodaskalaki, Maria; Sypsa, Vana; Papachristou, Eleni; Anastassopoulou, Cleo G; Gargalianos, Panagiotis; Karafoulidou, Anastasia; Lazanas, Marios; Kordossis, Theodoros; Andoniadou, Anastasia; Hatzakis, Angelos
2006-02-01
The COBAS TaqMan HIV-1 test (Roche Diagnostics) was compared with the LCx HIV RNA quantitative assay (Abbott Laboratories), the Versant HIV-1 RNA 3.0 (bDNA) assay (Bayer) and the COBAS Amplicor HIV-1 Monitor v1.5 test (Roche Diagnostics), using plasma samples of various viral load levels from HIV-1-infected individuals. In the comparison of TaqMan with LCx, TaqMan identified as positive 77.5% of the 240 samples versus 72.1% identified by LCx assay, while their overall agreement was 94.6% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.91). Similarly, in the comparison of TaqMan with bDNA 3.0, both methods identified 76.3% of the 177 samples as positive, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.95). Finally, in the comparison of TaqMan with Monitor v1.5, TaqMan identified 79.5% of the 156 samples as positive versus 80.1% identified by Monitor v1.5, while their overall agreement was 95.5% and the quantitative results of samples that were positive by both methods were strongly correlated (r=0.96). In conclusion, the new COBAS TaqMan HIV-1 test showed excellent agreement with other widely used commercially available tests for the quantitation of HIV-1 viral load.
Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.
In this pilot study, quantitative...
Phillips, Brett T; Fourman, Mitchell S; Rivara, Andrew; Dagum, Alexander B; Huston, Tara L; Ganz, Jason C; Bui, Duc T; Khan, Sami U
2014-01-01
Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R(2) = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8.
Fourman, Mitchell S.; Rivara, Andrew; Dagum, Alexander B.; Huston, Tara L.; Ganz, Jason C.; Bui, Duc T.; Khan, Sami U.
2014-01-01
Objective: Several devices exist today to assist the intraoperative determination of skin flap perfusion. Laser-Assisted Indocyanine Green Dye Angiography (LAICGA) has been shown to accurately predict mastectomy skin flap necrosis using quantitative perfusion values. The laser properties of the latest LAICGA device (SPY Elite) differ significantly from its predecessor system (SPY 2001), preventing direct translation of previous published data. The purpose of this study was to establish a mathematical relationship of perfusion values between these 2 devices. Methods: Breast reconstruction patients were prospectively enrolled into a clinical trial where skin flap evaluation and excision was based on quantitative SPY Q values previously established in the literature. Initial study patients underwent mastectomy skin flap evaluation using both SPY systems simultaneously. Absolute perfusion unit (APU) values at identical locations on the breast were then compared graphically. Results: 210 data points were identified on the same patients (n = 4) using both SPY systems. A linear relationship (y = 2.9883x + 12.726) was identified with a high level or correlation (R2 = 0.744). Previously published values using SPY 2001 (APU 3.7) provided a value of 23.8 APU on the SPY Elite. In addition, postoperative necrosis in these patients correlated to regions of skin identified with the SPY Elite with APU less than 23.8. Conclusion: Intraoperative comparison of LAICGA systems has provided direct correlation of perfusion values predictive of necrosis that were previously established in the literature. An APU value of 3.7 from the SPY 2001 correlates to a SPY Elite APU value of 23.8. PMID:25525483
Mroczek, Tomasz
2016-09-10
Recently launched thin-layer chromatography-mass spectrometry (TLC-MS) interface enabling extraction of compounds directly from TLC plates into MS ion source was unusually extended into two-dimensional thin-layer chromatography/high performance liquid chromatography (2D, TLC/HPLC) system by its a direct connection to a rapid resolution 50×2.1mm, I.D. C18 column compartment followed by detection by diode array (DAD) and electrospray ionisation time-of-flight mass spectrometry (ESI-TOF-MS). In this way, even not separated bands of complicated mixtures of natural compounds could be analysed structurally, only within 1-2min after development of TLC plates. In comparison to typically applied TLC-MS interface, no ion suppression for acidic mobile phases was observed. Also, substantial increase in ESI-TOF-MS sensitivities and quality of spectra, were noticed. It has been utilised in combination with TLC- based bioautographic approaches of acetylcholinesterase (AChE) inhibitors, However, it can be also applied in any other procedures related to bioactivity (e.g. 2,2-Diphenyl-1-picryl-hydrazyl-DPPH screen test for radicals). This system has been also used for determination of half maximal inhibitory concentration (IC50 values) of the active inhibitor-galanthamine, as an example. Moreover, AChE inhibitory potencies of some of purified plant extracts, never studied before, have been quantitatively measured. This is first report of usage such the 2D TLC/HPLC/MS system both for qualitative and quantitative evaluation of cholinesterase inhibitors in biological matrices. Copyright © 2016 Elsevier B.V. All rights reserved.
Primary production in the Delta: Then and now
Cloern, James E.; Robinson, April; Richey, Amy; Grenier, Letitia; Grossinger, Robin; Boyer, Katharyn E.; Burau, Jon; Canuel, Elizabeth A.; DeGeorge, John F.; Drexler, Judith Z.; Enright, Chris; Howe, Emily R.; Kneib, Ronald; Mueller-Solger, Anke; Naiman, Robert J.; Pinckney, James L.; Safran, Samuel M.; Schoellhamer, David H.; Simenstad, Charles A.
2016-01-01
To evaluate the role of restoration in the recovery of the Delta ecosystem, we need to have clear targets and performance measures that directly assess ecosystem function. Primary production is a crucial ecosystem process, which directly limits the quality and quantity of food available for secondary consumers such as invertebrates and fish. The Delta has a low rate of primary production, but it is unclear whether this was always the case. Recent analyses from the Historical Ecology Team and Delta Landscapes Project provide quantitative comparisons of the areal extent of 14 habitat types in the modern Delta versus the historical Delta (pre-1850). Here we describe an approach for using these metrics of land use change to: (1) produce the first quantitative estimates of how Delta primary production and the relative contributions from five different producer groups have been altered by large-scale drainage and conversion to agriculture; (2) convert these production estimates into a common currency so the contributions of each producer group reflect their food quality and efficiency of transfer to consumers; and (3) use simple models to discover how tidal exchange between marshes and open water influences primary production and its consumption. Application of this approach could inform Delta management in two ways. First, it would provide a quantitative estimate of how large-scale conversion to agriculture has altered the Delta's capacity to produce food for native biota. Second, it would provide restoration practitioners with a new approach—based on ecosystem function—to evaluate the success of restoration projects and gauge the trajectory of ecological recovery in the Delta region.
A whole-cell bioreporter assay for quantitative genotoxicity evaluation of environmental samples.
Jiang, Bo; Li, Guanghe; Xing, Yi; Zhang, Dayi; Jia, Jianli; Cui, Zhisong; Luan, Xiao; Tang, Hui
2017-10-01
Whole-cell bioreporters have emerged as promising tools for genotoxicity evaluation, due to their rapidity, cost-effectiveness, sensitivity and selectivity. In this study, a method for detecting genotoxicity in environmental samples was developed using the bioluminescent whole-cell bioreporter Escherichia coli recA::luxCDABE. To further test its performance in a real world scenario, the E. coli bioreporter was applied in two cases: i) soil samples collected from chromium(VI) contaminated sites; ii) crude oil contaminated seawater collected after the Jiaozhou Bay oil spill which occurred in 2013. The chromium(VI) contaminated soils were pretreated by water extraction, and directly exposed to the bioreporter in two phases: aqueous soil extraction (water phase) and soil supernatant (solid phase). The results indicated that both extractable and soil particle fixed chromium(VI) were bioavailable to the bioreporter, and the solid-phase contact bioreporter assay provided a more precise evaluation of soil genotoxicity. For crude oil contaminated seawater, the response of the bioreporter clearly illustrated the spatial and time change in genotoxicity surrounding the spill site, suggesting that the crude oil degradation process decreased the genotoxic risk to ecosystem. In addition, the performance of the bioreporter was simulated by a modified cross-regulation gene expression model, which quantitatively described the DNA damage response of the E. coli bioreporter. Accordingly, the bioluminescent response of the bioreporter was calculated as the mitomycin C equivalent, enabling quantitative comparison of genotoxicities between different environmental samples. This bioreporter assay provides a rapid and sensitive screening tool for direct genotoxicity assessment of environmental samples. Copyright © 2017. Published by Elsevier Ltd.
Automated measurement of stent strut coverage in intravascular optical coherence tomography
NASA Astrophysics Data System (ADS)
Ahn, Chi Young; Kim, Byeong-Keuk; Hong, Myeong-Ki; Jang, Yangsoo; Heo, Jung; Joo, Chulmin; Seo, Jin Keun
2015-02-01
Optical coherence tomography (OCT) is a non-invasive, cross-sectional imaging modality that has become a prominent imaging method in percutaneous intracoronary intervention. We present an automated detection algorithm for stent strut coordinates and coverage in OCT images. The algorithm for stent strut detection is composed of a coordinate transformation from the polar to the Cartesian domains and application of second derivative operators in the radial and the circumferential directions. Local region-based active contouring was employed to detect lumen boundaries. We applied the method to the OCT pullback images acquired from human patients in vivo to quantitatively measure stent strut coverage. The validation studies against manual expert assessments demonstrated high Pearson's coefficients ( R = 0.99) in terms of the stent strut coordinates, with no significant bias. An averaged Hausdorff distance of < 120 μm was obtained for vessel border detection. Quantitative comparison in stent strut to vessel wall distance found a bias of < 12.3 μm and a 95% confidence of < 110 μm.
Quantitative Analysis of Venus Radar Backscatter Data in ArcGIS
NASA Technical Reports Server (NTRS)
Long, S. M.; Grosfils, E. B.
2005-01-01
Ongoing mapping of the Ganiki Planitia (V14) quadrangle of Venus and definition of material units has involved an integrated but qualitative analysis of Magellan radar backscatter images and topography using standard geomorphological mapping techniques. However, such analyses do not take full advantage of the quantitative information contained within the images. Analysis of the backscatter coefficient allows a much more rigorous statistical comparison between mapped units, permitting first order selfsimilarity tests of geographically separated materials assigned identical geomorphological labels. Such analyses cannot be performed directly on pixel (DN) values from Magellan backscatter images, because the pixels are scaled to the Muhleman law for radar echoes on Venus and are not corrected for latitudinal variations in incidence angle. Therefore, DN values must be converted based on pixel latitude back to their backscatter coefficient values before accurate statistical analysis can occur. Here we present a method for performing the conversions and analysis of Magellan backscatter data using commonly available ArcGIS software and illustrate the advantages of the process for geological mapping.
Investigating fold structures of 2D materials by quantitative transmission electron microscopy.
Wang, Zhiwei; Zhang, Zengming; Liu, Wei; Wang, Zhong Lin
2017-04-01
We report an approach developed for deriving 3D structural information of 2D membrane folds based on the recently-established quantitative transmission electron microscopy (TEM) in combination with density functional theory (DFT) calculations. Systematic multislice simulations reveal that the membrane folding leads to sufficiently strong electron scattering which enables a precise determination of bending radius. The image contrast depends also on the folding angles of 2D materials due to the variation of projection potentials, which however exerts much smaller effect compared with the bending radii. DFT calculations show that folded edges are typically characteristic of (fractional) nanotubes with the same curvature retained after energy optimization. Owing to the exclusion of Stobbs factor issue, numerical simulations were directly used in comparison with the experimental measurements on an absolute contrast scale, which results in a successful determination of bending radius of folded monolayer MoS 2 films. The method should be applicable to characterizing all 2D membranes with 3D folding features. Copyright © 2017 Elsevier Ltd. All rights reserved.
2014-01-01
Quantitative imaging biomarkers (QIBs) are being used increasingly in medicine to diagnose and monitor patients’ disease. The computer algorithms that measure QIBs have different technical performance characteristics. In this paper we illustrate the appropriate statistical methods for assessing and comparing the bias, precision, and agreement of computer algorithms. We use data from three studies of pulmonary nodules. The first study is a small phantom study used to illustrate metrics for assessing repeatability. The second study is a large phantom study allowing assessment of four algorithms’ bias and reproducibility for measuring tumor volume and the change in tumor volume. The third study is a small clinical study of patients whose tumors were measured on two occasions. This study allows a direct assessment of six algorithms’ performance for measuring tumor change. With these three examples we compare and contrast study designs and performance metrics, and we illustrate the advantages and limitations of various common statistical methods for QIB studies. PMID:24919828
Orellano, Elsa M.; Mountain, Gail; Varas, Nelson; Labault, Nirzka
2014-01-01
In this pilot study we explore the difference in the use of occupational competence strategies for daily participation between highly active and low active Hispanic older women. Twenty-nine women, living alone who were ≥ 70 years participated in this study. We employed a mixed method design through which the principal investigator administered a tool to measure participation restrictions during the quantitative phase and conducted in-depth interviews with a subsample of the quantitative phase. Active women predominantly used transportation resources, emotional social support, and spirituality to support participation in life activities. Less active women used more practical social support, assistive technology, and environmental modifications. Personal facilitators seemed to directly modify these strategies. These results suggest that older women with different activity levels use distinct internal and external resources to maintain or enhance daily participation. Future studies should explore whether these resources remain consistent across gender, living status, and ethnicities. PMID:24669397
Surface temperature/heat transfer measurement using a quantitative phosphor thermography system
NASA Technical Reports Server (NTRS)
Buck, G. M.
1991-01-01
A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.
A new software for dimensional measurements in 3D endodontic root canal instrumentation.
Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella
2012-01-01
The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.
Applying Knowledge of Quantitative Design and Analysis
ERIC Educational Resources Information Center
Baskas, Richard S.
2011-01-01
This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…
Liu, Peng; Reichl, John H; Rao, Eshaan R; McNellis, Brittany M; Huang, Eric S; Hemmy, Laura S; Forster, Colleen L; Kuskowski, Michael A; Borchelt, David R; Vassar, Robert; Ashe, Karen H; Zahs, Kathleen R
2017-01-01
There exist several dozen lines of transgenic mice that express human amyloid-β protein precursor (AβPP) with Alzheimer's disease (AD)-linked mutations. AβPP transgenic mouse lines differ in the types and amounts of Aβ that they generate and in their spatiotemporal patterns of expression of Aβ assemblies, providing a toolkit to study Aβ amyloidosis and the influence of Aβ aggregation on brain function. More complete quantitative descriptions of the types of Aβ assemblies present in transgenic mice and in humans during disease progression should add to our understanding of how Aβ toxicity in mice relates to the pathogenesis of AD. Here, we provide a direct quantitative comparison of amyloid plaque burdens and plaque sizes in four lines of AβPP transgenic mice. We measured the fraction of cortex and hippocampus occupied by dense-core plaques, visualized by staining with Thioflavin S, in mice from young adulthood through advanced age. We found that the plaque burdens among the transgenic lines varied by an order of magnitude: at 15 months of age, the oldest age studied, the median cortical plaque burden in 5XFAD mice was already ∼4.5 times that of 21-month-old Tg2576 mice and ∼15 times that of 21-24-month-old rTg9191 mice. Plaque-size distributions changed across the lifespan in a line- and region-dependent manner. We also compared the dense-core plaque burdens in the mice to those measured in a set of pathologically-confirmed AD cases from the Nun Study. Cortical plaque burdens in Tg2576, APPSwePS1ΔE9, and 5XFAD mice eventually far exceeded those measured in the human cohort.
Liu, Peng; Reichl, John H.; Rao, Eshaan R.; McNellis, Brittany M.; Huang, Eric S.; Hemmy, Laura S.; Forster, Colleen L.; Kuskowski, Michael A.; Borchelt, David R.; Vassar, Robert; Ashe, Karen H.; Zahs, Kathleen R.
2016-01-01
There exist several dozen lines of transgenic mice that express human amyloid-β precursor protein (AβPP) with Alzheimer’s disease (AD)-linked mutations. AβPP transgenic mouse lines differ in the types and amounts of Aβ that they generate and in their spatiotemporal patterns of expression of Aβ assemblies, providing a toolkit to study Aβ amyloidosis and the influence of Aβ aggregation on brain function. More complete quantitative descriptions of the types of Aβ assemblies present in transgenic mice and in humans during disease progression should add to our understanding of how Aβ toxicity in mice relates to the pathogenesis of AD. Here, we provide a direct quantitative comparison of amyloid plaque burdens and plaque sizes in four lines of AβPP transgenic mice. We measured the fraction of cortex and hippocampus occupied by dense-core plaques, visualized by staining with Thioflavin S, in mice from young adulthood through advanced age. We found that the plaque burdens among the transgenic lines varied by an order of magnitude: at 15 months of age, the oldest age studied, the median cortical plaque burden in 5XFAD mice was already ~4.5 times that of 21-month Tg2576 mice and ~15 times that of 21–24-month rTg9191 mice. Plaque-size distributions changed across the lifespan in a line- and region-dependent manner. We also compared the dense-core plaque burdens in the mice to those measured in a set of pathologically-confirmed AD cases from the Nun Study. Cortical plaque burdens in Tg2576, APPSwePS1ΔE9, and 5XFAD mice eventually far exceeded those measured in the human cohort. PMID:28059792
KEY COMPARISON: CCQM-K28: Tributyltin in sediment
NASA Astrophysics Data System (ADS)
Wolff Briche, Céline S. J.; Wahlen, Raimund; Sturgeon, Ralph E.
2006-01-01
Key comparison CCQM K-28 was undertaken to assess the measurement capabilities for quantitation of (C4H9)3Sn+ (TBT) in a prepared marine sediment by National Metrology Institutes (NMIs), which are members of the Comité Consultatif pour la Quantité de Matière (CCQM). It follows a previous pilot study, CCQM-P18. [1, 2] This exercise was sanctioned by the 8th CCQM meeting, 18-19 April 2002, as an activity of the Inorganic Analysis Working Group and was jointly coordinated by the Institute for National Measurement Standards of the National Research Council of Canada (NRC) and LGC, UK. Eight NMIs initially indicated their interest, with seven ultimately submitting their results. All NMIs relied on isotope dilution mass spectrometry using a species-specific 117Sn-enriched TBT standard, which was supplied by LGC. No analytical methodology was prescribed for this study. As a result, a variety of extraction approaches was adopted by the participants, including mechanical shaking, sonication, accelerated solvent extraction, microwave assisted extraction and heating in combination with ethylation and direct sampling. Detection techniques included ICP-MS (coupled to GC or HPLC for the separation of Sn species) and GC-MS. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the Mutual Recognition Arrangement (MRA).
Half-unit weighted bilinear algorithm for image contrast enhancement in capsule endoscopy
NASA Astrophysics Data System (ADS)
Rukundo, Olivier
2018-04-01
This paper proposes a novel enhancement method based exclusively on the bilinear interpolation algorithm for capsule endoscopy images. The proposed method does not convert the original RBG image components to HSV or any other color space or model; instead, it processes directly RGB components. In each component, a group of four adjacent pixels and half-unit weight in the bilinear weighting function are used to calculate the average pixel value, identical for each pixel in that particular group. After calculations, groups of identical pixels are overlapped successively in horizontal and vertical directions to achieve a preliminary-enhanced image. The final-enhanced image is achieved by halving the sum of the original and preliminary-enhanced image pixels. Quantitative and qualitative experiments were conducted focusing on pairwise comparisons between original and enhanced images. Final-enhanced images have generally the best diagnostic quality and gave more details about the visibility of vessels and structures in capsule endoscopy images.
Forward ultrasonic model validation using wavefield imaging methods
NASA Astrophysics Data System (ADS)
Blackshire, James L.
2018-04-01
The validation of forward ultrasonic wave propagation models in a complex titanium polycrystalline material system is accomplished using wavefield imaging methods. An innovative measurement approach is described that permits the visualization and quantitative evaluation of bulk elastic wave propagation and scattering behaviors in the titanium material for a typical focused immersion ultrasound measurement process. Results are provided for the determination and direct comparison of the ultrasonic beam's focal properties, mode-converted shear wave position and angle, and scattering and reflection from millimeter-sized microtexture regions (MTRs) within the titanium material. The approach and results are important with respect to understanding the root-cause backscatter signal responses generated in aerospace engine materials, where model-assisted methods are being used to understand the probabilistic nature of the backscatter signal content. Wavefield imaging methods are shown to be an effective means for corroborating and validating important forward model predictions in a direct manner using time- and spatially-resolved displacement field amplitude measurements.
NASA Astrophysics Data System (ADS)
Tourret, D.; Karma, A.; Clarke, A. J.; Gibbs, P. J.; Imhoff, S. D.
2015-06-01
We present a three-dimensional (3D) extension of a previously proposed multi-scale Dendritic Needle Network (DNN) approach for the growth of complex dendritic microstructures. Using a new formulation of the DNN dynamics equations for dendritic paraboloid-branches of a given thickness, one can directly extend the DNN approach to 3D modeling. We validate this new formulation against known scaling laws and analytical solutions that describe the early transient and steady-state growth regimes, respectively. Finally, we compare the predictions of the model to in situ X-ray imaging of Al-Cu alloy solidification experiments. The comparison shows a very good quantitative agreement between 3D simulations and thin sample experiments. It also highlights the importance of full 3D modeling to accurately predict the primary dendrite arm spacing that is significantly over-estimated by 2D simulations.
Tourret, D.; Karma, A.; Clarke, A. J.; ...
2015-06-11
We present a three-dimensional (3D) extension of a previously proposed multi-scale Dendritic Needle Network (DNN) approach for the growth of complex dendritic microstructures. Using a new formulation of the DNN dynamics equations for dendritic paraboloid-branches of a given thickness, one can directly extend the DNN approach to 3D modeling. We validate this new formulation against known scaling laws and analytical solutions that describe the early transient and steady-state growth regimes, respectively. Finally, we compare the predictions of the model to in situ X-ray imaging of Al-Cu alloy solidification experiments. The comparison shows a very good quantitative agreement between 3D simulationsmore » and thin sample experiments. It also highlights the importance of full 3D modeling to accurately predict the primary dendrite arm spacing that is significantly over-estimated by 2D simulations.« less
Bryce A. Richardson; Gerald E. Rehfeldt; Mee-Sook Kim
2009-01-01
Analyses of molecular and quantitative genetic data demonstrate the existence of congruent climate-related patterns in western white pine (Pinus monticola). Two independent studies allowed comparisons of amplified fragment length polymorphism (AFLP) markers with quantitative variation in adaptive traits. Principal component analyses...
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Determining absolute protein numbers by quantitative fluorescence microscopy.
Verdaasdonk, Jolien Suzanne; Lawrimore, Josh; Bloom, Kerry
2014-01-01
Biological questions are increasingly being addressed using a wide range of quantitative analytical tools to examine protein complex composition. Knowledge of the absolute number of proteins present provides insights into organization, function, and maintenance and is used in mathematical modeling of complex cellular dynamics. In this chapter, we outline and describe three microscopy-based methods for determining absolute protein numbers--fluorescence correlation spectroscopy, stepwise photobleaching, and ratiometric comparison of fluorescence intensity to known standards. In addition, we discuss the various fluorescently labeled proteins that have been used as standards for both stepwise photobleaching and ratiometric comparison analysis. A detailed procedure for determining absolute protein number by ratiometric comparison is outlined in the second half of this chapter. Counting proteins by quantitative microscopy is a relatively simple yet very powerful analytical tool that will increase our understanding of protein complex composition. © 2014 Elsevier Inc. All rights reserved.
75 FR 68468 - List of Fisheries for 2011
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
...-existent; therefore, quantitative data on the frequency of incidental mortality and serious injury is... currently available for most of these marine mammals on the high seas, and quantitative comparison of...
2016-01-01
Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978
The Role of Spatially Controlled Cell Proliferation in Limb Bud Morphogenesis
Boehm, Bernd; Westerberg, Henrik; Lesnicar-Pucko, Gaja; Raja, Sahdia; Rautschka, Michael; Cotterell, James; Swoger, Jim; Sharpe, James
2010-01-01
Although the vertebrate limb bud has been studied for decades as a model system for spatial pattern formation and cell specification, the cellular basis of its distally oriented elongation has been a relatively neglected topic by comparison. The conventional view is that a gradient of isotropic proliferation exists along the limb, with high proliferation rates at the distal tip and lower rates towards the body, and that this gradient is the driving force behind outgrowth. Here we test this hypothesis by combining quantitative empirical data sets with computer modelling to assess the potential role of spatially controlled proliferation rates in the process of directional limb bud outgrowth. In particular, we generate two new empirical data sets for the mouse hind limb—a numerical description of shape change and a quantitative 3D map of cell cycle times—and combine these with a new 3D finite element model of tissue growth. By developing a parameter optimization approach (which explores spatial patterns of tissue growth) our computer simulations reveal that the observed distribution of proliferation rates plays no significant role in controlling the distally extending limb shape, and suggests that directional cell activities are likely to be the driving force behind limb bud outgrowth. This theoretical prediction prompted us to search for evidence of directional cell orientations in the limb bud mesenchyme, and we thus discovered a striking highly branched and extended cell shape composed of dynamically extending and retracting filopodia, a distally oriented bias in Golgi position, and also a bias in the orientation of cell division. We therefore provide both theoretical and empirical evidence that limb bud elongation is achieved by directional cell activities, rather than a PD gradient of proliferation rates. PMID:20644711
Recent trends in high spin sensitivity magnetic resonance
NASA Astrophysics Data System (ADS)
Blank, Aharon; Twig, Ygal; Ishay, Yakir
2017-07-01
Magnetic resonance is a very powerful methodology that has been employed successfully in many applications for about 70 years now, resulting in a wealth of scientific, technological, and diagnostic data. Despite its many advantages, one major drawback of magnetic resonance is its relatively poor sensitivity and, as a consequence, its bad spatial resolution when examining heterogeneous samples. Contemporary science and technology often make use of very small amounts of material and examine heterogeneity on a very small length scale, both of which are well beyond the current capabilities of conventional magnetic resonance. It is therefore very important to significantly improve both the sensitivity and the spatial resolution of magnetic resonance techniques. The quest for higher sensitivity led in recent years to the development of many alternative detection techniques that seem to rival and challenge the conventional ;old-fashioned; induction-detection approach. The aim of this manuscript is to briefly review recent advances in the field, and to provide a quantitative as well as qualitative comparison between various detection methods with an eye to future potential advances and developments. We first offer a common definition of sensitivity in magnetic resonance to enable proper quantitative comparisons between various detection methods. Following that, up-to-date information about the sensitivity capabilities of the leading recently-developed detection approaches in magnetic resonance is provided, accompanied by a critical comparison between them and induction detection. Our conclusion from this comparison is that induction detection is still indispensable, and as such, it is very important to look for ways to significantly improve it. To do so, we provide expressions for the sensitivity of induction-detection, derived from both classical and quantum mechanics, that identify its main limiting factors. Examples from current literature, as well as a description of new ideas, show how these limiting factors can be mitigated to significantly improve the sensitivity of induction detection. Finally, we outline some directions for the possible applications of high-sensitivity induction detection in the field of electron spin resonance.
A comparison of visual and quantitative methods to identify interstitial lung abnormalities.
Kliment, Corrine R; Araki, Tetsuro; Doyle, Tracy J; Gao, Wei; Dupuis, Josée; Latourelle, Jeanne C; Zazueta, Oscar E; Fernandez, Isis E; Nishino, Mizuki; Okajima, Yuka; Ross, James C; Estépar, Raúl San José; Diaz, Alejandro A; Lederer, David J; Schwartz, David A; Silverman, Edwin K; Rosas, Ivan O; Washko, George R; O'Connor, George T; Hatabu, Hiroto; Hunninghake, Gary M
2015-10-29
Evidence suggests that individuals with interstitial lung abnormalities (ILA) on a chest computed tomogram (CT) may have an increased risk to develop a clinically significant interstitial lung disease (ILD). Although methods used to identify individuals with ILA on chest CT have included both automated quantitative and qualitative visual inspection methods, there has been not direct comparison between these two methods. To investigate this relationship, we created lung density metrics and compared these to visual assessments of ILA. To provide a comparison between ILA detection methods based on visual assessment we generated measures of high attenuation areas (HAAs, defined by attenuation values between -600 and -250 Hounsfield Units) in >4500 participants from both the COPDGene and Framingham Heart studies (FHS). Linear and logistic regressions were used for analyses. Increased measures of HAAs (in ≥ 10 % of the lung) were significantly associated with ILA defined by visual inspection in both cohorts (P < 0.0001); however, the positive predictive values were not very high (19 % in COPDGene and 13 % in the FHS). In COPDGene, the association between HAAs and ILA defined by visual assessment were modified by the percentage of emphysema and body mass index. Although increased HAAs were associated with reductions in total lung capacity in both cohorts, there was no evidence for an association between measurement of HAAs and MUC5B promoter genotype in the FHS. Our findings demonstrate that increased measures of lung density may be helpful in determining the severity of lung volume reduction, but alone, are not strongly predictive of ILA defined by visual assessment. Moreover, HAAs were not associated with MUC5B promoter genotype.
Song, Fujian; Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-08-16
To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. The study included 112 independent trial networks (including 1552 trials with 478,775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence.
Xiong, Tengbin; Parekh-Bhurke, Sheetal; Loke, Yoon K; Sutton, Alex J; Eastwood, Alison J; Holland, Richard; Chen, Yen-Fu; Glenny, Anne-Marie; Deeks, Jonathan J; Altman, Doug G
2011-01-01
Objective To investigate the agreement between direct and indirect comparisons of competing healthcare interventions. Design Meta-epidemiological study based on sample of meta-analyses of randomised controlled trials. Data sources Cochrane Database of Systematic Reviews and PubMed. Inclusion criteria Systematic reviews that provided sufficient data for both direct comparison and independent indirect comparisons of two interventions on the basis of a common comparator and in which the odds ratio could be used as the outcome statistic. Main outcome measure Inconsistency measured by the difference in the log odds ratio between the direct and indirect methods. Results The study included 112 independent trial networks (including 1552 trials with 478 775 patients in total) that allowed both direct and indirect comparison of two interventions. Indirect comparison had already been explicitly done in only 13 of the 85 Cochrane reviews included. The inconsistency between the direct and indirect comparison was statistically significant in 16 cases (14%, 95% confidence interval 9% to 22%). The statistically significant inconsistency was associated with fewer trials, subjectively assessed outcomes, and statistically significant effects of treatment in either direct or indirect comparisons. Owing to considerable inconsistency, many (14/39) of the statistically significant effects by direct comparison became non-significant when the direct and indirect estimates were combined. Conclusions Significant inconsistency between direct and indirect comparisons may be more prevalent than previously observed. Direct and indirect estimates should be combined in mixed treatment comparisons only after adequate assessment of the consistency of the evidence. PMID:21846695
The immediate economic impact of maternal deaths on rural Chinese households.
Ye, Fang; Wang, Haijun; Huntington, Dale; Zhou, Hong; Li, Yan; You, Fengzhi; Li, Jinhua; Cui, Wenlong; Yao, Meiling; Wang, Yan
2012-01-01
To identify the immediate economic impact of maternal death on rural Chinese households. Results are reported from a study that matched 195 households who had suffered a maternal death to 384 households that experienced a childbirth without maternal death in rural areas of three provinces in China, using quantitative questionnaire to compare differences of direct and indirect costs between two groups. The direct costs of a maternal death were significantly higher than the costs of a childbirth without a maternal death (US$4,119 vs. $370, p<0.001). More than 40% of the direct costs were attributed to funeral expenses. Hospitalization and emergency care expenses were the largest proportion of non-funeral direct costs and were higher in households with maternal death than the comparison group (US$2,248 vs. $305, p<0.001). To cover most of the high direct costs, 44.1% of affected households utilized compensation from hospitals, and the rest affected households (55.9%) utilized borrowing money or taking loans as major source of money to offset direct costs. The median economic burden of the direct (and non-reimbursed) costs of a maternal death was quite high--37.0% of the household's annual income, which was approximately 4 times as high as the threshold for an expense being considered catastrophic. The immediate direct costs of maternal deaths are extremely catastrophic for the rural Chinese households in three provinces studied.
Shul'ts, E V; Baburin, I N; Karavaeva, T A; Karvasarskiĭ, B D; Slezin, V B
2011-01-01
Fifty-five patients with neurotic and neurosis-like disorders and 20 healthy controls, aged 17-64 years, have been examined. The basic research method was electroencephalography (EEG) with the fractal analysis of alpha power fluctuations. In patients, the changes in the fractal structure were of the same direction: the decrease of fractal indexes of low-frequency fluctuations and the increase of fractal indexes of mid-frequency fluctuations. Patients with neurosis-like disorders, in comparison to those with neurotic disorders, were characterized by more expressed (quantitative) changes in fractal structures of more extended character. It suggests the presence of deeper pathological changes in patients with neurosis-like disorders.
Resolving phase information of the optical local density of state with scattering near-field probes
NASA Astrophysics Data System (ADS)
Prasad, R.; Vincent, R.
2016-10-01
We theoretically discuss the link between the phase measured using a scattering optical scanning near-field microscopy (s-SNOM) and the local density of optical states (LDOS). A remarkable result is that the LDOS information is directly included in the phase of the probe. Therefore by monitoring the spatial variation of the trans-scattering phase, we locally measure the phase modulation associated with the probe and the optical paths. We demonstrate numerically that a technique involving two-phase imaging of a sample with two different sized tips should allow to obtain the image the pLDOS. For this imaging method, numerical comparison with extinction probe measurement shows crucial qualitative and quantitative improvement.
Ion induced electron emission statistics under Agm- cluster bombardment of Ag
NASA Astrophysics Data System (ADS)
Breuers, A.; Penning, R.; Wucher, A.
2018-05-01
The electron emission from a polycrystalline silver surface under bombardment with Agm- cluster ions (m = 1, 2, 3) is investigated in terms of ion induced kinetic excitation. The electron yield γ is determined directly by a current measurement method on the one hand and implicitly by the analysis of the electron emission statistics on the other hand. Successful measurements of the electron emission spectra ensure a deeper understanding of the ion induced kinetic electron emission process, with particular emphasis on the effect of the projectile cluster size to the yield as well as to emission statistics. The results allow a quantitative comparison to computer simulations performed for silver atoms and clusters impinging onto a silver surface.
NASA Astrophysics Data System (ADS)
Amengual, A.; Romero, R.; Homar, V.; Ramis, C.; Alonso, S.
2007-08-01
Studies using transparent, polymeric witness plates consisting of polydimethlysiloxane (PDMS) have been conducted to measure the output of exploding bridge wire (EBW) detonators and exploding foil initiators (EFI). Polymeric witness plates are utilized to alleviate particle response issues that arise in gaseous flow fields containing shock waves and to allow measurements of shock-induced material velocities to be made using particle image velocimetry (PIV). Quantitative comparisons of velocity profiles across the shock waves in air and in PDMS demonstrate the improved response achieved by the dynamic witness plate method. Schlieren photographs complement the analysis through direct visualization of detonator-induced shock waves in the witness plates.
Particle response to shock waves in solids: dynamic witness plate/PIV method for detonations
NASA Astrophysics Data System (ADS)
Murphy, Michael J.; Adrian, Ronald J.
2007-08-01
Studies using transparent, polymeric witness plates consisting of polydimethlysiloxane (PDMS) have been conducted to measure the output of exploding bridge wire (EBW) detonators and exploding foil initiators (EFI). Polymeric witness plates are utilized to alleviate particle response issues that arise in gaseous flow fields containing shock waves and to allow measurements of shock-induced material velocities to be made using particle image velocimetry (PIV). Quantitative comparisons of velocity profiles across the shock waves in air and in PDMS demonstrate the improved response achieved by the dynamic witness plate method. Schlieren photographs complement the analysis through direct visualization of detonator-induced shock waves in the witness plates.
Goding, Julian C; Ragon, Dorisanne Y; O'Connor, Jack B; Boehm, Sarah J; Hupp, Amber M
2013-07-01
The fatty acid methyl ester (FAME) content of biodiesel fuels has traditionally been determined using gas chromatography with a polar stationary phase. In this study, a direct comparison of the separation of FAMEs present in various biodiesel samples on three polar stationary phases and one moderately polar stationary phase (with comparable column dimensions) was performed. Retention on each column was based on solubility in and polarity of the phase. Quantitative metrics describing the resolution of important FAME pairs indicate high resolution on all polar columns, yet the best resolution, particularly of geometric isomers, is achieved on the cyanopropyl column. In addition, the separation of four C18 monounsaturated isomers was optimized and the elution order determined on each column. FAME composition of various biodiesel fuel types was determined on each column to illustrate (1) chemical differences in biodiesels produced from different feedstocks and (2) chemical similarities in biodiesels of the same feedstock type produced in different locations and harvest seasons.
Morrison, P; Burnard, P
1989-04-01
The theoretical framework known as Six Category Intervention Analysis is described. This framework has been used in the teaching of interpersonal skills in various settings but there appears to be little or no empirical work to test out the theory. In the present study, an instrument was devised for assessing student nurses' perceptions of their interpersonal skills based on the category analysis. The findings of the study are presented and a quantitative comparison is made with the results of an earlier study of trained nurses' perceptions. Marked similarities were noted between the two sets of findings. The key trend to emerge was that both groups of nurses tended to perceive themselves as being more authoritative and less facilitative in their interpersonal relationships, in terms of the category analysis. This trend and others are discussed and suggestions made for future directions in research and training in the field of interpersonal skills in nursing. Implications for the theory of six category intervention analysis are also discussed.
Monomer volume fraction profiles in pH responsive planar polyelectrolyte brushes
Mahalik, Jyoti P.; Yang, Yubo; Deodhar, Chaitra V.; ...
2016-03-06
Spatial dependencies of monomer volume fraction profiles of pH responsive polyelectrolyte brushes were investigated using field theories and neutron reflectivity experiments. In particular, planar polyelectrolyte brushes in good solvent were studied and direct comparisons between predictions of the theories and experimental measurements are presented. The comparisons between the theories and the experimental data reveal that solvent entropy and ion-pairs resulting from adsorption of counterions from the added salt play key roles in affecting the monomer distribution and must be taken into account in modeling polyelectrolyte brushes. Furthermore, the utility of this physics-based approach based on these theories for the predictionmore » and interpretation of neutron reflectivity profiles in the context of pH responsive planar polyelectrolyte brushes such as polybasic poly(2-(dimethylamino)ethyl methacrylate) (PDMAEMA) and polyacidic poly(methacrylic acid) (PMAA) brushes is demonstrated. The approach provides a quantitative way of estimating molecular weights of the polymers polymerized using surface-initiated atom transfer radical polymerization.« less
Li, Libo; Bentler, Peter M
2011-06-01
MacCallum, Browne, and Cai (2006) proposed a new framework for evaluation and power analysis of small differences between nested structural equation models (SEMs). In their framework, the null and alternative hypotheses for testing a small difference in fit and its related power analyses were defined by some chosen root-mean-square error of approximation (RMSEA) pairs. In this article, we develop a new method that quantifies those chosen RMSEA pairs and allows a quantitative comparison of them. Our method proposes the use of single RMSEA values to replace the choice of RMSEA pairs for model comparison and power analysis, thus avoiding the differential meaning of the chosen RMSEA pairs inherent in the approach of MacCallum et al. (2006). With this choice, the conventional cutoff values in model overall evaluation can directly be transferred and applied to the evaluation and power analysis of model differences. © 2011 American Psychological Association
NASA Technical Reports Server (NTRS)
Witt, Adolf N.; Petersohn, Jens K.; Bohlin, Ralph C.; O'Connell, Robert W.; Roberts, Morton S.; Smith, Andrew M.; Stecher, Theodore P.
1992-01-01
The Ultraviolet Imaging Telescope as part of the Astro-1 mission, was used to obtain high-resolution surface brightness distribution data in six ultraviolet wavelength bands for the bright reflection nebula NGC 7023. From the quantitative comparison of the measured surface brightness gradients ratios of nebular to stellar flux, and detail radial surface brightness profiles with corresponding data from the visible, two major conclusions results: (1) the scattering in the near- and far-ultraviolet in this nebula is more strongly forward-directed than in the visible; (2) the dust albedo in the ultraviolet for wavelengths not less than 140 nm is identical to that in the visible, with the exception of the 220 nm bump in the extinction curve. In the wavelengths region of the bump, the albedo is reduced by 25 to 30 percent in comparison with wavelengths regions both shorter and longer. This lower albedo is expected, if the bump is a pure absorption feature.
Quantitative Comparisons of a Coarse-Grid LES with Experimental Data for Backward-Facing Step Flow
NASA Astrophysics Data System (ADS)
McDonough, J. M.
1999-11-01
A novel approach to LES employing an additive decomposition of both solutions and governing equations (similar to ``multi-level'' approaches of Dubois et al.,Dynamic Multilevel Methods and the Simulation of Turbulence, Cambridge University Press, 1999) is presented; its main structural features are lack of filtering of governing equations (instead, solutions are filtered to remove aliasing due to under resolution) and direct modeling of subgrid-scale primitive variables (rather than modeling their correlations) in the manner proposed by Hylin and McDonough (Int. J. Fluid Mech. Res. 26, 228-256, 1999). A 2-D implementation of this formalism is applied to the backward-facing step flow studied experimentally by Driver and Seegmiller (AIAA J. 23, 163-171, 1985) and Driver et al. (AIAA J. 25, 914-919, 1987), and run on grids sufficiently coarse to permit easy extension to 3-D, industrially-realistic problems. Comparisons of computed and experimental mean quantities (velocity profiles, turbulence kinetic energy, reattachment lengths, etc.) and effects of grid refinement will be presented.
Dual Modifications Strategy to Quantify Neutral and Sialylated N-Glycans Simultaneously by MALDI-MS
2015-01-01
Differences in ionization efficiency among neutral and sialylated glycans prevent direct quantitative comparison by their respective mass spectrometric signals. To overcome this challenge, we developed an integrated chemical strategy, Dual Reactions for Analytical Glycomics (DRAG), to quantitatively compare neutral and sialylated glycans simultaneously by MALDI-MS. Initially, two glycan samples to be compared undergo reductive amination with 2-aminobenzoic acid and 2-13[C6]-aminobenzoic acid, respectively. The different isotope-incorporated glycans are then combined and subjected to the methylamidation of the sialic acid residues in one mixture, homogenizing the ionization responses for all neutral and sialylated glycans. By this approach, the expression change of relevant glycans between two samples is proportional to the ratios of doublet signals with a static 6 Da mass difference in MALDI-MS and the change in relative abundance of any glycan within samples can also be determined. The strategy was chemically validated using well-characterized N-glycans from bovine fetuin and IgG from human serum. By comparing the N-glycomes from a first morning (AM) versus an afternoon (PM) urine sample obtained from a single donor, we further demonstrated the ability of DRAG strategy to measure subtle quantitative differences in numerous urinary N-glycans. PMID:24766348
Dual modifications strategy to quantify neutral and sialylated N-glycans simultaneously by MALDI-MS.
Zhou, Hui; Warren, Peter G; Froehlich, John W; Lee, Richard S
2014-07-01
Differences in ionization efficiency among neutral and sialylated glycans prevent direct quantitative comparison by their respective mass spectrometric signals. To overcome this challenge, we developed an integrated chemical strategy, Dual Reactions for Analytical Glycomics (DRAG), to quantitatively compare neutral and sialylated glycans simultaneously by MALDI-MS. Initially, two glycan samples to be compared undergo reductive amination with 2-aminobenzoic acid and 2-(13)[C6]-aminobenzoic acid, respectively. The different isotope-incorporated glycans are then combined and subjected to the methylamidation of the sialic acid residues in one mixture, homogenizing the ionization responses for all neutral and sialylated glycans. By this approach, the expression change of relevant glycans between two samples is proportional to the ratios of doublet signals with a static 6 Da mass difference in MALDI-MS and the change in relative abundance of any glycan within samples can also be determined. The strategy was chemically validated using well-characterized N-glycans from bovine fetuin and IgG from human serum. By comparing the N-glycomes from a first morning (AM) versus an afternoon (PM) urine sample obtained from a single donor, we further demonstrated the ability of DRAG strategy to measure subtle quantitative differences in numerous urinary N-glycans.
Standardizing Quality Assessment of Fused Remotely Sensed Images
NASA Astrophysics Data System (ADS)
Pohl, C.; Moellmann, J.; Fries, K.
2017-09-01
The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.
Casas-Vila, Núria; Scheibe, Marion; Freiwald, Anja; Kappei, Dennis; Butter, Falk
2015-11-17
To date, telomere research in fungi has mainly focused on Saccharomyces cerevisiae and Schizosaccharomyces pombe, despite the fact that both yeasts have degenerated telomeric repeats in contrast to the canonical TTAGGG motif found in vertebrates and also several other fungi. Using label-free quantitative proteomics, we here investigate the telosome of Neurospora crassa, a fungus with canonical telomeric repeats. We show that at least six of the candidates detected in our screen are direct TTAGGG-repeat binding proteins. While three of the direct interactors (NCU03416 [ncTbf1], NCU01991 [ncTbf2] and NCU02182 [ncTay1]) feature the known myb/homeobox DNA interaction domain also found in the vertebrate telomeric factors, we additionally show that a zinc-finger protein (NCU07846) and two proteins without any annotated DNA-binding domain (NCU02644 and NCU05718) are also direct double-strand TTAGGG binders. We further find two single-strand binders (NCU02404 [ncGbp2] and NCU07735 [ncTcg1]). By quantitative label-free interactomics we identify TTAGGG-binding proteins in Neurospora crassa, suggesting candidates for telomeric factors that are supported by phylogenomic comparison with yeast species. Intriguingly, homologs in yeast species with degenerated telomeric repeats are also TTAGGG-binding proteins, e.g. in S. cerevisiae Tbf1 recognizes the TTAGGG motif found in its subtelomeres. However, there is also a subset of proteins that is not conserved. While a rudimentary core TTAGGG-recognition machinery may be conserved across yeast species, our data suggests Neurospora as an emerging model organism with unique features.
Rapid enumeration of viable bacteria by image analysis
NASA Technical Reports Server (NTRS)
Singh, A.; Pyle, B. H.; McFeters, G. A.
1989-01-01
A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.
Dai, Qi; Yang, Yanchun; Wang, Tianming
2008-10-15
Many proposed statistical measures can efficiently compare biological sequences to further infer their structures, functions and evolutionary information. They are related in spirit because all the ideas for sequence comparison try to use the information on the k-word distributions, Markov model or both. Motivated by adding k-word distributions to Markov model directly, we investigated two novel statistical measures for sequence comparison, called wre.k.r and S2.k.r. The proposed measures were tested by similarity search, evaluation on functionally related regulatory sequences and phylogenetic analysis. This offers the systematic and quantitative experimental assessment of our measures. Moreover, we compared our achievements with these based on alignment or alignment-free. We grouped our experiments into two sets. The first one, performed via ROC (receiver operating curve) analysis, aims at assessing the intrinsic ability of our statistical measures to search for similar sequences from a database and discriminate functionally related regulatory sequences from unrelated sequences. The second one aims at assessing how well our statistical measure is used for phylogenetic analysis. The experimental assessment demonstrates that our similarity measures intending to incorporate k-word distributions into Markov model are more efficient.
Quantitative comparison of the in situ microbial communities in different biomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, D.C.; Ringelberg, D.B.; Palmer, R.J.
1995-12-31
A system to define microbial communities in different biomes requires the application of non-traditional methodology. Classical microbiological methods have severe limitations for the analysis of environmental samples. Pure-culture isolation, biochemical testing, and/or enumeration by direct microscopic counting are not well suited for the estimation of total biomass or the assessment of community composition within environmental samples. Such methods provide little insight into the in situ phenotypic activity of the extant microbiota since these techniques are dependent on microbial growth and thus select against many environmental microorganisms which are non- culturable under a wide range of conditions. It has been repeatedlymore » documented in the literature that viable counts or direct counts of bacteria attached to sediment grains are difficult to quantitative and may grossly underestimate the extent of the existing community. The traditional tests provide little indication of the in situ nutritional status or for evidence of toxicity within the microbial community. A more recent development (MIDI Microbial Identification System), measure free and ester-linked fatty acids from isolated microorganisms. Bacterial isolates are identified by comparing their fatty acid profiles to the MIKI database which contains over 8000 entries. The application of the MIKI system to the analysis of environmental samples however, has significant drawbacks. The MIDI system was developed to identify clinical microorganisms and requires their isolation and culture on trypticase soy agar at 27{degrees}C. Since many isolates are unable to grow at these restrictive growth conditions, the system does not lend itself to identification of some environmental organisms. A more applicable methodology for environmental microbial analysis is based on the liquid extrication and separation of microbial lipids from environmental samples, followed by quantitative analysis using gas chromatography/« less
High pressure rinsing system comparison
DOE Office of Scientific and Technical Information (OSTI.GOV)
D. Sertore; M. Fusetti; P. Michelato
2007-06-01
High pressure rinsing (HPR) is a key process for the surface preparation of high field superconducting cavities. A portable apparatus for the water jet characterization, based on the transferred momentum between the water jet and a load cell, has been used in different laboratories. This apparatus allows to collected quantitative parameters that characterize the HPR water jet. In this paper, we present a quantitative comparison of the different water jet produced by various nozzles routinely used in different laboratories for the HPR process
2007-01-05
positive / false negatives. The quantitative on-site methods were evaluated using linear regression analysis and relative percent difference (RPD) comparison...Conclusion ...............................................................................................3-9 3.2 Quantitative Analysis Using CRREL...3-37 3.3 Quantitative Analysis for NG by GC/TID.........................................................3-38 3.3.1 Introduction
Blue light dosage affects carotenoids and tocopherols in microgreens.
Samuolienė, Giedrė; Viršilė, Akvilė; Brazaitytė, Aušra; Jankauskienė, Julė; Sakalauskienė, Sandra; Vaštakaitė, Viktorija; Novičkovas, Algirdas; Viškelienė, Alina; Sasnauskas, Audrius; Duchovskis, Pavelas
2017-08-01
Mustard, beet and parsley were grown to harvest time under selected LEDs: 638+660+731+0% 445nm; 638+660+731+8% 445nm; 638+660+731+16% 445nm; 638+660+731+25% 445nm; 638+660+731+33% 445nm. From 1.2 to 4.3 times higher concentrations of chlorophylls a and b, carotenoids, α- and β-carotenes, lutein, violaxanthin and zeaxanthin was found under blue 33% treatment in comparison to lower blue light dosages. Meanwhile, the accumulation of metabolites, which were not directly connected with light reactions, such as tocopherols, was more influenced by lower (16%) blue light dosage, increasing about 1.3 times. Thus, microgreen enrichment of carotenoid and xanthophyll pigments may be achieved using higher (16-33%) blue light intensities. Changes in metabolite quantities were not the result of changes of other carotenoid concentration, but were more influenced by light treatment and depended on the species. Significant quantitative changes in response to blue light percentage were obtained for both directly and not directly light-dependent metabolite groups. Copyright © 2017 Elsevier Ltd. All rights reserved.
Paleomagnetic Analysis Using SQUID Microscopy
NASA Technical Reports Server (NTRS)
Weiss, Benjamin P.; Lima, Eduardo A.; Fong, Luis E.; Baudenbacher, Franz J.
2007-01-01
Superconducting quantum interference device (SQUID) microscopes are a new generation of instruments that map magnetic fields with unprecedented spatial resolution and moment sensitivity. Unlike standard rock magnetometers, SQUID microscopes map magnetic fields rather than measuring magnetic moments such that the sample magnetization pattern must be retrieved from source model fits to the measured field data. In this paper, we presented the first direct comparison between paleomagnetic analyses on natural samples using joint measurements from SQUID microscopy and moment magnetometry. We demonstrated that in combination with apriori geologic and petrographic data, SQUID microscopy can accurately characterize the magnetization of lunar glass spherules and Hawaiian basalt. The bulk moment magnitude and direction of these samples inferred from inversions of SQUID microscopy data match direct measurements on the same samples using moment magnetometry. In addition, these inversions provide unique constraints on the magnetization distribution within the sample. These measurements are among the most sensitive and highest resolution quantitative paleomagnetic studies of natural remanent magnetization to date. We expect that this technique will be able to extend many other standard paleomagnetic techniques to previously inaccessible microscale samples.
Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.
Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R
2014-08-01
To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.
Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul
2016-12-01
Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao
2015-01-01
Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566
Nielsen, Karsten H.; Karlsson, Stefan; Limbach, Rene; Wondraczek, Lothar
2015-01-01
The abrasion resistance of coated glass surfaces is an important parameter for judging lifetime performance, but practical testing procedures remain overly simplistic and do often not allow for direct conclusions on real-world degradation. Here, we combine quantitative two-dimensional image analysis and mechanical abrasion into a facile tool for probing the abrasion resistance of anti-reflective (AR) coatings. We determine variations in the average coated area, during and after controlled abrasion. Through comparison with other experimental techniques, we show that this method provides a practical, rapid and versatile tool for the evaluation of the abrasion resistance of sol-gel-derived thin films on glass. The method yields informative data, which correlates with measurements of diffuse reflectance and is further supported by qualitative investigations through scanning electron microscopy. In particular, the method directly addresses degradation of coating performance, i.e., the gradual areal loss of antireflective functionality. As an exemplary subject, we studied the abrasion resistance of state-of-the-art nanoporous SiO2 thin films which were derived from 5–6 wt% aqueous solutions of potassium silicates, or from colloidal suspensions of SiO2 nanoparticles. It is shown how abrasion resistance is governed by coating density and film adhesion, defining the trade-off between optimal AR performance and acceptable mechanical performance. PMID:26656260
Oikonomopoulos, Spyros; Wang, Yu Chang; Djambazian, Haig; Badescu, Dunarel; Ragoussis, Jiannis
2016-08-24
To assess the performance of the Oxford Nanopore Technologies MinION sequencing platform, cDNAs from the External RNA Controls Consortium (ERCC) RNA Spike-In mix were sequenced. This mix mimics mammalian mRNA species and consists of 92 polyadenylated transcripts with known concentration. cDNA libraries were generated using a template switching protocol to facilitate the direct comparison between different sequencing platforms. The MinION performance was assessed for its ability to sequence the cDNAs directly with good accuracy in terms of abundance and full length. The abundance of the ERCC cDNA molecules sequenced by MinION agreed with their expected concentration. No length or GC content bias was observed. The majority of cDNAs were sequenced as full length. Additionally, a complex cDNA population derived from a human HEK-293 cell line was sequenced on an Illumina HiSeq 2500, PacBio RS II and ONT MinION platforms. We observed that there was a good agreement in the measured cDNA abundance between PacBio RS II and ONT MinION (rpearson = 0.82, isoforms with length more than 700bp) and between Illumina HiSeq 2500 and ONT MinION (rpearson = 0.75). This indicates that the ONT MinION can sequence quantitatively both long and short full length cDNA molecules.
An alternative method for analysis of food taints using stir bar sorptive extraction.
Ridgway, Kathy; Lalljie, Sam P D; Smith, Roger M
2010-09-10
The determination of taints in food products currently can involve the use of several sample extraction techniques, including direct headspace (DHS), steam distillation extraction (SDE) and more recently solid phase microextraction (SPME). Each of these techniques has disadvantages, such as the use of large volumes of solvents (SDE), or limitations in sensitivity (DHS), or have only been applied to date for determination of individual or specific groups of tainting compounds (SPME). The use of stir bar sorptive extraction (SBSE) has been evaluated as a quantitative screening method for unknown tainting compounds in foods. A range of commonly investigated problem compounds, with a range of physical and chemical properties, were examined. The method was optimised to give the best response for the majority of compounds and the performance was evaluated by examining the accuracy, precision, linearity, limits of detection and quantitation and uncertainties for each analyte. For most compounds SBSE gave the lowest limits of detection compared to steam distillation extraction or direct headspace analysis and in general was better than these established techniques. However, for methyl methacrylate and hexanal no response was observed following stir bar extraction under the optimised conditions. The assays were carried out using a single quadrupole GC-MS in scan mode. A comparison of acquisition modes and instrumentation was performed using standards to illustrate the increase in sensitivity possible using more targeted ion monitoring or a more sensitive high resolution mass spectrometer. This comparison illustrated the usefulness of this approach as an alternative to specialised glassware or expensive instrumentation. SBSE in particular offers a 'greener' extraction method by a large reduction in the use of organic solvents and also minimises the potential for contamination from external laboratory sources, which is of particular concern for taint analysis. Copyright © 2010 Elsevier B.V. All rights reserved.
Pallebage-Gamarallage, Menuka; Foxley, Sean; Menke, Ricarda A L; Huszar, Istvan N; Jenkinson, Mark; Tendler, Benjamin C; Wang, Chaoyue; Jbabdi, Saad; Turner, Martin R; Miller, Karla L; Ansorge, Olaf
2018-03-13
Amyotrophic lateral sclerosis (ALS) is a clinically and histopathologically heterogeneous neurodegenerative disorder, in which therapy is hindered by the rapid progression of disease and lack of biomarkers. Magnetic resonance imaging (MRI) has demonstrated its potential for detecting the pathological signature and tracking disease progression in ALS. However, the microstructural and molecular pathological substrate is poorly understood and generally defined histologically. One route to understanding and validating the pathophysiological correlates of MRI signal changes in ALS is to directly compare MRI to histology in post mortem human brains. The article delineates a universal whole brain sampling strategy of pathologically relevant grey matter (cortical and subcortical) and white matter tracts of interest suitable for histological evaluation and direct correlation with MRI. A standardised systematic sampling strategy that was compatible with co-registration of images across modalities was established for regions representing phosphorylated 43-kDa TAR DNA-binding protein (pTDP-43) patterns that were topographically recognisable with defined neuroanatomical landmarks. Moreover, tractography-guided sampling facilitated accurate delineation of white matter tracts of interest. A digital photography pipeline at various stages of sampling and histological processing was established to account for structural deformations that might impact alignment and registration of histological images to MRI volumes. Combined with quantitative digital histology image analysis, the proposed sampling strategy is suitable for routine implementation in a high-throughput manner for acquisition of large-scale histology datasets. Proof of concept was determined in the spinal cord of an ALS patient where multiple MRI modalities (T1, T2, FA and MD) demonstrated sensitivity to axonal degeneration and associated heightened inflammatory changes in the lateral corticospinal tract. Furthermore, qualitative comparison of R2* and susceptibility maps in the motor cortex of 2 ALS patients demonstrated varying degrees of hyperintense signal changes compared to a control. Upon histological evaluation of the same region, intensity of signal changes in both modalities appeared to correspond primarily to the degree of microglial activation. The proposed post mortem whole brain sampling methodology enables the accurate intraindividual study of pathological propagation and comparison with quantitative MRI data, to more fully understand the relationship of imaging signal changes with underlying pathophysiology in ALS.
Using Inequality Measures to Incorporate Environmental Justice into Regulatory Analyses
Abstract: Formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. However, there is a large literature on developing and applying quantitative...
Mickenautsch, Steffen; Yengopal, Veerasamy
2013-01-01
Naïve-indirect comparisons are comparisons between competing clinical interventions' evidence from separate (uncontrolled) trials. Direct comparisons are comparisons within randomised control trials (RCTs). The objective of this empirical study is to test the null-hypothesis that trends and performance differences inferred from naïve-indirect comparisons and from direct comparisons/RCTs regarding the failure rates of amalgam and direct high-viscosity glass-ionomer cement (HVGIC) restorations in permanent posterior teeth have similar direction and magnitude. A total of 896 citations were identified through systematic literature search. From these, ten and two uncontrolled clinical longitudinal studies for HVGIC and amalgam, respectively, were included for naïve-indirect comparison and could be matched with three out twenty RCTs. Summary effects sizes were computed as Odds ratios (OR; 95% Confidence intervals) and compared with those from RCTs. Trend directions were inferred from 95% Confidence interval overlaps and direction of point estimates; magnitudes of performance differences were inferred from the median point estimates (OR) with 25% and 75% percentile range, for both types of comparison. Mann-Whitney U test was applied to test for statistically significant differences between point estimates of both comparison types. Trends and performance differences inferred from naïve-indirect comparison based on evidence from uncontrolled clinical longitudinal studies and from direct comparisons based on RCT evidence are not the same. The distributions of the point estimates differed significantly for both comparison types (Mann-Whitney U = 25, n(indirect) = 26; n(direct) = 8; p = 0.0013, two-tailed). The null-hypothesis was rejected. Trends and performance differences inferred from either comparison between HVGIC and amalgam restorations failure rates in permanent posterior teeth are not the same. It is recommended that clinical practice guidance regarding HVGICs should rest on direct comparisons via RCTs and not on naïve-indirect comparisons based on uncontrolled longitudinal studies in order to avoid inflation of effect estimates.
Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.
2016-01-01
A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685
Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry; ...
2016-08-31
Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Qiaoling; Paunesku, Tatjana; Lai, Barry
Trace metals play important roles in biological function, and x-ray fluorescence microscopy (XFM) provides a way to quantitatively image their distribution within cells. The faithfulness of these measurements is dependent on proper sample preparation. Using mouse embryonic fibroblast NIH/3T3 cells as an example, we compare various approaches to the preparation of adherent mammalian cells for XFM imaging under ambient temperature. Direct side-by-side comparison shows that plunge-freezing-based cryoimmobilization provides more faithful preservation than conventional chemical fixation for most biologically important elements including P, S, Cl, K, Fe, Cu, Zn and possibly Ca in adherent mammalian cells. Although cells rinsed with freshmore » media had a great deal of extracellular background signal for Cl and Ca, this approach maintained cells at the best possible physiological status before rapid freezing and it does not interfere with XFM analysis of other elements. If chemical fixation has to be chosen, the combination of 3% paraformaldehyde and 1.5 % glutaraldehyde preserves S, Fe, Cu and Zn better than either fixative alone. Lastly, when chemically fixed cells were subjected to a variety of dehydration processes, air drying was proved to be more suitable than other drying methods such as graded ethanol dehydration and freeze drying. This first detailed comparison for x-ray fluorescence microscopy shows how detailed quantitative conclusions can be affected by the choice of cell preparation method.« less
Silica Coated Paper Substrate for Paper-Spray Analysis of Therapeutic Drugs in Dried Blood Spots
Zhang, Zhiping; Xu, Wei; Manicke, Nicholas E.; Cooks, R. Graham; Ouyang, Zheng
2011-01-01
Paper spray is a newly developed ambient ionization method that has been applied for direct qualitative and quantitative analysis of biological samples. The properties of the paper substrate and spray solution have a significant impact on the release of chemical compounds from complex sample matrices, the diffusion of the analytes through the substrate, and the formation of ions for mass spectrometry analysis. In this study, a commercially available silica-coated paper was explored in an attempt to improve the analysis of therapeutic drugs in dried blood spots (DBS). The dichloromethane/isopropanol solvent has been identified as an optimal spray solvent for the analysis. The comparison was made with paper spray using chromatography paper as substrate with methanol/water as solvent for the analysis of verapamil, citalopram, amitriptyline, lidocaine and sunitinib in dried blood spots. It has been demonstrated the efficiency of recovery of the analytes was notably improved with the silica coated paper and the limit of quantitation (LOQ) for the drug analysis was 0.1 ng mL−1 using a commercial triple quadrupole mass spectrometer. The use of silica paper substrate also resulted in a sensitivity improvement of 5-50 fold in comparison with chromatography papers, including the Whatmann ET31 paper used for blood card. Analysis using a handheld miniature mass spectrometer Mini 11 gave LOQs of 10~20 ng mL−1 for the tested drugs, which is sufficient to cover the therapeutic ranges of these drugs. PMID:22145627
Kim, Min Soon; Rodney, William N; Cooper, Tara; Kite, Chris; Reece, Gregory P; Markey, Mia K
2009-02-01
Scarring is a significant cause of dissatisfaction for women who undergo breast surgery. Scar tissue may be clinically distinguished from normal skin by aberrant colour, rough surface texture, increased thickness (hypertrophy) and firmness. Colorimeters or spectrophotometers can be used to quantitatively assess scar colour, but they require direct patient interaction and can cost thousands of dollars. By comparison, digital photography is already in widespread use to document clinical outcomes and requires less patient interaction. Thus, assessment of scar coloration by digital photography is an attractive alternative. The goal of this study was to compare colour measurements obtained by digital photography and colorimetry. Agreements between photographic and colorimetric measurements of colour were evaluated. Experimental conditions were controlled by performing measurements on artificial scars created by a make-up artist. The colorimetric measurements of the artificial scars were compared with those reported in the literature for real scars in order to confirm the validity of this approach. We assessed the agreement between the colorimetric and photographic measurements of colour using a hypothesis test for equivalence, the intraclass correlation coefficient and the Bland-Altman method. Overall, good agreement was obtained for three parameters (L*a*b*) measured by colorimetry and photography from the results of three statistical analyses. Colour measurements obtained by digital photography were equivalent to those obtained using colorimetry. Thus, digital photography is a reliable, cost-effective measurement method of skin colour and should be further investigated for quantitative analysis of surgical outcomes.
Purified oocysts of Cryptosporidium parvum were used to evaluate applicability of two quantitative PCR (qPCR) viability detection methods in raw surface water and disinfection treated water. Propidium monoazide-qPCR targeting hsp70 gene was compared to reverse transcription (RT)-...
Analytic Guided-Search Model of Human Performance Accuracy in Target- Localization Search Tasks
NASA Technical Reports Server (NTRS)
Eckstein, Miguel P.; Beutter, Brent R.; Stone, Leland S.
2000-01-01
Current models of human visual search have extended the traditional serial/parallel search dichotomy. Two successful models for predicting human visual search are the Guided Search model and the Signal Detection Theory model. Although these models are inherently different, it has been difficult to compare them because the Guided Search model is designed to predict response time, while Signal Detection Theory models are designed to predict performance accuracy. Moreover, current implementations of the Guided Search model require the use of Monte-Carlo simulations, a method that makes fitting the model's performance quantitatively to human data more computationally time consuming. We have extended the Guided Search model to predict human accuracy in target-localization search tasks. We have also developed analytic expressions that simplify simulation of the model to the evaluation of a small set of equations using only three free parameters. This new implementation and extension of the Guided Search model will enable direct quantitative comparisons with human performance in target-localization search experiments and with the predictions of Signal Detection Theory and other search accuracy models.
Cheng, Zhiliang; Zaki, Ajlan Al; Hui, James Z; Tsourkas, Andrew
2012-01-01
Liposomes are intensively being developed for biomedical applications including drug and gene delivery. However, targeted liposomal delivery in cancer treatment is a very complicated multi-step process. Unfavorable liposome biodistribution upon intravenous administration and membrane destabilization in blood circulation could result in only a very small fraction of cargo reaching the tumors. It would therefore be desirable to develop new quantitative strategies to track liposomal delivery systems to improve the therapeutic index and decrease systemic toxicity. Here, we developed a simple and non-radiative method to quantify the tumor uptake of targeted and non-targeted control liposomes as well as their encapsulated contents simultaneously. Specifically, four different chelated lanthanide metals were encapsulated or surface-conjugated onto tumor-targeted and non-targeted liposomes, respectively. The two liposome formulations were then injected into tumor-bearing mice simultaneously and their tumor delivery was determined quantitatively via inductively coupled plasma-mass spectroscopy (ICP-MS), allowing for direct comparisons. Tumor uptake of the liposomes themselves and their encapsulated contents were consistent with targeted and non-targeted liposome formulations that were injected individually. PMID:22882145
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chuang, Claire Y.; Zepeda-Ruiz, Luis A.; Han, Sang M.
2015-06-01
Molecular dynamics simulations were used to study Ge island nucleation and growth on amorphous SiO 2 substrates. This process is relevant in selective epitaxial growth of Ge on Si, for which SiO 2 is often used as a template mask. The islanding process was studied over a wide range of temperatures and fluxes, using a recently proposed empirical potential model for the Si–SiO 2–Ge system. The simulations provide an excellent quantitative picture of the Ge islanding and compare well with detailed experimental measurements. These quantitative comparisons were enabled by an analytical rate model as a bridge between simulations and experimentsmore » despite the fact that deposition fluxes accessible in simulations and experiments are necessarily different by many orders of magnitude. In particular, the simulations led to accurate predictions of the critical island size and the scaling of island density as a function of temperature. Lastly, the overall approach used here should be useful not just for future studies in this particular system, but also for molecular simulations of deposition in other materials.« less
Barkla, Bronwyn J.
2016-01-01
Modern day agriculture practice is narrowing the genetic diversity in our food supply. This may compromise the ability to obtain high yield under extreme climactic conditions, threatening food security for a rapidly growing world population. To identify genetic diversity, tolerance mechanisms of cultivars, landraces and wild relatives of major crops can be identified and ultimately exploited for yield improvement. Quantitative proteomics allows for the identification of proteins that may contribute to tolerance mechanisms by directly comparing protein abundance under stress conditions between genotypes differing in their stress responses. In this review, a summary is provided of the data accumulated from quantitative proteomic comparisons of crop genotypes/cultivars which present different stress tolerance responses when exposed to various abiotic stress conditions, including drought, salinity, high/low temperature, nutrient deficiency and UV-B irradiation. This field of research aims to identify molecular features that can be developed as biomarkers for crop improvement, however without accurate phenotyping, careful experimental design, statistical robustness and appropriate biomarker validation and verification it will be challenging to deliver what is promised. PMID:28248236
Barkla, Bronwyn J
2016-09-08
Modern day agriculture practice is narrowing the genetic diversity in our food supply. This may compromise the ability to obtain high yield under extreme climactic conditions, threatening food security for a rapidly growing world population. To identify genetic diversity, tolerance mechanisms of cultivars, landraces and wild relatives of major crops can be identified and ultimately exploited for yield improvement. Quantitative proteomics allows for the identification of proteins that may contribute to tolerance mechanisms by directly comparing protein abundance under stress conditions between genotypes differing in their stress responses. In this review, a summary is provided of the data accumulated from quantitative proteomic comparisons of crop genotypes/cultivars which present different stress tolerance responses when exposed to various abiotic stress conditions, including drought, salinity, high/low temperature, nutrient deficiency and UV-B irradiation. This field of research aims to identify molecular features that can be developed as biomarkers for crop improvement, however without accurate phenotyping, careful experimental design, statistical robustness and appropriate biomarker validation and verification it will be challenging to deliver what is promised.
Tuerxunyiming, Muhadasi; Xian, Feng; Zi, Jin; Yimamu, Yilihamujiang; Abuduwayite, Reshalaiti; Ren, Yan; Li, Qidan; Abudula, Abulizi; Liu, SiQi; Mohemaiti, Patamu
2018-01-05
Maturity-onset diabetes of the young (MODY) is an inherited monogenic type of diabetes. Genetic mutations in MODY often cause nonsynonymous changes that directly lead to the functional distortion of proteins and the pathological consequences. Herein, we proposed that the inherited mutations found in a MODY family could cause a disturbance of protein abundance, specifically in serum. The serum samples were collected from a Uyghur MODY family through three generations, and the serum proteins after depletion treatment were examined by quantitative proteomics to characterize the MODY-related serum proteins followed by verification using target quantification of proteomics. A total of 32 serum proteins were preliminarily identified as the MODY-related. Further verification test toward the individual samples demonstrated the 12 candidates with the significantly different abundance in the MODY patients. A comparison of the 12 proteins among the sera of type 1 diabetes, type 2 diabetes, MODY, and healthy subjects was conducted and revealed a protein signature related with MODY composed of the serum proteins such as SERPINA7, APOC4, LPA, C6, and F5.
An approach to quantitative sustainability assessment in the early stages of process design.
Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio
2008-06-15
A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.
Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki
2017-06-01
Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.
Francesconi, Andrea; Kasai, Miki; Petraitiene, Ruta; Petraitis, Vidmantas; Kelaher, Amy M.; Schaufele, Robert; Hope, William W.; Shea, Yvonne R.; Bacher, John; Walsh, Thomas J.
2006-01-01
Bronchoalveolar lavage (BAL) is widely used for evaluation of patients with suspected invasive pulmonary aspergillosis (IPA). However, the diagnostic yield of BAL for detection of IPA by culture and direct examination is limited. Earlier diagnosis may be facilitated by assays that can detect Aspergillus galactomannan antigen or DNA in BAL fluid. We therefore characterized and compared the diagnostic yields of a galactomannan enzyme immunoassay (GM EIA), quantitative real-time PCR (qPCR), and quantitative cultures in experiments using BAL fluid from neutropenic rabbits with experimentally induced IPA defined as microbiologically and histologically evident invasion. The qPCR assay targeted the rRNA gene complex of Aspergillus fumigatus. The GM EIA and qPCR assay were characterized by receiver operator curve analysis. With an optimal cutoff of 0.75, the GM EIA had a sensitivity and specificity of 100% in untreated controls. A decline in sensitivity (92%) was observed when antifungal therapy (AFT) was administered. The optimal cutoff for qPCR was a crossover of 36 cycles, with sensitivity and specificity of 80% and 100%, respectively. The sensitivity of qPCR also decreased with AFT to 50%. Quantitative culture of BAL had a sensitivity of 46% and a specificity of 100%. The sensitivity of quantitative culture decreased with AFT to 16%. The GM EIA and qPCR assay had greater sensitivity than culture in detection of A. fumigatus in BAL fluid in experimentally induced IPA (P ± 0.04). Use of the GM EIA and qPCR assay in conjunction with culture-based diagnostic methods applied to BAL fluid could facilitate accurate diagnosis and more-timely initiation of specific therapy. PMID:16825367
Granvogl, Michael
2014-02-12
Three stable isotope dilution assays (SIDAs) were developed for the quantitation of (E)-2-butenal (crotonaldehyde) in heat-processed edible fats and oils as well as in food using synthesized [¹³C₄]-crotonaldehyde as internal standard. First, a direct headspace GC-MS method, followed by two indirect methods on the basis of derivatization with either pentafluorophenylhydrazine (GC-MS) or 2,4-dinitrophenylhydrazine (LC-MS/MS), was developed. All methods are also suitable for the quantitation of acrolein using the standard [¹³C₃]-acrolein. Applying these three methods on five different types of fats and oils varying in their fatty acid compositions revealed significantly varying crotonaldehyde concentrations for the different samples, but nearly identical quantitative data for all methods. Formed amounts of crotonaldehyde were dependent not only on the type of oil, e.g., 0.29-0.32 mg/kg of coconut oil or 33.9-34.4 mg/kg of linseed oil after heat-processing for 24 h at 180 °C, but also on the applied temperature and time. The results indicated that the concentration of formed crotonaldehyde seemed to be correlated with the amount of linolenic acid in the oils. Furthermore, the formation of crotonaldehyde was compared to that of its homologue acrolein, demonstrating that acrolein was always present in higher amounts in heat-processed oils, e.g., 12.3 mg of crotonaldehyde/kg of rapeseed oil in comparison to 23.4 mg of acrolein/kg after 24 h at 180 °C. Finally, crotonaldehyde was also quantitated in fried food, revealing concentrations from 12 to 25 μg/kg for potato chips and from 8 to 19 μg/kg for donuts, depending on the oil used.
A quantitative comparison of leading-edge vortices in incompressible and supersonic flows
DOT National Transportation Integrated Search
2002-01-14
When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that pl...
Comparison of large-scale structures and velocities in the local universe
NASA Technical Reports Server (NTRS)
Yahil, Amos
1994-01-01
Comparison of the large-scale density and velocity fields in the local universe shows detailed agreement, strengthening the standard paradigm of the gravitational origin of these structures. Quantitative analysis can determine the cosmological density parameter, Omega, and biasing factor, b; there is virtually no sensitivity in any local analyses to the cosmologial constant, lambda. Comparison of the dipole anisotropy of the cosmic microwave background with the acceleration due to the Infrared Astronomy Satellite (IRAS) galaxies puts the linear growth factor in the range beta approximately equals Omega (exp 0.6)/b = 0.6(+0.7/-0.3) (95% confidence). A direct comparison of the density and velocity fields of nearby galaxies gives beta = 1.3 (+0.7/-0.6), and from nonlinear analysis the weaker limit (Omega greater than 0.45 for b greater than 0.5 (again 95% confidence). A tighter limit (Omega greater than 0.3 (4-6 sigma)), is obtained by a reconstruction of the probability distribution function of the initial fluctuations from which the structures observed today arose. The last two methods depend critically on the smooth velocity field determined from the observed velocities of nearby galaxies by the POTENT method. A new analysis of these velocities, with more than three times the data used to obtain the above quoted results, is now underway and promises to tighten the uncertainties considerably, as well as reduce systematic bias.
Physical validation of a patient-specific contact finite element model of the ankle.
Anderson, Donald D; Goldsworthy, Jane K; Li, Wendy; James Rudert, M; Tochigi, Yuki; Brown, Thomas D
2007-01-01
A validation study was conducted to determine the extent to which computational ankle contact finite element (FE) results agreed with experimentally measured tibio-talar contact stress. Two cadaver ankles were loaded in separate test sessions, during which ankle contact stresses were measured with a high-resolution (Tekscan) pressure sensor. Corresponding contact FE analyses were subsequently performed for comparison. The agreement was good between FE-computed and experimentally measured mean (3.2% discrepancy for one ankle, 19.3% for the other) and maximum (1.5% and 6.2%) contact stress, as well as for contact area (1.7% and 14.9%). There was also excellent agreement between histograms of fractional areas of cartilage experiencing specific ranges of contact stress. Finally, point-by-point comparisons between the computed and measured contact stress distributions over the articular surface showed substantial agreement, with correlation coefficients of 90% for one ankle and 86% for the other. In the past, general qualitative, but little direct quantitative agreement has been demonstrated with articular joint contact FE models. The methods used for this validation enable formal comparison of computational and experimental results, and open the way for objective statistical measures of regional correlation between FE-computed contact stress distributions from comparison articular joint surfaces (e.g., those from an intact versus those with residual intra-articular fracture incongruity).
Social Comparison and Body Image in Adolescence: A Grounded Theory Approach
ERIC Educational Resources Information Center
Krayer, A.; Ingledew, D. K.; Iphofen, R.
2008-01-01
This study explored the use of social comparison appraisals in adolescents' lives with particular reference to enhancement appraisals which can be used to counter threats to the self. Social comparison theory has been increasingly used in quantitative research to understand the processes through which societal messages about appearance influence…
Target Scattering Metrics: Model-Model and Model-Data Comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Target Scattering Metrics: Model-Model and Model Data comparisons
2017-12-13
measured synthetic aperture sonar (SAS) data or from numerical models is investigated. Metrics are needed for quantitative comparisons for signals...candidate metrics for model-model comparisons are examined here with a goal to consider raw data prior to its reduction to data products, which may...be suitable for input to classification schemes. The investigated metrics are then applied to model-data comparisons. INTRODUCTION Metrics for
Mickenautsch, Steffen; Yengopal, Veerasamy
2013-01-01
Background Naïve-indirect comparisons are comparisons between competing clinical interventions’ evidence from separate (uncontrolled) trials. Direct comparisons are comparisons within randomised control trials (RCTs). The objective of this empirical study is to test the null-hypothesis that trends and performance differences inferred from naïve-indirect comparisons and from direct comparisons/RCTs regarding the failure rates of amalgam and direct high-viscosity glass-ionomer cement (HVGIC) restorations in permanent posterior teeth have similar direction and magnitude. Methods A total of 896 citations were identified through systematic literature search. From these, ten and two uncontrolled clinical longitudinal studies for HVGIC and amalgam, respectively, were included for naïve-indirect comparison and could be matched with three out twenty RCTs. Summary effects sizes were computed as Odds ratios (OR; 95% Confidence intervals) and compared with those from RCTs. Trend directions were inferred from 95% Confidence interval overlaps and direction of point estimates; magnitudes of performance differences were inferred from the median point estimates (OR) with 25% and 75% percentile range, for both types of comparison. Mann-Whitney U test was applied to test for statistically significant differences between point estimates of both comparison types. Results Trends and performance differences inferred from naïve-indirect comparison based on evidence from uncontrolled clinical longitudinal studies and from direct comparisons based on RCT evidence are not the same. The distributions of the point estimates differed significantly for both comparison types (Mann–Whitney U = 25, nindirect = 26; ndirect = 8; p = 0.0013, two-tailed). Conclusion The null-hypothesis was rejected. Trends and performance differences inferred from either comparison between HVGIC and amalgam restorations failure rates in permanent posterior teeth are not the same. It is recommended that clinical practice guidance regarding HVGICs should rest on direct comparisons via RCTs and not on naïve-indirect comparisons based on uncontrolled longitudinal studies in order to avoid inflation of effect estimates. PMID:24205220
Magnetooptics of Exciton Rydberg States in a Monolayer Semiconductor
NASA Astrophysics Data System (ADS)
Stier, A. V.; Wilson, N. P.; Velizhanin, K. A.; Kono, J.; Xu, X.; Crooker, S. A.
2018-02-01
We report 65 T magnetoabsorption spectroscopy of exciton Rydberg states in the archetypal monolayer semiconductor WSe2 . The strongly field-dependent and distinct energy shifts of the 2 s , 3 s , and 4 s excited neutral excitons permits their unambiguous identification and allows for quantitative comparison with leading theoretical models. Both the sizes (via low-field diamagnetic shifts) and the energies of the n s exciton states agree remarkably well with detailed numerical simulations using the nonhydrogenic screened Keldysh potential for 2D semiconductors. Moreover, at the highest magnetic fields, the nearly linear diamagnetic shifts of the weakly bound 3 s and 4 s excitons provide a direct experimental measure of the exciton's reduced mass mr=0.20 ±0.01 m0.
Ampe, Frédéric
2000-01-01
Based on 16S rRNA sequence comparison, we have designed a 20-mer oligonucleotide that targets a region specific to the species Lactobacillus manihotivorans recently isolated from sour cassava fermentation. The probe recognized the rRNA obtained from all the L. manihotivorans strains tested but did not recognize 56 strains of microorganisms from culture collections or directly isolated from sour cassava, including 29 species of lactic acid bacteria. This probe was then successfully used in quantitative RNA blots and demonstrated the importance of L. manihotivorans in the fermentation of sour cassava starch, which could represent up to 20% of total lactic acid bacteria. PMID:10788405
A QUANTITATIVE COMPARISON OF LUNAR ORBITAL NEUTRON DATA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eke, V. R.; Teodoro, L. F. A.; Lawrence, D. J.
2012-03-01
Data from the Lunar Exploration Neutron Detector (LEND) Collimated Sensors for Epithermal Neutrons (CSETN) are used in conjunction with a model based on results from the Lunar Prospector (LP) mission to quantify the extent of the background in the LEND CSETN. A simple likelihood analysis implies that at least 90% of the lunar component of the LEND CSETN flux results from high-energy epithermal (HEE) neutrons passing through the walls of the collimator. Thus, the effective FWHM of the LEND CSETN field of view is comparable to that of the omni-directional LP Neutron Spectrometer. The resulting map of HEE neutrons offersmore » the opportunity to probe the hydrogen abundance at low latitudes and to provide constraints on the distribution of lunar water.« less
Roughness induced transition and heat transfer augmentation in hypersonic environments
NASA Astrophysics Data System (ADS)
Wassel, A. T.; Shih, W. C. L.; Courtney, J. F.
Boundary layer transition and surface heating distributions on graphite, fine weave carbon-carbon, and metallic nosetip materials were derived from surface temperature responses measured in nitrogen environments during both free-flight and track-guided testing in hypersonic environments. Innovative test procedures were developed, and heat transfer results were validated against established theory through experiments using a super-smooth tungsten model. Quantitative definitions of mean transition front locations were established by deriving heat flux distributions from measured temperatures, and comparisons made with existing nosetip transition correlations. Qualitative transition locations were inferred directly from temperature distributions to investigate preferred orientations on fine weave nosetips. Levels of roughness augmented heat transfer were generally shown to be below values predicted by state-of-the-art methods.
Multiple ionization of neon by soft x-rays at ultrahigh intensity
NASA Astrophysics Data System (ADS)
Guichard, R.; Richter, M.; Rost, J.-M.; Saalmann, U.; Sorokin, A. A.; Tiedtke, K.
2013-08-01
At the free-electron laser FLASH, multiple ionization of neon atoms was quantitatively investigated at photon energies of 93.0 and 90.5 eV. For ion charge states up to 6+, we compare the respective absolute photoionization yields with results from a minimal model and an elaborate description including standard sequential and direct photoionization channels. Both approaches are based on rate equations and take into account a Gaussian spatial intensity distribution of the laser beam. From the comparison we conclude that photoionization up to a charge of 5+ can be described by the minimal model which we interpret as sequential photoionization assisted by electron shake-up processes. For higher charges, the experimental ionization yields systematically exceed the elaborate rate-based prediction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berkelbach, Timothy C., E-mail: tcb2112@columbia.edu; Reichman, David R., E-mail: drr2103@columbia.edu; Hybertsen, Mark S., E-mail: mhyberts@bnl.gov
We extend our previous work on singlet exciton fission in isolated dimers to the case of crystalline materials, focusing on pentacene as a canonical and concrete example. We discuss the proper interpretation of the character of low-lying excited states of relevance to singlet fission. In particular, we consider a variety of metrics for measuring charge-transfer character, conclusively demonstrating significant charge-transfer character in the low-lying excited states. The impact of this electronic structure on the subsequent singlet fission dynamics is assessed by performing real-time master-equation calculations involving hundreds of quantum states. We make direct comparisons with experimental absorption spectra and singletmore » fission rates, finding good quantitative agreement in both cases, and we discuss the mechanistic distinctions that exist between small isolated aggregates and bulk systems.« less
Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides
Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...
Radicevic, Zoran; Jelicic Dobrijevic, Ljiljana; Sovilj, Mirjana; Barlov, Ivana
2009-06-01
Aim of the research was to examine similarities and differences between the periods of experiencing visually stimulated directed speech-language information and periods of undirected attention. The examined group comprised N = 64 children, aged 4-5, with different speech-language disorders (developmental dysphasia, hyperactive syndrome with attention disorder, children with borderline intellectual abilities, autistic complex). Theta EEG was registered in children in the period of watching and describing the picture ("task"), and in the period of undirected attention ("passive period"). The children were recorded in standard EEG conditions, at 19 points of EEG registration and in longitudinal bipolar montage. Results in the observed age-operative theta rhythm indicated significant similarities and differences in the prevalence of spatial engagement of certain regions between the two hemispheres at the input and output of processing, which opens the possibility for more detailed analysis of conscious control of speech-language processing and its disorders.
Horowltz, A.J.
1986-01-01
Centrifugation, settling/centrifugation, and backflush-filtration procedures have been tested for the concentration of suspended sediment from water for subsequent trace-metal analysis. Either of the first two procedures is comparable with in-line filtration and can be carried out precisely, accurately, and with a facility that makes the procedures amenable to large-scale sampling and analysis programs. There is less potential for post-sampling alteration of suspended sediment-associated metal concentrations with the centrifugation procedure because sample stabilization is accomplished more rapidly than with settling/centrifugation. Sample preservation can be achieved by chilling. Suspended sediment associated metal levels can best be determined by direct analysis but can also be estimated from the difference between a set of unfiltered-digested and filtered subsamples. However, when suspended sediment concentrations (<150 mg/L) or trace-metal levels are low, the direct analysis approach makes quantitation more accurate and precise and can be accomplished with simpler analytical procedures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruban, V. P., E-mail: ruban@itp.ac.ru
2015-05-15
The nonlinear dynamics of an obliquely oriented wave packet on a sea surface is analyzed analytically and numerically for various initial parameters of the packet in relation to the problem of the so-called rogue waves. Within the Gaussian variational ansatz applied to the corresponding (1+2)-dimensional hyperbolic nonlinear Schrödinger equation (NLSE), a simplified Lagrangian system of differential equations is derived that describes the evolution of the coefficients of the real and imaginary quadratic forms appearing in the Gaussian. This model provides a semi-quantitative description of the process of nonlinear spatiotemporal focusing, which is one of the most probable mechanisms of roguemore » wave formation in random wave fields. The system of equations is integrated in quadratures, which allows one to better understand the qualitative differences between linear and nonlinear focusing regimes of a wave packet. Predictions of the Gaussian model are compared with the results of direct numerical simulation of fully nonlinear long-crested waves.« less
Reverse phase protein microarrays: fluorometric and colorimetric detection.
Gallagher, Rosa I; Silvestri, Alessandra; Petricoin, Emanuel F; Liotta, Lance A; Espina, Virginia
2011-01-01
The Reverse Phase Protein Microarray (RPMA) is an array platform used to quantitate proteins and their posttranslationally modified forms. RPMAs are applicable for profiling key cellular signaling pathways and protein networks, allowing direct comparison of the activation state of proteins from multiple samples within the same array. The RPMA format consists of proteins immobilized directly on a nitrocellulose substratum. The analyte is subsequently probed with a primary antibody and a series of reagents for signal amplification and detection. Due to the diversity, low concentration, and large dynamic range of protein analytes, RPMAs require stringent signal amplification methods, high quality image acquisition, and software capable of precisely analyzing spot intensities on an array. Microarray detection strategies can be either fluorescent or colorimetric. The choice of a detection system depends on (a) the expected analyte concentration, (b) type of microarray imaging system, and (c) type of sample. The focus of this chapter is to describe RPMA detection and imaging using fluorescent and colorimetric (diaminobenzidine (DAB)) methods.
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
[Development of an Excel spreadsheet for meta-analysis of indirect and mixed treatment comparisons].
Tobías, Aurelio; Catalá-López, Ferrán; Roqué, Marta
2014-01-01
Meta-analyses in clinical research usually aimed to evaluate treatment efficacy and safety in direct comparison with a unique comparator. Indirect comparisons, using the Bucher's method, can summarize primary data when information from direct comparisons is limited or nonexistent. Mixed comparisons allow combining estimates from direct and indirect comparisons, increasing statistical power. There is a need for simple applications for meta-analysis of indirect and mixed comparisons. These can easily be conducted using a Microsoft Office Excel spreadsheet. We developed a spreadsheet for indirect and mixed effects comparisons of friendly use for clinical researchers interested in systematic reviews, but non-familiarized with the use of more advanced statistical packages. The use of the proposed Excel spreadsheet for indirect and mixed comparisons can be of great use in clinical epidemiology to extend the knowledge provided by traditional meta-analysis when evidence from direct comparisons is limited or nonexistent.
Bayés, Àlex; Collins, Mark O.; Croning, Mike D. R.; van de Lagemaat, Louie N.; Choudhary, Jyoti S.; Grant, Seth G. N.
2012-01-01
Direct comparison of protein components from human and mouse excitatory synapses is important for determining the suitability of mice as models of human brain disease and to understand the evolution of the mammalian brain. The postsynaptic density is a highly complex set of proteins organized into molecular networks that play a central role in behavior and disease. We report the first direct comparison of the proteome of triplicate isolates of mouse and human cortical postsynaptic densities. The mouse postsynaptic density comprised 1556 proteins and the human one 1461. A large compositional overlap was observed; more than 70% of human postsynaptic density proteins were also observed in the mouse postsynaptic density. Quantitative analysis of postsynaptic density components in both species indicates a broadly similar profile of abundance but also shows that there is higher abundance variation between species than within species. Well known components of this synaptic structure are generally more abundant in the mouse postsynaptic density. Significant inter-species abundance differences exist in some families of key postsynaptic density proteins including glutamatergic neurotransmitter receptors and adaptor proteins. Furthermore, we have identified a closely interacting set of molecules enriched in the human postsynaptic density that could be involved in dendrite and spine structural plasticity. Understanding synapse proteome diversity within and between species will be important to further our understanding of brain complexity and disease. PMID:23071613
2014-01-01
Aims Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Material and methods Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. Results The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Conclusions Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well. PMID:25208969
Janiszewska-Olszowska, Joanna; Tandecka, Katarzyna; Szatkiewicz, Tomasz; Sporniak-Tutak, Katarzyna; Grocholewicz, Katarzyna
2014-09-10
Presenting a new method for direct, quantitative analysis of enamel surface. Measurement of adhesive remnants and enamel loss resulting from debonding molar tubes. Buccal surfaces of fifteen extracted human molars were directly scanned with an optic blue-light 3D scanner to the nearest 2 μm. After 20 s etching molar tubes were bonded and after 24 h storing in 0.9% saline - debonded. Then 3D scanning was repeated. Superimposition and comparison were proceeded and shape alterations of the entire objects were analyzed using specialized computer software. Residual adhesive heights as well as enamel loss depths have been obtained for the entire buccal surfaces. Residual adhesive volume and enamel loss volume have been calculated for every tooth. The maximum height of adhesive remaining on enamel surface was 0.76 mm and the volume on particular teeth ranged from 0.047 mm3 to 4.16 mm3. The median adhesive remnant volume was 0.988 mm3. Mean depths of enamel loss for particular teeth ranged from 0.0076 mm to 0.0416 mm. Highest maximum depth of enamel loss was 0.207 mm. Median volume of enamel loss was 0.104 mm3 and maximum volume was 1.484 mm3. Blue-light 3D scanning is able to provide direct precise scans of the enamel surface, which can be superimposed in order to calculate shape alterations. Debonding molar tubes leaves a certain amount of adhesive remnants on the enamel, however the interface fracture pattern varies for particular teeth and areas of enamel loss are present as well.
Freed, Melanie; de Zwart, Jacco A; Hariharan, Prasanna; Myers, Matthew R; Badano, Aldo
2011-10-01
To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml/s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml/s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions.
A comparison of manual and quantitative elbow strength testing.
Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R
2012-10-01
The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.
Jung, Yongsik; Jeong, Seong Kyun; Kang, Doo Kyoung; Moon, Yeorae; Kim, Tae Hee
2018-06-01
We quantitatively analyzed background parenchymal enhancement (BPE) in whole breast according to menstrual cycle and compared it with a qualitative analysis method. A data set of breast magnetic resonance imaging (MRI) from 273 breast cancer patients was used. For quantitative analysis, we used semiautomated in-house software with MATLAB. From each voxel of whole breast, the software calculated BPE using following equation: [(signal intensity [SI] at 1 min 30 s after contrast injection - baseline SI)/baseline SI] × 100%. In total, 53 patients had minimal, 108 mild, 87 moderate, and 25 marked BPE. On quantitative analysis, mean BPE values were 33.1% in the minimal, 42.1% in the mild, 59.1% in the moderate, and 81.9% in the marked BPE group showing significant difference (p = .009 for minimal vs. mild, p < 0.001 for other comparisons). Spearman's correlation test showed that there was strong significant correlation between qualitative and quantitative BPE (r = 0.63, p < 0.001). The mean BPE value was 48.7% for patients in the first week of the menstrual cycle, 43.5% in the second week, 49% in the third week, and 49.4% for those in the fourth week. The difference between the second and fourth weeks was significant (p = .005). Median, 90th percentile, and 10th percentile values were also significantly different between the second and fourth weeks but not different in other comparisons (first vs. second, first vs. third, first vs. fourth, second vs. third, or third vs. fourth). Quantitative analysis of BPE correlated well with the qualitative BPE grade. Quantitative BPE values were lowest in the second week and highest in the fourth week. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Minzhi; Li, Haiyun; Liu, Xiaochen; Wei, Jie; Ji, Jianguo; Yang, Shu; Hu, Zhiyuan; Wei, Shicheng
2016-03-01
Nano-sized hydroxyapatite (n-HA) is considered as a bio-active material, which is often mixed into bone implant material, polyetheretherketone (PEEK). To reveal the global protein expression modulations of osteoblast in response to direct contact with the PEEK composite containing high level (40%) nano-sized hydroxyapatite (n-HA/PEEK) and explain its comprehensive bio-effects, quantitative proteomic analysis was conducted on human osteoblast-like cells MG-63 cultured on n-HA/PEEK in comparison with pure PEEK. Results from quantitative proteomic analysis showed that the most enriched categories in the up-regulated proteins were related to calcium ion processes and associated functions while the most enriched categories in the down-regulated proteins were related to RNA process. This enhanced our understanding to the molecular mechanism of the promotion of the cell adhesion and differentiation with the inhibition of the cell proliferation on n-HA/PEEK composite. It also exhibited that although the calcium ion level of incubate environment hadn’t increased, merely the calcium fixed on the surface of material had influence to intracellular calcium related processes, which was also reflect by the higher intracellular Ca2+ concentration of n-HA/PEEK. This study could lead to more comprehensive cognition to the versatile biocompatibility of composite materials. It further proves that proteomics is useful in new bio-effect discovery.
Zhao, Minzhi; Li, Haiyun; Liu, Xiaochen; Wei, Jie; Ji, Jianguo; Yang, Shu; Hu, Zhiyuan; Wei, Shicheng
2016-03-09
Nano-sized hydroxyapatite (n-HA) is considered as a bio-active material, which is often mixed into bone implant material, polyetheretherketone (PEEK). To reveal the global protein expression modulations of osteoblast in response to direct contact with the PEEK composite containing high level (40%) nano-sized hydroxyapatite (n-HA/PEEK) and explain its comprehensive bio-effects, quantitative proteomic analysis was conducted on human osteoblast-like cells MG-63 cultured on n-HA/PEEK in comparison with pure PEEK. Results from quantitative proteomic analysis showed that the most enriched categories in the up-regulated proteins were related to calcium ion processes and associated functions while the most enriched categories in the down-regulated proteins were related to RNA process. This enhanced our understanding to the molecular mechanism of the promotion of the cell adhesion and differentiation with the inhibition of the cell proliferation on n-HA/PEEK composite. It also exhibited that although the calcium ion level of incubate environment hadn't increased, merely the calcium fixed on the surface of material had influence to intracellular calcium related processes, which was also reflect by the higher intracellular Ca(2+) concentration of n-HA/PEEK. This study could lead to more comprehensive cognition to the versatile biocompatibility of composite materials. It further proves that proteomics is useful in new bio-effect discovery.
Shultzaberger, Ryan K.; Paddock, Mark L.; Katsuki, Takeo; Greenspan, Ralph J.; Golden, Susan S.
2016-01-01
The temporal measurement of a bioluminescent reporter has proven to be one of the most powerful tools for characterizing circadian rhythms in the cyanobacterium Synechococcus elongatus. Primarily, two approaches have been used to automate this process: (1) detection of cell culture bioluminescence in 96-well plates by a photomultiplier tube-based plate-cycling luminometer (TopCount Microplate Scintillation and Luminescence Counter, Perkin Elmer) and (2) detection of individual colony bioluminescence by iteratively rotating a Petri dish under a cooled CCD camera using a computer-controlled turntable. Each approach has distinct advantages. The TopCount provides a more quantitative measurement of bioluminescence, enabling the direct comparison of clock output levels among strains. The computer-controlled turntable approach has a shorter set-up time and greater throughput, making it a more powerful phenotypic screening tool. While the latter approach is extremely useful, only a few labs have been able to build such an apparatus because of technical hurdles involved in coordinating and controlling both the camera and the turntable, and in processing the resulting images. This protocol provides instructions on how to construct, use, and process data from a computer-controlled turntable to measure the temporal changes in bioluminescence of individual cyanobacterial colonies. Furthermore, we describe how to prepare samples for use with the TopCount to minimize experimental noise, and generate meaningful quantitative measurements of clock output levels for advanced analysis. PMID:25662451
Mendoza-Parra, Marco-Antonio; Saravaki, Vincent; Cholley, Pierre-Etienne; Blum, Matthias; Billoré, Benjamin; Gronemeyer, Hinrich
2016-01-01
We have established a certification system for antibodies to be used in chromatin immunoprecipitation assays coupled to massive parallel sequencing (ChIP-seq). This certification comprises a standardized ChIP procedure and the attribution of a numerical quality control indicator (QCi) to biological replicate experiments. The QCi computation is based on a universally applicable quality assessment that quantitates the global deviation of randomly sampled subsets of ChIP-seq dataset with the original genome-aligned sequence reads. Comparison with a QCi database for >28,000 ChIP-seq assays were used to attribute quality grades (ranging from 'AAA' to 'DDD') to a given dataset. In the present report we used the numerical QC system to assess the factors influencing the quality of ChIP-seq assays, including the nature of the target, the sequencing depth and the commercial source of the antibody. We have used this approach specifically to certify mono and polyclonal antibodies obtained from Active Motif directed against the histone modification marks H3K4me3, H3K27ac and H3K9ac for ChIP-seq. The antibodies received the grades AAA to BBC ( www.ngs-qc.org). We propose to attribute such quantitative grading of all antibodies attributed with the label "ChIP-seq grade".
TAIWO, OLUWADAMILOLA O.; FINEGAN, DONAL P.; EASTWOOD, DAVID S.; FIFE, JULIE L.; BROWN, LEON D.; DARR, JAWWAD A.; LEE, PETER D.; BRETT, DANIEL J.L.
2016-01-01
Summary Lithium‐ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium‐ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3‐D imaging techniques, quantitative assessment of 3‐D microstructures from 2‐D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two‐dimensional (2‐D) data sets. In this study, stereological prediction and three‐dimensional (3‐D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium‐ion battery electrodes were imaged using synchrotron‐based X‐ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2‐D image sections generated from tomographic imaging, whereas direct 3‐D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2‐D image sections is bound to be associated with ambiguity and that volume‐based 3‐D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially‐dependent parameters, such as tortuosity and pore‐phase connectivity. PMID:26999804
Taiwo, Oluwadamilola O; Finegan, Donal P; Eastwood, David S; Fife, Julie L; Brown, Leon D; Darr, Jawwad A; Lee, Peter D; Brett, Daniel J L; Shearing, Paul R
2016-09-01
Lithium-ion battery performance is intrinsically linked to electrode microstructure. Quantitative measurement of key structural parameters of lithium-ion battery electrode microstructures will enable optimization as well as motivate systematic numerical studies for the improvement of battery performance. With the rapid development of 3-D imaging techniques, quantitative assessment of 3-D microstructures from 2-D image sections by stereological methods appears outmoded; however, in spite of the proliferation of tomographic imaging techniques, it remains significantly easier to obtain two-dimensional (2-D) data sets. In this study, stereological prediction and three-dimensional (3-D) analysis techniques for quantitative assessment of key geometric parameters for characterizing battery electrode microstructures are examined and compared. Lithium-ion battery electrodes were imaged using synchrotron-based X-ray tomographic microscopy. For each electrode sample investigated, stereological analysis was performed on reconstructed 2-D image sections generated from tomographic imaging, whereas direct 3-D analysis was performed on reconstructed image volumes. The analysis showed that geometric parameter estimation using 2-D image sections is bound to be associated with ambiguity and that volume-based 3-D characterization of nonconvex, irregular and interconnected particles can be used to more accurately quantify spatially-dependent parameters, such as tortuosity and pore-phase connectivity. © 2016 The Authors. Journal of Microscopy published by John Wiley & Sons Ltd on behalf of Royal Microscopical Society.
What quantitative mechanical loading stimulates in vitro cultivation best?
Natenstedt, Jerry; Kok, Aimee C; Dankelman, Jenny; Tuijthof, Gabrielle Jm
2015-12-01
Articular cartilage has limited regeneration capacities. One of the factors that appear to affect the in vitro cultivation of articular cartilage is mechanical stimulation. So far, no combination of parameters has been identified that offers the best results. The goal is to review the literature in search of the best available set of quantitative mechanical stimuli that lead to optimal in vitro cultivation.The databases Scopus and PubMed were used to survey the literature, and strict in- and exclusion criteria were applied regarding the presence of quantitative data. The review was performed by studying the type of loading (hydrostatic compression or direct compression), the loading magnitude, the frequency and the loading regime (duration of the loading) in comparison to quantitative evidence of cartilage quality response (cellular, signaling and mechanical).Thirty-three studies met all criteria of which 8 studied human, 20 bovine, 2 equine, 1 ovine, 1 porcine and 1 canine cells using four different types of cultivated constructs. Six studies investigated loading magnitude within the same setup, three studies the frequency, and seven the loading regime. Nine studies presented mechanical tissue response. The studies suggest that a certain threshold exits for enhanced cartilage in vitro cultivation of explants (>20 % strain and 0.5 Hz), and that chondrocyte-seeded cultivated constructs show best results when loaded with physiological mechanical stimuli. That is a loading pressure between 5-10 MPa and a loading frequency of 1 Hz exerted at intermittent intervals for a period of a week or longer. Critical aspects remain to be answered for translation into in vivo therapies.
NASA Astrophysics Data System (ADS)
Kainerstorfer, Jana M.; Amyot, Franck; Demos, Stavros G.; Hassan, Moinuddin; Chernomordik, Victor; Hitzenberger, Christoph K.; Gandjbakhche, Amir H.; Riley, Jason D.
2009-07-01
Quantitative assessment of skin chromophores in a non-invasive fashion is often desirable. Especially pixel wise assessment of blood volume and blood oxygenation is beneficial for improved diagnostics. We utilized a multi-spectral imaging system for acquiring diffuse reflectance images of healthy volunteers' lower forearm. Ischemia and reactive hyperemia was introduced by occluding the upper arm with a pressure cuff for 5min with 180mmHg. Multi-spectral images were taken every 30s, before, during and after occlusion. Image reconstruction for blood volume and blood oxygenation was performed, using a two layered skin model. As the images were taken in a non-contact way, strong artifacts related to the shape (curvature) of the arms were observed, making reconstruction of optical / physiological parameters highly inaccurate. We developed a curvature correction method, which is based on extracting the curvature directly from the intensity images acquired and does not require any additional measures on the object imaged. The effectiveness of the algorithm was demonstrated, on reconstruction results of blood volume and blood oxygenation for in vivo data during occlusion of the arm. Pixel wise assessment of blood volume and blood oxygenation was made possible over the entire image area and comparison of occlusion effects between veins and surrounding skin was performed. Induced ischemia during occlusion and reactive hyperemia afterwards was observed and quantitatively assessed. Furthermore, the influence of epidermal thickness on reconstruction results was evaluated and the exact knowledge of this parameter for fully quantitative assessment was pointed out.
Quantitative assessment of emphysema from whole lung CT scans: comparison with visual grading
NASA Astrophysics Data System (ADS)
Keller, Brad M.; Reeves, Anthony P.; Apanosovich, Tatiyana V.; Wang, Jianwei; Yankelevitz, David F.; Henschke, Claudia I.
2009-02-01
Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for imaging of the anatomical basis of emphysema and for visual assessment by radiologists of the extent present in the lungs. Several measures have been introduced for the quantification of the extent of disease directly from CT data in order to add to the qualitative assessments made by radiologists. In this paper we compare emphysema index, mean lung density, histogram percentiles, and the fractal dimension to visual grade in order to evaluate the predictability of radiologist visual scoring of emphysema from low-dose CT scans through quantitative scores, in order to determine which measures can be useful as surrogates for visual assessment. All measures were computed over nine divisions of the lung field (whole lung, individual lungs, and upper/middle/lower thirds of each lung) for each of 148 low-dose, whole lung scans. In addition, a visual grade of each section was also given by an expert radiologist. One-way ANOVA and multinomial logistic regression were used to determine the ability of the measures to predict visual grade from quantitative score. We found that all measures were able to distinguish between normal and severe grades (p<0.01), and between mild/moderate and all other grades (p<0.05). However, no measure was able to distinguish between mild and moderate cases. Approximately 65% prediction accuracy was achieved from using quantitative score to predict visual grade, with 73% if mild and moderate cases are considered as a single class.
Porous polymer packings have been used successfully in many applications of direct aqueous injection gas chromatography. The authors have expanded the use of aqueous injection to the quantitative analysis of 68 alcohols, acetates, ketones, ethers, sulfides, aldehydes, diols, dion...
Fox, Bridget C; Devonshire, Alison S; Baradez, Marc-Olivier; Marshall, Damian; Foy, Carole A
2012-08-15
Single cell gene expression analysis can provide insights into development and disease progression by profiling individual cellular responses as opposed to reporting the global average of a population. Reverse transcription-quantitative polymerase chain reaction (RT-qPCR) is the "gold standard" for the quantification of gene expression levels; however, the technical performance of kits and platforms aimed at single cell analysis has not been fully defined in terms of sensitivity and assay comparability. We compared three kits using purification columns (PicoPure) or direct lysis (CellsDirect and Cells-to-CT) combined with a one- or two-step RT-qPCR approach using dilutions of cells and RNA standards to the single cell level. Single cell-level messenger RNA (mRNA) analysis was possible using all three methods, although the precision, linearity, and effect of lysis buffer and cell background differed depending on the approach used. The impact of using a microfluidic qPCR platform versus a standard instrument was investigated for potential variability introduced by preamplification of template or scaling down of the qPCR to nanoliter volumes using laser-dissected single cell samples. The two approaches were found to be comparable. These studies show that accurate gene expression analysis is achievable at the single cell level and highlight the importance of well-validated experimental procedures for low-level mRNA analysis. Copyright © 2012 Elsevier Inc. All rights reserved.
Quantitative analysis of catch-up saccades during sustained pursuit.
de Brouwer, Sophie; Missal, Marcus; Barnes, Graham; Lefèvre, Philippe
2002-04-01
During visual tracking of a moving stimulus, primates orient their visual axis by combining two very different types of eye movements, smooth pursuit and saccades. The purpose of this paper was to investigate quantitatively the catch-up saccades occurring during sustained pursuit. We used a ramp-step-ramp paradigm to evoke catch-up saccades during sustained pursuit. In general, catch-up saccades followed the unexpected steps in position and velocity of the target. We observed catch-up saccades in the same direction as the smooth eye movement (forward saccades) as well as in the opposite direction (reverse saccades). We made a comparison of the main sequences of forward saccades, reverse saccades, and control saccades made to stationary targets. They were all three significantly different from each other and were fully compatible with the hypothesis that the smooth pursuit component is added to the saccadic component during catch-up saccades. A multiple linear regression analysis was performed on the saccadic component to find the parameters determining the amplitude of catch-up saccades. We found that both position error and retinal slip are taken into account in catch-up saccade programming to predict the future trajectory of the moving target. We also demonstrated that the saccadic system needs a minimum period of approximately 90 ms for taking into account changes in target trajectory. Finally, we reported a saturation (above 15 degrees /s) in the contribution of retinal slip to the amplitude of catch-up saccades.
NASA Astrophysics Data System (ADS)
Caley, T.; Roche, D. M.
2013-03-01
Oxygen stable isotopes (18O) are among the most usual tools in paleoclimatology/paleoceanography. Simulation of oxygen stable isotopes allows testing how the past variability of these isotopes in water can be interpreted. By modelling the proxy directly in the model, the results can also be directly compared with the data. Water isotopes have been implemented in the global three-dimensional model of intermediate complexity iLOVECLIM allowing fully coupled atmosphere-ocean simulations. In this study, we present the validation of the model results for present day climate against global database for oxygen stable isotopes in carbonates. The limitation of the model together with the processes operating in the natural environment reveal the complexity of use the continental calcite 18O signal of speleothems for a data-model comparison exercise. On the contrary, the reconstructed surface ocean calcite δ18O signal in iLOVECLIM does show a very good agreement with late Holocene database (foraminifers) at the global and regional scales. Our results indicate that temperature and the isotopic composition of the seawater are the main control on the fossil δ18O signal recorded in foraminifer shells and that depth habitat and seasonality play a role but have secondary importance. We argue that a data-model comparison for surface ocean calcite δ18O in past climate, such as the last glacial maximum (≈21 000 yr), could constitute an interesting tool for mapping the potential shifts of the frontal systems and circulation changes throughout time. Similarly, the potential changes in intermediate oceanic circulation systems in the past could be documented by a data (benthic foraminifers)-model comparison exercise whereas future investigations are necessary in order to quantitatively compare the results with data for the deep ocean.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-01-01
Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393
Sergutina, A V; Rakhmanova, V I
2016-06-01
Monoamine oxidase activity was quantitatively assessed by cytochemical method in brain structures (layers III and V of the sensorimotor cortex, caudate nucleus, nucleus accumbens, hippocampal CA3 field) of rats of August line and Wistar population with high and low locomotor activity in the open fi eld test. Monoamine oxidase activity (substrate tryptamine) predominated in the nucleus accumbens of Wistar rats with high motor activity in comparison with rats with low locomotor activity. In August rats, enzyme activity (substrates tryptamine and serotonin) predominated in the hippocampus of animals with high motor activity. Comparison of August rats with low locomotor activity and Wistar rats with high motor activity (i.e. animals demonstrating maximum differences in motor function) revealed significantly higher activity of the enzyme (substrates tryptamine and serotonin) in the hippocampus of Wistar rats. The study demonstrates clear-cut morphochemical specificity of monoaminergic metabolism based on the differences in the cytochemical parameter "monoamine oxidase activity", in the studied brain structures, responsible for the formation and realization of goal-directed behavior in Wistar and August rats.
Han, Z Y; Weng, W G
2011-05-15
In this paper, a qualitative and a quantitative risk assessment methods for urban natural gas pipeline network are proposed. The qualitative method is comprised of an index system, which includes a causation index, an inherent risk index, a consequence index and their corresponding weights. The quantitative method consists of a probability assessment, a consequences analysis and a risk evaluation. The outcome of the qualitative method is a qualitative risk value, and for quantitative method the outcomes are individual risk and social risk. In comparison with previous research, the qualitative method proposed in this paper is particularly suitable for urban natural gas pipeline network, and the quantitative method takes different consequences of accidents into consideration, such as toxic gas diffusion, jet flame, fire ball combustion and UVCE. Two sample urban natural gas pipeline networks are used to demonstrate these two methods. It is indicated that both of the two methods can be applied to practical application, and the choice of the methods depends on the actual basic data of the gas pipelines and the precision requirements of risk assessment. Crown Copyright © 2011. Published by Elsevier B.V. All rights reserved.
Burger, Stefan; Fraunholz, Thomas; Leirer, Christian; Hoppe, Ronald H W; Wixforth, Achim; Peter, Malte A; Franke, Thomas
2013-06-25
Phase decomposition in lipid membranes has been the subject of numerous investigations by both experiment and theoretical simulation, yet quantitative comparisons of the simulated data to the experimental results are rare. In this work, we present a novel way of comparing the temporal development of liquid-ordered domains obtained from numerically solving the Cahn-Hilliard equation and by inducing a phase transition in giant unilamellar vesicles (GUVs). Quantitative comparison is done by calculating the structure factor of the domain pattern. It turns out that the decomposition takes place in three distinct regimes in both experiment and simulation. These regimes are characterized by different rates of growth of the mean domain diameter, and there is quantitative agreement between experiment and simulation as to the duration of each regime and the absolute rate of growth in each regime.
There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method ...
Employment from Solar Energy: A Bright but Partly Cloudy Future.
ERIC Educational Resources Information Center
Smeltzer, K. K.; Santini, D. J.
A comparison of quantitative and qualitative employment effects of solar and conventional systems can prove the increased employment postulated as one of the significant secondary benefits of a shift from conventional to solar energy use. Current quantitative employment estimates show solar technology-induced employment to be generally greater…
ERIC Educational Resources Information Center
Arabacioglu, Taner; Akar-Vural, Ruken
2014-01-01
The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…
There are numerous quantitative real-time PCR (qPCR) assays available to detect and enumerate fecal pollution in ambient waters. Each assay employs distinct primers and probes that target different rRNA genes and microorganisms leading to potential variations in concentration es...
Andrew D. Bower; Bryce A. Richardson; Valerie Hipkins; Regina Rochefort; Carol Aubry
2011-01-01
Analysis of "neutral" molecular markers and "adaptive" quantitative traits are common methods of assessing genetic diversity and population structure. Molecular markers typically reflect the effects of demographic and stochastic processes but are generally assumed to not reflect natural selection. Conversely, quantitative (or "adaptive")...
Leadership and Culture-Building in Schools: Quantitative and Qualitative Understandings.
ERIC Educational Resources Information Center
Sashkin, Marshall; Sashkin, Molly G.
Understanding effective school leadership as a function of culture building through quantitative and qualitative analyses is the purpose of this paper. The two-part quantitative phase of the research focused on statistical measures of culture and leadership behavior directed toward culture building in the school. The first quantitative part…
One registration multi-atlas-based pseudo-CT generation for attenuation correction in PET/MRI.
Arabi, Hossein; Zaidi, Habib
2016-10-01
The outcome of a detailed assessment of various strategies for atlas-based whole-body bone segmentation from magnetic resonance imaging (MRI) was exploited to select the optimal parameters and setting, with the aim of proposing a novel one-registration multi-atlas (ORMA) pseudo-CT generation approach. The proposed approach consists of only one online registration between the target and reference images, regardless of the number of atlas images (N), while for the remaining atlas images, the pre-computed transformation matrices to the reference image are used to align them to the target image. The performance characteristics of the proposed method were evaluated and compared with conventional atlas-based attenuation map generation strategies (direct registration of the entire atlas images followed by voxel-wise weighting (VWW) and arithmetic averaging atlas fusion). To this end, four different positron emission tomography (PET) attenuation maps were generated via arithmetic averaging and VWW scheme using both direct registration and ORMA approaches as well as the 3-class attenuation map obtained from the Philips Ingenuity TF PET/MRI scanner commonly used in the clinical setting. The evaluation was performed based on the accuracy of extracted whole-body bones by the different attenuation maps and by quantitative analysis of resulting PET images compared to CT-based attenuation-corrected PET images serving as reference. The comparison of validation metrics regarding the accuracy of extracted bone using the different techniques demonstrated the superiority of the VWW atlas fusion algorithm achieving a Dice similarity measure of 0.82 ± 0.04 compared to arithmetic averaging atlas fusion (0.60 ± 0.02), which uses conventional direct registration. Application of the ORMA approach modestly compromised the accuracy, yielding a Dice similarity measure of 0.76 ± 0.05 for ORMA-VWW and 0.55 ± 0.03 for ORMA-averaging. The results of quantitative PET analysis followed the same trend with less significant differences in terms of SUV bias, whereas massive improvements were observed compared to PET images corrected for attenuation using the 3-class attenuation map. The maximum absolute bias achieved by VWW and VWW-ORMA methods was 06.4 ± 5.5 in the lung and 07.9 ± 4.8 in the bone, respectively. The proposed algorithm is capable of generating decent attenuation maps. The quantitative analysis revealed a good correlation between PET images corrected for attenuation using the proposed pseudo-CT generation approach and the corresponding CT images. The computational time is reduced by a factor of 1/N at the expense of a modest decrease in quantitative accuracy, thus allowing us to achieve a reasonable compromise between computing time and quantitative performance.
Raineri, M; Traina, M; Rotolo, A; Candela, B; Lombardo, R M; Raineri, A A
1993-05-01
Thallium-201 scintigraphy is a widely used noninvasive procedure for the detection and prognostic assessment of patients with suspected or proven coronary artery disease. Thallium uptake can be evaluated by a visual analysis or by a quantitative interpretation. Quantitative scintigraphy enhances disease detection in individual coronary arteries, provides a more precise estimate of the amount of ischemic myocardium, distinguishing scar from hypoperfused tissue. Due to the great deal of data, analysis, interpretation and comparison of thallium uptake can be very complex. We designed a computer-based system for the interpretation of quantitative thallium-201 scintigraphy data uptake. We used a database (DataEase 4.2-DataEase Italia). Our software has the following functions: data storage; calculation; conversion of numerical data into different definitions classifying myocardial perfusion; uptake data comparison; automatic conclusion; comparison of different scintigrams for the same patient. Our software is made up by 4 sections: numeric analysis, descriptive analysis, automatic conclusion, clinical remarks. We introduced in the computer system appropriate information, "logical paths", that use the "IF ... THEN" rules. The software executes these rules in order to analyze the myocardial regions in the 3 phases of scintigraphic analysis (stress, redistribution, re-injection), in the 3 projections (LAO 45 degrees, LAT,ANT), considering our uptake cutoff, obtaining, finally, the automatic conclusions. For these reasons, our computer-based system could be considered a real "expert system".
All you need is shape: Predicting shear banding in sand with LS-DEM
NASA Astrophysics Data System (ADS)
Kawamoto, Reid; Andò, Edward; Viggiani, Gioacchino; Andrade, José E.
2018-02-01
This paper presents discrete element method (DEM) simulations with experimental comparisons at multiple length scales-underscoring the crucial role of particle shape. The simulations build on technological advances in the DEM furnished by level sets (LS-DEM), which enable the mathematical representation of the surface of arbitrarily-shaped particles such as grains of sand. We show that this ability to model shape enables unprecedented capture of the mechanics of granular materials across scales ranging from macroscopic behavior to local behavior to particle behavior. Specifically, the model is able to predict the onset and evolution of shear banding in sands, replicating the most advanced high-fidelity experiments in triaxial compression equipped with sequential X-ray tomography imaging. We present comparisons of the model and experiment at an unprecedented level of quantitative agreement-building a one-to-one model where every particle in the more than 53,000-particle array has its own avatar or numerical twin. Furthermore, the boundary conditions of the experiment are faithfully captured by modeling the membrane effect as well as the platen displacement and tilting. The results show a computational tool that can give insight into the physics and mechanics of granular materials undergoing shear deformation and failure, with computational times comparable to those of the experiment. One quantitative measure that is extracted from the LS-DEM simulations that is currently not available experimentally is the evolution of three dimensional force chains inside and outside of the shear band. We show that the rotations on the force chains are correlated to the rotations in stress principal directions.
Kim, Min Soon; Rodney, William N.; Cooper, Tara; Kite, Chris; Reece, Gregory P.; Markey, Mia K.
2011-01-01
Rationale, aims and objectives Scarring is a significant cause of dissatisfaction for women who undergo breast surgery. Scar tissue may be clinically distinguished from normal skin by aberrant color, rough surface texture, increased thickness (hypertrophy), and firmness. Colorimeters or spectrophotometers can be used to quantitatively assess scar color, but they require direct patient interaction and can cost thousands of dollars By comparison, digital photography is already in widespread use to document clinical outcomes and requires less patient interaction. Thus, assessment of scar coloration by digital photography is an attractive alternative. The goal of this study was to compare color measurements obtained by digital photography and colorimetry. Method Agreement between photographic and colorimetric measurements of color were evaluated. Experimental conditions were controlled by performing measurements on artificial scars created by a makeup artist. The colorimetric measurements of the artificial scars were compared to those reported in the literature for real scars in order to confirm the validity of this approach. We assessed the agreement between the colorimetric and photographic measurements of color using a hypothesis test for equivalence, the intra-class correlation coefficient (ICC), and the Bland-Altman method. Results Overall, good agreement was obtained for three parameters (L*a*b*) measured by colorimetry and photography from the results of three statistical analyses. Conclusion Color measurements obtained by digital photography were equivalent to those obtained using colorimetry. Thus, digital photography is a reliable, cost-effective measurement method of skin color and should be further investigated for quantitative analysis of surgical outcomes. PMID:19239578
Radiation damage to nucleoprotein complexes in macromolecular crystallography
Bury, Charles; Garman, Elspeth F.; Ginn, Helen Mary; ...
2015-01-30
Significant progress has been made in macromolecular crystallography over recent years in both the understanding and mitigation of X-ray induced radiation damage when collecting diffraction data from crystalline proteins. Despite the large field that is productively engaged in the study of radiation chemistry of nucleic acids, particularly of DNA, there are currently very few X-ray crystallographic studies on radiation damage mechanisms in nucleic acids. Quantitative comparison of damage to protein and DNA crystals separately is challenging, but many of the issues are circumvented by studying pre-formed biological nucleoprotein complexes where direct comparison of each component can be made under themore » same controlled conditions. A model protein–DNA complex C.Esp1396I is employed to investigate specific damage mechanisms for protein and DNA in a biologically relevant complex over a large dose range (2.07–44.63 MGy). In order to allow a quantitative analysis of radiation damage sites from a complex series of macromolecular diffraction data, a computational method has been developed that is generally applicable to the field. Typical specific damage was observed for both the protein on particular amino acids and for the DNA on, for example, the cleavage of base-sugar N 1—C and sugar-phosphate C—O bonds. Strikingly the DNA component was determined to be far more resistant to specific damage than the protein for the investigated dose range. We observed the protein at low doses and found that they were susceptible to radiation damage while the DNA was far more resistant, damage only being observed at significantly higher doses.« less
Comparative assessment of fluorescent proteins for in vivo imaging in an animal model system.
Heppert, Jennifer K; Dickinson, Daniel J; Pani, Ariel M; Higgins, Christopher D; Steward, Annette; Ahringer, Julie; Kuhn, Jeffrey R; Goldstein, Bob
2016-11-07
Fluorescent protein tags are fundamental tools used to visualize gene products and analyze their dynamics in vivo. Recent advances in genome editing have expedited the precise insertion of fluorescent protein tags into the genomes of diverse organisms. These advances expand the potential of in vivo imaging experiments and facilitate experimentation with new, bright, photostable fluorescent proteins. Most quantitative comparisons of the brightness and photostability of different fluorescent proteins have been made in vitro, removed from biological variables that govern their performance in cells or organisms. To address the gap, we quantitatively assessed fluorescent protein properties in vivo in an animal model system. We generated transgenic Caenorhabditis elegans strains expressing green, yellow, or red fluorescent proteins in embryos and imaged embryos expressing different fluorescent proteins under the same conditions for direct comparison. We found that mNeonGreen was not as bright in vivo as predicted based on in vitro data but is a better tag than GFP for specific kinds of experiments, and we report on optimal red fluorescent proteins. These results identify ideal fluorescent proteins for imaging in vivo in C. elegans embryos and suggest good candidate fluorescent proteins to test in other animal model systems for in vivo imaging experiments. © 2016 Heppert et al. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Hawkins, Brent L; Van Puymbroeck, Marieke; Walter, Alysha; Sharp, Julia; Woshkolup, Kathleen; Urrea-Mendoza, Enrique; Revilla, Fredy; Schmid, Arlene A
2018-04-09
Parkinson's disease (PD) often leads to poor balance, increased falls, and fear of falling, all of which can reduce participation in life activities. Yoga, which usually includes physical exercise, can improve functioning and life participation; however, limited research has been conducted on the effects of yoga on life participation of individuals with PD. This study had two purposes: (1) to identify and understand the perceived activities and participation outcomes associated a therapeutic yoga intervention for individuals with PD; and (2) to compare the perceived activities and participation outcomes with the outcomes measured in the clinical trial. A single-blind, randomized, waitlist-controlled, phase II exploratory pilot study using an after-trial embedded mixed methods design (clinical trial Pro00041068) evaluated the effect of an 8-week Hatha Yoga intervention on individuals with PD. Directed content analysis was used to analyze focus group interviews with participants who completed the yoga intervention. Quantitative and qualitative data were merged and compared using a data comparison matrix. Qualitative analysis indicated many activities and participation outcomes. Comparison of qualitative and quantitative data indicated the yoga intervention led to improved balance, mobility, and functional gait, and fewer falls. These outcomes reached beyond the intervention and into participants' daily lives. Results support the use of Hatha Yoga as a community-based rehabilitation intervention for individuals with PD. Yoga, as part of an interdisciplinary approach to treatment, can improve many types of activities and participation outcomes (e.g., mobility, social relationships, self-care, handling stress, recreation).
Ku, Hyung-Keun; Lim, Hyuk-Min; Oh, Kyong-Hwa; Yang, Hyo-Jin; Jeong, Ji-Seon; Kim, Sook-Kyung
2013-03-01
The Bradford assay is a simple method for protein quantitation, but variation in the results between proteins is a matter of concern. In this study, we compared and normalized quantitative values from two models for protein quantitation, where the residues in the protein that bind to anionic Coomassie Brilliant Blue G-250 comprise either Arg and Lys (Method 1, M1) or Arg, Lys, and His (Method 2, M2). Use of the M2 model yielded much more consistent quantitation values compared with use of the M1 model, which exhibited marked overestimations against protein standards. Copyright © 2012 Elsevier Inc. All rights reserved.
Holland, Jason P; Green, Jennifer C
2010-04-15
The electronic absorption spectra of a range of copper and zinc complexes have been simulated by using time-dependent density functional theory (TD-DFT) calculations implemented in Gaussian03. In total, 41 exchange-correlation (XC) functionals including first-, second-, and third-generation (meta-generalized gradient approximation) DFT methods were compared in their ability to predict the experimental electronic absorption spectra. Both pure and hybrid DFT methods were tested and differences between restricted and unrestricted calculations were also investigated by comparison of analogous neutral zinc(II) and copper(II) complexes. TD-DFT calculated spectra were optimized with respect to the experimental electronic absorption spectra by use of a Matlab script. Direct comparison of the performance of each XC functional was achieved both qualitatively and quantitatively by comparison of optimized half-band widths, root-mean-squared errors (RMSE), energy scaling factors (epsilon(SF)), and overall quality-of-fit (Q(F)) parameters. Hybrid DFT methods were found to outperform all pure DFT functionals with B1LYP, B97-2, B97-1, X3LYP, and B98 functionals providing the highest quantitative and qualitative accuracy in both restricted and unrestricted systems. Of the functionals tested, B1LYP gave the most accurate results with both average RMSE and overall Q(F) < 3.5% and epsilon(SF) values close to unity (>0.990) for the copper complexes. The XC functional performance in spin-restricted TD-DFT calculations on the zinc complexes was found to be slightly worse. PBE1PBE, mPW1PW91 and B1LYP gave the most accurate results with typical RMSE and Q(F) values between 5.3 and 7.3%, and epsilon(SF) around 0.930. These studies illustrate the power of modern TD-DFT calculations for exploring excited state transitions of metal complexes. 2009 Wiley Periodicals, Inc.
Fisher statistics for analysis of diffusion tensor directional information.
Hutchinson, Elizabeth B; Rutecki, Paul A; Alexander, Andrew L; Sutula, Thomas P
2012-04-30
A statistical approach is presented for the quantitative analysis of diffusion tensor imaging (DTI) directional information using Fisher statistics, which were originally developed for the analysis of vectors in the field of paleomagnetism. In this framework, descriptive and inferential statistics have been formulated based on the Fisher probability density function, a spherical analogue of the normal distribution. The Fisher approach was evaluated for investigation of rat brain DTI maps to characterize tissue orientation in the corpus callosum, fornix, and hilus of the dorsal hippocampal dentate gyrus, and to compare directional properties in these regions following status epilepticus (SE) or traumatic brain injury (TBI) with values in healthy brains. Direction vectors were determined for each region of interest (ROI) for each brain sample and Fisher statistics were applied to calculate the mean direction vector and variance parameters in the corpus callosum, fornix, and dentate gyrus of normal rats and rats that experienced TBI or SE. Hypothesis testing was performed by calculation of Watson's F-statistic and associated p-value giving the likelihood that grouped observations were from the same directional distribution. In the fornix and midline corpus callosum, no directional differences were detected between groups, however in the hilus, significant (p<0.0005) differences were found that robustly confirmed observations that were suggested by visual inspection of directionally encoded color DTI maps. The Fisher approach is a potentially useful analysis tool that may extend the current capabilities of DTI investigation by providing a means of statistical comparison of tissue structural orientation. Copyright © 2012 Elsevier B.V. All rights reserved.
Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Ake; Winter, Reidar
2009-08-25
Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 +/- 3.7% and -0.2 +/- 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant.
Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.
1989-01-01
A charged particle spectrometer for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode.
Fayn, J; Rubel, P
1988-01-01
The authors present a new computer program for serial ECG analysis that allows a direct comparison of any couple of three-dimensional ECGs and quantitatively assesses the degree of evolution of the spatial loops as well as of their initial, central, or terminal sectors. Loops and sectors are superposed as best as possible, with the aim of overcoming tracing variability of nonpathological origin. As a result, optimal measures of evolution are computed and a tabular summary of measurements is dynamically configured with respect to the patient's history and is then printed. A multivariate classifier assigns each couple of tracings to one of four classes of evolution. Color graphic displays corresponding to several modes of representation may also be plotted.
Comparison and quantitative verification of mapping algorithms for whole genome bisulfite sequencing
USDA-ARS?s Scientific Manuscript database
Coupling bisulfite conversion with next-generation sequencing (Bisulfite-seq) enables genome-wide measurement of DNA methylation, but poses unique challenges for mapping. However, despite a proliferation of Bisulfite-seq mapping tools, no systematic comparison of their genomic coverage and quantitat...
NASA Astrophysics Data System (ADS)
Gulliver, Eric A.
The objective of this thesis to identify and develop techniques providing direct comparison between simulated and real packed particle mixture microstructures containing submicron-sized particles. This entailed devising techniques for simulating powder mixtures, producing real mixtures with known powder characteristics, sectioning real mixtures, interrogating mixture cross-sections, evaluating and quantifying the mixture interrogation process and for comparing interrogation results between mixtures. A drop and roll-type particle-packing model was used to generate simulations of random mixtures. The simulated mixtures were then evaluated to establish that they were not segregated and free from gross defects. A powder processing protocol was established to provide real mixtures for direct comparison and for use in evaluating the simulation. The powder processing protocol was designed to minimize differences between measured particle size distributions and the particle size distributions in the mixture. A sectioning technique was developed that was capable of producing distortion free cross-sections of fine scale particulate mixtures. Tessellation analysis was used to interrogate mixture cross sections and statistical quality control charts were used to evaluate different types of tessellation analysis and to establish the importance of differences between simulated and real mixtures. The particle-packing program generated crescent shaped pores below large particles but realistic looking mixture microstructures otherwise. Focused ion beam milling was the only technique capable of sectioning particle compacts in a manner suitable for stereological analysis. Johnson-Mehl and Voronoi tessellation of the same cross-sections produced tessellation tiles with different the-area populations. Control charts analysis showed Johnson-Mehl tessellation measurements are superior to Voronoi tessellation measurements for detecting variations in mixture microstructure, such as altered particle-size distributions or mixture composition. Control charts based on tessellation measurements were used for direct, quantitative comparisons between real and simulated mixtures. Four sets of simulated and real mixtures were examined. Data from real mixture was matched with simulated data when the samples were well mixed and the particle size distributions and volume fractions of the components were identical. Analysis of mixture components that occupied less than approximately 10 vol% of the mixture was not practical unless the particle size of the component was extremely small and excellent quality high-resolution compositional micrographs of the real sample are available. These methods of analysis should allow future researchers to systematically evaluate and predict the impact and importance of variables such as component volume fraction and component particle size distribution as they pertain to the uniformity of powder mixture microstructures.
Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J
2014-05-15
Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.
There are numerous quantitative real-time PCR (qPCR) methods available to detect and enumerate human fecal pollution in ambient waters. Each assay employs distinct primers and/or probes and many target different genes and microorganisms leading to potential variations in method p...
Exciting New Images | Lunar Reconnaissance Orbiter Camera
slowly and relentlessly reshapes the Moon's topography. Comparative study of the shapes of lunar craters , quantitative comparison be derived? And how can we quantify and compare the topography of a large number of for quantitative characterization of impact crater topography (Mahanti, P. et al., 2014, Icarus v. 241
A Comparison of Learning Cultures in Different Sizes and Types
ERIC Educational Resources Information Center
Brown, Paula D.; Finch, Kim S.; MacGregor, Cynthia
2012-01-01
This study compared relevant data and information about leadership and learning cultures in different sizes and types of high schools. Research was conducted using a quantitative design with a qualitative element. Quantitative data were gathered using a researcher-created survey. Independent sample t-tests were conducted to analyze the means of…
Does Pre-Service Preparation Matter? Examining an Old Question in New Ways
ERIC Educational Resources Information Center
Ronfeldt, Matthew
2014-01-01
Background: Over the past decade, most of the quantitative studies on teacher preparation have focused on comparisons between alternative and traditional routes. There has been relatively little quantitative research on specific features of teacher education that might cause certain pathways into teaching to be more effective than others. The vast…
Modern techniques for tracking fecal pollution in environmental waters require investing in DNA-based methods to determine the presence of specific fecal sources. To help water quality managers decide whether to employ routine polymerase chain reaction (PCR) or quantitative PC...
Aims: Compare specificity and sensitivity of quantitative PCR (qPCR) assays targeting single and multi-copy gene regions of Escherichia coli. Methods and Results: A previously reported assay targeting the uidA gene (uidA405) was used as the basis for comparing the taxono...
ERIC Educational Resources Information Center
Nishiyama, Takeshi; Suzuki, Masako; Adachi, Katsunori; Sumi, Satoshi; Okada, Kensuke; Kishino, Hirohisa; Sakai, Saeko; Kamio, Yoko; Kojima, Masayo; Suzuki, Sadao; Kanne, Stephen M.
2014-01-01
We comprehensively compared all available questionnaires for measuring quantitative autistic traits (QATs) in terms of reliability and construct validity in 3,147 non-clinical and 60 clinical subjects with normal intelligence. We examined four full-length forms, the Subthreshold Autism Trait Questionnaire (SATQ), the Broader Autism Phenotype…
NASA Astrophysics Data System (ADS)
Woolford, Alison; Holden, Marcia; Salit, Marc; Burns, Malcolm; Ellison, Stephen L. R.
2009-01-01
Key comparison CCQM-K61 was performed to demonstrate and document the capability of interested national metrology institutes in the determination of the quantity of specific DNA target in an aqueous solution. The study provides support for the following measurement claim: "Quantitation of a linearised plasmid DNA, based on a matched standard in a matrix of non-target DNA". The comparison was an activity of the Bioanalysis Working Group (BAWG) of the Comité Consultatif pour la Quantité de Matière and was coordinated by NIST (Gaithersburg, USA) and LGC (Teddington, UK). The following laboratories (in alphabetical order) participated in this key comparison. DMSC (Thailand); IRMM (European Union); KRISS (Republic of Korea); LGC (UK); NIM (China); NIST (USA); NMIA (Australia); NMIJ (Japan); VNIIM (Russian Federation) Good agreement was observed between the reported results of all nine of the participants. Uncertainty estimates did not account fully for the dispersion of results even after allowance for possible inhomogeneity in calibration materials. Preliminary studies suggest that the effects of fluorescence threshold setting might contribute to the excess dispersion, and further study of this topic is suggested Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).
Elias, Andrew; Crayton, Samuel H; Warden-Rothman, Robert; Tsourkas, Andrew
2014-07-28
Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples.
Long-wavelength Magnetic and Gravity Anomaly Correlations of Africa and Europe
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Hinze, W. J. (Principal Investigator); Olivier, R.
1984-01-01
Preliminary MAGSAT scalar magnetic anomaly data were compiled for comparison with long-wavelength-pass filtered free-air gravity anomalies and regional heat-flow and tectonic data. To facilitate the correlation analysis at satellite elevations over a spherical-Earth, equivalent point source inversion was used to differentially reduce the magnetic satellite anomalies to the radial pole at 350 km elevation, and to upward continue the first radial derivative of the free-air gravity anomalies. Correlation patterns between these regional geopotential anomaly fields are quantitatively established by moving window linear regression based on Poisson's theorem. Prominent correlations include direct correspondences for the Baltic Shield, where both anomalies are negative, and the central Mediterranean and Zaire Basin where both anomalies are positive. Inverse relationships are generally common over the Precambrian Shield in northwest Africa, the Basins and Shields in southern Africa, and the Alpine Orogenic Belt. Inverse correlations also presist over the North Sea Rifts, the Benue Rift, and more generally over the East African Rifts. The results of this quantitative correlation analysis support the general inverse relationships of gravity and magnetic anomalies observed for North American continental terrain which may be broadly related to magnetic crustal thickness variations.
Long-wavelength magnetic and gravity anomaly correlations on Africa and Europe
NASA Technical Reports Server (NTRS)
Vonfrese, R. R. B.; Olivier, R.; Hinze, W. J.
1985-01-01
Preliminary MAGSAT scalar magnetic anomaly data were compiled for comparison with long-wavelength-pass filtered free-air gravity anomalies and regional heat-flow and tectonic data. To facilitate the correlation analysis at satellite elevations over a spherical-Earth, equivalent point source inversion was used to differentially reduce the magnetic satellite anomalies to the radial pole at 350 km elevation, and to upward continue the first radial derivative of the free-air gravity anomalies. Correlation patterns between these regional geopotential anomaly fields are quantitatively established by moving window linear regression based on Poisson's theorem. Prominent correlations include direct correspondences for the Baltic shield, where both anomalies are negative, and the central Mediterranean and Zaire Basin where both anomalies are positive. Inverse relationships are generally common over the Precambrian Shield in northwest Africa, the Basins and Shields in southern Africa, and the Alpine Orogenic Belt. Inverse correlations also presist over the North Sea Rifts, the Benue Rift, and more generally over the East African Rifts. The results of this quantitative correlation analysis support the general inverse relationships of gravity and magnetic anomalies observed for North American continental terrain which may be broadly related to magnetic crustal thickness variations.
Scholey, J J; Wilcox, P D; Wisnom, M R; Friswell, M I
2009-06-01
A model for quantifying the performance of acoustic emission (AE) systems on plate-like structures is presented. Employing a linear transfer function approach the model is applicable to both isotropic and anisotropic materials. The model requires several inputs including source waveforms, phase velocity and attenuation. It is recognised that these variables may not be readily available, thus efficient measurement techniques are presented for obtaining phase velocity and attenuation in a form that can be exploited directly in the model. Inspired by previously documented methods, the application of these techniques is examined and some important implications for propagation characterisation in plates are discussed. Example measurements are made on isotropic and anisotropic plates and, where possible, comparisons with numerical solutions are made. By inputting experimentally obtained data into the model, quantitative system metrics are examined for different threshold values and sensor locations. By producing plots describing areas of hit success and source location error, the ability to measure the performance of different AE system configurations is demonstrated. This quantitative approach will help to place AE testing on a more solid foundation, underpinning its use in industrial AE applications.
Sun, Bing; Zheng, Yun-Ling
2018-01-01
Currently there is no sensitive, precise, and reproducible method to quantitate alternative splicing of mRNA transcripts. Droplet digital™ PCR (ddPCR™) analysis allows for accurate digital counting for quantification of gene expression. Human telomerase reverse transcriptase (hTERT) is one of the essential components required for telomerase activity and for the maintenance of telomeres. Several alternatively spliced forms of hTERT mRNA in human primary and tumor cells have been reported in the literature. Using one pair of primers and two probes for hTERT, four alternatively spliced forms of hTERT (α-/β+, α+/β- single deletions, α-/β- double deletion, and nondeletion α+/β+) were accurately quantified through a novel analysis method via data collected from a single ddPCR reaction. In this chapter, we describe this ddPCR method that enables direct quantitative comparison of four alternatively spliced forms of the hTERT messenger RNA without the need for internal standards or multiple pairs of primers specific for each variant, eliminating the technical variation due to differential PCR amplification efficiency for different amplicons and the challenges of quantification using standard curves. This simple and straightforward method should have general utility for quantifying alternatively spliced gene transcripts.
Achieving across-laboratory replicability in psychophysical scaling
Ward, Lawrence M.; Baumann, Michael; Moffat, Graeme; Roberts, Larry E.; Mori, Shuji; Rutledge-Taylor, Matthew; West, Robert L.
2015-01-01
It is well known that, although psychophysical scaling produces good qualitative agreement between experiments, precise quantitative agreement between experimental results, such as that routinely achieved in physics or biology, is rarely or never attained. A particularly galling example of this is the fact that power function exponents for the same psychological continuum, measured in different laboratories but ostensibly using the same scaling method, magnitude estimation, can vary by a factor of three. Constrained scaling (CS), in which observers first learn a standardized meaning for a set of numerical responses relative to a standard sensory continuum and then make magnitude judgments of other sensations using the learned response scale, has produced excellent quantitative agreement between individual observers’ psychophysical functions. Theoretically it could do the same for across-laboratory comparisons, although this needs to be tested directly. We compared nine different experiments from four different laboratories as an example of the level of across experiment and across-laboratory agreement achievable using CS. In general, we found across experiment and across-laboratory agreement using CS to be significantly superior to that typically obtained with conventional magnitude estimation techniques, although some of its potential remains to be realized. PMID:26191019
NASA Astrophysics Data System (ADS)
Devès, Guillaume; Cohen-Bouhacina, Touria; Ortega, Richard
2004-10-01
We used the nuclear microprobe techniques, micro-PIXE (particle-induced X-ray emission), micro-RBS (Rutherford backscattering spectrometry) and scanning transmission ion microscopy (STIM) in order to perform the characterization of trace element content and spatial distribution within biological samples (dehydrated cultured cells, tissues). The normalization of PIXE results was usually expressed in terms of sample dry mass as determined by micro-RBS recorded simultaneously to micro-PIXE. However, the main limit of RBS mass measurement is the sample mass loss occurring during irradiation and which could be up to 30% of the initial sample mass. We present here a new methodology for PIXE normalization and quantitative analysis of trace element within biological samples based on dry mass measurement performed by mean of STIM. The validation of STIM cell mass measurements was obtained in comparison with AFM sample thickness measurements. Results indicated the reliability of STIM mass measurement performed on biological samples and suggested that STIM should be performed for PIXE normalization. Further information deriving from direct confrontation of AFM and STIM analysis could as well be obtained, like in situ measurements of cell specific gravity within cells compartment (nucleolus and cytoplasm).
Image database for digital hand atlas
NASA Astrophysics Data System (ADS)
Cao, Fei; Huang, H. K.; Pietka, Ewa; Gilsanz, Vicente; Dey, Partha S.; Gertych, Arkadiusz; Pospiech-Kurkowska, Sywia
2003-05-01
Bone age assessment is a procedure frequently performed in pediatric patients to evaluate their growth disorder. A commonly used method is atlas matching by a visual comparison of a hand radiograph with a small reference set of old Greulich-Pyle atlas. We have developed a new digital hand atlas with a large set of clinically normal hand images of diverse ethnic groups. In this paper, we will present our system design and implementation of the digital atlas database to support the computer-aided atlas matching for bone age assessment. The system consists of a hand atlas image database, a computer-aided diagnostic (CAD) software module for image processing and atlas matching, and a Web user interface. Users can use a Web browser to push DICOM images, directly or indirectly from PACS, to the CAD server for a bone age assessment. Quantitative features on the examined image, which reflect the skeletal maturity, are then extracted and compared with patterns from the atlas image database to assess the bone age. The digital atlas method built on a large image database and current Internet technology provides an alternative to supplement or replace the traditional one for a quantitative, accurate and cost-effective assessment of bone age.
Pohmann, Rolf; Künnecke, Basil; Fingerle, Jürgen; von Kienlin, Markus
2006-01-01
Non-invasive measurement of perfusion in skeletal muscle by in vivo magnetic resonance remains a challenge due to its low level and the correspondingly low signal-to-noise ratio. To enable accurate, quantitative, and time-resolved perfusion measurements in the leg muscle, a technique with a high sensitivity is required. By combining a flow-sensitive alternating inversion recovery (FAIR)-sequence with a single-voxel readout, we have developed a new technique to measure the perfusion in the rat gastrocnemius muscle at rest, yielding an average value of 19.4 +/- 4.8 mL/100 g/min (n = 22). In additional experiments, perfusion changes were elicited by acute ischemia and reperfusion or by exercise induced by electrical, noninvasive muscle stimulation with varying duration and intensity. The perfusion time courses during these manipulations were measured with a temporal resolution of 2.2 min, showing increases in perfusion of a factor of up to 2.5. In a direct comparison, the results agreed closely with values found with microsphere measurements in the same animals. The quantitative and noninvasive method can significantly facilitate the investigation of atherosclerotic diseases and the examination of drug efficacy.
Comparative and Quantitative Global Proteomics Approaches: An Overview
Deracinois, Barbara; Flahaut, Christophe; Duban-Deweer, Sophie; Karamanos, Yannis
2013-01-01
Proteomics became a key tool for the study of biological systems. The comparison between two different physiological states allows unravelling the cellular and molecular mechanisms involved in a biological process. Proteomics can confirm the presence of proteins suggested by their mRNA content and provides a direct measure of the quantity present in a cell. Global and targeted proteomics strategies can be applied. Targeted proteomics strategies limit the number of features that will be monitored and then optimise the methods to obtain the highest sensitivity and throughput for a huge amount of samples. The advantage of global proteomics strategies is that no hypothesis is required, other than a measurable difference in one or more protein species between the samples. Global proteomics methods attempt to separate quantify and identify all the proteins from a given sample. This review highlights only the different techniques of separation and quantification of proteins and peptides, in view of a comparative and quantitative global proteomics analysis. The in-gel and off-gel quantification of proteins will be discussed as well as the corresponding mass spectrometry technology. The overview is focused on the widespread techniques while keeping in mind that each approach is modular and often recovers the other. PMID:28250403
Comparison of two trajectory based models for locating particle sources for two rural New York sites
NASA Astrophysics Data System (ADS)
Zhou, Liming; Hopke, Philip K.; Liu, Wei
Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.
LCSH and PRECIS in Music: A Comparison.
ERIC Educational Resources Information Center
Gabbard, Paula Beversdorf
1985-01-01
By studying examples of their applications by two major English language bibliographic agencies, this article compares strengths and weaknesses of PRECIS and Library of Congress Subject Headings for books about music. Highlights include quantitative and qualitative analysis, comparison of number of subject statements, and terminology problems in…
Comparison of Quantitative Antifungal Testing Methods for Textile Fabrics.
Imoto, Yasuo; Seino, Satoshi; Nakagawa, Takashi; Yamamoto, Takao A
2017-01-01
Quantitative antifungal testing methods for textile fabrics under growth-supportive conditions were studied. Fungal growth activities on unfinished textile fabrics and textile fabrics modified with Ag nanoparticles were investigated using the colony counting method and the luminescence method. Morphological changes of the fungi during incubation were investigated by microscopic observation. Comparison of the results indicated that the fungal growth activity values obtained with the colony counting method depended on the morphological state of the fungi on textile fabrics, whereas those obtained with the luminescence method did not. Our findings indicated that unique characteristics of each testing method must be taken into account for the proper evaluation of antifungal activity.
Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L
2018-02-01
A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T
2016-12-01
Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.
Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).
Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan
2015-01-01
Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.
NASA Astrophysics Data System (ADS)
Antropova, Natasha; Huynh, Benjamin; Giger, Maryellen
2017-03-01
Intuitive segmentation-based CADx/radiomic features, calculated from the lesion segmentations of dynamic contrast-enhanced magnetic resonance images (DCE-MRIs) have been utilized in the task of distinguishing between malignant and benign lesions. Additionally, transfer learning with pre-trained deep convolutional neural networks (CNNs) allows for an alternative method of radiomics extraction, where the features are derived directly from the image data. However, the comparison of computer-extracted segmentation-based and CNN features in MRI breast lesion characterization has not yet been conducted. In our study, we used a DCE-MRI database of 640 breast cases - 191 benign and 449 malignant. Thirty-eight segmentation-based features were extracted automatically using our quantitative radiomics workstation. Also, 2D ROIs were selected around each lesion on the DCE-MRIs and directly input into a pre-trained CNN AlexNet, yielding CNN features. Each method was investigated separately and in combination in terms of performance in the task of distinguishing between benign and malignant lesions. Area under the ROC curve (AUC) served as the figure of merit. Both methods yielded promising classification performance with round-robin cross-validated AUC values of 0.88 (se =0.01) and 0.76 (se=0.02) for segmentationbased and deep learning methods, respectively. Combination of the two methods enhanced the performance in malignancy assessment resulting in an AUC value of 0.91 (se=0.01), a statistically significant improvement over the performance of the CNN method alone.
Comparison of fecal indicators with pathogenic bacteria and rotavirus in groundwater.
Ferguson, Andrew S; Layton, Alice C; Mailloux, Brian J; Culligan, Patricia J; Williams, Daniel E; Smartt, Abby E; Sayler, Gary S; Feighery, John; McKay, Larry D; Knappett, Peter S K; Alexandrova, Ekaterina; Arbit, Talia; Emch, Michael; Escamilla, Veronica; Ahmed, Kazi Matin; Alam, Md Jahangir; Streatfield, P Kim; Yunus, Mohammad; van Geen, Alexander
2012-08-01
Groundwater is routinely analyzed for fecal indicators but direct comparisons of fecal indicators to the presence of bacterial and viral pathogens are rare. This study was conducted in rural Bangladesh where the human population density is high, sanitation is poor, and groundwater pumped from shallow tubewells is often contaminated with fecal bacteria. Five indicator microorganisms (E. coli, total coliform, F+RNA coliphage, Bacteroides and human-associated Bacteroides) and various environmental parameters were compared to the direct detection of waterborne pathogens by quantitative PCR in groundwater pumped from 50 tubewells. Rotavirus was detected in groundwater filtrate from the largest proportion of tubewells (40%), followed by Shigella (10%), Vibrio (10%), and pathogenic E. coli (8%). Spearman rank correlations and sensitivity-specificity calculations indicate that some, but not all, combinations of indicators and environmental parameters can predict the presence of pathogens. Culture-dependent fecal indicator bacteria measured on a single date did not predict total bacterial pathogens, but annually averaged monthly measurements of culturable E. coli did improve prediction for total bacterial pathogens. A qPCR-based E. coli assay was the best indicator for the bacterial pathogens. F+RNA coliphage were neither correlated nor sufficiently sensitive towards rotavirus, but were predictive of bacterial pathogens. Since groundwater cannot be excluded as a significant source of diarrheal disease in Bangladesh and neighboring countries with similar characteristics, the need to develop more effective methods for screening tubewells with respect to microbial contamination is necessary. Copyright © 2012 Elsevier B.V. All rights reserved.
Comparison of fecal indicators with pathogenic bacteria and rotavirus in groundwater
Ferguson, Andrew S.; Layton, Alice C.; Mailloux, Brian J; Culligan, Patricia J.; Williams, Daniel E.; Smartt, Abby E.; Sayler, Gary S.; Feighery, John; McKay, Larry; Knappett, Peter S.K.; Alexandrova, Ekaterina; Arbit, Talia; Emch, Michael; Escamilla, Veronica; Ahmed, Kazi Matin; Alam, Md. Jahangir; Streatfield, P. Kim; Yunus, Mohammad; van Geen, Alexander
2012-01-01
Groundwater is routinely analyzed for fecal indicators but direct comparisons of fecal indicators to the presence of bacterial and viral pathogens are rare. This study was conducted in rural Bangladesh where the human population density is high, sanitation is poor, and groundwater pumped from shallow tubewells is often contaminated with fecal bacteria. Five indicator microorganisms (E. coli, total coliform, F+RNA coliphage, Bacteroides and human-associated Bacteroides) and various environmental parameters were compared to the direct detection of waterborne pathogens by quantitative PCR in groundwater pumped from 50 tubewells. Rotavirus was detected in groundwater filtrate from the largest proportion of tubewells (40%), followed by Shigella (10%), Vibrio (10%), and pathogenic E. coli (8%). Spearman rank correlations and sensitivity-specificity calculations indicate that some, but not all, combinations of indicators and environmental parameters can predict the presence of pathogens. Culture-dependent fecal indicator bacteria measured on a single date did not predict total bacterial pathogens, but annually averaged monthly measurements of culturable E. coli did improve prediction for total bacterial pathogens. A qPCR-based E. coli assay was the best indicator for the bacterial pathogens. F+RNA coliphage were neither correlated nor sufficiently sensitive towards rotavirus, but were predictive of bacterial pathogens. Since groundwater cannot be excluded as a significant source of diarrheal disease in Bangladesh and neighboring countries with similar characteristics, the need to develop more effective methods for screening tubewells with respect to microbial contamination is necessary. PMID:22705866
Kobayashi, Takeshi; Reid, Joshua E S J; Shimizu, Seishi; Fyta, Maria; Smiatek, Jens
2017-07-26
We study the properties of residual water molecules at different mole fractions in dialkylimidazolium based ionic liquids (ILs), namely 1-ethyl-3-methylimidazolium tetrafluoroborate (EMIM/BF 4 ) and 1-butyl-3-methylimidazolium tetrafluoroborate (BMIM/BF 4 ) by means of atomistic molecular dynamics (MD) simulations. The corresponding Kirkwood-Buff (KB) integrals for the water-ion and ion-ion correlation behavior are calculated by a direct evaluation of the radial distribution functions. The outcomes are compared to the corresponding KB integrals derived by an inverse approach based on experimental data. Our results reveal a quantitative agreement between both approaches, which paves a way towards a more reliable comparison between simulation and experimental results. The simulation outcomes further highlight that water even at intermediate mole fractions has a negligible influence on the ion distribution in the solution. More detailed analysis on the local/bulk partition coefficients and the partial structure factors reveal that water molecules at low mole fractions mainly remain in the monomeric state. A non-linear increase of higher order water clusters can be found at larger water concentrations. For both ILs, a more pronounced water coordination around the cations when compared to the anions can be observed, which points out that the IL cations are mainly responsible for water pairing mechanisms. Our simulations thus provide detailed insights in the properties of dialkylimidazolium based ILs and their effects on water binding.
Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C
2004-09-30
Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.
Abdolhosseini, Sana; Ghiasvand, Ali Reza; Heidari, Nahid
2018-05-14
A stainless steel fiber was made porous and adhesive by platinization and then coated by nanostructured polypyrrole (PPy), using an appropriate electrophoretic deposition (EPD) method. The morphological surface structure and functional groups of the PPy-coated fiber were studied using SEM (Scanning electron microscope) instrument. The prepared fiber was used for comparison of direct immersion (DI) and electroenhanced direct immersion solid-phase microextraction (EE-DI-SPME) of nicotine in human plasma and urine samples followed by gas chromatography flame ionization detector (GC-FID) determination. The effects of the influential experimental parameters on the efficiency of the DI-SPME and EE-DI-SPME methods, including the pH and ionic strength of the sample solution, applied Direct current (DC) voltage, extraction temperature and time and stirring rate, were optimized. Under the optimal conditions, the calibration curves for the DI-SPME-GC-FID and EE-DI-SPME-GC-FID methods were linear over the ranges of 0.1⁻10.0 μg mL -1 and 0.001⁻10.0 μg mL -1 , respectively. The relative standard deviations (RSDs, n = 6) were found to be 6.1% and 4.6% for the DI and EE strategies, respectively. The LODs (limit of detection) of the DI-SPME-GC-FID and EE-DI-SPME-GC-FID methods were found to be 10 and 0.3 ng mL -1 , respectively. The relative recovery values (for the analysis of 1 µg mL -1 nicotine) were found to be 91⁻110% for EE-DI-SPME and 75⁻105% for DI-SPME. The enrichment factors for DI-SPME and EE-DI-SPME sampling were obtained as 38,734 and 50,597, respectively. The results indicated that EE-SPME was more efficient for quantitation of nicotine in biological fluids. The developed procedure was successfully carried out for the extraction and measurement of nicotine in real plasma and urine samples.
Can NMR solve some significant challenges in metabolomics?
Gowda, G.A. Nagana; Raftery, Daniel
2015-01-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact biospecimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. PMID:26476597
Shear-induced aggregation dynamics in a polymer microrod suspension
NASA Astrophysics Data System (ADS)
Kumar, Pramukta S.
A non-Brownian suspension of micron scale rods is found to exhibit reversible shear-driven formation of disordered aggregates resulting in dramatic viscosity enhancement at low shear rates. Aggregate formation is imaged at low magnification using a combined rheometer and fluorescence microscope system. The size and structure of these aggregates are found to depend on shear rate and concentration, with larger aggregates present at lower shear rates and higher concentrations. Quantitative measurements of the early-stage aggregation process are modeled by a collision driven growth of porous structures which show that the aggregate density increases with a shear rate. A Krieger-Dougherty type constitutive relation and steady-state viscosity measurements are used to estimate the intrinsic viscosity of complex structures developed under shear. Higher magnification images are collected and used to validate the aggregate size versus density relationship, as well as to obtain particle flow fields via PIV. The flow fields provide a tantalizing view of fluctuations involved in the aggregation process. Interaction strength is estimated via contact force measurements and JKR theory and found to be extremely strong in comparison to shear forces present in the system, estimated using hydrodynamic arguments. All of the results are then combined to produce a consistent conceptual model of aggregation in the system that features testable consequences. These results represent a direct, quantitative, experimental study of aggregation and viscosity enhancement in rod suspension, and demonstrate a strategy for inferring inaccessible microscopic geometric properties of a dynamic system through the combination of quantitative imaging and rheology.
NASA Astrophysics Data System (ADS)
Wu, Di; Donovan Wong, Molly; Li, Yuhua; Fajardo, Laurie; Zheng, Bin; Wu, Xizeng; Liu, Hong
2017-12-01
The objective of this study was to quantitatively investigate the ability to distribute microbubbles along the interface between two tissues, in an effort to improve the edge and/or boundary features in phase contrast imaging. The experiments were conducted by employing a custom designed tissue simulating phantom, which also simulated a clinical condition where the ligand-targeted microbubbles are self-aggregated on the endothelium of blood vessels surrounding malignant cells. Four different concentrations of microbubble suspensions were injected into the phantom: 0%, 0.1%, 0.2%, and 0.4%. A time delay of 5 min was implemented before image acquisition to allow the microbubbles to become distributed at the interface between the acrylic and the cavity simulating a blood vessel segment. For comparison purposes, images were acquired using three system configurations for both projection and tomosynthesis imaging with a fixed radiation dose delivery: conventional low-energy contact mode, low-energy in-line phase contrast and high-energy in-line phase contrast. The resultant images illustrate the edge feature enhancements in the in-line phase contrast imaging mode when the microbubble concentration is extremely low. The quantitative edge-enhancement-to-noise ratio calculations not only agree with the direct image observations, but also indicate that the edge feature enhancement can be improved by increasing the microbubble concentration. In addition, high-energy in-line phase contrast imaging provided better performance in detecting low-concentration microbubble distributions.
McCann, Una D; Szabo, Zsolt; Seckin, Esen; Rosenblatt, Peter; Mathews, William B; Ravert, Hayden T; Dannals, Robert F; Ricaurte, George A
2005-09-01
(+/-)3,4-Methylenedioxymethamphetamine (MDMA, 'Ecstasy') is a widely used illicit drug that produces toxic effects on brain serotonin axons and axon terminals in animals. The results of clinical studies addressing MDMA's serotonin neurotoxic potential in humans have been inconclusive. In the present study, 23 abstinent MDMA users and 19 non-MDMA controls underwent quantitative positron emission tomography (PET) studies using [11C]McN5652 and [11C]DASB, first- and second-generation serotonin transporter (SERT) ligands previously validated in baboons for detecting MDMA-induced brain serotonin neurotoxicity. Global and regional distribution volumes (DVs) and two additional SERT-binding parameters (DV(spec) and DVR) were compared in the two subject populations using parametric statistical analyses. Data from PET studies revealed excellent correlations between the various binding parameters of [11C]McN5652 and [11C]DASB, both in individual brain regions and individual subjects. Global SERT reductions were found in MDMA users with both PET ligands, using all three of the above-mentioned SERT-binding parameters. Preplanned comparisons in 15 regions of interest demonstrated reductions in selected cortical and subcortical structures. Exploratory correlational analyses suggested that SERT measures recover with time, and that loss of the SERT is directly associated with MDMA use intensity. These quantitative PET data, obtained using validated first- and second-generation SERT PET ligands, provide strong evidence of reduced SERT density in some recreational MDMA users.
Can NMR solve some significant challenges in metabolomics?
NASA Astrophysics Data System (ADS)
Nagana Gowda, G. A.; Raftery, Daniel
2015-11-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory.
Shahgaldi, Kambiz; Gudmundsson, Petri; Manouras, Aristomenis; Brodin, Lars-Åke; Winter, Reidar
2009-01-01
Background Visual assessment of left ventricular ejection fraction (LVEF) is often used in clinical routine despite general recommendations to use quantitative biplane Simpsons (BPS) measurements. Even thou quantitative methods are well validated and from many reasons preferable, the feasibility of visual assessment (eyeballing) is superior. There is to date only sparse data comparing visual EF assessment in comparison to quantitative methods available. The aim of this study was to compare visual EF assessment by two-dimensional echocardiography (2DE) and triplane echocardiography (TPE) using quantitative real-time three-dimensional echocardiography (RT3DE) as the reference method. Methods Thirty patients were enrolled in the study. Eyeballing EF was assessed using apical 4-and 2 chamber views and TP mode by two experienced readers blinded to all clinical data. The measurements were compared to quantitative RT3DE. Results There were an excellent correlation between eyeballing EF by 2D and TP vs 3DE (r = 0.91 and 0.95 respectively) without any significant bias (-0.5 ± 3.7% and -0.2 ± 2.9% respectively). Intraobserver variability was 3.8% for eyeballing 2DE, 3.2% for eyeballing TP and 2.3% for quantitative 3D-EF. Interobserver variability was 7.5% for eyeballing 2D and 8.4% for eyeballing TP. Conclusion Visual estimation of LVEF both using 2D and TP by an experienced reader correlates well with quantitative EF determined by RT3DE. There is an apparent trend towards a smaller variability using TP in comparison to 2D, this was however not statistically significant. PMID:19706183
Mapping eQTL Networks with Mixed Graphical Markov Models
Tur, Inma; Roverato, Alberto; Castelo, Robert
2014-01-01
Expression quantitative trait loci (eQTL) mapping constitutes a challenging problem due to, among other reasons, the high-dimensional multivariate nature of gene-expression traits. Next to the expression heterogeneity produced by confounding factors and other sources of unwanted variation, indirect effects spread throughout genes as a result of genetic, molecular, and environmental perturbations. From a multivariate perspective one would like to adjust for the effect of all of these factors to end up with a network of direct associations connecting the path from genotype to phenotype. In this article we approach this challenge with mixed graphical Markov models, higher-order conditional independences, and q-order correlation graphs. These models show that additive genetic effects propagate through the network as function of gene–gene correlations. Our estimation of the eQTL network underlying a well-studied yeast data set leads to a sparse structure with more direct genetic and regulatory associations that enable a straightforward comparison of the genetic control of gene expression across chromosomes. Interestingly, it also reveals that eQTLs explain most of the expression variability of network hub genes. PMID:25271303
Application of preconditioned alternating direction method of multipliers in depth from focal stack
NASA Astrophysics Data System (ADS)
Javidnia, Hossein; Corcoran, Peter
2018-03-01
Postcapture refocusing effect in smartphone cameras is achievable using focal stacks. However, the accuracy of this effect is totally dependent on the combination of the depth layers in the stack. The accuracy of the extended depth of field effect in this application can be improved significantly by computing an accurate depth map, which has been an open issue for decades. To tackle this issue, a framework is proposed based on a preconditioned alternating direction method of multipliers for depth from the focal stack and synthetic defocus application. In addition to its ability to provide high structural accuracy, the optimization function of the proposed framework can, in fact, converge faster and better than state-of-the-art methods. The qualitative evaluation has been done on 21 sets of focal stacks and the optimization function has been compared against five other methods. Later, 10 light field image sets have been transformed into focal stacks for quantitative evaluation purposes. Preliminary results indicate that the proposed framework has a better performance in terms of structural accuracy and optimization in comparison to the current state-of-the-art methods.
Vlaming, Hanneke; Molenaar, Thom M; van Welsem, Tibor; Poramba-Liyanage, Deepani W; Smith, Desiree E; Velds, Arno; Hoekman, Liesbeth; Korthout, Tessy; Hendriks, Sjoerd; Altelaar, A F Maarten; van Leeuwen, Fred
2016-12-06
Given the frequent misregulation of chromatin in cancer, it is important to understand the cellular mechanisms that regulate chromatin structure. However, systematic screening for epigenetic regulators is challenging and often relies on laborious assays or indirect reporter read-outs. Here we describe a strategy, Epi-ID, to directly assess chromatin status in thousands of mutants. In Epi-ID, chromatin status on DNA barcodes is interrogated by chromatin immunoprecipitation followed by deep sequencing, allowing for quantitative comparison of many mutants in parallel. Screening of a barcoded yeast knock-out collection for regulators of histone H3K79 methylation by Dot1 identified all known regulators as well as novel players and processes. These include histone deposition, homologous recombination, and adenosine kinase, which influences the methionine cycle. Gcn5, the acetyltransferase within the SAGA complex, was found to regulate histone methylation and H2B ubiquitination. The concept of Epi-ID is widely applicable and can be readily applied to other chromatin features.
Vlaming, Hanneke; Molenaar, Thom M; van Welsem, Tibor; Poramba-Liyanage, Deepani W; Smith, Desiree E; Velds, Arno; Hoekman, Liesbeth; Korthout, Tessy; Hendriks, Sjoerd; Maarten Altelaar, AF; van Leeuwen, Fred
2016-01-01
Given the frequent misregulation of chromatin in cancer, it is important to understand the cellular mechanisms that regulate chromatin structure. However, systematic screening for epigenetic regulators is challenging and often relies on laborious assays or indirect reporter read-outs. Here we describe a strategy, Epi-ID, to directly assess chromatin status in thousands of mutants. In Epi-ID, chromatin status on DNA barcodes is interrogated by chromatin immunoprecipitation followed by deep sequencing, allowing for quantitative comparison of many mutants in parallel. Screening of a barcoded yeast knock-out collection for regulators of histone H3K79 methylation by Dot1 identified all known regulators as well as novel players and processes. These include histone deposition, homologous recombination, and adenosine kinase, which influences the methionine cycle. Gcn5, the acetyltransferase within the SAGA complex, was found to regulate histone methylation and H2B ubiquitination. The concept of Epi-ID is widely applicable and can be readily applied to other chromatin features. DOI: http://dx.doi.org/10.7554/eLife.18919.001 PMID:27922451
NASA Astrophysics Data System (ADS)
Gejji, Rohan M.
The management of combustion dynamics in gas turbine combustors has become more challenging as strict NOx/CO emission standards have led to engine operation in a narrow, lean regime. While premixed or partially premixed combustor configurations such as the Lean Premixed Pre-vaporized (LPP), Rich Quench Lean burn (RQL), and Lean Direct Injection (LDI) have shown a potential for reduced NOx emissions, they promote a coupling between acoustics, hydrodynamics and combustion that can lead to combustion instabilities. These couplings can be quite complex, and their detailed understanding is a pre-requisite to any engine development program and for the development of predictive capability for combustion instabilities through high-fidelity models. The overarching goal of this project is to assess the capability of high-fidelity simulation to predict combustion dynamics in low-emissions gas turbine combustors. A prototypical lean-direct-inject combustor was designed in a modular configuration so that a suitable geometry could be found by test. The combustor comprised a variable length air plenum and combustion chamber, air swirler, and fuel nozzle located inside a subsonic venturi. The venturi cross section and the fuel nozzle were consistent with previous studies. Test pressure was 1 MPa and variables included geometry and acoustic resonance, inlet temperatures, equivalence ratio, and type of liquid fuel. High-frequency pressure measurements in a well-instrumented metal chamber yielded frequencies and mode shapes as a function of inlet air temperature, equivalence ratio, fuel nozzle placement, and combustor acoustic resonances. The parametric survey was a significant effort, with over 105 tests on eight geometric configurations. A good dataset was obtained that could be used for both operating-point-dependent quantitative comparisons, and testing the ability of the simulation to predict more global trends. Results showed a very strong dependence of instability amplitude on the geometric configuration of the combustor, i.e., its acoustic resonance characteristics, with measured pressure fluctuation amplitudes ranged from 5 kPa (0.5% of mean pressure) to 200 kPa ( 20% of mean pressure) depending on combustor geometry. The stability behavior also showed a consistent and pronounced dependence on equivalence ratio and inlet air temperature. Instability amplitude increased with higher equivalence ratio and with lower inlet air temperature. A pronounced effect of fuel nozzle location on the combustion dynamics was also observed. Combustion instabilities with the fuel nozzle at the throat of the venturi throat were stronger than in the configuration with fuel nozzle 2.6 mm upstream of the nozzle. A second set of dynamics data was based on high-response-rate laser-based combustion diagnostics using an optically accessible combustor section. High-frequency measurements of OH*-chemiluminescence and OH-PLIF and velocity fields using PIV were obtained at a relatively stable, low equivalence ratio case and a less stable case at higher equivalence ratio. PIV measurements were performed at 5 kHz for non-reacting flow but glare from the cylindrical quartz chamber limited the field of view to a small region in the combustor. Quantitative and qualitative comparisons were made for five different combinations of geometry and operating condition that yielded discriminating stability behavior in the experiment with simulations that were carried out concurrently. Comparisons were made on the basis of trends and pressure mode data as well as with OH-PLIF measurements for the baseline geometry at equivalence ratios of 0.44 and 0.6. Overall, the ability of the simulation to match experimental data and trends was encouraging. Dynamic Mode Decomposition (DMD) analysis was performed on two sets of computations - a global 2-step chemistry mechanism and an 18-step chemistry mechanism - and the OH-PLIF images to allow comparison of dynamic patterns of heat release and OH distribution in the combustion zone. The DMD analysis was able to identify similar dominant unstable modes in the combustor. Recommendations for future work are based on the continued requirement for quantitative and spatio-temporally resolved data for direct comparison with computational efforts to develop predictive capabilities for combustion instabilities at relevant operating conditions. Discriminating instability behavior for the prototypical combustor demonstrated in this study is critical for any robust validation effort Unit physics based scaling of the current effort to multi-element combustors along with improvement in diagnostic techniques and analysis efforts are recommended for advancement in understanding of the complex physics in the multi-phase, three dimensional and turbulent combustion processes in the LDI combustor.
The U.S. Environmental Protection Agency (EPA) has provided recommended beach advisory values in its 2012 recreational water quality criteria (RWQC) for states wishing to use quantitative polymerase chain reaction (qPCR) for the monitoring of Enterococcus fecal indicator bacteria...
ERIC Educational Resources Information Center
Yilmaz, Kaya
2013-01-01
There has been much discussion about quantitative and qualitative approaches to research in different disciplines. In the behavioural and social sciences, these two paradigms are compared to reveal their relative strengths and weaknesses. But the debate about both traditions has commonly taken place in academic books. It is hard to find an article…
USDA-ARS?s Scientific Manuscript database
Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...
The goal of this research was to compare the populations of 81 mold species in homes in USA and UK using mould specific quantitative polymerase chain reaction (MSQPCR) technology. Dust samples were obtained from randomly selected homes in Great Britain (n=11). The mould populat...
ERIC Educational Resources Information Center
Hudson, Peter; Matthews, Kelly
2012-01-01
Women are underrepresented in science, technology, engineering and mathematics (STEM) areas in university settings; however this may be the result of attitude rather than aptitude. There is widespread agreement that quantitative problem-solving is essential for graduate competence and preparedness in science and other STEM subjects. The research…
Cell densities of the fecal pollution indicator genus, Enterococcus, were determined by a rapid (2-3 hr) quantitative PCR (QPCR) analysis based method in 100 ml water samples collected from recreational beaches on Lake Michigan and Lake Erie during the summer of 2003. Enumeration...
Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review
ERIC Educational Resources Information Center
Booker, Rhae-Ann Richardson
2010-01-01
This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…
ERIC Educational Resources Information Center
Jamison, Joseph A.
2013-01-01
This quantitative study sought to determine whether there were significant statistical differences between the performance scores of special education and general education students' scores when in team or solo-teaching environments as may occur in inclusively taught classrooms. The investigated problem occurs because despite education's stated…
Field, Christopher R.; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C.; Rose-Pehrsson, Susan L.
2014-01-01
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples. PMID:25145416
Field, Christopher R; Lubrano, Adam; Woytowitz, Morgan; Giordano, Braden C; Rose-Pehrsson, Susan L
2014-07-25
The direct liquid deposition of solution standards onto sorbent-filled thermal desorption tubes is used for the quantitative analysis of trace explosive vapor samples. The direct liquid deposition method yields a higher fidelity between the analysis of vapor samples and the analysis of solution standards than using separate injection methods for vapors and solutions, i.e., samples collected on vapor collection tubes and standards prepared in solution vials. Additionally, the method can account for instrumentation losses, which makes it ideal for minimizing variability and quantitative trace chemical detection. Gas chromatography with an electron capture detector is an instrumentation configuration sensitive to nitro-energetics, such as TNT and RDX, due to their relatively high electron affinity. However, vapor quantitation of these compounds is difficult without viable vapor standards. Thus, we eliminate the requirement for vapor standards by combining the sensitivity of the instrumentation with a direct liquid deposition protocol to analyze trace explosive vapor samples.
NASA Astrophysics Data System (ADS)
Yamamoto, Utako; Kobayashi, Tetsuo; Kito, Shinsuke; Koga, Yoshihiko
We have analyzed cerebral white matter using magnetic resonance diffusion tensor imaging (MR-DTI) to measure the diffusion anisotropy of water molecules. The goal of this study is the quantitative evaluation of schizophrenia. Diffusion tensor images are acquired for patients with schizophrenia and healthy comparison subjects, group-matched for age, sex, and handedness. Fiber tracking is performed on the superior longitudinal fasciculus for the comparison between the patient and comparison groups. We have analysed and compared the cross-sectional area on the starting coronal plane and the mean and standard deviation of the fractional anisotropy and the apparent diffusion coefficient along fibers in the right and left hemispheres. In the right hemisphere, the cross-sectional areas in patient group are significantly smaller than those in the comparison group. Furthermore, in the comparison group, the cross-sectional areas in the right hemisphere are significantly larger than those in the left hemisphere, whereas there is no significant difference in the patient group. These results suggest that we may evaluate the disruption in white matter integrity in schizophrenic patients quantitatively by comparing the cross-sectional area of the superior longitudinal fasciculus in the right and left hemispheres.
A Backscatter-Lidar Forward-Operator
NASA Astrophysics Data System (ADS)
Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland
2015-04-01
We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.
Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model
Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.
2012-01-01
Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315
Topology Design for Directional Range Extension Networks with Antenna Blockage
2017-03-19
introduced by pod-based antenna blockages. Using certain modeling approximations, the paper presents a quantitative analysis showing design trade-offs...parameters. Sec- tion IV develops quantitative relationships among key design elements and performance metrics. Section V considers some implications of the...Topology Design for Directional Range Extension Networks with Antenna Blockage Thomas Shake MIT Lincoln Laboratory shake@ll.mit.edu Abstract
Klein, Brennan J; Li, Zhi; Durgin, Frank H
2016-04-01
What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides to dissociate egocentric from allocentric reference frames. In Experiment 1, it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Klein, Brennan J.; Li, Zhi; Durgin, Frank H.
2015-01-01
What is the natural reference frame for seeing large-scale spatial scenes in locomotor action space? Prior studies indicate an asymmetric angular expansion in perceived direction in large-scale environments: Angular elevation relative to the horizon is perceptually exaggerated by a factor of 1.5, whereas azimuthal direction is exaggerated by a factor of about 1.25. Here participants made angular and spatial judgments when upright or on their sides in order to dissociate egocentric from allocentric reference frames. In Experiment 1 it was found that body orientation did not affect the magnitude of the up-down exaggeration of direction, suggesting that the relevant orientation reference frame for this directional bias is allocentric rather than egocentric. In Experiment 2, the comparison of large-scale horizontal and vertical extents was somewhat affected by viewer orientation, but only to the extent necessitated by the classic (5%) horizontal-vertical illusion (HVI) that is known to be retinotopic. Large-scale vertical extents continued to appear much larger than horizontal ground extents when observers lay sideways. When the visual world was reoriented in Experiment 3, the bias remained tied to the ground-based allocentric reference frame. The allocentric HVI is quantitatively consistent with differential angular exaggerations previously measured for elevation and azimuth in locomotor space. PMID:26594884
Fan, Wen; Almirall, José
2014-03-01
A novel geometry configuration based on sorbent-coated glass microfibers packed within a glass capillary is used to sample volatile organic compounds, dynamically, in the headspace of an open system or in a partially open system to achieve quantitative extraction of the available volatiles of explosives with negligible breakthrough. Air is sampled through the newly developed sorbent-packed 2 cm long, 2 mm diameter capillary microextraction of volatiles (CMV) and subsequently introduced into a commercially available thermal desorption probe fitted directly into a GC injection port. A sorbent coating surface area of ∼5 × 10(-2) m(2) or 5,000 times greater than that of a single solid-phase microextraction (SPME) fiber allows for fast (30 s), flow-through sampling of relatively large volumes using sampling flow rates of ∼1.5 L/min. A direct comparison of the new CMV extraction to a static (equilibrium) SPME extraction of the same headspace sample yields a 30 times improvement in sensitivity for the CMV when sampling nitroglycerine (NG), 2,4-dinitrotoluene (2,4-DNT), and diphenylamine (DPA) in a mixture containing a total mass of 500 ng of each analyte, when spiked into a liter-volume container. Calibration curves were established for all compounds studied, and the recovery was determined to be ∼1 % or better after only 1 min of sampling time. Quantitative analysis is also possible using this extraction technique when the sampling temperature, flow rate, and time are kept constant between calibration curves and the sample.
Kelly, Martin J; Feeley, Iain H; O'Byrne, John M
2016-10-01
Direct to consumer (DTC) advertising, targeting the public over the physician, is an increasingly pervasive presence in medical clinics. It is trending toward a format of online interaction rather than that of traditional print and television advertising. We analyze patient-focused Web pages from the top 5 companies supplying prostheses for total hip arthroplasties, comparing them to the top 10 independent medical websites. Quantitative comparison is performed using the Journal of American Medical Association benchmark and DISCERN criteria, and for comparative readability, we use the Flesch-Kincaid grade level, the Flesch reading ease, and the Gunning fog index. Content is analyzed for information on type of surgery and surgical approach. There is a statistically significant difference between the independent and DTC websites in both the mean DISCERN score (independent 74.6, standard deviation [SD] = 4.77; DTC 32.2, SD = 10.28; P = .0022) and the mean Journal of American Medical Association score (Independent 3.45, SD = 0.49; DTC 1.9, SD = 0.74; P = .004). The difference between the readability scores is not statistically significantly. The commercial content is found to be heavily biased in favor of the direct anterior approach and minimally invasive surgical techniques. We demonstrate that the quality of information on commercial websites is inferior to that of the independent sites. The advocacy of surgical approaches by industry to the patient group is a concern. This study underlines the importance of future regulation of commercial patient education Web pages. Copyright © 2016 Elsevier Inc. All rights reserved.
Hout, David R; Schweitzer, Brock L; Lawrence, Kasey; Morris, Stephan W; Tucker, Tracy; Mazzola, Rosetta; Skelton, Rachel; McMahon, Frank; Handshoe, John; Lesperance, Mary; Karsan, Aly; Saltman, David L
2017-08-01
Patients with lung cancers harboring an activating anaplastic lymphoma kinase ( ALK ) rearrangement respond favorably to ALK inhibitor therapy. Fluorescence in situ hybridization (FISH) and immunohistochemistry (IHC) are validated and widely used screening tests for ALK rearrangements but both methods have limitations. The ALK RGQ RT-PCR Kit (RT-PCR) is a single tube quantitative real-time PCR assay for high throughput and automated interpretation of ALK expression. In this study, we performed a direct comparison of formalin-fixed paraffin-embedded (FFPE) lung cancer specimens using all three ALK detection methods. The RT-PCR test (diagnostic cut-off Δ C t of ≤8) was shown to be highly sensitive (100%) when compared to FISH and IHC. Sequencing of RNA detected full-length ALK transcripts or EML4-ALK and KIF5B-ALK fusion variants in discordant cases in which ALK expression was detected by the ALK RT-PCR test but negative by FISH and IHC. The overall specificity of the RT-PCR test for the detection of ALK in cases without full-length ALK expression was 94% in comparison to FISH and sequencing. These data support the ALK RT-PCR test as a highly efficient and reliable diagnostic screening approach to identify patients with non-small cell lung cancer whose tumors are driven by oncogenic ALK.
Retention time alignment of LC/MS data by a divide-and-conquer algorithm.
Zhang, Zhongqi
2012-04-01
Liquid chromatography-mass spectrometry (LC/MS) has become the method of choice for characterizing complex mixtures. These analyses often involve quantitative comparison of components in multiple samples. To achieve automated sample comparison, the components of interest must be detected and identified, and their retention times aligned and peak areas calculated. This article describes a simple pairwise iterative retention time alignment algorithm, based on the divide-and-conquer approach, for alignment of ion features detected in LC/MS experiments. In this iterative algorithm, ion features in the sample run are first aligned with features in the reference run by applying a single constant shift of retention time. The sample chromatogram is then divided into two shorter chromatograms, which are aligned to the reference chromatogram the same way. Each shorter chromatogram is further divided into even shorter chromatograms. This process continues until each chromatogram is sufficiently narrow so that ion features within it have a similar retention time shift. In six pairwise LC/MS alignment examples containing a total of 6507 confirmed true corresponding feature pairs with retention time shifts up to five peak widths, the algorithm successfully aligned these features with an error rate of 0.2%. The alignment algorithm is demonstrated to be fast, robust, fully automatic, and superior to other algorithms. After alignment and gap-filling of detected ion features, their abundances can be tabulated for direct comparison between samples.
Booth, Brandon D; Vilt, Steven G; McCabe, Clare; Jennings, G Kane
2009-09-01
This Article presents a quantitative comparison of the frictional performance for monolayers derived from n-alkanethiolates on gold and n-alkyl trichlorosilanes on silicon. Monolayers were characterized by pin-on-disk tribometry, contact angle analysis, ellipsometry, and electrochemical impedance spectroscopy (EIS). Pin-on-disk microtribometry provided frictional analysis at applied normal loads from 10 to 1000 mN at a speed of 0.1 mm/s. At low loads (10 mN), methyl-terminated n-alkanethiolate self-assembled monolayers (SAMs) exhibited a 3-fold improvement in coefficient of friction over SAMs with hydroxyl- or carboxylic-acid-terminated surfaces. For monolayers prepared from both n-alkanethiols on gold and n-alkyl trichlorosilanes on silicon, a critical chain length of at least eight carbons is required for beneficial tribological performance at an applied load of 9.8 mN. Evidence for disruption of chemisorbed alkanethiolate SAMs with chain lengths n
Hout, David R.; Lawrence, Kasey; Morris, Stephan W.; Tucker, Tracy; Mazzola, Rosetta; Skelton, Rachel; McMahon, Frank; Handshoe, John; Lesperance, Mary; Karsan, Aly
2017-01-01
Patients with lung cancers harboring an activating anaplastic lymphoma kinase (ALK) rearrangement respond favorably to ALK inhibitor therapy. Fluorescence in situ hybridization (FISH) and immunohistochemistry (IHC) are validated and widely used screening tests for ALK rearrangements but both methods have limitations. The ALK RGQ RT-PCR Kit (RT-PCR) is a single tube quantitative real-time PCR assay for high throughput and automated interpretation of ALK expression. In this study, we performed a direct comparison of formalin-fixed paraffin-embedded (FFPE) lung cancer specimens using all three ALK detection methods. The RT-PCR test (diagnostic cut-off ΔCt of ≤8) was shown to be highly sensitive (100%) when compared to FISH and IHC. Sequencing of RNA detected full-length ALK transcripts or EML4-ALK and KIF5B-ALK fusion variants in discordant cases in which ALK expression was detected by the ALK RT-PCR test but negative by FISH and IHC. The overall specificity of the RT-PCR test for the detection of ALK in cases without full-length ALK expression was 94% in comparison to FISH and sequencing. These data support the ALK RT-PCR test as a highly efficient and reliable diagnostic screening approach to identify patients with non-small cell lung cancer whose tumors are driven by oncogenic ALK. PMID:28763012
Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I
2017-01-20
There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Lucas, James E; Siegel, Justin B
2015-01-01
Enzyme active site residues are often highly conserved, indicating a significant role in function. In this study we quantitate the functional contribution for all conserved molecular interactions occurring within a Michaelis complex for mannitol 2-dehydrogenase derived from Pseudomonas fluorescens (pfMDH). Through systematic mutagenesis of active site residues, we reveal that the molecular interactions in pfMDH mediated by highly conserved residues not directly involved in reaction chemistry can be as important to catalysis as those directly involved in the reaction chemistry. This quantitative analysis of the molecular interactions within the pfMDH active site provides direct insight into the functional role of each molecular interaction, several of which were unexpected based on canonical sequence conservation and structural analyses. PMID:25752240
NASA Astrophysics Data System (ADS)
Miao, Linling; Young, Charles D.; Sing, Charles E.
2017-07-01
Brownian Dynamics (BD) simulations are a standard tool for understanding the dynamics of polymers in and out of equilibrium. Quantitative comparison can be made to rheological measurements of dilute polymer solutions, as well as direct visual observations of fluorescently labeled DNA. The primary computational challenge with BD is the expensive calculation of hydrodynamic interactions (HI), which are necessary to capture physically realistic dynamics. The full HI calculation, performed via a Cholesky decomposition every time step, scales with the length of the polymer as O(N3). This limits the calculation to a few hundred simulated particles. A number of approximations in the literature can lower this scaling to O(N2 - N2.25), and explicit solvent methods scale as O(N); however both incur a significant constant per-time step computational cost. Despite this progress, there remains a need for new or alternative methods of calculating hydrodynamic interactions; large polymer chains or semidilute polymer solutions remain computationally expensive. In this paper, we introduce an alternative method for calculating approximate hydrodynamic interactions. Our method relies on an iterative scheme to establish self-consistency between a hydrodynamic matrix that is averaged over simulation and the hydrodynamic matrix used to run the simulation. Comparison to standard BD simulation and polymer theory results demonstrates that this method quantitatively captures both equilibrium and steady-state dynamics after only a few iterations. The use of an averaged hydrodynamic matrix allows the computationally expensive Brownian noise calculation to be performed infrequently, so that it is no longer the bottleneck of the simulation calculations. We also investigate limitations of this conformational averaging approach in ring polymers.
Fee, Timothy; Downs, Crawford; Eberhardt, Alan; Zhou, Yong; Berry, Joel
2016-07-01
It is well documented that electrospun tissue engineering scaffolds can be fabricated with variable degrees of fiber alignment to produce scaffolds with anisotropic mechanical properties. Several attempts have been made to quantify the degree of fiber alignment within an electrospun scaffold using image-based methods. However, these methods are limited by the inability to produce a quantitative measure of alignment that can be used to make comparisons across publications. Therefore, we have developed a new approach to quantifying the alignment present within a scaffold from scanning electron microscopic (SEM) images. The alignment is determined by using the Sobel approximation of the image gradient to determine the distribution of gradient angles with an image. This data was fit to a Von Mises distribution to find the dispersion parameter κ, which was used as a quantitative measure of fiber alignment. We fabricated four groups of electrospun polycaprolactone (PCL) + Gelatin scaffolds with alignments ranging from κ = 1.9 (aligned) to κ = 0.25 (random) and tested our alignment quantification method on these scaffolds. It was found that our alignment quantification method could distinguish between scaffolds of different alignments more accurately than two other published methods. Additionally, the alignment parameter κ was found to be a good predictor the mechanical anisotropy of our electrospun scaffolds. The ability to quantify fiber alignment within and make direct comparisons of scaffold fiber alignment across publications can reduce ambiguity between published results where cells are cultured on "highly aligned" fibrous scaffolds. This could have important implications for characterizing mechanics and cellular behavior on aligned tissue engineering scaffolds. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 1680-1686, 2016. © 2016 Wiley Periodicals, Inc.
Gondim Teixeira, Pedro Augusto; Leplat, Christophe; Chen, Bailiang; De Verbizier, Jacques; Beaumont, Marine; Badr, Sammy; Cotten, Anne; Blum, Alain
2017-12-01
To evaluate intra-tumour and striated muscle T1 value heterogeneity and the influence of different methods of T1 estimation on the variability of quantitative perfusion parameters. Eighty-two patients with a histologically confirmed musculoskeletal tumour were prospectively included in this study and, with ethics committee approval, underwent contrast-enhanced MR perfusion and T1 mapping. T1 value variations in viable tumour areas and in normal-appearing striated muscle were assessed. In 20 cases, normal muscle perfusion parameters were calculated using three different methods: signal based and gadolinium concentration based on fixed and variable T1 values. Tumour and normal muscle T1 values were significantly different (p = 0.0008). T1 value heterogeneity was higher in tumours than in normal muscle (variation of 19.8% versus 13%). The T1 estimation method had a considerable influence on the variability of perfusion parameters. Fixed T1 values yielded higher coefficients of variation than variable T1 values (mean 109.6 ± 41.8% and 58.3 ± 14.1% respectively). Area under the curve was the least variable parameter (36%). T1 values in musculoskeletal tumours are significantly different and more heterogeneous than normal muscle. Patient-specific T1 estimation is needed for direct inter-patient comparison of perfusion parameters. • T1 value variation in musculoskeletal tumours is considerable. • T1 values in muscle and tumours are significantly different. • Patient-specific T1 estimation is needed for comparison of inter-patient perfusion parameters. • Technical variation is higher in permeability than semiquantitative perfusion parameters.
Validation of a quantitative Eimeria spp. PCR for fresh droppings of broiler chickens.
Peek, H W; Ter Veen, C; Dijkman, R; Landman, W J M
2017-12-01
A quantitative Polymerase Chain Reaction (qPCR) for the seven chicken Eimeria spp. was modified and validated for direct use on fresh droppings. The analytical specificity of the qPCR on droppings was 100%. Its analytical sensitivity (non-sporulated oocysts/g droppings) was 41 for E. acervulina, ≤2900 for E. brunetti, 710 for E. praecox, 1500 for E. necatrix, 190 for E. tenella, 640 for E. maxima, and 1100 for E. mitis. Field validation of the qPCR was done using droppings with non-sporulated oocysts from 19 broiler flocks. To reduce the number of qPCR tests five grams of each pooled sample (consisting of ten fresh droppings) per time point were blended into one mixed sample. Comparison of the oocysts per gram (OPG)-counting method with the qPCR using pooled samples (n = 1180) yielded a Pearson's correlation coefficient of 0.78 (95% CI: 0.76-0.80) and a Pearson's correlation coefficient of 0.76 (95% CI: 0.70-0.81) using mixed samples (n = 236). Comparison of the average of the OPG-counts of the five pooled samples with the mixed sample per time point (n = 236) showed a Pearson's correlation coefficient (R) of 0.94 (95% CI: 0.92-0.95) for the OPG-counting method and 0.87 (95% CI: 0.84-0.90) for the qPCR. This indicates that mixed samples are practically equivalent to the mean of five pooled samples. The good correlation between the OPG-counting method and the qPCR was further confirmed by the visual agreement between the total oocyst/g shedding patterns measured with both techniques in the 19 broiler flocks using the mixed samples.
Gruen, D.M.; Young, C.E.; Pellin, M.J.
1989-12-26
A charged particle spectrometer is described for performing ultrasensitive quantitative analysis of selected atomic components removed from a sample. Significant improvements in performing energy and angular refocusing spectroscopy are accomplished by means of a two dimensional structure for generating predetermined electromagnetic field boundary conditions. Both resonance and non-resonance ionization of selected neutral atomic components allow accumulation of increased chemical information. A multiplexed operation between a SIMS mode and a neutral atomic component ionization mode with EARTOF analysis enables comparison of chemical information from secondary ions and neutral atomic components removed from the sample. An electronic system is described for switching high level signals, such as SIMS signals, directly to a transient recorder and through a charge amplifier to the transient recorder for a low level signal pulse counting mode, such as for a neutral atomic component ionization mode. 12 figs.
Probing multiscale transport and inhomogeneity in a lithium-ion cells using in-situ neutron methods
Zhou, Hui; An, Ke; Allu, Srikanth; ...
2016-01-01
Here, we demonstrate for the first time the lithiation process in graphitic anodes using insitu neutron radiography in a pouch cell format. The neutron absorption contrast shows a direct correlation between degree of lithiation and the discharge voltage plateau. Furthermore, we provide a semi-quantitative comparison between the observed spatial variations of neutron attenuation line profile across the graphite electrode and the calculated lithium concentration profiles computed under similar electrochemical discharge conditions. In conjunction, in situ neutron diffraction of a similar pouch cell under identical test protocol was carried to obtain information about the local phase changes upon lithiation. Combined in-situmore » radiography and diffraction opens up a powerful nondestructive method to understand the multi-scale nature of lithium transport and degradation in practical lithium-ion cells.« less
Clinical importance of voluntary and induced Bennett movement.
Tupac, R G
1978-07-01
A total of 136 dentulous patients were divided into three groups for purposes of quantitative pantographic comparison of voluntary and induced Bennett movement. The effects of patient age and operator experience on recording the Bennett movement were also studied. The results indicates that for patients studied with Bennett movement iduced in the manner described: 1. Experienced operators can obtain more induced Bennett movement that inexperienced operators. 2. Inducing Bennett movement has a greater effect on the immediate side shift component than it has on the progressive side shift component. 3. For older individuals the amount and direction of induced immediate side shift is greater than for younger patients, statistically highly significant, and therefore clinically important. In conclusion, if the objective of a pantographic survey is to record the complete capacity of the joint to move, *lateral jaw movements must be induced.
Deciphering the kinetic structure of multi-ion plasma shocks
Keenan, Brett D.; Simakov, Andrei N.; Chacón, Luis; ...
2017-11-15
Here, strong collisional shocks in multi-ion plasmas are featured in many high-energy-density environments, including inertial confinement fusion implosions. However, their basic structure and its dependence on key parameters (e.g., the Mach number and the plasma ion composition) are poorly understood, and inconsistencies in that regard remain in the literature. In particular, the shock width's dependence on the Mach number has been hotly debated for decades. Using a high-fidelity Vlasov-Fokker-Planck code, iFP, and direct comparisons to multi-ion hydrodynamic simulations and semianalytic predictions, we resolve the structure of steady-state planar shocks in D- 3He plasmas. Additionally, we derive and confirm with kineticmore » simulations a quantitative description of the dependence of the shock width on the Mach number and initial ion concentration.« less
NASA Astrophysics Data System (ADS)
Wassel, A. T.; Shih, W. C. L.; Curtis, R. J.
1981-01-01
Boundary layer transition and surface heating distributions on graphite fine weave carbon-carbon, and metallic nosetip materials were derived from surface temperature responses measured in nitrogen environments during both free-flight and track-guided testing in the AEDC Hyperballistics Range/Track G. Innovative test procedures were developed, and heat transfer results were validated against established theory through experiments using a super-smooth tungsten model. Quantitative definitions of mean transition front locations were established by deriving heat flux distributions from measured temperatures, and comparisons made with existing nosetip transition correlations. Qualitative transition locations were inferred directly from temperature distributions to investigate preferred orientations on fine weave nosetips. Levels of roughness augmented heat transfer were generally shown to be below values predicted by state of the art methods.
Strength and reversibility of stereotypes for a rotary control with linear scales.
Chan, Alan H S; Chan, W H
2008-02-01
Using real mechanical controls, this experiment studied strength and reversibility of direction-of-motion stereotypes and response times for a rotary control with horizontal and vertical scales. Thirty-eight engineering undergraduates (34 men and 4 women) ages 23 to 47 years (M=29.8, SD=7.7) took part in the experiment voluntarily. The effects of instruction of change of pointer position and control plane on movement compatibility were analyzed with precise quantitative measures of strength and a reversibility index of stereotype. Comparisons of the strength and reversibility values of these two configurations with those of rotary control-circular display, rotary control-digital counter, four-way lever-circular display, and four-way lever-digital counter were made. The results of this study provided significant implications for the industrial design of control panels for improved human performance.
Deciphering the kinetic structure of multi-ion plasma shocks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keenan, Brett D.; Simakov, Andrei N.; Chacón, Luis
Here, strong collisional shocks in multi-ion plasmas are featured in many high-energy-density environments, including inertial confinement fusion implosions. However, their basic structure and its dependence on key parameters (e.g., the Mach number and the plasma ion composition) are poorly understood, and inconsistencies in that regard remain in the literature. In particular, the shock width's dependence on the Mach number has been hotly debated for decades. Using a high-fidelity Vlasov-Fokker-Planck code, iFP, and direct comparisons to multi-ion hydrodynamic simulations and semianalytic predictions, we resolve the structure of steady-state planar shocks in D- 3He plasmas. Additionally, we derive and confirm with kineticmore » simulations a quantitative description of the dependence of the shock width on the Mach number and initial ion concentration.« less
Pre-clinical characterization of tissue engineering constructs for bone and cartilage regeneration
Trachtenberg, Jordan E.; Vo, Tiffany N.; Mikos, Antonios G.
2014-01-01
Pre-clinical animal models play a crucial role in the translation of biomedical technologies from the bench top to the bedside. However, there is a need for improved techniques to evaluate implanted biomaterials within the host, including consideration of the care and ethics associated with animal studies, as well as the evaluation of host tissue repair in a clinically relevant manner. This review discusses non-invasive, quantitative, and real-time techniques for evaluating host-materials interactions, quality and rate of neotissue formation, and functional outcomes of implanted biomaterials for bone and cartilage tissue engineering. Specifically, a comparison will be presented for pre-clinical animal models, histological scoring systems, and non-invasive imaging modalities. Additionally, novel technologies to track delivered cells and growth factors will be discussed, including methods to directly correlate their release with tissue growth. PMID:25319726
Electronic energy level alignment at metal-molecule interfaces with a GW approach
NASA Astrophysics Data System (ADS)
Tamblyn, Isaac; Darancet, Pierre; Quek, Su Ying; Bonev, Stanimir A.; Neaton, Jeffrey B.
2011-11-01
Using density functional theory and many-body perturbation theory within a GW approximation, we calculate the electronic structure of a metal-molecule interface consisting of benzene diamine (BDA) adsorbed on Au(111). Through direct comparison with photoemission data, we show that a conventional G0W0 approach can underestimate the energy of the adsorbed molecular resonance relative to the Au Fermi level by up to 0.8 eV. The source of this discrepancy is twofold: a 0.7 eV underestimate of the gas phase ionization energy (IE), and a 0.2 eV overestimate of the Au work function. Refinements to self-energy calculations within the GW framework that account for deviations in both the Au work function and BDA gas-phase IE can result in an interfacial electronic level alignment in quantitative agreement with experiment.
Pre-clinical characterization of tissue engineering constructs for bone and cartilage regeneration.
Trachtenberg, Jordan E; Vo, Tiffany N; Mikos, Antonios G
2015-03-01
Pre-clinical animal models play a crucial role in the translation of biomedical technologies from the bench top to the bedside. However, there is a need for improved techniques to evaluate implanted biomaterials within the host, including consideration of the care and ethics associated with animal studies, as well as the evaluation of host tissue repair in a clinically relevant manner. This review discusses non-invasive, quantitative, and real-time techniques for evaluating host-materials interactions, quality and rate of neotissue formation, and functional outcomes of implanted biomaterials for bone and cartilage tissue engineering. Specifically, a comparison will be presented for pre-clinical animal models, histological scoring systems, and non-invasive imaging modalities. Additionally, novel technologies to track delivered cells and growth factors will be discussed, including methods to directly correlate their release with tissue growth.
Thermalization of oscillator chains with onsite anharmonicity and comparison with kinetic theory
Mendl, Christian B.; Lu, Jianfeng; Lukkarinen, Jani
2016-12-02
We perform microscopic molecular dynamics simulations of particle chains with an onsite anharmonicity to study relaxation of spatially homogeneous states to equilibrium, and directly compare the simulations with the corresponding Boltzmann-Peierls kinetic theory. The Wigner function serves as a common interface between the microscopic and kinetic level. We demonstrate quantitative agreement after an initial transient time interval. In particular, besides energy conservation, we observe the additional quasiconservation of the phonon density, defined via an ensemble average of the related microscopic field variables and exactly conserved by the kinetic equations. On superkinetic time scales, density quasiconservation is lost while energy remainsmore » conserved, and we find evidence for eventual relaxation of the density to its canonical ensemble value. Furthermore, the precise mechanism remains unknown and is not captured by the Boltzmann-Peierls equations.« less
Thermalization of oscillator chains with onsite anharmonicity and comparison with kinetic theory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendl, Christian B.; Lu, Jianfeng; Lukkarinen, Jani
We perform microscopic molecular dynamics simulations of particle chains with an onsite anharmonicity to study relaxation of spatially homogeneous states to equilibrium, and directly compare the simulations with the corresponding Boltzmann-Peierls kinetic theory. The Wigner function serves as a common interface between the microscopic and kinetic level. We demonstrate quantitative agreement after an initial transient time interval. In particular, besides energy conservation, we observe the additional quasiconservation of the phonon density, defined via an ensemble average of the related microscopic field variables and exactly conserved by the kinetic equations. On superkinetic time scales, density quasiconservation is lost while energy remainsmore » conserved, and we find evidence for eventual relaxation of the density to its canonical ensemble value. Furthermore, the precise mechanism remains unknown and is not captured by the Boltzmann-Peierls equations.« less
NASA Technical Reports Server (NTRS)
Tedder, S. A.; OByrne, S.; Danehy, P. M.; Cutler, A. D.
2005-01-01
The dual-pump coherent anti-Stokes Raman spectroscopy (CARS) method was used to measure temperature and the absolute mole fractions of N2, O2 and H2 in a supersonic combustor. Experiments were conducted in the NASA Langley Direct-Connect Supersonic Combustion Test Facility. CARS measurements were performed at the facility nozzle exit and at three planes downstream of fuel injection. Processing the CARS measurements produced maps of the mean temperature, as well as quantitative N2 and O2 and qualitative H2 mean mole fraction fields at each plane. The CARS measurements were also used to compute correlations between fluctuations of the different simultaneously measured parameters. Comparisons were made between this 90 degree angle fuel injection case and a 30 degree fuel injection case previously presented at the 2004 Reno AIAA Meeting.
Objective analysis of pseudostress over the Indian Ocean using a direct-minimization approach
NASA Technical Reports Server (NTRS)
Legler, David M.; Navon, I. M.; O'Brien, James J.
1989-01-01
A technique not previously used in objective analysis of meteorological data is used here to produce monthly average surface pseudostress data over the Indian Ocean. An initial guess field is derived and a cost functional is constructed with five terms: approximation to initial guess, approximation to climatology, a smoothness parameter, and two kinematic terms. The functional is minimized using a conjugate-gradient technique, and the weight for the climatology term controls the overall balance of influence between the climatology and the initial guess. Results from various weight combinations are presented for January and July 1984. Quantitative and qualitative comparisons to the subject analysis are made to find which weight combination provides the best results. The weight on the approximation to climatology is found to balance the influence of the original field and climatology.
Hortin, Mitchell S; Bowden, Anton E
2016-11-01
Data has been published that quantifies the nonlinear, anisotropic material behaviour and pre-strain behaviour of the anterior longitudinal, supraspinous (SSL), and interspinous ligaments of the human lumbar spine. Additionally, data has been published on localized material properties of the SSL. These results have been incrementally incorporated into a previously validated finite element model of the human lumbar spine. Results suggest that the effects of increased ligament model fidelity on bone strain energy were moderate and the effects on disc pressure were slight, and do not justify a change in modelling strategy for most clinical applications. There were significant effects on the ligament stresses of the ligaments that were directly modified, suggesting that these phenomena should be included in FE models where ligament stresses are the desired metric.
GAO, L.; HAGEN, N.; TKACZYK, T.S.
2012-01-01
Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127
dCLIP: a computational approach for comparative CLIP-seq analyses
2014-01-01
Although comparison of RNA-protein interaction profiles across different conditions has become increasingly important to understanding the function of RNA-binding proteins (RBPs), few computational approaches have been developed for quantitative comparison of CLIP-seq datasets. Here, we present an easy-to-use command line tool, dCLIP, for quantitative CLIP-seq comparative analysis. The two-stage method implemented in dCLIP, including a modified MA normalization method and a hidden Markov model, is shown to be able to effectively identify differential binding regions of RBPs in four CLIP-seq datasets, generated by HITS-CLIP, iCLIP and PAR-CLIP protocols. dCLIP is freely available at http://qbrc.swmed.edu/software/. PMID:24398258
Direct comparison of optical lattice clocks with an intercontinental baseline of 9000 km.
Hachisu, H; Fujieda, M; Nagano, S; Gotoh, T; Nogami, A; Ido, T; Falke, St; Huntemann, N; Grebing, C; Lipphardt, B; Lisdat, Ch; Piester, D
2014-07-15
We have demonstrated a direct frequency comparison between two ⁸⁷Sr lattice clocks operated in intercontinentally separated laboratories in real time. Two-way satellite time and frequency transfer technique, based on the carrier-phase, was employed for a direct comparison, with a baseline of 9000 km between Japan and Germany. A frequency comparison was achieved for 83,640 s, resulting in a fractional difference of (1.1±1.6)×10⁻¹⁵, where the statistical part is the largest contributor to the uncertainty. This measurement directly confirms the agreement of the two optical frequency standards on an intercontinental scale.
Comparing Observations, 1st Experimental Edition.
ERIC Educational Resources Information Center
Butts, David P.
Objectives for this module include the ability to: (1) order objects by comparing a property which the objects have in common (such as length, area, volume or mass), (2) describe objects (length, area, volume, mass, etc.) by comparing them quantitatively using either arbitrary units of comparison or standard units of comparison, and (3) describe…
ERIC Educational Resources Information Center
Dimitropoulos, Anastasia; Ho, Alan Y.; Klaiman, Cheryl; Koenig, Kathy; Schultz, Robert T.
2009-01-01
In order to investigate unique and shared characteristics and to determine factors predictive of group classification, quantitative comparisons of behavioral and emotional problems were assessed using the Developmental Behavior Checklist (DBC-P) and the Vineland Adaptive Behavior Scales in autistic disorder, Williams syndrome (WS), and…
Amin-Hanjani, Sepideh; Singh, Amritha; Rifai, Hashem; Thulborn, Keith R; Alaraj, Ali; Aletich, Victor; Charbel, Fady T
2013-12-01
The optimal revascularization strategy for symptomatic adult moyamoya remains controversial. Whereas direct bypass offers immediate revascularization, indirect bypass can effectively induce collaterals over time. Using angiography and quantitative magnetic resonance angiography, we examined the relative contributions of direct and indirect bypass in moyamoya patients after combined direct superficial temporal artery-to-middle cerebral artery (STA-MCA) bypass and indirect encephaloduroarteriosynangiosis (EDAS). A retrospective review of moyamoya patients undergoing combined STA-MCA bypass and EDAS was conducted, excluding pediatric patients and hemorrhagic presentation. Patients with quantitative magnetic resonance angiography measurements of the direct bypass immediately and > 6 months postoperatively were included. Angiographic follow-up, when available, was used to assess EDAS collaterals at similar time intervals. Of 16 hemispheres in 13 patients, 11 (69%) demonstrated a significant (> 50%) decline in direct bypass flow at > 6 months compared with baseline, averaging a drop from 99 ± 35 to 12 ± 7 mL/min. Conversely, angiography in these hemispheres demonstrated prominent indirect collaterals, in concert with shrinkage of the STA graft. Decline in flow was apparent at a median of 9 months but was evident as early as 2 to 3 months. In this small cohort, a reciprocal relationship between direct STA bypass flow and indirect EDAS collaterals frequently occurred. This substantiates the notion that combined direct/indirect bypass can provide temporally complementary revascularization.
Malkyarenko, Dariya I; Chenevert, Thomas L
2014-12-01
To describe an efficient procedure to empirically characterize gradient nonlinearity and correct for the corresponding apparent diffusion coefficient (ADC) bias on a clinical magnetic resonance imaging (MRI) scanner. Spatial nonlinearity scalars for individual gradient coils along superior and right directions were estimated via diffusion measurements of an isotropicic e-water phantom. Digital nonlinearity model from an independent scanner, described in the literature, was rescaled by system-specific scalars to approximate 3D bias correction maps. Correction efficacy was assessed by comparison to unbiased ADC values measured at isocenter. Empirically estimated nonlinearity scalars were confirmed by geometric distortion measurements of a regular grid phantom. The applied nonlinearity correction for arbitrarily oriented diffusion gradients reduced ADC bias from 20% down to 2% at clinically relevant offsets both for isotropic and anisotropic media. Identical performance was achieved using either corrected diffusion-weighted imaging (DWI) intensities or corrected b-values for each direction in brain and ice-water. Direction-average trace image correction was adequate only for isotropic medium. Empiric scalar adjustment of an independent gradient nonlinearity model adequately described DWI bias for a clinical scanner. Observed efficiency of implemented ADC bias correction quantitatively agreed with previous theoretical predictions and numerical simulations. The described procedure provides an independent benchmark for nonlinearity bias correction of clinical MRI scanners.
NASA Astrophysics Data System (ADS)
Warriner, Heidi E.; Safinya, Cyrus R.
1997-03-01
Using two complimentary techniques, we have measured repulsive interactions in the L_α phase of very flexible membranes composed of the surfactant C12E5 and small amounts of polymer-lipids derived from polyethylene glycol (PEG-DMPE 5000, PEG-DMPE 2000 and PEG-DMPE 550). In the first method, the lamellar repeat distance of samples in equilibrium with a dextran solution of known osmotic pressure is determined, yielding a direct measurement of pressure versus distance. These data immediately differentiate the repulsive interaction between flexible polymer-decorated membranes from polymer-brush forces found in rigid lamellar systems. In the second method, fits to high-resolution x-ray data yield the η parameter, proportional to (κB)-1\\over2, where B is the layer compressional modulus and κ is the bending rigidity of a single membrane. Combining the two types of data to eliminate B, one can quantitatively determine the κ of a decorated membrane as a function of polymer-lipid concentration. For the bare C12E5 membrane, where κ is known , a direct comparison of the compressibility modulus values derived via the two methods is also possible. This work supported by NSF-DMR-9624091; PRF-31352-AC7 CULAR-STB/UC:96-118.
Integrating Quantitative and Ethnographic Methods to Describe the Classroom. Report No. 5083.
ERIC Educational Resources Information Center
Malitz, David; And Others
The debate between proponents of ethnographic and quantitative methodology in classroom observation is reviewed, and the respective strengths and weaknesses of the two approaches are discussed. These methodologies are directly compared in a study that conducted simultaneous ethnographic and quantitative observations on nine classrooms. It is…
ERIC Educational Resources Information Center
Ferdinandi, Andrew D.; Li, Ming Hui
2007-01-01
The purpose of this quantitative study was to investigate the effect of counselor active rehabilitation service compared with the effect of standard rehabilitation counseling in assisting individuals with coexisting psychiatric and substance abuse disorders in attaining desired life roles. This study was conducted during a 6-month period in a…
Clinical applications of a quantitative analysis of regional lift ventricular wall motion
NASA Technical Reports Server (NTRS)
Leighton, R. F.; Rich, J. M.; Pollack, M. E.; Altieri, P. I.
1975-01-01
Observations were summarized which may have clinical application. These were obtained from a quantitative analysis of wall motion that was used to detect both hypokinesis and tardokinesis in left ventricular cineangiograms. The method was based on statistical comparisons with normal values for regional wall motion derived from the cineangiograms of patients who were found not to have heart disease.
ERIC Educational Resources Information Center
Belue, Paul T.; Cavey, Laurie Overman; Kinzel, Margaret T.
2017-01-01
In this exploratory study, we examined the effects of a quantitative reasoning instructional approach to linear equations in two variables on community college students' conceptual understanding, procedural fluency, and reasoning ability. This was done in comparison to the use of a traditional procedural approach for instruction on the same topic.…
Daniel J. Manier; Richard D. Laven
2001-01-01
Using repeat photography, we conducted a qualitative and quantitative analysis of changes in forest cover on the western slope of the Rocky Mountains in Colorado. For the quantitative analysis, both images in a pair were classified using remote sensing and geographic information system (GIS) technologies. Comparisons were made using three landscape metrics: total...
In Vitro Comparison of Adipokine Export Signals.
Sharafi, Parisa; Kocaefe, Y Çetin
2016-01-01
Mammalian cells are widely used for recombinant protein production in research and biotechnology. Utilization of export signals significantly facilitates production and purification processes. 35 years after the discovery of the mammalian export machinery, there still are obscurities regarding the efficiency of the export signals. The aim of this study was the comparative evaluation of the efficiency of selected export signals using adipocytes as a cell model. Adipocytes have a large capacity for protein secretion including several enzymes, adipokines, and other signaling molecules, providing a valid system for a quantitative evaluation. Constructs that expressed N-terminal fusion export signals were generated to express Enhanced Green Fluorescence Protein (EGFP) as a reporter for quantitative and qualitative evaluation. Furthermore, fluorescent microscopy was used to trace the intracellular traffic of the reporter. The export efficiency of six selected proteins secreted from adipocytes was evaluated. Quantitative comparison of intracellular and exported fractions of the recombinant constructs demonstrated a similar efficiency among the studied sequences with minor variations. The export signal of Retinol Binding Protein (RBP4) exhibited the highest efficiency. This study presents the first quantitative data showing variations among export signals, in adipocytes which will help optimization of recombinant protein distribution.
Gregorich, Steven E
2006-11-01
Comparative public health research makes wide use of self-report instruments. For example, research identifying and explaining health disparities across demographic strata may seek to understand the health effects of patient attitudes or private behaviors. Such personal attributes are difficult or impossible to observe directly and are often best measured by self-reports. Defensible use of self-reports in quantitative comparative research requires not only that the measured constructs have the same meaning across groups, but also that group comparisons of sample estimates (eg, means and variances) reflect true group differences and are not contaminated by group-specific attributes that are unrelated to the construct of interest. Evidence for these desirable properties of measurement instruments can be established within the confirmatory factor analysis (CFA) framework; a nested hierarchy of hypotheses is tested that addresses the cross-group invariance of the instrument's psychometric properties. By name, these hypotheses include configural, metric (or pattern), strong (or scalar), and strict factorial invariance. The CFA model and each of these hypotheses are described in nontechnical language. A worked example and technical appendices are included.
Comparison of Housing Construction Development in Selected Regions of Central Europe
NASA Astrophysics Data System (ADS)
Dvorský, Ján; Petráková, Zora; Hollý, Ján
2017-12-01
In fast-growing countries, the economic growth, which came after the global financial crisis, ought to be manifested in the development of housing policy. The development of the region is directly related to the increase of the quality of living of its inhabitants. Housing construction and its relation with the availability of housing is a key issue for population overall. Comparison of its development in selected regions is important for experts in the field of construction, mayors of the regions, the state, but especially for the inhabitants themselves. The aim of the article is to compare the number of new dwellings with building permits and completed dwellings with final building approval between selected regions by using a mathematical statistics method - “Analysis of variance”. The article also uses the tools of descriptive statistics such as a point graph, a graph of deviations from the average, basic statistical characteristics of mean and variability. Qualitative factors influencing the construction of flats as well as the causes of quantitative differences in the number of started apartments under construction and completed apartments in selected regions of Central Europe are the subjects of the article’s conclusions.
Comparison of the predictions of two road dust emission models with the measurements of a mobile van
NASA Astrophysics Data System (ADS)
Kauhaniemi, M.; Stojiljkovic, A.; Pirjola, L.; Karppinen, A.; Härkönen, J.; Kupiainen, K.; Kangas, L.; Aarnio, M. A.; Omstedt, G.; Denby, B. R.; Kukkonen, J.
2014-02-01
The predictions of two road dust suspension emission models were compared with the on-site mobile measurements of suspension emission factors. Such a quantitative comparison has not previously been reported in the reviewed literature. The models used were the Nordic collaboration model NORTRIP (NOn-exhaust Road TRaffic Induced Particle emissions) and the Swedish-Finnish FORE model (Forecasting Of Road dust Emissions). These models describe particulate matter generated by the wear of road surface due to traction control methods and processes that control the suspension of road dust particles into the air. An experimental measurement campaign was conducted using a mobile laboratory called SNIFFER, along two selected road segments in central Helsinki in 2007 and 2008. The suspended PM10 concentration was measured behind the left rear tyre and the street background PM10 concentration in front of the van. Both models reproduced the measured seasonal variation of suspension emission factors fairly well during both years at both measurement sites. However, both models substantially under-predicted the measured emission values. The results indicate that road dust emission models can be directly compared with mobile measurements; however, more extensive and versatile measurement campaigns will be needed in the future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Popovich, P.; Carter, T. A.; Friedman, B.
Numerical simulation of plasma turbulence in the Large Plasma Device (LAPD) [W. Gekelman, H. Pfister, Z. Lucky et al., Rev. Sci. Instrum. 62, 2875 (1991)] is presented. The model, implemented in the BOUndary Turbulence code [M. Umansky, X. Xu, B. Dudson et al., Contrib. Plasma Phys. 180, 887 (2009)], includes three-dimensional (3D) collisional fluid equations for plasma density, electron parallel momentum, and current continuity, and also includes the effects of ion-neutral collisions. In nonlinear simulations using measured LAPD density profiles but assuming constant temperature profile for simplicity, self-consistent evolution of instabilities and nonlinearly generated zonal flows results in a saturatedmore » turbulent state. Comparisons of these simulations with measurements in LAPD plasmas reveal good qualitative and reasonable quantitative agreement, in particular in frequency spectrum, spatial correlation, and amplitude probability distribution function of density fluctuations. For comparison with LAPD measurements, the plasma density profile in simulations is maintained either by direct azimuthal averaging on each time step, or by adding particle source/sink function. The inferred source/sink values are consistent with the estimated ionization source and parallel losses in LAPD. These simulations lay the groundwork for more a comprehensive effort to test fluid turbulence simulation against LAPD data.« less
Interdisciplinary research on patient-provider communication: a cross-method comparison.
Chou, Wen-Ying Sylvia; Han, Paul; Pilsner, Alison; Coa, Kisha; Greenberg, Larrie; Blatt, Benjamin
2011-01-01
Patient-provider communication, a key aspect of healthcare delivery, has been assessed through multiple methods for purposes of research, education, and quality control. Common techniques include satisfaction ratings and quantitatively- and qualitatively-oriented direct observations. Identifying the strengths and weaknesses of different approaches is critically important in determining the appropriate assessment method for a specific research or practical goal. Analyzing ten videotaped simulated encounters between medical students and Standardized Patients (SPs), this study compared three existing assessment methods through the same data set. Methods included: (1) dichotomized SP ratings on students' communication skills; (2) Roter Interaction Analysis System (RIAS) analysis; and (3) inductive discourse analysis informed by sociolinguistic theories. The large dichotomous contrast between good and poor ratings in (1) was not evidenced in any of the other methods. Following a discussion of strengths and weaknesses of each approach, we pilot-tested a combined assessment done by coders blinded to results of (1)-(3). This type of integrative approach has the potential of adding a quantifiable dimension to qualitative, discourse-based observations. Subjecting the same data set to separate analytic methods provides an excellent opportunity for methodological comparisons with the goal of informing future assessment of clinical encounters.
Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat
2012-11-20
The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-06-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on Gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups; and then it constructs the Voronoi Diagram between the two groups using the triangular network; after this, the normal of each Vornoi edge is calculated, and the quantitative expression of the direction relations is constructed; finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
An approach to computing direction relations between separated object groups
NASA Astrophysics Data System (ADS)
Yan, H.; Wang, Z.; Li, J.
2013-09-01
Direction relations between object groups play an important role in qualitative spatial reasoning, spatial computation and spatial recognition. However, none of existing models can be used to compute direction relations between object groups. To fill this gap, an approach to computing direction relations between separated object groups is proposed in this paper, which is theoretically based on gestalt principles and the idea of multi-directions. The approach firstly triangulates the two object groups, and then it constructs the Voronoi diagram between the two groups using the triangular network. After this, the normal of each Voronoi edge is calculated, and the quantitative expression of the direction relations is constructed. Finally, the quantitative direction relations are transformed into qualitative ones. The psychological experiments show that the proposed approach can obtain direction relations both between two single objects and between two object groups, and the results are correct from the point of view of spatial cognition.
Freed, Melanie; de Zwart, Jacco A.; Hariharan, Prasanna; R. Myers, Matthew; Badano, Aldo
2011-01-01
Purpose: To develop a dynamic lesion phantom that is capable of producing physiological kinetic curves representative of those seen in human dynamic contrast-enhanced MRI (DCE-MRI) data. The objective of this phantom is to provide a platform for the quantitative comparison of DCE-MRI protocols to aid in the standardization and optimization of breast DCE-MRI. Methods: The dynamic lesion consists of a hollow, plastic mold with inlet and outlet tubes to allow flow of a contrast agent solution through the lesion over time. Border shape of the lesion can be controlled using the lesion mold production method. The configuration of the inlet and outlet tubes was determined using fluid transfer simulations. The total fluid flow rate was determined using x-ray images of the lesion for four different flow rates (0.25, 0.5, 1.0, and 1.5 ml∕s) to evaluate the resultant kinetic curve shape and homogeneity of the contrast agent distribution in the dynamic lesion. High spatial and temporal resolution x-ray measurements were used to estimate the true kinetic curve behavior in the dynamic lesion for benign and malignant example curves. DCE-MRI example data were acquired of the dynamic phantom using a clinical protocol. Results: The optimal inlet and outlet tube configuration for the lesion molds was two inlet molds separated by 30° and a single outlet tube directly between the two inlet tubes. X-ray measurements indicated that 1.0 ml∕s was an appropriate total fluid flow rate and provided truth for comparison with MRI data of kinetic curves representative of benign and malignant lesions. DCE-MRI data demonstrated the ability of the phantom to produce realistic kinetic curves. Conclusions: The authors have constructed a dynamic lesion phantom, demonstrated its ability to produce physiological kinetic curves, and provided estimations of its true kinetic curve behavior. This lesion phantom provides a tool for the quantitative evaluation of DCE-MRI protocols, which may lead to improved discrimination of breast cancer lesions. PMID:21992378
O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A
2010-03-01
Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to achieve significant improvements over the fixed template and the segmentation methods. As well as achieving acceptable kinetic parameter accuracy, the quality of fit of the region of interest (ROI) time-course data based on the extracted AIF, matches results based on arterially sampled AIFs. In comparison, significant deviation in the estimation of FDG flux and degradation in ROI data fit are found with the template and segmentation methods. The proposed AIF extraction method is recommended for practical use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Kimberly E; Gerdes, Kirk
2013-07-01
A new and complete GC–ICP-MS method is described for direct analysis of trace metals in a gas phase process stream. The proposed method is derived from standard analytical procedures developed for ICP-MS, which are regularly exercised in standard ICP-MS laboratories. In order to implement the method, a series of empirical factors were generated to calibrate detector response with respect to a known concentration of an internal standard analyte. Calibrated responses are ultimately used to determine the concentration of metal analytes in a gas stream using a semi-quantitative algorithm. The method was verified using a traditional gas injection from a GCmore » sampling valve and a standard gas mixture containing either a 1 ppm Xe + Kr mix with helium balance or 100 ppm Xe with helium balance. Data collected for Xe and Kr gas analytes revealed that agreement of 6–20% with the actual concentration can be expected for various experimental conditions. To demonstrate the method using a relevant “unknown” gas mixture, experiments were performed for continuous 4 and 7 hour periods using a Hg-containing sample gas that was co-introduced into the GC sample loop with the xenon gas standard. System performance and detector response to the dilute concentration of the internal standard were pre-determined, which allowed semi-quantitative evaluation of the analyte. The calculated analyte concentrations varied during the course of the 4 hour experiment, particularly during the first hour of the analysis where the actual Hg concentration was under predicted by up to 72%. Calculated concentration improved to within 30–60% for data collected after the first hour of the experiment. Similar results were seen during the 7 hour test with the deviation from the actual concentration being 11–81% during the first hour and then decreasing for the remaining period. The method detection limit (MDL) was determined for the mercury by injecting the sample gas into the system following a period of equilibration. The MDL for Hg was calculated as 6.8 μg · m -3. This work describes the first complete GC–ICP-MS method to directly analyze gas phase samples, and detailed sample calculations and comparisons to conventional ICP-MS methods are provided.« less
Methodological triangulation in a study of social support for siblings of children with cancer.
Murray, J S
1999-10-01
Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.
Elias, Andrew; Crayton, Samuel H.; Warden-Rothman, Robert; Tsourkas, Andrew
2014-01-01
Given the rapidly expanding library of disease biomarkers and targeting agents, the number of unique targeted nanoparticles is growing exponentially. The high variability and expense of animal testing often makes it unfeasible to examine this large number of nanoparticles in vivo. This often leads to the investigation of a single formulation that performed best in vitro. However, nanoparticle performance in vivo depends on many variables, many of which cannot be adequately assessed with cell-based assays. To address this issue, we developed a lanthanide-doped nanoparticle method that allows quantitative comparison of multiple targeted nanoparticles simultaneously. Specifically, superparamagnetic iron oxide (SPIO) nanoparticles with different targeting ligands were created, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood and resected tumor samples. PMID:25068300
Comparison of quantitative and qualitative tests for glucose-6-phosphate dehydrogenase deficiency.
LaRue, Nicole; Kahn, Maria; Murray, Marjorie; Leader, Brandon T; Bansil, Pooja; McGray, Sarah; Kalnoky, Michael; Zhang, Hao; Huang, Huiqiang; Jiang, Hui; Domingo, Gonzalo J
2014-10-01
A barrier to eliminating Plasmodium vivax malaria is inadequate treatment of infected patients. 8-Aminoquinoline-based drugs clear the parasite; however, people with glucose-6-phosphate dehydrogenase (G6PD) deficiency are at risk for hemolysis from these drugs. Understanding the performance of G6PD deficiency tests is critical for patient safety. Two quantitative assays and two qualitative tests were evaluated. The comparison of quantitative assays gave a Pearson correlation coefficient of 0.7585 with significant difference in mean G6PD activity, highlighting the need to adhere to a single reference assay. Both qualitative tests had high sensitivity and negative predictive value at a cutoff G6PD value of 40% of normal activity if interpreted conservatively and performed under laboratory conditions. The performance of both tests dropped at a cutoff level of 45%. Cytochemical staining of specimens confirmed that heterozygous females with > 50% G6PD-deficient cells can seem normal by phenotypic tests. © The American Society of Tropical Medicine and Hygiene.
NASA Astrophysics Data System (ADS)
Guin, Arijit; Ramanathan, Ramya; Ritzi, Robert W.; Dominic, David F.; Lunt, Ian A.; Scheibe, Timothy D.; Freedman, Vicky L.
2010-04-01
In part 1 of this paper (Ramanathan et al., 2010b) we presented a methodology and a code for modeling the hierarchical sedimentary architecture in braided channel belt deposits. Here in part 2, the code was used to create a digital model of this architecture and the corresponding spatial distribution of permeability. The simulated architecture was compared to the real stratal architecture observed in an abandoned channel belt. The comparisons included assessments of similarity which were both qualitative and quantitative. The qualitative comparisons show that the geometries of unit types within the synthetic deposits are generally consistent with field observations. The unit types in the synthetic deposits would generally be recognized as representing their counterparts in nature, including cross stratasets, lobate and scroll bar deposits, and channel fills. Furthermore, the synthetic deposits have a hierarchical spatial relationship among these units consistent with observations from field exposures and geophysical images. In quantitative comparisons the proportions and the length, width, and height of unit types at different scales, across all levels of the stratal hierarchy, compare well between the synthetic and the natural deposits. A number of important attributes of the synthetic channel belt deposits are shown to be influenced by more than one level within the hierarchy of stratal architecture. First, the high-permeability open-framework gravels connected across all levels and thus formed preferential flow pathways; open-framework gravels are known to form preferential flow pathways in natural channel belt deposits. The nature of a connected cluster changed across different levels of the stratal hierarchy, and as a result of the geologic structure, the connectivity occurs at proportions of open-framework gravels below the theoretical percolation threshold for random infinite media. Second, when the channel belt model was populated with permeability distributions by lowest-level unit type, the composite permeability semivariogram contained structures that were identifiable at more than one scale, and each of these structures could be directly linked to unit types of different scales existing at different levels within the hierarchy of strata. These collective results are encouraging with respect to our goal that this model be relevant for testing ideas in future research on flow and transport in aquifers and reservoirs with multiscale heterogeneity.
NASA Astrophysics Data System (ADS)
Burton, Mike
2015-07-01
Magmatic degassing plays a key role in the dynamics of volcanic activity and also in contributing to the carbon, water and sulphur volatile cycles on Earth. Quantifying the fluxes of magmatic gas emitted from volcanoes is therefore of fundamental importance in Earth Science. This has been recognised since the beginning of modern volcanology, with initial measurements of volcanic SO2 flux being conducted with COrrelation SPECtrometer instruments from the late seventies. While COSPEC measurements continue today, they have been largely superseded by compact grating spectrometers, which were first introduced soon after the start of the 21st Century. Since 2006, a new approach to measuring fluxes has appeared, that of quantitative imaging of the SO2 slant column amount in a volcanic plume. Quantitative imaging of volcanic plumes has created new opportunities and challenges, and in April 2013 an ESF-funded MeMoVolC workshop was held, with the objectives of bringing together the main research groups, create a vibrant, interconnected, community, and examine the current state of the art of this new research frontier. This special issue of sixteen papers within the Journal of Volcanology and Geothermal Research is the direct result of the discussions, intercomparisons and results reported in that workshop. The papers report on the volcanological objectives of the plume imaging community, the state of the art of the technology used, intercomparisons, validations, novel methods and results from field applications. Quantitative plume imaging of volcanic plumes is achieved by using both infrared and ultraviolet wavelengths, with each wavelength offering a different trade-off of strengths and weaknesses, and the papers in this issue reflect this wavelength flexibility. Gas compositions can also be imaged, and this approach offers much promise in the quantification of chemical processing within plumes. One of the key advantages of the plume imaging approach is that we can achieve gas flux measurements at 1-10 Hz frequencies, allowing direct comparisons with geophysical measurements, opening new, interdisciplinary opportunities to deepen our understanding of volcanological processes. Several challenges still can be improved upon, such as dealing with light scattering issues and full automation of data processing. However, it is clear that quantitative plume imaging will have a lasting and profound impact on how volcano observatories operate, our ability to forecast and manage volcanic eruptions, our constraints of global volcanic gas fluxes, and on our understanding of magma dynamics.
Quantitative molecular analysis in mantle cell lymphoma.
Brízová, H; Hilská, I; Mrhalová, M; Kodet, R
2011-07-01
A molecular analysis has three major roles in modern oncopathology--as an aid in the differential diagnosis, in molecular monitoring of diseases, and in estimation of the potential prognosis. In this report we review the application of the molecular analysis in a group of patients with mantle cell lymphoma (MCL). We demonstrate that detection of the cyclin D1 mRNA level is a molecular marker in 98% of patients with MCL. Cyclin D1 quantitative monitoring is specific and sensitive for the differential diagnosis and for the molecular monitoring of the disease in the bone marrow. Moreover, the dynamics of cyclin D1 in bone marrow reflects the disease development and it predicts the clinical course. We employed the molecular analysis for a precise quantitative detection of proliferation markers, Ki-67, topoisomerase IIalpha, and TPX2, that are described as effective prognostic factors. Using the molecular approach it is possible to measure the proliferation rate in a reproducible, standard way which is an essential prerequisite for using the proliferation activity as a routine clinical tool. Comparing with immunophenotyping we may conclude that the quantitative PCR-based analysis is a useful, reliable, rapid, reproducible, sensitive and specific method broadening our diagnostic tools in hematopathology. In comparison to interphase FISH in paraffin sections quantitative PCR is less technically demanding and less time-consuming and furthermore it is more sensitive in detecting small changes in the mRNA level. Moreover, quantitative PCR is the only technology which provides precise and reproducible quantitative information about the expression level. Therefore it may be used to demonstrate the decrease or increase of a tumor-specific marker in bone marrow in comparison with a previously aspirated specimen. Thus, it has a powerful potential to monitor the course of the disease in correlation with clinical data.
Shahabpoor, Erfan; Pavic, Aleksandar
2017-09-12
Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the 'accuracy' and 'practicality' of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the ground reactions estimation methods to include pathological gaits and natural variability of gait in real-life physical environment.
Shahabpoor, Erfan; Pavic, Aleksandar
2017-01-01
Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the ‘accuracy’ and ‘practicality’ of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the ground reactions estimation methods to include pathological gaits and natural variability of gait in real-life physical environment. PMID:28895909
DOE Office of Scientific and Technical Information (OSTI.GOV)
McClellan, G.E.; Wiker, S.F.
1985-05-31
This report quantifies for the first time the relationship between the signs and symptoms of acute radiation sickness and those of motion sickness. With this relationship, a quantitative comparison is made between data on human performance degradation during motion sickness and estimates of performance degradation during radiation sickness. The comparison validates estimates made by the Intermediate Dose Program on the performance degradation from acute radiation sickness.
Long term pavement performance directive : annual profiler-dipstick comparisons
DOT National Transportation Integrated Search
1996-11-25
The objective of this directive is to initiate a formal program for Profiler - Dipstick comparisons. These comparison tests should be performed as a minimum, on an annual basis, or within 90 days after major repairs to any of the LTPP profile measure...
Wave Turning and Flow Angle in the E-Region Ionosphere
NASA Astrophysics Data System (ADS)
Young, M.; Oppenheim, M. M.; Dimant, Y. S.
2016-12-01
This work presents results of particle-in-cell (PIC) simulations of Farley-Buneman (FB) turbulence at various altitudes in the high-latitude E-region ionosphere. In that region, the FB instability regularly produces meter-scale plasma irregularities. VHF radars observe coherent echoes via Bragg scatter from wave fronts parallel or anti-parallel to the radar line of sight (LoS) but do not necessarily measure the mean direction of wave propagation. Haldoupis (1984) conducted a study of diffuse radar aurora and found that the spectral width of back-scattered power depends critically on the angle between the radar LoS and the true flow direction, called the flow angle. Knowledge of the flow angle will allow researchers to better interpret observations of coherent back-scatter. Experiments designed to observe meter-scale irregularities in the E-region ionosphere created by the FB instability typically assume that the predominant flow direction is the E×B direction. However, linear theory of Dimant and Oppenheim (2004) showed that FB waves should turn away from E×B and particle-in-cell simulations by Oppenheim and Dimant (2013) support the theory. The present study comprises a quantitative analysis of the dependence of back-scattered power, flow velocity, and spectral width as functions of the flow angle. It also demonstrates that the mean direction of meter-scale wave propagation may differ from the E×B direction by tens of degrees. The analysis includes 2-D and 3-D simulations at a range of altitudes in the auroral ionosphere. Comparison between 2-D and 3-D simulations illustrates the relative importance to the irregularity spectrum of a small but finite component in the direction parallel to B. Previous work has shown this small parallel component to be important to turbulent electron heating and nonlinear transport.
Türk, Ayşe Gözde; Sabuncu, Metin; Ünal, Sena; Önal, Banu; Ulusoy, Mübin
2016-01-01
The purpose of the study was to use the photonic imaging modality of optical coherence tomography (OCT) to compare the marginal adaptation of composite inlays fabricated by direct and indirect techniques. Class II cavities were prepared on 34 extracted human molar teeth. The cavities were randomly divided into two groups according to the inlay fabrication technique. The first group was directly restored on cavities with a composite (Esthet X HD, Dentsply, Germany) after isolating. The second group was indirectly restored with the same composite material. Marginal adaptations were scanned before cementation with an invisible infrared light beam of OCT (Thorlabs), allowing measurement in 200 µm intervals. Restorations were cemented with a self-adhesive cement resin (SmartCem2, Dentsply), and then marginal adaptations were again measured with OCT. Mean values were statistically compared by using independent-samples t-test and paired samples t-test (p<0.05), before and after cementation. Direct inlays presented statistically smaller marginal discrepancy values than indirect inlays, before (p=0.00001442) and after (p=0.00001466) cementation. Marginal discrepancy values were increased for all restorations after cementation (p=0.00008839, p=0.000000952 for direct and indirect inlays, respectively). The mean marginal discrepancy value of the direct group increased from 56.88±20.04 µm to 91.88±31.7 µm, whereas the indirect group increased from 107.54±35.63 µm to 170.29±54.83 µm. Different techniques are available to detect marginal adaptation of restorations, but the OCT system can give quantitative information about resin cement thickness and its interaction between tooth and restoration in a nondestructive manner. Direct inlays presented smaller marginal discrepancy than indirect inlays. The marginal discrepancy values were increased for all restorations that refer to cement thickness after cementation.
Showmaker, Kurt; Lawrence, Gary W.; Lu, Shien; Balbalian, Clarissa; Klink, Vincent P.
2011-01-01
A quantitative PCR procedure targeting the β-tubulin gene determined the number of Rotylenchulus reniformis Linford & Oliveira 1940 in metagenomic DNA samples isolated from soil. Of note, this outcome was in the presence of other soil-dwelling plant parasitic nematodes including its sister genus Helicotylenchus Steiner, 1945. The methodology provides a framework for molecular diagnostics of nematodes from metagenomic DNA isolated directly from soil. PMID:22194958
Qualitative Versus Quantitative Social Support as a Predictor of Depression in the Elderly.
ERIC Educational Resources Information Center
Chwalisz, Kathleen D.; And Others
This study examined the relationship between qualitative and quantitative indicators of social support in the prediction of depression. Quantitative indicators were examined with regard to their direct effects on depression as well as their indirect effects through their relationship to perceived social support. Subjects were 301…
Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.
Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A
2018-01-08
For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.
ERIC Educational Resources Information Center
Barrows, Russell D.
2007-01-01
A one-way ANOVA experiment is performed to determine whether or not the three standardization methods are statistically different in determining the concentration of the three paraffin analytes. The laboratory exercise asks students to combine the three methods in a single analytical procedure of their own design to determine the concentration of…
Quantification of HCV RNA in Liver Tissue by bDNA Assay.
Dailey, P J; Collins, M L; Urdea, M S; Wilber, J C
1999-01-01
With this statement, Sherlock and Dooley have described two of the three major challenges involved in quantitatively measuring any analyte in tissue samples: the distribution of the analyte in the tissue; and the standard of reference, or denominator, with which to make comparisons between tissue samples. The third challenge for quantitative measurement of an analyte in tissue is to ensure reproducible and quantitative recovery of the analyte on extraction from tissue samples. This chapter describes a method that can be used to measure HCV RNA quantitatively in liver biopsy and tissue samples using the bDNA assay. All three of these challenges-distribution, denominator, and recovery-apply to the measurement of HCV RNA in liver biopsies.
Big fish in a big pond: a study of academic self concept in first year medical students.
Jackman, Kirsty; Wilson, Ian G; Seaton, Marjorie; Craven, Rhonda G
2011-07-27
Big-fish-little-pond effect (BFLPE) research has demonstrated that students in high-ability environments have lower academic self-concepts than equally able students in low-ability settings. Research has shown low academic self-concepts to be associated with negative educational outcomes. Social comparison processes have been implicated as fundamental to the BFLPE. Twenty first-year students in an Australian medical school completed a survey that included academic self-concept and social comparison measures, before and after their first written assessments. Focus groups were also conducted with a separate group of students to explore students' perceptions of competence, the medical school environment, and social comparison processes. The quantitative study did not reveal any changes in academic self-concept or self-evaluation. The qualitative study suggested that the attributions that students used when discussing performance were those that have been demonstrated to negatively affect self-concept. Students reported that the environment was slightly competitive and they used social comparison to evaluate their performance. Although the BFLPE was not evident in the quantitative study, results from the qualitative study suggest that the BFLPE might be operating In that students were using attributions that are associated with lower self-concepts, the environment was slightly competitive, and social comparisons were used for evaluation.
Ghasemi Damavandi, Hamidreza; Sen Gupta, Ananya; Nelson, Robert K; Reddy, Christopher M
2016-01-01
Comprehensive two-dimensional gas chromatography [Formula: see text] provides high-resolution separations across hundreds of compounds in a complex mixture, thus unlocking unprecedented information for intricate quantitative interpretation. We exploit this compound diversity across the [Formula: see text] topography to provide quantitative compound-cognizant interpretation beyond target compound analysis with petroleum forensics as a practical application. We focus on the [Formula: see text] topography of biomarker hydrocarbons, hopanes and steranes, as they are generally recalcitrant to weathering. We introduce peak topography maps (PTM) and topography partitioning techniques that consider a notably broader and more diverse range of target and non-target biomarker compounds compared to traditional approaches that consider approximately 20 biomarker ratios. Specifically, we consider a range of 33-154 target and non-target biomarkers with highest-to-lowest peak ratio within an injection ranging from 4.86 to 19.6 (precise numbers depend on biomarker diversity of individual injections). We also provide a robust quantitative measure for directly determining "match" between samples, without necessitating training data sets. We validate our methods across 34 [Formula: see text] injections from a diverse portfolio of petroleum sources, and provide quantitative comparison of performance against established statistical methods such as principal components analysis (PCA). Our data set includes a wide range of samples collected following the 2010 Deepwater Horizon disaster that released approximately 160 million gallons of crude oil from the Macondo well (MW). Samples that were clearly collected following this disaster exhibit statistically significant match [Formula: see text] using PTM-based interpretation against other closely related sources. PTM-based interpretation also provides higher differentiation between closely correlated but distinct sources than obtained using PCA-based statistical comparisons. In addition to results based on this experimental field data, we also provide extentive perturbation analysis of the PTM method over numerical simulations that introduce random variability of peak locations over the [Formula: see text] biomarker ROI image of the MW pre-spill sample (sample [Formula: see text] in Additional file 4: Table S1). We compare the robustness of the cross-PTM score against peak location variability in both dimensions and compare the results against PCA analysis over the same set of simulated images. Detailed description of the simulation experiment and discussion of results are provided in Additional file 1: Section S8. We provide a peak-cognizant informational framework for quantitative interpretation of [Formula: see text] topography. Proposed topographic analysis enables [Formula: see text] forensic interpretation across target petroleum biomarkers, while including the nuances of lesser-known non-target biomarkers clustered around the target peaks. This allows potential discovery of hitherto unknown connections between target and non-target biomarkers.
ERIC Educational Resources Information Center
Epps, Sucari
2017-01-01
This quantitative study investigated the learning outcomes of students with disabilities in comparison to their non-disabled peers in a TK-12th grade school that offers a sixth-twelfth grade virtual public charter school program that currently serves students in the state of California. No differences were found between groups indicating…
Changren Weng; Thomas L. Kubisiak; C. Dana Nelson; James P. Geaghan; Michael Stine
1999-01-01
Single marker regression and single marker maximum likelihood estimation were tied to detect quantitative trait loci (QTLs) controlling the early height growth of longleaf pine and slash pine using a ((longleaf pine x slash pine) x slash pine) BC, population consisting of 83 progeny. Maximum likelihood estimation was found to be more power than regression and could...
Portillo, M C; Gonzalez, J M
2008-08-01
Molecular fingerprints of microbial communities are a common method for the analysis and comparison of environmental samples. The significance of differences between microbial community fingerprints was analyzed considering the presence of different phylotypes and their relative abundance. A method is proposed by simulating coverage of the analyzed communities as a function of sampling size applying a Cramér-von Mises statistic. Comparisons were performed by a Monte Carlo testing procedure. As an example, this procedure was used to compare several sediment samples from freshwater ponds using a relative quantitative PCR-DGGE profiling technique. The method was able to discriminate among different samples based on their molecular fingerprints, and confirmed the lack of differences between aliquots from a single sample.
Mapping of thermal injury in biologic tissues using quantitative pathologic techniques
NASA Astrophysics Data System (ADS)
Thomsen, Sharon L.
1999-05-01
Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.
Leatemia, Lukas D; Susilo, Astrid P; van Berkel, Henk
2016-12-03
To identify the student's readiness to perform self-directed learning and the underlying factors influencing it on the hybrid problem based learning curriculum. A combination of quantitative and qualitative studies was conducted in five medical schools in Indonesia. In the quantitative study, the Self Directed Learning Readiness Scale was distributed to all students in all batches, who had experience with the hybrid problem based curriculum. They were categorized into low- and high -level based on the score of the questionnaire. Three focus group discussions (low-, high-, and mixed level) were conducted in the qualitative study with six to twelve students chosen randomly from each group to find the factors influencing their self-directed learning readiness. Two researchers analysed the qualitative data as a measure of triangulation. The quantitative study showed only half of the students had a high-level of self-directed learning readiness, and a similar trend also occurred in each batch. The proportion of students with a high level of self-directed learning readiness was lower in the senior students compared to more junior students. The qualitative study showed that problem based learning processes, assessments, learning environment, students' life styles, students' perceptions of the topics, and mood, were factors influencing their self-directed learning. A hybrid problem based curriculum may not fully affect the students' self-directed learning. The curriculum system, teacher's experiences, student's background and cultural factors might contribute to the difficulties for the student's in conducting self-directed learning.
Scatter and veiling glare corrections for quantitative digital subtraction angiography
NASA Astrophysics Data System (ADS)
Ersahin, Atila; Molloi, Sabee Y.; Qian, Yao-Jin
1994-05-01
In order to quantitate anatomical and physiological parameters such as vessel dimensions and volumetric blood flow, it is necessary to make corrections for scatter and veiling glare (SVG), which are the major sources of nonlinearities in videodensitometric digital subtraction angiography (DSA). A convolution filtering technique has been investigated to estimate SVG distribution in DSA images without the need to sample the SVG for each patient. This technique utilizes exposure parameters and image gray levels to estimate SVG intensity by predicting the total thickness for every pixel in the image. At this point, corrections were also made for variation of SVG fraction with beam energy and field size. To test its ability to estimate SVG intensity, the correction technique was applied to images of a Lucite step phantom, anthropomorphic chest phantom, head phantom, and animal models at different thicknesses, projections, and beam energies. The root-mean-square (rms) percentage error of these estimates were obtained by comparison with direct SVG measurements made behind a lead strip. The average rms percentage errors in the SVG estimate for the 25 phantom studies and for the 17 animal studies were 6.22% and 7.96%, respectively. These results indicate that the SVG intensity can be estimated for a wide range of thicknesses, projections, and beam energies.
Whale, Alexandra S; Huggett, Jim F; Cowen, Simon; Speirs, Valerie; Shaw, Jacqui; Ellison, Stephen; Foy, Carole A; Scott, Daniel J
2012-06-01
One of the benefits of Digital PCR (dPCR) is the potential for unparalleled precision enabling smaller fold change measurements. An example of an assessment that could benefit from such improved precision is the measurement of tumour-associated copy number variation (CNV) in the cell free DNA (cfDNA) fraction of patient blood plasma. To investigate the potential precision of dPCR and compare it with the established technique of quantitative PCR (qPCR), we used breast cancer cell lines to investigate HER2 gene amplification and modelled a range of different CNVs. We showed that, with equal experimental replication, dPCR could measure a smaller CNV than qPCR. As dPCR precision is directly dependent upon both the number of replicate measurements and the template concentration, we also developed a method to assist the design of dPCR experiments for measuring CNV. Using an existing model (based on Poisson and binomial distributions) to derive an expression for the variance inherent in dPCR, we produced a power calculation to define the experimental size required to reliably detect a given fold change at a given template concentration. This work will facilitate any future translation of dPCR to key diagnostic applications, such as cancer diagnostics and analysis of cfDNA.
Gao, Tingting; Wang, Xiaochang C; Chen, Rong; Ngo, Huu Hao; Guo, Wenshan
2015-04-01
Disability adjusted life year (DALY) has been widely used since 1990s for evaluating global and/or regional burden of diseases. As many environmental pollutants are hazardous to human health, DALY is also recognized as an indicator to quantify the health impact of environmental pollution related to disease burden. Based on literature reviews, this article aims to give an overview of the applicable methodologies and research directions for using DALY as a tool for quantitative assessment of environmental pollution. With an introduction of the methodological framework of DALY, the requirements on data collection and manipulation for quantifying disease burdens are summarized. Regarding environmental pollutants hazardous to human beings, health effect/risk evaluation is indispensable for transforming pollution data into disease data through exposure and dose-response analyses which need careful selection of models and determination of parameters. Following the methodological discussions, real cases are analyzed with attention paid to chemical pollutants and pathogens usually encountered in environmental pollution. It can be seen from existing studies that DALY is advantageous over conventional environmental impact assessment for quantification and comparison of the risks resulted from environmental pollution. However, further studies are still required to standardize the methods of health effect evaluation regarding varied pollutants under varied circumstances before DALY calculation. Copyright © 2014 Elsevier B.V. All rights reserved.
Bokulich, Nicholas A.
2013-01-01
Ultra-high-throughput sequencing (HTS) of fungal communities has been restricted by short read lengths and primer amplification bias, slowing the adoption of newer sequencing technologies to fungal community profiling. To address these issues, we evaluated the performance of several common internal transcribed spacer (ITS) primers and designed a novel primer set and work flow for simultaneous quantification and species-level interrogation of fungal consortia. Primer comparison and validation were predicted in silico and by sequencing a “mock community” of mixed yeast species to explore the challenges of amplicon length and amplification bias for reconstructing defined yeast community structures. The amplicon size and distribution of this primer set are smaller than for all preexisting ITS primer sets, maximizing sequencing coverage of hypervariable ITS domains by very-short-amplicon, high-throughput sequencing platforms. This feature also enables the optional integration of quantitative PCR (qPCR) directly into the HTS preparatory work flow by substituting qPCR with these primers for standard PCR, yielding quantification of individual community members. The complete work flow described here, utilizing any of the qualified primer sets evaluated, can rapidly profile mixed fungal communities and capably reconstructed well-characterized beer and wine fermentation fungal communities. PMID:23377949
Metrics for comparing dynamic earthquake rupture simulations
Barall, Michael; Harris, Ruth A.
2014-01-01
Earthquakes are complex events that involve a myriad of interactions among multiple geologic features and processes. One of the tools that is available to assist with their study is computer simulation, particularly dynamic rupture simulation. A dynamic rupture simulation is a numerical model of the physical processes that occur during an earthquake. Starting with the fault geometry, friction constitutive law, initial stress conditions, and assumptions about the condition and response of the near‐fault rocks, a dynamic earthquake rupture simulation calculates the evolution of fault slip and stress over time as part of the elastodynamic numerical solution (Ⓔ see the simulation description in the electronic supplement to this article). The complexity of the computations in a dynamic rupture simulation make it challenging to verify that the computer code is operating as intended, because there are no exact analytic solutions against which these codes’ results can be directly compared. One approach for checking if dynamic rupture computer codes are working satisfactorily is to compare each code’s results with the results of other dynamic rupture codes running the same earthquake simulation benchmark. To perform such a comparison consistently, it is necessary to have quantitative metrics. In this paper, we present a new method for quantitatively comparing the results of dynamic earthquake rupture computer simulation codes.
NASA Astrophysics Data System (ADS)
Illien, Françoise; Rodriguez, Nicolas; Amoura, Mehdi; Joliot, Alain; Pallerla, Manjula; Cribier, Sophie; Burlina, Fabienne; Sagan, Sandrine
2016-11-01
The mechanism of cell-penetrating peptides entry into cells is unclear, preventing the development of more efficient vectors for biotechnological or therapeutic purposes. Here, we developed a protocol relying on fluorometry to distinguish endocytosis from direct membrane translocation, using Penetratin, TAT and R9. The quantities of internalized CPPs measured by fluorometry in cell lysates converge with those obtained by our previously reported mass spectrometry quantification method. By contrast, flow cytometry quantification faces several limitations due to fluorescence quenching processes that depend on the cell line and occur at peptide/cell ratio >6.108 for CF-Penetratin. The analysis of cellular internalization of a doubly labeled fluorescent and biotinylated Penetratin analogue by the two independent techniques, fluorometry and mass spectrometry, gave consistent results at the quantitative and qualitative levels. Both techniques revealed the use of two alternative translocation and endocytosis pathways, whose relative efficacy depends on cell-surface sugars and peptide concentration. We confirmed that Penetratin translocates at low concentration and uses endocytosis at high μM concentrations. We further demonstrate that the hydrophobic/hydrophilic nature of the N-terminal extremity impacts on the internalization efficiency of CPPs. We expect these results and the associated protocols to help unraveling the translocation pathway to the cytosol of cells.
Does Augmented Reality Affect High School Students' Learning Outcomes in Chemistry?
NASA Astrophysics Data System (ADS)
Renner, Jonathan Christopher
Some teens may prefer using a self-directed, constructivist, and technologic approach to learning rather than traditional classroom instruction. If it can be demonstrated, educators may adjust their teaching methodology. The guiding research question for this study focused on how augmented reality affects high school students' learning outcomes in chemistry, as measured by a pretest and posttest methodology when ensuring that the individual outcomes were not the result of group collaboration. This study employed a quantitative, quasi-experimental study design that used a comparison and experimental group. Inferential statistical analysis was employed. The study was conducted at a high school in southwest Colorado. Eighty-nine respondents returned completed and signed consent forms, and 78 participants completed the study. Results demonstrated that augmented reality instruction caused posttest scores to significantly increase, as compared to pretest scores, but it was not as effective as traditional classroom instruction. Scores did improve under both types of instruction; therefore, more research is needed in this area. The present study was the first quantitative experiment controlling for individual learning to validate augmented reality using mobile handheld digital devices that affected individual students' learning outcomes without group collaboration. This topic was important to the field of education as it may help educators understand how students learn and it may also change the way students are taught.
Elliott, D.G.; Applegate, L.J.; Murray, A.L.; Purcell, M.K.; McKibben, C.L.
2013-01-01
No gold standard assay exhibiting error-free classification of results has been identified for detection of Renibacterium salmoninarum, the causative agent of salmonid bacterial kidney disease. Validation of diagnostic assays for R. salmoninarum has been hindered by its unique characteristics and biology, and difficulties in locating suitable populations of reference test animals. Infection status of fish in test populations is often unknown, and it is commonly assumed that the assay yielding the most positive results has the highest diagnostic accuracy, without consideration of misclassification of results. In this research, quantification of R. salmoninarum in samples by bacteriological culture provided a standardized measure of viable bacteria to evaluate analytical performance characteristics (sensitivity, specificity and repeatability) of non-culture assays in three matrices (phosphate-buffered saline, ovarian fluid and kidney tissue). Non-culture assays included polyclonal enzyme-linked immunosorbent assay (ELISA), direct smear fluorescent antibody technique (FAT), membrane-filtration FAT, nested polymerase chain reaction (nested PCR) and three real-time quantitative PCR assays. Injection challenge of specific pathogen-free Chinook salmon, Oncorhynchus tshawytscha (Walbaum), with R. salmoninarum was used to estimate diagnostic sensitivity and specificity. Results did not identify a single assay demonstrating the highest analytical and diagnostic performance characteristics, but revealed strengths and weaknesses of each test.
Bentley-Hewitt, Kerry L; Hedderley, Duncan I; Monro, John; Martell, Sheridan; Smith, Hannah; Mishra, Suman
2016-05-25
Experimental methods are constantly being improved by new technology. Recently a new technology, NanoString®, has been introduced to the market for the analysis of gene expression. Our experiments used adipose and liver samples collected from a rat feeding trial to explore gene expression changes resulting from a diet of 7.5% seaweed. Both quantitative real-time polymerase chain reaction (qPCR) and NanoString methods were employed to look at expression of genes related to fat and glucose metabolism and this paper compares results from both methods. We conclude that NanoString offers a valuable alternative to qPCR and our data suggest that results are more accurate because of the reduced sample handling and direct quantification of gene copy number without the need for enzymatic amplification. However, we have highlighted a potential challenge for both methods, which needs to be addressed when designing primers or probes. We suggest a literature search for known splice variants of a particular gene to be completed so that primers or probes can be designed that do not span exons which may be affected by alternative gene sequences. Copyright © 2016 Elsevier B.V. All rights reserved.
Solid-state NMR for bacterial biofilms
NASA Astrophysics Data System (ADS)
Reichhardt, Courtney; Cegelski, Lynette
2014-04-01
Bacteria associate with surfaces and one another by elaborating an extracellular matrix to encapsulate cells, creating communities termed biofilms. Biofilms are beneficial in some ecological niches, but also contribute to the pathogenesis of serious and chronic infectious diseases. New approaches and quantitative measurements are needed to define the composition and architecture of bacterial biofilms to help drive the development of strategies to interfere with biofilm assembly. Solid-state nuclear magnetic resonance (NMR) is uniquely suited to the examination of insoluble and complex macromolecular and whole-cell systems. This article highlights three examples that implement solid-state NMR to deliver insights into bacterial biofilm composition and changes in cell-wall composition as cells transition to the biofilm lifestyle. Most recently, solid-state NMR measurements provided a total accounting of the protein and polysaccharide components in the extracellular matrix of an Escherichia coli biofilm and transformed our qualitative descriptions of matrix composition into chemical parameters that permit quantitative comparisons among samples. We present additional data for whole biofilm samples (cells plus the extracellular matrix) that complement matrix-only analyses. The study of bacterial biofilms by solid-state NMR is an exciting avenue ripe with many opportunities and we close the article by articulating some outstanding questions and future directions in this area.
Quantitative analysis of TALE-DNA interactions suggests polarity effects.
Meckler, Joshua F; Bhakta, Mital S; Kim, Moon-Soo; Ovadia, Robert; Habrian, Chris H; Zykovich, Artem; Yu, Abigail; Lockwood, Sarah H; Morbitzer, Robert; Elsäesser, Janett; Lahaye, Thomas; Segal, David J; Baldwin, Enoch P
2013-04-01
Transcription activator-like effectors (TALEs) have revolutionized the field of genome engineering. We present here a systematic assessment of TALE DNA recognition, using quantitative electrophoretic mobility shift assays and reporter gene activation assays. Within TALE proteins, tandem 34-amino acid repeats recognize one base pair each and direct sequence-specific DNA binding through repeat variable di-residues (RVDs). We found that RVD choice can affect affinity by four orders of magnitude, with the relative RVD contribution in the order NG > HD ≈ NN > NI > NK. The NN repeat preferred the base G over A, whereas the NK repeat bound G with 10(3)-fold lower affinity. We compared AvrBs3, a naturally occurring TALE that recognizes its target using some atypical RVD-base combinations, with a designed TALE that precisely matches 'standard' RVDs with the target bases. This comparison revealed unexpected differences in sensitivity to substitutions of the invariant 5'-T. Another surprising observation was that base mismatches at the 5' end of the target site had more disruptive effects on affinity than those at the 3' end, particularly in designed TALEs. These results provide evidence that TALE-DNA recognition exhibits a hitherto un-described polarity effect, in which the N-terminal repeats contribute more to affinity than C-terminal ones.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, B.D.; Toole, A.P.; Callahan, B.G.
1991-12-01
Alkylphenols are a class of environmentally pervasive compounds, found both in natural (e.g., crude oils) and in anthropogenic (e.g., wood tar, coal gasification waste) materials. Despite the frequent environmental occurrence of these chemicals, there is a limited toxicity database on alkylphenols. The authors have therefore developed a 'toxicity equivalence approach' for alkylphenols which is based on their ability to inhibit, in a specific manner, the enzyme cyclooxygenase. Enzyme-inhibiting ability for individual alkylphenols can be estimated based on the quantitative structure-activity relationship developed by Dewhirst (1980) and is a function of the free hydroxyl group, electron-donating ring substituents, and hydrophobic aromaticmore » ring substituents. The authors evaluated the toxicological significance of cyclooxygenase inhibition by comparison of the inhibitory capacity of alkylphenols with the inhibitory capacity of acetylsalicylic acid, or aspirin, a compound whose low-level effects are due to cyclooxygenase inhibition. Since nearly complete absorption for alkylphenols and aspirin is predicted, based on estimates of hydrophobicity and fraction of charged molecules at gastrointestinal pHs, risks from alkylphenols can be expressed directly in terms of 'milligram aspirin equivalence,' without correction for absorption differences. They recommend this method for assessing risks of mixtures of alkylphenols, especially for those compounds with no chronic toxicity data.38 references.« less
Asarnow, Daniel; Rojo-Arreola, Liliana; Suzuki, Brian M; Caffrey, Conor R; Singh, Rahul
2015-05-01
Neglected tropical diseases (NTDs) caused by helminths constitute some of the most common infections of the world's poorest people. The etiological agents are complex and recalcitrant to standard techniques of molecular biology. Drug screening against helminths has often been phenotypic and typically involves manual description of drug effect and efficacy. A key challenge is to develop automated, quantitative approaches to drug screening against helminth diseases. The quantal dose-response calculator (QDREC) constitutes a significant step in this direction. It can be used to automatically determine quantitative dose-response characteristics and half-maximal effective concentration (EC50) values using image-based readouts from phenotypic screens, thereby allowing rigorous comparisons of the efficacies of drug compounds. QDREC has been developed and validated in the context of drug screening for schistosomiasis, one of the most important NTDs. However, it is equally applicable to general phenotypic screening involving helminths and other complex parasites. QDREC is publically available at: http://haddock4.sfsu.edu/qdrec2/. Source code and datasets are at: http://tintin.sfsu.edu/projects/phenotypicAssays.html. rahul@sfsu.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Quasiparticle Level Alignment for Photocatalytic Interfaces.
Migani, Annapaoala; Mowbray, Duncan J; Zhao, Jin; Petek, Hrvoje; Rubio, Angel
2014-05-13
Electronic level alignment at the interface between an adsorbed molecular layer and a semiconducting substrate determines the activity and efficiency of many photocatalytic materials. Standard density functional theory (DFT)-based methods have proven unable to provide a quantitative description of this level alignment. This requires a proper treatment of the anisotropic screening, necessitating the use of quasiparticle (QP) techniques. However, the computational complexity of QP algorithms has meant a quantitative description of interfacial levels has remained elusive. We provide a systematic study of a prototypical interface, bare and methanol-covered rutile TiO2(110) surfaces, to determine the type of many-body theory required to obtain an accurate description of the level alignment. This is accomplished via a direct comparison with metastable impact electron spectroscopy (MIES), ultraviolet photoelectron spectroscopy (UPS), and two-photon photoemission (2PP) spectroscopy. We consider GGA DFT, hybrid DFT, and G0W0, scQPGW1, scQPGW0, and scQPGW QP calculations. Our results demonstrate that G0W0, or our recently introduced scQPGW1 approach, are required to obtain the correct alignment of both the highest occupied and lowest unoccupied interfacial molecular levels (HOMO/LUMO). These calculations set a new standard in the interpretation of electronic structure probe experiments of complex organic molecule/semiconductor interfaces.
Jin, Sang-Man; Oh, Seung-Hoon; Oh, Bae Jun; Suh, Sunghwan; Bae, Ji Cheol; Lee, Jung Hee; Lee, Myung-Shik; Lee, Moon-Kyu; Kim, Kwang-Won; Kim, Jae Hyeon
2014-01-01
While a few studies have demonstrated the benefit of PEGylation in islet transplantation, most have employed renal subcapsular models and none have performed direct comparisons of islet mass in intraportal islet transplantation using islet magnetic resonance imaging (MRI). In this study, our aim was to demonstrate the benefit of PEGylation in the early post-transplant period of intraportal islet transplantation with a novel algorithm for islet MRI. Islets were PEGylated after ferucarbotran labeling in a rat syngeneic intraportal islet transplantation model followed by comparisons of post-transplant glycemic levels in recipient rats infused with PEGylated (n = 12) and non-PEGylated (n = 13) islets. The total area of hypointense spots and the number of hypointense spots larger than 1.758 mm(2) of PEGylated and non-PEGylated islets were quantitatively compared. The total area of hypointense spots (P < 0.05) and the number of hypointense spots larger than 1.758 mm(2) (P < 0.05) were higher in the PEGylated islet group 7 and 14 days post translation (DPT). These results translated into better post-transplant outcomes in the PEGylated islet group 28 DPT. In validation experiments, MRI parameters obtained 1, 7, and 14 DPT predicted normoglycemia 4 wk post-transplantation. We directly demonstrated the benefit of islet PEGylation in protection against nonspecific islet destruction in the early post-transplant period of intraportal islet transplantation using a novel algorithm for islet MRI. This novel algorithm could serve as a useful tool to demonstrate such benefit in future clinical trials of islet transplantation using PEGylated islets.
Vertyporokh, Lidiia; Taszłow, Paulina; Samorek-Pieróg, Małgorzata; Wojda, Iwona
2015-09-01
We aimed to investigate how exposition of infected insects to short-term heat shock affects the biochemical and molecular aspects of their immune response. Galleria mellonella larvae were exposed to 43°C for 15min, at the seventy second hour after natural infection with entomopathogenic fungus Beauveria bassiana. As a result, both qualitative and quantitative changes in hemolymph protein profiles, and among them infection-induced changes in the amount of apolipophorin III (apoLp-III), were observed. Heat shock differently affects the expression of the tested immune-related genes. It transiently inhibits expression of antifungal peptides gallerimycin and galiomicin in both the fat body and hemocytes of infected larvae. The same, although to a lesser extent, concerned apoLp-III gene expression and was observed directly after heat shock. Nevertheless, in larvae that had recovered from heat shock, apoLp-III expression was higher in comparison to unshocked larvae in the fat body but not in hemocytes, which was consistent with the higher amount of this protein detected in the hemolymph of the infected, shocked larvae. Furthermore, lysozyme-type activity was higher directly after heat shock, while antifungal activity was significantly higher also in larvae that had recovered from heat shock, in comparison to the respective values in their non-shocked, infected counterparts. These results show how changes in the external temperature modulate the immune response of G. mellonella suffering from infection with its natural pathogen B. bassiana. Copyright © 2015 Elsevier Inc. All rights reserved.
Liu, Hanhua; Wakeford, Richard; Riddell, Anthony; O'Hagan, Jacqueline; MacGregor, David; Agius, Raymond; Wilson, Christine; Peace, Mark; de Vocht, Frank
2016-03-01
Any potential health effects of radiation emitted from radionuclides deposited in the bodies of workers exposed to radioactive materials can be directly investigated through epidemiological studies. However, estimates of radionuclide exposure and consequent tissue-specific doses, particularly for early workers for whom monitoring was relatively crude but exposures tended to be highest, can be uncertain, limiting the accuracy of risk estimates. We review the use of job-exposure matrices (JEMs) in peer-reviewed epidemiological and exposure assessment studies of nuclear industry workers exposed to radioactive materials as a method for addressing gaps in exposure data, and discuss methodology and comparability between studies. We identified nine studies of nuclear worker cohorts in France, Russia, the USA and the UK that had incorporated JEMs in their exposure assessments. All these JEMs were study or cohort-specific, and although broadly comparable methodologies were used in their construction, this is insufficient to enable the transfer of any one JEM to another study. Moreover there was often inadequate detail on whether, or how, JEMs were validated. JEMs have become more detailed and more quantitative, and this trend may eventually enable better comparison across, and the pooling of, studies. We conclude that JEMs have been shown to be a valuable exposure assessment methodology for imputation of missing exposure data for nuclear worker cohorts with data not missing at random. The next step forward for direct comparison or pooled analysis of complete cohorts would be the use of transparent and transferable methods.
Mordini, Federico E; Haddad, Tariq; Hsu, Li-Yueh; Kellman, Peter; Lowrey, Tracy B; Aletras, Anthony H; Bandettini, W Patricia; Arai, Andrew E
2014-01-01
This study's primary objective was to determine the sensitivity, specificity, and accuracy of fully quantitative stress perfusion cardiac magnetic resonance (CMR) versus a reference standard of quantitative coronary angiography. We hypothesized that fully quantitative analysis of stress perfusion CMR would have high diagnostic accuracy for identifying significant coronary artery stenosis and exceed the accuracy of semiquantitative measures of perfusion and qualitative interpretation. Relatively few studies apply fully quantitative CMR perfusion measures to patients with coronary disease and comparisons to semiquantitative and qualitative methods are limited. Dual bolus dipyridamole stress perfusion CMR exams were performed in 67 patients with clinical indications for assessment of myocardial ischemia. Stress perfusion images alone were analyzed with a fully quantitative perfusion (QP) method and 3 semiquantitative methods including contrast enhancement ratio, upslope index, and upslope integral. Comprehensive exams (cine imaging, stress/rest perfusion, late gadolinium enhancement) were analyzed qualitatively with 2 methods including the Duke algorithm and standard clinical interpretation. A 70% or greater stenosis by quantitative coronary angiography was considered abnormal. The optimum diagnostic threshold for QP determined by receiver-operating characteristic curve occurred when endocardial flow decreased to <50% of mean epicardial flow, which yielded a sensitivity of 87% and specificity of 93%. The area under the curve for QP was 92%, which was superior to semiquantitative methods: contrast enhancement ratio: 78%; upslope index: 82%; and upslope integral: 75% (p = 0.011, p = 0.019, p = 0.004 vs. QP, respectively). Area under the curve for QP was also superior to qualitative methods: Duke algorithm: 70%; and clinical interpretation: 78% (p < 0.001 and p < 0.001 vs. QP, respectively). Fully quantitative stress perfusion CMR has high diagnostic accuracy for detecting obstructive coronary artery disease. QP outperforms semiquantitative measures of perfusion and qualitative methods that incorporate a combination of cine, perfusion, and late gadolinium enhancement imaging. These findings suggest a potential clinical role for quantitative stress perfusion CMR. Copyright © 2014 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.
Can NMR solve some significant challenges in metabolomics?
Nagana Gowda, G A; Raftery, Daniel
2015-11-01
The field of metabolomics continues to witness rapid growth driven by fundamental studies, methods development, and applications in a number of disciplines that include biomedical science, plant and nutrition sciences, drug development, energy and environmental sciences, toxicology, etc. NMR spectroscopy is one of the two most widely used analytical platforms in the metabolomics field, along with mass spectrometry (MS). NMR's excellent reproducibility and quantitative accuracy, its ability to identify structures of unknown metabolites, its capacity to generate metabolite profiles using intact bio-specimens with no need for separation, and its capabilities for tracing metabolic pathways using isotope labeled substrates offer unique strengths for metabolomics applications. However, NMR's limited sensitivity and resolution continue to pose a major challenge and have restricted both the number and the quantitative accuracy of metabolites analyzed by NMR. Further, the analysis of highly complex biological samples has increased the demand for new methods with improved detection, better unknown identification, and more accurate quantitation of larger numbers of metabolites. Recent efforts have contributed significant improvements in these areas, and have thereby enhanced the pool of routinely quantifiable metabolites. Additionally, efforts focused on combining NMR and MS promise opportunities to exploit the combined strength of the two analytical platforms for direct comparison of the metabolite data, unknown identification and reliable biomarker discovery that continue to challenge the metabolomics field. This article presents our perspectives on the emerging trends in NMR-based metabolomics and NMR's continuing role in the field with an emphasis on recent and ongoing research from our laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.
Analytical progress in the theory of vesicles under linear flow
NASA Astrophysics Data System (ADS)
Farutin, Alexander; Biben, Thierry; Misbah, Chaouqi
2010-06-01
Vesicles are becoming a quite popular model for the study of red blood cells. This is a free boundary problem which is rather difficult to handle theoretically. Quantitative computational approaches constitute also a challenge. In addition, with numerical studies, it is not easy to scan within a reasonable time the whole parameter space. Therefore, having quantitative analytical results is an essential advance that provides deeper understanding of observed features and can be used to accompany and possibly guide further numerical development. In this paper, shape evolution equations for a vesicle in a shear flow are derived analytically with precision being cubic (which is quadratic in previous theories) with regard to the deformation of the vesicle relative to a spherical shape. The phase diagram distinguishing regions of parameters where different types of motion (tank treading, tumbling, and vacillating breathing) are manifested is presented. This theory reveals unsuspected features: including higher order terms and harmonics (even if they are not directly excited by the shear flow) is necessary, whatever the shape is close to a sphere. Not only does this theory cure a quite large quantitative discrepancy between previous theories and recent experiments and numerical studies, but also it reveals a phenomenon: the VB mode band in parameter space, which is believed to saturate after a moderate shear rate, exhibits a striking widening beyond a critical shear rate. The widening results from excitation of fourth-order harmonic. The obtained phase diagram is in a remarkably good agreement with recent three-dimensional numerical simulations based on the boundary integral formulation. Comparison of our results with experiments is systematically made.
Summary of Quantitative Interpretation of Image Far Ultraviolet Auroral Data
NASA Technical Reports Server (NTRS)
Frey, H. U.; Immel, T. J.; Mende, S. B.; Gerard, J.-C.; Hubert, B.; Habraken, S.; Span, J.; Gladstone, G. R.; Bisikalo, D. V.; Shematovich, V. I.;
2002-01-01
Direct imaging of the magnetosphere by instruments on the IMAGE spacecraft is supplemented by simultaneous observations of the global aurora in three far ultraviolet (FUV) wavelength bands. The purpose of the multi-wavelength imaging is to study the global auroral particle and energy input from thc magnetosphere into the atmosphere. This paper describes provides the method for quantitative interpretation of FUV measurements. The Wide-Band Imaging Camera (WIC) provides broad band ultraviolet images of the aurora with maximum spatial and temporal resolution by imaging the nitrogen lines and bands between 140 and 180 nm wavelength. The Spectrographic Imager (SI), a dual wavelength monochromatic instrument, images both Doppler-shifted Lyman alpha emissions produced by precipitating protons, in the SI-12 channel and OI 135.6 nm emissions in the SI-13 channel. From the SI-12 Doppler shifted Lyman alpha images it is possible to obtain the precipitating proton flux provided assumptions are made regarding the mean energy of the protons. Knowledge of the proton (flux and energy) component allows the calculation of the contribution produced by protons in the WIC and SI-13 instruments. Comparison of the corrected WIC and SI-13 signals provides a measure of the electron mean energy, which can then be used to determine the electron energy fluxun-. To accomplish this reliable modeling emission modeling and instrument calibrations are required. In-flight calibration using early-type stars was used to validate the pre-flight laboratory calibrations and determine long-term trends in sensitivity. In general, very reasonable agreement is found between in-situ measurements and remote quantitative determinations.
Van Dyke, M I; Morton, V K; McLellan, N L; Huck, P M
2010-09-01
Quantitative PCR and a culture method were used to investigate Campylobacter occurrence over 3 years in a watershed located in southern Ontario, Canada that is used as a source of drinking water. Direct DNA extraction from river water followed by quantitative PCR analysis detected thermophilic campylobacters at low concentrations (<130 cells 100 ml(-1) ) in 57-79% of samples taken from five locations. By comparison, a culture-based method detected Campylobacter in 0-23% of samples. Water quality parameters such as total Escherichia coli were not highly correlated with Campylobacter levels, although higher pathogen concentrations were observed at colder water temperatures (<10°C). Strains isolated from river water were primarily nalidixic acid-susceptible Campylobacter lari, and selected isolates were identified as Campylobacter lari ssp. concheus. Campylobacter from wild birds (seagulls, ducks and geese) were detected at a similar rate using PCR (32%) and culture-based (29%) methods, and although Campylobacter jejuni was isolated most frequently, C. lari ssp. concheus was also detected. Campylobacter were frequently detected at low concentrations in the watershed. Higher prevalence rates using quantitative PCR was likely because of the formation of viable but nonculturable cells and low recovery of the culture method. In addition to animal and human waste, waterfowl can be an important contributor of Campylobacter in the environment. Results of this study show that Campylobacter in surface water can be an important vector for human disease transmission and that method selection is important in determining pathogen occurrence in a water environment. © 2010 The Authors. Journal compilation © 2010 The Society for Applied Microbiology.
Devrim, Burcu; Dinç, Erdal; Bozkir, Asuman
2014-01-01
Diphenhydramine hydrochloride (DPH), a histamine H1-receptor antagonist, is widely used as antiallergic, antiemetic and antitussive drug found in many pharmaceutical preparations. In this study, a new reconstitutable syrup formulation of DPH was prepared because it is more stable in solid form than that in liquid form. The quantitative estimation of the DPH content of a reconstitutable syrup formulation in the presence of pharmaceutical excipients, D-sorbitol, sodium citrate, sodium benzoate and sodium EDTA is not possible by the direct absorbance measurement. Therefore, a signal processing approach based on continuous wavelet transform was used to determine the DPH in the reconstitutable syrup formulations and to eliminate the effect of excipients on the analysis. The absorption spectra of DPH in the range of 5.0-40.0 μg/mL were recorded between 200-300 nm. Various wavelet families were tested and Biorthogonal1.1 continuous wavelet transform (BIOR1.1-CWT) was found to be optimal signal processing family to get fast and desirable determination results and to overcome excipient interference effects. For a comparison of the experimental results obtained by partial least squares (PLS) and principal component regression (PCR) methods were applied to the quantitative prediction of DPH in the mentioned samples. The validity of the proposed BIOR1.1-CWT, PLS and PCR methods were achieved analyzing the prepared samples containing the mentioned excipients and using standard addition technique. It was observed that the proposed graphical and numerical approaches are suitable for the quantitative analysis of DPH in samples including excipients.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chacko, M; Aldoohan, S; Sonnad, J
2015-06-15
Purpose: To evaluate quantitatively dose distributions from helical, axial and cone-beam CT clinical imaging techniques by measurement using a two-dimensional (2D) diode-array detector. Methods: 2D-dose distributions from selected clinical protocols used for axial, helical and cone-beam CT imaging were measured using a diode-array detector (MapCheck2). The MapCheck2 is composed from solid state diode detectors that are arranged in horizontal and vertical lines with a spacing of 10 mm. A GE-Light-Speed CT-simulator was used to acquire axial and helical CT images and a kV on-board-imager integrated with a Varian TrueBeam-STx machine was used to acquire cone-beam CT (CBCT) images. Results: Themore » dose distributions from axial, helical and cone-beam CT were non-uniform over the region-of-interest with strong spatial and angular dependence. In axial CT, a large dose gradient was measured that decreased from lateral sides to the middle of the phantom due to large superficial dose at the side of the phantom in comparison with larger beam attenuation at the center. The dose decreased at the superior and inferior regions in comparison to the center of the phantom in axial CT. An asymmetry was found between the right-left or superior-inferior sides of the phantom which possibly to angular dependence in the dose distributions. The dose level and distribution varied from one imaging technique into another. For the pelvis technique, axial CT deposited a mean dose of 3.67 cGy, helical CT deposited a mean dose of 1.59 cGy, and CBCT deposited a mean dose of 1.62 cGy. Conclusions: MapCheck2 provides a robust tool to measure directly 2D-dose distributions for CT imaging with high spatial resolution detectors in comparison with ionization chamber that provides a single point measurement or an average dose to the phantom. The dose distributions measured with MapCheck2 consider medium heterogeneity and can represent specific patient dose.« less
Towards Extending Forward Kinematic Models on Hyper-Redundant Manipulator to Cooperative Bionic Arms
NASA Astrophysics Data System (ADS)
Singh, Inderjeet; Lakhal, Othman; Merzouki, Rochdi
2017-01-01
Forward Kinematics is a stepping stone towards finding an inverse solution and subsequently a dynamic model of a robot. Hence a study and comparison of various Forward Kinematic Models (FKMs) is necessary for robot design. This paper deals with comparison of three FKMs on the same hyper-redundant Compact Bionic Handling Assistant (CBHA) manipulator under same conditions. The aim of this study is to project on modeling cooperative bionic manipulators. Two of these methods are quantitative methods, Arc Geometry HTM (Homogeneous Transformation Matrix) Method and Dual Quaternion Method, while the other one is Hybrid Method which uses both quantitative as well as qualitative approach. The methods are compared theoretically and experimental results are discussed to add further insight to the comparison. HTM is the widely used and accepted technique, is taken as reference and trajectory deviation in other techniques are compared with respect to HTM. Which method allows obtaining an accurate kinematic behavior of the CBHA, controlled in the real-time.
A comparison of two IPv4/IPv6 transition mechanisms - OpenVPN and IVI
NASA Astrophysics Data System (ADS)
Vu, Cong Tuan; Tran, Quang Anh; Jiang, Frank
2012-09-01
This document presents a comparison of two IPv4/IPv6 transition mechanisms. They are OpenVPN and IVI. Meanwhile OpenVPN is based on tunneling technology, IVI is a stateless IPv4/IPv6 translation technique which is developed by China Education and Research Network (CERNET). This research focus on the quantitative and qualitative comparison of these two main mechanisms; how they are applied in practical situation by the Internet Service Providers, as well as their advantages and drawbacks.
A Model Comparison for Characterizing Protein Motions from Structure
NASA Astrophysics Data System (ADS)
David, Charles; Jacobs, Donald
2011-10-01
A comparative study is made using three computational models that characterize native state dynamics starting from known protein structures taken from four distinct SCOP classifications. A geometrical simulation is performed, and the results are compared to the elastic network model and molecular dynamics. The essential dynamics is quantified by a direct analysis of a mode subspace constructed from ANM and a principal component analysis on both the FRODA and MD trajectories using root mean square inner product and principal angles. Relative subspace sizes and overlaps are visualized using the projection of displacement vectors on the model modes. Additionally, a mode subspace is constructed from PCA on an exemplar set of X-ray crystal structures in order to determine similarly with respect to the generated ensembles. Quantitative analysis reveals there is significant overlap across the three model subspaces and the model independent subspace. These results indicate that structure is the key determinant for native state dynamics.
Language lateralization in healthy right-handers.
Knecht, S; Deppe, M; Dräger, B; Bobe, L; Lohmann, H; Ringelstein, E; Henningsen, H
2000-01-01
Our knowledge about the variability of cerebral language lateralization is derived from studies of patients with brain lesions and thus possible secondary reorganization of cerebral functions. In healthy right-handed subjects 'atypical', i.e. right hemisphere language dominance, has generally been assumed to be exceedingly rare. To test this assumption we measured language lateralization in 188 healthy subjects with moderate and strong right-handedness (59% females) by a new non-invasive, quantitative technique previously validated by direct comparison with the intracarotid amobarbital procedure. During a word generation task the averaged hemispheric perfusion differences within the territories of the middle cerebral arteries were determined. (i) The natural distribution of language lateralization was found to occur along a bimodal continuum. (ii) Lateralization was equivalent in men and women. (iii) Right hemisphere dominance was found in 7.5% of subjects. These findings indicate that atypical language dominance in healthy right-handed subjects of either sex is considerably more common than previously suspected.
NASA Astrophysics Data System (ADS)
Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei
2018-02-01
For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.
NASA Astrophysics Data System (ADS)
Tran, Henry K.; Stanton, John F.; Miller, Terry A.
2018-01-01
The limitations associated with the common practice of fitting a quadratic Hamiltonian to vibronic levels of a Jahn-Teller system have been explored quantitatively. Satisfactory results for the prototypical X∼2E‧ state of Li3 are obtained from fits to both experimental spectral data and to an "artificial" spectrum calculated by a quartic Hamiltonian which accurately reproduces the adiabatic potential obtained from state-of-the-art quantum chemistry calculations. However the values of the Jahn-Teller parameters, stabilization energy, and pseudo-rotation barrier obtained from the quadratic fit differ markedly from those associated with the ab initio potential. Nonetheless the RMS deviations of the fits are not strikingly different. Guidelines are suggested for comparing parameters obtained from fits to experiment to those obtained by direct calculation, but a principal conclusion of this work is that such comparisons must be done with a high degree of caution.
Mean-Field Scaling of the Superfluid to Mott Insulator Transition in a 2D Optical Superlattice.
Thomas, Claire K; Barter, Thomas H; Leung, Tsz-Him; Okano, Masayuki; Jo, Gyu-Boong; Guzman, Jennie; Kimchi, Itamar; Vishwanath, Ashvin; Stamper-Kurn, Dan M
2017-09-08
The mean-field treatment of the Bose-Hubbard model predicts properties of lattice-trapped gases to be insensitive to the specific lattice geometry once system energies are scaled by the lattice coordination number z. We test this scaling directly by comparing coherence properties of ^{87}Rb gases that are driven across the superfluid to Mott insulator transition within optical lattices of either the kagome (z=4) or the triangular (z=6) geometries. The coherent fraction measured for atoms in the kagome lattice is lower than for those in a triangular lattice with the same interaction and tunneling energies. A comparison of measurements from both lattices agrees quantitatively with the scaling prediction. We also study the response of the gas to a change in lattice geometry, and observe the dynamics as a strongly interacting kagome-lattice gas is suddenly "hole doped" by introducing the additional sites of the triangular lattice.
NASA Astrophysics Data System (ADS)
Johnston, L.; Heron, S. F.; Johnson, S.; Okano, R.; Benavente, D.; Iguel, J.; Perez, D. I.; Liu, G.; Geiger, E.; Eakin, C. M.
2016-02-01
In 2013 and 2014, the Mariana Archipelago experienced consecutive thermal stress events that resulted in widespread coral bleaching and mortality. Using in situ survey data collected across seven of the Northern Mariana Islands during the 2014 event, we undertook the first quantitative comparison between the National Oceanic and Atmospheric Administration's Coral Reef Watch 5 km satellite monitoring products and coral bleaching observations. Analysis of coral community characteristics, historical temperature conditions and thermal stress revealed a strong influence of coral biodiversity in the patterns of observed bleaching. This illustrates the importance of using local benthic characteristics to interpret the level of impact from thermal stress exposure. In an era of continuing climate change, accurate monitoring of thermal stress and prediction of coral bleaching are essential for resource managers and stakeholders to direct resources to the most effective management actions to conserve coral reefs.
NASA Astrophysics Data System (ADS)
Joiris, Claude R.
2000-12-01
The summer at-sea distribution of seabirds and marine mammals was quantitatively established both in Antarctica (Weddell Sea) and in the European Arctic: Greenland, Norwegian and Barents seas. Data can directly be compared, since the same transect counts were applied by the same team from the same icebreaking ship in both regions. The main conclusion is that densities of seabirds and marine mammals are similar in open water and at the ice edge from both polar regions, while the presence of Adélie penguins, minke whales and crabeater seals in densities more than one order of magnitude higher in Antarctic pack-ice must reflect a major ecological difference between both polar systems. The ecological implications of these observations are discussed, especially concerning important primary and secondary (krill) productions under the Weddell Sea pack-ice.
Koenderink, Jan J; van Doorn, Andrea J; Wagemans, Johan
2011-01-01
Depth is the feeling of remoteness, or separateness, that accompanies awareness in human modalities like vision and audition. In specific cases depths can be graded on an ordinal scale, or even measured quantitatively on an interval scale. In the case of pictorial vision this is complicated by the fact that human observers often appear to apply mental transformations that involve depths in distinct visual directions. This implies that a comparison of empirically determined depths between observers involves pictorial space as an integral entity, whereas comparing pictorial depths as such is meaningless. We describe the formal structure of pictorial space purely in the phenomenological domain, without taking recourse to the theories of optics which properly apply to physical space-a distinct ontological domain. We introduce a number of general ways to design and implement methods of geodesy in pictorial space, and discuss some basic problems associated with such measurements. We deal mainly with conceptual issues.
NASA Astrophysics Data System (ADS)
Avoine, Amaury; Hong, Phan Ngoc; Frederich, Hugo; Frigerio, Jean-Marc; Coolen, Laurent; Schwob, Catherine; Nga, Pham Thu; Gallas, Bruno; Maître, Agnès
2012-10-01
Self-assembled artificial opals (in particular silica opals) constitute a model system to study the optical properties of three-dimensional photonic crystals. The silica optical index is a key parameter to correctly describe an opal but is difficult to measure at the submicrometer scale and usually treated as a free parameter. Here, we propose a method to extract the silica index from the opal reflection spectra and we validate it by comparison with two independent methods based on infrared measurements. We show that this index gives a correct description of the opal reflection spectra, either by a band structure or by a Bragg approximation. In particular, we are able to provide explanations in quantitative agreement with the measurements for two features : the observation of a second reflection peak in specular direction, and the quasicollapse of the p-polarized main reflection peak at a typical angle of 54∘.
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1993-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
Techniques for optimal crop selection in a controlled ecological life support system
NASA Technical Reports Server (NTRS)
Mccormack, Ann; Finn, Cory; Dunsky, Betsy
1992-01-01
A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.
NASA Astrophysics Data System (ADS)
van Poppel, Bret; Owkes, Mark; Nelson, Thomas; Lee, Zachary; Sowell, Tyler; Benson, Michael; Vasquez Guzman, Pablo; Fahrig, Rebecca; Eaton, John; Kurman, Matthew; Kweon, Chol-Bum; Bravo, Luis
2014-11-01
In this work, we present high-fidelity Computational Fluid Dynamics (CFD) results of liquid fuel injection from a pressure-swirl atomizer and compare the simulations to experimental results obtained using both shadowgraphy and phase-averaged X-ray computed tomography (CT) scans. The CFD and experimental results focus on the dense near-nozzle region to identify the dominant mechanisms of breakup during primary atomization. Simulations are performed using the NGA code of Desjardins et al (JCP 227 (2008)) and employ the volume of fluid (VOF) method proposed by Owkes and Desjardins (JCP 270 (2013)), a second order accurate, un-split, conservative, three-dimensional VOF scheme providing second order density fluxes and capable of robust and accurate high density ratio simulations. Qualitative features and quantitative statistics are assessed and compared for the simulation and experimental results, including the onset of atomization, spray cone angle, and drop size and distribution.
Lin, Y W; Hee, S S
1998-07-24
A rapid, reliable and effective method for direct determination of the inert components, manufacturing by-products of the pesticide, and active ingredient in two malathion formulations has been established using capillary gas chromatography-mass spectrometry (GC-MS) with the internal standard method. The C2-, C3-, and C4-alkylbenzenes, the major pesticide manufacturing by-products (O,O,S-trimethylthionophosphate, diethyl maleate and O,O,O-trimethylthionophosphate), and malathion were resolved, and quantified in the same chromatogram. Structural identification was based on MS total ion current data, comparison of GC retention times with those of authentic standards, and retention indices. O,O,S-Trimethylthionophosphate was quantified at 3.57 +/- 0.31% (w/w) in one malathion formulation. While the malathion contents were within specifications for both formulations, the total alkylbenzene contents were not.
Theory of space-charge polarization for determining ionic constants of electrolytic solutions
NASA Astrophysics Data System (ADS)
Sawada, Atsushi
2007-06-01
A theoretical expression of the complex dielectric constant attributed to space-charge polarization has been derived under an electric field calculated using Poisson's equation considering the effects of bound charges on ions. The frequency dependence of the complex dielectric constant of chlorobenzene solutions doped with tetrabutylammonium tetraphenylborate (TBATPB) has been analyzed using the theoretical expression, and the impact of the bound charges on the complex dielectric constant has been clarified quantitatively in comparison with a theory that does not consider the effect of the bound charges. The Stokes radius of TBA +(=TPB-) determined by the present theory shows a good agreement with that determined by conductometry in the past; hence, the present theory should be applicable to the direct determination of the mobility of ion species in an electrolytic solution without the need to measure ionic limiting equivalent conductance and transport number.
Electronic entanglement in late transition metal oxides.
Thunström, Patrik; Di Marco, Igor; Eriksson, Olle
2012-11-02
We present a study of the entanglement in the electronic structure of the late transition metal monoxides--MnO, FeO, CoO, and NiO--obtained by means of density-functional theory in the local density approximation combined with dynamical mean-field theory. The impurity problem is solved through exact diagonalization, which grants full access to the thermally mixed many-body ground state density operator. The quality of the electronic structure is affirmed through a direct comparison between the calculated electronic excitation spectrum and photoemission experiments. Our treatment allows for a quantitative investigation of the entanglement in the electronic structure. Two main sources of entanglement are explicitly resolved through the use of a fidelity based geometrical entanglement measure, and additional information is gained from a complementary entropic entanglement measure. We show that the interplay of crystal field effects and Coulomb interaction causes the entanglement in CoO to take a particularly intricate form.
Role of small oligomers on the amyloidogenic aggregation free-energy landscape.
He, Xianglan; Giurleo, Jason T; Talaga, David S
2010-01-08
We combine atomic-force-microscopy particle-size-distribution measurements with earlier measurements on 1-anilino-8-naphthalene sulfonate, thioflavin T, and dynamic light scattering to develop a quantitative kinetic model for the aggregation of beta-lactoglobulin into amyloid. We directly compare our simulations to the population distributions provided by dynamic light scattering and atomic force microscopy. We combine species in the simulation according to structural type for comparison with fluorescence fingerprint results. The kinetic model of amyloidogenesis leads to an aggregation free-energy landscape. We define the roles of and propose a classification scheme for different oligomeric species based on their location in the aggregation free-energy landscape. We relate the different types of oligomers to the amyloid cascade hypothesis and the toxic oligomer hypothesis for amyloid-related diseases. We discuss existing kinetic mechanisms in terms of the different types of oligomers. We provide a possible resolution to the toxic oligomer-amyloid coincidence.
The infant rat as a model of bacterial meningitis.
Moxon, E R; Glode, M P; Sutton, A; Robbins, J B
1977-08-01
The pathogenesis of bacterial meningitis was studied in infant rats. Intranasal intoculation of greater than 10(3) Haemophilus influenzae type b resulted in an incidence of bacteremia that was directly related to the size of hte challenge inoculum. The temporal and quantitative relationship of bacteremia to meningitis indicated that bacteria spread to the meninges by the hematogenous route and that the magnitude of bacteremia was a primary determinant in the development of meningitis. In a sparate series of experiments, infant rats that were fed Escherichia coli strain C94 (O7:K1:H-) became colonized and developed bacteremia and meningitis, but invasive disease was rare when rats were fed E. Coli strain Easter (O75:K100:H5). A comparison of intranasal vs. oral challenge indicated that the nasopharynx was the most effective route for inducing H. influenzae bacteremia, whereas the gastrointestinal route was the more effective challenge route for the E. coli K1 serotype.
A critical comparison of electrical methods for measuring spin-orbit torques
NASA Astrophysics Data System (ADS)
Zhang, Xuanzi; Hung, Yu-Ming; Rehm, Laura; Kent, Andrew D.
Direct (DC) and alternating current (AC) transport measurements of spin-orbit torques (SOTs) in heavy metal-ferromagnet heterostructure with perpendicular magnetic anisotropy have been proposed and demonstrated. A DC method measures the change of perpendicular magnetization component while an AC method probes the first and second harmonic magnetization oscillation in responses to an AC current (~1 kHz). Here we conduct both types of measurements on β-Ta/CoFeB/MgO in the form of patterned Hall bars (20 μm linewidth) and compare the results. Experiments results are qualitatively in agreement with a macro spin model including Slonzewski-like and a field-like SOTs. However, the effective field from the ac method is larger than that obtained from the DC method. We discuss the possible origins of the discrepancy and its implications for quantitatively determining SOTs. Research supported by the SRC-INDEX program, NSF-DMR-1309202 and NYU-DURF award.
Cool running: locomotor performance at low body temperature in mammals.
Rojas, A Daniella; Körtner, Gerhard; Geiser, Fritz
2012-10-23
Mammalian torpor saves enormous amounts of energy, but a widely assumed cost of torpor is immobility and therefore vulnerability to predators. Contrary to this assumption, some small marsupial mammals in the wild move while torpid at low body temperatures to basking sites, thereby minimizing energy expenditure during arousal. Hence, we quantified how mammalian locomotor performance is affected by body temperature. The three small marsupial species tested, known to use torpor and basking in the wild, could move while torpid at body temperatures as low as 14.8-17.9°C. Speed was a sigmoid function of body temperature, but body temperature effects on running speed were greater than those in an ectothermic lizard used for comparison. We provide the first quantitative data of movement at low body temperature in mammals, which have survival implications for wild heterothermic mammals, as directional movement at low body temperature permits both basking and predator avoidance.
Kalaycıoğlu, Zeynep; Erim, F Bedia
2017-04-15
Numerous recent scientific publications investigating the health benefits of pomegranate juice have greatly increased consumer interest in this fruit. The primary cause of the positive health effect of pomegranate is the unique antioxidant activity of this fruit. As a result of the increased attention given to pomegranate, the number of countries producing pomegranate has increased and new cultivars are appearing. The purpose of this review is to quantitatively establish the antioxidant activities, the total phenolic contents which are highly correlated to antioxidant activities, and the other important ingredients of pomegranate juices obtained from cultivars of different regions. Pomegranate wine, vinegar, and sour sauce obtained directly from pomegranate juice are included in this review. Comparison of aril juices with peel and seed extracts is also given. This data could be useful to the pomegranate industry in identifying and developing cultivars having commercial value. Copyright © 2016 Elsevier Ltd. All rights reserved.
The development and validation of the Physical Appearance Comparison Scale-3 (PACS-3).
Schaefer, Lauren M; Thompson, J Kevin
2018-05-21
Appearance comparison processes are implicated in the development of body-image disturbance and disordered eating. The Physical Appearance Comparison Scale-Revised (PACS-R) assesses the simple frequency of appearance comparisons; however, research has suggested that other aspects of appearance comparisons (e.g., comparison direction) may moderate the association between comparisons and their negative outcomes. In the current study, the PACS-R was revised to examine aspects of comparisons with relevance to body-image and eating outcomes. Specifically, the measure was modified to examine (a) dimensions of physical appearance relevant to men and women (i.e., weight-shape, muscularity, and overall physical appearance), (b) comparisons with proximal and distal targets, (c) upward versus downward comparisons, and (d) the acute emotional impact of comparisons. The newly revised measure, labeled the PACS-3, along with existing measures of appearance comparison, body satisfaction, eating pathology, and self-esteem, was completed by 1,533 college men and women. Exploratory and confirmatory factor analyses were conducted to examine the factor structure of the PACS-3. In addition, the reliability, convergent validity, and incremental validity of the PACS-3 scores were examined. The final PACS-3 comprises 27 items and 9 subscales: Proximal: Frequency, Distal: Frequency, Muscular: Frequency, Proximal: Direction, Distal: Direction, Muscular: Direction, Proximal: Effect, Distal: Effect, and Muscular: Effect. the PACS-3 subscale scores demonstrated good reliability and convergent validity. Moreover, the PACS-3 subscales greatly improved the prediction of body satisfaction and disordered eating relative to existing measures of appearance comparison. Overall, the PACS-3 improves upon existing scales and offers a comprehensive assessment of appearance-comparison processes. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Pitarch, Elena; Hernandez, Felix; ten Hove, Jan; Meiring, Hugo; Niesing, Willem; Dijkman, Ellen; Stolker, Linda; Hogendoorn, Elbert
2004-03-26
We have investigated the potential of capillary-column-switching liquid chromatography coupled to tandem mass spectrometry (cLC-MS-MS) for the quantitative on-line trace analysis of target compounds in aqueous solutions. The technical design of the nano-scale cLC system developed at our Institute for peptide and protein identification has been tested and evaluated for the direct trace analysis of drugs in water samples. Sulphametoxazole, bezafibrate, metoprolol, carbamazepine and bisoprolol occurring frequently in Dutch waters, were selected as test compounds. Adequate conditions for trapping, elution and MS-MS detection were investigated by employing laboratory made 200 microm i.d. capillary columns packed with 5 microm aqua C18 material. In the final cLC-MS-MS conditions, a 1 cm length trapping column and a 4 cm length analytical column were selected. Under these conditions, the target compounds could be directly determined in water down to a level of around 50 ng/l employing only 25 microl of water sample. Validation was done by recovery experiments in ground-, surface- and drinking-water matrices as well as by the analysis of water samples with incurred residues and previously analyzed with a conventional procedure involving off-line solid-phase extraction and narrow-bore LC with MS-MS detection. The new methodology provided recoveries (50-500 ng/l level) between 50 and 114% with RSDs (n = 3, each level) below 20% for most of the compounds. Despite the somewhat less analytical performance in comparison to the conventional procedure, the on-line approach of the new methodology is very suitable for screening of drugs in aqueous samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwarz, S.; Ellen, R.P.; Grove, D.A.
1987-10-01
There is limited evidence, mostly indirect, to suggest that the adherence of Bacteroides gingivalis to teeth may be enhanced by the presence of gram-positive dental plaque bacteria like Actinomyces viscosus. The purpose of this study was to carry out direct quantitative assessments of the cohesion of B gingivalis and A. viscosus by using an in vitro assay modeled on the natural sequence in which these two species colonize the teeth. The assay allowed comparisons to be made of the adherence of /sup 3/H-labeled B. gingivalis 2561 and 381 to saliva-coated hydroxyapatite beads (S-HA) and A. viscosus WVU627- or T14V-coated S-HAmore » (actinobeads) in equilibrium and kinetics binding studies. A series of preliminary binding studies with 3H-labeled A. viscosus and parallel studies by scanning electron microscopy with unlabeled A. viscosus were conducted to establish a protocol by which actinobeads suitable for subsequent Bacteroides adherence experiments could be prepared. By scanning electron microscopy, the actinobeads had only small gaps of exposed S-HA between essentially irreversibly bound A. viscosus cells. Furthermore, B. gingivalis cells appeared to bind preferentially to the Actinomyces cells instead of the exposed S-HA. B. gingivalis binding to both S-HA and actinobeads was saturable with at least 2 X 10(9) to 3 X 10(9) cells per ml, and equilibrium with saturating concentrations was reached within 10 to 20 min. B. gingivalis always bound in greater numbers to the actinobeads than to S-HA. These findings provide direct measurements supporting the concept that cohesion with dental plaque bacteria like A. viscosus may foster the establishment of B. gingivalis on teeth by enhancing its adherence.« less
Kager, Simone; Budhota, Aamani; Deshmukh, Vishwanath A.; Kuah, Christopher W. K.; Yam, Lester H. L.; Xiang, Liming; Chua, Karen S. G.; Masia, Lorenzo; Campolo, Domenico
2017-01-01
Proprioception is a critical component for motor functions and directly affects motor learning after neurological injuries. Conventional methods for its assessment are generally ordinal in nature and hence lack sensitivity. Robotic devices designed to promote sensorimotor learning can potentially provide quantitative precise, accurate, and reliable assessments of sensory impairments. In this paper, we investigate the clinical applicability and validity of using a planar 2 degrees of freedom robot to quantitatively assess proprioceptive deficits in post-stroke participants. Nine stroke survivors and nine healthy subjects participated in the study. Participants’ hand was passively moved to the target position guided by the H-Man robot (Criterion movement) and were asked to indicate during a second passive movement towards the same target (Matching movement) when they felt that they matched the target position. The assessment was carried out on a planar surface for movements in the forward and oblique directions in the contralateral and ipsilateral sides of the tested arm. The matching performance was evaluated in terms of error magnitude (absolute and signed) and its variability. Stroke patients showed higher variability in the estimation of the target position compared to the healthy participants. Further, an effect of target was found, with lower absolute errors in the contralateral side. Pairwise comparison between individual stroke participant and control participants showed significant proprioceptive deficits in two patients. The proposed assessment of passive joint position sense was inherently simple and all participants, regardless of motor impairment level, could complete it in less than 10 minutes. Therefore, the method can potentially be carried out to detect changes in proprioceptive deficits in clinical settings. PMID:29161264
Swan, Hilton B; Deschaseaux, Elisabeth S M; Jones, Graham B; Eyre, Bradley D
2017-03-01
Dimethylsulfoniopropionate (DMSP) in scleractinian coral is usually analysed indirectly as dimethylsulfide (DMS) using gas chromatography (GC) with a sulfur-specific detector. We developed a headspace GC method for mass spectral analysis of DMSP in branching coral where hexa-deuterated DMSP (d 6 -DMSP) was added to samples and standards to optimise the analytical precision and quantitative accuracy. Using this indirect HS-GC-MS method, we show that common coral sample handling techniques did not alter DMSP concentrations in Acropora aspera and that endogenous DMS was insignificant compared to the store of DMSP in A. aspera. Field application of the indirect HS-GC-MS method in all seasons over a 5-year period at Heron Island in the southern Great Barrier Reef indicated that healthy colonies of A. aspera ordinarily seasonally conserve their branch tip store of DMSP; however, this store increased to a higher concentration under extended thermal stress conditions driven by a strong El Niño Southern Oscillation event. A liquid chromatography mass spectral method (LC-MS) was subsequently developed for direct analysis of DMSP in branching coral, also utilising the d 6 -DMSP internal standard. The quantitative comparison of DMSP in four species of Acropora coral by indirect HS-GC-MS and direct LC-MS analyses gave equivalent concentrations in A. aspera only; in the other three species, HS-GC-MS gave consistently higher concentrations, indicating that indirect analysis of DMSP may lead to artificially high values for some coral species. Graphical Abstract Dimethylsulfoniopropionate (DMSP) was quantified in Acropora spp. of branching coral using deuterated stable isotope dilution mass spectrometry.
ERIC Educational Resources Information Center
Viegas, Ricardo G.; Oliveira, Armando M.; Garriga-Trillo, Ana; Grieco, Alba
2012-01-01
In order to be treated quantitatively, subjective gains and losses (utilities/disutilities) must be psychologically measured. If legitimate comparisons are sought between them, measurement must be at least interval level, with a common unit. If comparisons of absolute magnitudes across gains and losses are further sought, as in standard…
The Effects of Handwriting Instruction on Reading for Students in Grades 1 and 2
ERIC Educational Resources Information Center
Stroik, Linda R.
2016-01-01
The purpose of this quantitative quasi-experimental group comparison study using a repeated measures comparison group design with random assignment of subjects to groups was to investigate the effects of handwriting instruction on reading progress for learners in grade 1 and grade 2. At three points in time, the number of words each student read…
ERIC Educational Resources Information Center
Casanova, Manuel F.; El-Baz, Ayman; Elnakib, Ahmed; Switala, Andrew E.; Williams, Emily L.; Williams, Diane L.; Minshew, Nancy J.; Conturo, Thomas E.
2011-01-01
Multiple studies suggest that the corpus callosum in patients with autism is reduced in size. This study attempts to elucidate the nature of this morphometric abnormality by analyzing the shape of this structure in 17 high-functioning patients with autism and an equal number of comparison participants matched for age, sex, IQ, and handedness. The…
Shock, Everett L; Holland, Melanie E
2007-12-01
A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.
Brigo, Francesco; Bragazzi, Nicola; Nardone, Raffaele; Trinka, Eugen
2016-11-01
The aim of this study was to conduct a meta-analysis of published studies to directly compare intravenous (IV) levetiracetam (LEV) with IV phenytoin (PHT) or IV valproate (VPA) as second-line treatment of status epilepticus (SE), to indirectly compare intravenous IV LEV with IV VPA using common reference-based indirect comparison meta-analysis, and to verify whether results of indirect comparisons are consistent with results of head-to-head randomized controlled trials (RCTs) directly comparing IV LEV with IV VPA. Random-effects Mantel-Haenszel meta-analyses to obtain odds ratios (ORs) for efficacy and safety of LEV versus VPA and LEV or VPA versus PHT were used. Adjusted indirect comparisons between LEV and VPA were used. Two RCTs comparing LEV with PHT (144 episodes of SE) and 3 RCTs comparing VPA with PHT (227 episodes of SE) were included. Direct comparisons showed no difference in clinical seizure cessation, neither between VPA and PHT (OR: 1.07; 95% CI: 0.57 to 2.03) nor between LEV and PHT (OR: 1.18; 95% CI: 0.50 to 2.79). Indirect comparisons showed no difference between LEV and VPA for clinical seizure cessation (OR: 1.16; 95% CI: 0.45 to 2.97). Results of indirect comparisons are consistent with results of a recent RCT directly comparing LEV with VPA. The absence of a statistically significant difference in direct and indirect comparisons is due to the lack of sufficient statistical power to detect a difference. Conducting a RCT that has not enough people to detect a clinically important difference or to estimate an effect with sufficient precision can be regarded a waste of time and resources and may raise several ethical concerns, especially in RCT on SE. Copyright © 2016 Elsevier Inc. All rights reserved.
Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E
2013-01-01
To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.
NASA Technical Reports Server (NTRS)
Bush, V. N.
1974-01-01
Plectonema boryanum is a filamentous blue green alga. Blue green algae have a procaryotic cellular organization similar to bacteria, but are usually obligate photoautotrophs, obtaining their carbon and energy from photosynthetic mechanism similar to higher plants. This research deals with a comparison of three methods of quantitating filamentous populations: microscopic cell counts, the luciferase assay for ATP and optical density measurements.
Comparison of two laboratory-based systems for evaluation of halos in intraocular lenses
Alexander, Elsinore; Wei, Xin; Lee, Shinwook
2018-01-01
Purpose Multifocal intraocular lenses (IOLs) can be associated with unwanted visual phenomena, including halos. Predicting potential for halos is desirable when designing new multifocal IOLs. Halo images from 6 IOL models were compared using the Optikos modulation transfer function bench system and a new high dynamic range (HDR) system. Materials and methods One monofocal, 1 extended depth of focus, and 4 multifocal IOLs were evaluated. An off-the-shelf optical bench was used to simulate a distant (>50 m) car headlight and record images. A custom HDR system was constructed using an imaging photometer to simulate headlight images and to measure quantitative halo luminance data. A metric was developed to characterize halo luminance properties. Clinical relevance was investigated by correlating halo measurements to visual outcomes questionnaire data. Results The Optikos system produced halo images useful for visual comparisons; however, measurements were relative and not quantitative. The HDR halo system provided objective and quantitative measurements used to create a metric from the area under the curve (AUC) of the logarithmic normalized halo profile. This proposed metric differentiated between IOL models, and linear regression analysis found strong correlations between AUC and subjective clinical ratings of halos. Conclusion The HDR system produced quantitative, preclinical metrics that correlated to patients’ subjective perception of halos. PMID:29503526
The Integral Method, a new approach to quantify bactericidal activity.
Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus
2015-08-01
The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.
Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin
2017-08-15
Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.
Helicopter Blade-Vortex Interaction Noise with Comparisons to CFD Calculations
NASA Technical Reports Server (NTRS)
McCluer, Megan S.
1996-01-01
A comparison of experimental acoustics data and computational predictions was performed for a helicopter rotor blade interacting with a parallel vortex. The experiment was designed to examine the aerodynamics and acoustics of parallel Blade-Vortex Interaction (BVI) and was performed in the Ames Research Center (ARC) 80- by 120-Foot Subsonic Wind Tunnel. An independently generated vortex interacted with a small-scale, nonlifting helicopter rotor at the 180 deg azimuth angle to create the interaction in a controlled environment. Computational Fluid Dynamics (CFD) was used to calculate near-field pressure time histories. The CFD code, called Transonic Unsteady Rotor Navier-Stokes (TURNS), was used to make comparisons with the acoustic pressure measurement at two microphone locations and several test conditions. The test conditions examined included hover tip Mach numbers of 0.6 and 0.7, advance ratio of 0.2, positive and negative vortex rotation, and the vortex passing above and below the rotor blade by 0.25 rotor chords. The results show that the CFD qualitatively predicts the acoustic characteristics very well, but quantitatively overpredicts the peak-to-peak sound pressure level by 15 percent in most cases. There also exists a discrepancy in the phasing (about 4 deg) of the BVI event in some cases. Additional calculations were performed to examine the effects of vortex strength, thickness, time accuracy, and directionality. This study validates the TURNS code for prediction of near-field acoustic pressures of controlled parallel BVI.
76 FR 5719 - Pattern of Violations
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-02
... safety and health record of each mine rather than on a strictly quantitative comparison of mines to... several reservations, given the methodological difficulties involved in estimating the compensating wage...
Statistical design of quantitative mass spectrometry-based proteomic experiments.
Oberg, Ann L; Vitek, Olga
2009-05-01
We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.
Quantitative analysis of diffusion tensor orientation: theoretical framework.
Wu, Yu-Chien; Field, Aaron S; Chung, Moo K; Badie, Benham; Alexander, Andrew L
2004-11-01
Diffusion-tensor MRI (DT-MRI) yields information about the magnitude, anisotropy, and orientation of water diffusion of brain tissues. Although white matter tractography and eigenvector color maps provide visually appealing displays of white matter tract organization, they do not easily lend themselves to quantitative and statistical analysis. In this study, a set of visual and quantitative tools for the investigation of tensor orientations in the human brain was developed. Visual tools included rose diagrams, which are spherical coordinate histograms of the major eigenvector directions, and 3D scatterplots of the major eigenvector angles. A scatter matrix of major eigenvector directions was used to describe the distribution of major eigenvectors in a defined anatomic region. A measure of eigenvector dispersion was developed to describe the degree of eigenvector coherence in the selected region. These tools were used to evaluate directional organization and the interhemispheric symmetry of DT-MRI data in five healthy human brains and two patients with infiltrative diseases of the white matter tracts. In normal anatomical white matter tracts, a high degree of directional coherence and interhemispheric symmetry was observed. The infiltrative diseases appeared to alter the eigenvector properties of affected white matter tracts, showing decreased eigenvector coherence and interhemispheric symmetry. This novel approach distills the rich, 3D information available from the diffusion tensor into a form that lends itself to quantitative analysis and statistical hypothesis testing. (c) 2004 Wiley-Liss, Inc.
Solomon, Justin; Mileto, Achille; Nelson, Rendon C; Roy Choudhury, Kingshuk; Samei, Ehsan
2016-04-01
To determine if radiation dose and reconstruction algorithm affect the computer-based extraction and analysis of quantitative imaging features in lung nodules, liver lesions, and renal stones at multi-detector row computed tomography (CT). Retrospective analysis of data from a prospective, multicenter, HIPAA-compliant, institutional review board-approved clinical trial was performed by extracting 23 quantitative imaging features (size, shape, attenuation, edge sharpness, pixel value distribution, and texture) of lesions on multi-detector row CT images of 20 adult patients (14 men, six women; mean age, 63 years; range, 38-72 years) referred for known or suspected focal liver lesions, lung nodules, or kidney stones. Data were acquired between September 2011 and April 2012. All multi-detector row CT scans were performed at two different radiation dose levels; images were reconstructed with filtered back projection, adaptive statistical iterative reconstruction, and model-based iterative reconstruction (MBIR) algorithms. A linear mixed-effects model was used to assess the effect of radiation dose and reconstruction algorithm on extracted features. Among the 23 imaging features assessed, radiation dose had a significant effect on five, three, and four of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Adaptive statistical iterative reconstruction had a significant effect on three, one, and one of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). MBIR reconstruction had a significant effect on nine, 11, and 15 of the features for liver lesions, lung nodules, and renal stones, respectively (P < .002 for all comparisons). Of note, the measured size of lung nodules and renal stones with MBIR was significantly different than those for the other two algorithms (P < .002 for all comparisons). Although lesion texture was significantly affected by the reconstruction algorithm used (average of 3.33 features affected by MBIR throughout lesion types; P < .002, for all comparisons), no significant effect of the radiation dose setting was observed for all but one of the texture features (P = .002-.998). Radiation dose settings and reconstruction algorithms affect the extraction and analysis of quantitative imaging features in lesions at multi-detector row CT.
NASA Technical Reports Server (NTRS)
Beke, Andrew; Allen, J L
1953-01-01
Aerodynamic and performance characteristics of a conical spike nacelle-type inlet with two bypasses are presented at Mach numbers of 1.6, 1.8, and 2.0 for angles of attach up to 90 degrees. The bypasses were located 6 inlet diameters downstream of the inlet and were designed to discharge the bypass mass flow outward from the body axis. The inlet was designed to attain a mass-flow ratio of unity at a Mach number of 2.0. It is shown that discharging the bypass mass flow outward from the body nearly doubles the critical drag of a similar configuration but with bypass discharge in an axial direction. As a result of this greater drag, the net force on the model in the flight direction is reduced when comparison is made with the axial discharge case. The lift and pitching-moment coefficients are slightly higher than those for a configuration without bypasses. Approximately 25 % of the maximum inlet mass flow was discharged through the bypasses, and the pressure-recovery and mass-flow characteristics were in qualitative and quantitative agreement with the results of an investigation of a similar configuration with axial discharge.
Verification of a VRF Heat Pump Computer Model in EnergyPlus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nigusse, Bereket; Raustad, Richard
2013-06-15
This paper provides verification results of the EnergyPlus variable refrigerant flow (VRF) heat pump computer model using manufacturer's performance data. The paper provides an overview of the VRF model, presents the verification methodology, and discusses the results. The verification provides quantitative comparison of full and part-load performance to manufacturer's data in cooling-only and heating-only modes of operation. The VRF heat pump computer model uses dual range bi-quadratic performance curves to represent capacity and Energy Input Ratio (EIR) as a function of indoor and outdoor air temperatures, and dual range quadratic performance curves as a function of part-load-ratio for modeling part-loadmore » performance. These performance curves are generated directly from manufacturer's published performance data. The verification compared the simulation output directly to manufacturer's performance data, and found that the dual range equation fit VRF heat pump computer model predicts the manufacturer's performance data very well over a wide range of indoor and outdoor temperatures and part-load conditions. The predicted capacity and electric power deviations are comparbale to equation-fit HVAC computer models commonly used for packaged and split unitary HVAC equipment.« less
NASA Technical Reports Server (NTRS)
Bell, Thomas L.; Kundu, Prasun K.; Kummerow, Christian D.; Einaudi, Franco (Technical Monitor)
2000-01-01
Quantitative use of satellite-derived maps of monthly rainfall requires some measure of the accuracy of the satellite estimates. The rainfall estimate for a given map grid box is subject to both remote-sensing error and, in the case of low-orbiting satellites, sampling error due to the limited number of observations of the grid box provided by the satellite. A simple model of rain behavior predicts that Root-mean-square (RMS) random error in grid-box averages should depend in a simple way on the local average rain rate, and the predicted behavior has been seen in simulations using surface rain-gauge and radar data. This relationship was examined using satellite SSM/I data obtained over the western equatorial Pacific during TOGA COARE. RMS error inferred directly from SSM/I rainfall estimates was found to be larger than predicted from surface data, and to depend less on local rain rate than was predicted. Preliminary examination of TRMM microwave estimates shows better agreement with surface data. A simple method of estimating rms error in satellite rainfall estimates is suggested, based on quantities that can be directly computed from the satellite data.
Tseng, Z. Jack; Flynn, John J.
2015-01-01
Morphology serves as a ubiquitous proxy in macroevolutionary studies to identify potential adaptive processes and patterns. Inferences of functional significance of phenotypes or their evolution are overwhelmingly based on data from living taxa. Yet, correspondence between form and function has been tested in only a few model species, and those linkages are highly complex. The lack of explicit methodologies to integrate form and function analyses within a deep-time and phylogenetic context weakens inferences of adaptive morphological evolution, by invoking but not testing form–function linkages. Here, we provide a novel approach to test mechanical properties at reconstructed ancestral nodes/taxa and the strength and direction of evolutionary pathways in feeding biomechanics, in a case study of carnivorous mammals. Using biomechanical profile comparisons that provide functional signals for the separation of feeding morphologies, we demonstrate, using experimental optimization criteria on estimation of strength and direction of functional changes on a phylogeny, that convergence in mechanical properties and degree of evolutionary optimization can be decoupled. This integrative approach is broadly applicable to other clades, by using quantitative data and model-based tests to evaluate interpretations of function from morphology and functional explanations for observed macroevolutionary pathways. PMID:25994295
Stalder, Claudio; Rüggeberg, Andres; Neururer, Christoph; Spangenberg, Jorge E.; Spezzaferri, Silvia
2018-01-01
The marine environment in the Gulf of Gabes (southern Tunisia) is severely impacted by phosphate industries. Nowadays, three localities, Sfax, Skhira and Gabes produce phosphoric acid along the coasts of this Gulf and generate a large amount of phosphogypsum as a waste product. The Gabes phosphate industry is the major cause of pollution in the Gulf because most of the waste is directly discharged into the sea without preliminary treatment. This study investigates the marine environment in the proximity of the phosphate industries of Gabes and the coastal marine environment on the eastern coast of Djerba, without phosphate industry. This site can be considered as "pristine" and enables a direct comparison between polluted and “clean” adjacent areas. Phosphorous, by sequential extractions (SEDEX), Rock-Eval, C, H, N elemental analysis, and stable carbon isotope composition of sedimentary organic matter, X-ray diffraction (qualitative and quantitative analysis) were measured on sediments. Temperature, pH and dissolved oxygen were measured on the water close to the sea floor of each station to estimate environmental conditions. These analyses are coupled with video surveys of the sea floor. This study reveals clear differentiations in pollution and eutrophication in the investigated areas. PMID:29771969
Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory
NASA Astrophysics Data System (ADS)
Deyi, Feng; Ichikawa, M.
1989-11-01
In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.
NASA Astrophysics Data System (ADS)
Al-Mousa, Amjed A.
Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.
A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.
Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B
2015-12-04
A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.
Comparison of NGA-West2 directivity models
Spudich, Paul A.; Rowshandel, Badie; Shahi, Shrey; Baker, Jack W.; Chiou, Brian S-J
2014-01-01
Five directivity models have been developed based on data from the NGA-West2 database and based on numerical simulations of large strike-slip and reverse-slip earthquakes. All models avoid the use of normalized rupture dimension, enabling them to scale up to the largest earthquakes in a physically reasonable way. Four of the five models are explicitly “narrow-band” (in which the effect of directivity is maximum at a specific period that is a function of earthquake magnitude). Several strategies for determining the zero-level for directivity have been developed. We show comparisons of maps of the directivity amplification. This comparison suggests that the predicted geographic distributions of directivity amplification are dominated by effects of the models' assumptions, and more than one model should be used for ruptures dipping less than about 65 degrees.
Susilo, Astrid P.; van Berkel, Henk
2016-01-01
Objectives To identify the student’s readiness to perform self-directed learning and the underlying factors influencing it on the hybrid problem based learning curriculum. Methods A combination of quantitative and qualitative studies was conducted in five medical schools in Indonesia. In the quantitative study, the Self Directed Learning Readiness Scale was distributed to all students in all batches, who had experience with the hybrid problem based curriculum. They were categorized into low- and high -level based on the score of the questionnaire. Three focus group discussions (low-, high-, and mixed level) were conducted in the qualitative study with six to twelve students chosen randomly from each group to find the factors influencing their self-directed learning readiness. Two researchers analysed the qualitative data as a measure of triangulation. Results The quantitative study showed only half of the students had a high-level of self-directed learning readiness, and a similar trend also occurred in each batch. The proportion of students with a high level of self-directed learning readiness was lower in the senior students compared to more junior students. The qualitative study showed that problem based learning processes, assessments, learning environment, students’ life styles, students’ perceptions of the topics, and mood, were factors influencing their self-directed learning. Conclusion A hybrid problem based curriculum may not fully affect the students’ self-directed learning. The curriculum system, teacher’s experiences, student’s background and cultural factors might contribute to the difficulties for the student’s in conducting self-directed learning. PMID:27915308
Volis, S; Ormanbekova, D; Shulgina, I
2016-04-01
Evaluating the relative importance of neutral and adaptive processes as determinants of population differentiation across environments is a central theme of evolutionary biology. We applied the QST-FST comparison flanked by a direct test for local adaptation to infer the role of climate-driven selection and gene flow in population differentiation of an annual grass Avena sterilis in two distinct parts of the species range, edge and interior, which represent two globally different climates, desert and Mediterranean. In a multiyear reciprocal transplant experiment, the plants of desert and Mediterranean origin demonstrated home advantage, and population differentiation in several phenotypic traits related to reproduction exceeded neutral predictions, as determined by comparisons of QST values with theoretical FST distributions. Thus, variation in these traits likely resulted from local adaptation to desert and Mediterranean environments. The two separate common garden experiments conducted with different experimental design revealed that two population comparisons, in contrast to multi-population comparisons, are likely to detect population differences in virtually every trait, but many of these differences reflect effects of local rather than regional environment. We detected a general reduction in neutral (SSR) genetic variation but not in adaptive quantitative trait variation in peripheral desert as compared with Mediterranean core populations. On the other hand, the molecular data indicated intensive gene flow from the Mediterranean core towards desert periphery. Although species range position in our study (edge vs. interior) was confounded with climate (desert vs. Mediterranean), the results suggest that the gene flow from the species core does not have negative consequences for either performance of the peripheral plants or their adaptive potential. © 2016 John Wiley & Sons Ltd.