Sample records for quantitatively similar results

  1. A Quantitative Comparison of the Similarity between Genes and Geography in Worldwide Human Populations

    PubMed Central

    Wang, Chaolong; Zöllner, Sebastian; Rosenberg, Noah A.

    2012-01-01

    Multivariate statistical techniques such as principal components analysis (PCA) and multidimensional scaling (MDS) have been widely used to summarize the structure of human genetic variation, often in easily visualized two-dimensional maps. Many recent studies have reported similarity between geographic maps of population locations and MDS or PCA maps of genetic variation inferred from single-nucleotide polymorphisms (SNPs). However, this similarity has been evident primarily in a qualitative sense; and, because different multivariate techniques and marker sets have been used in different studies, it has not been possible to formally compare genetic variation datasets in terms of their levels of similarity with geography. In this study, using genome-wide SNP data from 128 populations worldwide, we perform a systematic analysis to quantitatively evaluate the similarity of genes and geography in different geographic regions. For each of a series of regions, we apply a Procrustes analysis approach to find an optimal transformation that maximizes the similarity between PCA maps of genetic variation and geographic maps of population locations. We consider examples in Europe, Sub-Saharan Africa, Asia, East Asia, and Central/South Asia, as well as in a worldwide sample, finding that significant similarity between genes and geography exists in general at different geographic levels. The similarity is highest in our examples for Asia and, once highly distinctive populations have been removed, Sub-Saharan Africa. Our results provide a quantitative assessment of the geographic structure of human genetic variation worldwide, supporting the view that geography plays a strong role in giving rise to human population structure. PMID:22927824

  2. A quantitative comparison of the similarity between genes and geography in worldwide human populations.

    PubMed

    Wang, Chaolong; Zöllner, Sebastian; Rosenberg, Noah A

    2012-08-01

    Multivariate statistical techniques such as principal components analysis (PCA) and multidimensional scaling (MDS) have been widely used to summarize the structure of human genetic variation, often in easily visualized two-dimensional maps. Many recent studies have reported similarity between geographic maps of population locations and MDS or PCA maps of genetic variation inferred from single-nucleotide polymorphisms (SNPs). However, this similarity has been evident primarily in a qualitative sense; and, because different multivariate techniques and marker sets have been used in different studies, it has not been possible to formally compare genetic variation datasets in terms of their levels of similarity with geography. In this study, using genome-wide SNP data from 128 populations worldwide, we perform a systematic analysis to quantitatively evaluate the similarity of genes and geography in different geographic regions. For each of a series of regions, we apply a Procrustes analysis approach to find an optimal transformation that maximizes the similarity between PCA maps of genetic variation and geographic maps of population locations. We consider examples in Europe, Sub-Saharan Africa, Asia, East Asia, and Central/South Asia, as well as in a worldwide sample, finding that significant similarity between genes and geography exists in general at different geographic levels. The similarity is highest in our examples for Asia and, once highly distinctive populations have been removed, Sub-Saharan Africa. Our results provide a quantitative assessment of the geographic structure of human genetic variation worldwide, supporting the view that geography plays a strong role in giving rise to human population structure.

  3. Molecular basis of quantitative structure-properties relationships (QSPR): a quantum similarity approach.

    PubMed

    Ponec, R; Amat, L; Carbó-Dorca, R

    1999-05-01

    Since the dawn of quantitative structure-properties relationships (QSPR), empirical parameters related to structural, electronic and hydrophobic molecular properties have been used as molecular descriptors to determine such relationships. Among all these parameters, Hammett sigma constants and the logarithm of the octanol-water partition coefficient, log P, have been massively employed in QSPR studies. In the present paper, a new molecular descriptor, based on quantum similarity measures (QSM), is proposed as a general substitute of these empirical parameters. This work continues previous analyses related to the use of QSM to QSPR, introducing molecular quantum self-similarity measures (MQS-SM) as a single working parameter in some cases. The use of MQS-SM as a molecular descriptor is first confirmed from the correlation with the aforementioned empirical parameters. The Hammett equation has been examined using MQS-SM for a series of substituted carboxylic acids. Then, for a series of aliphatic alcohols and acetic acid esters, log P values have been correlated with the self-similarity measure between density functions in water and octanol of a given molecule. And finally, some examples and applications of MQS-SM to determine QSAR are presented. In all studied cases MQS-SM appeared to be excellent molecular descriptors usable in general QSPR applications of chemical interest.

  4. Molecular basis of quantitative structure-properties relationships (QSPR): A quantum similarity approach

    NASA Astrophysics Data System (ADS)

    Ponec, Robert; Amat, Lluís; Carbó-dorca, Ramon

    1999-05-01

    Since the dawn of quantitative structure-properties relationships (QSPR), empirical parameters related to structural, electronic and hydrophobic molecular properties have been used as molecular descriptors to determine such relationships. Among all these parameters, Hammett σ constants and the logarithm of the octanol- water partition coefficient, log P, have been massively employed in QSPR studies. In the present paper, a new molecular descriptor, based on quantum similarity measures (QSM), is proposed as a general substitute of these empirical parameters. This work continues previous analyses related to the use of QSM to QSPR, introducing molecular quantum self-similarity measures (MQS-SM) as a single working parameter in some cases. The use of MQS-SM as a molecular descriptor is first confirmed from the correlation with the aforementioned empirical parameters. The Hammett equation has been examined using MQS-SM for a series of substituted carboxylic acids. Then, for a series of aliphatic alcohols and acetic acid esters, log P values have been correlated with the self-similarity measure between density functions in water and octanol of a given molecule. And finally, some examples and applications of MQS-SM to determine QSAR are presented. In all studied cases MQS-SM appeared to be excellent molecular descriptors usable in general QSPR applications of chemical interest.

  5. Structural similarity based kriging for quantitative structure activity and property relationship modeling.

    PubMed

    Teixeira, Ana L; Falcao, Andre O

    2014-07-28

    Structurally similar molecules tend to have similar properties, i.e. closer molecules in the molecular space are more likely to yield similar property values while distant molecules are more likely to yield different values. Based on this principle, we propose the use of a new method that takes into account the high dimensionality of the molecular space, predicting chemical, physical, or biological properties based on the most similar compounds with measured properties. This methodology uses ordinary kriging coupled with three different molecular similarity approaches (based on molecular descriptors, fingerprints, and atom matching) which creates an interpolation map over the molecular space that is capable of predicting properties/activities for diverse chemical data sets. The proposed method was tested in two data sets of diverse chemical compounds collected from the literature and preprocessed. One of the data sets contained dihydrofolate reductase inhibition activity data, and the second molecules for which aqueous solubility was known. The overall predictive results using kriging for both data sets comply with the results obtained in the literature using typical QSPR/QSAR approaches. However, the procedure did not involve any type of descriptor selection or even minimal information about each problem, suggesting that this approach is directly applicable to a large spectrum of problems in QSAR/QSPR. Furthermore, the predictive results improve significantly with the similarity threshold between the training and testing compounds, allowing the definition of a confidence threshold of similarity and error estimation for each case inferred. The use of kriging for interpolation over the molecular metric space is independent of the training data set size, and no reparametrizations are necessary when more compounds are added or removed from the set, and increasing the size of the database will consequentially improve the quality of the estimations. Finally it is shown

  6. A novel approach to molecular similarity

    NASA Astrophysics Data System (ADS)

    Cooper, David L.; Allan, Neil L.

    1989-09-01

    We review briefly the general problem of assessing the similarity between one molecule and another. We propose a novel approach to the quantitative estimation of the similarity of two electron distributions. The procedure is based on momentum space concepts, and avoids many of the difficulties associated with the usual position space definitions. Results are presented for the model systems CH3CH2CH3, CH3OCH3, CH3SCH3, H2O and H2S.

  7. Detecting Distortion: Bridging Visual and Quantitative Reasoning on Similarity Tasks

    ERIC Educational Resources Information Center

    Cox, Dana C.; Lo, Jane-Jane

    2014-01-01

    This study is focused on identifying and describing the reasoning patterns of middle grade students when examining potentially similar figures. Described here is a framework that includes 11 strategies that students used during clinical interview to differentiate similar and non-similar figures. Two factors were found to influence the strategies…

  8. Networks of plants: how to measure similarity in vegetable species.

    PubMed

    Vivaldo, Gianna; Masi, Elisa; Pandolfi, Camilla; Mancuso, Stefano; Caldarelli, Guido

    2016-06-07

    Despite the common misconception of nearly static organisms, plants do interact continuously with the environment and with each other. It is fair to assume that during their evolution they developed particular features to overcome similar problems and to exploit possibilities from environment. In this paper we introduce various quantitative measures based on recent advancements in complex network theory that allow to measure the effective similarities of various species. By using this approach on the similarity in fruit-typology ecological traits we obtain a clear plant classification in a way similar to traditional taxonomic classification. This result is not trivial, since a similar analysis done on the basis of diaspore morphological properties do not provide any clear parameter to classify plants species. Complex network theory can then be used in order to determine which feature amongst many can be used to distinguish scope and possibly evolution of plants. Future uses of this approach range from functional classification to quantitative determination of plant communities in nature.

  9. Approaches to quantitating the results of differentially dyed cottons

    USDA-ARS?s Scientific Manuscript database

    The differential dyeing (DD) method has served as a subjective method for visually determining immature cotton fibers. In an attempt to quantitate the results of the differential dyeing method, and thus offer an efficient means of elucidating cotton maturity without visual discretion, image analysi...

  10. Towards a chromatographic similarity index to establish localised quantitative structure-retention relationships for retention prediction. II Use of Tanimoto similarity index in ion chromatography.

    PubMed

    Park, Soo Hyun; Talebi, Mohammad; Amos, Ruth I J; Tyteca, Eva; Haddad, Paul R; Szucs, Roman; Pohl, Christopher A; Dolan, John W

    2017-11-10

    Quantitative Structure-Retention Relationships (QSRR) are used to predict retention times of compounds based only on their chemical structures encoded by molecular descriptors. The main concern in QSRR modelling is to build models with high predictive power, allowing reliable retention prediction for the unknown compounds across the chromatographic space. With the aim of enhancing the prediction power of the models, in this work, our previously proposed QSRR modelling approach called "federation of local models" is extended in ion chromatography to predict retention times of unknown ions, where a local model for each target ion (unknown) is created using only structurally similar ions from the dataset. A Tanimoto similarity (TS) score was utilised as a measure of structural similarity and training sets were developed by including ions that were similar to the target ion, as defined by a threshold value. The prediction of retention parameters (a- and b-values) in the linear solvent strength (LSS) model in ion chromatography, log k=a - blog[eluent], allows the prediction of retention times under all eluent concentrations. The QSRR models for a- and b-values were developed by a genetic algorithm-partial least squares method using the retention data of inorganic and small organic anions and larger organic cations (molecular mass up to 507) on four Thermo Fisher Scientific columns (AS20, AS19, AS11HC and CS17). The corresponding predicted retention times were calculated by fitting the predicted a- and b-values of the models into the LSS model equation. The predicted retention times were also plotted against the experimental values to evaluate the goodness of fit and the predictive power of the models. The application of a TS threshold of 0.6 was found to successfully produce predictive and reliable QSRR models (Q ext(F2) 2 >0.8 and Mean Absolute Error<0.1), and hence accurate retention time predictions with an average Mean Absolute Error of 0.2min. Crown Copyright

  11. Detecting distortion: bridging visual and quantitative reasoning on similarity tasks

    NASA Astrophysics Data System (ADS)

    Cox, Dana C.; Lo, Jane-Jane

    2014-03-01

    This study is focused on identifying and describing the reasoning patterns of middle grade students when examining potentially similar figures. Described here is a framework that includes 11 strategies that students used during clinical interview to differentiate similar and non-similar figures. Two factors were found to influence the strategies students selected: the complexity of the figures being compared and the type of distortion present in nonsimilar pairings. Data from this study support the theory that distortions are identified as a dominant property of figures and that students use the presence and absence of distortion to visually decide if two figures are similar. Furthermore, this study shows that visual reasoning is not as primitive or nonconstructive as represented in earlier literature and supports students who are developing numeric reasoning strategies. This illuminates possible pathways students may take when advancing from using visual and additive reasoning strategies to using multiplicative proportional reasoning on similarity tasks. In particular, distortion detection is a visual activity that enables students to reflect upon and evaluate the validity and accuracy of differentiation and quantify perceived relationships leading to ratio. This study has implications for curriculum developers as well as future research.

  12. Fuzzy similarity measures for ultrasound tissue characterization

    NASA Astrophysics Data System (ADS)

    Emara, Salem M.; Badawi, Ahmed M.; Youssef, Abou-Bakr M.

    1995-03-01

    Computerized ultrasound tissue characterization has become an objective means for diagnosis of diseases. It is difficult to differentiate diffuse liver diseases, namely cirrhotic and fatty liver from a normal one, by visual inspection from the ultrasound images. The visual criteria for differentiating diffused diseases is rather confusing and highly dependent upon the sonographer's experience. The need for computerized tissue characterization is thus justified to quantitatively assist the sonographer for accurate differentiation and to minimize the degree of risk from erroneous interpretation. In this paper we used the fuzzy similarity measure as an approximate reasoning technique to find the maximum degree of matching between an unknown case defined by a feature vector and a family of prototypes (knowledge base). The feature vector used for the matching process contains 8 quantitative parameters (textural, acoustical, and speckle parameters) extracted from the ultrasound image. The steps done to match an unknown case with the family of prototypes (cirr, fatty, normal) are: Choosing the membership functions for each parameter, then obtaining the fuzzification matrix for the unknown case and the family of prototypes, then by the linguistic evaluation of two fuzzy quantities we obtain the similarity matrix, then by a simple aggregation method and the fuzzy integrals we obtain the degree of similarity. Finally, we find that the similarity measure results are comparable to the neural network classification techniques and it can be used in medical diagnosis to determine the pathology of the liver and to monitor the extent of the disease.

  13. Qualitative versus Quantitative Results: An Experimental Introduction to Data Interpretation.

    ERIC Educational Resources Information Center

    Johnson, Eric R.; Alter, Paula

    1989-01-01

    Described is an experiment in which the student can ascertain the meaning of a negative result from a qualitative test by performing a more sensitive quantitative test on the same sample. Methodology for testing urinary glucose with a spectrophotometer at 630 nm and with commercial assaying glucose strips is presented. (MVL)

  14. Fewer Doses of HPV Vaccine Result in Immune Response Similar to Three-Dose Regimen

    MedlinePlus

    ... Releases NCI News Note Fewer doses of HPV vaccine result in immune response similar to three-dose ... that two doses of a human papillomavirus (HPV) vaccine, trademarked as Cervarix, resulted in similar serum antibody ...

  15. Lemurs and macaques show similar numerical sensitivity.

    PubMed

    Jones, Sarah M; Pearson, John; DeWind, Nicholas K; Paulsen, David; Tenekedjieva, Ana-Maria; Brannon, Elizabeth M

    2014-05-01

    We investigated the precision of the approximate number system (ANS) in three lemur species (Lemur catta, Eulemur mongoz, and Eulemur macaco flavifrons), one Old World monkey species (Macaca mulatta) and humans (Homo sapiens). In Experiment 1, four individuals of each nonhuman primate species were trained to select the numerically larger of two visual arrays on a touchscreen. We estimated numerical acuity by modeling Weber fractions (w) and found quantitatively equivalent performance among all four nonhuman primate species. In Experiment 2, we tested adult humans in a similar procedure, and they outperformed the four nonhuman species but showed qualitatively similar performance. These results indicate that the ANS is conserved over the primate order.

  16. A Uniqueness Result for Self-Similar Profiles to Smoluchowski's Coagulation Equation Revisited

    NASA Astrophysics Data System (ADS)

    Niethammer, B.; Throm, S.; Velázquez, J. J. L.

    2016-07-01

    In this note we indicate how to correct the proof of a uniqueness result in [6] for self-similar solutions to Smoluchowski's coagulation equation for kernels K=K(x,y) that are homogeneous of degree zero and close to constant in the sense that begin{aligned} -\\varepsilon le K(x,y)-2 le \\varepsilon Big ( Big (x/yBig )^{α } + Big (y/xBig )^{α }Big ) for α in [0,1/2). Under the additional assumption, in comparison to [6], that K has an analytic extension to mathbb {C}{setminus } (-infty ,0] and that the precise asymptotic behaviour of K at the origin is prescribed, we prove that self-similar solutions with given mass are unique if \\varepsilon is sufficiently small. The complete details of the proof are available in [4]. In addition, we give here the proof of a uniqueness result for a related but simpler problem that appears in the description of self-similar solutions for x → infty.

  17. Quantitative MR imaging in fracture dating--Initial results.

    PubMed

    Baron, Katharina; Neumayer, Bernhard; Widek, Thomas; Schick, Fritz; Scheicher, Sylvia; Hassler, Eva; Scheurer, Eva

    2016-04-01

    For exact age determinations of bone fractures in a forensic context (e.g. in cases of child abuse) improved knowledge of the time course of the healing process and use of non-invasive modern imaging technology is of high importance. To date, fracture dating is based on radiographic methods by determining the callus status and thereby relying on an expert's experience. As a novel approach, this study aims to investigate the applicability of magnetic resonance imaging (MRI) for bone fracture dating by systematically investigating time-resolved changes in quantitative MR characteristics after a fracture event. Prior to investigating fracture healing in children, adults were examined for this study in order to test the methodology for this application. Altogether, 31 MR examinations in 17 subjects (♀: 11 ♂: 6; median age 34 ± 15 y, scanned 1-5 times over a period of up to 200 days after the fracture event) were performed on a clinical 3T MR scanner (TimTrio, Siemens AG, Germany). All subjects were treated conservatively for a fracture in either a long bone or in the collar bone. Both, qualitative and quantitative MR measurements were performed in all subjects. MR sequences for a quantitative measurement of relaxation times T1 and T2 in the fracture gap and musculature were applied. Maps of quantitative MR parameters T1, T2, and magnetisation transfer ratio (MTR) were calculated and evaluated by investigating changes over time in the fractured area by defined ROIs. Additionally, muscle areas were examined as reference regions to validate this approach. Quantitative evaluation of 23 MR data sets (12 test subjects, ♀: 7 ♂: 5) showed an initial peak in T1 values in the fractured area (T1=1895 ± 607 ms), which decreased over time to a value of 1094 ± 182 ms (200 days after the fracture event). T2 values also peaked for early-stage fractures (T2=115 ± 80 ms) and decreased to 73 ± 33 ms within 21 days after the fracture event. After that time point, no

  18. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  19. Epistemic Similarities between Students' Scientific and Supernatural Beliefs

    ERIC Educational Resources Information Center

    Shtulman, Andrew

    2013-01-01

    The evidential support for scientific claims is quantitatively and qualitatively superior to that for supernatural claims, yet students may not appreciate this difference in light of the fact that both types of claims are learned in similar ways (through testimony rather than firsthand observation) and perform similar functions (explaining…

  20. Why different gas flux velocity parameterizations result in so similar flux results in the North Atlantic?

    NASA Astrophysics Data System (ADS)

    Piskozub, Jacek; Wróbel, Iwona

    2016-04-01

    The North Atlantic is a crucial region for both ocean circulation and the carbon cycle. Most of ocean deep waters are produced in the basin making it a large CO2 sink. The region, close to the major oceanographic centres has been well covered with cruises. This is why we have performed a study of net CO2 flux dependence upon the choice of gas transfer velocity k parameterization for this very region: the North Atlantic including European Arctic Seas. The study has been a part of a ESA funded OceanFlux GHG Evolution project and, at the same time, a PhD thesis (of I.W) funded by Centre of Polar Studies "POLAR-KNOW" (a project of the Polish Ministry of Science). Early results have been presented last year at EGU 2015 as a PICO presentation EGU2015-11206-1. We have used FluxEngine, a tool created within an earlier ESA funded project (OceanFlux Greenhouse Gases) to calculate the North Atlantic and global fluxes with different gas transfer velocity formulas. During the processing of the data, we have noticed that the North Atlantic results for different k formulas are more similar (in the sense of relative error) that global ones. This was true both for parameterizations using the same power of wind speed and when comparing wind squared and wind cubed parameterizations. This result was interesting because North Atlantic winds are stronger than the global average ones. Was the flux result similarity caused by the fact that the parameterizations were tuned to the North Atlantic area where many of the early cruises measuring CO2 fugacities were performed? A closer look at the parameterizations and their history showed that not all of them were based on North Atlantic data. Some of them were tuned to the South Ocean with even stronger winds while some were based on global budgets of 14C. However we have found two reasons, not reported before in the literature, for North Atlantic fluxes being more similar than global ones for different gas transfer velocity parametrizations

  1. Link-Based Similarity Measures Using Reachability Vectors

    PubMed Central

    Yoon, Seok-Ho; Kim, Ji-Soo; Ryu, Minsoo; Choi, Ho-Jin

    2014-01-01

    We present a novel approach for computing link-based similarities among objects accurately by utilizing the link information pertaining to the objects involved. We discuss the problems with previous link-based similarity measures and propose a novel approach for computing link based similarities that does not suffer from these problems. In the proposed approach each target object is represented by a vector. Each element of the vector corresponds to all the objects in the given data, and the value of each element denotes the weight for the corresponding object. As for this weight value, we propose to utilize the probability of reaching from the target object to the specific object, computed using the “Random Walk with Restart” strategy. Then, we define the similarity between two objects as the cosine similarity of the two vectors. In this paper, we provide examples to show that our approach does not suffer from the aforementioned problems. We also evaluate the performance of the proposed methods in comparison with existing link-based measures, qualitatively and quantitatively, with respect to two kinds of data sets, scientific papers and Web documents. Our experimental results indicate that the proposed methods significantly outperform the existing measures. PMID:24701188

  2. Graph Kernels for Molecular Similarity.

    PubMed

    Rupp, Matthias; Schneider, Gisbert

    2010-04-12

    Molecular similarity measures are important for many cheminformatics applications like ligand-based virtual screening and quantitative structure-property relationships. Graph kernels are formal similarity measures defined directly on graphs, such as the (annotated) molecular structure graph. Graph kernels are positive semi-definite functions, i.e., they correspond to inner products. This property makes them suitable for use with kernel-based machine learning algorithms such as support vector machines and Gaussian processes. We review the major types of kernels between graphs (based on random walks, subgraphs, and optimal assignments, respectively), and discuss their advantages, limitations, and successful applications in cheminformatics. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Can deja vu result from similarity to a prior experience? Support for the similarity hypothesis of deja vu.

    PubMed

    Cleary, Anne M; Ryals, Anthony J; Nomi, Jason S

    2009-12-01

    The strange feeling of having been somewhere or done something before--even though there is evidence to the contrary--is called déjà vu. Although déjà vu is beginning to receive attention among scientists (Brown, 2003, 2004), few studies have empirically investigated the phenomenon. We investigated the hypothesis that déjà vu is related to feelings of familiarity and that it can result from similarity between a novel scene and that of a scene experienced in one's past. We used a variation of the recognition-without-recall method of studying familiarity (Cleary, 2004) to examine instances in which participants failed to recall a studied scene in response to a configurally similar novel test scene. In such instances, resemblance to a previously viewed scene increased both feelings of familiarity and of déjà vu. Furthermore, in the absence of recall, resemblance of a novel scene to a previously viewed scene increased the probability of a reported déjà vu state for the novel scene, and feelings of familiarity with a novel scene were directly related to feelings of being in a déjà vu state.

  4. What Really Happens in Quantitative Group Research? Results of a Content Analysis of Recent Quantitative Research in "JSGW"

    ERIC Educational Resources Information Center

    Boyle, Lauren H.; Whittaker, Tiffany A.; Eyal, Maytal; McCarthy, Christopher J.

    2017-01-01

    The authors conducted a content analysis on quantitative studies published in "The Journal for Specialists in Group Work" ("JSGW") between 2012 and 2015. This brief report provides a general overview of the current practices of quantitative group research in counseling. The following study characteristics are reported and…

  5. Laparoscopic and open subtotal colectomies have similar short-term results.

    PubMed

    Hoogenboom, Froukje J; Bosker, Robbert J I; Groen, Henk; Meijerink, Wilhelmus J H J; Lamme, Bas; Pierie, Jean Pierre E N

    2013-01-01

    Laparoscopic subtotal colectomy (STC) is a complex procedure. It is possible that short-term benefits for segmental resections cannot be attributed to this complex procedure. This study aims to assess differences in short-term results for laparoscopic versus open STC during a 15-year single-institute experience. We reviewed consecutive patients undergoing laparoscopic or open elective or subacute STC from January 1997 to December 2012. Fifty-six laparoscopic and 50 open STCs were performed. The operation time was significantly longer in the laparoscopic group, median 266 min (range 121-420 min), compared to 153 min (range 90-408 min) in the open group (p < 0.001). Median hospital stay showed no statistical difference, 14 days (range 1-129 days) in the laparoscopic and 13 days (range 1-85 days) in the open group. Between-group postoperative complications were not statistically different. Laparoscopic STC has short-term results similar to the open procedure, except for a longer operation time. The laparoscopic approach for STC is therefore only advisable in selected patients combined with extensive preoperative counseling. Copyright © 2013 S. Karger AG, Basel.

  6. Similar Spectral Power Densities Within the Schumann Resonance and a Large Population of Quantitative Electroencephalographic Profiles: Supportive Evidence for Koenig and Pobachenko.

    PubMed

    Saroka, Kevin S; Vares, David E; Persinger, Michael A

    2016-01-01

    In 1954 and 1960 Koenig and his colleagues described the remarkable similarities of spectral power density profiles and patterns between the earth-ionosphere resonance and human brain activity which also share magnitudes for both electric field (mV/m) and magnetic field (pT) components. In 2006 Pobachenko and colleagues reported real time coherence between variations in the Schumann and brain activity spectra within the 6-16 Hz band for a small sample. We examined the ratios of the average potential differences (~3 μV) obtained by whole brain quantitative electroencephalography (QEEG) between rostral-caudal and left-right (hemispheric) comparisons of 238 measurements from 184 individuals over a 3.5 year period. Spectral densities for the rostral-caudal axis revealed a powerful peak at 10.25 Hz while the left-right peak was 1.95 Hz with beat-differences of ~7.5 to 8 Hz. When global cerebral measures were employed, the first (7-8 Hz), second (13-14 Hz) and third (19-20 Hz) harmonics of the Schumann resonances were discernable in averaged QEEG profiles in some but not all participants. The intensity of the endogenous Schumann resonance was related to the 'best-of-fitness' of the traditional 4-class microstate model. Additional measurements demonstrated real-time coherence for durations approximating microstates in spectral power density variations between Schumann frequencies measured in Sudbury, Canada and Cumiana, Italy with the QEEGs of local subjects. Our results confirm the measurements reported by earlier researchers that demonstrated unexpected similarities in the spectral patterns and strengths of electromagnetic fields generated by the human brain and the earth-ionospheric cavity.

  7. Similar Spectral Power Densities Within the Schumann Resonance and a Large Population of Quantitative Electroencephalographic Profiles: Supportive Evidence for Koenig and Pobachenko

    PubMed Central

    Saroka, Kevin S.; Vares, David E.; Persinger, Michael A.

    2016-01-01

    In 1954 and 1960 Koenig and his colleagues described the remarkable similarities of spectral power density profiles and patterns between the earth-ionosphere resonance and human brain activity which also share magnitudes for both electric field (mV/m) and magnetic field (pT) components. In 2006 Pobachenko and colleagues reported real time coherence between variations in the Schumann and brain activity spectra within the 6–16 Hz band for a small sample. We examined the ratios of the average potential differences (~3 μV) obtained by whole brain quantitative electroencephalography (QEEG) between rostral-caudal and left-right (hemispheric) comparisons of 238 measurements from 184 individuals over a 3.5 year period. Spectral densities for the rostral-caudal axis revealed a powerful peak at 10.25 Hz while the left-right peak was 1.95 Hz with beat-differences of ~7.5 to 8 Hz. When global cerebral measures were employed, the first (7–8 Hz), second (13–14 Hz) and third (19–20 Hz) harmonics of the Schumann resonances were discernable in averaged QEEG profiles in some but not all participants. The intensity of the endogenous Schumann resonance was related to the ‘best-of-fitness’ of the traditional 4-class microstate model. Additional measurements demonstrated real-time coherence for durations approximating microstates in spectral power density variations between Schumann frequencies measured in Sudbury, Canada and Cumiana, Italy with the QEEGs of local subjects. Our results confirm the measurements reported by earlier researchers that demonstrated unexpected similarities in the spectral patterns and strengths of electromagnetic fields generated by the human brain and the earth-ionospheric cavity. PMID:26785376

  8. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses.

    PubMed

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard.

  9. Linear Quantitative Profiling Method Fast Monitors Alkaloids of Sophora Flavescens That Was Verified by Tri-Marker Analyses

    PubMed Central

    Hou, Zhifei; Sun, Guoxiang; Guo, Yong

    2016-01-01

    The present study demonstrated the use of the Linear Quantitative Profiling Method (LQPM) to evaluate the quality of Alkaloids of Sophora flavescens (ASF) based on chromatographic fingerprints in an accurate, economical and fast way. Both linear qualitative and quantitative similarities were calculated in order to monitor the consistency of the samples. The results indicate that the linear qualitative similarity (LQLS) is not sufficiently discriminating due to the predominant presence of three alkaloid compounds (matrine, sophoridine and oxymatrine) in the test samples; however, the linear quantitative similarity (LQTS) was shown to be able to obviously identify the samples based on the difference in the quantitative content of all the chemical components. In addition, the fingerprint analysis was also supported by the quantitative analysis of three marker compounds. The LQTS was found to be highly correlated to the contents of the marker compounds, indicating that quantitative analysis of the marker compounds may be substituted with the LQPM based on the chromatographic fingerprints for the purpose of quantifying all chemicals of a complex sample system. Furthermore, once reference fingerprint (RFP) developed from a standard preparation in an immediate detection way and the composition similarities calculated out, LQPM could employ the classical mathematical model to effectively quantify the multiple components of ASF samples without any chemical standard. PMID:27529425

  10. Exploring discrepancies between quantitative validation results and the geomorphic plausibility of statistical landslide susceptibility maps

    NASA Astrophysics Data System (ADS)

    Steger, Stefan; Brenning, Alexander; Bell, Rainer; Petschko, Helene; Glade, Thomas

    2016-06-01

    Empirical models are frequently applied to produce landslide susceptibility maps for large areas. Subsequent quantitative validation results are routinely used as the primary criteria to infer the validity and applicability of the final maps or to select one of several models. This study hypothesizes that such direct deductions can be misleading. The main objective was to explore discrepancies between the predictive performance of a landslide susceptibility model and the geomorphic plausibility of subsequent landslide susceptibility maps while a particular emphasis was placed on the influence of incomplete landslide inventories on modelling and validation results. The study was conducted within the Flysch Zone of Lower Austria (1,354 km2) which is known to be highly susceptible to landslides of the slide-type movement. Sixteen susceptibility models were generated by applying two statistical classifiers (logistic regression and generalized additive model) and two machine learning techniques (random forest and support vector machine) separately for two landslide inventories of differing completeness and two predictor sets. The results were validated quantitatively by estimating the area under the receiver operating characteristic curve (AUROC) with single holdout and spatial cross-validation technique. The heuristic evaluation of the geomorphic plausibility of the final results was supported by findings of an exploratory data analysis, an estimation of odds ratios and an evaluation of the spatial structure of the final maps. The results showed that maps generated by different inventories, classifiers and predictors appeared differently while holdout validation revealed similar high predictive performances. Spatial cross-validation proved useful to expose spatially varying inconsistencies of the modelling results while additionally providing evidence for slightly overfitted machine learning-based models. However, the highest predictive performances were obtained for

  11. How Are Notions of Childcare Similar or Different among American, Chinese, Japanese and Swedish Teachers?

    ERIC Educational Resources Information Center

    Izumi-Taylor, Satomi; Lee, Yu-Yuan; Franceschini, Louis

    2011-01-01

    The purpose of this study was to examine similarities and differences in the perceptions of childcare among American, Chinese, Japanese and Swedish early childhood teachers. Participants consisted of 78 American teachers, 156 Chinese teachers, 158 Japanese teachers, and 157 Swedish teachers. The results of quantitative analysis revealed that these…

  12. Interlaboratory Comparison of Quantitative PCR Test Results for Dehalococcoides

    EPA Science Inventory

    Quantitative PCR (qPCR) techniques have been widely used to measure Dehalococcoides (Dhc) DNA in the groundwater at field sites for several years. Interpretation of these data may be complicated when different laboratories using alternate methods conduct the analysis. An...

  13. Comparison of 99mTc-MDP SPECT qualitative vs quantitative results in patients with suspected condylar hyperplasia.

    PubMed

    López Buitrago, D F; Ruiz Botero, J; Corral, C M; Carmona, A R; Sabogal, A

    To compare qualitative vs quantitative results of Single Photon Emission Computerised Tomography (SPECT), calculated from percentage of 99m Tc-MDP (methylene diphosphonate) uptake, in condyles of patients with a presumptive clinical diagnosis of condylar hyperplasia. A retrospective, descriptive study was conducted on the 99m Tc-MDP SPECT bone scintigraphy reports from 51 patients, with clinical impression of facial asymmetry related to condylar hyperplasia referred by their specialist in orthodontics or maxillofacial surgery, to a nuclear medicine department in order to take this type of test. Quantitative data from 99m Tc-MDP condylar uptake of each were obtained and compared with qualitative image interpretation reported by a nuclear medicine expert. The concordances between the 51 qualitative and quantitative reports results was established. The total sample included 32 women (63%) and 19 men (37%). The patient age range was 13-45 years (21±8 years). According to qualitative reports, 19 patients were positive for right side condylar hyperplasia, 12 for left side condylar hyperplasia, with 8 bilateral, and 12 negative. The quantitative reports diagnosed 16 positives for right side condylar hyperplasia, 10 for left side condylar hyperplasia, and 25 negatives. Nuclear medicine images are an important diagnostic tool, but the qualitative interpretation of the images is not as reliable as the quantitative calculation. The agreement between the two types of report is low (39.2%, Kappa=0.13; P>.2). The main limitation of quantitative reports is that they do not register bilateral condylar hyperplasia cases. Copyright © 2017 Elsevier España, S.L.U. y SEMNIM. All rights reserved.

  14. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  15. Fewer doses of HPV vaccine result in immune response similar to three-dose regimen

    Cancer.gov

    NCI scientists report that two doses of a human papillomavirus (HPV) vaccine, trademarked as Cervarix, resulted in similar serum antibody levels against two of the most carcinogenic types of HPV (16 and 18), compared to a standard three dose regimen.

  16. Behavioral similarity measurement based on image processing for robots that use imitative learning

    NASA Astrophysics Data System (ADS)

    Sterpin B., Dante G.; Martinez S., Fernando; Jacinto G., Edwar

    2017-02-01

    In the field of the artificial societies, particularly those are based on memetics, imitative behavior is essential for the development of cultural evolution. Applying this concept for robotics, through imitative learning, a robot can acquire behavioral patterns from another robot. Assuming that the learning process must have an instructor and, at least, an apprentice, the fact to obtain a quantitative measurement for their behavioral similarity, would be potentially useful, especially in artificial social systems focused on cultural evolution. In this paper the motor behavior of both kinds of robots, for two simple tasks, is represented by 2D binary images, which are processed in order to measure their behavioral similarity. The results shown here were obtained comparing some similarity measurement methods for binary images.

  17. Predicting the performance of fingerprint similarity searching.

    PubMed

    Vogt, Martin; Bajorath, Jürgen

    2011-01-01

    Fingerprints are bit string representations of molecular structure that typically encode structural fragments, topological features, or pharmacophore patterns. Various fingerprint designs are utilized in virtual screening and their search performance essentially depends on three parameters: the nature of the fingerprint, the active compounds serving as reference molecules, and the composition of the screening database. It is of considerable interest and practical relevance to predict the performance of fingerprint similarity searching. A quantitative assessment of the potential that a fingerprint search might successfully retrieve active compounds, if available in the screening database, would substantially help to select the type of fingerprint most suitable for a given search problem. The method presented herein utilizes concepts from information theory to relate the fingerprint feature distributions of reference compounds to screening libraries. If these feature distributions do not sufficiently differ, active database compounds that are similar to reference molecules cannot be retrieved because they disappear in the "background." By quantifying the difference in feature distribution using the Kullback-Leibler divergence and relating the divergence to compound recovery rates obtained for different benchmark classes, fingerprint search performance can be quantitatively predicted.

  18. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  19. Determining quantitative immunophenotypes and evaluating their implications

    NASA Astrophysics Data System (ADS)

    Redelman, Douglas; Hudig, Dorothy; Berner, Dave; Castell, Linda M.; Roberts, Don; Ensign, Wayne

    2002-05-01

    Quantitative immunophenotypes varied widely among > 100 healthy young males but were maintained at characteristic levels within individuals. The initial results (SPIE Proceedings 4260:226) that examined cell numbers and the quantitative expression of adhesion and lineage-specific molecules, e.g., CD2 and CD14, have now been confirmed and extended to include the quantitative expression of inducible molecules such as HLA-DR and perforin (Pf). Some properties, such as the ratio of T helper (Th) to T cytotoxic/suppressor (Tc/s) cells, are known to be genetically determined. Other properties, e.g., the T:B cell ratio, the amount of CD19 per B cell, etc., behaved similarly and may also be inherited traits. Since some patterns observed in these healthy individuals resembled those found in pathological situations we tested whether the patterns could be associated with the occurrence of disease. The current studies shows that there were associations between quantitative immunophenotypes and the subsequent incidence and severity of disease. For example, individuals with characteristically low levels of HLA-DR or B cells or reduced numbers of Pf+ Tc/s cells had more frequent and/or more severe upper respiratory infections. Quantitative immunophenotypes will be more widely measured if the necessary standards are available and if appropriate procedures are made more accessible.

  20. Modified DTW for a quantitative estimation of the similarity between rainfall time series

    NASA Astrophysics Data System (ADS)

    Djallel Dilmi, Mohamed; Barthès, Laurent; Mallet, Cécile; Chazottes, Aymeric

    2017-04-01

    interpretation of this inter-correlation is not straightforward. We propose here to use an improvement of the Euclidian distance which integrates the global complexity of the rainfall series. The Dynamic Time Wrapping (DTW) used in speech recognition allows matching two time series instantly different and provide the most probable time lag. However, the original formulation of the DTW suffers from some limitations. In particular, it is not adequate to the rain intermittency. In this study we present an adaptation of the DTW for the analysis of rainfall time series : we used time series from the "Météo France" rain gauge network observed between January 1st, 2007 and December 31st, 2015 on 25 stations located in the Île de France area. Then we analyze the results (eg. The distance, the relationship between the time lag detected by our methods and others measured parameters like speed and direction of the wind…) to show the ability of the proposed similarity to provide usefull information on the rain structure. The possibility of using this measure of similarity to define a quality indicator of a sensor integrated into an observation network is also envisaged.

  1. Hydrophobic ionic liquids for quantitative bacterial cell lysis with subsequent DNA quantification.

    PubMed

    Fuchs-Telka, Sabine; Fister, Susanne; Mester, Patrick-Julian; Wagner, Martin; Rossmanith, Peter

    2017-02-01

    DNA is one of the most frequently analyzed molecules in the life sciences. In this article we describe a simple and fast protocol for quantitative DNA isolation from bacteria based on hydrophobic ionic liquid supported cell lysis at elevated temperatures (120-150 °C) for subsequent PCR-based analysis. From a set of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide was identified as the most suitable for quantitative cell lysis and DNA extraction because of limited quantitative PCR inhibition by the aqueous eluate as well as no detectable DNA uptake. The newly developed method was able to efficiently lyse Gram-negative bacterial cells, whereas Gram-positive cells were protected by their thick cell wall. The performance of the final protocol resulted in quantitative DNA extraction efficiencies for Gram-negative bacteria similar to those obtained with a commercial kit, whereas the number of handling steps, and especially the time required, was dramatically reduced. Graphical Abstract After careful evaluation of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide ([BMPyr + ][Ntf 2 - ]) was identified as the most suitable ionic liquid for quantitative cell lysis and DNA extraction. When used for Gram-negative bacteria, the protocol presented is simple and very fast and achieves DNA extraction efficiencies similar to those obtained with a commercial kit. ddH 2 O double-distilled water, qPCR quantitative PCR.

  2. Rheumatic Heart Disease and Myxomatous Degeneration: Differences and Similarities of Valve Damage Resulting from Autoimmune Reactions and Matrix Disorganization.

    PubMed

    Martins, Carlo de Oliveira; Demarchi, Lea; Ferreira, Frederico Moraes; Pomerantzeff, Pablo Maria Alberto; Brandao, Carlos; Sampaio, Roney Orismar; Spina, Guilherme Sobreira; Kalil, Jorge; Cunha-Neto, Edecio; Guilherme, Luiza

    2017-01-01

    Autoimmune inflammatory reactions leading to rheumatic fever (RF) and rheumatic heart disease (RHD) result from untreated Streptococcus pyogenes throat infections in individuals who exhibit genetic susceptibility. Immune effector mechanisms have been described that lead to heart tissue damage culminating in mitral and aortic valve dysfunctions. In myxomatous valve degeneration (MXD), the mitral valve is also damaged due to non-inflammatory mechanisms. Both diseases are characterized by structural valve disarray and a previous proteomic analysis of them has disclosed a distinct profile of matrix/structural proteins differentially expressed. Given their relevance in organizing valve tissue, we quantitatively evaluated the expression of vimentin, collagen VI, lumican, and vitronectin as well as performed immunohistochemical analysis of their distribution in valve tissue lesions of patients in both diseases. We identified abundant expression of two isoforms of vimentin (45 kDa, 42 kDa) with reduced expression of the full-size protein (54 kDa) in RHD valves. We also found increased vitronectin expression, reduced collagen VI expression and similar lumican expression between RHD and MXD valves. Immunohistochemical analysis indicated disrupted patterns of these proteins in myxomatous degeneration valves and disorganized distribution in rheumatic heart disease valves that correlated with clinical manifestations such as valve regurgitation or stenosis. Confocal microscopy analysis revealed a diverse pattern of distribution of collagen VI and lumican into RHD and MXD valves. Altogether, these results demonstrated distinct patterns of altered valve expression and tissue distribution/organization of structural/matrix proteins that play important pathophysiological roles in both valve diseases.

  3. Introduction to Quantitative Science, a Ninth-Grade Laboratory-Centered Course Stressing Quantitative Observation and Mathematical Analysis of Experimental Results. Final Report.

    ERIC Educational Resources Information Center

    Badar, Lawrence J.

    This report, in the form of a teacher's guide, presents materials for a ninth grade introductory course on Introduction to Quantitative Science (IQS). It is intended to replace a traditional ninth grade general science with a process oriented course that will (1) unify the sciences, and (2) provide a quantitative preparation for the new science…

  4. Guidelines for Reporting Quantitative Methods and Results in Primary Research

    ERIC Educational Resources Information Center

    Norris, John M.; Plonsky, Luke; Ross, Steven J.; Schoonen, Rob

    2015-01-01

    Adequate reporting of quantitative research about language learning involves careful consideration of the logic, rationale, and actions underlying both study designs and the ways in which data are analyzed. These guidelines, commissioned and vetted by the board of directors of "Language Learning," outline the basic expectations for…

  5. Parent-Child Similarity in Environmental Attitudes: A Pairwise Comparison

    ERIC Educational Resources Information Center

    Leppanen, Jaana M.; Haahla, Anu E.; Lensu, Anssi M.; Kuitunen, Markku T.

    2012-01-01

    Are adolescents' environmental attitudes similar to their parents' attitudes? The main objective of this study is to examine what quantitative associations, if any, exist in parent-child environmental attitudes within the family. The survey data was collected assessing attitudes toward the environment and nature from 15-year-old students (n = 237)…

  6. Similarly shaped letters evoke similar colors in grapheme-color synesthesia.

    PubMed

    Brang, David; Rouw, Romke; Ramachandran, V S; Coulson, Seana

    2011-04-01

    Grapheme-color synesthesia is a neurological condition in which viewing numbers or letters (graphemes) results in the concurrent sensation of color. While the anatomical substrates underlying this experience are well understood, little research to date has investigated factors influencing the particular colors associated with particular graphemes or how synesthesia occurs developmentally. A recent suggestion of such an interaction has been proposed in the cascaded cross-tuning (CCT) model of synesthesia, which posits that in synesthetes connections between grapheme regions and color area V4 participate in a competitive activation process, with synesthetic colors arising during the component-stage of grapheme processing. This model more directly suggests that graphemes sharing similar component features (lines, curves, etc.) should accordingly activate more similar synesthetic colors. To test this proposal, we created and regressed synesthetic color-similarity matrices for each of 52 synesthetes against a letter-confusability matrix, an unbiased measure of visual similarity among graphemes. Results of synesthetes' grapheme-color correspondences indeed revealed that more similarly shaped graphemes corresponded with more similar synesthetic colors, with stronger effects observed in individuals with more intense synesthetic experiences (projector synesthetes). These results support the CCT model of synesthesia, implicate early perceptual mechanisms as driving factors in the elicitation of synesthetic hues, and further highlight the relationship between conceptual and perceptual factors in this phenomenon. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. Similarities and differences between dreaming and waking cognition: an exploratory study.

    PubMed

    Kahan, T L; LaBerge, S; Levitan, L; Zimbardo, P

    1997-03-01

    Thirty-eight "practiced" dreamers (Study 1) and 50 "novice" dreamers (Study 2) completed questionnaires assessing the cognitive, metacognitive, and emotional qualities of recent waking and dreaming experiences. The present findings suggest that dreaming cognition is more similar to waking cognition than previously assumed and that the differences between dreaming and waking cognition are more quantitative than qualitative. Results from the two studies were generally consistent, indicating that high-order cognition during dreaming is not restricted to individuals practiced in dream recall or self-observation. None of the measured features was absent or infrequent in reports of either dreaming or waking experiences. Recollections of dreaming and waking experiences were similar for some cognitive features (e.g., attentional processes, internal commentary, and public self-consciousness) and different for other features (e.g., choice, event-related self-reflection, and affect).

  8. Discrimination and Measurements of Three Flavonols with Similar Structure Using Terahertz Spectroscopy and Chemometrics

    NASA Astrophysics Data System (ADS)

    Yan, Ling; Liu, Changhong; Qu, Hao; Liu, Wei; Zhang, Yan; Yang, Jianbo; Zheng, Lei

    2018-03-01

    Terahertz (THz) technique, a recently developed spectral method, has been researched and used for the rapid discrimination and measurements of food compositions due to its low-energy and non-ionizing characteristics. In this study, THz spectroscopy combined with chemometrics has been utilized for qualitative and quantitative analysis of myricetin, quercetin, and kaempferol with concentrations of 0.025, 0.05, and 0.1 mg/mL. The qualitative discrimination was achieved by KNN, ELM, and RF models with the spectra pre-treatments. An excellent discrimination (100% CCR in the prediction set) could be achieved using the RF model. Furthermore, the quantitative analyses were performed by partial least square regression (PLSR) and least squares support vector machine (LS-SVM). Comparing to the PLSR models, the LS-SVM yielded better results with low RMSEP (0.0044, 0.0039, and 0.0048), higher Rp (0.9601, 0.9688, and 0.9359), and higher RPD (8.6272, 9.6333, and 7.9083) for myricetin, quercetin, and kaempferol, respectively. Our results demonstrate that THz spectroscopy technique is a powerful tool for identification of three flavonols with similar chemical structures and quantitative determination of their concentrations.

  9. Binary similarity measures for fingerprint analysis of qualitative metabolomic profiles.

    PubMed

    Rácz, Anita; Andrić, Filip; Bajusz, Dávid; Héberger, Károly

    2018-01-01

    Contemporary metabolomic fingerprinting is based on multiple spectrometric and chromatographic signals, used either alone or combined with structural and chemical information of metabolic markers at the qualitative and semiquantitative level. However, signal shifting, convolution, and matrix effects may compromise metabolomic patterns. Recent increase in the use of qualitative metabolomic data, described by the presence (1) or absence (0) of particular metabolites, demonstrates great potential in the field of metabolomic profiling and fingerprint analysis. The aim of this study is a comprehensive evaluation of binary similarity measures for the elucidation of patterns among samples of different botanical origin and various metabolomic profiles. Nine qualitative metabolomic data sets covering a wide range of natural products and metabolomic profiles were applied to assess 44 binary similarity measures for the fingerprinting of plant extracts and natural products. The measures were analyzed by the novel sum of ranking differences method (SRD), searching for the most promising candidates. Baroni-Urbani-Buser (BUB) and Hawkins-Dotson (HD) similarity coefficients were selected as the best measures by SRD and analysis of variance (ANOVA), while Dice (Di1), Yule, Russel-Rao, and Consonni-Todeschini 3 ranked the worst. ANOVA revealed that concordantly and intermediately symmetric similarity coefficients are better candidates for metabolomic fingerprinting than the asymmetric and correlation based ones. The fingerprint analysis based on the BUB and HD coefficients and qualitative metabolomic data performed equally well as the quantitative metabolomic profile analysis. Fingerprint analysis based on the qualitative metabolomic profiles and binary similarity measures proved to be a reliable way in finding the same/similar patterns in metabolomic data as that extracted from quantitative data.

  10. Self-similar conductance patterns in graphene Cantor-like structures.

    PubMed

    García-Cervantes, H; Gaggero-Sager, L M; Díaz-Guerrero, D S; Sotolongo-Costa, O; Rodríguez-Vargas, I

    2017-04-04

    Graphene has proven to be an ideal system for exotic transport phenomena. In this work, we report another exotic characteristic of the electron transport in graphene. Namely, we show that the linear-regime conductance can present self-similar patterns with well-defined scaling rules, once the graphene sheet is subjected to Cantor-like nanostructuring. As far as we know the mentioned system is one of the few in which a self-similar structure produces self-similar patterns on a physical property. These patterns are analysed quantitatively, by obtaining the scaling rules that underlie them. It is worth noting that the transport properties are an average of the dispersion channels, which makes the existence of scale factors quite surprising. In addition, that self-similarity be manifested in the conductance opens an excellent opportunity to test this fundamental property experimentally.

  11. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  12. Phenotypic plasticity and similarity among gall morphotypes on a superhost, Baccharis reticularia (Asteraceae).

    PubMed

    Formiga, A T; Silveira, F A O; Fernandes, G W; Isaias, R M S

    2015-03-01

    Understanding factors that modulate plant development is still a challenging task in plant biology. Although research has highlighted the role of abiotic and biotic factors in determining final plant structure, we know little of how these factors combine to produce specific developmental patterns. Here, we studied patterns of cell and tissue organisation in galled and non-galled organs of Baccharis reticularia, a Neotropical shrub that hosts over ten species of galling insects. We employed qualitative and quantitative approaches to understand patterns of growth and differentiation in its four most abundant gall morphotypes. We compared two leaf galls induced by sap-sucking Hemiptera and stem galls induced by a Lepidopteran and a Dipteran, Cecidomyiidae. The hypotheses tested were: (i) the more complex the galls, the more distinct they are from their non-galled host; (ii) galls induced on less plastic host organs, e.g. stems, develop under more morphogenetic constraints and, therefore, should be more similar among themselves than galls induced on more plastic organs. We also evaluated the plant sex preference of gall-inducing insects for oviposition. Simple galls were qualitative and quantitatively more similar to non-galled organs than complex galls, thereby supporting the first hypothesis. Unexpectedly, stem galls had more similarities between them than to their host organ, hence only partially supporting the second hypothesis. Similarity among stem galls may be caused by the restrictive pattern of host stems. The opposite trend was observed for host leaves, which generate either similar or distinct gall morphotypes due to their higher phenotypic plasticity. The Relative Distance of Plasticity Index for non-galled stems and stem galls ranged from 0.02 to 0.42. Our results strongly suggest that both tissue plasticity and gall inducer identity interact to determine plant developmental patterns, and therefore, final gall structure. © 2014 German Botanical Society

  13. GFD-Net: A novel semantic similarity methodology for the analysis of gene networks.

    PubMed

    Díaz-Montaña, Juan J; Díaz-Díaz, Norberto; Gómez-Vela, Francisco

    2017-04-01

    Since the popularization of biological network inference methods, it has become crucial to create methods to validate the resulting models. Here we present GFD-Net, the first methodology that applies the concept of semantic similarity to gene network analysis. GFD-Net combines the concept of semantic similarity with the use of gene network topology to analyze the functional dissimilarity of gene networks based on Gene Ontology (GO). The main innovation of GFD-Net lies in the way that semantic similarity is used to analyze gene networks taking into account the network topology. GFD-Net selects a functionality for each gene (specified by a GO term), weights each edge according to the dissimilarity between the nodes at its ends and calculates a quantitative measure of the network functional dissimilarity, i.e. a quantitative value of the degree of dissimilarity between the connected genes. The robustness of GFD-Net as a gene network validation tool was demonstrated by performing a ROC analysis on several network repositories. Furthermore, a well-known network was analyzed showing that GFD-Net can also be used to infer knowledge. The relevance of GFD-Net becomes more evident in Section "GFD-Net applied to the study of human diseases" where an example of how GFD-Net can be applied to the study of human diseases is presented. GFD-Net is available as an open-source Cytoscape app which offers a user-friendly interface to configure and execute the algorithm as well as the ability to visualize and interact with the results(http://apps.cytoscape.org/apps/gfdnet). Copyright © 2017 Elsevier Inc. All rights reserved.

  14. The Problem of Limited Inter-rater Agreement in Modelling Music Similarity

    PubMed Central

    Flexer, Arthur; Grill, Thomas

    2016-01-01

    One of the central goals of Music Information Retrieval (MIR) is the quantification of similarity between or within pieces of music. These quantitative relations should mirror the human perception of music similarity, which is however highly subjective with low inter-rater agreement. Unfortunately this principal problem has been given little attention in MIR so far. Since it is not meaningful to have computational models that go beyond the level of human agreement, these levels of inter-rater agreement present a natural upper bound for any algorithmic approach. We will illustrate this fundamental problem in the evaluation of MIR systems using results from two typical application scenarios: (i) modelling of music similarity between pieces of music; (ii) music structure analysis within pieces of music. For both applications, we derive upper bounds of performance which are due to the limited inter-rater agreement. We compare these upper bounds to the performance of state-of-the-art MIR systems and show how the upper bounds prevent further progress in developing better MIR systems. PMID:28190932

  15. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  16. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  17. Geography and Similarity of Regional Cuisines in China

    PubMed Central

    Zhu, Yu-Xiao; Huang, Junming; Zhang, Zi-Ke; Zhang, Qian-Ming; Zhou, Tao; Ahn, Yong-Yeol

    2013-01-01

    Food occupies a central position in every culture and it is therefore of great interest to understand the evolution of food culture. The advent of the World Wide Web and online recipe repositories have begun to provide unprecedented opportunities for data-driven, quantitative study of food culture. Here we harness an online database documenting recipes from various Chinese regional cuisines and investigate the similarity of regional cuisines in terms of geography and climate. We find that geographical proximity, rather than climate proximity, is a crucial factor that determines the similarity of regional cuisines. We develop a model of regional cuisine evolution that provides helpful clues for understanding the evolution of cuisines and cultures. PMID:24260166

  18. Geography and similarity of regional cuisines in China.

    PubMed

    Zhu, Yu-Xiao; Huang, Junming; Zhang, Zi-Ke; Zhang, Qian-Ming; Zhou, Tao; Ahn, Yong-Yeol

    2013-01-01

    Food occupies a central position in every culture and it is therefore of great interest to understand the evolution of food culture. The advent of the World Wide Web and online recipe repositories have begun to provide unprecedented opportunities for data-driven, quantitative study of food culture. Here we harness an online database documenting recipes from various Chinese regional cuisines and investigate the similarity of regional cuisines in terms of geography and climate. We find that geographical proximity, rather than climate proximity, is a crucial factor that determines the similarity of regional cuisines. We develop a model of regional cuisine evolution that provides helpful clues for understanding the evolution of cuisines and cultures.

  19. Kernel approach to molecular similarity based on iterative graph similarity.

    PubMed

    Rupp, Matthias; Proschak, Ewgenij; Schneider, Gisbert

    2007-01-01

    Similarity measures for molecules are of basic importance in chemical, biological, and pharmaceutical applications. We introduce a molecular similarity measure defined directly on the annotated molecular graph, based on iterative graph similarity and optimal assignments. We give an iterative algorithm for the computation of the proposed molecular similarity measure, prove its convergence and the uniqueness of the solution, and provide an upper bound on the required number of iterations necessary to achieve a desired precision. Empirical evidence for the positive semidefiniteness of certain parametrizations of our function is presented. We evaluated our molecular similarity measure by using it as a kernel in support vector machine classification and regression applied to several pharmaceutical and toxicological data sets, with encouraging results.

  20. Quantitative Doppler Analysis Using Conventional Color Flow Imaging Acquisitions.

    PubMed

    Karabiyik, Yucel; Ekroll, Ingvild Kinn; Eik-Nes, Sturla H; Lovstakken, Lasse

    2018-05-01

    Interleaved acquisitions used in conventional triplex mode result in a tradeoff between the frame rate and the quality of velocity estimates. On the other hand, workflow becomes inefficient when the user has to switch between different modes, and measurement variability is increased. This paper investigates the use of power spectral Capon estimator in quantitative Doppler analysis using data acquired with conventional color flow imaging (CFI) schemes. To preserve the number of samples used for velocity estimation, only spatial averaging was utilized, and clutter rejection was performed after spectral estimation. The resulting velocity spectra were evaluated in terms of spectral width using a recently proposed spectral envelope estimator. The spectral envelopes were also used for Doppler index calculations using in vivo and string phantom acquisitions. In vivo results demonstrated that the Capon estimator can provide spectral estimates with sufficient quality for quantitative analysis using packet-based CFI acquisitions. The calculated Doppler indices were similar to the values calculated using spectrograms estimated on a commercial ultrasound scanner.

  1. Quantitative Comparison of Tandem Mass Spectra Obtained on Various Instruments

    NASA Astrophysics Data System (ADS)

    Bazsó, Fanni Laura; Ozohanics, Oliver; Schlosser, Gitta; Ludányi, Krisztina; Vékey, Károly; Drahos, László

    2016-08-01

    The similarity between two tandem mass spectra, which were measured on different instruments, was compared quantitatively using the similarity index (SI), defined as the dot product of the square root of peak intensities in the respective spectra. This function was found to be useful for comparing energy-dependent tandem mass spectra obtained on various instruments. Spectral comparisons show the similarity index in a 2D "heat map", indicating which collision energy combinations result in similar spectra, and how good this agreement is. The results and methodology can be used in the pharma industry to design experiments and equipment well suited for good reproducibility. We suggest that to get good long-term reproducibility, it is best to adjust the collision energy to yield a spectrum very similar to a reference spectrum. It is likely to yield better results than using the same tuning file, which, for example, does not take into account that contamination of the ion source due to extended use may influence instrument tuning. The methodology may be used to characterize energy dependence on various instrument types, to optimize instrumentation, and to study the influence or correlation between various experimental parameters.

  2. Toxmatch-a new software tool to aid in the development and evaluation of chemically similar groups.

    PubMed

    Patlewicz, G; Jeliazkova, N; Gallegos Saliner, A; Worth, A P

    2008-01-01

    Chemical similarity is a widely used concept in toxicology, and is based on the hypothesis that similar compounds should have similar biological activities. This forms the underlying basis for performing read-across, forming chemical groups and developing (Quantitative) Structure-Activity Relationships ((Q)SARs). Chemical similarity is often perceived as structural similarity but in fact there are a number of other approaches that can be used to assess similarity. A systematic similarity analysis usually comprises two main steps. Firstly the chemical structures to be compared need to be characterised in terms of relevant descriptors which encode their physicochemical, topological, geometrical and/or surface properties. A second step involves a quantitative comparison of those descriptors using similarity (or dissimilarity) indices. This work outlines the use of chemical similarity principles in the formation of endpoint specific chemical groupings. Examples are provided to illustrate the development and evaluation of chemical groupings using a new software application called Toxmatch that was recently commissioned by the European Chemicals Bureau (ECB), of the European Commission's Joint Research Centre. Insights from using this software are highlighted with specific focus on the prospective application of chemical groupings under the new chemicals legislation, REACH.

  3. Meta-Storms: efficient search for similar microbial communities based on a novel indexing scheme and similarity score for metagenomic data.

    PubMed

    Su, Xiaoquan; Xu, Jian; Ning, Kang

    2012-10-01

    It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable

  4. Quantifying the similarity of seismic polarizations

    NASA Astrophysics Data System (ADS)

    Jones, Joshua P.; Eaton, David W.; Caffagni, Enrico

    2016-02-01

    Assessing the similarities of seismic attributes can help identify tremor, low signal-to-noise (S/N) signals and converted or reflected phases, in addition to diagnosing site noise and sensor misalignment in arrays. Polarization analysis is a widely accepted method for studying the orientation and directional characteristics of seismic phases via computed attributes, but similarity is ordinarily discussed using qualitative comparisons with reference values or known seismic sources. Here we introduce a technique for quantitative polarization similarity that uses weighted histograms computed in short, overlapping time windows, drawing on methods adapted from the image processing and computer vision literature. Our method accounts for ambiguity in azimuth and incidence angle and variations in S/N ratio. Measuring polarization similarity allows easy identification of site noise and sensor misalignment and can help identify coherent noise and emergent or low S/N phase arrivals. Dissimilar azimuths during phase arrivals indicate misaligned horizontal components, dissimilar incidence angles during phase arrivals indicate misaligned vertical components and dissimilar linear polarization may indicate a secondary noise source. Using records of the Mw = 8.3 Sea of Okhotsk earthquake, from Canadian National Seismic Network broad-band sensors in British Columbia and Yukon Territory, Canada, and a vertical borehole array at Hoadley gas field, central Alberta, Canada, we demonstrate that our method is robust to station spacing. Discrete wavelet analysis extends polarization similarity to the time-frequency domain in a straightforward way. Time-frequency polarization similarities of borehole data suggest that a coherent noise source may have persisted above 8 Hz several months after peak resource extraction from a `flowback' type hydraulic fracture.

  5. Examining the Role of Numeracy in College STEM Courses: Results from the Quantitative Reasoning for College Science (QuaRCS) Assessment Instrument

    NASA Astrophysics Data System (ADS)

    Follette, Katherine B.; McCarthy, Donald W.; Dokter, Erin F.; Buxner, Sanlyn; Prather, Edward E.

    2016-01-01

    Is quantitative literacy a prerequisite for science literacy? Can students become discerning voters, savvy consumers and educated citizens without it? Should college science courses for nonmajors be focused on "science appreciation", or should they engage students in the messy quantitative realities of modern science? We will present results from the recently developed and validated Quantitative Reasoning for College Science (QuaRCS) Assessment, which probes both quantitative reasoning skills and attitudes toward mathematics. Based on data from nearly two thousand students enrolled in nineteen general education science courses, we show that students in these courses did not demonstrate significant skill or attitude improvements over the course of a single semester, but find encouraging evidence for longer term trends.

  6. Representational Similarity Analysis – Connecting the Branches of Systems Neuroscience

    PubMed Central

    Kriegeskorte, Nikolaus; Mur, Marieke; Bandettini, Peter

    2008-01-01

    A fundamental challenge for systems neuroscience is to quantitatively relate its three major branches of research: brain-activity measurement, behavioral measurement, and computational modeling. Using measured brain-activity patterns to evaluate computational network models is complicated by the need to define the correspondency between the units of the model and the channels of the brain-activity data, e.g., single-cell recordings or voxels from functional magnetic resonance imaging (fMRI). Similar correspondency problems complicate relating activity patterns between different modalities of brain-activity measurement (e.g., fMRI and invasive or scalp electrophysiology), and between subjects and species. In order to bridge these divides, we suggest abstracting from the activity patterns themselves and computing representational dissimilarity matrices (RDMs), which characterize the information carried by a given representation in a brain or model. Building on a rich psychological and mathematical literature on similarity analysis, we propose a new experimental and data-analytical framework called representational similarity analysis (RSA), in which multi-channel measures of neural activity are quantitatively related to each other and to computational theory and behavior by comparing RDMs. We demonstrate RSA by relating representations of visual objects as measured with fMRI in early visual cortex and the fusiform face area to computational models spanning a wide range of complexities. The RDMs are simultaneously related via second-level application of multidimensional scaling and tested using randomization and bootstrap techniques. We discuss the broad potential of RSA, including novel approaches to experimental design, and argue that these ideas, which have deep roots in psychology and neuroscience, will allow the integrated quantitative analysis of data from all three branches, thus contributing to a more unified systems neuroscience. PMID:19104670

  7. An adaptive image sparse reconstruction method combined with nonlocal similarity and cosparsity for mixed Gaussian-Poisson noise removal

    NASA Astrophysics Data System (ADS)

    Chen, Yong-fei; Gao, Hong-xia; Wu, Zi-ling; Kang, Hui

    2018-01-01

    Compressed sensing (CS) has achieved great success in single noise removal. However, it cannot restore the images contaminated with mixed noise efficiently. This paper introduces nonlocal similarity and cosparsity inspired by compressed sensing to overcome the difficulties in mixed noise removal, in which nonlocal similarity explores the signal sparsity from similar patches, and cosparsity assumes that the signal is sparse after a possibly redundant transform. Meanwhile, an adaptive scheme is designed to keep the balance between mixed noise removal and detail preservation based on local variance. Finally, IRLSM and RACoSaMP are adopted to solve the objective function. Experimental results demonstrate that the proposed method is superior to conventional CS methods, like K-SVD and state-of-art method nonlocally centralized sparse representation (NCSR), in terms of both visual results and quantitative measures.

  8. Teaching Fundamental Skills in Microsoft Excel to First-Year Students in Quantitative Analysis

    ERIC Educational Resources Information Center

    Rubin, Samuel J.; Abrams, Binyomin

    2015-01-01

    Despite their technological savvy, most students entering university lack the necessary computer skills to succeed in a quantitative analysis course, in which they are often expected to input, analyze, and plot results of experiments without any previous formal education in Microsoft Excel or similar programs. This lack of formal education results…

  9. Using self-organizing maps to classify humpback whale song units and quantify their similarity.

    PubMed

    Allen, Jenny A; Murray, Anita; Noad, Michael J; Dunlop, Rebecca A; Garland, Ellen C

    2017-10-01

    Classification of vocal signals can be undertaken using a wide variety of qualitative and quantitative techniques. Using east Australian humpback whale song from 2002 to 2014, a subset of vocal signals was acoustically measured and then classified using a Self-Organizing Map (SOM). The SOM created (1) an acoustic dictionary of units representing the song's repertoire, and (2) Cartesian distance measurements among all unit types (SOM nodes). Utilizing the SOM dictionary as a guide, additional song recordings from east Australia were rapidly (manually) transcribed. To assess the similarity in song sequences, the Cartesian distance output from the SOM was applied in Levenshtein distance similarity analyses as a weighting factor to better incorporate unit similarity in the calculation (previously a qualitative process). SOMs provide a more robust and repeatable means of categorizing acoustic signals along with a clear quantitative measurement of sound type similarity based on acoustic features. This method can be utilized for a wide variety of acoustic databases especially those containing very large datasets and can be applied across the vocalization research community to help address concerns surrounding inconsistency in manual classification.

  10. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    PubMed

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-02-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences.

  11. Comparing Cognitive Interviewing and Online Probing: Do They Find Similar Results?

    ERIC Educational Resources Information Center

    Meitinger, Katharina; Behr, Dorothée

    2016-01-01

    This study compares the application of probing techniques in cognitive interviewing (CI) and online probing (OP). Even though the probing is similar, the methods differ regarding typical mode setting, sample size, level of interactivity, and goals. We analyzed probing answers to the International Social Survey Programme item battery on specific…

  12. Towards Personalized Medicine: Leveraging Patient Similarity and Drug Similarity Analytics

    PubMed Central

    Zhang, Ping; Wang, Fei; Hu, Jianying; Sorrentino, Robert

    2014-01-01

    The rapid adoption of electronic health records (EHR) provides a comprehensive source for exploratory and predictive analytic to support clinical decision-making. In this paper, we investigate how to utilize EHR to tailor treatments to individual patients based on their likelihood to respond to a therapy. We construct a heterogeneous graph which includes two domains (patients and drugs) and encodes three relationships (patient similarity, drug similarity, and patient-drug prior associations). We describe a novel approach for performing a label propagation procedure to spread the label information representing the effectiveness of different drugs for different patients over this heterogeneous graph. The proposed method has been applied on a real-world EHR dataset to help identify personalized treatments for hypercholesterolemia. The experimental results demonstrate the effectiveness of the approach and suggest that the combination of appropriate patient similarity and drug similarity analytics could lead to actionable insights for personalized medicine. Particularly, by leveraging drug similarity in combination with patient similarity, our method could perform well even on new or rarely used drugs for which there are few records of known past performance. PMID:25717413

  13. Using Qualitative Metasummary to Synthesize Qualitative and Quantitative Descriptive Findings

    PubMed Central

    Sandelowski, Margarete; Barroso, Julie; Voils, Corrine I.

    2008-01-01

    The new imperative in the health disciplines to be more methodologically inclusive has generated a growing interest in mixed research synthesis, or the integration of qualitative and quantitative research findings. Qualitative metasummary is a quantitatively oriented aggregation of qualitative findings originally developed to accommodate the distinctive features of qualitative surveys. Yet these findings are similar in form and mode of production to the descriptive findings researchers often present in addition to the results of bivariate and multivariable analyses. Qualitative metasummary, which includes the extraction, grouping, and formatting of findings, and the calculation of frequency and intensity effect sizes, can be used to produce mixed research syntheses and to conduct a posteriori analyses of the relationship between reports and findings. PMID:17243111

  14. Are calanco landforms similar to river basins?

    PubMed

    Caraballo-Arias, N A; Ferro, V

    2017-12-15

    In the past badlands have been often considered as ideal field laboratories for studying landscape evolution because of their geometrical similarity to larger fluvial systems. For a given hydrological process, no scientific proof exists that badlands can be considered a model of river basin prototypes. In this paper the measurements carried out on 45 Sicilian calanchi, a type of badlands that appears as a small-scale hydrographic unit, are used to establish their morphological similarity with river systems whose data are available in the literature. At first the geomorphological similarity is studied by identifying the dimensionless groups, which can assume the same value or a scaled one in a fixed ratio, representing drainage basin shape, stream network and relief properties. Then, for each property, the dimensionless groups are calculated for the investigated calanchi and the river basins and their corresponding scale ratio is evaluated. The applicability of Hack's, Horton's and Melton's laws for establishing similarity criteria is also tested. The developed analysis allows to conclude that a quantitative morphological similarity between calanco landforms and river basins can be established using commonly applied dimensionless groups. In particular, the analysis showed that i) calanchi and river basins have a geometrically similar shape respect to the parameters Rf and Re with a scale factor close to 1, ii) calanchi and river basins are similar respect to the bifurcation and length ratios (λ=1), iii) for the investigated calanchi the Melton number assumes values less than that (0.694) corresponding to the river case and a scale ratio ranging from 0.52 and 0.78 can be used, iv) calanchi and river basins have similar mean relief ratio values (λ=1.13) and v) calanchi present active geomorphic processes and therefore fall in a more juvenile stage with respect to river basins. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher

  16. Assessing the properties of internal standards for quantitative matrix-assisted laser desorption/ionization mass spectrometry of small molecules.

    PubMed

    Sleno, Lekha; Volmer, Dietrich A

    2006-01-01

    Growing interest in the ability to conduct quantitative assays for small molecules by matrix-assisted laser desorption/ionization (MALDI) has been the driving force for several recent studies. This present work includes the investigation of internal standards for these analyses using a high-repetition rate MALDI triple quadrupole instrument. Certain physicochemical properties are assessed for predicting possible matches for internal standards for different small molecules. The importance of similar molecular weight of an internal standard to its analyte is seen through experiments with a series of acylcarnitines, having a fixed charge site and growing alkyl chain length. Both acetyl- and hexanoyl-carnitine were systematically assessed with several other acylcarnitine compounds as internal standards. The results clearly demonstrate that closely matched molecular weights between analyte and internal standard are essential for acceptable quantitation results. Using alpha-cyano-4-hydroxycinnamic acid as the organic matrix, the similarities between analyte and internal standard remain the most important parameter and not necessarily their even distribution within the solid sample spot. Several 4-quinolone antibiotics as well as a diverse group of pharmaceutical drugs were tested as internal standards for the 4-quinolone, ciprofloxacin. Quantitative results were shown using the solution-phase properties, log D and pKa, of these molecules. Their distribution coefficients, log D, are demonstrated as a fundamental parameter for similar crystallization patterns of analyte and internal standard. In the end, it was also possible to quantify ciprofloxacin using a drug from a different compound class, namely quinidine, having a similar log D value as the analyte. Copyright 2006 John Wiley & Sons, Ltd.

  17. Quantitative Boltzmann-Gibbs Principles via Orthogonal Polynomial Duality

    NASA Astrophysics Data System (ADS)

    Ayala, Mario; Carinci, Gioia; Redig, Frank

    2018-06-01

    We study fluctuation fields of orthogonal polynomials in the context of particle systems with duality. We thereby obtain a systematic orthogonal decomposition of the fluctuation fields of local functions, where the order of every term can be quantified. This implies a quantitative generalization of the Boltzmann-Gibbs principle. In the context of independent random walkers, we complete this program, including also fluctuation fields in non-stationary context (local equilibrium). For other interacting particle systems with duality such as the symmetric exclusion process, similar results can be obtained, under precise conditions on the n particle dynamics.

  18. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  19. Quantitative estimation of time-variable earthquake hazard by using fuzzy set theory

    NASA Astrophysics Data System (ADS)

    Deyi, Feng; Ichikawa, M.

    1989-11-01

    In this paper, the various methods of fuzzy set theory, called fuzzy mathematics, have been applied to the quantitative estimation of the time-variable earthquake hazard. The results obtained consist of the following. (1) Quantitative estimation of the earthquake hazard on the basis of seismicity data. By using some methods of fuzzy mathematics, seismicity patterns before large earthquakes can be studied more clearly and more quantitatively, highly active periods in a given region and quiet periods of seismic activity before large earthquakes can be recognized, similarities in temporal variation of seismic activity and seismic gaps can be examined and, on the other hand, the time-variable earthquake hazard can be assessed directly on the basis of a series of statistical indices of seismicity. Two methods of fuzzy clustering analysis, the method of fuzzy similarity, and the direct method of fuzzy pattern recognition, have been studied is particular. One method of fuzzy clustering analysis is based on fuzzy netting, and another is based on the fuzzy equivalent relation. (2) Quantitative estimation of the earthquake hazard on the basis of observational data for different precursors. The direct method of fuzzy pattern recognition has been applied to research on earthquake precursors of different kinds. On the basis of the temporal and spatial characteristics of recognized precursors, earthquake hazards in different terms can be estimated. This paper mainly deals with medium-short-term precursors observed in Japan and China.

  20. Quantitative Procedures for the Assessment of Quality in Higher Education Institutions.

    ERIC Educational Resources Information Center

    Moran, Tom; Rowse, Glenwood

    The development of procedures designed to provide quantitative assessments of quality in higher education institutions are reviewed. These procedures employ a systems framework and utilize quantitative data to compare institutions or programs of similar types with one another. Three major elements essential in the development of models focusing on…

  1. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays

    PubMed Central

    Guetterman, Timothy C.; Fetters, Michael D.; Creswell, John W.

    2015-01-01

    PURPOSE Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. METHODS We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. RESULTS The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. CONCLUSIONS Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. PMID:26553895

  3. Correlation Between Geometric Similarity of Ice Shapes and the Resulting Aerodynamic Performance Degradation: A Preliminary Investigation Using WIND

    NASA Technical Reports Server (NTRS)

    Wright, William B.; Chung, James

    1999-01-01

    Aerodynamic performance calculations were performed using WIND on ten experimental ice shapes and the corresponding ten ice shapes predicted by LEWICE 2.0. The resulting data for lift coefficient and drag coefficient are presented. The difference in aerodynamic results between the experimental ice shapes and the LEWICE ice shapes were compared to the quantitative difference in ice shape geometry presented in an earlier report. Correlations were generated to determine the geometric features which have the most effect on performance degradation. Results show that maximum lift and stall angle can be correlated to the upper horn angle and the leading edge minimum thickness. Drag coefficient can be correlated to the upper horn angle and the frequency-weighted average of the Fourier coefficients. Pitching moment correlated with the upper horn angle and to a much lesser extent to the upper and lower horn thicknesses.

  4. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  5. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  6. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  7. A novel iris transillumination grading scale allowing flexible assessment with quantitative image analysis and visual matching.

    PubMed

    Wang, Chen; Brancusi, Flavia; Valivullah, Zaheer M; Anderson, Michael G; Cunningham, Denise; Hedberg-Buenz, Adam; Power, Bradley; Simeonov, Dimitre; Gahl, William A; Zein, Wadih M; Adams, David R; Brooks, Brian

    2018-01-01

    To develop a sensitive scale of iris transillumination suitable for clinical and research use, with the capability of either quantitative analysis or visual matching of images. Iris transillumination photographic images were used from 70 study subjects with ocular or oculocutaneous albinism. Subjects represented a broad range of ocular pigmentation. A subset of images was subjected to image analysis and ranking by both expert and nonexpert reviewers. Quantitative ordering of images was compared with ordering by visual inspection. Images were binned to establish an 8-point scale. Ranking consistency was evaluated using the Kendall rank correlation coefficient (Kendall's tau). Visual ranking results were assessed using Kendall's coefficient of concordance (Kendall's W) analysis. There was a high degree of correlation among the image analysis, expert-based and non-expert-based image rankings. Pairwise comparisons of the quantitative ranking with each reviewer generated an average Kendall's tau of 0.83 ± 0.04 (SD). Inter-rater correlation was also high with Kendall's W of 0.96, 0.95, and 0.95 for nonexpert, expert, and all reviewers, respectively. The current standard for assessing iris transillumination is expert assessment of clinical exam findings. We adapted an image-analysis technique to generate quantitative transillumination values. Quantitative ranking was shown to be highly similar to a ranking produced by both expert and nonexpert reviewers. This finding suggests that the image characteristics used to quantify iris transillumination do not require expert interpretation. Inter-rater rankings were also highly similar, suggesting that varied methods of transillumination ranking are robust in terms of producing reproducible results.

  8. Case study, comparison of trial burn results from similar sulfuric acid regeneration plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Milaszewski, M.; Johns, T.; Dickerson, W.F.

    The primary business of Rhodia Eco Services (Rhodia) is the regeneration of sulfuric acid. Sulfuric acid regeneration requires thermal decomposition of acid to sulfur dioxide, and remaking the acid through chemical reaction. The sulfuric acid regeneration furnace is the ideal place to process pumpable wastes for energy recovery and for thermal destruction. Rhodia is regulated by the Boiler and Industrial Furnace (BIF) regulations (40 CFR 266, Subpart H). The Hammond, Indiana plant is an interim status BIF facility and the Houston, Texas facility is renewing its RCRA incineration permit as a BIF facility. Both plants have conducted BIF Trial Burnsmore » with very similar results. The performance levels demonstrated were at levels better than RCRA/BIF standards for destruction and removal efficiency, metal, HCl/Cl, particulate, dioxin/furan, and organic emissions.« less

  9. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  10. Self-similar gravity wave spectra resulting from the modulation of bound waves

    NASA Astrophysics Data System (ADS)

    Michel, Guillaume; Semin, Benoît; Cazaubiel, Annette; Haudin, Florence; Humbert, Thomas; Lepot, Simon; Bonnefoy, Félicien; Berhanu, Michaël; Falcon, Éric

    2018-05-01

    We experimentally study the properties of nonlinear surface gravity waves in a large-scale basin. We consider two different configurations: a one-dimensional (1D) monochromatic wave forcing, and a two-dimensional (2D) forcing with bichromatic waves satisfying resonant-wave interaction conditions. For the 1D forcing, we find a discrete wave-energy spectrum dominated at high frequencies by bound waves whose amplitudes decrease as a power law of the frequency. Bound waves (e.g., to the carrier) are harmonics superimposed on the carrier wave propagating with the same phase velocity as the one of the carrier. When a narrow frequency random modulation is applied to this carrier, the high-frequency part of the wave-energy spectrum becomes continuous with the same frequency-power law. Similar results are found for the 2D forcing when a random modulation is also applied to both carrier waves. Our results thus show that all these nonlinear gravity wave spectra are dominated at high frequencies by the presence of bound waves, even in the configuration where resonant interactions occur. Moreover, in all these configurations, the power-law exponent of the spectrum is found to depend on the forcing amplitude with the same trend as the one found in previous gravity wave turbulence experiments. Such a set of bound waves may thus explain this dependence that was previously poorly understood.

  11. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reddy, Jay P.; Lei, Xiudong; Huang, Sheng-Cheng

    Purpose: To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. Methods and Materials: From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomesmore » Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Results: Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Conclusions: Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting.« less

  12. Quantitative structure-activity relationship of organosulphur compounds as soybean 15-lipoxygenase inhibitors using CoMFA and CoMSIA.

    PubMed

    Caballero, Julio; Fernández, Michael; Coll, Deysma

    2010-12-01

    Three-dimensional quantitative structure-activity relationship studies were carried out on a series of 28 organosulphur compounds as 15-lipoxygenase inhibitors using comparative molecular field analysis and comparative molecular similarity indices analysis. Quantitative information on structure-activity relationships is provided for further rational development and direction of selective synthesis. All models were carried out over a training set including 22 compounds. The best comparative molecular field analysis model only included steric field and had a good Q² = 0.789. Comparative molecular similarity indices analysis overcame the comparative molecular field analysis results: the best comparative molecular similarity indices analysis model also only included steric field and had a Q² = 0.894. In addition, this model predicted adequately the compounds contained in the test set. Furthermore, plots of steric comparative molecular similarity indices analysis field allowed conclusions to be drawn for the choice of suitable inhibitors. In this sense, our model should prove useful in future 15-lipoxygenase inhibitor design studies. © 2010 John Wiley & Sons A/S.

  13. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    PubMed Central

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences. Images Fig. 2 Fig. 3 Fig. 5 Fig. 6 Fig. 7 Fig. 8 Fig. 9 PMID:1452481

  14. PRIORITIZING FUTURE RESEACH ON OFF-LABEL PRESCRIBING: RESULTS OF A QUANTITATIVE EVALUATION

    PubMed Central

    Walton, Surrey M.; Schumock, Glen T.; Lee, Ky-Van; Alexander, G. Caleb; Meltzer, David; Stafford, Randall S.

    2015-01-01

    Background Drug use for indications not approved by the Food and Drug Administration exceeds 20% of prescribing. Available compendia indicate that a minority of off-label uses are well supported by evidence. Policy makers, however, lack information to identify where systematic reviews of the evidence or other research would be most valuable. Methods We developed a quantitative model for prioritizing individual drugs for future research on off-label uses. The base model incorporated three key factors, 1) the volume of off-label use with inadequate evidence, 2) safety, and 3) cost and market considerations. Nationally representative prescribing data were used to estimate the number of off-label drug uses by indication from 1/2005 through 6/2007 in the United States, and these indications were then categorized according to the adequacy of scientific support. Black box warnings and safety alerts were used to quantify drug safety. Drug cost, date of market entry, and marketing expenditures were used to quantify cost and market considerations. Each drug was assigned a relative value for each factor, and the factors were then weighted in the final model to produce a priority score. Sensitivity analyses were conducted by varying the weightings and model parameters. Results Drugs that were consistently ranked highly in both our base model and sensitivity analyses included quetiapine, warfarin, escitalopram, risperidone, montelukast, bupropion, sertraline, venlafaxine, celecoxib, lisinopril, duloxetine, trazodone, olanzapine, and epoetin alfa. Conclusion Future research into off-label drug use should focus on drugs used frequently with inadequate supporting evidence, particularly if further concerns are raised by known safety issues, high drug cost, recent market entry, and extensive marketing. Based on quantitative measures of these factors, we have prioritized drugs where targeted research and policy activities have high potential value. PMID:19025425

  15. Investigating Correlation between Protein Sequence Similarity and Semantic Similarity Using Gene Ontology Annotations.

    PubMed

    Ikram, Najmul; Qadir, Muhammad Abdul; Afzal, Muhammad Tanvir

    2018-01-01

    Sequence similarity is a commonly used measure to compare proteins. With the increasing use of ontologies, semantic (function) similarity is getting importance. The correlation between these measures has been applied in the evaluation of new semantic similarity methods, and in protein function prediction. In this research, we investigate the relationship between the two similarity methods. The results suggest absence of a strong correlation between sequence and semantic similarities. There is a large number of proteins with low sequence similarity and high semantic similarity. We observe that Pearson's correlation coefficient is not sufficient to explain the nature of this relationship. Interestingly, the term semantic similarity values above 0 and below 1 do not seem to play a role in improving the correlation. That is, the correlation coefficient depends only on the number of common GO terms in proteins under comparison, and the semantic similarity measurement method does not influence it. Semantic similarity and sequence similarity have a distinct behavior. These findings are of significant effect for future works on protein comparison, and will help understand the semantic similarity between proteins in a better way.

  16. A Quantitative Comparison of Leading-edge Vortices in Incompressible and Supersonic Flows

    NASA Technical Reports Server (NTRS)

    Wang, F. Y.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2002-01-01

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that plague measurement techniques in high-speed flows. In the present paper an attempt is made to examine this practice by comparing quantitative data on the nearwake properties of such vortices in incompressible and supersonic flows. The incompressible flow data are obtained in experiments conducted in a low-speed wind tunnel. Detailed flow-field properties, including vorticity and turbulence characteristics, obtained by hot-wire and pressure probe surveys are documented. These data are compared, wherever possible, with available data from a past work for a Mach 2.49 flow for the same wing geometry and angles-of-attack. The results indicate that quantitative similarities exist in the distributions of total pressure and swirl velocity. However, the streamwise velocity of the core exhibits different trends. The axial flow characteristics of the vortices in the two regimes are examined, and a candidate theory is discussed.

  17. How to compare movement? A review of physical movement similarity measures in geographic information science and beyond.

    PubMed

    Ranacher, Peter; Tzavella, Katerina

    2014-05-27

    In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods.

  18. How to compare movement? A review of physical movement similarity measures in geographic information science and beyond

    PubMed Central

    Ranacher, Peter; Tzavella, Katerina

    2014-01-01

    In geographic information science, a plethora of different approaches and methods is used to assess the similarity of movement. Some of these approaches term two moving objects similar if they share akin paths. Others require objects to move at similar speed and yet others consider movement similar if it occurs at the same time. We believe that a structured and comprehensive classification of movement comparison measures is missing. We argue that such a classification not only depicts the status quo of qualitative and quantitative movement analysis, but also allows for identifying those aspects of movement for which similarity measures are scarce or entirely missing. In this review paper we, first, decompose movement into its spatial, temporal, and spatiotemporal movement parameters. A movement parameter is a physical quantity of movement, such as speed, spatial path, or temporal duration. For each of these parameters we then review qualitative and quantitative methods of how to compare movement. Thus, we provide a systematic and comprehensive classification of different movement similarity measures used in geographic information science. This classification is a valuable first step toward a GIS toolbox comprising all relevant movement comparison methods. PMID:27019646

  19. Wordform Similarity Increases With Semantic Similarity: An Analysis of 100 Languages.

    PubMed

    Dautriche, Isabelle; Mahowald, Kyle; Gibson, Edward; Piantadosi, Steven T

    2017-11-01

    Although the mapping between form and meaning is often regarded as arbitrary, there are in fact well-known constraints on words which are the result of functional pressures associated with language use and its acquisition. In particular, languages have been shown to encode meaning distinctions in their sound properties, which may be important for language learning. Here, we investigate the relationship between semantic distance and phonological distance in the large-scale structure of the lexicon. We show evidence in 100 languages from a diverse array of language families that more semantically similar word pairs are also more phonologically similar. This suggests that there is an important statistical trend for lexicons to have semantically similar words be phonologically similar as well, possibly for functional reasons associated with language learning. Copyright © 2016 Cognitive Science Society, Inc.

  20. Modified HS-SPME for determination of quantitative relations between low-molecular oxygen compounds in various matrices.

    PubMed

    Dawidowicz, Andrzej L; Szewczyk, Joanna; Dybowski, Michal P

    2016-09-07

    Similar quantitative relations between individual constituents of the liquid sample established by its direct injection can be obtained applying Polydimethylsiloxane (PDMS) fiber in the headspace solid phase microextraction (HS-SPME) system containing the examined sample suspended in methyl silica oil. This paper proves that the analogous system composed of sample suspension/emulsion in polyethylene glycol (PEG) and Carbowax fiber allows to get similar quantitative relations between components of the mixture as those established by its direct analysis, but only for polar constituents. It is demonstrated for essential oil (EO) components of savory, sage, mint and thyme, and of artificial liquid mixture of polar constituents. The observed differences in quantitative relations between polar constituents estimated by both applied procedures are insignificant (Fexp < Fcrit). The presented results indicates that wider applicability of the system composed of a sample suspended in the oil of the same physicochemical character as that of used SPME fiber coating strongly depends on the character of interactions between analytes-suspending liquid and analytes-fiber coating. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Marital assortment for genetic similarity.

    PubMed

    Eckman, Ronael E; Williams, Robert; Nagoshi, Craig

    2002-10-01

    The present study involved analyses of a Caucasian American sample (n=949) and a Japanese American sample (n=400) for factors supporting Genetic Similarity Theory (GST). The analyses found no evidence for the presence of genetic similarity between spouses in either sample for the blood group analyses of nine loci. All results indicated random mating for blood group genes. The results did not provide consistent substantial support to show that spousal similarity is correlated with the degree of genetic component of a trait for a set of seventeen individual differences variables, with only the Caucasian sample yielding significant correlations for this analysis. A third analysis examining the correlation between presence of spousal genetic similarity and spousal similarity on observable traits was not performed because spousal genetic similarity was not observed in either sample. The overall implication of the study is that GST is not supported as an explanation for spousal similarity in humans.

  2. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  3. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    PubMed

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .

  4. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  5. Influence of echo time in quantitative proton MR spectroscopy using LCModel.

    PubMed

    Yamamoto, Tetsuya; Isobe, Tomonori; Akutsu, Hiroyoshi; Masumoto, Tomohiko; Ando, Hiroki; Sato, Eisuke; Takada, Kenta; Anno, Izumi; Matsumura, Akira

    2015-06-01

    The objective of this study was to elucidate the influence on quantitative analysis using LCModel with the condition of echo time (TE) longer than the recommended values in the spectrum acquisition specifications. A 3T magnetic resonance system was used to perform proton magnetic resonance spectroscopy. The participants were 5 healthy volunteers and 11 patients with glioma. Data were collected at TE of 72, 144 and 288ms. LCModel was used to quantify several metabolites (N-acetylaspartate, creatine and phosphocreatine, and choline-containing compounds). The results were compared with quantitative values obtained by using the T2-corrected internal reference method. In healthy volunteers, when TE was long, the quantitative values obtained using LCModel were up to 6.8-fold larger (p<0.05) than those obtained using the T2-corrected internal reference method. The ratios of the quantitative values obtained by the two methods differed between metabolites (p<0.05). In patients with glioma, the ratios of quantitative values obtained by the two methods tended to be larger at longer TE, similarly to the case of healthy volunteers, and large between-individual variation in the ratios was observed. In clinical practice, TE is sometimes set longer than the value recommended for LCModel. If TE is long, LCModel overestimates the quantitative value since it cannot compensate for signal attenuation, and this effect is different for each metabolite and condition. Therefore, if TE is longer than recommended, it is necessary to account for the possibly reduced reliability of quantitative values calculated using LCModel. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  7. Analysis of perceived similarity between pairs of microcalcification clusters in mammograms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Juan; Jing, Hao; Wernick, Miles N.

    2014-05-15

    Purpose: Content-based image retrieval aims to assist radiologists by presenting example images with known pathology that are visually similar to the case being evaluated. In this work, the authors investigate several fundamental issues underlying the similarity ratings between pairs of microcalcification (MC) lesions on mammograms as judged by radiologists: the degree of variability in the similarity ratings, the impact of this variability on agreement between readers in retrieval of similar lesions, and the factors contributing to the readers’ similarity ratings. Methods: The authors conduct a reader study on a set of 1000 image pairs of MC lesions, in which amore » group of experienced breast radiologists rated the degree of similarity between each image pair. The image pairs are selected, from among possible pairings of 222 cases (110 malignant, 112 benign), based on quantitative image attributes (features) and the results of a preliminary reader study. Next, the authors apply analysis of variance (ANOVA) to quantify the level of variability in the readers’ similarity ratings, and study how the variability in individual reader ratings affects consistency between readers. The authors also measure the extent to which readers agree on images which are most similar to a given query, for which the Dice coefficient is used. To investigate how the similarity ratings potentially relate to the attributes underlying the cases, the authors study the fraction of perceptually similar images that also share the same benign or malignant pathology as the query image; moreover, the authors apply multidimensional scaling (MDS) to embed the cases according to their mutual perceptual similarity in a two-dimensional plot, which allows the authors to examine the manner in which similar lesions relate to one another in terms of benign or malignant pathology and clustered MCs. Results: The ANOVA results show that the coefficient of determination in the reader similarity ratings

  8. Quantitative High-Efficiency Cadmium-Zinc-Telluride SPECT with Dedicated Parallel-Hole Collimation System in Obese Patients: Results of a Multi-Center Study

    PubMed Central

    Nakazato, Ryo; Slomka, Piotr J.; Fish, Mathews; Schwartz, Ronald G.; Hayes, Sean W.; Thomson, Louise E.J.; Friedman, John D.; Lemley, Mark; Mackin, Maria L.; Peterson, Benjamin; Schwartz, Arielle M.; Doran, Jesse A.; Germano, Guido; Berman, Daniel S.

    2014-01-01

    Background Obesity is a common source of artifact on conventional SPECT myocardial perfusion imaging (MPI). We evaluated image quality and diagnostic performance of high-efficiency (HE) cadmium-zinc-telluride (CZT) parallel-hole SPECT-MPI for coronary artery disease (CAD) in obese patients. Methods and Results 118 consecutive obese patients at 3 centers (BMI 43.6±8.9 kg/m2, range 35–79.7 kg/m2) had upright/supine HE-SPECT and ICA >6 months (n=67) or low-likelihood of CAD (n=51). Stress quantitative total perfusion deficit (TPD) for upright (U-TPD), supine (S-TPD) and combined acquisitions (C-TPD) was assessed. Image quality (IQ; 5=excellent; <3 nondiagnostic) was compared among BMI 35–39.9 (n=58), 40–44.9 (n=24) and ≥45 (n=36) groups. ROC-curve area for CAD detection (≥50% stenosis) for U-TPD, S-TPD, and C-TPD were 0.80, 0.80, and 0.87, respectively. Sensitivity/specificity was 82%/57% for U-TPD, 74%/71% for S-TPD, and 80%/82% for C-TPD. C-TPD had highest specificity (P=.02). C-TPD normalcy rate was higher than U-TPD (88% vs. 75%, P=.02). Mean IQ was similar among BMI 35–39.9, 40–44.9 and ≥45 groups [4.6 vs. 4.4 vs. 4.5, respectively (P=.6)]. No patient had a non-diagnostic stress scan. Conclusions In obese patients, HE-SPECT MPI with dedicated parallel-hole collimation demonstrated high image quality, normalcy rate, and diagnostic accuracy for CAD by quantitative analysis of combined upright/supine acquisitions. PMID:25388380

  9. Quantitative susceptibility mapping of human brain at 3T: a multisite reproducibility study.

    PubMed

    Lin, P-Y; Chao, T-C; Wu, M-L

    2015-03-01

    Quantitative susceptibility mapping of the human brain has demonstrated strong potential in examining iron deposition, which may help in investigating possible brain pathology. This study assesses the reproducibility of quantitative susceptibility mapping across different imaging sites. In this study, the susceptibility values of 5 regions of interest in the human brain were measured on 9 healthy subjects following calibration by using phantom experiments. Each of the subjects was imaged 5 times on 1 scanner with the same procedure repeated on 3 different 3T systems so that both within-site and cross-site quantitative susceptibility mapping precision levels could be assessed. Two quantitative susceptibility mapping algorithms, similar in principle, one by using iterative regularization (iterative quantitative susceptibility mapping) and the other with analytic optimal solutions (deterministic quantitative susceptibility mapping), were implemented, and their performances were compared. Results show that while deterministic quantitative susceptibility mapping had nearly 700 times faster computation speed, residual streaking artifacts seem to be more prominent compared with iterative quantitative susceptibility mapping. With quantitative susceptibility mapping, the putamen, globus pallidus, and caudate nucleus showed smaller imprecision on the order of 0.005 ppm, whereas the red nucleus and substantia nigra, closer to the skull base, had a somewhat larger imprecision of approximately 0.01 ppm. Cross-site errors were not significantly larger than within-site errors. Possible sources of estimation errors are discussed. The reproducibility of quantitative susceptibility mapping in the human brain in vivo is regionally dependent, and the precision levels achieved with quantitative susceptibility mapping should allow longitudinal and multisite studies such as aging-related changes in brain tissue magnetic susceptibility. © 2015 by American Journal of Neuroradiology.

  10. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  11. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  12. Selectivity of similar compounds' identification using IR spectrometry: β-Lactam antibiotics

    NASA Astrophysics Data System (ADS)

    Sadlej-Sosnowska, Nina; Ocios, Agnieszka; Fuks, Leon

    2006-07-01

    The study aims to develop a reliable, quantitative method for positive identification or discrimination of a substance, when it is compared to a set of similar ones. In the course of the study a group of structurally related compounds, namely a set of β-lactam antimicrobial agents has been explored. Identification of a substance was based on the comparison of its spectrum with that of a reference material by using two functional algorithms. The algorithm based on the calculation of correlation coefficient between the first derivatives of the spectra has been proved more powerful than that using the original spectra. Then the results in a few spectral regions were likened. Limiting values were proposed for correlation coefficients that allow for qualification of a substance as identical to the reference one.

  13. Defining an Analytic Framework to Evaluate Quantitative MRI Markers of Traumatic Axonal Injury: Preliminary Results in a Mouse Closed Head Injury Model

    PubMed Central

    Sadeghi, N.; Namjoshi, D.; Irfanoglu, M. O.; Wellington, C.; Diaz-Arrastia, R.

    2017-01-01

    Diffuse axonal injury (DAI) is a hallmark of traumatic brain injury (TBI) pathology. Recently, the Closed Head Injury Model of Engineered Rotational Acceleration (CHIMERA) was developed to generate an experimental model of DAI in a mouse. The characterization of DAI using diffusion tensor magnetic resonance imaging (MRI; diffusion tensor imaging, DTI) may provide a useful set of outcome measures for preclinical and clinical studies. The objective of this study was to identify the complex neurobiological underpinnings of DTI features following DAI using a comprehensive and quantitative evaluation of DTI and histopathology in the CHIMERA mouse model. A consistent neuroanatomical pattern of pathology in specific white matter tracts was identified across ex vivo DTI maps and photomicrographs of histology. These observations were confirmed by voxelwise and regional analysis of DTI maps, demonstrating reduced fractional anisotropy (FA) in distinct regions such as the optic tract. Similar regions were identified by quantitative histology and exhibited axonal damage as well as robust gliosis. Additional analysis using a machine-learning algorithm was performed to identify regions and metrics important for injury classification in a manner free from potential user bias. This analysis found that diffusion metrics were able to identify injured brains almost with the same degree of accuracy as the histology metrics. Good agreement between regions detected as abnormal by histology and MRI was also found. The findings of this work elucidate the complexity of cellular changes that give rise to imaging abnormalities and provide a comprehensive and quantitative evaluation of the relative importance of DTI and histological measures to detect brain injury. PMID:28966972

  14. Quantitative computed tomography and aerosol morphometry in COPD and alpha1-antitrypsin deficiency.

    PubMed

    Shaker, S B; Maltbaek, N; Brand, P; Haeussermann, S; Dirksen, A

    2005-01-01

    Relative area of emphysema below -910 Hounsfield units (RA-910) and 15th percentile density (PD15) are quantitative computed tomography (CT) parameters used in the diagnosis of emphysema. New concepts for noninvasive diagnosis of emphysema are aerosol-derived airway morphometry, which measures effective airspace dimensions (EAD) and aerosol bolus dispersion (ABD). Quantitative CT, ABD and EAD were compared in 20 smokers with chronic obstructive pulmonary disease (COPD) and 22 patients with alpha1-antitrypsin deficiency (AAD) with a similar degree of airway obstruction and reduced diffusion capacity. In both groups, there was a significant correlation between RA-910 and PD15 and pulmonary function tests (PFTs). A significant correlation was also found between EAD, RA-910 and PD15 in the study population as a whole. Upon separation into two groups, the significance disappeared for the smokers with COPD and strengthened for those with AAD, where EAD correlated significantly with RA-910 and PD15. ABD was similar in the two groups and did not correlate with PFT and quantitative CT in either group. In conclusion, based on quantitative computed tomography and aerosol-derived airway morphometry, emphysema was significantly more severe in patients with alpha1-antitrypsin deficiency compared with patients with usual emphysema, despite similar measures of pulmonary function tests.

  15. Germicidal Activity against Carbapenem/Colistin-Resistant Enterobacteriaceae Using a Quantitative Carrier Test Method.

    PubMed

    Kanamori, Hajime; Rutala, William A; Gergen, Maria F; Sickbert-Bennett, Emily E; Weber, David J

    2018-05-07

    Susceptibility to germicides for carbapenem/colistin-resistant Enterobacteriaceae is poorly described. We investigated the efficacy of multiple germicides against these emerging antibiotic-resistant pathogens using the disc-based quantitative carrier test method that can produce results more similar to those encountered in healthcare settings than a suspension test. Our study results demonstrated that germicides commonly used in healthcare facilities likely will be effective against carbapenem/colistin-resistant Enterobacteriaceae when used appropriately in healthcare facilities. Copyright © 2018 American Society for Microbiology.

  16. Linearization improves the repeatability of quantitative dynamic contrast-enhanced MRI.

    PubMed

    Jones, Kyle M; Pagel, Mark D; Cárdenas-Rodríguez, Julio

    2018-04-01

    The purpose of this study was to compare the repeatabilities of the linear and nonlinear Tofts and reference region models (RRM) for dynamic contrast-enhanced MRI (DCE-MRI). Simulated and experimental DCE-MRI data from 12 rats with a flank tumor of C6 glioma acquired over three consecutive days were analyzed using four quantitative and semi-quantitative DCE-MRI metrics. The quantitative methods used were: 1) linear Tofts model (LTM), 2) non-linear Tofts model (NTM), 3) linear RRM (LRRM), and 4) non-linear RRM (NRRM). The following semi-quantitative metrics were used: 1) maximum enhancement ratio (MER), 2) time to peak (TTP), 3) initial area under the curve (iauc64), and 4) slope. LTM and NTM were used to estimate K trans , while LRRM and NRRM were used to estimate K trans relative to muscle (R Ktrans ). Repeatability was assessed by calculating the within-subject coefficient of variation (wSCV) and the percent intra-subject variation (iSV) determined with the Gage R&R analysis. The iSV for R Ktrans using LRRM was two-fold lower compared to NRRM at all simulated and experimental conditions. A similar trend was observed for the Tofts model, where LTM was at least 50% more repeatable than the NTM under all experimental and simulated conditions. The semi-quantitative metrics iauc64 and MER were as equally repeatable as K trans and R Ktrans estimated by LTM and LRRM respectively. The iSV for iauc64 and MER were significantly lower than the iSV for slope and TTP. In simulations and experimental results, linearization improves the repeatability of quantitative DCE-MRI by at least 30%, making it as repeatable as semi-quantitative metrics. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. RNA sequencing confirms similarities between PPI-responsive oesophageal eosinophilia and eosinophilic oesophagitis.

    PubMed

    Peterson, K A; Yoshigi, M; Hazel, M W; Delker, D A; Lin, E; Krishnamurthy, C; Consiglio, N; Robson, J; Yandell, M; Clayton, F

    2018-06-04

    Although current American guidelines distinguish proton pump inhibitor-responsive oesophageal eosinophilia (PPI-REE) from eosinophilic oesophagitis (EoE), these entities are broadly similar. While two microarray studies showed that they have similar transcriptomes, more extensive RNA sequencing studies have not been done previously. To determine whether RNA sequencing identifies genetic markers distinguishing PPI-REE from EoE. We retrospectively examined 13 PPI-REE and 14 EoE biopsies, matched for tissue eosinophil content, and 14 normal controls. Patients and controls were not PPI-treated at the time of biopsy. We did RNA sequencing on formalin-fixed, paraffin-embedded tissue, with differential expression confirmation by quantitative polymerase chain reaction (PCR). We validated the use of formalin-fixed, paraffin-embedded vs RNAlater-preserved tissue, and compared our formalin-fixed, paraffin-embedded EoE results to a prior EoE study. By RNA sequencing, no genes were differentially expressed between the EoE and PPI-REE groups at the false discovery rate (FDR) ≤0.01 level. Compared to normal controls, 1996 genes were differentially expressed in the PPI-REE group and 1306 genes in the EoE group. By less stringent criteria, only MAPK8IP2 was differentially expressed between PPI-REE and EoE (FDR = 0.029, 2.2-fold less in EoE than in PPI-REE), with similar results by PCR. KCNJ2, which was differentially expressed in a prior study, was similar in the EoE and PPI-REE groups by both RNA sequencing and real-time PCR. Eosinophilic oesophagitis and PPI-REE have comparable transcriptomes, confirming that they are part of the same disease continuum. © 2018 John Wiley & Sons Ltd.

  18. Nuclear markers reveal that inter-lake cichlids' similar morphologies do not reflect similar genealogy.

    PubMed

    Kassam, Daud; Seki, Shingo; Horic, Michio; Yamaoka, Kosaku

    2006-08-01

    The apparent inter-lake morphological similarity among East African Great Lakes' cichlid species/genera has left evolutionary biologists asking whether such similarity is due to sharing of common ancestor or mere convergent evolution. In order to answer such question, we first used Geometric Morphometrics, GM, to quantify morphological similarity and then subsequently used Amplified Fragment Length Polymorphism, AFLP, to determine if similar morphologies imply shared ancestry or convergent evolution. GM revealed that not all presumed morphological similar pairs were indeed similar, and the dendrogram generated from AFLP data indicated distinct clusters corresponding to each lake and not inter-lake morphological similar pairs. Such results imply that the morphological similarity is due to convergent evolution and not shared ancestry. The congruency of GM and AFLP generated dendrograms imply that GM is capable of picking up phylogenetic signal, and thus GM can be potential tool in phylogenetic systematics.

  19. The Gender Similarities Hypothesis

    ERIC Educational Resources Information Center

    Hyde, Janet Shibley

    2005-01-01

    The differences model, which argues that males and females are vastly different psychologically, dominates the popular media. Here, the author advances a very different view, the gender similarities hypothesis, which holds that males and females are similar on most, but not all, psychological variables. Results from a review of 46 meta-analyses…

  20. Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns

    NASA Astrophysics Data System (ADS)

    Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.

    2013-03-01

    Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.

  1. Beyond Math Skills: Measuring Quantitative Reasoning in Context

    ERIC Educational Resources Information Center

    Grawe, Nathan D.

    2011-01-01

    It might be argued that quantitative and qualitative analyses are merely two alternative reflections of an overarching critical thinking. For instance, just as instructors of numeracy warn their charges to consider the construction of variables, teachers of qualitative approaches caution students to define terms. Similarly, an advocate of…

  2. Integrating Quantitative and Qualitative Results in Health Science Mixed Methods Research Through Joint Displays.

    PubMed

    Guetterman, Timothy C; Fetters, Michael D; Creswell, John W

    2015-11-01

    Mixed methods research is becoming an important methodology to investigate complex health-related topics, yet the meaningful integration of qualitative and quantitative data remains elusive and needs further development. A promising innovation to facilitate integration is the use of visual joint displays that bring data together visually to draw out new insights. The purpose of this study was to identify exemplar joint displays by analyzing the various types of joint displays being used in published articles. We searched for empirical articles that included joint displays in 3 journals that publish state-of-the-art mixed methods research. We analyzed each of 19 identified joint displays to extract the type of display, mixed methods design, purpose, rationale, qualitative and quantitative data sources, integration approaches, and analytic strategies. Our analysis focused on what each display communicated and its representation of mixed methods analysis. The most prevalent types of joint displays were statistics-by-themes and side-by-side comparisons. Innovative joint displays connected findings to theoretical frameworks or recommendations. Researchers used joint displays for convergent, explanatory sequential, exploratory sequential, and intervention designs. We identified exemplars for each of these designs by analyzing the inferences gained through using the joint display. Exemplars represented mixed methods integration, presented integrated results, and yielded new insights. Joint displays appear to provide a structure to discuss the integrated analysis and assist both researchers and readers in understanding how mixed methods provides new insights. We encourage researchers to use joint displays to integrate and represent mixed methods analysis and discuss their value. © 2015 Annals of Family Medicine, Inc.

  3. A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies.

    PubMed

    Sun, Dali; Hu, Tony Y

    2018-01-15

    Dark-field microscope (DFM) analysis of nanoparticle binding signal is highly useful for a variety of research and biomedical applications, but current applications for nanoparticle quantification rely on expensive DFM systems. The cost, size, limited robustness of these DFMs limits their utility for non-laboratory settings. Most nanoparticle analyses use high-magnification DFM images, which are labor intensive to acquire and subject to operator bias. Low-magnification DFM image capture is faster, but is subject to background from surface artifacts and debris, although image processing can partially compensate for background signal. We thus mated an LED light source, a dark-field condenser and a 20× objective lens with a mobile phone camera to create an inexpensive, portable and robust DFM system suitable for use in non-laboratory conditions. This proof-of-concept mobile DFM device weighs less than 400g and costs less than $2000, but analysis of images captured with this device reveal similar nanoparticle quantitation results to those acquired with a much larger and more expensive desktop DFMM system. Our results suggest that similar devices may be useful for quantification of stable, nanoparticle-based activity and quantitation assays in resource-limited areas where conventional assay approaches are not practical. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. [Generics: essentially similar, bioequivalent but not identical].

    PubMed

    Even-Adin, D; De Muylder, J A; Sternon, J

    2001-12-01

    The using of generic forms (GF) is presented as a potential source of budgetary "saving of money" in the field of pharmaceutical expenses. Not frequently prescribed in Belgium, they win a new interest thanks to the recent making use of the "reference repayment". Sale's authorization of GF is controlled by european rules, but some questions about their identity to original medications remain. Do similarities based only upon qualitative and quantitative composition in active molecules, pharmaceutical forms and biodisponibility give us all requested guarantees? Several cases of discordances can appear: the major elements of non conformity are the nature of excipients, notice's contents and the value of biodisponibility studies. However, in term of economy, in the drug market, development of GF appears to constitute an unavoidable phenomenon.

  5. Comprehensive evaluation of direct injection mass spectrometry for the quantitative profiling of volatiles in food samples

    PubMed Central

    2016-01-01

    Although qualitative strategies based on direct injection mass spectrometry (DIMS) have recently emerged as an alternative for the rapid classification of food samples, the potential of these approaches in quantitative tasks has scarcely been addressed to date. In this paper, the applicability of different multivariate regression procedures to data collected by DIMS from simulated mixtures has been evaluated. The most relevant factors affecting quantitation, such as random noise, the number of calibration samples, type of validation, mixture complexity and similarity of mass spectra, were also considered and comprehensively discussed. Based on the conclusions drawn from simulated data, and as an example of application, experimental mass spectral fingerprints collected by direct thermal desorption coupled to mass spectrometry were used for the quantitation of major volatiles in Thymus zygis subsp. zygis chemotypes. The results obtained, validated with the direct thermal desorption coupled to gas chromatography–mass spectrometry method here used as a reference, show the potential of DIMS approaches for the fast and precise quantitative profiling of volatiles in foods. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644978

  6. Bridging the Qualitative/Quantitative Software Divide

    PubMed Central

    Annechino, Rachelle; Antin, Tamar M. J.; Lee, Juliet P.

    2011-01-01

    To compare and combine qualitative and quantitative data collected from respondents in a mixed methods study, the research team developed a relational database to merge survey responses stored and analyzed in SPSS and semistructured interview responses stored and analyzed in the qualitative software package ATLAS.ti. The process of developing the database, as well as practical considerations for researchers who may wish to use similar methods, are explored. PMID:22003318

  7. Functional Module Search in Protein Networks based on Semantic Similarity Improves the Analysis of Proteomics Data*

    PubMed Central

    Boyanova, Desislava; Nilla, Santosh; Klau, Gunnar W.; Dandekar, Thomas; Müller, Tobias; Dittrich, Marcus

    2014-01-01

    The continuously evolving field of proteomics produces increasing amounts of data while improving the quality of protein identifications. Albeit quantitative measurements are becoming more popular, many proteomic studies are still based on non-quantitative methods for protein identification. These studies result in potentially large sets of identified proteins, where the biological interpretation of proteins can be challenging. Systems biology develops innovative network-based methods, which allow an integrated analysis of these data. Here we present a novel approach, which combines prior knowledge of protein-protein interactions (PPI) with proteomics data using functional similarity measurements of interacting proteins. This integrated network analysis exactly identifies network modules with a maximal consistent functional similarity reflecting biological processes of the investigated cells. We validated our approach on small (H9N2 virus-infected gastric cells) and large (blood constituents) proteomic data sets. Using this novel algorithm, we identified characteristic functional modules in virus-infected cells, comprising key signaling proteins (e.g. the stress-related kinase RAF1) and demonstrate that this method allows a module-based functional characterization of cell types. Analysis of a large proteome data set of blood constituents resulted in clear separation of blood cells according to their developmental origin. A detailed investigation of the T-cell proteome further illustrates how the algorithm partitions large networks into functional subnetworks each representing specific cellular functions. These results demonstrate that the integrated network approach not only allows a detailed analysis of proteome networks but also yields a functional decomposition of complex proteomic data sets and thereby provides deeper insights into the underlying cellular processes of the investigated system. PMID:24807868

  8. Quantitative fetal fibronectin and cervical length in symptomatic women: results from a prospective blinded cohort study.

    PubMed

    Levine, Lisa D; Downes, Katheryne L; Romero, Julie A; Pappas, Hope; Elovitz, Michal A

    2018-05-15

    Our objectives were to determine whether quantitative fetal fibronectin (fFN) and cervical length (CL) screening can be used alone or in combination as prognostic tests to identify symptomatic women at the highest or lowest risk for spontaneous preterm birth (sPTB). A prospective, blinded cohort study of women presenting with a singleton gestation to our triage unit between 22-33w6d with preterm labor symptoms was performed. Women with ruptured membranes, moderate/severe bleeding, and dilation >2 cm were excluded. The primary outcome was sPTB <37 weeks. We evaluated test characteristics of quantitative fFN and CL assessment, both separately and in combination, considering traditionally reported cut-points (fFN ≥50 and CL <25), as well as cut-points above and below these measures. We found interactions between fFN >50 and CL <25 and sPTB by parity and obstetric history (p < .05) and therefore stratified results. Test characteristics are presented with positive predictive value (PPV) and negative predictive value (NPV). Five hundred eighty women were enrolled and 537 women were available for analysis. Overall sPTB rate was 11.1%. Among nulliparous women, increasing levels of fFN were associated with increasing risk of sPTB, with PPV going from 26.5% at ≥20 ng/mL to 44.4% at ≥200 ng/mL. A cut-point of 20 ng/mL had higher sensitivity (69.2%) and higher NPV (96.8%) and therefore identified a "low-risk" group. fFN was not informative for multiparous women regardless of prior obstetrical history or quantitative level chosen. For all women, a shorter CL was associated with an increased sPTB risk. Among nulliparas and multiparas without a prior sPTB, a CL <20 mm optimized test characteristics (PPV 25 and 20%, NPV 95.5, and 92.7%, respectively). For multiparas with a prior sPTB, CL <25 mm was more useful. Using fFN and CL in combination for nulliparas did not improve test characteristics over using the individual fFN (p = .74) and CL (p = .31

  9. "The Math You Need" When Faculty Need It: Enhancing Quantitative Skills at a Broad Spectrum of Higher Education Institutions

    NASA Astrophysics Data System (ADS)

    Baer, E. M.; Wenner, J. M.

    2014-12-01

    Implementation of "The Math You Need, When You Need It" (TMYN) modules at a wide variety of institutions suggests a broad need for faculty support in helping students develop quantitative skills necessary in introductory geoscience courses. Designed to support students in applying geoscience relevant quantitative skills, TMYN modules are web-based, self-paced and commonly assigned outside of class. They include topics such as calculating slope, rearranging equations, and unit conversions and provide several applications of the mathematical technique to geoscience problems. Each instructor chooses modules that are applicable to the content in his/her individual course and students typically work through the module immediately before the module topic is applied in lab or class. Instructors assigned TMYN modules in their courses at more than 40 diverse institutions, including four-year colleges and universities (4YCs) that vary from non-selective to highly selective and open-door two-year colleges (2YCs). Analysis of module topics assigned, frequency of module use, and institutional characteristics reveals similarities and differences among faculty perception of required quantitative skills and incoming student ability at variably selective institutions. Results indicate that institutional type and selectivity are not correlated with module topic; that is, faculty apply similar quantitative skills in all introductory geoscience courses. For example, nearly every instructor assigned the unit conversions module, whereas very few required the trigonometry module. However, differences in number of assigned modules and faculty expectations are observed between 2YCs and 4YCs (no matter the selectivity). Two-year college faculty typically assign a higher number of modules per course and faculty at 4YCs more often combine portions of multiple modules or cover multiple mathematical concepts in a single assignment. These observations suggest that quantitative skills required

  10. Quantitative Assessment of Breast Cosmetic Outcome After Whole-Breast Irradiation.

    PubMed

    Reddy, Jay P; Lei, Xiudong; Huang, Sheng-Cheng; Nicklaus, Krista M; Fingeret, Michelle C; Shaitelman, Simona F; Hunt, Kelly K; Buchholz, Thomas A; Merchant, Fatima; Markey, Mia K; Smith, Benjamin D

    2017-04-01

    To measure, by quantitative analysis of digital photographs, breast cosmetic outcome within the setting of a randomized trial of conventionally fractionated (CF) and hypofractionated (HF) whole-breast irradiation (WBI), to identify how quantitative cosmesis metrics were associated with patient- and physician-reported cosmesis and whether they differed by treatment arm. From 2011 to 2014, 287 women aged ≥40 with ductal carcinoma in situ or early invasive breast cancer were randomized to HF-WBI (42.56 Gy/16 fractions [fx] + 10-12.5 Gy/4-5 fx boost) or CF-WBI (50 Gy/25 fx + 10-14 Gy/5-7 fx). At 1 year after treatment we collected digital photographs, patient-reported cosmesis using the Breast Cancer Treatment and Outcomes Scale, and physician-reported cosmesis using the Radiation Therapy Oncology Group scale. Six quantitative measures of breast symmetry, labeled M1-M6, were calculated from anteroposterior digital photographs. For each measure, values closer to 1 imply greater symmetry, and values closer to 0 imply greater asymmetry. Associations between M1-M6 and patient- and physician-reported cosmesis and treatment arm were evaluated using the Kruskal-Wallis test. Among 245 evaluable patients, patient-reported cosmesis was strongly associated with M1 (vertical symmetry measure) (P<.01). Physician-reported cosmesis was similarly correlated with M1 (P<.01) and also with M2 (vertical symmetry, P=.01) and M4 (horizontal symmetry, P=.03). At 1 year after treatment, HF-WBI resulted in better values of M2 (P=.02) and M3 (P<.01) than CF-WBI; treatment arm was not significantly associated with M1, M4, M5, or M6 (P≥.12). Quantitative assessment of breast photographs reveals similar to improved cosmetic outcome with HF-WBI compared with CF-WBI 1 year after treatment. Assessing cosmetic outcome using these measures could be useful for future comparative effectiveness studies and outcome reporting. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. The gender similarities hypothesis.

    PubMed

    Hyde, Janet Shibley

    2005-09-01

    The differences model, which argues that males and females are vastly different psychologically, dominates the popular media. Here, the author advances a very different view, the gender similarities hypothesis, which holds that males and females are similar on most, but not all, psychological variables. Results from a review of 46 meta-analyses support the gender similarities hypothesis. Gender differences can vary substantially in magnitude at different ages and depend on the context in which measurement occurs. Overinflated claims of gender differences carry substantial costs in areas such as the workplace and relationships. Copyright (c) 2005 APA, all rights reserved.

  12. Production and certification of NIST Standard Reference Material 2372 Human DNA Quantitation Standard.

    PubMed

    Kline, Margaret C; Duewer, David L; Travis, John C; Smith, Melody V; Redman, Janette W; Vallone, Peter M; Decker, Amy E; Butler, John M

    2009-06-01

    Modern highly multiplexed short tandem repeat (STR) assays used by the forensic human-identity community require tight control of the initial amount of sample DNA amplified in the polymerase chain reaction (PCR) process. This, in turn, requires the ability to reproducibly measure the concentration of human DNA, [DNA], in a sample extract. Quantitative PCR (qPCR) techniques can determine the number of intact stretches of DNA of specified nucleotide sequence in an extremely small sample; however, these assays must be calibrated with DNA extracts of well-characterized and stable composition. By 2004, studies coordinated by or reported to the National Institute of Standards and Technology (NIST) indicated that a well-characterized, stable human DNA quantitation certified reference material (CRM) could help the forensic community reduce within- and among-laboratory quantitation variability. To ensure that the stability of such a quantitation standard can be monitored and that, if and when required, equivalent replacement materials can be prepared, a measurement of some stable quantity directly related to [DNA] is required. Using a long-established conventional relationship linking optical density (properly designated as decadic attenuance) at 260 nm with [DNA] in aqueous solution, NIST Standard Reference Material (SRM) 2372 Human DNA Quantitation Standard was issued in October 2007. This SRM consists of three quite different DNA extracts: a single-source male, a multiple-source female, and a mixture of male and female sources. All three SRM components have very similar optical densities, and thus very similar conventional [DNA]. The materials perform very similarly in several widely used gender-neutral assays, demonstrating that the combination of appropriate preparation methods and metrologically sound spectrophotometric measurements enables the preparation and certification of quantitation [DNA] standards that are both maintainable and of practical utility.

  13. Artificial neural networks applied to quantitative elemental analysis of organic material using PIXE

    NASA Astrophysics Data System (ADS)

    Correa, R.; Chesta, M. A.; Morales, J. R.; Dinator, M. I.; Requena, I.; Vila, I.

    2006-08-01

    An artificial neural network (ANN) has been trained with real-sample PIXE (particle X-ray induced emission) spectra of organic substances. Following the training stage ANN was applied to a subset of similar samples thus obtaining the elemental concentrations in muscle, liver and gills of Cyprinus carpio. Concentrations obtained with the ANN method are in full agreement with results from one standard analytical procedure, showing the high potentiality of ANN in PIXE quantitative analyses.

  14. Quantitative relations between corruption and economic factors

    NASA Astrophysics Data System (ADS)

    Shao, Jia; Ivanov, Plamen Ch.; Podobnik, Boris; Stanley, H. Eugene

    2007-03-01

    We report quantitative relations between corruption level and economic factors, such as country wealth and foreign investment per capita, which are characterized by a power law spanning multiple scales of wealth and investment per capita. These relations hold for diverse countries, and also remain stable over different time periods. We also observe a negative correlation between level of corruption and long-term economic growth. We find similar results for two independent indices of corruption, suggesting that the relation between corruption and wealth does not depend on the specific measure of corruption. The functional relations we report have implications when assessing the relative level of corruption for two countries with comparable wealth, and for quantifying the impact of corruption on economic growth and foreign investment.

  15. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  16. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006.

    PubMed

    Chen, Lin; Ray, Shonket; Keller, Brad M; Pertuz, Said; McDonald, Elizabeth S; Conant, Emily F; Kontos, Despina

    2016-09-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88-0.95; weighted κ = 0.83-0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76-0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density estimation

  17. The Impact of Acquisition Dose on Quantitative Breast Density Estimation with Digital Mammography: Results from ACRIN PA 4006

    PubMed Central

    Chen, Lin; Ray, Shonket; Keller, Brad M.; Pertuz, Said; McDonald, Elizabeth S.; Conant, Emily F.

    2016-01-01

    Purpose To investigate the impact of radiation dose on breast density estimation in digital mammography. Materials and Methods With institutional review board approval and Health Insurance Portability and Accountability Act compliance under waiver of consent, a cohort of women from the American College of Radiology Imaging Network Pennsylvania 4006 trial was retrospectively analyzed. All patients underwent breast screening with a combination of dose protocols, including standard full-field digital mammography, low-dose digital mammography, and digital breast tomosynthesis. A total of 5832 images from 486 women were analyzed with previously validated, fully automated software for quantitative estimation of density. Clinical Breast Imaging Reporting and Data System (BI-RADS) density assessment results were also available from the trial reports. The influence of image acquisition radiation dose on quantitative breast density estimation was investigated with analysis of variance and linear regression. Pairwise comparisons of density estimations at different dose levels were performed with Student t test. Agreement of estimation was evaluated with quartile-weighted Cohen kappa values and Bland-Altman limits of agreement. Results Radiation dose of image acquisition did not significantly affect quantitative density measurements (analysis of variance, P = .37 to P = .75), with percent density demonstrating a high overall correlation between protocols (r = 0.88–0.95; weighted κ = 0.83–0.90). However, differences in breast percent density (1.04% and 3.84%, P < .05) were observed within high BI-RADS density categories, although they were significantly correlated across the different acquisition dose levels (r = 0.76–0.92, P < .05). Conclusion Precision and reproducibility of automated breast density measurements with digital mammography are not substantially affected by variations in radiation dose; thus, the use of low-dose techniques for the purpose of density

  18. What difference reveals about similarity.

    PubMed

    Sagi, Eyal; Gentner, Dedre; Lovett, Andrew

    2012-08-01

    Detecting that two images are different is faster for highly dissimilar images than for highly similar images. Paradoxically, we showed that the reverse occurs when people are asked to describe how two images differ--that is, to state a difference between two images. Following structure-mapping theory, we propose that this disassociation arises from the multistage nature of the comparison process. Detecting that two images are different can be done in the initial (local-matching) stage, but only for pairs with low overlap; thus, "different" responses are faster for low-similarity than for high-similarity pairs. In contrast, identifying a specific difference generally requires a full structural alignment of the two images, and this alignment process is faster for high-similarity pairs. We described four experiments that demonstrate this dissociation and show that the results can be simulated using the Structure-Mapping Engine. These results pose a significant challenge for nonstructural accounts of similarity comparison and suggest that structural alignment processes play a significant role in visual comparison. Copyright © 2012 Cognitive Science Society, Inc.

  19. Clinical performance of the LCx HCV RNA quantitative assay.

    PubMed

    Bertuzis, Rasa; Hardie, Alison; Hottentraeger, Barbara; Izopet, Jacques; Jilg, Wolfgang; Kaesdorf, Barbara; Leckie, Gregor; Leete, Jean; Perrin, Luc; Qiu, Chunfu; Ran, Iris; Schneider, George; Simmonds, Peter; Robinson, John

    2005-02-01

    This study was conducted to assess the performance of the Abbott laboratories LCx HCV RNA Quantitative Assay (LCx assay) in the clinical setting. Four clinical laboratories measured LCx assay precision, specificity, and linearity. In addition, a method comparison was conducted between the LCx assay and the Roche HCV Amplicor Monitor, version 2.0 (Roche Monitor 2.0) and the Bayer VERSANT HCV RNA 3.0 Assay (Bayer bDNA 3.0) quantitative assays. For precision, the observed LCx assay intra-assay standard deviation (S.D.) was 0.060-0.117 log IU/ml, the inter-assay S.D. was 0.083-0.133 log IU/ml, the inter-lot S.D. was 0.105-0.177 log IU/ml, the inter-site S.D. was 0.099-0.190 log IU/ml, and the total S.D. was 0.113-0.190 log IU/ml. The specificity of the LCx assay was 99.4% (542/545; 95% CI, 98.4-99.9%). For linearity, the mean pooled LCx assay results were linear (r=0.994) over the range of the panel (2.54-5.15 log IU/ml). A method comparison demonstrated a correlation coefficient of 0.881 between the LCx assay and Roche Monitor 2.0, 0.872 between the LCx assay and Bayer bDNA 3.0, and 0.870 between Roche Monitor 2.0 and Bayer bDNA 3.0. The mean LCx assay result was 0.04 log IU/ml (95% CI, -0.08, 0.01) lower than the mean Roche Monitor 2.0 result, but 0.57 log IU/ml (95% CI, 0.53, 0.61) higher than the mean Bayer bDNA 3.0 result. The mean Roche Monitor 2.0 result was 0.60 log IU/ml (95% CI, 0.56, 0.65) higher than the mean Bayer bDNA 3.0 result. The LCx assay quantitated genotypes 1-4 with statistical equivalency. The vast majority (98.9%, 278/281) of paired LCx assay-Roche Monitor 2.0 specimen results were within 1 log IU/ml. Similarly, 86.6% (240/277) of paired LCx assay and Bayer bDNA 3.0 specimen results were within 1 log, as were 85.6% (237/277) of paired Roche Monitor 2.0 and Bayer specimen results. These data demonstrate that the LCx assay may be used for quantitation of HCV RNA in HCV-infected individuals.

  20. Evaluation of quantitative PCR measurement of bacterial colonization of epithelial cells.

    PubMed

    Schmidt, Marcin T; Olejnik-Schmidt, Agnieszka K; Myszka, Kamila; Borkowska, Monika; Grajek, Włodzimierz

    2010-01-01

    Microbial colonization is an important step in establishing pathogenic or probiotic relations to host cells and in biofilm formation on industrial or medical devices. The aim of this work was to verify the applicability of quantitative PCR (Real-Time PCR) to measure bacterial colonization of epithelial cells. Salmonella enterica and Caco-2 intestinal epithelial cell line was used as a model. To verify sensitivity of the assay a competition of the pathogen cells to probiotic microorganism was tested. The qPCR method was compared to plate count and radiolabel approach, which are well established techniques in this area of research. The three methods returned similar results. The best quantification accuracy had radiolabel method, followed by qPCR. The plate count results showed coefficient of variation two-times higher than this of qPCR. The quantitative PCR proved to be a reliable method for enumeration of microbes in colonization assay. It has several advantages that make it very useful in case of analyzing mixed populations, where several different species or even strains can be monitored at the same time.

  1. Quantitative EEG analysis in minimally conscious state patients during postural changes.

    PubMed

    Greco, A; Carboncini, M C; Virgillito, A; Lanata, A; Valenza, G; Scilingo, E P

    2013-01-01

    Mobilization and postural changes of patients with cognitive impairment are standard clinical practices useful for both psychic and physical rehabilitation process. During this process, several physiological signals, such as Electroen-cephalogram (EEG), Electrocardiogram (ECG), Photopletysmography (PPG), Respiration activity (RESP), Electrodermal activity (EDA), are monitored and processed. In this paper we investigated how quantitative EEG (qEEG) changes with postural modifications in minimally conscious state patients. This study is quite novel and no similar experimental data can be found in the current literature, therefore, although results are very encouraging, a quantitative analysis of the cortical area activated in such postural changes still needs to be deeply investigated. More specifically, this paper shows EEG power spectra and brain symmetry index modifications during a verticalization procedure, from 0 to 60 degrees, of three patients in Minimally Consciousness State (MCS) with focused region of impairment. Experimental results show a significant increase of the power in β band (12 - 30 Hz), commonly associated to human alertness process, thus suggesting that mobilization and postural changes can have beneficial effects in MCS patients.

  2. LCS-TA to identify similar fragments in RNA 3D structures.

    PubMed

    Wiedemann, Jakub; Zok, Tomasz; Milostan, Maciej; Szachniuk, Marta

    2017-10-23

    In modern structural bioinformatics, comparison of molecular structures aimed to identify and assess similarities and differences between them is one of the most commonly performed procedures. It gives the basis for evaluation of in silico predicted models. It constitutes the preliminary step in searching for structural motifs. In particular, it supports tracing the molecular evolution. Faced with an ever-increasing amount of available structural data, researchers need a range of methods enabling comparative analysis of the structures from either global or local perspective. Herein, we present a new, superposition-independent method which processes pairs of RNA 3D structures to identify their local similarities. The similarity is considered in the context of structure bending and bonds' rotation which are described by torsion angles. In the analyzed RNA structures, the method finds the longest continuous segments that show similar torsion within a user-defined threshold. The length of the segment is provided as local similarity measure. The method has been implemented as LCS-TA algorithm (Longest Continuous Segments in Torsion Angle space) and is incorporated into our MCQ4Structures application, freely available for download from http://www.cs.put.poznan.pl/tzok/mcq/ . The presented approach ties torsion-angle-based method of structure analysis with the idea of local similarity identification by handling continuous 3D structure segments. The first method, implemented in MCQ4Structures, has been successfully utilized in RNA-Puzzles initiative. The second one, originally applied in Euclidean space, is a component of LGA (Local-Global Alignment) algorithm commonly used in assessing protein models submitted to CASP. This unique combination of concepts implemented in LCS-TA provides a new perspective on structure quality assessment in local and quantitative aspect. A series of computational experiments show the first results of applying our method to comparison of RNA 3

  3. Quantitative distribution of GABA-immunoreactive neurons in cetacean visual cortex is similar to that in land mammals.

    PubMed

    Garey, L J; Takács, J; Revishchin, A V; Hámori, J

    1989-04-24

    Sections of the anterior portion of the visual cortex in the lateral gyrus of the Black Sea porpoise were studied to determine the neuronal architecture and numerical density, and the distribution of neurons immunoreactive to gamma-aminobutyric acid (GABA). Cytoarchitecture and neuronal density are similar to those described in another cetacean, the bottlenose dolphin. GABA-positive neurons are distributed through all layers of the visual cortex but are especially dense in layers II and III, and comprise some 20% of the total neuronal population in this part of the cortex. The distribution of GABA-positive neurons is similar to that found in land mammals.

  4. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  5. Low-Energy Nuclear Reactions Resulting as Picometer Interactions with Similarity to K-Shell Electron Capture

    NASA Astrophysics Data System (ADS)

    Hora, H.; Miley, G. H.; Li, X. Z.; Kelly, J. C.; Osman, F.

    2006-02-01

    Since the appeal by Brian Josephson at the meeting of the Nobel Laureates July 2004, it seems to be indicated to summarize the following serious, reproducible and confirmed observations on reactions of protons or deuterons incorporated in host metals such as palladium. Some reflections to Rutherford's discovery of nuclear physics, the Cockroft-Oliphant discovery of anomalous low-energy fusion reactions and the chemist Hahn's discovery of fission had to be included. Using gaseous atmosphere or discharges between palladium targets, rather significant results were seen e.g. from the "life after death" heat production of such high values per host atom that only nuclear reactions can be involved. This supports the earlier evaluation of neutron generation in fully reversible experiments with gas discharges hinting that a reasonable screening effect - preferably in the swimming electron layer - may lead to reactions at nuclear distances d of picometers with reaction probability times U of about megaseconds similar to the K-shell capture radioactivity. Further electrolytic experiments led to low-energy nuclear reactions (LENR) where the involvement of pollution could be excluded from the appearance of very seldom rare earth elements. A basically new theory for DD cross-sections is used to confirm the picometer-megasecond reactions of cold fusion. Other theoretical aspects are given from measured heavy element distributions similar to the standard abundance distribution, SAD, in the Universe with consequences on endothermic heavy nuclei generation, magic numbers and to quark-gluon plasmas.

  6. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  7. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  8. Semantic Similarity in Biomedical Ontologies

    PubMed Central

    Pesquita, Catia; Faria, Daniel; Falcão, André O.; Lord, Phillip; Couto, Francisco M.

    2009-01-01

    In recent years, ontologies have become a mainstream topic in biomedical research. When biological entities are described using a common schema, such as an ontology, they can be compared by means of their annotations. This type of comparison is called semantic similarity, since it assesses the degree of relatedness between two entities by the similarity in meaning of their annotations. The application of semantic similarity to biomedical ontologies is recent; nevertheless, several studies have been published in the last few years describing and evaluating diverse approaches. Semantic similarity has become a valuable tool for validating the results drawn from biomedical studies such as gene clustering, gene expression data analysis, prediction and validation of molecular interactions, and disease gene prioritization. We review semantic similarity measures applied to biomedical ontologies and propose their classification according to the strategies they employ: node-based versus edge-based and pairwise versus groupwise. We also present comparative assessment studies and discuss the implications of their results. We survey the existing implementations of semantic similarity measures, and we describe examples of applications to biomedical research. This will clarify how biomedical researchers can benefit from semantic similarity measures and help them choose the approach most suitable for their studies. Biomedical ontologies are evolving toward increased coverage, formality, and integration, and their use for annotation is increasingly becoming a focus of both effort by biomedical experts and application of automated annotation procedures to create corpora of higher quality and completeness than are currently available. Given that semantic similarity measures are directly dependent on these evolutions, we can expect to see them gaining more relevance and even becoming as essential as sequence similarity is today in biomedical research. PMID:19649320

  9. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  10. Ten Years of LibQual: A Study of Qualitative and Quantitative Survey Results at the University of Mississippi 2001-2010

    ERIC Educational Resources Information Center

    Greenwood, Judy T.; Watson, Alex P.; Dennis, Melissa

    2011-01-01

    This article analyzes quantitative adequacy gap scores and coded qualitative comments from LibQual surveys at the University of Mississippi from 2001 to 2010, looking for relationships between library policy changes and LibQual results and any other trends that emerged. Analysis found no relationship between changes in policy and survey results…

  11. WE-FG-207B-12: Quantitative Evaluation of a Spectral CT Scanner in a Phantom Study: Results of Spectral Reconstructions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Duan, X; Arbique, G; Guild, J

    Purpose: To evaluate the quantitative image quality of spectral reconstructions of phantom data from a spectral CT scanner. Methods: The spectral CT scanner (IQon Spectral CT, Philips Healthcare) is equipped with a dual-layer detector and generates conventional 80-140 kVp images and variety of spectral reconstructions, e.g., virtual monochromatic (VM) images, virtual non-contrast (VNC) images, iodine maps, and effective atomic number (Z) images. A cylindrical solid water phantom (Gammex 472, 33 cm diameter and 5 cm thick) with iodine (2.0-20.0 mg I/ml) and calcium (50-600 mg/ml) rod inserts was scanned at 120 kVp and 27 mGy CTDIvol. Spectral reconstructions were evaluatedmore » by comparing image measurements with theoretical values calculated from nominal rod compositions provided by the phantom manufacturer. The theoretical VNC was calculated using water and iodine basis material decomposition, and the theoretical Z was calculated using two common methods, the chemical formula method (Z1) and the dual-energy ratio method (Z2). Results: Beam-hardening-like artifacts between high-attenuation calcium rods (≥300 mg/ml, >800 HU) influenced quantitative measurements, so the quantitative analysis was only performed on iodine rods using the images from the scan with all the calcium rods removed. The CT numbers of the iodine rods in the VM images (50∼150 keV) were close to theoretical values with average difference of 2.4±6.9 HU. Compared with theoretical values, the average difference for iodine concentration, VNC CT number and effective Z of iodine rods were −0.10±0.38 mg/ml, −0.1±8.2 HU, 0.25±0.06 (Z1) and −0.23±0.07 (Z2). Conclusion: The results indicate that the spectral CT scanner generates quantitatively accurate spectral reconstructions at clinically relevant iodine concentrations. Beam-hardening-like artifacts still exist when high-attenuation objects are present and their impact on patient images needs further investigation. YY is an employee of

  12. Creating Birds of Similar Feathers: Leveraging Similarity to Improve Teacher-Student Relationships and Academic Achievement

    ERIC Educational Resources Information Center

    Gehlbach, Hunter; Brinkworth, Maureen E.; King, Aaron M.; Hsu, Laura M.; McIntyre, Joseph; Rogers, Todd

    2016-01-01

    When people perceive themselves as similar to others, greater liking and closer relationships typically result. In the first randomized field experiment that leverages actual similarities to improve real-world relationships, we examined the affiliations between 315 9th grade students and their 25 teachers. Students in the treatment condition…

  13. Sample similarity analysis of angles of repose based on experimental results for DEM calibration

    NASA Astrophysics Data System (ADS)

    Tan, Yuan; Günthner, Willibald A.; Kessler, Stephan; Zhang, Lu

    2017-06-01

    As a fundamental material property, particle-particle friction coefficient is usually calculated based on angle of repose which can be obtained experimentally. In the present study, the bottomless cylinder test was carried out to investigate this friction coefficient of a kind of biomass material, i.e. willow chips. Because of its irregular shape and varying particle size distribution, calculation of the angle becomes less applicable and decisive. In the previous studies only one section of those uneven slopes is chosen in most cases, although standard methods in definition of a representable section are barely found. Hence, we presented an efficient and reliable method from the new technology, 3D scan, which was used to digitize the surface of heaps and generate its point cloud. Then, two tangential lines of any selected section were calculated through the linear least-squares regression (LLSR), such that the left and right angle of repose of a pile could be derived. As the next step, a certain sum of sections were stochastic selected, and calculations were repeated correspondingly in order to achieve sample of angles, which was plotted in Cartesian coordinates as spots diagram. Subsequently, different samples were acquired through various selections of sections. By applying similarities and difference analysis of these samples, the reliability of this proposed method was verified. Phased results provides a realistic criterion to reduce the deviation between experiment and simulation as a result of random selection of a single angle, which will be compared with the simulation results in the future.

  14. Pattern similarity study of functional sites in protein sequences: lysozymes and cystatins

    PubMed Central

    Nakai, Shuryo; Li-Chan, Eunice CY; Dou, Jinglie

    2005-01-01

    Background Although it is generally agreed that topography is more conserved than sequences, proteins sharing the same fold can have different functions, while there are protein families with low sequence similarity. An alternative method for profile analysis of characteristic conserved positions of the motifs within the 3D structures may be needed for functional annotation of protein sequences. Using the approach of quantitative structure-activity relationships (QSAR), we have proposed a new algorithm for postulating functional mechanisms on the basis of pattern similarity and average of property values of side-chains in segments within sequences. This approach was used to search for functional sites of proteins belonging to the lysozyme and cystatin families. Results Hydrophobicity and β-turn propensity of reference segments with 3–7 residues were used for the homology similarity search (HSS) for active sites. Hydrogen bonding was used as the side-chain property for searching the binding sites of lysozymes. The profiles of similarity constants and average values of these parameters as functions of their positions in the sequences could identify both active and substrate binding sites of the lysozyme of Streptomyces coelicolor, which has been reported as a new fold enzyme (Cellosyl). The same approach was successfully applied to cystatins, especially for postulating the mechanisms of amyloidosis of human cystatin C as well as human lysozyme. Conclusion Pattern similarity and average index values of structure-related properties of side chains in short segments of three residues or longer were, for the first time, successfully applied for predicting functional sites in sequences. This new approach may be applicable to studying functional sites in un-annotated proteins, for which complete 3D structures are not yet available. PMID:15904486

  15. Quantitative characterization of nanoscale polycrystalline magnets with electron magnetic circular dichroism.

    PubMed

    Muto, Shunsuke; Rusz, Ján; Tatsumi, Kazuyoshi; Adam, Roman; Arai, Shigeo; Kocevski, Vancho; Oppeneer, Peter M; Bürgler, Daniel E; Schneider, Claus M

    2014-01-01

    Electron magnetic circular dichroism (EMCD) allows the quantitative, element-selective determination of spin and orbital magnetic moments, similar to its well-established X-ray counterpart, X-ray magnetic circular dichroism (XMCD). As an advantage over XMCD, EMCD measurements are made using transmission electron microscopes, which are routinely operated at sub-nanometre resolution, thereby potentially allowing nanometre magnetic characterization. However, because of the low intensity of the EMCD signal, it has not yet been possible to obtain quantitative information from EMCD signals at the nanoscale. Here we demonstrate a new approach to EMCD measurements that considerably enhances the outreach of the technique. The statistical analysis introduced here yields robust quantitative EMCD signals. Moreover, we demonstrate that quantitative magnetic information can be routinely obtained using electron beams of only a few nanometres in diameter without imposing any restriction regarding the crystalline order of the specimen.

  16. Approach for Text Classification Based on the Similarity Measurement between Normal Cloud Models

    PubMed Central

    Dai, Jin; Liu, Xin

    2014-01-01

    The similarity between objects is the core research area of data mining. In order to reduce the interference of the uncertainty of nature language, a similarity measurement between normal cloud models is adopted to text classification research. On this basis, a novel text classifier based on cloud concept jumping up (CCJU-TC) is proposed. It can efficiently accomplish conversion between qualitative concept and quantitative data. Through the conversion from text set to text information table based on VSM model, the text qualitative concept, which is extraction from the same category, is jumping up as a whole category concept. According to the cloud similarity between the test text and each category concept, the test text is assigned to the most similar category. By the comparison among different text classifiers in different feature selection set, it fully proves that not only does CCJU-TC have a strong ability to adapt to the different text features, but also the classification performance is also better than the traditional classifiers. PMID:24711737

  17. PREDICTING TOXICOLOGICAL ENDPOINTS OF CHEMICALS USING QUANTITATIVE STRUCTURE-ACTIVITY RELATIONSHIPS (QSARS)

    EPA Science Inventory

    Quantitative structure-activity relationships (QSARs) are being developed to predict the toxicological endpoints for untested chemicals similar in structure to chemicals that have known experimental toxicological data. Based on a very large number of predetermined descriptors, a...

  18. Improved salvage of complicated microvascular transplants monitored with quantitative fluorometry.

    PubMed

    Whitney, T M; Lineaweaver, W C; Billys, J B; Siko, P P; Buncke, G M; Alpert, B S; Oliva, A; Buncke, H J

    1992-07-01

    Quantitative fluorometry has been used to monitor circulation in transplanted toes and cutaneous flaps in our unit since 1982. Analysis of 177 uncomplicated transplants monitored by quantitative fluorometry shows that this technique has low false indication rates for arterial occlusion (0.6 percent of patients) and venous occlusion (6.2 percent of patients). None of these patients was reexplored because of a false monitor reading, and except for single abnormal sequences, monitoring appropriately indicated intact circulation throughout the postoperative period. Quantitative fluorometry has correctly indicated vascular complications in 21 (91.3 percent) of 23 transplants over an 8-year period. The salvage rate (85.7 percent) of the fluorescein-monitored reexplored transplants was significantly higher than the salvage rates of similar reexplored transplants not monitored with fluorescein and of reexplored muscle flaps (which cannot be monitored with the fluorometer used at this unit). These clinical data indicate that quantitative fluorometry is a valid and useful postoperative monitor for transplanted toes and cutaneous flaps.

  19. Accuracy improvement of quantitative analysis by spatial confinement in laser-induced breakdown spectroscopy.

    PubMed

    Guo, L B; Hao, Z Q; Shen, M; Xiong, W; He, X N; Xie, Z Q; Gao, M; Li, X Y; Zeng, X Y; Lu, Y F

    2013-07-29

    To improve the accuracy of quantitative analysis in laser-induced breakdown spectroscopy, the plasma produced by a Nd:YAG laser from steel targets was confined by a cavity. A number of elements with low concentrations, such as vanadium (V), chromium (Cr), and manganese (Mn), in the steel samples were investigated. After the optimization of the cavity dimension and laser fluence, significant enhancement factors of 4.2, 3.1, and 2.87 in the emission intensity of V, Cr, and Mn lines, respectively, were achieved at a laser fluence of 42.9 J/cm(2) using a hemispherical cavity (diameter: 5 mm). More importantly, the correlation coefficient of the V I 440.85/Fe I 438.35 nm was increased from 0.946 (without the cavity) to 0.981 (with the cavity); and similar results for Cr I 425.43/Fe I 425.08 nm and Mn I 476.64/Fe I 492.05 nm were also obtained. Therefore, it was demonstrated that the accuracy of quantitative analysis with low concentration elements in steel samples was improved, because the plasma became uniform with spatial confinement. The results of this study provide a new pathway for improving the accuracy of quantitative analysis of LIBS.

  20. Quantitative coronary plaque analysis predicts high-risk plaque morphology on coronary computed tomography angiography: results from the ROMICAT II trial.

    PubMed

    Liu, Ting; Maurovich-Horvat, Pál; Mayrhofer, Thomas; Puchner, Stefan B; Lu, Michael T; Ghemigian, Khristine; Kitslaar, Pieter H; Broersen, Alexander; Pursnani, Amit; Hoffmann, Udo; Ferencik, Maros

    2018-02-01

    Semi-automated software can provide quantitative assessment of atherosclerotic plaques on coronary CT angiography (CTA). The relationship between established qualitative high-risk plaque features and quantitative plaque measurements has not been studied. We analyzed the association between quantitative plaque measurements and qualitative high-risk plaque features on coronary CTA. We included 260 patients with plaque who underwent coronary CTA in the Rule Out Myocardial Infarction/Ischemia Using Computer Assisted Tomography (ROMICAT) II trial. Quantitative plaque assessment and qualitative plaque characterization were performed on a per coronary segment basis. Quantitative coronary plaque measurements included plaque volume, plaque burden, remodeling index, and diameter stenosis. In qualitative analysis, high-risk plaque was present if positive remodeling, low CT attenuation plaque, napkin-ring sign or spotty calcium were detected. Univariable and multivariable logistic regression analyses were performed to assess the association between quantitative and qualitative high-risk plaque assessment. Among 888 segments with coronary plaque, high-risk plaque was present in 391 (44.0%) segments by qualitative analysis. In quantitative analysis, segments with high-risk plaque had higher total plaque volume, low CT attenuation plaque volume, plaque burden and remodeling index. Quantitatively assessed low CT attenuation plaque volume (odds ratio 1.12 per 1 mm 3 , 95% CI 1.04-1.21), positive remodeling (odds ratio 1.25 per 0.1, 95% CI 1.10-1.41) and plaque burden (odds ratio 1.53 per 0.1, 95% CI 1.08-2.16) were associated with high-risk plaque. Quantitative coronary plaque characteristics (low CT attenuation plaque volume, positive remodeling and plaque burden) measured by semi-automated software correlated with qualitative assessment of high-risk plaque features.

  1. Quantiprot - a Python package for quantitative analysis of protein sequences.

    PubMed

    Konopka, Bogumił M; Marciniak, Marta; Dyrka, Witold

    2017-07-17

    The field of protein sequence analysis is dominated by tools rooted in substitution matrices and alignments. A complementary approach is provided by methods of quantitative characterization. A major advantage of the approach is that quantitative properties defines a multidimensional solution space, where sequences can be related to each other and differences can be meaningfully interpreted. Quantiprot is a software package in Python, which provides a simple and consistent interface to multiple methods for quantitative characterization of protein sequences. The package can be used to calculate dozens of characteristics directly from sequences or using physico-chemical properties of amino acids. Besides basic measures, Quantiprot performs quantitative analysis of recurrence and determinism in the sequence, calculates distribution of n-grams and computes the Zipf's law coefficient. We propose three main fields of application of the Quantiprot package. First, quantitative characteristics can be used in alignment-free similarity searches, and in clustering of large and/or divergent sequence sets. Second, a feature space defined by quantitative properties can be used in comparative studies of protein families and organisms. Third, the feature space can be used for evaluating generative models, where large number of sequences generated by the model can be compared to actually observed sequences.

  2. Quantitative genetics

    USDA-ARS?s Scientific Manuscript database

    The majority of economically important traits targeted for cotton improvement are quantitatively inherited. In this chapter, the current state of cotton quantitative genetics is described and separated into four components. These components include: 1) traditional quantitative inheritance analysis, ...

  3. Legionella in water samples: how can you interpret the results obtained by quantitative PCR?

    PubMed

    Ditommaso, Savina; Ricciardi, Elisa; Giacomuzzi, Monica; Arauco Rivera, Susan R; Zotti, Carla M

    2015-02-01

    Evaluation of the potential risk associated with Legionella has traditionally been determined from culture-based methods. Quantitative polymerase chain reaction (qPCR) is an alternative tool that offers rapid, sensitive and specific detection of Legionella in environmental water samples. In this study we compare the results obtained by conventional qPCR (iQ-Check™ Quanti Legionella spp.; Bio-Rad) and by culture method on artificial samples prepared in Page's saline by addiction of Legionella pneumophila serogroup 1 (ATCC 33152) and we analyse the selective quantification of viable Legionella cells by the qPCR-PMA method. The amount of Legionella DNA (GU) determined by qPCR was 28-fold higher than the load detected by culture (CFU). Applying the qPCR combined with PMA treatment we obtained a reduction of 98.5% of the qPCR signal from dead cells. We observed a dissimilarity in the ability of PMA to suppress the PCR signal in samples with different amounts of bacteria: the effective elimination of detection signals by PMA depended on the concentration of GU and increasing amounts of cells resulted in higher values of reduction. Using the results from this study we created an algorithm to facilitate the interpretation of viable cell level estimation with qPCR-PMA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    PubMed

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  5. Molecule kernels: a descriptor- and alignment-free quantitative structure-activity relationship approach.

    PubMed

    Mohr, Johannes A; Jain, Brijnesh J; Obermayer, Klaus

    2008-09-01

    Quantitative structure activity relationship (QSAR) analysis is traditionally based on extracting a set of molecular descriptors and using them to build a predictive model. In this work, we propose a QSAR approach based directly on the similarity between the 3D structures of a set of molecules measured by a so-called molecule kernel, which is independent of the spatial prealignment of the compounds. Predictors can be build using the molecule kernel in conjunction with the potential support vector machine (P-SVM), a recently proposed machine learning method for dyadic data. The resulting models make direct use of the structural similarities between the compounds in the test set and a subset of the training set and do not require an explicit descriptor construction. We evaluated the predictive performance of the proposed method on one classification and four regression QSAR datasets and compared its results to the results reported in the literature for several state-of-the-art descriptor-based and 3D QSAR approaches. In this comparison, the proposed molecule kernel method performed better than the other QSAR methods.

  6. Development of similarity theory for control systems

    NASA Astrophysics Data System (ADS)

    Myshlyaev, L. P.; Evtushenko, V. F.; Ivushkin, K. A.; Makarov, G. V.

    2018-05-01

    The area of effective application of the traditional similarity theory and the need necessity of its development for systems are discussed. The main statements underlying the similarity theory of control systems are given. The conditions for the similarity of control systems and the need for similarity control control are formulated. Methods and algorithms for estimating and similarity control of control systems and the results of research of control systems based on their similarity are presented. The similarity control of systems includes the current evaluation of the degree of similarity of control systems and the development of actions controlling similarity, and the corresponding targeted change in the state of any element of control systems.

  7. The relationship between quantitative measures of disc height and disc signal intensity with Pfirrmann score of disc degeneration.

    PubMed

    Salamat, Sara; Hutchings, John; Kwong, Clemens; Magnussen, John; Hancock, Mark J

    2016-01-01

    To assess the relationship between quantitative measures of disc height and signal intensity with the Pfirrmann disc degeneration scoring system and to test the inter-rater reliability of the quantitative measures. Participants were 76 people who had recently recovered from their last episode of acute low back pain and underwent MRI scan on a single 3T machine. At all 380 lumbar discs, quantitative measures of disc height and signal intensity were made by 2 independent raters and compared to Pfirrmann scores from a single radiologist. For quantitative measures of disc height and signal intensity a "raw" score and 2 adjusted ratios were calculated and the relationship with Pfirrmann scores was assessed. The inter-tester reliability of quantitative measures was also investigated. There was a strong linear relationship between quantitative disc signal intensity and Pfirrmann scores for grades 1-4, but not for grades 4 and 5. For disc height only, Pfirrmann grade 5 had significantly reduced disc height compared to all other grades. Results were similar regardless of whether raw or adjusted scores were used. Inter-rater reliability for the quantitative measures was excellent (ICC > 0.97). Quantitative measures of disc signal intensity were strongly related to Pfirrmann scores from grade 1 to 4; however disc height only differentiated between grade 4 and 5 Pfirrmann scores. Using adjusted ratios for quantitative measures of disc height or signal intensity did not significantly alter the relationship with Pfirrmann scores.

  8. Development of a relational database to capture and merge clinical history with the quantitative results of radionuclide renography.

    PubMed

    Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T

    2012-12-01

    Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image

  9. Progress in Quantitative Viral Load Testing: Variability and Impact of the WHO Quantitative International Standards

    PubMed Central

    Sun, Y.; Tang, L.; Procop, G. W.; Hillyard, D. R.; Young, S. A.; Caliendo, A. M.

    2016-01-01

    ABSTRACT It has been hoped that the recent availability of WHO quantitative standards would improve interlaboratory agreement for viral load testing; however, insufficient data are available to evaluate whether this has been the case. Results from 554 laboratories participating in proficiency testing surveys for quantitative PCR assays of cytomegalovirus (CMV), Epstein-Barr virus (EBV), BK virus (BKV), adenovirus (ADV), and human herpesvirus 6 (HHV6) were evaluated to determine overall result variability and then were stratified by assay manufacturer. The impact of calibration to international units/ml (CMV and EBV) on variability was also determined. Viral loads showed a high degree of interlaboratory variability for all tested viruses, with interquartile ranges as high as 1.46 log10 copies/ml and the overall range for a given sample up to 5.66 log10 copies/ml. Some improvement in result variability was seen when international units were adopted. This was particularly the case for EBV viral load results. Variability in viral load results remains a challenge across all viruses tested here; introduction of international quantitative standards may help reduce variability and does so more or less markedly for certain viruses. PMID:27852673

  10. Quantitative Muscle Ultrasonography in Carpal Tunnel Syndrome.

    PubMed

    Lee, Hyewon; Jee, Sungju; Park, Soo Ho; Ahn, Seung-Chan; Im, Juneho; Sohn, Min Kyun

    2016-12-01

    To assess the reliability of quantitative muscle ultrasonography (US) in healthy subjects and to evaluate the correlation between quantitative muscle US findings and electrodiagnostic study results in patients with carpal tunnel syndrome (CTS). The clinical significance of quantitative muscle US in CTS was also assessed. Twenty patients with CTS and 20 age-matched healthy volunteers were recruited. All control and CTS subjects underwent a bilateral median and ulnar nerve conduction study (NCS) and quantitative muscle US. Transverse US images of the abductor pollicis brevis (APB) and abductor digiti minimi (ADM) were obtained to measure muscle cross-sectional area (CSA), thickness, and echo intensity (EI). EI was determined using computer-assisted, grayscale analysis. Inter-rater and intra-rater reliability for quantitative muscle US in control subjects, and differences in muscle thickness, CSA, and EI between the CTS patient and control groups were analyzed. Relationships between quantitative US parameters and electrodiagnostic study results were evaluated. Quantitative muscle US had high inter-rater and intra-rater reliability in the control group. Muscle thickness and CSA were significantly decreased, and EI was significantly increased in the APB of the CTS group (all p<0.05). EI demonstrated a significant positive correlation with latency of the median motor and sensory NCS in CTS patients (p<0.05). These findings suggest that quantitative muscle US parameters may be useful for detecting muscle changes in CTS. Further study involving patients with other neuromuscular diseases is needed to evaluate peripheral muscle change using quantitative muscle US.

  11. Homophily of Vocabulary Usage: Beneficial Effects of Vocabulary Similarity on Online Health Communities Participation

    PubMed Central

    Park, Albert; Hartzler, Andrea L.; Huh, Jina; McDonald, David W.; Pratt, Wanda

    2015-01-01

    Online health communities provide popular platforms for individuals to exchange psychosocial support and form ties. Although regular active participation (i.e., posting to interact with other members) in online health communities can provide important benefits, sustained active participation remains challenging for these communities. Leveraging previous literature on homophily (i.e., “love of those who are like themselves”), we examined the relationship between vocabulary similarity (i.e., homophily of word usage) of thread posts and members’ future interaction in online health communities. We quantitatively measured vocabulary similarity by calculating, in a vector space model, cosine similarity between the original post and the first reply in 20,499 threads. Our findings across five online health communities suggest that vocabulary similarity is a significant predictor of members’ future interaction in online health communities. These findings carry practical implications for facilitating and sustaining online community participation through beneficial effects of homophily in the vocabulary of essential peer support. PMID:26958240

  12. Econo-ESA in semantic text similarity.

    PubMed

    Rahutomo, Faisal; Aritsugi, Masayoshi

    2014-01-01

    Explicit semantic analysis (ESA) utilizes an immense Wikipedia index matrix in its interpreter part. This part of the analysis multiplies a large matrix by a term vector to produce a high-dimensional concept vector. A similarity measurement between two texts is performed between two concept vectors with numerous dimensions. The cost is expensive in both interpretation and similarity measurement steps. This paper proposes an economic scheme of ESA, named econo-ESA. We investigate two aspects of this proposal: dimensional reduction and experiments with various data. We use eight recycling test collections in semantic text similarity. The experimental results show that both the dimensional reduction and test collection characteristics can influence the results. They also show that an appropriate concept reduction of econo-ESA can decrease the cost with minor differences in the results from the original ESA.

  13. Species Determination and Quantitation in Mixtures Using MRM Mass Spectrometry of Peptides Applied to Meat Authentication

    PubMed Central

    Gunning, Yvonne; Watson, Andrew D.; Rigby, Neil M.; Philo, Mark; Peazer, Joshua K.; Kemsley, E. Kate

    2016-01-01

    We describe a simple protocol for identifying and quantifying the two components in binary mixtures of species possessing one or more similar proteins. Central to the method is the identification of 'corresponding proteins' in the species of interest, in other words proteins that are nominally the same but possess species-specific sequence differences. When subject to proteolysis, corresponding proteins will give rise to some peptides which are likewise similar but with species-specific variants. These are 'corresponding peptides'. Species-specific peptides can be used as markers for species determination, while pairs of corresponding peptides permit relative quantitation of two species in a mixture. The peptides are detected using multiple reaction monitoring (MRM) mass spectrometry, a highly specific technique that enables peptide-based species determination even in complex systems. In addition, the ratio of MRM peak areas deriving from corresponding peptides supports relative quantitation. Since corresponding proteins and peptides will, in the main, behave similarly in both processing and in experimental extraction and sample preparation, the relative quantitation should remain comparatively robust. In addition, this approach does not need the standards and calibrations required by absolute quantitation methods. The protocol is described in the context of red meats, which have convenient corresponding proteins in the form of their respective myoglobins. This application is relevant to food fraud detection: the method can detect 1% weight for weight of horse meat in beef. The corresponding protein, corresponding peptide (CPCP) relative quantitation using MRM peak area ratios gives good estimates of the weight for weight composition of a horse plus beef mixture. PMID:27685654

  14. Species Determination and Quantitation in Mixtures Using MRM Mass Spectrometry of Peptides Applied to Meat Authentication.

    PubMed

    Gunning, Yvonne; Watson, Andrew D; Rigby, Neil M; Philo, Mark; Peazer, Joshua K; Kemsley, E Kate

    2016-09-20

    We describe a simple protocol for identifying and quantifying the two components in binary mixtures of species possessing one or more similar proteins. Central to the method is the identification of 'corresponding proteins' in the species of interest, in other words proteins that are nominally the same but possess species-specific sequence differences. When subject to proteolysis, corresponding proteins will give rise to some peptides which are likewise similar but with species-specific variants. These are 'corresponding peptides'. Species-specific peptides can be used as markers for species determination, while pairs of corresponding peptides permit relative quantitation of two species in a mixture. The peptides are detected using multiple reaction monitoring (MRM) mass spectrometry, a highly specific technique that enables peptide-based species determination even in complex systems. In addition, the ratio of MRM peak areas deriving from corresponding peptides supports relative quantitation. Since corresponding proteins and peptides will, in the main, behave similarly in both processing and in experimental extraction and sample preparation, the relative quantitation should remain comparatively robust. In addition, this approach does not need the standards and calibrations required by absolute quantitation methods. The protocol is described in the context of red meats, which have convenient corresponding proteins in the form of their respective myoglobins. This application is relevant to food fraud detection: the method can detect 1% weight for weight of horse meat in beef. The corresponding protein, corresponding peptide (CPCP) relative quantitation using MRM peak area ratios gives good estimates of the weight for weight composition of a horse plus beef mixture.

  15. Quantitating Antibody Uptake In Vivo: Conditional Dependence on Antigen Expression Levels

    PubMed Central

    Thurber, Greg M.; Weissleder, Ralph

    2010-01-01

    Purpose Antibodies form an important class of cancer therapeutics, and there is intense interest in using them for imaging applications in diagnosis and monitoring of cancer treatment. Despite the expanding body of knowledge describing pharmacokinetic and pharmacodynamic interactions of antibodies in vivo, discrepancies remain over the effect of antigen expression level on tumoral uptake with some reports indicating a relationship between uptake and expression and others showing no correlation. Procedures Using a cell line with high EpCAM expression and moderate EGFR expression, fluorescent antibodies with similar plasma clearance were imaged in vivo. A mathematical model and mouse xenograft experiments were used to describe the effect of antigen expression on uptake of these high affinity antibodies. Results As predicted by the theoretical model, under subsaturating conditions, uptake of the antibodies in such tumors is similar because localization of both probes is limited by delivery from the vasculature. In a separate experiment, when the tumor is saturated, the uptake becomes dependent on the number of available binding sites. In addition, targeting of small micrometastases is shown to be higher than larger vascularized tumors. Conclusions These results are consistent with the prediction that high affinity antibody uptake is dependent on antigen expression levels for saturating doses and delivery for subsaturating doses. It is imperative for any probe to understand whether quantitative uptake is a measure of biomarker expression or transport to the region of interest. The data provide support for a predictive theoretical model of antibody uptake, enabling it to be used as a starting point for the design of more efficacious therapies and timely quantitative imaging probes. PMID:20809210

  16. Similar call signs

    DOT National Transportation Integrated Search

    2010-08-18

    This presentation was given at the Partnership for Safety Meeting in Washington, DC. It examines the similarities that are found when calls signs are visually similar or similar sounding. Visually similar call signs increase the chances of controller...

  17. Mathematical modelling and quantitative methods.

    PubMed

    Edler, L; Poirier, K; Dourson, M; Kleiner, J; Mileson, B; Nordmann, H; Renwick, A; Slob, W; Walton, K; Würtzen, G

    2002-01-01

    The present review reports on the mathematical methods and statistical techniques presently available for hazard characterisation. The state of the art of mathematical modelling and quantitative methods used currently for regulatory decision-making in Europe and additional potential methods for risk assessment of chemicals in food and diet are described. Existing practices of JECFA, FDA, EPA, etc., are examined for their similarities and differences. A framework is established for the development of new and improved quantitative methodologies. Areas for refinement, improvement and increase of efficiency of each method are identified in a gap analysis. Based on this critical evaluation, needs for future research are defined. It is concluded from our work that mathematical modelling of the dose-response relationship would improve the risk assessment process. An adequate characterisation of the dose-response relationship by mathematical modelling clearly requires the use of a sufficient number of dose groups to achieve a range of different response levels. This need not necessarily lead to an increase in the total number of animals in the study if an appropriate design is used. Chemical-specific data relating to the mode or mechanism of action and/or the toxicokinetics of the chemical should be used for dose-response characterisation whenever possible. It is concluded that a single method of hazard characterisation would not be suitable for all kinds of risk assessments, and that a range of different approaches is necessary so that the method used is the most appropriate for the data available and for the risk characterisation issue. Future refinements to dose-response characterisation should incorporate more clearly the extent of uncertainty and variability in the resulting output.

  18. Quantitative analysis of bristle number in Drosophila mutants identifies genes involved in neural development

    NASA Technical Reports Server (NTRS)

    Norga, Koenraad K.; Gurganus, Marjorie C.; Dilda, Christy L.; Yamamoto, Akihiko; Lyman, Richard F.; Patel, Prajal H.; Rubin, Gerald M.; Hoskins, Roger A.; Mackay, Trudy F.; Bellen, Hugo J.

    2003-01-01

    BACKGROUND: The identification of the function of all genes that contribute to specific biological processes and complex traits is one of the major challenges in the postgenomic era. One approach is to employ forward genetic screens in genetically tractable model organisms. In Drosophila melanogaster, P element-mediated insertional mutagenesis is a versatile tool for the dissection of molecular pathways, and there is an ongoing effort to tag every gene with a P element insertion. However, the vast majority of P element insertion lines are viable and fertile as homozygotes and do not exhibit obvious phenotypic defects, perhaps because of the tendency for P elements to insert 5' of transcription units. Quantitative genetic analysis of subtle effects of P element mutations that have been induced in an isogenic background may be a highly efficient method for functional genome annotation. RESULTS: Here, we have tested the efficacy of this strategy by assessing the extent to which screening for quantitative effects of P elements on sensory bristle number can identify genes affecting neural development. We find that such quantitative screens uncover an unusually large number of genes that are known to function in neural development, as well as genes with yet uncharacterized effects on neural development, and novel loci. CONCLUSIONS: Our findings establish the use of quantitative trait analysis for functional genome annotation through forward genetics. Similar analyses of quantitative effects of P element insertions will facilitate our understanding of the genes affecting many other complex traits in Drosophila.

  19. Quantitative Analysis in the General Chemistry Laboratory: Training Students to Analyze Individual Results in the Context of Collective Data

    ERIC Educational Resources Information Center

    Ling, Chris D.; Bridgeman, Adam J.

    2011-01-01

    Titration experiments are ideal for generating large data sets for use in quantitative-analysis activities that are meaningful and transparent to general chemistry students. We report the successful implementation of a sophisticated quantitative exercise in which the students identify a series of unknown acids by determining their molar masses…

  20. Some Effects of Similarity Self-Disclosure

    ERIC Educational Resources Information Center

    Murphy, Kevin C.; Strong, Stanley R.

    1972-01-01

    College males were interviewed about how college had altered their friendships, values, and plans. The interviewers diclosed experiences and feelings similar to those revealed by the students. Results support Byrne's Law of Similarity in generating interpersonal attraction in the interview and suggest that the timing of self-disclosures is…

  1. Methods for quantitative and qualitative evaluation of vaginal microflora during menstruation.

    PubMed Central

    Onderdonk, A B; Zamarchi, G R; Walsh, J A; Mellor, R D; Muñoz, A; Kass, E H

    1986-01-01

    The quantitative and qualitative changes in the bacterial flora of the vagina during menstruation have received inadequate study. Similarly, the effect of vaginal tampons on the microbial flora as well as the relationship between the microbial flora of the vagina and that of the tampon has not been adequately evaluated. The purposes of the present study were (i) to develop quantitative methods for studying the vaginal flora and the flora of tampons obtained during menstruation and (ii) to determine whether there were differences between the microflora of the tampon and that of the vaginal vault. Tampon and swab samples were obtained at various times from eight young healthy volunteers for 8 to 10 menstrual cycles. Samples consisted of swabs from women wearing menstrual pads compared with swab and tampon samples taken at various times during the menstrual cycle. Samples were analyzed for total facultative and anaerobic bacterial counts, and the six dominant bacterial species in each culture were identified. Statistical evaluation of the results indicates that total bacterial counts decreased during menstruation and that swab and tampon samples yielded similar total counts per unit weight of sample. The numbers of bacteria in tampons tended to be lower than in swabs taken at the same time. Overall, during menstruation, the concentrations of lactobacilli declined, but otherwise there was little difference among the species found during menstruation compared with those found in intermenstrual samples. Cotton tampons had little discernible effect on the microbial flora. PMID:3954346

  2. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    associated with increased risk of breast cancer for both operators (OR = 4.0, 95% confidence interval (CI) 1.6-10.1, and 2.4, 95% CI 1.2-4.7). Quantitative measurement of BPU, defined as the ratio of average counts in fibroglandular tissue relative to that in fat, can be reliably performed by nonradiologist operators with a simple region-of-interest analysis tool. Similar to results obtained with subjective BPU categories, quantitative BPU is a functional imaging biomarker of breast cancer risk, independent of mammographic density and hormonal factors.

  3. QDIRT: Quantitative Direct and Indirect Testing of Sudomotor Function

    PubMed Central

    Gibbons, Christopher H.; Illigens, Ben MW; Centi, Justin; Freeman, Roy

    2011-01-01

    Objective To develop a novel assessment of sudomotor function. Background Post-ganglionic sudomotor function is currently evaluated using quantitative sudomotor axon reflex testing (QSART) or silicone impressions. We hypothesize that high-resolution digital photography has advanced sufficiently to allow quantitative direct and indirect testing of sudomotor function (QDIRT) with spatial and temporal resolution comparable to these techniques. Methods Sweating in 10 humans was stimulated on both forearms by iontophoresis of 10% acetylcholine. Silicone impressions were made and topical indicator dyes were digitally photographed every 15 seconds for 7 minutes after iontophoresis. Sweat droplets were quantified by size, location and percent surface area. Each test was repeated 8 times in each subject on alternating arms over 2 months. Another 10 subjects had silicone impressions, QDIRT and QSART performed on the dorsum of the right foot. Results The percent area of sweat photographically imaged correlated with silicone impressions at 5 minutes on the forearm (r = 0.92, p<0.01) and dorsal foot (r=0.85, p<0.01). The number of sweat droplets assessed with QDIRT correlated with the silicone impression although the droplet number was lower (162±28 vs. 341±56, p<0.01; r =0.83, p<0.01). QDIRT and QSART sudomotor assessments measured at the dorsum of the foot correlated (sweat response (r=0.63, p<0.05) and sweat onset latency (r=0.52, p<0.05). Conclusions QDIRT measured both the direct and indirect sudomotor response with spatial resolution similar to silicone impressions, and with temporal resolution that is similar to QSART. QDIRT provides a novel tool for the evaluation of post-ganglionic sudomotor function. PMID:18541883

  4. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  5. Counterfactual Plausibility and Comparative Similarity.

    PubMed

    Stanley, Matthew L; Stewart, Gregory W; Brigard, Felipe De

    2017-05-01

    Counterfactual thinking involves imagining hypothetical alternatives to reality. Philosopher David Lewis (1973, 1979) argued that people estimate the subjective plausibility that a counterfactual event might have occurred by comparing an imagined possible world in which the counterfactual statement is true against the current, actual world in which the counterfactual statement is false. Accordingly, counterfactuals considered to be true in possible worlds comparatively more similar to ours are judged as more plausible than counterfactuals deemed true in possible worlds comparatively less similar. Although Lewis did not originally develop his notion of comparative similarity to be investigated as a psychological construct, this study builds upon his idea to empirically investigate comparative similarity as a possible psychological strategy for evaluating the perceived plausibility of counterfactual events. More specifically, we evaluate judgments of comparative similarity between episodic memories and episodic counterfactual events as a factor influencing people's judgments of plausibility in counterfactual simulations, and we also compare it against other factors thought to influence judgments of counterfactual plausibility, such as ease of simulation and prior simulation. Our results suggest that the greater the perceived similarity between the original memory and the episodic counterfactual event, the greater the perceived plausibility that the counterfactual event might have occurred. While similarity between actual and counterfactual events, ease of imagining, and prior simulation of the counterfactual event were all significantly related to counterfactual plausibility, comparative similarity best captured the variance in ratings of counterfactual plausibility. Implications for existing theories on the determinants of counterfactual plausibility are discussed. Copyright © 2016 Cognitive Science Society, Inc.

  6. Study on the Multi-marker Components Quantitative HPLC Fingerprint of the Compound Chinese Medicine Wuwei Changyanning Granule

    PubMed Central

    Yang, Xian; Yang, Shui-Ping; Zhang, Xue; Yu, Xiao-Dong; He, Qi-Yi; Wang, Bo-Chu

    2014-01-01

    The aim of this paper is to develop a rapid and highly sensitive quantitative HPLC fingerprint method with multiple indicators by using the Compound Chinese Medicine Wuwei Changyanning granule and 5 herbs in the prescription. The quantitative fingerprint chromatogram with multiple indicators was investigated. і)6 compositions included rutin, gallic acid, chlorogenic acid, atractylenolide Ⅰ, pachymic acid and apigenin, which originated from 5 herbs respectively, were selected as quantitative compositions, and their contents were determined using HPLC from 11 batches granules and the corresponding 5 medicinal materials. ⅱ) The precision, stability and repeatability of fingerprinting were investigated. In addition, common peaks number, the percentage of non-common peaks and similarity were also studied. Among them, 21 common peaks in the granule could find the source of peaks from the 5 herbs, among of 10 peaks from Niuerfeng, 9 peaks from Laliao, 3 peaks from Baishu, 3 peaks from Fuling and 5 peaks from Guanghuoxiang. The results showed that the identification method of fingerprinting was reliable. PMID:25587307

  7. Proteins with similar architecture exhibit similar large-scale dynamic behavior.

    PubMed Central

    Keskin, O; Jernigan, R L; Bahar, I

    2000-01-01

    We have investigated the similarities and differences in the computed dynamic fluctuations exhibited by six members of a protein fold family with a coarse-grained Gaussian network model. Specifically, we consider the cofactor binding fragment of CysB; the lysine/arginine/ornithine-binding protein (LAO); the enzyme porphobilinogen deaminase (PBGD); the ribose-binding protein (RBP); the N-terminal lobe of ovotransferrin in apo-form (apo-OVOT); and the leucine/isoleucine/valine-binding protein (LIVBP). All have domains that resemble a Rossmann fold, but there are also some significant differences. Results indicate that similar global dynamic behavior is preserved for the members of a fold family, and that differences usually occur in regions only where specific function is localized. The present work is a computational demonstration that the scaffold of a protein fold may be utilized for diverse purposes. LAO requires a bound ligand before it conforms to the large-scale fluctuation behavior of the three other members of the family, CysB, PBGD, and RBP, all of which contain a substrate (cofactor) at the active site cleft. The dynamics of the ligand-free enzymes LIVBP and apo-OVOT, on the other hand, concur with that of unliganded LAO. The present results suggest that it is possible to construct structure alignments based on dynamic fluctuation behavior. PMID:10733987

  8. The Next Frontier: Quantitative Biochemistry in Living Cells.

    PubMed

    Honigmann, Alf; Nadler, André

    2018-01-09

    Researchers striving to convert biology into an exact science foremost rely on structural biology and biochemical reconstitution approaches to obtain quantitative data. However, cell biological research is moving at an ever-accelerating speed into areas where these approaches lose much of their edge. Intrinsically unstructured proteins and biochemical interaction networks composed of interchangeable, multivalent, and unspecific interactions pose unique challenges to quantitative biology, as do processes that occur in discrete cellular microenvironments. Here we argue that a conceptual change in our way of conducting biochemical experiments is required to take on these new challenges. We propose that reconstitution of cellular processes in vitro should be much more focused on mimicking the cellular environment in vivo, an approach that requires detailed knowledge of the material properties of cellular compartments, essentially requiring a material science of the cell. In a similar vein, we suggest that quantitative biochemical experiments in vitro should be accompanied by corresponding experiments in vivo, as many newly relevant cellular processes are highly context-dependent. In essence, this constitutes a call for chemical biologists to convert their discipline from a proof-of-principle science to an area that could rightfully be called quantitative biochemistry in living cells. In this essay, we discuss novel techniques and experimental strategies with regard to their potential to fulfill such ambitious aims.

  9. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  10. Bench press and push-up at comparable levels of muscle activity results in similar strength gains.

    PubMed

    Calatayud, Joaquin; Borreani, Sebastien; Colado, Juan C; Martin, Fernando; Tella, Victor; Andersen, Lars L

    2015-01-01

    Electromyography (EMG) exercise evaluation is commonly used to measure the intensity of muscle contraction. Although researchers assume that biomechanically comparable resistance exercises with similar high EMG levels will produce similar strength gains over the long term, no studies have actually corroborated this hypothesis. This study evaluated EMG levels during 6 repetition maximum (6RM) bench press and push-up, and subsequently performed a 5-week training period where subjects were randomly divided into 3 groups (i.e., 6RM bench press group, 6RM elastic band push-up group, or control group) to evaluate muscle strength gains. Thirty university students with advanced resistance training experience participated in the 2-part study. During the training period, exercises were performed using the same loads and variables that were used during the EMG data collection. At baseline, EMG amplitude showed no significant difference between 6RM bench press and band push-up. Significant differences among the groups were found for percent change (Δ) between pretest and posttest for 6RM (p = 0.017) and for 1 repetition maximum (1RM) (p < 0.001). Six repetition maximum bench press group and 6RM elastic band push-up group improved their 1RM and 6RM (Δ ranging from 13.65 to 22.21) tests significantly with similar gains, whereas control group remains unchanged. Thus, when the EMG values are comparable and the same conditions are reproduced, the aforementioned exercises can provide similar muscle strength gains.

  11. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  12. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  13. Identification among morphologically similar Argyreia (Convolvulaceae) based on leaf anatomy and phenetic analyses.

    PubMed

    Traiperm, Paweena; Chow, Janene; Nopun, Possathorn; Staples, G; Swangpol, Sasivimon C

    2017-12-01

    The genus Argyreia Lour. is one of the species-rich Asian genera in the family Convolvulaceae. Several species complexes were recognized in which taxon delimitation was imprecise, especially when examining herbarium materials without fully developed open flowers. The main goal of this study is to investigate and describe leaf anatomy for some morphologically similar Argyreia using epidermal peeling, leaf and petiole transverse sections, and scanning electron microscopy. Phenetic analyses including cluster analysis and principal component analysis were used to investigate the similarity of these morpho-types. Anatomical differences observed between the morpho-types include epidermal cell walls and the trichome types on the leaf epidermis. Additional differences in the leaf and petiole transverse sections include the epidermal cell shape of the adaxial leaf blade, the leaf margins, and the petiole transverse sectional outline. The phenogram from cluster analysis using the UPGMA method represented four groups with an R value of 0.87. Moreover, the important quantitative and qualitative leaf anatomical traits of the four groups were confirmed by the principal component analysis of the first two components. The results from phenetic analyses confirmed the anatomical differentiation between the morpho-types. Leaf anatomical features regarded as particularly informative for morpho-type differentiation can be used to supplement macro morphological identification.

  14. Quantitative proteomics in biological research.

    PubMed

    Wilm, Matthias

    2009-10-01

    Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.

  15. Interdependency in Multimodel Climate Projections: Component Replication and Result Similarity

    NASA Astrophysics Data System (ADS)

    Boé, Julien

    2018-03-01

    Multimodel ensembles are the main way to deal with model uncertainties in climate projections. However, the interdependencies between models that often share entire components make it difficult to combine their results in a satisfactory way. In this study, how the replication of components (atmosphere, ocean, land, and sea ice) between climate models impacts the proximity of their results is quantified precisely, in terms of climatological means and future changes. A clear relationship exists between the number of components shared by climate models and the proximity of their results. Even the impact of a single shared component is generally visible. These conclusions are true at both the global and regional scales. Given available data, it cannot be robustly concluded that some components are more important than others. Those results provide ways to estimate model interdependencies a priori rather than a posteriori based on their results, in order to define independence weights.

  16. Quantitative high-efficiency cadmium-zinc-telluride SPECT with dedicated parallel-hole collimation system in obese patients: results of a multi-center study.

    PubMed

    Nakazato, Ryo; Slomka, Piotr J; Fish, Mathews; Schwartz, Ronald G; Hayes, Sean W; Thomson, Louise E J; Friedman, John D; Lemley, Mark; Mackin, Maria L; Peterson, Benjamin; Schwartz, Arielle M; Doran, Jesse A; Germano, Guido; Berman, Daniel S

    2015-04-01

    Obesity is a common source of artifact on conventional SPECT myocardial perfusion imaging (MPI). We evaluated image quality and diagnostic performance of high-efficiency (HE) cadmium-zinc-telluride parallel-hole SPECT MPI for coronary artery disease (CAD) in obese patients. 118 consecutive obese patients at three centers (BMI 43.6 ± 8.9 kg·m(-2), range 35-79.7 kg·m(-2)) had upright/supine HE-SPECT and invasive coronary angiography > 6 months (n = 67) or low likelihood of CAD (n = 51). Stress quantitative total perfusion deficit (TPD) for upright (U-TPD), supine (S-TPD), and combined acquisitions (C-TPD) was assessed. Image quality (IQ; 5 = excellent; < 3 nondiagnostic) was compared among BMI 35-39.9 (n = 58), 40-44.9 (n = 24) and ≥45 (n = 36) groups. ROC curve area for CAD detection (≥50% stenosis) for U-TPD, S-TPD, and C-TPD were 0.80, 0.80, and 0.87, respectively. Sensitivity/specificity was 82%/57% for U-TPD, 74%/71% for S-TPD, and 80%/82% for C-TPD. C-TPD had highest specificity (P = .02). C-TPD normalcy rate was higher than U-TPD (88% vs 75%, P = .02). Mean IQ was similar among BMI 35-39.9, 40-44.9 and ≥45 groups [4.6 vs 4.4 vs 4.5, respectively (P = .6)]. No patient had a nondiagnostic stress scan. In obese patients, HE-SPECT MPI with dedicated parallel-hole collimation demonstrated high image quality, normalcy rate, and diagnostic accuracy for CAD by quantitative analysis of combined upright/supine acquisitions.

  17. Inter-rater reliability of motor unit number estimates and quantitative motor unit analysis in the tibialis anterior muscle.

    PubMed

    Boe, S G; Dalton, B H; Harwood, B; Doherty, T J; Rice, C L

    2009-05-01

    To establish the inter-rater reliability of decomposition-based quantitative electromyography (DQEMG) derived motor unit number estimates (MUNEs) and quantitative motor unit (MU) analysis. Using DQEMG, two examiners independently obtained a sample of needle and surface-detected motor unit potentials (MUPs) from the tibialis anterior muscle from 10 subjects. Coupled with a maximal M wave, surface-detected MUPs were used to derive a MUNE for each subject and each examiner. Additionally, size-related parameters of the individual MUs were obtained following quantitative MUP analysis. Test-retest MUNE values were similar with high reliability observed between examiners (ICC=0.87). Additionally, MUNE variability from test-retest as quantified by a 95% confidence interval was relatively low (+/-28 MUs). Lastly, quantitative data pertaining to MU size, complexity and firing rate were similar between examiners. MUNEs and quantitative MU data can be obtained with high reliability by two independent examiners using DQEMG. Establishing the inter-rater reliability of MUNEs and quantitative MU analysis using DQEMG is central to the clinical applicability of the technique. In addition to assessing response to treatments over time, multiple clinicians may be involved in the longitudinal assessment of the MU pool of individuals with disorders of the central or peripheral nervous system.

  18. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  19. Extracting similar terms from multiple EMR-based semantic embeddings to support chart reviews.

    PubMed

    Cheng Ye, M S; Fabbri, Daniel

    2018-05-21

    Word embeddings project semantically similar terms into nearby points in a vector space. When trained on clinical text, these embeddings can be leveraged to improve keyword search and text highlighting. In this paper, we present methods to refine the selection process of similar terms from multiple EMR-based word embeddings, and evaluate their performance quantitatively and qualitatively across multiple chart review tasks. Word embeddings were trained on each clinical note type in an EMR. These embeddings were then combined, weighted, and truncated to select a refined set of similar terms to be used in keyword search and text highlighting. To evaluate their quality, we measured the similar terms' information retrieval (IR) performance using precision-at-K (P@5, P@10). Additionally a user study evaluated users' search term preferences, while a timing study measured the time to answer a question from a clinical chart. The refined terms outperformed the baseline method's information retrieval performance (e.g., increasing the average P@5 from 0.48 to 0.60). Additionally, the refined terms were preferred by most users, and reduced the average time to answer a question. Clinical information can be more quickly retrieved and synthesized when using semantically similar term from multiple embeddings. Copyright © 2018. Published by Elsevier Inc.

  20. Adapting Document Similarity Measures for Ligand-Based Virtual Screening.

    PubMed

    Himmat, Mubarak; Salim, Naomie; Al-Dabbagh, Mohammed Mumtaz; Saeed, Faisal; Ahmed, Ali

    2016-04-13

    Quantifying the similarity of molecules is considered one of the major tasks in virtual screening. There are many similarity measures that have been proposed for this purpose, some of which have been derived from document and text retrieving areas as most often these similarity methods give good results in document retrieval and can achieve good results in virtual screening. In this work, we propose a similarity measure for ligand-based virtual screening, which has been derived from a text processing similarity measure. It has been adopted to be suitable for virtual screening; we called this proposed measure the Adapted Similarity Measure of Text Processing (ASMTP). For evaluating and testing the proposed ASMTP we conducted several experiments on two different benchmark datasets: the Maximum Unbiased Validation (MUV) and the MDL Drug Data Report (MDDR). The experiments have been conducted by choosing 10 reference structures from each class randomly as queries and evaluate them in the recall of cut-offs at 1% and 5%. The overall obtained results are compared with some similarity methods including the Tanimoto coefficient, which are considered to be the conventional and standard similarity coefficients for fingerprint-based similarity calculations. The achieved results show that the performance of ligand-based virtual screening is better and outperforms the Tanimoto coefficients and other methods.

  1. Bayes` theorem and quantitative risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kaplan, S.

    1994-12-31

    This paper argues that for a quantitative risk analysis (QRA) to be useful for public and private decision making, and for rallying the support necessary to implement those decisions, it is necessary that the QRA results be ``trustable.`` Trustable means that the results are based solidly and logically on all the relevant evidence available. This, in turn, means that the quantitative results must be derived from the evidence using Bayes` theorem. Thus, it argues that one should strive to make their QRAs more clearly and explicitly Bayesian, and in this way make them more ``evidence dependent`` than ``personality dependent.``

  2. A Similarity Analysis of Audio Signal to Develop a Human Activity Recognition Using Similarity Networks

    PubMed Central

    García-Hernández, Alejandra; Galván-Tejada, Jorge I.; Celaya-Padilla, José M.; Velasco-Elizondo, Perla; Cárdenas-Vargas, Rogelio

    2017-01-01

    Human Activity Recognition (HAR) is one of the main subjects of study in the areas of computer vision and machine learning due to the great benefits that can be achieved. Examples of the study areas are: health prevention, security and surveillance, automotive research, and many others. The proposed approaches are carried out using machine learning techniques and present good results. However, it is difficult to observe how the descriptors of human activities are grouped. In order to obtain a better understanding of the the behavior of descriptors, it is important to improve the abilities to recognize the human activities. This paper proposes a novel approach for the HAR based on acoustic data and similarity networks. In this approach, we were able to characterize the sound of the activities and identify those activities looking for similarity in the sound pattern. We evaluated the similarity of the sounds considering mainly two features: the sound location and the materials that were used. As a result, the materials are a good reference classifying the human activities compared with the location. PMID:29160799

  3. A common neural code for similar conscious experiences in different individuals

    PubMed Central

    Naci, Lorina; Cusack, Rhodri; Anello, Mimma; Owen, Adrian M.

    2014-01-01

    The interpretation of human consciousness from brain activity, without recourse to speech or action, is one of the most provoking and challenging frontiers of modern neuroscience. We asked whether there is a common neural code that underpins similar conscious experiences, which could be used to decode these experiences in the absence of behavior. To this end, we used richly evocative stimulation (an engaging movie) portraying real-world events to elicit a similar conscious experience in different people. Common neural correlates of conscious experience were quantified and related to measurable, quantitative and qualitative, executive components of the movie through two additional behavioral investigations. The movie’s executive demands drove synchronized brain activity across healthy participants’ frontal and parietal cortices in regions known to support executive function. Moreover, the timing of activity in these regions was predicted by participants’ highly similar qualitative experience of the movie’s moment-to-moment executive demands, suggesting that synchronization of activity across participants underpinned their similar experience. Thus we demonstrate, for the first time to our knowledge, that a neural index based on executive function reliably predicted every healthy individual’s similar conscious experience in response to real-world events unfolding over time. This approach provided strong evidence for the conscious experience of a brain-injured patient, who had remained entirely behaviorally nonresponsive for 16 y. The patient’s executive engagement and moment-to-moment perception of the movie content were highly similar to that of every healthy participant. These findings shed light on the common basis of human consciousness and enable the interpretation of conscious experience in the absence of behavior. PMID:25225384

  4. Comparative Evaluation of Four Real-Time PCR Methods for the Quantitative Detection of Epstein-Barr Virus from Whole Blood Specimens.

    PubMed

    Buelow, Daelynn; Sun, Yilun; Tang, Li; Gu, Zhengming; Pounds, Stanley; Hayden, Randall

    2016-07-01

    Monitoring of Epstein-Barr virus (EBV) load in immunocompromised patients has become integral to their care. An increasing number of reagents are available for quantitative detection of EBV; however, there are little published comparative data. Four real-time PCR systems (one using laboratory-developed reagents and three using analyte-specific reagents) were compared with one another for detection of EBV from whole blood. Whole blood specimens seeded with EBV were used to determine quantitative linearity, analytical measurement range, lower limit of detection, and CV for each assay. Retrospective testing of 198 clinical samples was performed in parallel with all methods; results were compared to determine relative quantitative and qualitative performance. All assays showed similar performance. No significant difference was found in limit of detection (3.12-3.49 log10 copies/mL; P = 0.37). A strong qualitative correlation was seen with all assays that used clinical samples (positive detection rates of 89.5%-95.8%). Quantitative correlation of clinical samples across assays was also seen in pairwise regression analysis, with R(2) ranging from 0.83 to 0.95. Normalizing clinical sample results to IU/mL did not alter the quantitative correlation between assays. Quantitative EBV detection by real-time PCR can be performed over a wide linear dynamic range, using three different commercially available reagents and laboratory-developed methods. EBV was detected with comparable sensitivity and quantitative correlation for all assays. Copyright © 2016 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  5. Physical similarity or numerical representation counts in same-different, numerical comparison, physical comparison, and priming tasks?

    PubMed

    Zhang, Li; Xin, Ziqiang; Feng, Tingyong; Chen, Yinghe; Szűcs, Denes

    2018-03-01

    Recent studies have highlighted the fact that some tasks used to study symbolic number representations are confounded by judgments about physical similarity. Here, we investigated whether the contribution of physical similarity and numerical representation differed in the often-used symbolic same-different, numerical comparison, physical comparison, and priming tasks. Experiment 1 showed that subjective physical similarity was the best predictor of participants' performance in the same-different task, regardless of simultaneous or sequential presentation. Furthermore, the contribution of subjective physical similarity was larger in a simultaneous presentation than in a sequential presentation. Experiment 2 showed that only numerical representation was involved in numerical comparison. Experiment 3 showed that both subjective physical similarity and numerical representation contributed to participants' physical comparison performance. Finally, only numerical representation contributed to participants' performance in a priming task as revealed by Experiment 4. Taken together, the contribution of physical similarity and numerical representation depends on task demands. Performance primarily seems to rely on numerical properties in tasks that require explicit quantitative comparison judgments (physical or numerical), while physical stimulus properties exert an effect in the same-different task.

  6. When is Chemical Similarity Significant? The Statistical Distribution of Chemical Similarity Scores and Its Extreme Values

    PubMed Central

    Baldi, Pierre

    2010-01-01

    As repositories of chemical molecules continue to expand and become more open, it becomes increasingly important to develop tools to search them efficiently and assess the statistical significance of chemical similarity scores. Here we develop a general framework for understanding, modeling, predicting, and approximating the distribution of chemical similarity scores and its extreme values in large databases. The framework can be applied to different chemical representations and similarity measures but is demonstrated here using the most common binary fingerprints with the Tanimoto similarity measure. After introducing several probabilistic models of fingerprints, including the Conditional Gaussian Uniform model, we show that the distribution of Tanimoto scores can be approximated by the distribution of the ratio of two correlated Normal random variables associated with the corresponding unions and intersections. This remains true also when the distribution of similarity scores is conditioned on the size of the query molecules in order to derive more fine-grained results and improve chemical retrieval. The corresponding extreme value distributions for the maximum scores are approximated by Weibull distributions. From these various distributions and their analytical forms, Z-scores, E-values, and p-values are derived to assess the significance of similarity scores. In addition, the framework allows one to predict also the value of standard chemical retrieval metrics, such as Sensitivity and Specificity at fixed thresholds, or ROC (Receiver Operating Characteristic) curves at multiple thresholds, and to detect outliers in the form of atypical molecules. Numerous and diverse experiments carried in part with large sets of molecules from the ChemDB show remarkable agreement between theory and empirical results. PMID:20540577

  7. Quantitative nephelometry

    MedlinePlus

    ... this page: //medlineplus.gov/ency/article/003545.htm Quantitative nephelometry test To use the sharing features on this page, please enable JavaScript. Quantitative nephelometry is a lab test to quickly and ...

  8. Risk assessment of false-positive quantitative real-time PCR results in food, due to detection of DNA originating from dead cells.

    PubMed

    Wolffs, Petra; Norling, Börje; Rådström, Peter

    2005-03-01

    Real-time PCR technology is increasingly used for detection and quantification of pathogens in food samples. A main disadvantage of nucleic acid detection is the inability to distinguish between signals originating from viable cells and DNA released from dead cells. In order to gain knowledge concerning risks of false-positive results due to detection of DNA originating from dead cells, quantitative PCR (qPCR) was used to investigate the degradation kinetics of free DNA in four types of meat samples. Results showed that the fastest degradation rate was observed (1 log unit per 0.5 h) in chicken homogenate, whereas the slowest rate was observed in pork rinse (1 log unit per 120.5 h). Overall results indicated that degradation occurred faster in chicken samples than in pork samples and faster at higher temperatures. Based on these results, it was concluded that, especially in pork samples, there is a risk of false-positive PCR results. This was confirmed in a quantitative study on cell death and signal persistence over a period of 28 days, employing three different methods, i.e. viable counts, direct qPCR, and finally floatation, a recently developed discontinuous density centrifugation method, followed by qPCR. Results showed that direct qPCR resulted in an overestimation of up to 10 times of the amount of cells in the samples compared to viable counts, due to detection of DNA from dead cells. However, after using floatation prior to qPCR, results resembled the viable count data. This indicates that by using of floatation as a sample treatment step prior to qPCR, the risk of false-positive PCR results due to detection of dead cells, can be minimized.

  9. Anthropometric and quantitative EMG status of femoral quadriceps before and after conventional kinesitherapy with and without magnetotherapy.

    PubMed

    Graberski Matasović, M; Matasović, T; Markovac, Z

    1997-06-01

    The frequency of femoral quadriceps muscle hypotrophy has become a significant therapeutic problem. Efforts are being made to improve the standard scheme of kinesitherapeutic treatment by using additional more effective therapeutic methods. Beside kinesitherapy, the authors have used magnetotherapy in 30 of the 60 patients. The total of 60 patients, both sexes, similar age groups and intensity of hypotrophy, were included in the study. They were divided into groups A and B, the experimental and the control one (30 patients each). The treatment was scheduled for the usual 5-6 weeks. Electromyographic quantitative analysis was used to check-up the treatment results achieved after 5 and 6 weeks of treatment period. Analysis of results has confirmed the assumption that magnetotherapy may yield better and faster treatment results, disappearance of pain and decreased risk of complications. The same results were obtained in the experimental group, only one week earlier than in the control group. The EMG quantitative analysis has not proved sufficiently reliable and objective method in the assessment of real condition of the muscle and effects of treatment.

  10. Brain electromagnetic activity and lightning: potentially congruent scale-invariant quantitative properties

    PubMed Central

    Persinger, Michael A.

    2012-01-01

    The space-time characteristics of the axonal action potential are remarkably similar to the scaled equivalents of lightning. The energy and current densities from these transients within their respective volumes or cross-sectional areas are the same order of magnitude. Length–velocity ratios and temporal durations are nearly identical. There are similar chemical consequences such as the production of nitric oxide. Careful, quantitative examination of the characteristics of lightning may reveal analogous features of the action potential that could lead to a more accurate understanding of these powerful correlates of neurocognitive processes. PMID:22615688

  11. Stability of similarity measurements for bipartite networks

    PubMed Central

    Liu, Jian-Guo; Hou, Lei; Pan, Xue; Guo, Qiang; Zhou, Tao

    2016-01-01

    Similarity is a fundamental measure in network analyses and machine learning algorithms, with wide applications ranging from personalized recommendation to socio-economic dynamics. We argue that an effective similarity measurement should guarantee the stability even under some information loss. With six bipartite networks, we investigate the stabilities of fifteen similarity measurements by comparing the similarity matrixes of two data samples which are randomly divided from original data sets. Results show that, the fifteen measurements can be well classified into three clusters according to their stabilities, and measurements in the same cluster have similar mathematical definitions. In addition, we develop a top-n-stability method for personalized recommendation, and find that the unstable similarities would recommend false information to users, and the performance of recommendation would be largely improved by using stable similarity measurements. This work provides a novel dimension to analyze and evaluate similarity measurements, which can further find applications in link prediction, personalized recommendation, clustering algorithms, community detection and so on. PMID:26725688

  12. Stability of similarity measurements for bipartite networks.

    PubMed

    Liu, Jian-Guo; Hou, Lei; Pan, Xue; Guo, Qiang; Zhou, Tao

    2016-01-04

    Similarity is a fundamental measure in network analyses and machine learning algorithms, with wide applications ranging from personalized recommendation to socio-economic dynamics. We argue that an effective similarity measurement should guarantee the stability even under some information loss. With six bipartite networks, we investigate the stabilities of fifteen similarity measurements by comparing the similarity matrixes of two data samples which are randomly divided from original data sets. Results show that, the fifteen measurements can be well classified into three clusters according to their stabilities, and measurements in the same cluster have similar mathematical definitions. In addition, we develop a top-n-stability method for personalized recommendation, and find that the unstable similarities would recommend false information to users, and the performance of recommendation would be largely improved by using stable similarity measurements. This work provides a novel dimension to analyze and evaluate similarity measurements, which can further find applications in link prediction, personalized recommendation, clustering algorithms, community detection and so on.

  13. Size-exclusive Nanosensor for Quantitative Analysis of Fullerene C60: A Concept Paper

    EPA Science Inventory

    This paper presents the first development of a mass-sensitive nanosensor for the isolation and quantitative analyses of engineered fullerene (C60) nanoparticles, while excluding mixtures of structurally similar fullerenes. Amino-modified beta cyclodextrin (β-CD-NH

  14. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  15. Self-Similar Compressible Free Vortices

    NASA Technical Reports Server (NTRS)

    vonEllenrieder, Karl

    1998-01-01

    Lie group methods are used to find both exact and numerical similarity solutions for compressible perturbations to all incompressible, two-dimensional, axisymmetric vortex reference flow. The reference flow vorticity satisfies an eigenvalue problem for which the solutions are a set of two-dimensional, self-similar, incompressible vortices. These solutions are augmented by deriving a conserved quantity for each eigenvalue, and identifying a Lie group which leaves the reference flow equations invariant. The partial differential equations governing the compressible perturbations to these reference flows are also invariant under the action of the same group. The similarity variables found with this group are used to determine the decay rates of the velocities and thermodynamic variables in the self-similar flows, and to reduce the governing partial differential equations to a set of ordinary differential equations. The ODE's are solved analytically and numerically for a Taylor vortex reference flow, and numerically for an Oseen vortex reference flow. The solutions are used to examine the dependencies of the temperature, density, entropy, dissipation and radial velocity on the Prandtl number. Also, experimental data on compressible free vortex flow are compared to the analytical results, the evolution of vortices from initial states which are not self-similar is discussed, and the energy transfer in a slightly-compressible vortex is considered.

  16. Molecular similarity measures.

    PubMed

    Maggiora, Gerald M; Shanmugasundaram, Veerabahu

    2011-01-01

    Molecular similarity is a pervasive concept in chemistry. It is essential to many aspects of chemical reasoning and analysis and is perhaps the fundamental assumption underlying medicinal chemistry. Dissimilarity, the complement of similarity, also plays a major role in a growing number of applications of molecular diversity in combinatorial chemistry, high-throughput screening, and related fields. How molecular information is represented, called the representation problem, is important to the type of molecular similarity analysis (MSA) that can be carried out in any given situation. In this work, four types of mathematical structure are used to represent molecular information: sets, graphs, vectors, and functions. Molecular similarity is a pairwise relationship that induces structure into sets of molecules, giving rise to the concept of chemical space. Although all three concepts - molecular similarity, molecular representation, and chemical space - are treated in this chapter, the emphasis is on molecular similarity measures. Similarity measures, also called similarity coefficients or indices, are functions that map pairs of compatible molecular representations that are of the same mathematical form into real numbers usually, but not always, lying on the unit interval. This chapter presents a somewhat pedagogical discussion of many types of molecular similarity measures, their strengths and limitations, and their relationship to one another. An expanded account of the material on chemical spaces presented in the first edition of this book is also provided. It includes a discussion of the topography of activity landscapes and the role that activity cliffs in these landscapes play in structure-activity studies.

  17. EvolQG - An R package for evolutionary quantitative genetics

    PubMed Central

    Melo, Diogo; Garcia, Guilherme; Hubbe, Alex; Assis, Ana Paula; Marroig, Gabriel

    2016-01-01

    We present an open source package for performing evolutionary quantitative genetics analyses in the R environment for statistical computing. Evolutionary theory shows that evolution depends critically on the available variation in a given population. When dealing with many quantitative traits this variation is expressed in the form of a covariance matrix, particularly the additive genetic covariance matrix or sometimes the phenotypic matrix, when the genetic matrix is unavailable and there is evidence the phenotypic matrix is sufficiently similar to the genetic matrix. Given this mathematical representation of available variation, the \\textbf{EvolQG} package provides functions for calculation of relevant evolutionary statistics; estimation of sampling error; corrections for this error; matrix comparison via correlations, distances and matrix decomposition; analysis of modularity patterns; and functions for testing evolutionary hypotheses on taxa diversification. PMID:27785352

  18. Quantitative and qualitative research across cultures and languages: cultural metrics and their application.

    PubMed

    Wagner, Wolfgang; Hansen, Karolina; Kronberger, Nicole

    2014-12-01

    Growing globalisation of the world draws attention to cultural differences between people from different countries or from different cultures within the countries. Notwithstanding the diversity of people's worldviews, current cross-cultural research still faces the challenge of how to avoid ethnocentrism; comparing Western-driven phenomena with like variables across countries without checking their conceptual equivalence clearly is highly problematic. In the present article we argue that simple comparison of measurements (in the quantitative domain) or of semantic interpretations (in the qualitative domain) across cultures easily leads to inadequate results. Questionnaire items or text produced in interviews or via open-ended questions have culturally laden meanings and cannot be mapped onto the same semantic metric. We call the culture-specific space and relationship between variables or meanings a 'cultural metric', that is a set of notions that are inter-related and that mutually specify each other's meaning. We illustrate the problems and their possible solutions with examples from quantitative and qualitative research. The suggested methods allow to respect the semantic space of notions in cultures and language groups and the resulting similarities or differences between cultures can be better understood and interpreted.

  19. [THE COMPARATIVE ANALYSIS OF RESULTS OF DETECTION OF CARCINOGENIC TYPES OF HUMAN PAPILLOMA VIRUS BY QUALITATIVE AND QUANTITATIVE TESTS].

    PubMed

    Kuzmenko, E T; Labigina, A V; Leshenko, O Ya; Rusanov, D N; Kuzmenko, V V; Fedko, L P; Pak, I P

    2015-05-01

    The analysis of results of screening (n = 3208; sexually active citizen aged from 18 to 59 years) was carried out to detect oncogene types of human papilloma virus in using qualitative (1150 females and 720 males) and quantitative (polymerase chain reaction in real-time (843 females and 115 males) techniques. The human papilloma virus of high oncogene type was detected in 65% and 68.4% of females and in 48.6% and 53% of males correspondingly. Among 12 types of human papilloma virus the most frequently diagnosed was human papilloma virus 16 independently of gender of examined and technique of analysis. In females, under application of qualitative tests rate of human papilloma virus 16 made up to 18.3% (n = 280) and under application of quantitative tests Rte of human papilloma virus made up to 14.9% (n = 126; p ≤ 0.05). Under examination of males using qualitative tests rate of human papilloma virus 16 made up to 8.3% (n = 60) and under application of qualitative tests made up to 12.2% (n = 14; p ≥ 0.05). Under application of qualitative tests rate of detection on the rest ofoncogene types of human papilloma virus varied in females from 3.4% to 8.4% and in males from 1.8% to 5.9%. Under application of qualitative tests to females rate of human papilloma virus with high viral load made up to 68.4%, with medium viral load - 2.85% (n = 24) and with low viral load -0.24% (n = 2). Under application of quantitative tests in males rate of detection of types of human papilloma virus made up to 53% and at that in all high viral load was established. In females, the most of oncogene types of human papilloma virus (except for 31, 39, 59) are detected significantly more often than in males.

  20. Prioritization of candidate disease genes by combining topological similarity and semantic similarity.

    PubMed

    Liu, Bin; Jin, Min; Zeng, Pan

    2015-10-01

    The identification of gene-phenotype relationships is very important for the treatment of human diseases. Studies have shown that genes causing the same or similar phenotypes tend to interact with each other in a protein-protein interaction (PPI) network. Thus, many identification methods based on the PPI network model have achieved good results. However, in the PPI network, some interactions between the proteins encoded by candidate gene and the proteins encoded by known disease genes are very weak. Therefore, some studies have combined the PPI network with other genomic information and reported good predictive performances. However, we believe that the results could be further improved. In this paper, we propose a new method that uses the semantic similarity between the candidate gene and known disease genes to set the initial probability vector of a random walk with a restart algorithm in a human PPI network. The effectiveness of our method was demonstrated by leave-one-out cross-validation, and the experimental results indicated that our method outperformed other methods. Additionally, our method can predict new causative genes of multifactor diseases, including Parkinson's disease, breast cancer and obesity. The top predictions were good and consistent with the findings in the literature, which further illustrates the effectiveness of our method. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Proteomic analysis of cellular soluble proteins from human bronchial smooth muscle cells by combining nondenaturing micro 2DE and quantitative LC-MS/MS. 2. Similarity search between protein maps for the analysis of protein complexes.

    PubMed

    Jin, Ya; Yuan, Qi; Zhang, Jun; Manabe, Takashi; Tan, Wen

    2015-09-01

    Human bronchial smooth muscle cell soluble proteins were analyzed by a combined method of nondenaturing micro 2DE, grid gel-cutting, and quantitative LC-MS/MS and a native protein map was prepared for each of the identified 4323 proteins [1]. A method to evaluate the degree of similarity between the protein maps was developed since we expected the proteins comprising a protein complex would be separated together under nondenaturing conditions. The following procedure was employed using Excel macros; (i) maps that have three or more squares with protein quantity data were selected (2328 maps), (ii) within each map, the quantity values of the squares were normalized setting the highest value to be 1.0, (iii) in comparing a map with another map, the smaller normalized quantity in two corresponding squares was taken and summed throughout the map to give an "overlap score," (iv) each map was compared against all the 2328 maps and the largest overlap score, obtained when a map was compared with itself, was set to be 1.0 thus providing 2328 "overlap factors," (v) step (iv) was repeated for all maps providing 2328 × 2328 matrix of overlap factors. From the matrix, protein pairs that showed overlap factors above 0.65 from both protein sides were selected (431 protein pairs). Each protein pair was searched in a database (UniProtKB) on complex formation and 301 protein pairs, which comprise 35 protein complexes, were found to be documented. These results demonstrated that native protein maps and their similarity search would enable simultaneous analysis of multiple protein complexes in cells. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Prospective evaluation of risk of vertebral fractures using quantitative ultrasound measurements and bone mineral density in a population-based sample of postmenopausal women: results of the Basel Osteoporosis Study.

    PubMed

    Hollaender, R; Hartl, F; Krieg, M-A; Tyndall, A; Geuckel, C; Buitrago-Tellez, C; Manghani, M; Kraenzlin, M; Theiler, R; Hans, D

    2009-03-01

    Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

  3. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  4. Interpersonal similarity between body movements in face-to-face communication in daily life.

    PubMed

    Higo, Naoki; Ogawa, Ken-ichiro; Minemura, Juichi; Xu, Bujie; Nozawa, Takayuki; Ogata, Taiki; Ara, Koji; Yano, Kazuo; Miyake, Yoshihiro

    2014-01-01

    Individuals are embedded in social networks in which they communicate with others in their daily lives. Because smooth face-to-face communication is the key to maintaining these networks, measuring the smoothness of such communication is an important issue. One indicator of smoothness is the similarity of the body movements of the two individuals concerned. A typical example noted in experimental environments is the interpersonal synchronization of body movements such as nods and gestures during smooth face-to-face communication. It should therefore be possible to estimate quantitatively the smoothness of face-to-face communication in social networks through measurement of the synchronization of body movements. However, this is difficult because social networks, which differ from disciplined experimental environments, are open environments for the face-to-face communication between two individuals. In such open environments, their body movements become complicated by various external factors and may follow unstable and nonuniform patterns. Nevertheless, we consider there to be some interaction during face-to-face communication that leads to the interpersonal synchronization of body movements, which can be seen through the interpersonal similarity of body movements. The present study aims to clarify such interaction in terms of body movements during daily face-to-face communication in real organizations of more than 100 people. We analyzed data on the frequency of body movement for each individual during face-to-face communication, as measured by a wearable sensor, and evaluated the degree of interpersonal similarity of body movements between two individuals as their frequency difference. Furthermore, we generated uncorrelated data by resampling the data gathered and compared these two data sets statistically to distinguish the effects of actual face-to-face communication from those of the activities accompanying the communication. Our results confirm an

  5. Interpersonal Similarity between Body Movements in Face-To-Face Communication in Daily Life

    PubMed Central

    Higo, Naoki; Ogawa, Ken-ichiro; Minemura, Juichi; Xu, Bujie; Nozawa, Takayuki; Ogata, Taiki; Ara, Koji; Yano, Kazuo; Miyake, Yoshihiro

    2014-01-01

    Individuals are embedded in social networks in which they communicate with others in their daily lives. Because smooth face-to-face communication is the key to maintaining these networks, measuring the smoothness of such communication is an important issue. One indicator of smoothness is the similarity of the body movements of the two individuals concerned. A typical example noted in experimental environments is the interpersonal synchronization of body movements such as nods and gestures during smooth face-to-face communication. It should therefore be possible to estimate quantitatively the smoothness of face-to-face communication in social networks through measurement of the synchronization of body movements. However, this is difficult because social networks, which differ from disciplined experimental environments, are open environments for the face-to-face communication between two individuals. In such open environments, their body movements become complicated by various external factors and may follow unstable and nonuniform patterns. Nevertheless, we consider there to be some interaction during face-to-face communication that leads to the interpersonal synchronization of body movements, which can be seen through the interpersonal similarity of body movements. The present study aims to clarify such interaction in terms of body movements during daily face-to-face communication in real organizations of more than 100 people. We analyzed data on the frequency of body movement for each individual during face-to-face communication, as measured by a wearable sensor, and evaluated the degree of interpersonal similarity of body movements between two individuals as their frequency difference. Furthermore, we generated uncorrelated data by resampling the data gathered and compared these two data sets statistically to distinguish the effects of actual face-to-face communication from those of the activities accompanying the communication. Our results confirm an

  6. On the Development and Use of Large Chemical Similarity Networks, Informatics Best Practices and Novel Chemical Descriptors towards Materials Quantitative Structure Property Relationships

    ERIC Educational Resources Information Center

    Krein, Michael

    2011-01-01

    After decades of development and use in a variety of application areas, Quantitative Structure Property Relationships (QSPRs) and related descriptor-based statistical learning methods have achieved a level of infamy due to their misuse. The field is rife with past examples of overtrained models, overoptimistic performance assessment, and outright…

  7. Targeted, Site-specific quantitation of N- and O-glycopeptides using 18O-labeling and product ion based mass spectrometry.

    PubMed

    Srikanth, Jandhyam; Agalyadevi, Rathinasamy; Babu, Ponnusamy

    2017-02-01

    The site-specific quantitation of N- and O-glycosylation is vital to understanding the function(s) of different glycans expressed at a given site of a protein under physiological and disease conditions. Most commonly used precursor ion intensity based quantification method is less accurate and other labeled methods are expensive and require enrichment of glycopeptides. Here, we used glycopeptide product (y and Y0) ions and 18 O-labeling of C-terminal carboxyl group as a strategy to obtain quantitative information about fold-change and relative abundance of most of the glycoforms attached to the glycopeptides. As a proof of concept, the accuracy and robustness of this targeted, relative quantification LC-MS method was demonstrated using Rituximab. Furthermore, the N-glycopeptide quantification results were compared with a biosimilar of Rituximab and validated with quantitative data obtained from 2-AB-UHPLC-FL method. We further demonstrated the intensity fold-change and relative abundance of 46 unique N- and O-glycopeptides and aglycopeptides from innovator and biosimilar samples of Etanercept using both the normal-MS and product ion based quantitation. The results showed a very similar site-specific expression of N- and O-glycopeptides between the samples but with subtle differences. Interestingly, we have also been able to quantify macro-heterogeneity of all N- and O-glycopetides of Etanercept. In addition to applications in biotherapeutics, the developed method can also be used for site-specific quantitation of N- and O-glycopeptides and aglycopeptides of glycoproteins with known glycosylation pattern.

  8. Multiplicative effects model with internal standard in mobile phase for quantitative liquid chromatography-mass spectrometry.

    PubMed

    Song, Mi; Chen, Zeng-Ping; Chen, Yao; Jin, Jing-Wen

    2014-07-01

    Liquid chromatography-mass spectrometry assays suffer from signal instability caused by the gradual fouling of the ion source, vacuum instability, aging of the ion multiplier, etc. To address this issue, in this contribution, an internal standard was added into the mobile phase. The internal standard was therefore ionized and detected together with the analytes of interest by the mass spectrometer to ensure that variations in measurement conditions and/or instrument have similar effects on the signal contributions of both the analytes of interest and the internal standard. Subsequently, based on the unique strategy of adding internal standard in mobile phase, a multiplicative effects model was developed for quantitative LC-MS assays and tested on a proof of concept model system: the determination of amino acids in water by LC-MS. The experimental results demonstrated that the proposed method could efficiently mitigate the detrimental effects of continuous signal variation, and achieved quantitative results with average relative predictive error values in the range of 8.0-15.0%, which were much more accurate than the corresponding results of conventional internal standard method based on the peak height ratio and partial least squares method (their average relative predictive error values were as high as 66.3% and 64.8%, respectively). Therefore, it is expected that the proposed method can be developed and extended in quantitative LC-MS analysis of more complex systems. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  10. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein

  11. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  12. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  13. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE PAGES

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...

    2016-12-15

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  14. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  15. Artificial stone dust-induced functional and inflammatory abnormalities in exposed workers monitored quantitatively by biometrics.

    PubMed

    Ophir, Noa; Shai, Amir Bar; Alkalay, Yifat; Israeli, Shani; Korenstein, Rafi; Kramer, Mordechai R; Fireman, Elizabeth

    2016-01-01

    The manufacture of kitchen and bath countertops in Israel is based mainly on artificial stone that contains 93% silica as natural quartz, and ∼3500 workers are involved in cutting and processing it. Artificial stone produces high concentrations of silica dust. Exposure to crystalline silica may cause silicosis, an irreversible lung disease. Our aim was to screen exposed workers by quantitative biometric monitoring of functional and inflammatory parameters. 68 exposed artificial stone workers were compared to 48 nonexposed individuals (controls). Exposed workers filled in questionnaires, and all participants underwent pulmonary function tests and induced sputum analyses. Silica was quantitated by a Niton XL3 X-ray fluorescence spectrometer. Pulmonary function test results of exposed workers were significantly lower and induced sputa showed significantly higher neutrophilic inflammation compared to controls; both processes were slowed down by the use of protective measures in the workplace. Particle size distribution in induced sputum samples of exposed workers was similar to that of artificial stone dust, which contained aluminium, zirconium and titanium in addition to silica. In conclusion, the quantitation of biometric parameters is useful for monitoring workers exposed to artificial stone in order to avoid deterioration over time.

  16. [-25]A Similarity Analysis of Audio Signal to Develop a Human Activity Recognition Using Similarity Networks.

    PubMed

    García-Hernández, Alejandra; Galván-Tejada, Carlos E; Galván-Tejada, Jorge I; Celaya-Padilla, José M; Gamboa-Rosales, Hamurabi; Velasco-Elizondo, Perla; Cárdenas-Vargas, Rogelio

    2017-11-21

    Human Activity Recognition (HAR) is one of the main subjects of study in the areas of computer vision and machine learning due to the great benefits that can be achieved. Examples of the study areas are: health prevention, security and surveillance, automotive research, and many others. The proposed approaches are carried out using machine learning techniques and present good results. However, it is difficult to observe how the descriptors of human activities are grouped. In order to obtain a better understanding of the the behavior of descriptors, it is important to improve the abilities to recognize the human activities. This paper proposes a novel approach for the HAR based on acoustic data and similarity networks. In this approach, we were able to characterize the sound of the activities and identify those activities looking for similarity in the sound pattern. We evaluated the similarity of the sounds considering mainly two features: the sound location and the materials that were used. As a result, the materials are a good reference classifying the human activities compared with the location.

  17. Usefulness of a Dual Macro- and Micro-Energy-Dispersive X-Ray Fluorescence Spectrometer to Develop Quantitative Methodologies for Historic Mortar and Related Materials Characterization.

    PubMed

    García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel

    2018-05-01

    Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.

  18. Electron-density descriptors as predictors in quantitative structure--activity/property relationships and drug design.

    PubMed

    Matta, Chérif F; Arabi, Alya A

    2011-06-01

    The use of electron density-based molecular descriptors in drug research, particularly in quantitative structure--activity relationships/quantitative structure--property relationships studies, is reviewed. The exposition starts by a discussion of molecular similarity and transferability in terms of the underlying electron density, which leads to a qualitative introduction to the quantum theory of atoms in molecules (QTAIM). The starting point of QTAIM is the topological analysis of the molecular electron-density distributions to extract atomic and bond properties that characterize every atom and bond in the molecule. These atomic and bond properties have considerable potential as bases for the construction of robust quantitative structure--activity/property relationships models as shown by selected examples in this review. QTAIM is applicable to the electron density calculated from quantum-chemical calculations and/or that obtained from ultra-high resolution x-ray diffraction experiments followed by nonspherical refinement. Atomic and bond properties are introduced followed by examples of application of each of these two families of descriptors. The review ends with a study whereby the molecular electrostatic potential, uniquely determined by the density, is used in conjunction with atomic properties to elucidate the reasons for the biological similarity of bioisosteres.

  19. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues.

    PubMed

    Maccabi, Ashkan; Shin, Andrew; Namiri, Nikan K; Bajwa, Neha; St John, Maie; Taylor, Zachary D; Grundfest, Warren; Saddik, George N

    2018-01-01

    Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA) and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs) and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E), long term shear modulus (η), and time constant (τ) in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research.

  20. Quantitative characterization of viscoelastic behavior in tissue-mimicking phantoms and ex vivo animal tissues

    PubMed Central

    Shin, Andrew; Namiri, Nikan K.; Bajwa, Neha; St. John, Maie; Taylor, Zachary D.; Grundfest, Warren; Saddik, George N.

    2018-01-01

    Viscoelasticity of soft tissue is often related to pathology, and therefore, has become an important diagnostic indicator in the clinical assessment of suspect tissue. Surgeons, particularly within head and neck subsites, typically use palpation techniques for intra-operative tumor detection. This detection method, however, is highly subjective and often fails to detect small or deep abnormalities. Vibroacoustography (VA) and similar methods have previously been used to distinguish tissue with high-contrast, but a firm understanding of the main contrast mechanism has yet to be verified. The contributions of tissue mechanical properties in VA images have been difficult to verify given the limited literature on viscoelastic properties of various normal and diseased tissue. This paper aims to investigate viscoelasticity theory and present a detailed description of viscoelastic experimental results obtained in tissue-mimicking phantoms (TMPs) and ex vivo tissues to verify the main contrast mechanism in VA and similar imaging modalities. A spherical-tip micro-indentation technique was employed with the Hertzian model to acquire absolute, quantitative, point measurements of the elastic modulus (E), long term shear modulus (η), and time constant (τ) in homogeneous TMPs and ex vivo tissue in rat liver and porcine liver and gallbladder. Viscoelastic differences observed between porcine liver and gallbladder tissue suggest that imaging modalities which utilize the mechanical properties of tissue as a primary contrast mechanism can potentially be used to quantitatively differentiate between proximate organs in a clinical setting. These results may facilitate more accurate tissue modeling and add information not currently available to the field of systems characterization and biomedical research. PMID:29373598

  1. Using Neutron Spectroscopy to Obtain Quantitative Composition Data of Ganymede's Surface from the Jupiter Ganymede Orbiter

    NASA Astrophysics Data System (ADS)

    Lawrence, D. J.; Maurice, S.; Patterson, G. W.; Hibbitts, C. A.

    2010-05-01

    Understanding the global composition of Ganymede's surface is a key goal of the Europa Jupiter System Mission (EJSM) that is being jointly planned by NASA and ESA. Current plans for obtaining surface information with the Jupiter Ganymede Orbiter (JGO) use spectral imaging measurements. While spectral imaging can provide good mineralogy-related information, quantitative data about elemental abundances can often be hindered by non-composition variations due to surface effects (e.g., space weathering, grain effects, temperature, etc.). Orbital neutron and gamma-ray spectroscopy can provide quantitative composition information that is complementary to spectral imaging measurements, as has been demonstrated with similar instrumental combinations at the Moon, Mars, and Mercury. Neutron and gamma-ray measurements have successfully returned abundance information in a hydrogen-rich environment on Mars. In regards to neutrons and gamma-rays, there are many similarities between the Mars and Ganymede hydrogen-rich environments. In this study, we present results of neutron transport models, which show that quantitative composition information from Ganymede's surface can be obtained in a realistic mission scenario. Thermal and epithermal neutrons are jointly sensitive to the abundances of hydrogen and neutron absorbing elements, such as iron and titanium. These neutron measurements can discriminate between regions that are rich or depleted in neutron absorbing elements, even in the presence of large amounts of hydrogen. Details will be presented about how the neutron composition parameters can be used to meet high-level JGO science objectives, as well as an overview of a neutron spectrometer than can meet various mission and stringent environmental requirements.

  2. Engaging narratives evoke similar neural activity and lead to similar time perception.

    PubMed

    Cohen, Samantha S; Henin, Simon; Parra, Lucas C

    2017-07-04

    It is said that we lose track of time - that "time flies" - when we are engrossed in a story. How does engagement with the story cause this distorted perception of time, and what are its neural correlates? People commit both time and attentional resources to an engaging stimulus. For narrative videos, attentional engagement can be represented as the level of similarity between the electroencephalographic responses of different viewers. Here we show that this measure of neural engagement predicted the duration of time that viewers were willing to commit to narrative videos. Contrary to popular wisdom, engagement did not distort the average perception of time duration. Rather, more similar brain responses resulted in a more uniform perception of time across viewers. These findings suggest that by capturing the attention of an audience, narrative videos bring both neural processing and the subjective perception of time into synchrony.

  3. Quantitative modeling of the reaction/diffusion kinetics of two-chemistry photopolymers

    NASA Astrophysics Data System (ADS)

    Kowalski, Benjamin Andrew

    Optically driven diffusion in photopolymers is an appealing material platform for a broad range of applications, in which the recorded refractive index patterns serve either as images (e.g. data storage, display holography) or as optical elements (e.g. custom GRIN components, integrated optical devices). A quantitative understanding of the reaction/diffusion kinetics is difficult to obtain directly, but is nevertheless necessary in order to fully exploit the wide array of design freedoms in these materials. A general strategy for characterizing these kinetics is proposed, in which key processes are decoupled and independently measured. This strategy enables prediction of a material's potential refractive index change, solely on the basis of its chemical components. The degree to which a material does not reach this potential reveals the fraction of monomer that has participated in unwanted reactions, reducing spatial resolution and dynamic range. This approach is demonstrated for a model material similar to commercial media, achieving quantitative predictions of index response over three orders of exposure dose (~1 to ~103 mJ cm-2) and three orders of feature size (0.35 to 500 microns). The resulting insights enable guided, rational design of new material formulations with demonstrated performance improvement.

  4. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  5. Learning deep similarity in fundus photography

    NASA Astrophysics Data System (ADS)

    Chudzik, Piotr; Al-Diri, Bashir; Caliva, Francesco; Ometto, Giovanni; Hunter, Andrew

    2017-02-01

    Similarity learning is one of the most fundamental tasks in image analysis. The ability to extract similar images in the medical domain as part of content-based image retrieval (CBIR) systems has been researched for many years. The vast majority of methods used in CBIR systems are based on hand-crafted feature descriptors. The approximation of a similarity mapping for medical images is difficult due to the big variety of pixel-level structures of interest. In fundus photography (FP) analysis, a subtle difference in e.g. lesions and vessels shape and size can result in a different diagnosis. In this work, we demonstrated how to learn a similarity function for image patches derived directly from FP image data without the need of manually designed feature descriptors. We used a convolutional neural network (CNN) with a novel architecture adapted for similarity learning to accomplish this task. Furthermore, we explored and studied multiple CNN architectures. We show that our method can approximate the similarity between FP patches more efficiently and accurately than the state-of- the-art feature descriptors, including SIFT and SURF using a publicly available dataset. Finally, we observe that our approach, which is purely data-driven, learns that features such as vessels calibre and orientation are important discriminative factors, which resembles the way how humans reason about similarity. To the best of authors knowledge, this is the first attempt to approximate a visual similarity mapping in FP.

  6. Assessing agreement between preclinical magnetic resonance imaging and histology: An evaluation of their image qualities and quantitative results

    PubMed Central

    Elschner, Cindy; Korn, Paula; Hauptstock, Maria; Schulz, Matthias C.; Range, Ursula; Jünger, Diana; Scheler, Ulrich

    2017-01-01

    One consequence of demographic change is the increasing demand for biocompatible materials for use in implants and prostheses. This is accompanied by a growing number of experimental animals because the interactions between new biomaterials and its host tissue have to be investigated. To evaluate novel materials and engineered tissues the use of non-destructive imaging modalities have been identified as a strategic priority. This provides the opportunity for studying interactions repeatedly with individual animals, along with the advantages of reduced biological variability and decreased number of laboratory animals. However, histological techniques are still the golden standard in preclinical biomaterial research. The present article demonstrates a detailed method comparison between histology and magnetic resonance imaging. This includes the presentation of their image qualities as well as the detailed statistical analysis for assessing agreement between quantitative measures. Exemplarily, the bony ingrowth of tissue engineered bone substitutes for treatment of a cleft-like maxillary bone defect has been evaluated. By using a graphical concordance analysis the mean difference between MRI results and histomorphometrical measures has been examined. The analysis revealed a slightly but significant bias in the case of the bone volume (biasHisto−MRI:Bone volume=2.40 %, p<0.005) and a clearly significant deviation for the remaining defect width (biasHisto−MRI:Defect width=−6.73 %, p≪0.005). But the study although showed a considerable effect of the analyzed section position to the quantitative result. It could be proven, that the bias of the data sets was less originated due to the imaging modalities, but mainly on the evaluation of different slice positions. The article demonstrated that method comparisons not always need the use of an independent animal study, additionally. PMID:28666026

  7. Noncontiguous atom matching structural similarity function.

    PubMed

    Teixeira, Ana L; Falcao, Andre O

    2013-10-28

    Measuring similarity between molecules is a fundamental problem in cheminformatics. Given that similar molecules tend to have similar physical, chemical, and biological properties, the notion of molecular similarity plays an important role in the exploration of molecular data sets, query-retrieval in molecular databases, and in structure-property/activity modeling. Various methods to define structural similarity between molecules are available in the literature, but so far none has been used with consistent and reliable results for all situations. We propose a new similarity method based on atom alignment for the analysis of structural similarity between molecules. This method is based on the comparison of the bonding profiles of atoms on comparable molecules, including features that are seldom found in other structural or graph matching approaches like chirality or double bond stereoisomerism. The similarity measure is then defined on the annotated molecular graph, based on an iterative directed graph similarity procedure and optimal atom alignment between atoms using a pairwise matching algorithm. With the proposed approach the similarities detected are more intuitively understood because similar atoms in the molecules are explicitly shown. This noncontiguous atom matching structural similarity method (NAMS) was tested and compared with one of the most widely used similarity methods (fingerprint-based similarity) using three difficult data sets with different characteristics. Despite having a higher computational cost, the method performed well being able to distinguish either different or very similar hydrocarbons that were indistinguishable using a fingerprint-based approach. NAMS also verified the similarity principle using a data set of structurally similar steroids with differences in the binding affinity to the corticosteroid binding globulin receptor by showing that pairs of steroids with a high degree of similarity (>80%) tend to have smaller differences

  8. Similarity analyses of chromatographic herbal fingerprints: a review.

    PubMed

    Goodarzi, Mohammad; Russell, Paul J; Vander Heyden, Yvan

    2013-12-04

    Herbal medicines are becoming again more popular in the developed countries because being "natural" and people thus often assume that they are inherently safe. Herbs have also been used worldwide for many centuries in the traditional medicines. The concern of their safety and efficacy has grown since increasing western interest. Herbal materials and their extracts are very complex, often including hundreds of compounds. A thorough understanding of their chemical composition is essential for conducting a safety risk assessment. However, herbal material can show considerable variability. The chemical constituents and their amounts in a herb can be different, due to growing conditions, such as climate and soil, the drying process, the harvest season, etc. Among the analytical methods, chromatographic fingerprinting has been recommended as a potential and reliable methodology for the identification and quality control of herbal medicines. Identification is needed to avoid fraud and adulteration. Currently, analyzing chromatographic herbal fingerprint data sets has become one of the most applied tools in quality assessment of herbal materials. Mostly, the entire chromatographic profiles are used to identify or to evaluate the quality of the herbs investigated. Occasionally only a limited number of compounds are considered. One approach to the safety risk assessment is to determine whether the herbal material is substantially equivalent to that which is either readily consumed in the diet, has a history of application or has earlier been commercialized i.e. to what is considered as reference material. In order to help determining substantial equivalence using fingerprint approaches, a quantitative measurement of similarity is required. In this paper, different (dis)similarity approaches, such as (dis)similarity metrics or exploratory analysis approaches applied on herbal medicinal fingerprints, are discussed and illustrated with several case studies. Copyright © 2013

  9. Comparison of two quantitative fit-test methods using N95 filtering facepiece respirators.

    PubMed

    Sietsema, Margaret; Brosseau, Lisa M

    2016-08-01

    Current regulations require annual fit testing before an employee can wear a respirator during work activities. The goal of this research is to determine whether respirator fit measured with two TSI Portacount instruments simultaneously sampling ambient particle concentrations inside and outside of the respirator facepiece is similar to fit measured during an ambient aerosol condensation nuclei counter quantitative fit test. Sixteen subjects (ten female; six male) were recruited for a range of facial sizes. Each subject donned an N95 filtering facepiece respirator, completed two fit tests in random order (ambient aerosol condensation nuclei counter quantitative fit test and two-instrument real-time fit test) without removing or adjusting the respirator between tests. Fit tests were compared using Spearman's rank correlation coefficients. The real-time two-instrument method fit factors were similar to those measured with the single-instrument quantitative fit test. The first four exercises were highly correlated (r > 0.7) between the two protocols. Respirator fit was altered during the talking or grimace exercise, both of which involve facial movements that could dislodge the facepiece. Our analyses suggest that the new real-time two-instrument methodology can be used in future studies to evaluate fit before and during work activities.

  10. Development of quantitative screen for 1550 chemicals with GC-MS.

    PubMed

    Bergmann, Alan J; Points, Gary L; Scott, Richard P; Wilson, Glenn; Anderson, Kim A

    2018-05-01

    With hundreds of thousands of chemicals in the environment, effective monitoring requires high-throughput analytical techniques. This paper presents a quantitative screening method for 1550 chemicals based on statistical modeling of responses with identification and integration performed using deconvolution reporting software. The method was evaluated with representative environmental samples. We tested biological extracts, low-density polyethylene, and silicone passive sampling devices spiked with known concentrations of 196 representative chemicals. A multiple linear regression (R 2  = 0.80) was developed with molecular weight, logP, polar surface area, and fractional ion abundance to predict chemical responses within a factor of 2.5. Linearity beyond the calibration had R 2  > 0.97 for three orders of magnitude. Median limits of quantitation were estimated to be 201 pg/μL (1.9× standard deviation). The number of detected chemicals and the accuracy of quantitation were similar for environmental samples and standard solutions. To our knowledge, this is the most precise method for the largest number of semi-volatile organic chemicals lacking authentic standards. Accessible instrumentation and software make this method cost effective in quantifying a large, customizable list of chemicals. When paired with silicone wristband passive samplers, this quantitative screen will be very useful for epidemiology where binning of concentrations is common. Graphical abstract A multiple linear regression of chemical responses measured with GC-MS allowed quantitation of 1550 chemicals in samples such as silicone wristbands.

  11. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  12. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  13. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  14. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  15. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  16. 10 CFR 26.169 - Reporting Results.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... request. The laboratory shall routinely provide quantitative values for confirmatory opiate test results... requested quantitative values for the test result. (3) For a specimen that has an adulterated or substituted... of the standard curve, the laboratory may report to the MRO that the quantitative value “exceeds the...

  17. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses

    PubMed Central

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. PMID:27146161

  18. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  19. Phonological similarity and orthographic similarity affect probed serial recall of Chinese characters.

    PubMed

    Lin, Yi-Chen; Chen, Hsiang-Yu; Lai, Yvonne C; Wu, Denise H

    2015-04-01

    The previous literature on working memory (WM) has indicated that verbal materials are dominantly retained in phonological representations, whereas other linguistic information (e.g., orthography, semantics) only contributes to verbal WM minimally, if not negligibly. Although accumulating evidence has suggested that multiple linguistic components jointly support verbal WM, the visual/orthographic contribution has rarely been addressed in alphabetic languages, possibly due to the difficulty of dissociating the effects of word forms from the effects of their pronunciations in relatively shallow orthography. In the present study, we examined whether the orthographic representations of Chinese characters support the retention of verbal materials in this language of deep orthography. In Experiments 1a and 2, we independently manipulated the phonological and orthographic similarity of horizontal and vertical characters, respectively, and found that participants' accuracy of probed serial recall was reduced by both similar pronunciations and shared phonetic radicals in the to-be-remembered stimuli. Moreover, Experiment 1b showed that only the effect of phonological, but not that of orthographic, similarity was affected by concurrent articulatory suppression. Taken together, the present results indicate the indispensable contribution of orthographic representations to verbal WM of Chinese characters, and suggest that the linguistic characteristics of a specific language not only determine long-term linguistic-processing mechanisms, but also delineate the organization of verbal WM for that language.

  20. Quantitation of valve regurgitation severity by three-dimensional vena contracta area is superior to flow convergence method of quantitation on transesophageal echocardiography.

    PubMed

    Abudiab, Muaz M; Chao, Chieh-Ju; Liu, Shuang; Naqvi, Tasneem Z

    2017-07-01

    Quantitation of regurgitation severity using the proximal isovelocity acceleration (PISA) method to calculate effective regurgitant orifice (ERO) area has limitations. Measurement of three-dimensional (3D) vena contracta area (VCA) accurately grades mitral regurgitation (MR) severity on transthoracic echocardiography (TTE). We evaluated 3D VCA quantitation of regurgitant jet severity using 3D transesophageal echocardiography (TEE) in 110 native mitral, aortic, and tricuspid valves and six prosthetic valves in patients with at least mild valvular regurgitation. The ASE-recommended integrative method comprising semiquantitative and quantitative assessment of valvular regurgitation was used as a reference method, including ERO area by 2D PISA for assigning severity of regurgitation grade. Mean age was 62.2±14.4 years; 3D VCA quantitation was feasible in 91% regurgitant valves compared to 78% by the PISA method. When both methods were feasible and in the presence of a single regurgitant jet, 3D VCA and 2D PISA were similar in differentiating assigned severity (ANOVAP<.001). In valves with multiple jets, however, 3D VCA had a better correlation to assigned severity (ANOVAP<.0001). The agreement of 2D PISA and 3D VCA with the integrative method was 47% and 58% for moderate and 65% and 88% for severe regurgitation, respectively. Measurement of 3D VCA by TEE is superior to the 2D PISA method in determination of regurgitation severity in multiple native and prosthetic valves. © 2017, Wiley Periodicals, Inc.

  1. Assessing deep and shallow learning methods for quantitative prediction of acute chemical toxicity.

    PubMed

    Liu, Ruifeng; Madore, Michael; Glover, Kyle P; Feasel, Michael G; Wallqvist, Anders

    2018-05-02

    Animal-based methods for assessing chemical toxicity are struggling to meet testing demands. In silico approaches, including machine-learning methods, are promising alternatives. Recently, deep neural networks (DNNs) were evaluated and reported to outperform other machine-learning methods for quantitative structure-activity relationship modeling of molecular properties. However, most of the reported performance evaluations relied on global performance metrics, such as the root mean squared error (RMSE) between the predicted and experimental values of all samples, without considering the impact of sample distribution across the activity spectrum. Here, we carried out an in-depth analysis of DNN performance for quantitative prediction of acute chemical toxicity using several datasets. We found that the overall performance of DNN models on datasets of up to 30,000 compounds was similar to that of random forest (RF) models, as measured by the RMSE and correlation coefficients between the predicted and experimental results. However, our detailed analyses demonstrated that global performance metrics are inappropriate for datasets with a highly uneven sample distribution, because they show a strong bias for the most populous compounds along the toxicity spectrum. For highly toxic compounds, DNN and RF models trained on all samples performed much worse than the global performance metrics indicated. Surprisingly, our variable nearest neighbor method, which utilizes only structurally similar compounds to make predictions, performed reasonably well, suggesting that information of close near neighbors in the training sets is a key determinant of acute toxicity predictions.

  2. Mining Diagnostic Assessment Data for Concept Similarity

    ERIC Educational Resources Information Center

    Madhyastha, Tara; Hunt, Earl

    2009-01-01

    This paper introduces a method for mining multiple-choice assessment data for similarity of the concepts represented by the multiple choice responses. The resulting similarity matrix can be used to visualize the distance between concepts in a lower-dimensional space. This gives an instructor a visualization of the relative difficulty of concepts…

  3. Quality of courses evaluated by 'predictions' rather than opinions: Fewer respondents needed for similar results.

    PubMed

    Cohen-Schotanus, Janke; Schönrock-Adema, Johanna; Schmidt, Henk G

    2010-01-01

    A well-known problem with student surveys is a too low response rate. Experiences with predicting electoral outcomes, which required much smaller sample sizes, inspired us to adopt a similar approach to course evaluation. We expected that having respondents estimate the average opinions of their peers required fewer respondents for comparable outcomes than giving own opinions. Two course evaluation studies were performed among successive first-year medical students (N = 380 and 450, respectively). Study 1: Half the cohort gave opinions on nine questions, while the other half predicted the average outcomes. A prize was offered for the three best predictions (motivational remedy). Study 2: Half the cohort gave opinions, a quarter made predictions without a prize and a quarter made predictions with previous year's results as prior knowledge (cognitive remedy). The numbers of respondents required for stable outcomes were determined following an iterative process. Differences between numbers of respondents required and between average scores were analysed with ANOVA. In both studies, the prediction conditions required significantly fewer respondents (p < 0.001) for comparable outcomes. The informed prediction condition required the fewest respondents (N < 20). Problems with response rates can be reduced by asking respondents to predict evaluation outcomes rather than giving opinions.

  4. The similarity law for hypersonic flow and requirements for dynamic similarity of related bodies in free flight

    NASA Technical Reports Server (NTRS)

    Hamaker, Frank M; Neice, Stanford E; Wong, Thomas J

    1953-01-01

    The similarity law for nonsteady, inviscid, hypersonic flow about slender three-dimensional shapes is derived. Conclusions drawn are shown to be valid for rotational flow. Requirements for dynamic similarity of related shapes in free flight are obtained. The law is examined for steady flow about related three-dimensional shapes. Results of an experimental investigation of the pressures acting on two inclined cones are found to check the law as it applies to bodies of revolution.

  5. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment

    PubMed Central

    Ferragina, Paolo; Giancarlo, Raffaele; Greco, Valentina; Manzini, Giovanni; Valiente, Gabriel

    2007-01-01

    Background Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity), NCD (Normalized Compression Dissimilarity) and CD (Compression Dissimilarity). Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. Results We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC (Receiver Operating Curve

  6. Similar herpes zoster incidence across Europe: results from a systematic literature review

    PubMed Central

    2013-01-01

    Background Herpes zoster (HZ) is caused by reactivation of the varicella-zoster virus (VZV) and mainly affects individuals aged ≥50 years. The forthcoming European launch of a vaccine against HZ (Zostavax®) prompts the need for a better understanding of the epidemiology of HZ in Europe. Therefore the aim of this systematic review was to summarize the available data on HZ incidence in Europe and to describe age-specific incidence. Methods The Medline database of the National Library of Medicine was used to conduct a comprehensive literature search of population-based studies of HZ incidence published between 1960 and 2010 carried out in the 27 member countries of the European Union, Iceland, Norway and Switzerland. The identified articles were reviewed and scored according to a reading grid including various quality criteria, and HZ incidence data were extracted and presented by country. Results The search identified 21 studies, and revealed a similar annual HZ incidence throughout Europe, varying by country from 2.0 to 4.6/1 000 person-years with no clearly observed geographic trend. Despite the fact that age groups differed from one study to another, age-specific HZ incidence rates seemed to hold steady during the review period, at around 1/1 000 children <10 years, around 2/1 000 adults aged <40 years, and around 1–4/1 000 adults aged 40–50 years. They then increased rapidly after age 50 years to around 7–8/1 000, up to 10/1 000 after 80 years of age. Our review confirms that in Europe HZ incidence increases with age, and quite drastically after 50 years of age. In all of the 21 studies included in the present review, incidence rates were higher among women than men, and this difference increased with age. This review also highlights the need to identify standardized surveillance methods to improve the comparability of data within European Union Member States and to monitor the impact of VZV immunization on the epidemiology of HZ. Conclusions

  7. GPU-Meta-Storms: computing the structure similarities among massive amount of microbial community samples using GPU.

    PubMed

    Su, Xiaoquan; Wang, Xuetao; Jing, Gongchao; Ning, Kang

    2014-04-01

    The number of microbial community samples is increasing with exponential speed. Data-mining among microbial community samples could facilitate the discovery of valuable biological information that is still hidden in the massive data. However, current methods for the comparison among microbial communities are limited by their ability to process large amount of samples each with complex community structure. We have developed an optimized GPU-based software, GPU-Meta-Storms, to efficiently measure the quantitative phylogenetic similarity among massive amount of microbial community samples. Our results have shown that GPU-Meta-Storms would be able to compute the pair-wise similarity scores for 10 240 samples within 20 min, which gained a speed-up of >17 000 times compared with single-core CPU, and >2600 times compared with 16-core CPU. Therefore, the high-performance of GPU-Meta-Storms could facilitate in-depth data mining among massive microbial community samples, and make the real-time analysis and monitoring of temporal or conditional changes for microbial communities possible. GPU-Meta-Storms is implemented by CUDA (Compute Unified Device Architecture) and C++. Source code is available at http://www.computationalbioenergy.org/meta-storms.html.

  8. Quantitative agreement between [(15)O]H2O PET and model free QUASAR MRI-derived cerebral blood flow and arterial blood volume.

    PubMed

    Heijtel, D F R; Petersen, E T; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; van Bavel, E T; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2016-04-01

    The purpose of this study was to assess whether there was an agreement between quantitative cerebral blood flow (CBF) and arterial cerebral blood volume (CBVA) measurements by [(15)O]H2O positron emission tomography (PET) and model-free QUASAR MRI. Twelve healthy subjects were scanned within a week in separate MRI and PET imaging sessions, after which quantitative and qualitative agreement between both modalities was assessed for gray matter, white matter and whole brain region of interests (ROI). The correlation between CBF measurements obtained with both modalities was moderate to high (r(2): 0.28-0.60, P < 0.05), although QUASAR significantly underestimated CBF by 30% (P < 0.001). CBVA was moderately correlated (r(2): 0.28-0.43, P < 0.05), with QUASAR yielding values that were only 27% of the [(15)O]H2O-derived values (P < 0.001). Group-wise voxel statistics identified minor areas with significant contrast differences between [(15)O]H2O PET and QUASAR MRI, indicating similar qualitative CBVA and CBF information by both modalities. In conclusion, the results of this study demonstrate that QUASAR MRI and [(15)O]H2O PET provide similar CBF and CBVA information, but with systematic quantitative discrepancies. Copyright © 2016 John Wiley & Sons, Ltd.

  9. Improving quantitative gas chromatography-electron ionization mass spectrometry results using a modified ion source: demonstration for a pharmaceutical application.

    PubMed

    D'Autry, Ward; Wolfs, Kris; Hoogmartens, Jos; Adams, Erwin; Van Schepdael, Ann

    2011-07-01

    Gas chromatography-mass spectrometry is a well established analytical technique. However, mass spectrometers with electron ionization sources may suffer from signal drifts, hereby negatively influencing quantitative performance. To demonstrate this phenomenon for a real application, a static headspace-gas chromatography method in combination with electron ionization-quadrupole mass spectrometry was optimized for the determination of residual dichloromethane in coronary stent coatings. Validating the method, the quantitative performance of an original stainless steel ion source was compared to that of a modified ion source. Ion source modification included the application of a gold coating on the repeller and exit plate. Several validation aspects such as limit of detection, limit of quantification, linearity and precision were evaluated using both ion sources. It was found that, as expected, the stainless steel ion source suffered from signal drift. As a consequence, non-linearity and high RSD values for repeated analyses were obtained. An additional experiment was performed to check whether an internal standard compound would lead to better results. It was found that the signal drift patterns of the analyte and internal standard were different, consequently leading to high RSD values for the response factor. With the modified ion source however, a more stable signal was observed resulting in acceptable linearity and precision. Moreover, it was also found that sensitivity improved compared to the stainless steel ion source. Finally, the optimized method with the modified ion source was applied to determine residual dichloromethane in the coating of coronary stents. The solvent was detected but found to be below the limit of quantification. Copyright © 2011 Elsevier B.V. All rights reserved.

  10. Chest Press Exercises With Different Stability Requirements Result in Similar Muscle Damage Recovery in Resistance-Trained Men.

    PubMed

    Ferreira, Diogo V; Ferreira-Júnior, João B; Soares, Saulo R S; Cadore, Eduardo L; Izquierdo, Mikel; Brown, Lee E; Bottaro, Martim

    2017-01-01

    Ferreira, DV, Ferreira-Júnior, JB, Soares, SRS, Cadore, EL, Izquierdo, M, Brown, LE, and Bottaro, M. Chest press exercises with different stability requirements result in similar muscle damage recovery in resistance trained men. J Strength Cond Res 31(1): 71-79, 2017-This study investigated the time course of 96 hours of muscle recovery after 3 different chest press exercises with different stability requirements in resistance-trained men. Twenty-seven men (23.5 ± 3.8 years) were randomly assigned to one of the 3 groups: (a) Smith machine bench press; (b) barbell bench press; or (c) dumbbell bench press. Participants performed 8 sets of 10 repetition maximum with 2 minutes rest between sets. Muscle thickness, peak torque (PT), and soreness were measured pre, post, 24, 48, 72, and 96 hours after exercise. There were no differences in the time course of PT or muscle thickness values of the pectoralis major (p = 0.98 and p = 0.91, respectively) or elbow extensors (p = 0.07 and p = 0.86, respectively) between groups. Muscle soreness of the pectoralis major was also not different between groups (p > 0.05). However, the Smith machine and barbell groups recovered from triceps brachii muscle soreness by 72 hours after exercise (p > 0.05), whereas the dumbbell group did not present any triceps brachii muscle soreness after exercise (p > 0.05). In conclusion, resistance-trained men experience similar muscle damage recovery after Smith machine, barbell, and dumbbell chest press exercise. However, muscle soreness of the elbow extensors takes a longer time to recover after using a barbell chest press exercise.

  11. [Similarity system theory to evaluate similarity of chromatographic fingerprints of traditional Chinese medicine].

    PubMed

    Liu, Yongsuo; Meng, Qinghua; Jiang, Shumin; Hu, Yuzhu

    2005-03-01

    The similarity evaluation of the fingerprints is one of the most important problems in the quality control of the traditional Chinese medicine (TCM). Similarity measures used to evaluate the similarity of the common peaks in the chromatogram of TCM have been discussed. Comparative studies were carried out among correlation coefficient, cosine of the angle and an improved extent similarity method using simulated data and experimental data. Correlation coefficient and cosine of the angle are not sensitive to the differences of the data set. They are still not sensitive to the differences of the data even after normalization. According to the similarity system theory, an improved extent similarity method was proposed. The improved extent similarity is more sensitive to the differences of the data sets than correlation coefficient and cosine of the angle. And the character of the data sets needs not to be changed compared with log-transformation. The improved extent similarity can be used to evaluate the similarity of the chromatographic fingerprints of TCM.

  12. Similarity-Dissimilarity Competition in Disjunctive Classification Tasks

    PubMed Central

    Mathy, Fabien; Haladjian, Harry H.; Laurent, Eric; Goldstone, Robert L.

    2013-01-01

    Typical disjunctive artificial classification tasks require participants to sort stimuli according to rules such as “x likes cars only when black and coupe OR white and SUV.” For categories like this, increasing the salience of the diagnostic dimensions has two simultaneous effects: increasing the distance between members of the same category and increasing the distance between members of opposite categories. Potentially, these two effects respectively hinder and facilitate classification learning, leading to competing predictions for learning. Increasing saliency may lead to members of the same category to be considered less similar, while the members of separate categories might be considered more dissimilar. This implies a similarity-dissimilarity competition between two basic classification processes. When focusing on sub-category similarity, one would expect more difficult classification when members of the same category become less similar (disregarding the increase of between-category dissimilarity); however, the between-category dissimilarity increase predicts a less difficult classification. Our categorization study suggests that participants rely more on using dissimilarities between opposite categories than finding similarities between sub-categories. We connect our results to rule- and exemplar-based classification models. The pattern of influences of within- and between-category similarities are challenging for simple single-process categorization systems based on rules or exemplars. Instead, our results suggest that either these processes should be integrated in a hybrid model, or that category learning operates by forming clusters within each category. PMID:23403979

  13. Modern projection of the old electroscope for nuclear radiation quantitative work and demonstrations

    NASA Astrophysics Data System (ADS)

    Oliveira Bastos, Rodrigo; Baltokoski Boch, Layara

    2017-11-01

    Although quantitative measurements in radioactivity teaching and research are only believed to be possible with high technology, early work in this area was fully accomplished with very simple apparatus such as zinc sulphide screens and electroscopes. This article presents an experimental practice using the electroscope, which is a very simple apparatus that has been widely used for educational purposes, although generally for qualitative work. The main objective is to show the possibility of measuring radioactivity not only in qualitative demonstrations, but also in quantitative experimental practices. The experimental set-up is a low-cost ion chamber connected to an electroscope in a configuration that is very similar to that used by Marie and Pierre Currie, Rutherford, Geiger, Pacini, Hess and other great researchers from the time of the big discoveries in nuclear and high-energy particle physics. An electroscope leaf is filmed and projected, permitting the collection of quantitative data for the measurement of the 220Rn half-life, collected from the emanation of the lantern mantles. The article presents the experimental procedures and the expected results, indicating that the experiment may provide support for nuclear physics classes. These practices could spread widely to either university or school didactic laboratories, and the apparatus has the potential to allow the development of new teaching activity for nuclear physics.

  14. Rice- or pork-based diets with similar calorie and content result in different rat gut microbiota.

    PubMed

    Qi, Xiaozhe; Xu, Wentao; Guo, Mingzhang; Chen, Siyuan; Liu, Yifei; He, Xiaoyun; Huang, Kunlun

    2017-11-01

    Rice is the most important food crop, and pork is the most widely eaten meat in the world. In this study, we compared the gut microbiota of the rats fed with rice or pork mixed diets, which have similar caloric contents. The physiological indices (body weights, hematology, serum chemistry, organ weights and histopathology) of two groups were all within the normal range. Two diets did not induce difference in the diversity of gut bacteria. However, Firmicutes were significantly higher in rice diet group, while Bacteroidetes were enriched in pork diet group. Butyrate and the bacteria enzymes β-glucuronidase, β-glucosidase and nitroreductase in the feces were all drastically higher in pork diet group. This study indicates that different diets with similar calorie and nutritional composition could change the community structure but not the diversity of rat fecal microbiota.

  15. Quantitating antibody uptake in vivo: conditional dependence on antigen expression levels.

    PubMed

    Thurber, Greg M; Weissleder, Ralph

    2011-08-01

    Antibodies form an important class of cancer therapeutics, and there is intense interest in using them for imaging applications in diagnosis and monitoring of cancer treatment. Despite the expanding body of knowledge describing pharmacokinetic and pharmacodynamic interactions of antibodies in vivo, discrepancies remain over the effect of antigen expression level on tumoral uptake with some reports indicating a relationship between uptake and expression and others showing no correlation. Using a cell line with high epithelial cell adhesion molecule expression and moderate epidermal growth factor receptor expression, fluorescent antibodies with similar plasma clearance were imaged in vivo. A mathematical model and mouse xenograft experiments were used to describe the effect of antigen expression on uptake of these high-affinity antibodies. As predicted by the theoretical model, under subsaturating conditions, uptake of the antibodies in such tumors is similar because localization of both probes is limited by delivery from the vasculature. In a separate experiment, when the tumor is saturated, the uptake becomes dependent on the number of available binding sites. In addition, targeting of small micrometastases is shown to be higher than larger vascularized tumors. These results are consistent with the prediction that high affinity antibody uptake is dependent on antigen expression levels for saturating doses and delivery for subsaturating doses. It is imperative for any probe to understand whether quantitative uptake is a measure of biomarker expression or transport to the region of interest. The data provide support for a predictive theoretical model of antibody uptake, enabling it to be used as a starting point for the design of more efficacious therapies and timely quantitative imaging probes.

  16. Surface similarity-based molecular query-retrieval

    PubMed Central

    Singh, Rahul

    2007-01-01

    Background Discerning the similarity between molecules is a challenging problem in drug discovery as well as in molecular biology. The importance of this problem is due to the fact that the biochemical characteristics of a molecule are closely related to its structure. Therefore molecular similarity is a key notion in investigations targeting exploration of molecular structural space, query-retrieval in molecular databases, and structure-activity modelling. Determining molecular similarity is related to the choice of molecular representation. Currently, representations with high descriptive power and physical relevance like 3D surface-based descriptors are available. Information from such representations is both surface-based and volumetric. However, most techniques for determining molecular similarity tend to focus on idealized 2D graph-based descriptors due to the complexity that accompanies reasoning with more elaborate representations. Results This paper addresses the problem of determining similarity when molecules are described using complex surface-based representations. It proposes an intrinsic, spherical representation that systematically maps points on a molecular surface to points on a standard coordinate system (a sphere). Molecular surface properties such as shape, field strengths, and effects due to field super-positioningcan then be captured as distributions on the surface of the sphere. Surface-based molecular similarity is subsequently determined by computing the similarity of the surface-property distributions using a novel formulation of histogram-intersection. The similarity formulation is not only sensitive to the 3D distribution of the surface properties, but is also highly efficient to compute. Conclusion The proposed method obviates the computationally expensive step of molecular pose-optimisation, can incorporate conformational variations, and facilitates highly efficient determination of similarity by directly comparing molecular surfaces

  17. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  18. Terahertz time-domain spectroscopy and quantitative analysis of metal gluconates.

    PubMed

    Li, Shaoxian; Yang, Jingqi; Zhao, Hongwei; Yang, Na; Jing, Dandan; Zhang, Jianbing; Li, Qingnuan; Han, Jiaguang

    2015-01-01

    A series of metal gluconates (Na(+), K(+), Mg(2+), Ca(2+), Fe(2+), Cu(2+), and Zn(2+)) were investigated by terahertz (THz) time-domain spectroscopy. The absorption coefficients and refractive indices of the samples were obtained in the frequency range of 0.5-2.6 THz. The gluconates showed distinct THz characteristic fingerprints, and the dissimilarities reflect their different structures, hydrogen-bond networks, and molecular interactions. In addition, some common features were observed among these gluconates, and the similarities probably come from the similar carbohydrate anion group. The X-ray powder diffraction measurements of these metal gluconates were performed, and the copper(II) gluconate was found to be amorphous, corresponding to the monotonic increase feature in the THz absorption spectrum. The results suggest that THz spectroscopy is sensitive to molecular structure and physical form. Binary and ternary mixtures of different gluconates were quantitatively analyzed based on the Beer-Lambert law. A chemical map of a tablet containing calcium D-gluconate monohydrate and α-lactose in the polyethylene host was obtained by THz imaging. The study shows that THz technology is a useful tool in pharmaceutical research and quality control applications.

  19. Two developmentally temporal quantitative trait loci underlie convergent evolution of increased branchial bone length in sticklebacks

    PubMed Central

    Erickson, Priscilla A.; Glazer, Andrew M.; Cleves, Phillip A.; Smith, Alyson S.; Miller, Craig T.

    2014-01-01

    In convergent evolution, similar phenotypes evolve repeatedly in independent populations, often reflecting adaptation to similar environments. Understanding whether convergent evolution proceeds via similar or different genetic and developmental mechanisms offers insight towards the repeatability and predictability of evolution. Oceanic populations of threespine stickleback fish, Gasterosteus aculeatus, have repeatedly colonized countless freshwater lakes and streams, where new diets lead to morphological adaptations related to feeding. Here, we show that heritable increases in branchial bone length have convergently evolved in two independently derived freshwater stickleback populations. In both populations, an increased bone growth rate in juveniles underlies the convergent adult phenotype, and one population also has a longer cartilage template. Using F2 crosses from these two freshwater populations, we show that two quantitative trait loci (QTL) control branchial bone length at distinct points in development. In both populations, a QTL on chromosome 21 controls bone length throughout juvenile development, and a QTL on chromosome 4 controls bone length only in adults. In addition to these similar developmental profiles, these QTL show similar chromosomal locations in both populations. Our results suggest that sticklebacks have convergently evolved longer branchial bones using similar genetic and developmental programmes in two independently derived populations. PMID:24966315

  20. Benefits from living together? Clades whose species use similar habitats may persist as a result of eco-evolutionary feedbacks.

    PubMed

    Prinzing, Andreas; Ozinga, Wim A; Brändle, Martin; Courty, Pierre-Emmanuel; Hennion, Françoise; Labandeira, Conrad; Parisod, Christian; Pihain, Mickael; Bartish, Igor V

    2017-01-01

    Contents 66 I. 67 II. 68 III. 69 IV. 70 V. 73 VI. 75 VII. 77 78 References 78 SUMMARY: Recent decades have seen declines of entire plant clades while other clades persist despite changing environments. We suggest that one reason why some clades persist is that species within these clades use similar habitats, because such similarity may increase the degree of co-occurrence of species within clades. Traditionally, co-occurrence among clade members has been suggested to be disadvantageous because of increased competition and enemy pressure. Here, we hypothesize that increased co-occurrence among clade members promotes mutualist exchange, niche expansion or hybridization, thereby helping species avoid population decline from environmental change. We review the literature and analyse published data for hundreds of plant clades (genera) within a well-studied region and find major differences in the degree to which species within clades occupy similar habitats. We tentatively show that, in clades for which species occupy similar habitats, species tend to exhibit increased co-occurrence, mutualism, niche expansion, and hybridization - and rarely decline. Consistently, throughout the geological past, clades whose species occupied similar habitats often persisted through long time-spans. Overall, for many plant species, the occupation of similar habitats among fellow clade members apparently reduced their vulnerability to environmental change. Future research should identify when and how this previously unrecognized eco-evolutionary feedback operates. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  1. Quantitative research.

    PubMed

    Watson, Roger

    2015-04-01

    This article describes the basic tenets of quantitative research. The concepts of dependent and independent variables are addressed and the concept of measurement and its associated issues, such as error, reliability and validity, are explored. Experiments and surveys – the principal research designs in quantitative research – are described and key features explained. The importance of the double-blind randomised controlled trial is emphasised, alongside the importance of longitudinal surveys, as opposed to cross-sectional surveys. Essential features of data storage are covered, with an emphasis on safe, anonymous storage. Finally, the article explores the analysis of quantitative data, considering what may be analysed and the main uses of statistics in analysis.

  2. Electric Field Quantitative Measurement System and Method

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R. (Inventor)

    2016-01-01

    A method and system are provided for making a quantitative measurement of an electric field. A plurality of antennas separated from one another by known distances are arrayed in a region that extends in at least one dimension. A voltage difference between at least one selected pair of antennas is measured. Each voltage difference is divided by the known distance associated with the selected pair of antennas corresponding thereto to generate a resulting quantity. The plurality of resulting quantities defined over the region quantitatively describe an electric field therein.

  3. Similar protein expression profiles of ovarian and endometrial high-grade serous carcinomas.

    PubMed

    Hiramatsu, Kosuke; Yoshino, Kiyoshi; Serada, Satoshi; Yoshihara, Kosuke; Hori, Yumiko; Fujimoto, Minoru; Matsuzaki, Shinya; Egawa-Takata, Tomomi; Kobayashi, Eiji; Ueda, Yutaka; Morii, Eiichi; Enomoto, Takayuki; Naka, Tetsuji; Kimura, Tadashi

    2016-03-01

    Ovarian and endometrial high-grade serous carcinomas (HGSCs) have similar clinical and pathological characteristics; however, exhaustive protein expression profiling of these cancers has yet to be reported. We performed protein expression profiling on 14 cases of HGSCs (7 ovarian and 7 endometrial) and 18 endometrioid carcinomas (9 ovarian and 9 endometrial) using iTRAQ-based exhaustive and quantitative protein analysis. We identified 828 tumour-expressed proteins and evaluated the statistical similarity of protein expression profiles between ovarian and endometrial HGSCs using unsupervised hierarchical cluster analysis (P<0.01). Using 45 statistically highly expressed proteins in HGSCs, protein ontology analysis detected two enriched terms and proteins composing each term: IMP2 and MCM2. Immunohistochemical analyses confirmed the higher expression of IMP2 and MCM2 in ovarian and endometrial HGSCs as well as in tubal and peritoneal HGSCs than in endometrioid carcinomas (P<0.01). The knockdown of either IMP2 or MCM2 by siRNA interference significantly decreased the proliferation rate of ovarian HGSC cell line (P<0.01). We demonstrated the statistical similarity of the protein expression profiles of ovarian and endometrial HGSC beyond the organs. We suggest that increased IMP2 and MCM2 expression may underlie some of the rapid HGSC growth observed clinically.

  4. Statistical similarities of pre-earthquake electromagnetic emissions to biological and economic extreme events

    NASA Astrophysics Data System (ADS)

    Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos

    2014-05-01

    When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative

  5. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways

  6. The baryonic self similarity of dark matter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alard, C., E-mail: alard@iap.fr

    2014-06-20

    The cosmological simulations indicates that dark matter halos have specific self-similar properties. However, the halo similarity is affected by the baryonic feedback. By using momentum-driven winds as a model to represent the baryon feedback, an equilibrium condition is derived which directly implies the emergence of a new type of similarity. The new self-similar solution has constant acceleration at a reference radius for both dark matter and baryons. This model receives strong support from the observations of galaxies. The new self-similar properties imply that the total acceleration at larger distances is scale-free, the transition between the dark matter and baryons dominatedmore » regime occurs at a constant acceleration, and the maximum amplitude of the velocity curve at larger distances is proportional to M {sup 1/4}. These results demonstrate that this self-similar model is consistent with the basics of modified Newtonian dynamics (MOND) phenomenology. In agreement with the observations, the coincidence between the self-similar model and MOND breaks at the scale of clusters of galaxies. Some numerical experiments show that the behavior of the density near the origin is closely approximated by a Einasto profile.« less

  7. Context-dependent similarity effects in letter recognition.

    PubMed

    Kinoshita, Sachiko; Robidoux, Serje; Guilbert, Daniel; Norris, Dennis

    2015-10-01

    In visual word recognition tasks, digit primes that are visually similar to letter string targets (e.g., 4/A, 8/B) are known to facilitate letter identification relative to visually dissimilar digits (e.g., 6/A, 7/B); in contrast, with letter primes, visual similarity effects have been elusive. In the present study we show that the visual similarity effect with letter primes can be made to come and go, depending on whether it is necessary to discriminate between visually similar letters. The results support a Bayesian view which regards letter recognition not as a passive activation process driven by the fixed stimulus properties, but as a dynamic evidence accumulation process for a decision that is guided by the task context.

  8. A Quantitative Gas Chromatographic Ethanol Determination.

    ERIC Educational Resources Information Center

    Leary, James J.

    1983-01-01

    Describes a gas chromatographic experiment for the quantitative determination of volume percent ethanol in water ethanol solutions. Background information, procedures, and typical results are included. Accuracy and precision of results are both on the order of two percent. (JN)

  9. Development of one novel multiple-target plasmid for duplex quantitative PCR analysis of roundup ready soybean.

    PubMed

    Zhang, Haibo; Yang, Litao; Guo, Jinchao; Li, Xiang; Jiang, Lingxi; Zhang, Dabing

    2008-07-23

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of reference molecules as calibrators is becoming essential for practical quantification of GMOs. However, the reported reference molecules with tandem marker multiple targets have been proved not suitable for duplex PCR analysis. In this study, we developed one unique plasmid molecule based on one pMD-18T vector with three exogenous target DNA fragments of Roundup Ready soybean GTS 40-3-2 (RRS), that is, CaMV35S, NOS, and RRS event fragments, plus one fragment of soybean endogenous Lectin gene. This Lectin gene fragment was separated from the three exogenous target DNA fragments of RRS by inserting one 2.6 kb DNA fragment with no relatedness to RRS detection targets in this resultant plasmid. Then, we proved that this design allows the quantification of RRS using the three duplex real-time PCR assays targeting CaMV35S, NOS, and RRS events employing this reference molecule as the calibrator. In these duplex PCR assays, the limits of detection (LOD) and quantification (LOQ) were 10 and 50 copies, respectively. For the quantitative analysis of practical RRS samples, the results of accuracy and precision were similar to those of simplex PCR assays, for instance, the quantitative results were at the 1% level, the mean bias of the simplex and duplex PCR were 4.0% and 4.6%, respectively, and the statistic analysis ( t-test) showed that the quantitative data from duplex and simplex PCR had no significant discrepancy for each soybean sample. Obviously, duplex PCR analysis has the advantages of saving the costs of PCR reaction and reducing the experimental errors in simplex PCR testing. The strategy reported in the present study will be helpful for the development of new reference molecules suitable for duplex PCR quantitative assays of GMOs.

  10. On the Development and Use of Large Chemical Similarity Networks, Informatics Best Practices and Novel Chemical Descriptors Towards Materials Quantitative Structure Property Relationships

    NASA Astrophysics Data System (ADS)

    Krein, Michael

    After decades of development and use in a variety of application areas, Quantitative Structure Property Relationships (QSPRs) and related descriptor-based statistical learning methods have achieved a level of infamy due to their misuse. The field is rife with past examples of overtrained models, overoptimistic performance assessment, and outright cheating in the form of explicitly removing data to fit models. These actions do not serve the community well, nor are they beneficial to future predictions based on established models. In practice, in order to select combinations of descriptors and machine learning methods that might work best, one must consider the nature and size of the training and test datasets, be aware of existing hypotheses about the data, and resist the temptation to bias structure representation and modeling to explicitly fit the hypotheses. The definition and application of these best practices is important for obtaining actionable modeling outcomes, and for setting user expectations of modeling accuracy when predicting the endpoint values of unknowns. A wide variety of statistical learning approaches, descriptor types, and model validation strategies are explored herein, with the goals of helping end users understand the factors involved in creating and using QSPR models effectively, and to better understand relationships within the data, especially by looking at the problem space from multiple perspectives. Molecular relationships are commonly envisioned in a continuous high-dimensional space of numerical descriptors, referred to as chemistry space. Descriptor and similarity metric choice influence the partitioning of this space into regions corresponding to local structural similarity. These regions, known as domains of applicability, are most likely to be successfully modeled by a QSPR. In Chapter 2, the network topology and scaling relationships of several chemistry spaces are thoroughly investigated. Chemistry spaces studied include the

  11. Quantitative Simulation of QARBM Challenge Events During Radiation Belt Enhancements

    NASA Astrophysics Data System (ADS)

    Li, W.; Ma, Q.; Thorne, R. M.; Bortnik, J.; Chu, X.

    2017-12-01

    Various physical processes are known to affect energetic electron dynamics in the Earth's radiation belts, but their quantitative effects at different times and locations in space need further investigation. This presentation focuses on discussing the quantitative roles of various physical processes that affect Earth's radiation belt electron dynamics during radiation belt enhancement challenge events (storm-time vs. non-storm-time) selected by the GEM Quantitative Assessment of Radiation Belt Modeling (QARBM) focus group. We construct realistic global distributions of whistler-mode chorus waves, adopt various versions of radial diffusion models (statistical and event-specific), and use the global evolution of other potentially important plasma waves including plasmaspheric hiss, magnetosonic waves, and electromagnetic ion cyclotron waves from all available multi-satellite measurements. These state-of-the-art wave properties and distributions on a global scale are used to calculate diffusion coefficients, that are then adopted as inputs to simulate the dynamical electron evolution using a 3D diffusion simulation during the storm-time and the non-storm-time acceleration events respectively. We explore the similarities and differences in the dominant physical processes that cause radiation belt electron dynamics during the storm-time and non-storm-time acceleration events. The quantitative role of each physical process is determined by comparing against the Van Allen Probes electron observations at different energies, pitch angles, and L-MLT regions. This quantitative comparison further indicates instances when quasilinear theory is sufficient to explain the observed electron dynamics or when nonlinear interaction is required to reproduce the energetic electron evolution observed by the Van Allen Probes.

  12. Calibration of Wide-Field Deconvolution Microscopy for Quantitative Fluorescence Imaging

    PubMed Central

    Lee, Ji-Sook; Wee, Tse-Luen (Erika); Brown, Claire M.

    2014-01-01

    Deconvolution enhances contrast in fluorescence microscopy images, especially in low-contrast, high-background wide-field microscope images, improving characterization of features within the sample. Deconvolution can also be combined with other imaging modalities, such as confocal microscopy, and most software programs seek to improve resolution as well as contrast. Quantitative image analyses require instrument calibration and with deconvolution, necessitate that this process itself preserves the relative quantitative relationships between fluorescence intensities. To ensure that the quantitative nature of the data remains unaltered, deconvolution algorithms need to be tested thoroughly. This study investigated whether the deconvolution algorithms in AutoQuant X3 preserve relative quantitative intensity data. InSpeck Green calibration microspheres were prepared for imaging, z-stacks were collected using a wide-field microscope, and the images were deconvolved using the iterative deconvolution algorithms with default settings. Afterwards, the mean intensities and volumes of microspheres in the original and the deconvolved images were measured. Deconvolved data sets showed higher average microsphere intensities and smaller volumes than the original wide-field data sets. In original and deconvolved data sets, intensity means showed linear relationships with the relative microsphere intensities given by the manufacturer. Importantly, upon normalization, the trend lines were found to have similar slopes. In original and deconvolved images, the volumes of the microspheres were quite uniform for all relative microsphere intensities. We were able to show that AutoQuant X3 deconvolution software data are quantitative. In general, the protocol presented can be used to calibrate any fluorescence microscope or image processing and analysis procedure. PMID:24688321

  13. FunSimMat: a comprehensive functional similarity database

    PubMed Central

    Schlicker, Andreas; Albrecht, Mario

    2008-01-01

    Functional similarity based on Gene Ontology (GO) annotation is used in diverse applications like gene clustering, gene expression data analysis, protein interaction prediction and evaluation. However, there exists no comprehensive resource of functional similarity values although such a database would facilitate the use of functional similarity measures in different applications. Here, we describe FunSimMat (Functional Similarity Matrix, http://funsimmat.bioinf.mpi-inf.mpg.de/), a large new database that provides several different semantic similarity measures for GO terms. It offers various precomputed functional similarity values for proteins contained in UniProtKB and for protein families in Pfam and SMART. The web interface allows users to efficiently perform both semantic similarity searches with GO terms and functional similarity searches with proteins or protein families. All results can be downloaded in tab-delimited files for use with other tools. An additional XML–RPC interface gives automatic online access to FunSimMat for programs and remote services. PMID:17932054

  14. Metabolite profiling and quantitative genetics of natural variation for flavonoids in Arabidopsis

    PubMed Central

    Routaboul, Jean-Marc; Dubos, Christian; Beck, Gilles; Marquis, Catherine; Bidzinski, Przemyslaw; Loudet, Olivier; Lepiniec, Loïc

    2012-01-01

    Little is known about the range and the genetic bases of naturally occurring variation for flavonoids. Using Arabidopsis thaliana seed as a model, the flavonoid content of 41 accessions and two recombinant inbred line (RIL) sets derived from divergent accessions (Cvi-0×Col-0 and Bay-0×Shahdara) were analysed. These accessions and RILs showed mainly quantitative rather than qualitative changes. To dissect the genetic architecture underlying these differences, a quantitative trait locus (QTL) analysis was performed on the two segregating populations. Twenty-two flavonoid QTLs were detected that accounted for 11–64% of the observed trait variations, only one QTL being common to both RIL sets. Sixteen of these QTLs were confirmed and coarsely mapped using heterogeneous inbred families (HIFs). Three genes, namely TRANSPARENT TESTA (TT)7, TT15, and MYB12, were proposed to underlie their variations since the corresponding mutants and QTLs displayed similar specific flavonoid changes. Interestingly, most loci did not co-localize with any gene known to be involved in flavonoid metabolism. This latter result shows that novel functions have yet to be characterized and paves the way for their isolation. PMID:22442426

  15. A Computer-Aided Analysis Method of SPECT Brain Images for Quantitative Treatment Monitoring: Performance Evaluations and Clinical Applications.

    PubMed

    Zheng, Xiujuan; Wei, Wentao; Huang, Qiu; Song, Shaoli; Wan, Jieqing; Huang, Gang

    2017-01-01

    The objective and quantitative analysis of longitudinal single photon emission computed tomography (SPECT) images are significant for the treatment monitoring of brain disorders. Therefore, a computer aided analysis (CAA) method is introduced to extract a change-rate map (CRM) as a parametric image for quantifying the changes of regional cerebral blood flow (rCBF) in longitudinal SPECT brain images. The performances of the CAA-CRM approach in treatment monitoring are evaluated by the computer simulations and clinical applications. The results of computer simulations show that the derived CRMs have high similarities with their ground truths when the lesion size is larger than system spatial resolution and the change rate is higher than 20%. In clinical applications, the CAA-CRM approach is used to assess the treatment of 50 patients with brain ischemia. The results demonstrate that CAA-CRM approach has a 93.4% accuracy of recovered region's localization. Moreover, the quantitative indexes of recovered regions derived from CRM are all significantly different among the groups and highly correlated with the experienced clinical diagnosis. In conclusion, the proposed CAA-CRM approach provides a convenient solution to generate a parametric image and derive the quantitative indexes from the longitudinal SPECT brain images for treatment monitoring.

  16. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  17. New similarity of triangular fuzzy number and its application.

    PubMed

    Zhang, Xixiang; Ma, Weimin; Chen, Liping

    2014-01-01

    The similarity of triangular fuzzy numbers is an important metric for application of it. There exist several approaches to measure similarity of triangular fuzzy numbers. However, some of them are opt to be large. To make the similarity well distributed, a new method SIAM (Shape's Indifferent Area and Midpoint) to measure triangular fuzzy number is put forward, which takes the shape's indifferent area and midpoint of two triangular fuzzy numbers into consideration. Comparison with other similarity measurements shows the effectiveness of the proposed method. Then, it is applied to collaborative filtering recommendation to measure users' similarity. A collaborative filtering case is used to illustrate users' similarity based on cloud model and triangular fuzzy number; the result indicates that users' similarity based on triangular fuzzy number can obtain better discrimination. Finally, a simulated collaborative filtering recommendation system is developed which uses cloud model and triangular fuzzy number to express users' comprehensive evaluation on items, and result shows that the accuracy of collaborative filtering recommendation based on triangular fuzzy number is higher.

  18. Discrepancies between qualitative and quantitative evaluation of randomised controlled trial results: achieving clarity through mixed methods triangulation.

    PubMed

    Tonkin-Crine, Sarah; Anthierens, Sibyl; Hood, Kerenza; Yardley, Lucy; Cals, Jochen W L; Francis, Nick A; Coenen, Samuel; van der Velden, Alike W; Godycki-Cwirko, Maciek; Llor, Carl; Butler, Chris C; Verheij, Theo J M; Goossens, Herman; Little, Paul

    2016-05-12

    Mixed methods are commonly used in health services research; however, data are not often integrated to explore complementarity of findings. A triangulation protocol is one approach to integrating such data. A retrospective triangulation protocol was carried out on mixed methods data collected as part of a process evaluation of a trial. The multi-country randomised controlled trial found that a web-based training in communication skills (including use of a patient booklet) and the use of a C-reactive protein (CRP) point-of-care test decreased antibiotic prescribing by general practitioners (GPs) for acute cough. The process evaluation investigated GPs' and patients' experiences of taking part in the trial. Three analysts independently compared findings across four data sets: qualitative data collected view semi-structured interviews with (1) 62 patients and (2) 66 GPs and quantitative data collected via questionnaires with (3) 2886 patients and (4) 346 GPs. Pairwise comparisons were made between data sets and were categorised as agreement, partial agreement, dissonance or silence. Three instances of dissonance occurred in 39 independent findings. GPs and patients reported different views on the use of a CRP test. GPs felt that the test was useful in convincing patients to accept a no-antibiotic decision, but patient data suggested that this was unnecessary if a full explanation was given. Whilst qualitative data indicated all patients were generally satisfied with their consultation, quantitative data indicated highest levels of satisfaction for those receiving a detailed explanation from their GP with a booklet giving advice on self-care. Both qualitative and quantitative data sets indicated higher patient enablement for those in the communication groups who had received a booklet. Use of CRP tests does not appear to engage patients or influence illness perceptions and its effect is more centred on changing clinician behaviour. Communication skills and the patient

  19. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    PubMed

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are

  20. A blended design in acute care training: similar learning results, less training costs compared with a traditional format.

    PubMed

    Dankbaar, Mary E W; Storm, Diana J; Teeuwen, Irene C; Schuit, Stephanie C E

    2014-09-01

    Introduction There is a demand for more attractive and efficient training programmes in postgraduate health care training. This retrospective study aims to show the effectiveness of a blended versus traditional face-to-face training design. For nurses in postgraduate Acute and Intensive Care training, the effectiveness of a blended course design was compared with a traditional design. Methods In a first pilot study 57 students took a traditional course (2-h lecture and 2-h workshop) and 46 students took a blended course (2-h lecture and 2-h online self-study material). Test results were compared for both groups. After positive results in the pilot study, the design was replicated for the complete programme in Acute and Intensive Care. Now 16 students followed the traditional programme (11 days face-to-face education) and 31 students did the blended programme (7 days face-to-face and 40 h online self-study). An evaluation was done after the pilot and course costs were calculated. Results Results show that the traditional and blended groups were similar regarding the main characteristics and did not differ in learning results for both the pilot and the complete programme. Student evaluations of both designs were positive; however, the blended group were more confident that they had achieved the learning objectives. Training costs were reduced substantially. Conclusion The blended training design offers an effective and attractive training solution, leading to a significant reduction in costs.

  1. Semantic similarity between ontologies at different scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qingpeng; Haglin, David J.

    In the past decade, existing and new knowledge and datasets has been encoded in different ontologies for semantic web and biomedical research. The size of ontologies is often very large in terms of number of concepts and relationships, which makes the analysis of ontologies and the represented knowledge graph computational and time consuming. As the ontologies of various semantic web and biomedical applications usually show explicit hierarchical structures, it is interesting to explore the trade-offs between ontological scales and preservation/precision of results when we analyze ontologies. This paper presents the first effort of examining the capability of this idea viamore » studying the relationship between scaling biomedical ontologies at different levels and the semantic similarity values. We evaluate the semantic similarity between three Gene Ontology slims (Plant, Yeast, and Candida, among which the latter two belong to the same kingdom—Fungi) using four popular measures commonly applied to biomedical ontologies (Resnik, Lin, Jiang-Conrath, and SimRel). The results of this study demonstrate that with proper selection of scaling levels and similarity measures, we can significantly reduce the size of ontologies without losing substantial detail. In particular, the performance of Jiang-Conrath and Lin are more reliable and stable than that of the other two in this experiment, as proven by (a) consistently showing that Yeast and Candida are more similar (as compared to Plant) at different scales, and (b) small deviations of the similarity values after excluding a majority of nodes from several lower scales. This study provides a deeper understanding of the application of semantic similarity to biomedical ontologies, and shed light on how to choose appropriate semantic similarity measures for biomedical engineering.« less

  2. Similar fecal immunochemical test results in screening and referral colorectal cancer

    PubMed Central

    van Turenhout, Sietze T; van Rossum, Leo GM; Oort, Frank A; Laheij, Robert JF; van Rijn, Anne F; Terhaar sive Droste, Jochim S; Fockens, Paul; van der Hulst, René WM; Bouman, Anneke A; Jansen, Jan BMJ; Meijer, Gerrit A; Dekker, Evelien; Mulder, Chris JJ

    2012-01-01

    AIM: To improve the interpretation of fecal immunochemical test (FIT) results in colorectal cancer (CRC) cases from screening and referral cohorts. METHODS: In this comparative observational study, two prospective cohorts of CRC cases were compared. The first cohort was obtained from 10 322 average risk subjects invited for CRC screening with FIT, of which, only subjects with a positive FIT were referred for colonoscopy. The second cohort was obtained from 3637 subjects scheduled for elective colonoscopy with a positive FIT result. The same FIT and positivity threshold (OC sensor; ≥ 50 ng/mL) was used in both cohorts. Colonoscopy was performed in all referral subjects and in FIT positive screening subjects. All CRC cases were selected from both cohorts. Outcome measurements were mean FIT results and FIT scores per tissue tumor stage (T stage). RESULTS: One hundred and eighteen patients with CRC were included in the present study: 28 cases obtained from the screening cohort (64% male; mean age 65 years, SD 6.5) and 90 cases obtained from the referral cohort (58% male; mean age 69 years, SD 9.8). The mean FIT results found were higher in the referral cohort (829 ± 302 ng/mL vs 613 ± 368 ng/mL, P = 0.02). Tissue tumor stage (T stage) distribution was different between both populations [screening population: 13 (46%) T1, eight (29%) T2, six (21%) T3, one (4%) T4 carcinoma; referral population: 12 (13%) T1, 22 (24%) T2, 52 (58%) T3, four (4%) T4 carcinoma], and higher T stage was significantly associated with higher FIT results (P < 0.001). Per tumor stage, no significant difference in mean FIT results was observed (screening vs referral: T1 498 ± 382 ng/mL vs 725 ± 374 ng/mL, P = 0.22; T2 787 ± 303 ng/mL vs 794 ± 341 ng/mL, P = 0.79; T3 563 ± 368 ng/mL vs 870 ± 258 ng/mL, P = 0.13; T4 not available). After correction for T stage in logistic regression analysis, no significant differences in mean FIT results were observed between both types of cohorts (P = 0

  3. Quantitative oral dosing of water soluble and lipophilic contaminants in the Japanese medaka (Oryzias latipes).

    PubMed

    Schultz, I R; Reed, S; Pratt, A; Skillman, A D

    2007-02-01

    Quantitative oral dosing in fish can be challenging, particularly with water soluble contaminants, which can leach into the aquarium water prior to ingestion. We applied a method of bioencapsulation using newly hatched brine shrimp (Artemia franciscana) nauplii to study the toxicokinetics of five chlorinated and brominated halogenated acetic acids (HAAs), which are drinking water disinfection by-products. These results are compared to those obtained in a previous study using a polybrominated diphenyl ether (PBDE-47), a highly lipophilic chemical. The HAAs and PBDE-47 were bioencapsulated using freshly hatched A. franciscana nauplii after incubation in concentrated solutions of the study chemicals for 18 h. Aliquots of the brine shrimp were quantitatively removed for chemical analysis and fed to individual fish that were able to consume 400-500 nauplii in less than 5 min. At select times after feeding, fish were euthanized and the HAA or PBDE-47 content determined. The absorption of HAAs was quantitatively similar to previous studies in rodents: rapid absorption with peak body levels occurring within 1-2 h, then rapidly declining with elimination half-life of 0.3-3 h depending on HAA. PBDE-47 was more slowly absorbed with peak levels occurring by 18 h and very slowly eliminated with an elimination half-life of 281 h.

  4. Three-Dimensional Biologically Relevant Spectrum (BRS-3D): Shape Similarity Profile Based on PDB Ligands as Molecular Descriptors.

    PubMed

    Hu, Ben; Kuang, Zheng-Kun; Feng, Shi-Yu; Wang, Dong; He, Song-Bing; Kong, De-Xin

    2016-11-17

    The crystallized ligands in the Protein Data Bank (PDB) can be treated as the inverse shapes of the active sites of corresponding proteins. Therefore, the shape similarity between a molecule and PDB ligands indicated the possibility of the molecule to bind with the targets. In this paper, we proposed a shape similarity profile that can be used as a molecular descriptor for ligand-based virtual screening. First, through three-dimensional (3D) structural clustering, 300 diverse ligands were extracted from the druggable protein-ligand database, sc-PDB. Then, each of the molecules under scrutiny was flexibly superimposed onto the 300 ligands. Superimpositions were scored by shape overlap and property similarity, producing a 300 dimensional similarity array termed the "Three-Dimensional Biologically Relevant Spectrum (BRS-3D)". Finally, quantitative or discriminant models were developed with the 300 dimensional descriptor using machine learning methods (support vector machine). The effectiveness of this approach was evaluated using 42 benchmark data sets from the G protein-coupled receptor (GPCR) ligand library and the GPCR decoy database (GLL/GDD). We compared the performance of BRS-3D with other 2D and 3D state-of-the-art molecular descriptors. The results showed that models built with BRS-3D performed best for most GLL/GDD data sets. We also applied BRS-3D in histone deacetylase 1 inhibitors screening and GPCR subtype selectivity prediction. The advantages and disadvantages of this approach are discussed.

  5. The rapid quantitation of the filamentous blue-green alga plectonema boryanum by the luciferase assay for ATP

    NASA Technical Reports Server (NTRS)

    Bush, V. N.

    1974-01-01

    Plectonema boryanum is a filamentous blue green alga. Blue green algae have a procaryotic cellular organization similar to bacteria, but are usually obligate photoautotrophs, obtaining their carbon and energy from photosynthetic mechanism similar to higher plants. This research deals with a comparison of three methods of quantitating filamentous populations: microscopic cell counts, the luciferase assay for ATP and optical density measurements.

  6. Primary enzyme quantitation

    DOEpatents

    Saunders, G.C.

    1982-03-04

    The disclosure relates to the quantitation of a primary enzyme concentration by utilizing a substrate for the primary enzyme labeled with a second enzyme which is an indicator enzyme. Enzyme catalysis of the substrate occurs and results in release of the indicator enzyme in an amount directly proportional to the amount of primary enzyme present. By quantifying the free indicator enzyme one determines the amount of primary enzyme present.

  7. Conventional physical therapy and physical therapy based on reflex stimulation showed similar results in children with myelomeningocele.

    PubMed

    Aizawa, Carolina Y P; Morales, Mariana P; Lundberg, Carolina; Moura, Maria Clara D Soares de; Pinto, Fernando C G; Voos, Mariana C; Hasue, Renata H

    2017-03-01

    We aimed to investigate whether infants with myelomeningocele would improve their motor ability and functional independence after ten sessions of physical therapy and compare the outcomes of conventional physical therapy (CPT) to a physical therapy program based on reflex stimulation (RPT). Twelve children were allocated to CPT (n = 6, age 18.3 months) or RPT (n = 6, age 18.2 months). The RPT involved proprioceptive neuromuscular facilitation. Children were assessed with the Gross Motor Function Measure and the Pediatric Evaluation of Disability Inventory before and after treatment. Mann-Whitney tests compared the improvement on the two scales of CPT versus RPT and the Wilcoxon test compared CPT to RPT (before vs. after treatment). Possible correlations between the two scales were tested with Spearman correlation coefficients. Both groups showed improvement on self-care and mobility domains of both scales. There were no differences between the groups, before, or after intervention. The CPT and RPT showed similar results after ten weeks of treatment.

  8. From themes to hypotheses: following up with quantitative methods.

    PubMed

    Morgan, David L

    2015-06-01

    One important category of mixed-methods research designs consists of quantitative studies that follow up on qualitative research. In this case, the themes that serve as the results from the qualitative methods generate hypotheses for testing through the quantitative methods. That process requires operationalization to translate the concepts from the qualitative themes into quantitative variables. This article illustrates these procedures with examples that range from simple operationalization to the evaluation of complex models. It concludes with an argument for not only following up qualitative work with quantitative studies but also the reverse, and doing so by going beyond integrating methods within single projects to include broader mutual attention from qualitative and quantitative researchers who work in the same field. © The Author(s) 2015.

  9. Similarity Based Semantic Web Service Match

    NASA Astrophysics Data System (ADS)

    Peng, Hui; Niu, Wenjia; Huang, Ronghuai

    Semantic web service discovery aims at returning the most matching advertised services to the service requester by comparing the semantic of the request service with an advertised service. The semantic of a web service are described in terms of inputs, outputs, preconditions and results in Ontology Web Language for Service (OWL-S) which formalized by W3C. In this paper we proposed an algorithm to calculate the semantic similarity of two services by weighted averaging their inputs and outputs similarities. Case study and applications show the effectiveness of our algorithm in service match.

  10. Qualification of a Quantitative Laryngeal Imaging System Using Videostroboscopy and Videokymography

    PubMed Central

    Popolo, Peter S.; Titze, Ingo R.

    2008-01-01

    Objectives: We sought to determine whether full-cycle glottal width measurements could be obtained with a quantitative laryngeal imaging system using videostroboscopy, and whether glottal width and vocal fold length measurements were repeatable and reliable. Methods: Synthetic vocal folds were phonated on a laboratory bench, and dynamic images were obtained in repeated trials by use of videostroboscopy and videokymography (VKG) with an imaging system equipped with a 2-point laser projection device for measuring absolute dimensions. Video images were also obtained with an industrial videoscope system with a built-in laser measurement capability. Maximum glottal width and vocal fold length were compared among these 3 methods. Results: The average variation in maximum glottal width measurements between stroboscopic data and VKG data was 3.10%. The average variations in width measurements between the clinical system and the industrial system were 1.93% (stroboscopy) and 3.49% (VKG). The variations in vocal fold length were similarly small. The standard deviations across trials were 0.29 mm for width and 0.48 mm for length (stroboscopy), 0.18 mm for width (VKG), and 0.25 mm for width and 0.84 mm for length (industrial). Conclusions: For stable, periodic vibration, the full extent of the glottal width can be reliably measured with the quantitative videostroboscopy system. PMID:18646436

  11. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  12. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for

  13. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted.

  14. Sedation for electroencephalography with dexmedetomidine or chloral hydrate: a comparative study on the qualitative and quantitative electroencephalogram pattern.

    PubMed

    Fernandes, Magda L; Oliveira, Welser Machado de; Santos, Maria do Carmo Vasconcellos; Gomez, Renato S

    2015-01-01

    Sedation for electroencephalography in uncooperative patients is a controversial issue because majority of sedatives, hypnotics, and general anesthetics interfere with the brain's electrical activity. Chloral hydrate (CH) is typically used for this sedation, and dexmedetomidine (DEX) was recently tested because preliminary data suggest that this drug does not affect the electroencephalogram (EEG). The aim of the present study was to compare the EEG pattern during DEX or CH sedation to test the hypothesis that both drugs exert similar effects on the EEG. A total of 17 patients underwent 2 EEGs on 2 separate occasions, one with DEX and the other with CH. The EEG qualitative variables included the phases of sleep and the background activity. The EEG quantitative analysis was performed during the first 2 minutes of the second stage of sleep. The EEG quantitative variables included density, duration, and amplitude of the sleep spindles and absolute spectral power. The results showed that the qualitative analysis, density, duration, and amplitude of sleep spindles did not differ between DEX and CH sedation. The power of the slow-frequency bands (δ and θ) was higher with DEX, but the power of the faster-frequency bands (α and β) was higher with CH. The total power was lower with DEX than with CH. The differences of DEX and CH in EEG power did not change the EEG qualitative interpretation, which was similar with the 2 drugs. Other studies comparing natural sleep and sleep induced by these drugs are needed to clarify the clinical relevance of the observed EEG quantitative differences.

  15. Perceptual similarity of regional dialects of American English

    PubMed Central

    Clopper, Cynthia G.; Levi, Susannah V.; Pisoni, David B.

    2012-01-01

    Previous research on the perception of dialect variation has measured the perceptual similarity of talkers based on regional dialect using only indirect methods. In the present study, a paired comparison similarity ratings task was used to obtain direct measures of perceptual similarity. Naive listeners were asked to make explicit judgments about the similarity of a set of talkers based on regional dialect. The talkers represented four regional varieties of American English and both genders. Results revealed an additive effect of gender and dialect on mean similarity ratings and two primary dimensions of perceptual dialect similarity: geography (northern versus southern varieties) and dialect markedness (many versus few characteristic properties). The present findings are consistent with earlier research on the perception of dialect variation, as well as recent speech perception studies which demonstrate the integral role of talker gender in speech perception. PMID:16454310

  16. Qualitative and quantitative effects of harmonic echocardiographic imaging on endocardial edge definition and side-lobe artifacts

    NASA Technical Reports Server (NTRS)

    Rubin, D. N.; Yazbek, N.; Garcia, M. J.; Stewart, W. J.; Thomas, J. D.

    2000-01-01

    Harmonic imaging is a new ultrasonographic technique that is designed to improve image quality by exploiting the spontaneous generation of higher frequencies as ultrasound propagates through tissue. We studied 51 difficult-to-image patients with blinded side-by-side cineloop evaluation of endocardial border definition by harmonic versus fundamental imaging. In addition, quantitative intensities from cavity versus wall were compared for harmonic versus fundamental imaging. Harmonic imaging improved left ventricular endocardial border delineation over fundamental imaging (superior: harmonic = 71.1%, fundamental = 18.7%; similar: 10.2%; P <.001). Quantitative analysis of 100 wall/cavity combinations demonstrated brighter wall segments and more strikingly darker cavities during harmonic imaging (cavity intensity on a 0 to 255 scale: fundamental = 15.6 +/- 8.6; harmonic = 6.0 +/- 5.3; P <.0001), which led to enhanced contrast between the wall and cavity (1.89 versus 1.19, P <.0001). Harmonic imaging reduces side-lobe artifacts, resulting in a darker cavity and brighter walls, thereby improving image contrast and endocardial delineation.

  17. Automated Quantitative Spectral Classification of Stars in Areas of the main Meridional Section of the Galaxy

    NASA Astrophysics Data System (ADS)

    Shvelidze, T. D.; Malyuto, V. D.

    Quantitative spectral classification of F, G and K stars with the 70-cm telescope of the Ambastumani Astrophysical Observatory in areas of the main meridional section of the Galaxy, and for which proper motion data are available, has been performed. Fundamental parameters have been obtained for 333 stars in four areas. Space densities of stars of different spectral types, the stellar luminosity function and the relationships between the kinematics and metallicity of stars have been studied. The results have confirmed and completed the conclusions made on the basis of some previous spectroscopic and photometric surveys. Many plates have been obtained for other important directions in the sky: the Kapteyn areas, the Galactic anticentre and the main meridional section of the Galaxy. The data can be treated with the same quantitative method applied here. This method may also be applied to other available and future spectroscopic data of similar resolution, notably that obtained with large format CCD detectors on Schmidt-type telescopes.

  18. Quantitative Oxygenation Venography from MRI Phase

    PubMed Central

    Fan, Audrey P.; Bilgic, Berkin; Gagnon, Louis; Witzel, Thomas; Bhat, Himanshu; Rosen, Bruce R.; Adalsteinsson, Elfar

    2014-01-01

    Purpose To demonstrate acquisition and processing methods for quantitative oxygenation venograms that map in vivo oxygen saturation (SvO2) along cerebral venous vasculature. Methods Regularized quantitative susceptibility mapping (QSM) is used to reconstruct susceptibility values and estimate SvO2 in veins. QSM with ℓ1 and ℓ2 regularization are compared in numerical simulations of vessel structures with known magnetic susceptibility. Dual-echo, flow-compensated phase images are collected in three healthy volunteers to create QSM images. Bright veins in the susceptibility maps are vectorized and used to form a three-dimensional vascular mesh, or venogram, along which to display SvO2 values from QSM. Results Quantitative oxygenation venograms that map SvO2 along brain vessels of arbitrary orientation and geometry are shown in vivo. SvO2 values in major cerebral veins lie within the normal physiological range reported by 15O positron emission tomography. SvO2 from QSM is consistent with previous MR susceptometry methods for vessel segments oriented parallel to the main magnetic field. In vessel simulations, ℓ1 regularization results in less than 10% SvO2 absolute error across all vessel tilt orientations and provides more accurate SvO2 estimation than ℓ2 regularization. Conclusion The proposed analysis of susceptibility images enables reliable mapping of quantitative SvO2 along venograms and may facilitate clinical use of venous oxygenation imaging. PMID:24006229

  19. Similar Squamous Cell Carcinoma Epithelium microRNA Expression in Never Smokers and Ever Smokers

    PubMed Central

    Kolokythas, Antonia; Zhou, Yalu; Schwartz, Joel L.; Adami, Guy R.

    2015-01-01

    The incidence of oral tumors in patients who never used mutagenic agents such as tobacco is increasing. In an effort to better understand these tumors we studied microRNA (miRNA) expression in tumor epithelium of never tobacco users, tumor epithelium of ever tobacco users, and nonpathological control oral epithelium. A comparison of levels among 372 miRNAs in 12 never tobacco users with oral squamous cell carcinoma (OSCC) versus 10 healthy controls was made using the reverse transcription quantitative polymerase chain reaction. A similar analysis was done with 8 ever tobacco users with OSCC. These comparisons revealed miR-10b-5p, miR-196a-5p, and miR-31-5p as enriched in the tumor epithelium in OSCC of both never and ever tobacco users. Examination of The Cancer Genome Atlas (TCGA) project miRNA data on 305 OSCCs and 30 controls revealed 100% of those miRNAs enriched in never smoker OSCCs in this patient group were also enriched in ever smoker OSCCs. Nonsupervised clustering of TCGA OSCCs was suggestive of two or four subgroups of tumors based on miRNA levels with limited evidence for differences in tobacco exposure among the groups. Results from both patient groups together stress the importance of miR196a-5p in OSCC malignancy in both never and ever smokers, and emphasize the overall similarity of miRNA expression in OSCCs in these two risk groups. It implies that there may be great similarity in etiology of OSCC in never and ever smokers and that classifying OSCC based on tobacco exposure may not be helpful in the clinic. PMID:26544609

  20. Structure modification and functionality of whey proteins: quantitative structure-activity relationship approach.

    PubMed

    Nakai, S; Li-Chan, E

    1985-10-01

    According to the original idea of quantitative structure-activity relationship, electric, hydrophobic, and structural parameters should be taken into consideration for elucidating functionality. Changes in these parameters are reflected in the property of protein solubility upon modification of whey proteins by heating. Although solubility is itself a functional property, it has been utilized to explain other functionalities of proteins. However, better correlations were obtained when hydrophobic parameters of the proteins were used in conjunction with solubility. Various treatments reported in the literature were applied to whey protein concentrate in an attempt to obtain whipping and gelling properties similar to those of egg white. Mapping simplex optimization was used to search for the best results. Improvement in whipping properties by pepsin hydrolysis may have been due to higher protein solubility, and good gelling properties resulting from polyphosphate treatment may have been due to an increase in exposable hydrophobicity. However, the results of angel food cake making were still unsatisfactory.

  1. An integrative strategy for quantitative analysis of the N-glycoproteome in complex biological samples

    PubMed Central

    2014-01-01

    Background The complexity of protein glycosylation makes it difficult to characterize glycosylation patterns on a proteomic scale. In this study, we developed an integrated strategy for comparatively analyzing N-glycosylation/glycoproteins quantitatively from complex biological samples in a high-throughput manner. This strategy entailed separating and enriching glycopeptides/glycoproteins using lectin affinity chromatography, and then tandem labeling them with 18O/16O to generate a mass shift of 6 Da between the paired glycopeptides, and finally analyzing them with liquid chromatography-mass spectrometry (LC-MS) and the automatic quantitative method we developed based on Mascot Distiller. Results The accuracy and repeatability of this strategy were first verified using standard glycoproteins; linearity was maintained within a range of 1:10–10:1. The peptide concentration ratios obtained by the self-build quantitative method were similar to both the manually calculated and theoretical values, with a standard deviation (SD) of 0.023–0.186 for glycopeptides. The feasibility of the strategy was further confirmed with serum from hepatocellular carcinoma (HCC) patients and healthy individuals; the expression of 44 glycopeptides and 30 glycoproteins were significantly different between HCC patient and control serum. Conclusions This strategy is accurate, repeatable, and efficient, and may be a useful tool for identification of disease-related N-glycosylation/glycoprotein changes. PMID:24428921

  2. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  3. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  4. Natural texture retrieval based on perceptual similarity measurement

    NASA Astrophysics Data System (ADS)

    Gao, Ying; Dong, Junyu; Lou, Jianwen; Qi, Lin; Liu, Jun

    2018-04-01

    A typical texture retrieval system performs feature comparison and might not be able to make human-like judgments of image similarity. Meanwhile, it is commonly known that perceptual texture similarity is difficult to be described by traditional image features. In this paper, we propose a new texture retrieval scheme based on texture perceptual similarity. The key of the proposed scheme is that prediction of perceptual similarity is performed by learning a non-linear mapping from image features space to perceptual texture space by using Random Forest. We test the method on natural texture dataset and apply it on a new wallpapers dataset. Experimental results demonstrate that the proposed texture retrieval scheme with perceptual similarity improves the retrieval performance over traditional image features.

  5. Quantitative cell biology: the essential role of theory.

    PubMed

    Howard, Jonathon

    2014-11-05

    Quantitative biology is a hot area, as evidenced by the recent establishment of institutes, graduate programs, and conferences with that name. But what is quantitative biology? What should it be? And how can it contribute to solving the big questions in biology? The past decade has seen very rapid development of quantitative experimental techniques, especially at the single-molecule and single-cell levels. In this essay, I argue that quantitative biology is much more than just the quantitation of these experimental results. Instead, it should be the application of the scientific method by which measurement is directed toward testing theories. In this view, quantitative biology is the recognition that theory and models play critical roles in biology, as they do in physics and engineering. By tying together experiment and theory, quantitative biology promises a deeper understanding of underlying mechanisms, when the theory works, or to new discoveries, when it does not. © 2014 Howard. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  6. How do biological systems discriminate among physically similar ions?

    PubMed

    Diamond, J M

    1975-10-01

    This paper reviews the history of understanding how biological systems can discriminate so strikingly among physically similar ions, especially alkali cations. Appreciation of qualitative regularities ("permitted sequences") and quantitative regularities ("selectivity isotherms") in ion selectivity grew first from studies of ion exchangers and glass electrodes, then of biological systems such as enzymes and cell membranes, and most recently of lipid bilayers doped with model pores and carriers. Discrimination of ions depends on both electrostatic and steric forces. "Black-box" studies on intact biological membranes have in some cases yielded molecular clues to the structure of the actual biological pores and carriers. Major current problems involve the extraction of these molecules; how to do it, what to do when it is achieved, and how (and if) it is relevant to the central problems of membrane function. Further advances are expected soon from studies of rate barriers within membranes, of voltage-dependent ("excitable") conducting channels, and of increasingly complex model systems and biological membranes.

  7. Learning context-sensitive shape similarity by graph transduction.

    PubMed

    Bai, Xiang; Yang, Xingwei; Latecki, Longin Jan; Liu, Wenyu; Tu, Zhuowen

    2010-05-01

    Shape similarity and shape retrieval are very important topics in computer vision. The recent progress in this domain has been mostly driven by designing smart shape descriptors for providing better similarity measure between pairs of shapes. In this paper, we provide a new perspective to this problem by considering the existing shapes as a group, and study their similarity measures to the query shape in a graph structure. Our method is general and can be built on top of any existing shape similarity measure. For a given similarity measure, a new similarity is learned through graph transduction. The new similarity is learned iteratively so that the neighbors of a given shape influence its final similarity to the query. The basic idea here is related to PageRank ranking, which forms a foundation of Google Web search. The presented experimental results demonstrate that the proposed approach yields significant improvements over the state-of-art shape matching algorithms. We obtained a retrieval rate of 91.61 percent on the MPEG-7 data set, which is the highest ever reported in the literature. Moreover, the learned similarity by the proposed method also achieves promising improvements on both shape classification and shape clustering.

  8. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  9. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  10. Phoneme Similarity and Confusability

    ERIC Educational Resources Information Center

    Bailey, T.M.; Hahn, U.

    2005-01-01

    Similarity between component speech sounds influences language processing in numerous ways. Explanation and detailed prediction of linguistic performance consequently requires an understanding of these basic similarities. The research reported in this paper contrasts two broad classes of approach to the issue of phoneme similarity-theoretically…

  11. LC–MS/MS Quantitation of Esophagus Disease Blood Serum Glycoproteins by Enrichment with Hydrazide Chemistry and Lectin Affinity Chromatography

    PubMed Central

    2015-01-01

    Changes in glycosylation have been shown to have a profound correlation with development/malignancy in many cancer types. Currently, two major enrichment techniques have been widely applied in glycoproteomics, namely, lectin affinity chromatography (LAC)-based and hydrazide chemistry (HC)-based enrichments. Here we report the LC–MS/MS quantitative analyses of human blood serum glycoproteins and glycopeptides associated with esophageal diseases by LAC- and HC-based enrichment. The separate and complementary qualitative and quantitative data analyses of protein glycosylation were performed using both enrichment techniques. Chemometric and statistical evaluations, PCA plots, or ANOVA test, respectively, were employed to determine and confirm candidate cancer-associated glycoprotein/glycopeptide biomarkers. Out of 139, 59 common glycoproteins (42% overlap) were observed in both enrichment techniques. This overlap is very similar to previously published studies. The quantitation and evaluation of significantly changed glycoproteins/glycopeptides are complementary between LAC and HC enrichments. LC–ESI–MS/MS analyses indicated that 7 glycoproteins enriched by LAC and 11 glycoproteins enriched by HC showed significantly different abundances between disease-free and disease cohorts. Multiple reaction monitoring quantitation resulted in 13 glycopeptides by LAC enrichment and 10 glycosylation sites by HC enrichment to be statistically different among disease cohorts. PMID:25134008

  12. Optimization of homonuclear 2D NMR for fast quantitative analysis: application to tropine-nortropine mixtures.

    PubMed

    Giraudeau, Patrick; Guignard, Nadia; Hillion, Emilie; Baguet, Evelyne; Akoka, Serge

    2007-03-12

    Quantitative analysis by (1)H NMR is often hampered by heavily overlapping signals that may occur for complex mixtures, especially those containing similar compounds. Bidimensional homonuclear NMR spectroscopy can overcome this difficulty. A thorough review of acquisition and post-processing parameters was carried out to obtain accurate and precise, quantitative 2D J-resolved and DQF-COSY spectra in a much reduced time, thus limiting the spectrometer instabilities in the course of time. The number of t(1) increments was reduced as much as possible, and standard deviation was improved by optimization of spectral width, number of transients, phase cycling and apodization function. Localized polynomial baseline corrections were applied to the relevant chemical shift areas. Our method was applied to tropine-nortropine mixtures. Quantitative J-resolved spectra were obtained in less than 3 min and quantitative DQF-COSY spectra in 12 min, with an accuracy of 3% for J-spectroscopy and 2% for DQF-COSY, and a standard deviation smaller than 1%.

  13. Limiting similarity of competitive species and demographic stochasticity

    NASA Astrophysics Data System (ADS)

    Zheng, Xiu-Deng; Deng, Ling-Ling; Qiang, Wei-Ya; Cressman, Ross; Tao, Yi

    2017-04-01

    The limiting similarity of competitive species and its relationship with the competitive exclusion principle is still one of the most important concepts in ecology. In the 1970s, May [R. M. May, Stability and Complexity in Model Ecosystems (Princeton University, Princeton, NJ, 1973)] developed a concise theoretical framework to investigate the limiting similarity of competitive species. His theoretical results show that no limiting similarity threshold of competitive species can be identified in the deterministic model system whereby species more similar than this threshold never coexist. Theoretically, for competitive species coexisting in an unvarying environment, deterministic interspecific interactions and demographic stochasticity can be considered two sides of a coin. To investigate how the "tension" between these two forces affects the coexistence of competing species, a simple two-species competitive system based only on May's model system is transformed into an equivalent replicator equation. The effect of demographic stochasticity on the system stability is measured by the expected drift of the Lyapunov function. Our main results show that the limiting similarity of competitive species should not be considered to be an absolute measure. Specifically, very similar competitive species should be able to coexist in an environment with a high productivity level but big differences between competitive species should be necessary in an ecosystem with a low productivity level.

  14. Competitive RT-PCR Strategy for Quantitative Evaluation of the Expression of Tilapia (Oreochromis niloticus) Growth Hormone Receptor Type I

    PubMed Central

    2009-01-01

    Quantization of gene expression requires that an accurate measurement of a specific transcript is made. In this paper, a quantitative reverse transcription-polymerase chain reaction (RT-PCR) by competition for tilapia growth hormone receptor type I is designed and validated. This experimental procedure was used to determine the abundance of growth hormone receptor type I transcript in different tilapia tissues. The results obtained with this developed competitive RT-PCR were similar to real-time PCR results reported recently. This protocol provides a reliable alternative, but less expensive than real-time PCR to quantify specific genes. PMID:19495916

  15. Contrasting Ecosystem-Effects of Morphologically Similar Copepods

    PubMed Central

    Matthews, Blake; Hausch, Stephen; Winter, Christian; Suttle, Curtis A.; Shurin, Jonathan B.

    2011-01-01

    Organisms alter the biotic and abiotic conditions of ecosystems. They can modulate the availability of resources to other species (ecosystem engineering) and shape selection pressures on other organisms (niche construction). Very little is known about how the engineering effects of organisms vary among and within species, and, as a result, the ecosystem consequences of species diversification and phenotypic evolution are poorly understood. Here, using a common gardening experiment, we test whether morphologically similar species and populations of Diaptomidae copepods (Leptodiaptomus ashlandi, Hesperodiaptomus franciscanus, Skistodiaptomus oregonensis) have similar or different effects on the structure and function of freshwater ecosystems. We found that copepod species had contrasting effects on algal biomass, ammonium concentrations, and sedimentation rates, and that copepod populations had contrasting effects on prokaryote abundance, sedimentation rates, and gross primary productivity. The average size of ecosystem-effect contrasts between species was similar to those between populations, and was comparable to those between fish species and populations measured in previous common gardening experiments. Our results suggest that subtle morphological variation among and within species can cause multifarious and divergent ecosystem-effects. We conclude that using morphological trait variation to assess the functional similarity of organisms may underestimate the importance of species and population diversity for ecosystem functioning. PMID:22140432

  16. Tool independence for the Web Accessibility Quantitative Metric.

    PubMed

    Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio

    2009-07-01

    The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.

  17. Compression-based classification of biological sequences and structures via the Universal Similarity Metric: experimental assessment.

    PubMed

    Ferragina, Paolo; Giancarlo, Raffaele; Greco, Valentina; Manzini, Giovanni; Valiente, Gabriel

    2007-07-13

    Similarity of sequences is a key mathematical notion for Classification and Phylogenetic studies in Biology. It is currently primarily handled using alignments. However, the alignment methods seem inadequate for post-genomic studies since they do not scale well with data set size and they seem to be confined only to genomic and proteomic sequences. Therefore, alignment-free similarity measures are actively pursued. Among those, USM (Universal Similarity Metric) has gained prominence. It is based on the deep theory of Kolmogorov Complexity and universality is its most novel striking feature. Since it can only be approximated via data compression, USM is a methodology rather than a formula quantifying the similarity of two strings. Three approximations of USM are available, namely UCD (Universal Compression Dissimilarity), NCD (Normalized Compression Dissimilarity) and CD (Compression Dissimilarity). Their applicability and robustness is tested on various data sets yielding a first massive quantitative estimate that the USM methodology and its approximations are of value. Despite the rich theory developed around USM, its experimental assessment has limitations: only a few data compressors have been tested in conjunction with USM and mostly at a qualitative level, no comparison among UCD, NCD and CD is available and no comparison of USM with existing methods, both based on alignments and not, seems to be available. We experimentally test the USM methodology by using 25 compressors, all three of its known approximations and six data sets of relevance to Molecular Biology. This offers the first systematic and quantitative experimental assessment of this methodology, that naturally complements the many theoretical and the preliminary experimental results available. Moreover, we compare the USM methodology both with methods based on alignments and not. We may group our experiments into two sets. The first one, performed via ROC (Receiver Operating Curve) analysis, aims at

  18. Analysis of network motifs in cellular regulation: Structural similarities, input-output relations and signal integration.

    PubMed

    Straube, Ronny

    2017-12-01

    Much of the complexity of regulatory networks derives from the necessity to integrate multiple signals and to avoid malfunction due to cross-talk or harmful perturbations. Hence, one may expect that the input-output behavior of larger networks is not necessarily more complex than that of smaller network motifs which suggests that both can, under certain conditions, be described by similar equations. In this review, we illustrate this approach by discussing the similarities that exist in the steady state descriptions of a simple bimolecular reaction, covalent modification cycles and bacterial two-component systems. Interestingly, in all three systems fundamental input-output characteristics such as thresholds, ultrasensitivity or concentration robustness are described by structurally similar equations. Depending on the system the meaning of the parameters can differ ranging from protein concentrations and affinity constants to complex parameter combinations which allows for a quantitative understanding of signal integration in these systems. We argue that this approach may also be extended to larger regulatory networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    .4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  20. Similarity and Congruence.

    ERIC Educational Resources Information Center

    Herman, Daniel L.

    This instructional unit is an introduction to the common properties of similarity and congruence. Manipulation of objects leads to a recognition of these properties. The ASA, SAS, and SSS theorems are not mentioned. Limited use is made in the application of the properties of size and shape preserved by similarity or congruence. A teacher's guide…

  1. Similarity, not complexity, determines visual working memory performance.

    PubMed

    Jackson, Margaret C; Linden, David E J; Roberts, Mark V; Kriegeskorte, Nikolaus; Haenschel, Corinna

    2015-11-01

    A number of studies have shown that visual working memory (WM) is poorer for complex versus simple items, traditionally accounted for by higher information load placing greater demands on encoding and storage capacity limits. Other research suggests that it may not be complexity that determines WM performance per se, but rather increased perceptual similarity between complex items as a result of a large amount of overlapping information. Increased similarity is thought to lead to greater comparison errors between items encoded into WM and the test item(s) presented at retrieval. However, previous studies have used different object categories to manipulate complexity and similarity, raising questions as to whether these effects are simply due to cross-category differences. For the first time, here the relationship between complexity and similarity in WM using the same stimulus category (abstract polygons) are investigated. The authors used a delayed discrimination task to measure WM for 1-4 complex versus simple simultaneously presented items and manipulated the similarity between the single test item at retrieval and the sample items at encoding. WM was poorer for complex than simple items only when the test item was similar to 1 of the encoding items, and not when it was dissimilar or identical. The results provide clear support for reinterpretation of the complexity effect in WM as a similarity effect and highlight the importance of the retrieval stage in governing WM performance. The authors discuss how these findings can be reconciled with current models of WM capacity limits. (c) 2015 APA, all rights reserved).

  2. Similarity indices of meteo-climatic gauging stations: definition and comparison.

    PubMed

    Barca, Emanuele; Bruno, Delia Evelina; Passarella, Giuseppe

    2016-07-01

    Space-time dependencies among monitoring network stations have been investigated to detect and quantify similarity relationships among gauging stations. In this work, besides the well-known rank correlation index, two new similarity indices have been defined and applied to compute the similarity matrix related to the Apulian meteo-climatic monitoring network. The similarity matrices can be applied to address reliably the issue of missing data in space-time series. In order to establish the effectiveness of the similarity indices, a simulation test was then designed and performed with the aim of estimating missing monthly rainfall rates in a suitably selected gauging station. The results of the simulation allowed us to evaluate the effectiveness of the proposed similarity indices. Finally, the multiple imputation by chained equations method was used as a benchmark to have an absolute yardstick for comparing the outcomes of the test. In conclusion, the new proposed multiplicative similarity index resulted at least as reliable as the selected benchmark.

  3. Reference charts for young stands — a quantitative methodology for assessing tree performance

    Treesearch

    Lance A. Vickers; David R. Larsen; Benjamin O. Knapp; John M. Kabrick; Daniel C. Dey

    2017-01-01

    Reference charts have long been used in the medical field for quantitative clinical assessment of juvenile development by plotting distribution quantiles for a selected attribute (e.g., height) against age for specified peer populations.We propose that early stand dynamics is an area of study that could benefit from the descriptions and analyses offered by similar...

  4. Quantitative Resistance: More Than Just Perception of a Pathogen.

    PubMed

    Corwin, Jason A; Kliebenstein, Daniel J

    2017-04-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. © 2017 American Society of Plant Biologists. All rights reserved.

  5. Quantitative Resistance: More Than Just Perception of a Pathogen

    PubMed Central

    2017-01-01

    Molecular plant pathology has focused on studying large-effect qualitative resistance loci that predominantly function in detecting pathogens and/or transmitting signals resulting from pathogen detection. By contrast, less is known about quantitative resistance loci, particularly the molecular mechanisms controlling variation in quantitative resistance. Recent studies have provided insight into these mechanisms, showing that genetic variation at hundreds of causal genes may underpin quantitative resistance. Loci controlling quantitative resistance contain some of the same causal genes that mediate qualitative resistance, but the predominant mechanisms of quantitative resistance extend beyond pathogen recognition. Indeed, most causal genes for quantitative resistance encode specific defense-related outputs such as strengthening of the cell wall or defense compound biosynthesis. Extending previous work on qualitative resistance to focus on the mechanisms of quantitative resistance, such as the link between perception of microbe-associated molecular patterns and growth, has shown that the mechanisms underlying these defense outputs are also highly polygenic. Studies that include genetic variation in the pathogen have begun to highlight a potential need to rethink how the field considers broad-spectrum resistance and how it is affected by genetic variation within pathogen species and between pathogen species. These studies are broadening our understanding of quantitative resistance and highlighting the potentially vast scale of the genetic basis of quantitative resistance. PMID:28302676

  6. Representational Similarity of Body Parts in Human Occipitotemporal Cortex.

    PubMed

    Bracci, Stefania; Caramazza, Alfonso; Peelen, Marius V

    2015-09-23

    Regions in human lateral and ventral occipitotemporal cortices (OTC) respond selectively to pictures of the human body and its parts. What are the organizational principles underlying body part responses in these regions? Here we used representational similarity analysis (RSA) of fMRI data to test multiple possible organizational principles: shape similarity, physical proximity, cortical homunculus proximity, and semantic similarity. Participants viewed pictures of whole persons, chairs, and eight body parts (hands, arms, legs, feet, chests, waists, upper faces, and lower faces). The similarity of multivoxel activity patterns for all body part pairs was established in whole person-selective OTC regions. The resulting neural similarity matrices were then compared with similarity matrices capturing the hypothesized organizational principles. Results showed that the semantic similarity model best captured the neural similarity of body parts in lateral and ventral OTC, which followed an organization in three clusters: (1) body parts used as action effectors (hands, feet, arms, and legs), (2) noneffector body parts (chests and waists), and (3) face parts (upper and lower faces). Whole-brain RSA revealed, in addition to OTC, regions in parietal and frontal cortex in which neural similarity was related to semantic similarity. In contrast, neural similarity in occipital cortex was best predicted by shape similarity models. We suggest that the semantic organization of body parts in high-level visual cortex relates to the different functions associated with the three body part clusters, reflecting the unique processing and connectivity demands associated with the different types of information (e.g., action, social) different body parts (e.g., limbs, faces) convey. Significance statement: While the organization of body part representations in motor and somatosensory cortices has been well characterized, the principles underlying body part representations in visual cortex

  7. On the necessity of dissecting sequence similarity scores into segment-specific contributions for inferring protein homology, function prediction and annotation

    PubMed Central

    2014-01-01

    Background Protein sequence similarities to any types of non-globular segments (coiled coils, low complexity regions, transmembrane regions, long loops, etc. where either positional sequence conservation is the result of a very simple, physically induced pattern or rather integral sequence properties are critical) are pertinent sources for mistaken homologies. Regretfully, these considerations regularly escape attention in large-scale annotation studies since, often, there is no substitute to manual handling of these cases. Quantitative criteria are required to suppress events of function annotation transfer as a result of false homology assignments. Results The sequence homology concept is based on the similarity comparison between the structural elements, the basic building blocks for conferring the overall fold of a protein. We propose to dissect the total similarity score into fold-critical and other, remaining contributions and suggest that, for a valid homology statement, the fold-relevant score contribution should at least be significant on its own. As part of the article, we provide the DissectHMMER software program for dissecting HMMER2/3 scores into segment-specific contributions. We show that DissectHMMER reproduces HMMER2/3 scores with sufficient accuracy and that it is useful in automated decisions about homology for instructive sequence examples. To generalize the dissection concept for cases without 3D structural information, we find that a dissection based on alignment quality is an appropriate surrogate. The approach was applied to a large-scale study of SMART and PFAM domains in the space of seed sequences and in the space of UniProt/SwissProt. Conclusions Sequence similarity core dissection with regard to fold-critical and other contributions systematically suppresses false hits and, additionally, recovers previously obscured homology relationships such as the one between aquaporins and formate/nitrite transporters that, so far, was only

  8. A field- and laboratory-based quantitative analysis of alluvium: Relating analytical results to TIMS data

    NASA Technical Reports Server (NTRS)

    Wenrich, Melissa L.; Hamilton, Victoria E.; Christensen, Philip R.

    1995-01-01

    Thermal Infrared Multispectral Scanner (TIMS) data were acquired over the McDowell Mountains northeast of Scottsdale, Arizona during August 1994. The raw data were processed to emphasize lithologic differences using a decorrelation stretch and assigning bands 5, 3, and 1 to red, green, and blue, respectively. Processed data of alluvium flanking the mountains exhibit moderate color variation. The objective of this study was to determine, using a quantitative approach, what environmental variable(s), in the absence of bedrock, is/are responsible for influencing the spectral properties of the desert alluvial surface.

  9. Fish trophic level and the similarity of non-specific larval parasite assemblages.

    PubMed

    Timi, J T; Rossin, M A; Alarcos, A J; Braicovich, P E; Cantatore, D M P; Lanfranchi, A L

    2011-03-01

    Whereas the effect of parasites on food webs is increasingly recognised and has been extensively measured and modelled, the effect of food webs on the structure of parasite assemblages has not been quantified in a similar way. Here, we apply the concept of decay in community similarity with increasing distance, previously used for parasites in geographical, phylogenetic and ontogenetic contexts, to differences in the trophic level (TL) based on diet composition of fishes. It is proposed as an accurate quantitative method to measure rates of assemblage change as a function of host feeding habits and is applied, to our knowledge for the first time, across host species in marine waters. We focused on a suite of 15 species of trophically-transmitted and non-specific larval helminths across 16 fish species (1783 specimens, six orders, 14 families) with different sizes and TLs, gathered from the same ecosystem. Not all host species harboured the same number and types of parasites, reflecting the differences in their ecological characteristics. Using differences in TL and body length as measurements of size and trophic distances, we found that similarity at both infracommunity and component community levels showed a very clear decay pattern, based on parasite abundance and relative abundance, with increasing distance in TL, but was not related to changes in fish size, with TL thus emerging as the main explanatory factor for similarity of parasite assemblages. Furthermore, the relationships between host TL and assemblage similarity allowed identification of fishes for which the TL was under- or over-estimated and prediction of the TL of host species based on parasite data alone. Copyright © 2010 Australian Society for Parasitology Inc. Published by Elsevier Ltd. All rights reserved.

  10. Fluvial drainage networks: the fractal approach as an improvement of quantitative geomorphic analyses

    NASA Astrophysics Data System (ADS)

    Melelli, Laura; Liucci, Luisa; Vergari, Francesca; Ciccacci, Sirio; Del Monte, Maurizio

    2014-05-01

    Drainage basins are primary landscape units for geomorphological investigations. Both hillslopes and river drainage system are fundamental components in drainage basins analysis. As other geomorphological systems, also the drainage basins aim to an equilibrium condition where the sequence of erosion, transport and sedimentation approach to a condition of minimum energy effort. This state is revealed by a typical geometry of landforms and of drainage net. Several morphometric indexes can measure how much a drainage basin is far from the theoretical equilibrium configuration, revealing possible external disarray. In active tectonic areas, the drainage basins have a primary importance in order to highlight style, amount and rate of tectonic impulses, and morphometric indexes allow to estimate the tectonic activity classes of different sectors in a study area. Moreover, drainage rivers are characterized by a self-similarity structure; this promotes the use of fractals theory to investigate the system. In this study, fractals techniques are employed together with quantitative geomorphological analysis to study the Upper Tiber Valley (UTV), a tectonic intermontane basin located in northern Apennines (Umbria, central Italy). The area is the result of different tectonic phases. From Late Pliocene until present time the UTV is strongly controlled by a regional uplift and by an extensional phase with different sets of normal faults playing a fundamental role in basin morphology. Thirty-four basins are taken into account for the quantitative analysis, twenty on the left side of the basin, the others on the right side. Using fractals dimension of drainage networks, Horton's laws results, concavity and steepness indexes, and hypsometric curves, this study aims to obtain an evolutionary model of the UTV, where the uplift is compared to local subsidence induced by normal fault activity. The results highlight a well defined difference between western and eastern tributary basins

  11. Sensitive and quantitative measurement of gene expression directly from a small amount of whole blood.

    PubMed

    Zheng, Zhi; Luo, Yuling; McMaster, Gary K

    2006-07-01

    Accurate and precise quantification of mRNA in whole blood is made difficult by gene expression changes during blood processing, and by variations and biases introduced by sample preparations. We sought to develop a quantitative whole-blood mRNA assay that eliminates blood purification, RNA isolation, reverse transcription, and target amplification while providing high-quality data in an easy assay format. We performed single- and multiplex gene expression analysis with multiple hybridization probes to capture mRNA directly from blood lysate and used branched DNA to amplify the signal. The 96-well plate singleplex assay uses chemiluminescence detection, and the multiplex assay combines Luminex-encoded beads with fluorescent detection. The single- and multiplex assays could quantitatively measure as few as 6000 and 24,000 mRNA target molecules (0.01 and 0.04 amoles), respectively, in up to 25 microL of whole blood. Both formats had CVs < 10% and dynamic ranges of 3-4 logs. Assay sensitivities allowed quantitative measurement of gene expression in the minority of cells in whole blood. The signals from whole-blood lysate correlated well with signals from purified RNA of the same sample, and absolute mRNA quantification results from the assay were similar to those obtained by quantitative reverse transcription-PCR. Both single- and multiplex assay formats were compatible with common anticoagulants and PAXgene-treated samples; however, PAXgene preparations induced expression of known antiapoptotic genes in whole blood. Both the singleplex and the multiplex branched DNA assays can quantitatively measure mRNA expression directly from small volumes of whole blood. The assay offers an alternative to current technologies that depend on RNA isolation and is amenable to high-throughput gene expression analysis of whole blood.

  12. Scale-Similar Models for Large-Eddy Simulations

    NASA Technical Reports Server (NTRS)

    Sarghini, F.

    1999-01-01

    Scale-similar models employ multiple filtering operations to identify the smallest resolved scales, which have been shown to be the most active in the interaction with the unresolved subgrid scales. They do not assume that the principal axes of the strain-rate tensor are aligned with those of the subgrid-scale stress (SGS) tensor, and allow the explicit calculation of the SGS energy. They can provide backscatter in a numerically stable and physically realistic manner, and predict SGS stresses in regions that are well correlated with the locations where large Reynolds stress occurs. In this paper, eddy viscosity and mixed models, which include an eddy-viscosity part as well as a scale-similar contribution, are applied to the simulation of two flows, a high Reynolds number plane channel flow, and a three-dimensional, nonequilibrium flow. The results show that simulations without models or with the Smagorinsky model are unable to predict nonequilibrium effects. Dynamic models provide an improvement of the results: the adjustment of the coefficient results in more accurate prediction of the perturbation from equilibrium. The Lagrangian-ensemble approach [Meneveau et al., J. Fluid Mech. 319, 353 (1996)] is found to be very beneficial. Models that included a scale-similar term and a dissipative one, as well as the Lagrangian ensemble averaging, gave results in the best agreement with the direct simulation and experimental data.

  13. Advanced quantitative measurement methodology in physics education research

    NASA Astrophysics Data System (ADS)

    Wang, Jing

    parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The

  14. Quantitative patterns of stylistic influence in the evolution of literature.

    PubMed

    Hughes, James M; Foti, Nicholas J; Krakauer, David C; Rockmore, Daniel N

    2012-05-15

    Literature is a form of expression whose temporal structure, both in content and style, provides a historical record of the evolution of culture. In this work we take on a quantitative analysis of literary style and conduct the first large-scale temporal stylometric study of literature by using the vast holdings in the Project Gutenberg Digital Library corpus. We find temporal stylistic localization among authors through the analysis of the similarity structure in feature vectors derived from content-free word usage, nonhomogeneous decay rates of stylistic influence, and an accelerating rate of decay of influence among modern authors. Within a given time period we also find evidence for stylistic coherence with a given literary topic, such that writers in different fields adopt different literary styles. This study gives quantitative support to the notion of a literary "style of a time" with a strong trend toward increasingly contemporaneous stylistic influence.

  15. Path Similarity Analysis: A Method for Quantifying Macromolecular Pathways

    PubMed Central

    Seyler, Sean L.; Kumar, Avishek; Thorpe, M. F.; Beckstein, Oliver

    2015-01-01

    Diverse classes of proteins function through large-scale conformational changes and various sophisticated computational algorithms have been proposed to enhance sampling of these macromolecular transition paths. Because such paths are curves in a high-dimensional space, it has been difficult to quantitatively compare multiple paths, a necessary prerequisite to, for instance, assess the quality of different algorithms. We introduce a method named Path Similarity Analysis (PSA) that enables us to quantify the similarity between two arbitrary paths and extract the atomic-scale determinants responsible for their differences. PSA utilizes the full information available in 3N-dimensional configuration space trajectories by employing the Hausdorff or Fréchet metrics (adopted from computational geometry) to quantify the degree of similarity between piecewise-linear curves. It thus completely avoids relying on projections into low dimensional spaces, as used in traditional approaches. To elucidate the principles of PSA, we quantified the effect of path roughness induced by thermal fluctuations using a toy model system. Using, as an example, the closed-to-open transitions of the enzyme adenylate kinase (AdK) in its substrate-free form, we compared a range of protein transition path-generating algorithms. Molecular dynamics-based dynamic importance sampling (DIMS) MD and targeted MD (TMD) and the purely geometric FRODA (Framework Rigidity Optimized Dynamics Algorithm) were tested along with seven other methods publicly available on servers, including several based on the popular elastic network model (ENM). PSA with clustering revealed that paths produced by a given method are more similar to each other than to those from another method and, for instance, that the ENM-based methods produced relatively similar paths. PSA applied to ensembles of DIMS MD and FRODA trajectories of the conformational transition of diphtheria toxin, a particularly challenging example, showed that

  16. The Role of Introductory Geosciences in Students' Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Wenner, J. M.; Manduca, C.; Baer, E. M.

    2006-12-01

    Quantitative literacy is more than mathematics; it is about reasoning with data. Colleges and universities have begun to recognize the distinction between mathematics and quantitative literacy, modifying curricula to reflect the need for numerate citizens. Although students may view geology as 'rocks for jocks', the geosciences are truthfully rife with data, making introductory geoscience topics excellent context for developing the quantitative literacy of students with diverse backgrounds. In addition, many news items that deal with quantitative skills, such as the global warming phenomenon, have their basis in the Earth sciences and can serve as timely examples of the importance of quantitative literacy for all students in introductory geology classrooms. Participants at a workshop held in 2006, 'Infusing Quantitative Literacy into Introductory Geoscience Courses,' discussed and explored the challenges and opportunities associated with the inclusion of quantitative material and brainstormed about effective practices for imparting quantitative literacy to students with diverse backgrounds. The tangible results of this workshop add to the growing collection of quantitative materials available through the DLESE- and NSF-supported Teaching Quantitative Skills in the Geosciences website, housed at SERC. There, faculty can find a collection of pages devoted to the successful incorporation of quantitative literacy in introductory geoscience. The resources on the website are designed to help faculty to increase their comfort with presenting quantitative ideas to students with diverse mathematical abilities. A methods section on "Teaching Quantitative Literacy" (http://serc.carleton.edu/quantskills/methods/quantlit/index.html) focuses on connecting quantitative concepts with geoscience context and provides tips, trouble-shooting advice and examples of quantitative activities. The goal in this section is to provide faculty with material that can be readily incorporated

  17. Evaluation of the Abbott RealTime HCV assay for quantitative detection of hepatitis C virus RNA.

    PubMed

    Michelin, Birgit D A; Muller, Zsofia; Stelzl, Evelyn; Marth, Egon; Kessler, Harald H

    2007-02-01

    The Abbott RealTime HCV assay for quantitative detection of HCV RNA has recently been introduced. In this study, the performance of the Abbott RealTime HCV assay was evaluated and compared to the COBAS AmpliPrep/COBAS TaqMan HCV test. Accuracy, linearity, interassay and intra-assay variations were determined, and a total of 243 routine clinical samples were investigated. When accuracy of the new assay was tested, the majority of results were found to be within +/-0.5 log(10) unit of the results obtained by reference laboratories. Determination of linearity resulted in a quasilinear curve up to 1.0 x 10(6)IU/ml. The interassay variation ranged from 15% to 32%, and the intra-assay variation ranged from 5% to 8%. When clinical samples were tested by the Abbott RealTime HCV assay and the results were compared with those obtained by the COBAS AmpliPrep/COBAS TaqMan HCV test, the results for 93% of all samples with positive results by both tests were found to be within +/-1.0 log(10) unit. The viral loads for all patients measured by the Abbott and Roche assays showed a high correlation (R(2)=0.93); quantitative results obtained by the Abbott assay were found to be lower than those obtained by the Roche assay. The Abbott RealTime HCV assay proved to be suitable for use in the routine diagnostic laboratory. The time to results was similar for both of the assays.

  18. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  19. Similarity analysis between quantum images

    NASA Astrophysics Data System (ADS)

    Zhou, Ri-Gui; Liu, XingAo; Zhu, Changming; Wei, Lai; Zhang, Xiafen; Ian, Hou

    2018-06-01

    Similarity analyses between quantum images are so essential in quantum image processing that it provides fundamental research for the other fields, such as quantum image matching, quantum pattern recognition. In this paper, a quantum scheme based on a novel quantum image representation and quantum amplitude amplification algorithm is proposed. At the end of the paper, three examples and simulation experiments show that the measurement result must be 0 when two images are same, and the measurement result has high probability of being 1 when two images are different.

  20. Quantitative susceptibility mapping: Report from the 2016 reconstruction challenge.

    PubMed

    Langkammer, Christian; Schweser, Ferdinand; Shmueli, Karin; Kames, Christian; Li, Xu; Guo, Li; Milovic, Carlos; Kim, Jinsuh; Wei, Hongjiang; Bredies, Kristian; Buch, Sagar; Guo, Yihao; Liu, Zhe; Meineke, Jakob; Rauscher, Alexander; Marques, José P; Bilgic, Berkin

    2018-03-01

    The aim of the 2016 quantitative susceptibility mapping (QSM) reconstruction challenge was to test the ability of various QSM algorithms to recover the underlying susceptibility from phase data faithfully. Gradient-echo images of a healthy volunteer acquired at 3T in a single orientation with 1.06 mm isotropic resolution. A reference susceptibility map was provided, which was computed using the susceptibility tensor imaging algorithm on data acquired at 12 head orientations. Susceptibility maps calculated from the single orientation data were compared against the reference susceptibility map. Deviations were quantified using the following metrics: root mean squared error (RMSE), structure similarity index (SSIM), high-frequency error norm (HFEN), and the error in selected white and gray matter regions. Twenty-seven submissions were evaluated. Most of the best scoring approaches estimated the spatial frequency content in the ill-conditioned domain of the dipole kernel using compressed sensing strategies. The top 10 maps in each category had similar error metrics but substantially different visual appearance. Because QSM algorithms were optimized to minimize error metrics, the resulting susceptibility maps suffered from over-smoothing and conspicuity loss in fine features such as vessels. As such, the challenge highlighted the need for better numerical image quality criteria. Magn Reson Med 79:1661-1673, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  1. Two Measurement Methods of Leaf Dry Matter Content Produce Similar Results in a Broad Range of Species

    PubMed Central

    Vaieretti, María Victoria; Díaz, Sandra; Vile, Denis; Garnier, Eric

    2007-01-01

    Background and Aims Leaf dry matter content (LDMC) is widely used as an indicator of plant resource use in plant functional trait databases. Two main methods have been proposed to measure LDMC, which basically differ in the rehydration procedure to which leaves are subjected after harvesting. These are the ‘complete rehydration’ protocol of Garnier et al. (2001, Functional Ecology 15: 688–695) and the ‘partial rehydration’ protocol of Vendramini et al. (2002, New Phytologist 154: 147–157). Methods To test differences in LDMC due to the use of different methods, LDMC was measured on 51 native and cultivated species representing a wide range of plant families and growth forms from central-western Argentina, following the complete rehydration and partial rehydration protocols. Key Results and Conclusions The LDMC values obtained by both methods were strongly and positively correlated, clearly showing that LDMC is highly conserved between the two procedures. These trends were not altered by the exclusion of plants with non-laminar leaves. Although the complete rehydration method is the safest to measure LDMC, the partial rehydration procedure produces similar results and is faster. It therefore appears as an acceptable option for those situations in which the complete rehydration method cannot be applied. Two notes of caution are given for cases in which different datasets are compared or combined: (1) the discrepancy between the two rehydration protocols is greatest in the case of high-LDMC (succulent or tender) leaves; (2) the results suggest that, when comparing many studies across unrelated datasets, differences in the measurement protocol may be less important than differences among seasons, years and the quality of local habitats. PMID:17353207

  2. Quantitative versus qualitative cultures of respiratory secretions for clinical outcomes in patients with ventilator-associated pneumonia.

    PubMed

    Berton, Danilo Cortozi; Kalil, Andre C; Cavalcanti, Manuela; Teixeira, Paulo José Zimermann

    2008-10-08

    rates (RR = 0.91, 95% CI 0.75 to 1.11). The analysis of all five RCTs showed there was no evidence of mortality reduction in the invasive group versus the non-invasive group (RR = 0.93, 95% CI 0.78 to 1.11). There were no significant differences between the interventions with respect to the number of days on mechanical ventilation, length of ICU stay or antibiotic change. There is no evidence that the use of quantitative cultures of respiratory secretions results in reduced mortality, reduced time in ICU and on mechanical ventilation, or higher rates of antibiotic change when compared to qualitative cultures in patients with VAP. Similar results were observed when invasive strategies were compared with non-invasive strategies.

  3. The Use of Mouse Models of Breast Cancer and Quantitative Image Analysis to Evaluate Hormone Receptor Antigenicity after Microwave-assisted Formalin Fixation

    PubMed Central

    Engelberg, Jesse A.; Giberson, Richard T.; Young, Lawrence J.T.; Hubbard, Neil E.

    2014-01-01

    Microwave methods of fixation can dramatically shorten fixation times while preserving tissue structure; however, it remains unclear if adequate tissue antigenicity is preserved. To assess and validate antigenicity, robust quantitative methods and animal disease models are needed. We used two mouse mammary models of human breast cancer to evaluate microwave-assisted and standard 24-hr formalin fixation. The mouse models expressed four antigens prognostic for breast cancer outcome: estrogen receptor, progesterone receptor, Ki67, and human epidermal growth factor receptor 2. Using pathologist evaluation and novel methods of quantitative image analysis, we measured and compared the quality of antigen preservation, percentage of positive cells, and line plots of cell intensity. Visual evaluations by pathologists established that the amounts and patterns of staining were similar in tissues fixed by the different methods. The results of the quantitative image analysis provided a fine-grained evaluation, demonstrating that tissue antigenicity is preserved in tissues fixed using microwave methods. Evaluation of the results demonstrated that a 1-hr, 150-W fixation is better than a 45-min, 150-W fixation followed by a 15-min, 650-W fixation. The results demonstrated that microwave-assisted formalin fixation can standardize fixation times to 1 hr and produce immunohistochemistry that is in every way commensurate with longer conventional fixation methods. PMID:24682322

  4. Quantitative assessment of chronic postsurgical pain using the McGill Pain Questionnaire.

    PubMed

    Bruce, Julie; Poobalan, Amudha S; Smith, W Cairns S; Chambers, W Alastair

    2004-01-01

    The McGill Pain Questionnaire (MPQ) provides a quantitative profile of 3 major psychologic dimensions of pain: sensory-discriminative, motivational-affective, and cognitive-evaluative. Although the MPQ is frequently used as a pain measurement tool, no studies to date have compared the characteristics of chronic post-surgical pain after different surgical procedures using a quantitative scoring method. Three separate questionnaire surveys were administered to patients who had undergone surgery at different time points between 1990 and 2000. Surgical procedures selected were mastectomy (n = 511 patients), inguinal hernia repair (n = 351 patients), and cardiac surgery via a central chest wound with or without saphenous vein harvesting (n = 1348 patients). A standard questionnaire format with the MPQ was used for each survey. The IASP definition of chronic pain, continuously or intermittently for longer than 3 months, was used with other criteria for pain location. The type of chronic pain was compared between the surgical populations using 3 different analytical methods: the Pain Rating Intensity score using scale values, (PRI-S); the Pain Rating Intensity using weighted rank values multiplied by scale value (PRI-R); and number of words chosen (NWC). The prevalence of chronic pain after mastectomy, inguinal herniorrhaphy, and median sternotomy with or without saphenectomy was 43%, 30%, and 39% respectively. Chronic pain most frequently reported was sensory-discriminative in quality with similar proportions across different surgical sites. Average PRI-S values after mastectomy, hernia repair, sternotomy (without postoperative anginal symptoms), and saphenectomy were 14.06, 13.00, 12.03, and 8.06 respectively. Analysis was conducted on cardiac patients who reported anginal symptoms with chronic post-surgical pain (PRI-S value 14.28). Patients with moderate and severe pain were more likely to choose more than 10 pain descriptors, regardless of the operative site (P < 0

  5. Explosion Source Similarity Analysis via SVD

    NASA Astrophysics Data System (ADS)

    Yedlin, Matthew; Ben Horin, Yochai; Margrave, Gary

    2016-04-01

    An important seismological ingredient for establishing a regional seismic nuclear discriminant is the similarity analysis of a sequence of explosion sources. To investigate source similarity, we are fortunate to have access to a sequence of 1805 three-component recordings of quarry blasts, shot from March 2002 to January 2015. The centroid of these blasts has an estimated location 36.3E and 29.9N. All blasts were detonated by JPMC (Jordan Phosphate Mines Co.) All data were recorded at the Israeli NDC, HFRI, located at 30.03N and 35.03E. Data were first winnowed based on the distribution of maximum amplitudes in the neighborhood of the P-wave arrival. The winnowed data were then detrended using the algorithm of Cleveland et al (1990). The detrended data were bandpass filtered between .1 to 12 Hz using an eighth order Butterworth filter. Finally, data were sorted based on maximum trace amplitude. Two similarity analysis approaches were used. First, for each component, the entire suite of traces was decomposed into its eigenvector representation, by employing singular-valued decomposition (SVD). The data were then reconstructed using 10 percent of the singular values, with the resulting enhancement of the S-wave and surface wave arrivals. The results of this first method are then compared to the second analysis method based on the eigenface decomposition analysis of Turk and Pentland (1991). While both methods yield similar results in enhancement of data arrivals and reduction of data redundancy, more analysis is required to calibrate the recorded data to charge size, a quantity that was not available for the current study. References Cleveland, R. B., Cleveland, W. S., McRae, J. E., and Terpenning, I., Stl: A seasonal-trend decomposition procedure based on loess, Journal of Official Statistics, 6, No. 1, 3-73, 1990. Turk, M. and Pentland, A., Eigenfaces for recognition. Journal of cognitive neuroscience, 3(1), 71-86, 1991.

  6. Quantitative Susceptibility Mapping after Sports-Related Concussion.

    PubMed

    Koch, K M; Meier, T B; Karr, R; Nencka, A S; Muftuler, L T; McCrea, M

    2018-06-07

    Quantitative susceptibility mapping using MR imaging can assess changes in brain tissue structure and composition. This report presents preliminary results demonstrating changes in tissue magnetic susceptibility after sports-related concussion. Longitudinal quantitative susceptibility mapping metrics were produced from imaging data acquired from cohorts of concussed and control football athletes. One hundred thirty-six quantitative susceptibility mapping datasets were analyzed across 3 separate visits (24 hours after injury, 8 days postinjury, and 6 months postinjury). Longitudinal quantitative susceptibility mapping group analyses were performed on stability-thresholded brain tissue compartments and selected subregions. Clinical concussion metrics were also measured longitudinally in both cohorts and compared with the measured quantitative susceptibility mapping. Statistically significant increases in white matter susceptibility were identified in the concussed athlete group during the acute (24 hour) and subacute (day 8) period. These effects were most prominent at the 8-day visit but recovered and showed no significant difference from controls at the 6-month visit. The subcortical gray matter showed no statistically significant group differences. Observed susceptibility changes after concussion appeared to outlast self-reported clinical recovery metrics at a group level. At an individual subject level, susceptibility increases within the white matter showed statistically significant correlations with return-to-play durations. The results of this preliminary investigation suggest that sports-related concussion can induce physiologic changes to brain tissue that can be detected using MR imaging-based magnetic susceptibility estimates. In group analyses, the observed tissue changes appear to persist beyond those detected on clinical outcome assessments and were associated with return-to-play duration after sports-related concussion. © 2018 by American Journal of

  7. A literature-driven method to calculate similarities among diseases.

    PubMed

    Kim, Hyunjin; Yoon, Youngmi; Ahn, Jaegyoon; Park, Sanghyun

    2015-11-01

    "Our lives are connected by a thousand invisible threads and along these sympathetic fibers, our actions run as causes and return to us as results". It is Herman Melville's famous quote describing connections among human lives. To paraphrase the Melville's quote, diseases are connected by many functional threads and along these sympathetic fibers, diseases run as causes and return as results. The Melville's quote explains the reason for researching disease-disease similarity and disease network. Measuring similarities between diseases and constructing disease network can play an important role in disease function research and in disease treatment. To estimate disease-disease similarities, we proposed a novel literature-based method. The proposed method extracted disease-gene relations and disease-drug relations from literature and used the frequencies of occurrence of the relations as features to calculate similarities among diseases. We also constructed disease network with top-ranking disease pairs from our method. The proposed method discovered a larger number of answer disease pairs than other comparable methods and showed the lowest p-value. We presume that our method showed good results because of using literature data, using all possible gene symbols and drug names for features of a disease, and determining feature values of diseases with the frequencies of co-occurrence of two entities. The disease-disease similarities from the proposed method can be used in computational biology researches which use similarities among diseases. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Qualitative versus quantitative methods in psychiatric research.

    PubMed

    Razafsha, Mahdi; Behforuzi, Hura; Azari, Hassan; Zhang, Zhiqun; Wang, Kevin K; Kobeissy, Firas H; Gold, Mark S

    2012-01-01

    Qualitative studies are gaining their credibility after a period of being misinterpreted as "not being quantitative." Qualitative method is a broad umbrella term for research methodologies that describe and explain individuals' experiences, behaviors, interactions, and social contexts. In-depth interview, focus groups, and participant observation are among the qualitative methods of inquiry commonly used in psychiatry. Researchers measure the frequency of occurring events using quantitative methods; however, qualitative methods provide a broader understanding and a more thorough reasoning behind the event. Hence, it is considered to be of special importance in psychiatry. Besides hypothesis generation in earlier phases of the research, qualitative methods can be employed in questionnaire design, diagnostic criteria establishment, feasibility studies, as well as studies of attitude and beliefs. Animal models are another area that qualitative methods can be employed, especially when naturalistic observation of animal behavior is important. However, since qualitative results can be researcher's own view, they need to be statistically confirmed, quantitative methods. The tendency to combine both qualitative and quantitative methods as complementary methods has emerged over recent years. By applying both methods of research, scientists can take advantage of interpretative characteristics of qualitative methods as well as experimental dimensions of quantitative methods.

  9. Towards quantitative imaging: stability of fully automated nodule segmentation across varied dose levels and reconstruction parameters in a low-dose CT screening patient cohort

    NASA Astrophysics Data System (ADS)

    Wahi-Anwar, M. Wasil; Emaminejad, Nastaran; Hoffman, John; Kim, Grace H.; Brown, Matthew S.; McNitt-Gray, Michael F.

    2018-02-01

    Quantitative imaging in lung cancer CT seeks to characterize nodules through quantitative features, usually from a region of interest delineating the nodule. The segmentation, however, can vary depending on segmentation approach and image quality, which can affect the extracted feature values. In this study, we utilize a fully-automated nodule segmentation method - to avoid reader-influenced inconsistencies - to explore the effects of varied dose levels and reconstruction parameters on segmentation. Raw projection CT images from a low-dose screening patient cohort (N=59) were reconstructed at multiple dose levels (100%, 50%, 25%, 10%), two slice thicknesses (1.0mm, 0.6mm), and a medium kernel. Fully-automated nodule detection and segmentation was then applied, from which 12 nodules were selected. Dice similarity coefficient (DSC) was used to assess the similarity of the segmentation ROIs of the same nodule across different reconstruction and dose conditions. Nodules at 1.0mm slice thickness and dose levels of 25% and 50% resulted in DSC values greater than 0.85 when compared to 100% dose, with lower dose leading to a lower average and wider spread of DSC values. At 0.6mm, the increased bias and wider spread of DSC values from lowering dose were more pronounced. The effects of dose reduction on DSC for CAD-segmented nodules were similar in magnitude to reducing the slice thickness from 1.0mm to 0.6mm. In conclusion, variation of dose and slice thickness can result in very different segmentations because of noise and image quality. However, there exists some stability in segmentation overlap, as even at 1mm, an image with 25% of the lowdose scan still results in segmentations similar to that seen in a full-dose scan.

  10. Quantitative imaging biomarkers: a review of statistical methods for technical performance assessment.

    PubMed

    Raunig, David L; McShane, Lisa M; Pennello, Gene; Gatsonis, Constantine; Carson, Paul L; Voyvodic, James T; Wahl, Richard L; Kurland, Brenda F; Schwarz, Adam J; Gönen, Mithat; Zahlmann, Gudrun; Kondratovich, Marina V; O'Donnell, Kevin; Petrick, Nicholas; Cole, Patricia E; Garra, Brian; Sullivan, Daniel C

    2015-02-01

    Technological developments and greater rigor in the quantitative measurement of biological features in medical images have given rise to an increased interest in using quantitative imaging biomarkers to measure changes in these features. Critical to the performance of a quantitative imaging biomarker in preclinical or clinical settings are three primary metrology areas of interest: measurement linearity and bias, repeatability, and the ability to consistently reproduce equivalent results when conditions change, as would be expected in any clinical trial. Unfortunately, performance studies to date differ greatly in designs, analysis method, and metrics used to assess a quantitative imaging biomarker for clinical use. It is therefore difficult or not possible to integrate results from different studies or to use reported results to design studies. The Radiological Society of North America and the Quantitative Imaging Biomarker Alliance with technical, radiological, and statistical experts developed a set of technical performance analysis methods, metrics, and study designs that provide terminology, metrics, and methods consistent with widely accepted metrological standards. This document provides a consistent framework for the conduct and evaluation of quantitative imaging biomarker performance studies so that results from multiple studies can be compared, contrasted, or combined. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. Quantitative learning strategies based on word networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  12. Does contraceptive treatment in wildlife result in side effects? A review of quantitative and anecdotal evidence.

    PubMed

    Gray, Meeghan E; Cameron, Elissa Z

    2010-01-01

    The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.

  13. Noise suppression for dual-energy CT via penalized weighted least-square optimization with similarity-based regularization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harms, Joseph; Wang, Tonghe; Petrongolo, Michael

    Purpose: Dual-energy CT (DECT) expands applications of CT imaging in its capability to decompose CT images into material images. However, decomposition via direct matrix inversion leads to large noise amplification and limits quantitative use of DECT. Their group has previously developed a noise suppression algorithm via penalized weighted least-square optimization with edge-preservation regularization (PWLS-EPR). In this paper, the authors improve method performance using the same framework of penalized weighted least-square optimization but with similarity-based regularization (PWLS-SBR), which substantially enhances the quality of decomposed images by retaining a more uniform noise power spectrum (NPS). Methods: The design of PWLS-SBR is basedmore » on the fact that averaging pixels of similar materials gives a low-noise image. For each pixel, the authors calculate the similarity to other pixels in its neighborhood by comparing CT values. Using an empirical Gaussian model, the authors assign high/low similarity value to one neighboring pixel if its CT value is close/far to the CT value of the pixel of interest. These similarity values are organized in matrix form, such that multiplication of the similarity matrix to the image vector reduces image noise. The similarity matrices are calculated on both high- and low-energy CT images and averaged. In PWLS-SBR, the authors include a regularization term to minimize the L-2 norm of the difference between the images without and with noise suppression via similarity matrix multiplication. By using all pixel information of the initial CT images rather than just those lying on or near edges, PWLS-SBR is superior to the previously developed PWLS-EPR, as supported by comparison studies on phantoms and a head-and-neck patient. Results: On the line-pair slice of the Catphan{sup ©}600 phantom, PWLS-SBR outperforms PWLS-EPR and retains spatial resolution of 8 lp/cm, comparable to the original CT images, even at 90% reduction

  14. Noise suppression for dual-energy CT via penalized weighted least-square optimization with similarity-based regularization

    PubMed Central

    Harms, Joseph; Wang, Tonghe; Petrongolo, Michael; Niu, Tianye; Zhu, Lei

    2016-01-01

    Purpose: Dual-energy CT (DECT) expands applications of CT imaging in its capability to decompose CT images into material images. However, decomposition via direct matrix inversion leads to large noise amplification and limits quantitative use of DECT. Their group has previously developed a noise suppression algorithm via penalized weighted least-square optimization with edge-preservation regularization (PWLS-EPR). In this paper, the authors improve method performance using the same framework of penalized weighted least-square optimization but with similarity-based regularization (PWLS-SBR), which substantially enhances the quality of decomposed images by retaining a more uniform noise power spectrum (NPS). Methods: The design of PWLS-SBR is based on the fact that averaging pixels of similar materials gives a low-noise image. For each pixel, the authors calculate the similarity to other pixels in its neighborhood by comparing CT values. Using an empirical Gaussian model, the authors assign high/low similarity value to one neighboring pixel if its CT value is close/far to the CT value of the pixel of interest. These similarity values are organized in matrix form, such that multiplication of the similarity matrix to the image vector reduces image noise. The similarity matrices are calculated on both high- and low-energy CT images and averaged. In PWLS-SBR, the authors include a regularization term to minimize the L-2 norm of the difference between the images without and with noise suppression via similarity matrix multiplication. By using all pixel information of the initial CT images rather than just those lying on or near edges, PWLS-SBR is superior to the previously developed PWLS-EPR, as supported by comparison studies on phantoms and a head-and-neck patient. Results: On the line-pair slice of the Catphan©600 phantom, PWLS-SBR outperforms PWLS-EPR and retains spatial resolution of 8 lp/cm, comparable to the original CT images, even at 90% reduction in noise

  15. Histological Image Processing Features Induce a Quantitative Characterization of Chronic Tumor Hypoxia

    PubMed Central

    Grabocka, Elda; Bar-Sagi, Dafna; Mishra, Bud

    2016-01-01

    Hypoxia in tumors signifies resistance to therapy. Despite a wealth of tumor histology data, including anti-pimonidazole staining, no current methods use these data to induce a quantitative characterization of chronic tumor hypoxia in time and space. We use image-processing algorithms to develop a set of candidate image features that can formulate just such a quantitative description of xenographed colorectal chronic tumor hypoxia. Two features in particular give low-variance measures of chronic hypoxia near a vessel: intensity sampling that extends radially away from approximated blood vessel centroids, and multithresholding to segment tumor tissue into normal, hypoxic, and necrotic regions. From these features we derive a spatiotemporal logical expression whose truth value depends on its predicate clauses that are grounded in this histological evidence. As an alternative to the spatiotemporal logical formulation, we also propose a way to formulate a linear regression function that uses all of the image features to learn what chronic hypoxia looks like, and then gives a quantitative similarity score once it is trained on a set of histology images. PMID:27093539

  16. Alphabetic letter identification: Effects of perceivability, similarity, and bias☆

    PubMed Central

    Mueller, Shane T.; Weidemann, Christoph T.

    2012-01-01

    The legibility of the letters in the Latin alphabet has been measured numerous times since the beginning of experimental psychology. To identify the theoretical mechanisms attributed to letter identification, we report a comprehensive review of literature, spanning more than a century. This review revealed that identification accuracy has frequently been attributed to a subset of three common sources: perceivability, bias, and similarity. However, simultaneous estimates of these values have rarely (if ever) been performed. We present the results of two new experiments which allow for the simultaneous estimation of these factors, and examine how the shape of a visual mask impacts each of them, as inferred through a new statistical model. Results showed that the shape and identity of the mask impacted the inferred perceivability, bias, and similarity space of a letter set, but that there were aspects of similarity that were robust to the choice of mask. The results illustrate how the psychological concepts of perceivability, bias, and similarity can be estimated simultaneously, and how each make powerful contributions to visual letter identification. PMID:22036587

  17. Quantitative Literacy: Geosciences and Beyond

    NASA Astrophysics Data System (ADS)

    Richardson, R. M.; McCallum, W. G.

    2002-12-01

    Quantitative literacy seems like such a natural for the geosciences, right? The field has gone from its origin as a largely descriptive discipline to one where it is hard to imagine failing to bring a full range of mathematical tools to the solution of geological problems. Although there are many definitions of quantitative literacy, we have proposed one that is analogous to the UNESCO definition of conventional literacy: "A quantitatively literate person is one who, with understanding, can both read and represent quantitative information arising in his or her everyday life." Central to this definition is the concept that a curriculum for quantitative literacy must go beyond the basic ability to "read and write" mathematics and develop conceptual understanding. It is also critical that a curriculum for quantitative literacy be engaged with a context, be it everyday life, humanities, geoscience or other sciences, business, engineering, or technology. Thus, our definition works both within and outside the sciences. What role do geoscience faculty have in helping students become quantitatively literate? Is it our role, or that of the mathematicians? How does quantitative literacy vary between different scientific and engineering fields? Or between science and nonscience fields? We will argue that successful quantitative literacy curricula must be an across-the-curriculum responsibility. We will share examples of how quantitative literacy can be developed within a geoscience curriculum, beginning with introductory classes for nonmajors (using the Mauna Loa CO2 data set) through graduate courses in inverse theory (using singular value decomposition). We will highlight six approaches to across-the curriculum efforts from national models: collaboration between mathematics and other faculty; gateway testing; intensive instructional support; workshops for nonmathematics faculty; quantitative reasoning requirement; and individual initiative by nonmathematics faculty.

  18. Personality similarity in negotiations: Testing the dyadic effects of similarity in interpersonal traits and the use of emotional displays on negotiation outcomes.

    PubMed

    Wilson, Kelly Schwind; DeRue, D Scott; Matta, Fadel K; Howe, Michael; Conlon, Donald E

    2016-10-01

    We build on the small but growing literature documenting personality influences on negotiation by examining how the joint disposition of both negotiators with respect to the interpersonal traits of agreeableness and extraversion influences important negotiation processes and outcomes. Building on similarity-attraction theory, we articulate and demonstrate how being similarly high or similarly low on agreeableness and extraversion leads dyad members to express more positive emotional displays during negotiation. Moreover, because of increased positive emotional displays, we show that dyads with such compositions also tend to reach agreements faster, perceive less relationship conflict, and have more positive impressions of their negotiation partner. Interestingly, these results hold regardless of whether negotiating dyads are similar in normatively positive (i.e., similarly agreeable and similarly extraverted) or normatively negative (i.e., similarly disagreeable and similarly introverted) ways. Overall, these findings demonstrate the importance of considering the dyad's personality configuration when attempting to understand the affective experience as well as the downstream outcomes of a negotiation. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  20. The predictive value of quantitative fibronectin testing in combination with cervical length measurement in symptomatic women.

    PubMed

    Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan

    2016-12-01

    The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk

  1. UK audit of quantitative thyroid uptake imaging.

    PubMed

    Taylor, Jonathan C; Murray, Anthony W; Hall, David O; Barnfield, Mark C; O'Shaugnessy, Emma R; Carson, Kathryn J; Cullis, James; Towey, David J; Kenny, Bob

    2017-07-01

    A national audit of quantitative thyroid uptake imaging was conducted by the Nuclear Medicine Software Quality Group of the Institute of Physics and Engineering in Medicine in 2014/2015. The aims of the audit were to measure and assess the variability in thyroid uptake results across the UK and to compare local protocols with British Nuclear Medicine Society (BNMS) guidelines. Participants were invited through a combination of emails on a public mailbase and targeted invitations from regional co-ordinators. All participants were given a set of images from which to calculate quantitative measures and a spreadsheet for capturing results. The image data consisted of two sets of 10 anterior thyroid images, half of which were acquired after administration of Tc-pertechnetate and the other half after administration of I-iodide. Images of the administration syringes or thyroid phantoms were also included. In total, 54 participants responded to the audit. The median number of scans conducted per year was 50. A majority of centres had at least one noncompliance in comparison with BNMS guidelines. Of most concern was the widespread lack of injection-site imaging. Quantitative results showed that both intersite and intrasite variability were low for the Tc dataset. The coefficient of quartile deviation was between 0.03 and 0.13 for measurements of overall percentage uptake. Although the number of returns for the I dataset was smaller, the level of variability between participants was greater (the coefficient of quartile deviation was between 0.17 and 0.25). A UK-wide audit showed that thyroid uptake imaging is still a common test in the UK. It was found that most centres do not adhere to all aspects of the BNMS practice guidelines but that quantitative results are reasonably consistent for Tc-based scans.

  2. Pressure ratio effects on self-similar scalar mixing of high-pressure turbulent jets in a pressurized volume

    NASA Astrophysics Data System (ADS)

    Ruggles, Adam; Pickett, Lyle; Frank, Jonathan

    2014-11-01

    Many real world combustion devices model fuel scalar mixing by assuming the self-similar argument established in atmospheric free jets. This allows simple prediction of the mean and rms fuel scalar fields to describe the mixing. This approach has been adopted in super critical liquid injections found in diesel engines where the liquid behaves as a dense fluid. The effect of pressure ratio (injection to ambient) when the ambient is greater than atmospheric pressure, upon the self-similar collapse has not been well characterized, particularly the effect upon mixing constants, jet spreading rates, and virtual origins. Changes in these self-similar parameters control the reproduction of the scalar mixing statistics. This experiment investigates the steady state mixing of high pressure ethylene jets in a pressurized pure nitrogen environment for various pressure ratios and jet orifice diameters. Quantitative laser Rayleigh scattering imaging was performed utilizing a calibration procedure to account for the pressure effects upon scattering interference within the high-pressure vessel.

  3. Optimization of Statistical Methods Impact on Quantitative Proteomics Data.

    PubMed

    Pursiheimo, Anna; Vehmas, Anni P; Afzal, Saira; Suomi, Tomi; Chand, Thaman; Strauss, Leena; Poutanen, Matti; Rokka, Anne; Corthals, Garry L; Elo, Laura L

    2015-10-02

    As tools for quantitative label-free mass spectrometry (MS) rapidly develop, a consensus about the best practices is not apparent. In the work described here we compared popular statistical methods for detecting differential protein expression from quantitative MS data using both controlled experiments with known quantitative differences for specific proteins used as standards as well as "real" experiments where differences in protein abundance are not known a priori. Our results suggest that data-driven reproducibility-optimization can consistently produce reliable differential expression rankings for label-free proteome tools and are straightforward in their application.

  4. Quantitative, spectrally-resolved intraoperative fluorescence imaging

    PubMed Central

    Valdés, Pablo A.; Leblond, Frederic; Jacobs, Valerie L.; Wilson, Brian C.; Paulsen, Keith D.; Roberts, David W.

    2012-01-01

    Intraoperative visual fluorescence imaging (vFI) has emerged as a promising aid to surgical guidance, but does not fully exploit the potential of the fluorescent agents that are currently available. Here, we introduce a quantitative fluorescence imaging (qFI) approach that converts spectrally-resolved data into images of absolute fluorophore concentration pixel-by-pixel across the surgical field of view (FOV). The resulting estimates are linear, accurate, and precise relative to true values, and spectral decomposition of multiple fluorophores is also achieved. Experiments with protoporphyrin IX in a glioma rodent model demonstrate in vivo quantitative and spectrally-resolved fluorescence imaging of infiltrating tumor margins for the first time. Moreover, we present images from human surgery which detect residual tumor not evident with state-of-the-art vFI. The wide-field qFI technique has broad implications for intraoperative surgical guidance because it provides near real-time quantitative assessment of multiple fluorescent biomarkers across the operative field. PMID:23152935

  5. simDEF: definition-based semantic similarity measure of gene ontology terms for functional similarity analysis of genes.

    PubMed

    Pesaranghader, Ahmad; Matwin, Stan; Sokolova, Marina; Beiko, Robert G

    2016-05-01

    Measures of protein functional similarity are essential tools for function prediction, evaluation of protein-protein interactions (PPIs) and other applications. Several existing methods perform comparisons between proteins based on the semantic similarity of their GO terms; however, these measures are highly sensitive to modifications in the topological structure of GO, tend to be focused on specific analytical tasks and concentrate on the GO terms themselves rather than considering their textual definitions. We introduce simDEF, an efficient method for measuring semantic similarity of GO terms using their GO definitions, which is based on the Gloss Vector measure commonly used in natural language processing. The simDEF approach builds optimized definition vectors for all relevant GO terms, and expresses the similarity of a pair of proteins as the cosine of the angle between their definition vectors. Relative to existing similarity measures, when validated on a yeast reference database, simDEF improves correlation with sequence homology by up to 50%, shows a correlation improvement >4% with gene expression in the biological process hierarchy of GO and increases PPI predictability by > 2.5% in F1 score for molecular function hierarchy. Datasets, results and source code are available at http://kiwi.cs.dal.ca/Software/simDEF CONTACT: ahmad.pgh@dal.ca or beiko@cs.dal.ca Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  6. An anthropomorphic phantom for quantitative evaluation of breast MRI.

    PubMed

    Freed, Melanie; de Zwart, Jacco A; Loud, Jennifer T; El Khouli, Riham H; Myers, Kyle J; Greene, Mark H; Duyn, Jeff H; Badano, Aldo

    2011-02-01

    In this study, the authors aim to develop a physical, tissue-mimicking phantom for quantitative evaluation of breast MRI protocols. The objective of this phantom is to address the need for improved standardization in breast MRI and provide a platform for evaluating the influence of image protocol parameters on lesion detection and discrimination. Quantitative comparisons between patient and phantom image properties are presented. The phantom is constructed using a mixture of lard and egg whites, resulting in a random structure with separate adipose- and glandular-mimicking components. T1 and T2 relaxation times of the lard and egg components of the phantom were estimated at 1.5 T from inversion recovery and spin-echo scans, respectively, using maximum-likelihood methods. The image structure was examined quantitatively by calculating and comparing spatial covariance matrices of phantom and patient images. A static, enhancing lesion was introduced by creating a hollow mold with stereolithography and filling it with a gadolinium-doped water solution. Measured phantom relaxation values fall within 2 standard errors of human values from the literature and are reasonably stable over 9 months of testing. Comparison of the covariance matrices of phantom and patient data demonstrates that the phantom and patient data have similar image structure. Their covariance matrices are the same to within error bars in the anterior-posterior direction and to within about two error bars in the right-left direction. The signal from the phantom's adipose-mimicking material can be suppressed using active fat-suppression protocols. A static, enhancing lesion can also be included with the ability to change morphology and contrast agent concentration. The authors have constructed a phantom and demonstrated its ability to mimic human breast images in terms of key physical properties that are relevant to breast MRI. This phantom provides a platform for the optimization and standardization of

  7. Collection of quantitative chemical release field data.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demirgian, J.; Macha, S.; Loyola Univ.

    1999-01-01

    Detection and quantitation of chemicals in the environment requires Fourier-transform infrared (FTIR) instruments that are properly calibrated and tested. This calibration and testing requires field testing using matrices that are representative of actual instrument use conditions. Three methods commonly used for developing calibration files and training sets in the field are a closed optical cell or chamber, a large-scale chemical release, and a small-scale chemical release. There is no best method. The advantages and limitations of each method should be considered in evaluating field results. Proper calibration characterizes the sensitivity of an instrument, its ability to detect a component inmore » different matrices, and the quantitative accuracy and precision of the results.« less

  8. Scoring in genetically modified organism proficiency tests based on log-transformed results.

    PubMed

    Thompson, Michael; Ellison, Stephen L R; Owen, Linda; Mathieson, Kenneth; Powell, Joanne; Key, Pauline; Wood, Roger; Damant, Andrew P

    2006-01-01

    The study considers data from 2 UK-based proficiency schemes and includes data from a total of 29 rounds and 43 test materials over a period of 3 years. The results from the 2 schemes are similar and reinforce each other. The amplification process used in quantitative polymerase chain reaction determinations predicts a mixture of normal, binomial, and lognormal distributions dominated by the latter 2. As predicted, the study results consistently follow a positively skewed distribution. Log-transformation prior to calculating z-scores is effective in establishing near-symmetric distributions that are sufficiently close to normal to justify interpretation on the basis of the normal distribution.

  9. Quantitative method to assess caries via fluorescence imaging from the perspective of autofluorescence spectral analysis

    NASA Astrophysics Data System (ADS)

    Chen, Q. G.; Zhu, H. H.; Xu, Y.; Lin, B.; Chen, H.

    2015-08-01

    A quantitative method to discriminate caries lesions for a fluorescence imaging system is proposed in this paper. The autofluorescence spectral investigation of 39 teeth samples classified by the International Caries Detection and Assessment System levels was performed at 405 nm excitation. The major differences in the different caries lesions focused on the relative spectral intensity range of 565-750 nm. The spectral parameter, defined as the ratio of wavebands at 565-750 nm to the whole spectral range, was calculated. The image component ratio R/(G + B) of color components was statistically computed by considering the spectral parameters (e.g. autofluorescence, optical filter, and spectral sensitivity) in our fluorescence color imaging system. Results showed that the spectral parameter and image component ratio presented a linear relation. Therefore, the image component ratio was graded as <0.66, 0.66-1.06, 1.06-1.62, and >1.62 to quantitatively classify sound, early decay, established decay, and severe decay tissues, respectively. Finally, the fluorescence images of caries were experimentally obtained, and the corresponding image component ratio distribution was compared with the classification result. A method to determine the numerical grades of caries using a fluorescence imaging system was proposed. This method can be applied to similar imaging systems.

  10. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  11. A Quantitative Infrared Spectroscopy Experiment.

    ERIC Educational Resources Information Center

    Krahling, Mark D.; Eliason, Robert

    1985-01-01

    Although infrared spectroscopy is used primarily for qualitative identifications, it is possible to use it as a quantitative tool as well. The use of a standard curve to determine percent methanol in a 2,2,2-trifluoroethanol sample is described. Background information, experimental procedures, and results obtained are provided. (JN)

  12. Distributed Efficient Similarity Search Mechanism in Wireless Sensor Networks

    PubMed Central

    Ahmed, Khandakar; Gregory, Mark A.

    2015-01-01

    The Wireless Sensor Network similarity search problem has received considerable research attention due to sensor hardware imprecision and environmental parameter variations. Most of the state-of-the-art distributed data centric storage (DCS) schemes lack optimization for similarity queries of events. In this paper, a DCS scheme with metric based similarity searching (DCSMSS) is proposed. DCSMSS takes motivation from vector distance index, called iDistance, in order to transform the issue of similarity searching into the problem of an interval search in one dimension. In addition, a sector based distance routing algorithm is used to efficiently route messages. Extensive simulation results reveal that DCSMSS is highly efficient and significantly outperforms previous approaches in processing similarity search queries. PMID:25751081

  13. Intra-laboratory validation of chronic bee paralysis virus quantitation using an accredited standardised real-time quantitative RT-PCR method.

    PubMed

    Blanchard, Philippe; Regnault, Julie; Schurr, Frank; Dubois, Eric; Ribière, Magali

    2012-03-01

    Chronic bee paralysis virus (CBPV) is responsible for chronic bee paralysis, an infectious and contagious disease in adult honey bees (Apis mellifera L.). A real-time RT-PCR assay to quantitate the CBPV load is now available. To propose this assay as a reference method, it was characterised further in an intra-laboratory study during which the reliability and the repeatability of results and the performance of the assay were confirmed. The qPCR assay alone and the whole quantitation method (from sample RNA extraction to analysis) were both assessed following the ISO/IEC 17025 standard and the recent XP U47-600 standard issued by the French Standards Institute. The performance of the qPCR assay and of the overall CBPV quantitation method were validated over a 6 log range from 10(2) to 10(8) with a detection limit of 50 and 100 CBPV RNA copies, respectively, and the protocol of the real-time RT-qPCR assay for CBPV quantitation was approved by the French Accreditation Committee. Copyright © 2011 Elsevier B.V. All rights reserved.

  14. Evaluation of a rapid quantitative determination method of PSA concentration with gold immunochromatographic strips.

    PubMed

    Wu, Cheng-Ching; Lin, Hung-Yu; Wang, Chao-Ping; Lu, Li-Fen; Yu, Teng-Hung; Hung, Wei-Chin; Houng, Jer-Yiing; Chung, Fu-Mei; Lee, Yau-Jiunn; Hu, Jin-Jia

    2015-11-03

    Prostate cancer remains the most common cancer in men. Qualitative or semi-quantitative immunochromatographic measurements of prostate specific antigen (PSA) have been shown to be simple, noninvasive and feasible. The aim of this study was to evaluate an optimized gold immunochromatographic strip device for the detection of PSA, in which the results can be analysed using a Chromogenic Rapid Test Reader to quantitatively assess the test results. This reader measures the reflectance of the signal line via a charge-coupled device camera. For quantitative analysis, PSA concentration was computed via a calibration equation. Capillary blood samples from 305 men were evaluated, and two independent observers interpreted the test results after 12 min. Blood samples were also collected and tested with a conventional quantitative assay. Sensitivity, specificity, positive and negative predictive values, and accuracy of the PSA rapid quantitative test system were 100, 96.6, 89.5, 100, and 97.4 %, respectively. Reproducibility of the test was 99.2, and interobserver variation was 8 % with a false positive rate of 3.4 %. The correlation coefficient between the ordinary quantitative assay and the rapid quantitative test was 0.960. The PSA rapid quantitative test system provided results quickly and was easy to use, so that tests using this system can be easily performed at outpatient clinics or elsewhere. This system may also be useful for initial cancer screening and for point-of-care testing, because results can be obtained within 12 min and at a cost lower than that of conventional quantitative assays.

  15. An Experimental Study on the Iso-Content-Based Angle Similarity Measure.

    ERIC Educational Resources Information Center

    Zhang, Jin; Rasmussen, Edie M.

    2002-01-01

    Retrieval performance of the iso-content-based angle similarity measure within the angle, distance, conjunction, disjunction, and ellipse retrieval models is compared with retrieval performance of the distance similarity measure and the angle similarity measure. Results show the iso-content-based angle similarity measure achieves satisfactory…

  16. Quantitative Glycomics Strategies*

    PubMed Central

    Mechref, Yehia; Hu, Yunli; Desantos-Garcia, Janie L.; Hussein, Ahmed; Tang, Haixu

    2013-01-01

    The correlations between protein glycosylation and many biological processes and diseases are increasing the demand for quantitative glycomics strategies enabling sensitive monitoring of changes in the abundance and structure of glycans. This is currently attained through multiple strategies employing several analytical techniques such as capillary electrophoresis, liquid chromatography, and mass spectrometry. The detection and quantification of glycans often involve labeling with ionic and/or hydrophobic reagents. This step is needed in order to enhance detection in spectroscopic and mass spectrometric measurements. Recently, labeling with stable isotopic reagents has also been presented as a very viable strategy enabling relative quantitation. The different strategies available for reliable and sensitive quantitative glycomics are herein described and discussed. PMID:23325767

  17. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  18. A Quantitative Theory of Human Color Choices

    PubMed Central

    Komarova, Natalia L.; Jameson, Kimberly A.

    2013-01-01

    The system for colorimetry adopted by the Commission Internationale de l’Eclairage (CIE) in 1931, along with its subsequent improvements, represents a family of light mixture models that has served well for many decades for stimulus specification and reproduction when highly controlled color standards are important. Still, with regard to color appearance many perceptual and cognitive factors are known to contribute to color similarity, and, in general, to all cognitive judgments of color. Using experimentally obtained odd-one-out triad similarity judgments from 52 observers, we demonstrate that CIE-based models can explain a good portion (but not all) of the color similarity data. Color difference quantified by CIELAB ΔE explained behavior at levels of 81% (across all colors), 79% (across red colors), and 66% (across blue colors). We show that the unexplained variation cannot be ascribed to inter- or intra-individual variations among the observers, and points to the presence of additional factors shared by the majority of responders. Based on this, we create a quantitative model of a lexicographic semiorder type, which shows how different perceptual and cognitive influences can trade-off when making color similarity judgments. We show that by incorporating additional influences related to categorical and lightness and saturation factors, the model explains more of the triad similarity behavior, namely, 91% (all colors), 90% (reds), and 87% (blues). We conclude that distance in a CIE model is but the first of several layers in a hierarchy of higher-order cognitive influences that shape color triad choices. We further discuss additional mitigating influences outside the scope of CIE modeling, which can be incorporated in this framework, including well-known influences from language, stimulus set effects, and color preference bias. We also discuss universal and cultural aspects of the model as well as non-uniformity of the color space with respect to different

  19. Statistical self-similarity of hotspot seamount volumes modeled as self-similar criticality

    USGS Publications Warehouse

    Tebbens, S.F.; Burroughs, S.M.; Barton, C.C.; Naar, D.F.

    2001-01-01

    The processes responsible for hotspot seamount formation are complex, yet the cumulative frequency-volume distribution of hotspot seamounts in the Easter Island/Salas y Gomez Chain (ESC) is found to be well-described by an upper-truncated power law. We develop a model for hotspot seamount formation where uniform energy input produces events initiated on a self-similar distribution of critical cells. We call this model Self-Similar Criticality (SSC). By allowing the spatial distribution of magma migration to be self-similar, the SSC model recreates the observed ESC seamount volume distribution. The SSC model may have broad applicability to other natural systems.

  20. Gender similarities and differences.

    PubMed

    Hyde, Janet Shibley

    2014-01-01

    Whether men and women are fundamentally different or similar has been debated for more than a century. This review summarizes major theories designed to explain gender differences: evolutionary theories, cognitive social learning theory, sociocultural theory, and expectancy-value theory. The gender similarities hypothesis raises the possibility of theorizing gender similarities. Statistical methods for the analysis of gender differences and similarities are reviewed, including effect sizes, meta-analysis, taxometric analysis, and equivalence testing. Then, relying mainly on evidence from meta-analyses, gender differences are reviewed in cognitive performance (e.g., math performance), personality and social behaviors (e.g., temperament, emotions, aggression, and leadership), and psychological well-being. The evidence on gender differences in variance is summarized. The final sections explore applications of intersectionality and directions for future research.

  1. Improved collaborative filtering recommendation algorithm of similarity measure

    NASA Astrophysics Data System (ADS)

    Zhang, Baofu; Yuan, Baoping

    2017-05-01

    The Collaborative filtering recommendation algorithm is one of the most widely used recommendation algorithm in personalized recommender systems. The key is to find the nearest neighbor set of the active user by using similarity measure. However, the methods of traditional similarity measure mainly focus on the similarity of user common rating items, but ignore the relationship between the user common rating items and all items the user rates. And because rating matrix is very sparse, traditional collaborative filtering recommendation algorithm is not high efficiency. In order to obtain better accuracy, based on the consideration of common preference between users, the difference of rating scale and score of common items, this paper presents an improved similarity measure method, and based on this method, a collaborative filtering recommendation algorithm based on similarity improvement is proposed. Experimental results show that the algorithm can effectively improve the quality of recommendation, thus alleviate the impact of data sparseness.

  2. Quantitation of TGF-beta1 mRNA in porcine mesangial cells by comparative kinetic RT/PCR: comparison with ribonuclease protection assay and in situ hybridization.

    PubMed

    Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F

    2001-01-01

    Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.

  3. The Quantitative Methods Boot Camp: Teaching Quantitative Thinking and Computing Skills to Graduate Students in the Life Sciences

    PubMed Central

    Stefan, Melanie I.; Gutlerner, Johanna L.; Born, Richard T.; Springer, Michael

    2015-01-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a “boot camp” in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students’ engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others. PMID:25880064

  4. The quantitative methods boot camp: teaching quantitative thinking and computing skills to graduate students in the life sciences.

    PubMed

    Stefan, Melanie I; Gutlerner, Johanna L; Born, Richard T; Springer, Michael

    2015-04-01

    The past decade has seen a rapid increase in the ability of biologists to collect large amounts of data. It is therefore vital that research biologists acquire the necessary skills during their training to visualize, analyze, and interpret such data. To begin to meet this need, we have developed a "boot camp" in quantitative methods for biology graduate students at Harvard Medical School. The goal of this short, intensive course is to enable students to use computational tools to visualize and analyze data, to strengthen their computational thinking skills, and to simulate and thus extend their intuition about the behavior of complex biological systems. The boot camp teaches basic programming using biological examples from statistics, image processing, and data analysis. This integrative approach to teaching programming and quantitative reasoning motivates students' engagement by demonstrating the relevance of these skills to their work in life science laboratories. Students also have the opportunity to analyze their own data or explore a topic of interest in more detail. The class is taught with a mixture of short lectures, Socratic discussion, and in-class exercises. Students spend approximately 40% of their class time working through both short and long problems. A high instructor-to-student ratio allows students to get assistance or additional challenges when needed, thus enhancing the experience for students at all levels of mastery. Data collected from end-of-course surveys from the last five offerings of the course (between 2012 and 2014) show that students report high learning gains and feel that the course prepares them for solving quantitative and computational problems they will encounter in their research. We outline our course here which, together with the course materials freely available online under a Creative Commons License, should help to facilitate similar efforts by others.

  5. A Preliminary Quantitative Comparison of Vibratory Amplitude Using Rigid and Flexible Stroboscopic Assessment.

    PubMed

    Hosbach-Cannon, Carly J; Lowell, Soren Y; Kelley, Richard T; Colton, Raymond H

    2016-07-01

    The purpose of this study was to establish preliminary, quantitative data on amplitude of vibration during stroboscopic assessment in healthy speakers with normal voice characteristics. Amplitude of vocal fold vibration is a core physiological parameter used in diagnosing voice disorders, yet quantitative data are lacking to guide the determination of what constitutes normal vibratory amplitude. Eleven participants were assessed during sustained vowel production using rigid and flexible endoscopy with stroboscopy. Still images were extracted from digital recordings of a sustained /i/ produced at a comfortable pitch and loudness, with F0 controlled so that levels were within ±15% of each participant's comfortable mean level as determined from connected speech. Glottal width (GW), true vocal fold (TVF) length, and TVF width were measured from still frames representing the maximum open phase of the vibratory cycle. To control for anatomic and magnification differences across participants, GW was normalized to TVF length. GW as a ratio of TVF width was also computed for comparison with prior studies. Mean values and standard deviations were computed for the normalized measures. Paired t tests showed no significant differences between rigid and flexible endoscopy methods. Interrater and intrarater reliability values for raw measurements were found to be high (0.89-0.99). These preliminary quantitative data may be helpful in determining normality or abnormality of vocal fold vibration. Results indicate that quantified amplitude of vibration is similar between endoscopic methods, a clinically relevant finding for individuals performing and interpreting stroboscopic assessments. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  6. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  7. Protein structural similarity search by Ramachandran codes

    PubMed Central

    Lo, Wei-Cheng; Huang, Po-Jung; Chang, Chih-Hung; Lyu, Ping-Chiang

    2007-01-01

    Background Protein structural data has increased exponentially, such that fast and accurate tools are necessary to access structure similarity search. To improve the search speed, several methods have been designed to reduce three-dimensional protein structures to one-dimensional text strings that are then analyzed by traditional sequence alignment methods; however, the accuracy is usually sacrificed and the speed is still unable to match sequence similarity search tools. Here, we aimed to improve the linear encoding methodology and develop efficient search tools that can rapidly retrieve structural homologs from large protein databases. Results We propose a new linear encoding method, SARST (Structural similarity search Aided by Ramachandran Sequential Transformation). SARST transforms protein structures into text strings through a Ramachandran map organized by nearest-neighbor clustering and uses a regenerative approach to produce substitution matrices. Then, classical sequence similarity search methods can be applied to the structural similarity search. Its accuracy is similar to Combinatorial Extension (CE) and works over 243,000 times faster, searching 34,000 proteins in 0.34 sec with a 3.2-GHz CPU. SARST provides statistically meaningful expectation values to assess the retrieved information. It has been implemented into a web service and a stand-alone Java program that is able to run on many different platforms. Conclusion As a database search method, SARST can rapidly distinguish high from low similarities and efficiently retrieve homologous structures. It demonstrates that the easily accessible linear encoding methodology has the potential to serve as a foundation for efficient protein structural similarity search tools. These search tools are supposed applicable to automated and high-throughput functional annotations or predictions for the ever increasing number of published protein structures in this post-genomic era. PMID:17716377

  8. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  9. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  10. Quantitative analyses of cell behaviors underlying notochord formation and extension in mouse embryos.

    PubMed

    Sausedo, R A; Schoenwolf, G C

    1994-05-01

    Formation and extension of the notochord (i.e., notogenesis) is one of the earliest and most obvious events of axis development in vertebrate embryos. In birds and mammals, prospective notochord cells arise from Hensen's node and come to lie beneath the midline of the neural plate. Throughout the period of neurulation, the notochord retains its close spatial relationship with the developing neural tube and undergoes rapid extension in concert with the overlying neuroepithelium. In the present study, we examined notochord development quantitatively in mouse embryos. C57BL/6 mouse embryos were collected at 8, 8.5, 9, 9.5, and 10 days of gestation. They were then embedded in paraffin and sectioned transversely. Serial sections from 21 embryos were stained with Schiff's reagent according to the Feulgen-Rossenbeck procedure and used for quantitative analyses of notochord extension. Quantitative analyses revealed that extension of the notochord involves cell division within the notochord proper and cell rearrangement within the notochordal plate (the immediate precursor of the notochord). In addition, extension of the notochord involves cell accretion, that is, the addition of cells to the notochord's caudal end, a process that involves considerable cell rearrangement at the notochordal plate-node interface. Extension of the mouse notochord occurs similarly to that described previously for birds (Sausedo and Schoenwolf, 1993 Anat. Rec. 237:58-70). That is, in both birds (i.e., quail and chick) and mouse embryos, notochord extension involves cell division, cell rearrangement, and cell accretion. Thus higher vertebrates utilize similar morphogenetic movements to effect notogenesis.

  11. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer’s Disease: Results from the DIAN Study Group

    PubMed Central

    Su, Yi; Blazey, Tyler M.; Owen, Christopher J.; Christensen, Jon J.; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C.; Ances, Beau M.; Snyder, Abraham Z.; Cash, Lisa A.; Koeppe, Robert A.; Klunk, William E.; Galasko, Douglas; Brickman, Adam M.; McDade, Eric; Ringman, John M.; Thompson, Paul M.; Saykin, Andrew J.; Ghetti, Bernardino; Sperling, Reisa A.; Johnson, Keith A.; Salloway, Stephen P.; Schofield, Peter R.; Masters, Colin L.; Villemagne, Victor L.; Fox, Nick C.; Förster, Stefan; Chen, Kewei; Reiman, Eric M.; Xiong, Chengjie; Marcus, Daniel S.; Weiner, Michael W.; Morris, John C.; Bateman, Randall J.; Benzinger, Tammie L. S.

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer’s Network (DIAN), an autosomal dominant Alzheimer’s disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer’s disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted. PMID:27010959

  12. An Illustration of Determining Quantitatively the Rock Mass Quality Parameters of the Hoek-Brown Failure Criterion

    NASA Astrophysics Data System (ADS)

    Wu, Li; Adoko, Amoussou Coffi; Li, Bo

    2018-04-01

    In tunneling, determining quantitatively the rock mass strength parameters of the Hoek-Brown (HB) failure criterion is useful since it can improve the reliability of the design of tunnel support systems. In this study, a quantitative method is proposed to determine the rock mass quality parameters of the HB failure criterion, namely the Geological Strength Index (GSI) and the disturbance factor ( D) based on the structure of drilling core and weathering condition of rock mass combined with acoustic wave test to calculate the strength of rock mass. The Rock Mass Structure Index and the Rock Mass Weathering Index are used to quantify the GSI while the longitudinal wave velocity ( V p) is employed to derive the value of D. The DK383+338 tunnel face of Yaojia tunnel of Shanghai-Kunming passenger dedicated line served as illustration of how the methodology is implemented. The values of the GSI and D are obtained using the HB criterion and then using the proposed method. The measured in situ stress is used to evaluate their accuracy. To this end, the major and minor principal stresses are calculated based on the GSI and D given by HB criterion and the proposed method. The results indicated that both methods were close to the field observation which suggests that the proposed method can be used for determining quantitatively the rock quality parameters, as well. However, these results remain valid only for rock mass quality and rock type similar to those of the DK383+338 tunnel face of Yaojia tunnel.

  13. Using SQL Databases for Sequence Similarity Searching and Analysis.

    PubMed

    Pearson, William R; Mackey, Aaron J

    2017-09-13

    Relational databases can integrate diverse types of information and manage large sets of similarity search results, greatly simplifying genome-scale analyses. By focusing on taxonomic subsets of sequences, relational databases can reduce the size and redundancy of sequence libraries and improve the statistical significance of homologs. In addition, by loading similarity search results into a relational database, it becomes possible to explore and summarize the relationships between all of the proteins in an organism and those in other biological kingdoms. This unit describes how to use relational databases to improve the efficiency of sequence similarity searching and demonstrates various large-scale genomic analyses of homology-related data. It also describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. The unit also introduces search_demo, a database that stores sequence similarity search results. The search_demo database is then used to explore the evolutionary relationships between E. coli proteins and proteins in other organisms in a large-scale comparative genomic analysis. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  14. Slow erosion of a quantitative apple resistance to Venturia inaequalis based on an isolate-specific Quantitative Trait Locus.

    PubMed

    Caffier, Valérie; Le Cam, Bruno; Al Rifaï, Mehdi; Bellanger, Marie-Noëlle; Comby, Morgane; Denancé, Caroline; Didelot, Frédérique; Expert, Pascale; Kerdraon, Tifenn; Lemarquand, Arnaud; Ravon, Elisa; Durel, Charles-Eric

    2016-10-01

    Quantitative plant resistance affects the aggressiveness of pathogens and is usually considered more durable than qualitative resistance. However, the efficiency of a quantitative resistance based on an isolate-specific Quantitative Trait Locus (QTL) is expected to decrease over time due to the selection of isolates with a high level of aggressiveness on resistant plants. To test this hypothesis, we surveyed scab incidence over an eight-year period in an orchard planted with susceptible and quantitatively resistant apple genotypes. We sampled 79 Venturia inaequalis isolates from this orchard at three dates and we tested their level of aggressiveness under controlled conditions. Isolates sampled on resistant genotypes triggered higher lesion density and exhibited a higher sporulation rate on apple carrying the resistance allele of the QTL T1 compared to isolates sampled on susceptible genotypes. Due to this ability to select aggressive isolates, we expected the QTL T1 to be non-durable. However, our results showed that the quantitative resistance based on the QTL T1 remained efficient in orchard over an eight-year period, with only a slow decrease in efficiency and no detectable increase of the aggressiveness of fungal isolates over time. We conclude that knowledge on the specificity of a QTL is not sufficient to evaluate its durability. Deciphering molecular mechanisms associated with resistance QTLs, genetic determinants of aggressiveness and putative trade-offs within pathogen populations is needed to help in understanding the erosion processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Effect of once-yearly zoledronic acid on the spine and hip as measured by quantitative computed tomography: results of the HORIZON Pivotal Fracture Trial

    PubMed Central

    Lang, T.; Boonen, S.; Cummings, S.; Delmas, P. D.; Cauley, J. A.; Horowitz, Z.; Kerzberg, E.; Bianchi, G.; Kendler, D.; Leung, P.; Man, Z.; Mesenbrink, P.; Eriksen, E. F.; Black, D. M.

    2016-01-01

    Summary Changes in bone mineral density and bone strength following treatment with zoledronic acid (ZOL) were measured by quantitative computed analysis (QCT) or dual-energy X-ray absorptiometry (DXA). ZOL treatment increased spine and hip BMD vs placebo, assessed by QCT and DXA. Changes in trabecular bone resulted in increased bone strength. Introduction To investigate bone mineral density (BMD) changes in trabecular and cortical bone, estimated by quantitative computed analysis (QCT) or dual-energy X-ray absorptiometry (DXA), and whether zoledronic acid 5 mg (ZOL) affects bone strength. Methods In 233 women from a randomized, controlled trial of once-yearly ZOL, lumbar spine, total hip, femoral neck, and trochanter were assessed by DXA and QCT (baseline, Month 36). Mean percentage changes from baseline and between-treatment differences (ZOL vs placebo, t-test) were evaluated. Results Mean between-treatment differences for lumbar spine BMD were significant by DXA (7.0%, p<0.01) and QCT (5.7%, p<0.0001). Between-treatment differences were significant for trabecular spine (p=0.0017) [non-parametric test], trabecular trochanter (10.7%, p<0.0001), total hip (10.8%, p<0.0001), and compressive strength indices at femoral neck (8.6%, p=0.0001), and trochanter (14.1%, p<0.0001). Conclusions Once-yearly ZOL increased hip and spine BMD vs placebo, assessed by QCT vs DXA. Changes in trabecular bone resulted in increased indices of compressive strength. PMID:19802508

  16. Improved personalized recommendation based on a similarity network

    NASA Astrophysics Data System (ADS)

    Wang, Ximeng; Liu, Yun; Xiong, Fei

    2016-08-01

    A recommender system helps individual users find the preferred items rapidly and has attracted extensive attention in recent years. Many successful recommendation algorithms are designed on bipartite networks, such as network-based inference or heat conduction. However, most of these algorithms define the resource-allocation methods for an average allocation. That is not reasonable because average allocation cannot indicate the user choice preference and the influence between users which leads to a series of non-personalized recommendation results. We propose a personalized recommendation approach that combines the similarity function and bipartite network to generate a similarity network that improves the resource-allocation process. Our model introduces user influence into the recommender system and states that the user influence can make the resource-allocation process more reasonable. We use four different metrics to evaluate our algorithms for three benchmark data sets. Experimental results show that the improved recommendation on a similarity network can obtain better accuracy and diversity than some competing approaches.

  17. Reversing the similarity effect: The effect of presentation format.

    PubMed

    Cataldo, Andrea M; Cohen, Andrew L

    2018-06-01

    A context effect is a change in preference that occurs when alternatives are added to a choice set. Models of preferential choice that account for context effects largely assume a within-dimension comparison process. It has been shown, however, that the format in which a choice set is presented can influence comparison strategies. That is, a by-alternative or by-dimension grouping of the dimension values encourage within-alternative or within-dimension comparisons, respectively. For example, one classic context effect, the compromise effect, is strengthened by a by-dimension presentation format. Extrapolation from this result suggests that a second context effect, the similarity effect, will actually reverse when stimuli are presented in a by-dimension format. In the current study, we presented participants with a series of apartment choice sets designed to elicit the similarity effect, with either a by-alternative or by-dimension presentation format. Participants in the by-alternative condition demonstrated a standard similarity effect; however, participants in the by-dimension condition demonstrated a strong reverse similarity effect. The present data can be accounted for by Multialternative Decision Field Theory (MDFT) and the Multiattribute Linear Ballistic Accumulator (MLBA), but not Elimination by Aspects (EBA). Indeed, when some weak assumptions of within-dimension processes are met, MDFT and the MLBA predict the reverse similarity effect. These modeling results suggest that the similarity effect is governed by either forgetting and inhibition (MDFT), or attention to positive or negative differences (MLBA). These results demonstrate that flexibility in the comparison process needs to be incorporated into theories of preferential choice. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. A quantitative literature-curated gold standard for kinase-substrate pairs

    PubMed Central

    2011-01-01

    We describe the Yeast Kinase Interaction Database (KID, http://www.moseslab.csb.utoronto.ca/KID/), which contains high- and low-throughput data relevant to phosphorylation events. KID includes 6,225 low-throughput and 21,990 high-throughput interactions, from greater than 35,000 experiments. By quantitatively integrating these data, we identified 517 high-confidence kinase-substrate pairs that we consider a gold standard. We show that this gold standard can be used to assess published high-throughput datasets, suggesting that it will enable similar rigorous assessments in the future. PMID:21492431

  19. Dreaming and waking: similarities and differences revisited.

    PubMed

    Kahan, Tracey L; LaBerge, Stephen P

    2011-09-01

    Dreaming is often characterized as lacking high-order cognitive (HOC) skills. In two studies, we test the alternative hypothesis that the dreaming mind is highly similar to the waking mind. Multiple experience samples were obtained from late-night REM sleep and waking, following a systematic protocol described in Kahan (2001). Results indicated that reported dreaming and waking experiences are surprisingly similar in their cognitive and sensory qualities. Concurrently, ratings of dreaming and waking experiences were markedly different on questions of general reality orientation and logical organization (e.g., the bizarreness or typicality of the events, actions, and locations). Consistent with other recent studies (e.g., Bulkeley & Kahan, 2008; Kozmová & Wolman, 2006), experiences sampled from dreaming and waking were more similar with respect to their process features than with respect to their structural features. Copyright © 2010 Elsevier Inc. All rights reserved.

  20. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wahi-Anwar, M; Lo, P; Kim, H

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifiesmore » the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include

  1. Quantitative Comparison of Mountain Belt Topographic Profiles on Earth and Venus

    NASA Astrophysics Data System (ADS)

    Stoddard, P. R.; Jurdy, D. M.

    2016-12-01

    Earth's mountain belts result from interactions between tectonic plates. Several styles of belts reflect the differing nature of those interactions: The narrow spine of the Andes results from subduction of the oceanic Nazca plate under the continental South American plate, the soaring Himalayas from the collision of India and Asia, the broad Rockies and Alaskan cordillera from multiple collisions, and the gentle Appalachians and Urals are remnants from ancient collisions. Venus' mountain chains - Maxwell, Freyja, Akna and Danu - surround Lakshmi Planum, a highland with an elevation of 4 km. These make up Ishtar Terra. Maxwell Montes ascends to over 11 km, the highest elevation on the planet. Freyja rises just over 7 km and Akna to about 6 km. The arcuate Danu belt on Ishtar's western boundary comes up to only 1.5 km over the planum. No other mountain belts exist on Venus. The origins of these venusian orogenic belts remain unknown. Earliest explanations invoked subduction around Lakshmi Planum; subsequent models included either up- or down-welling of the mantle, horizontal convergence, or crustal thickening. We quantitatively compare topography of Venus' mountain chains with Earth's for similarities and differences. Patterns may provide clues to the dynamics forming venusian orogenic belts. To do this, we find topographic profiles across the various chains, determine average profiles for each, and then correlate averages to establish the degree of similarity. From this correlation we construct a covariance matrix, diagonalized for eigenvalues, or principal components. These can be displayed as profiles. Correlations and principal components allow us to assess the degree of similarity and variability of the shapes of the average profiles. These analyses thus offer independent and objective modes of comparison; for example, with respect to terrestrial mid-ocean ridges, some Venus chasmata were shown to most closely resemble the ultra-slow Arctic spreading center.

  2. A quantitative philology of introspection

    PubMed Central

    Diuk, Carlos G.; Slezak, D. Fernandez; Raskovsky, I.; Sigman, M.; Cecchi, G. A.

    2012-01-01

    The cultural evolution of introspective thought has been recognized to undergo a drastic change during the middle of the first millennium BC. This period, known as the “Axial Age,” saw the birth of religions and philosophies still alive in modern culture, as well as the transition from orality to literacy—which led to the hypothesis of a link between introspection and literacy. Here we set out to examine the evolution of introspection in the Axial Age, studying the cultural record of the Greco-Roman and Judeo-Christian literary traditions. Using a statistical measure of semantic similarity, we identify a single “arrow of time” in the Old and New Testaments of the Bible, and a more complex non-monotonic dynamics in the Greco-Roman tradition reflecting the rise and fall of the respective societies. A comparable analysis of the twentieth century cultural record shows a steady increase in the incidence of introspective topics, punctuated by abrupt declines during and preceding the First and Second World Wars. Our results show that (a) it is possible to devise a consistent metric to quantify the history of a high-level concept such as introspection, cementing the path for a new quantitative philology and (b) to the extent that it is captured in the cultural record, the increased ability of human thought for self-reflection that the Axial Age brought about is still heavily determined by societal contingencies beyond the orality-literacy nexus. PMID:23015783

  3. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  4. Evaluation of reference gene suitability for quantitative expression analysis by quantitative polymerase chain reaction in the mandibular condyle of sheep.

    PubMed

    Jiang, Xin; Xue, Yang; Zhou, Hongzhi; Li, Shouhong; Zhang, Zongmin; Hou, Rui; Ding, Yuxiang; Hu, Kaijin

    2015-10-01

    Reference genes are commonly used as a reliable approach to normalize the results of quantitative polymerase chain reaction (qPCR), and to reduce errors in the relative quantification of gene expression. Suitable reference genes belonging to numerous functional classes have been identified for various types of species and tissue. However, little is currently known regarding the most suitable reference genes for bone, specifically for the sheep mandibular condyle. Sheep are important for the study of human bone diseases, particularly for temporomandibular diseases. The present study aimed to identify a set of reference genes suitable for the normalization of qPCR data from the mandibular condyle of sheep. A total of 12 reference genes belonging to various functional classes were selected, and the expression stability of the reference genes was determined in both the normal and fractured area of the sheep mandibular condyle. RefFinder, which integrates the following currently available computational algorithms: geNorm, NormFinder, BestKeeper, and the comparative ΔCt method, was used to compare and rank the candidate reference genes. The results obtained from the four methods demonstrated a similar trend: RPL19, ACTB, and PGK1 were the most stably expressed reference genes in the sheep mandibular condyle. As determined by RefFinder comprehensive analysis, the results of the present study suggested that RPL19 is the most suitable reference gene for studies associated with the sheep mandibular condyle. In addition, ACTB and PGK1 may be considered suitable alternatives.

  5. Quantitative Imaging in Cancer Evolution and Ecology

    PubMed Central

    Grove, Olya; Gillies, Robert J.

    2013-01-01

    Cancer therapy, even when highly targeted, typically fails because of the remarkable capacity of malignant cells to evolve effective adaptations. These evolutionary dynamics are both a cause and a consequence of cancer system heterogeneity at many scales, ranging from genetic properties of individual cells to large-scale imaging features. Tumors of the same organ and cell type can have remarkably diverse appearances in different patients. Furthermore, even within a single tumor, marked variations in imaging features, such as necrosis or contrast enhancement, are common. Similar spatial variations recently have been reported in genetic profiles. Radiologic heterogeneity within tumors is usually governed by variations in blood flow, whereas genetic heterogeneity is typically ascribed to random mutations. However, evolution within tumors, as in all living systems, is subject to Darwinian principles; thus, it is governed by predictable and reproducible interactions between environmental selection forces and cell phenotype (not genotype). This link between regional variations in environmental properties and cellular adaptive strategies may permit clinical imaging to be used to assess and monitor intratumoral evolution in individual patients. This approach is enabled by new methods that extract, report, and analyze quantitative, reproducible, and mineable clinical imaging data. However, most current quantitative metrics lack spatialness, expressing quantitative radiologic features as a single value for a region of interest encompassing the whole tumor. In contrast, spatially explicit image analysis recognizes that tumors are heterogeneous but not well mixed and defines regionally distinct habitats, some of which appear to harbor tumor populations that are more aggressive and less treatable than others. By identifying regional variations in key environmental selection forces and evidence of cellular adaptation, clinical imaging can enable us to define intratumoral

  6. Visual similarity in short-term recall for where and when.

    PubMed

    Jalbert, Annie; Saint-Aubin, Jean; Tremblay, Sébastien

    2008-03-01

    Two experiments examined the effects of visual similarity on short-term recall for where and when in the visual spatial domain. A series of squares of similar or dissimilar colours were serially presented at various locations on the screen. At recall, all coloured squares were simultaneously presented in a random order at the bottom of the screen, and the locations used for presentation were indicated by white squares. Participants were asked to place the colours at their appropriate location in their presentation order. Performance for location (where) and order (when) was assessed separately. Results revealed that similarity severely hinders both memory for what was where and memory for what was when, under quiet and articulatory suppression conditions. These results provide further evidence that similarity has a major impact on processing relational information in memory.

  7. Development and Assessment of Modules to Integrate Quantitative Skills in Introductory Biology Courses.

    PubMed

    Hoffman, Kathleen; Leupen, Sarah; Dowell, Kathy; Kephart, Kerrie; Leips, Jeff

    2016-01-01

    Redesigning undergraduate biology courses to integrate quantitative reasoning and skill development is critical to prepare students for careers in modern medicine and scientific research. In this paper, we report on the development, implementation, and assessment of stand-alone modules that integrate quantitative reasoning into introductory biology courses. Modules are designed to improve skills in quantitative numeracy, interpreting data sets using visual tools, and making inferences about biological phenomena using mathematical/statistical models. We also examine demographic/background data that predict student improvement in these skills through exposure to these modules. We carried out pre/postassessment tests across four semesters and used student interviews in one semester to examine how students at different levels approached quantitative problems. We found that students improved in all skills in most semesters, although there was variation in the degree of improvement among skills from semester to semester. One demographic variable, transfer status, stood out as a major predictor of the degree to which students improved (transfer students achieved much lower gains every semester, despite the fact that pretest scores in each focus area were similar between transfer and nontransfer students). We propose that increased exposure to quantitative skill development in biology courses is effective at building competency in quantitative reasoning. © 2016 K. Hoffman, S. Leupen, et al. CBE—Life Sciences Education © 2016 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  8. Processes of Similarity Judgment

    ERIC Educational Resources Information Center

    Larkey, Levi B.; Markman, Arthur B.

    2005-01-01

    Similarity underlies fundamental cognitive capabilities such as memory, categorization, decision making, problem solving, and reasoning. Although recent approaches to similarity appreciate the structure of mental representations, they differ in the processes posited to operate over these representations. We present an experiment that…

  9. Common neighbour structure and similarity intensity in complex networks

    NASA Astrophysics Data System (ADS)

    Hou, Lei; Liu, Kecheng

    2017-10-01

    Complex systems as networks always exhibit strong regularities, implying underlying mechanisms governing their evolution. In addition to the degree preference, the similarity has been argued to be another driver for networks. Assuming a network is randomly organised without similarity preference, the present paper studies the expected number of common neighbours between vertices. A symmetrical similarity index is accordingly developed by removing such expected number from the observed common neighbours. The developed index can not only describe the similarities between vertices, but also the dissimilarities. We further apply the proposed index to measure of the influence of similarity on the wring patterns of networks. Fifteen empirical networks as well as artificial networks are examined in terms of similarity intensity and degree heterogeneity. Results on real networks indicate that, social networks are strongly governed by the similarity as well as the degree preference, while the biological networks and infrastructure networks show no apparent similarity governance. Particularly, classical network models, such as the Barabási-Albert model, the Erdös-Rényi model and the Ring Lattice, cannot well describe the social networks in terms of the degree heterogeneity and similarity intensity. The findings may shed some light on the modelling and link prediction of different classes of networks.

  10. A machine learning approach to quantifying geologic similarities between sites of gas hydrate accumulation

    NASA Astrophysics Data System (ADS)

    Runyan, T. E.; Wood, W. T.; Palmsten, M. L.; Zhang, R.

    2016-12-01

    Gas hydrates, specifically methane hydrates, are sparsely sampled on a global scale, and their accumulation is difficult to predict geospatially. Several attempts have been made at estimating global inventories, and to some extent geospatial distribution, using geospatial extrapoltions guided with geophysical and geochemical methods. Our objective is to quantitatively predict the geospatial likelihood of encountering methane hydrates, with uncertainty. Predictions could be incorporated into analyses of drilling hazards as well as climate change. We use global data sets (including water depth, temperature, pressure, TOC, sediment thickness, and heat flow) as parameters to train a k-nearest neighbor (KNN) machine learning technique. The KNN is unsupervised and non-parametric, we do not provide any interpretive influence on prior probability distribution, so our results are strictly data driven. We have selected as test sites several locations where gas hydrates have been well studied, each with significantly different geologic settings.These include: The Blake Ridge (U.S. East Coast), Hydrate Ridge (U.S. West Coast), and the Gulf of Mexico. We then use KNN to quantify similarities between these sites, and determine, via the distance in parameter space, what is the likelihood and uncertainty of encountering gas hydrate anywhere in the world. Here we are operating under the assumption that the distance in parameter space is proportional to the probability of the occurrence of gas hydrate. We then compare these global similarity maps made from our several test sites to identify the geologic (geophyisical, bio-geochemical) parameters best suited for predicting gas hydrate occurrence.

  11. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. ADvanced IMage Algebra (ADIMA): a novel method for depicting multiple sclerosis lesion heterogeneity, as demonstrated by quantitative MRI

    PubMed Central

    Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM

    2013-01-01

    Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551

  13. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  14. A competitive enzyme immunoassay for the quantitative detection of cocaine from banknotes and latent fingermarks.

    PubMed

    van der Heide, Susan; Garcia Calavia, Paula; Hardwick, Sheila; Hudson, Simon; Wolff, Kim; Russell, David A

    2015-05-01

    A sensitive and versatile competitive enzyme immunoassay (cEIA) has been developed for the quantitative detection of cocaine in complex forensic samples. Polyclonal anti-cocaine antibody was purified from serum and deposited onto microtiter plates. The concentration of the cocaine antibody adsorbed onto the plates, and the dilution of the cocaine-HRP hapten were both studied to achieve an optimised immunoassay. The method was successfully used to quantify cocaine in extracts taken from both paper currency and latent fingermarks. The limit of detection (LOD) of 0.162ngmL(-1) achieved with the assay compares favourably to that of conventional chromatography-mass spectroscopy techniques, with an appropriate sensitivity for the quantification of cocaine at the low concentrations present in some forensic samples. The cEIA was directly compared to LC-MS for the analysis of ten UK banknote samples. The results obtained from both techniques were statistically similar, suggesting that the immunoassay was unaffected by cross-reactivity with potentially interfering compounds. The cEIA was used also for the detection of cocaine in extracts from latent fingermarks. The results obtained were compared to the cocaine concentrations detected in oral fluid sampled from the same individual. Using the cEIA, we have shown, for the first time, that endogeneously excreted cocaine can be detected and quantified from a single latent fingermark. Additionally, it has been shown that the presence of cocaine, at similar concentrations, in more than one latent fingermark from the same individual can be linked with those concentrations found in oral fluid. These results show that detection of drugs in latent fingermarks could directly indicate whether an individual has consumed the drug. The specificity and feasibility of measuring low concentrations of cocaine in complex forensic samples demonstrate the effectiveness and robustness of the assay. The immunoassay presents a simple and cost

  15. An adapted mindfulness-based stress reduction program for elders in a continuing care retirement community: quantitative and qualitative results from a pilot randomized controlled trial.

    PubMed

    Moss, Aleezé S; Reibel, Diane K; Greeson, Jeffrey M; Thapar, Anjali; Bubb, Rebecca; Salmon, Jacqueline; Newberg, Andrew B

    2015-06-01

    The purpose of this study was to test the feasibility and effectiveness of an adapted 8-week Mindfulness-Based Stress Reduction (MBSR) program for elders in a continuing care community. This mixed-methods study used both quantitative and qualitative measures. A randomized waitlist control design was used for the quantitative aspect of the study. Thirty-nine elderly were randomized to MBSR (n = 20) or a waitlist control group (n = 19), mean age was 82 years. Both groups completed pre-post measures of health-related quality of life, acceptance and psychological flexibility, facets of mindfulness, self-compassion, and psychological distress. A subset of MBSR participants completed qualitative interviews. MBSR participants showed significantly greater improvement in acceptance and psychological flexibility and in role limitations due to physical health. In the qualitative interviews, MBSR participants reported increased awareness, less judgment, and greater self-compassion. Study results demonstrate the feasibility and potential effectiveness of an adapted MBSR program in promoting mind-body health for elders. © The Author(s) 2014.

  16. Differences and similarities in cross-cultural perceptions of boundaries: a comparison of results from two studies.

    PubMed

    Miller, Patrice Marie; Bener, Abdulbari; Ghuloum, Suhaila; Commons, Michael Lamport; Burgut, F Tuna

    2012-01-01

    There has been substantial literature on boundary excursions in clinician-patient relationships; however, very little empirical research exists. Even less information exists on how perceptions of this issue might differ across cultures. Prior to this study, empirical data on various kinds of boundary excursions were collected in different cultural contexts. First, clinicians from the U.S. and Brazil were asked to rate 173 boundary excursions for both their perceived harmfulness and their professional unacceptability (Miller et al., 2006). In a second study, colleagues from Qatar administered a slightly modified version to mental health care professional staff of a hospital in Doha, Qatar (Ghuloum et al., 2011). In this paper, the results of these two separate studies are compared. The results showed some similarities and some differences in perceptions of the boundary behaviors. For example, both sets of cultures seem to agree that certain behaviors are seriously harmful and/or professionally unacceptable. These behaviors include some frankly sexual behavior, such as having sexual intercourse with a patient, as well as behavior related to doing business with the patient, and some disclosing behavior. There are also significant cultural differences in perceptions of how harmful some of the behaviors are. Qatari practitioners seemed to rate certain behaviors that within therapy mix disclosing or personal behavior with therapy as more harmful, but behaviors that involved interacting with patients outside of therapy as less serious. A factor analysis suggested that participants in U.S./Brazil saw a much larger number of behaviors as making up a set of Core Boundary Violations, whereas Qatari respondents separated sexual behaviors from others. Finally, a Rasch analysis showed that both cultures perceived a continuum of boundary behaviors, from those that are least harmful or unprofessional to those that are highly harmful or unprofessional. One interpretation is that

  17. Validation of reference genes for quantitative expression analysis by real-time RT-PCR in Saccharomyces cerevisiae

    PubMed Central

    Teste, Marie-Ange; Duquenne, Manon; François, Jean M; Parrou, Jean-Luc

    2009-01-01

    Background Real-time RT-PCR is the recommended method for quantitative gene expression analysis. A compulsory step is the selection of good reference genes for normalization. A few genes often referred to as HouseKeeping Genes (HSK), such as ACT1, RDN18 or PDA1 are among the most commonly used, as their expression is assumed to remain unchanged over a wide range of conditions. Since this assumption is very unlikely, a geometric averaging of multiple, carefully selected internal control genes is now strongly recommended for normalization to avoid this problem of expression variation of single reference genes. The aim of this work was to search for a set of reference genes for reliable gene expression analysis in Saccharomyces cerevisiae. Results From public microarray datasets, we selected potential reference genes whose expression remained apparently invariable during long-term growth on glucose. Using the algorithm geNorm, ALG9, TAF10, TFC1 and UBC6 turned out to be genes whose expression remained stable, independent of the growth conditions and the strain backgrounds tested in this study. We then showed that the geometric averaging of any subset of three genes among the six most stable genes resulted in very similar normalized data, which contrasted with inconsistent results among various biological samples when the normalization was performed with ACT1. Normalization with multiple selected genes was therefore applied to transcriptional analysis of genes involved in glycogen metabolism. We determined an induction ratio of 100-fold for GPH1 and 20-fold for GSY2 between the exponential phase and the diauxic shift on glucose. There was no induction of these two genes at this transition phase on galactose, although in both cases, the kinetics of glycogen accumulation was similar. In contrast, SGA1 expression was independent of the carbon source and increased by 3-fold in stationary phase. Conclusion In this work, we provided a set of genes that are suitable reference

  18. Path similarity skeleton graph matching.

    PubMed

    Bai, Xiang; Latecki, Longin Jan

    2008-07-01

    This paper presents a novel framework to for shape recognition based on object silhouettes. The main idea is to match skeleton graphs by comparing the shortest paths between skeleton endpoints. In contrast to typical tree or graph matching methods, we completely ignore the topological graph structure. Our approach is motivated by the fact that visually similar skeleton graphs may have completely different topological structures. The proposed comparison of shortest paths between endpoints of skeleton graphs yields correct matching results in such cases. The skeletons are pruned by contour partitioning with Discrete Curve Evolution, which implies that the endpoints of skeleton branches correspond to visual parts of the objects. The experimental results demonstrate that our method is able to produce correct results in the presence of articulations, stretching, and occlusion.

  19. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  20. Comparative Performance of Reagents and Platforms for Quantitation of Cytomegalovirus DNA by Digital PCR

    PubMed Central

    Gu, Z.; Sam, S. S.; Sun, Y.; Tang, L.; Pounds, S.; Caliendo, A. M.

    2016-01-01

    A potential benefit of digital PCR is a reduction in result variability across assays and platforms. Three sets of PCR reagents were tested on two digital PCR systems (Bio-Rad and RainDance), using three different sets of PCR reagents for quantitation of cytomegalovirus (CMV). Both commercial quantitative viral standards and 16 patient samples (n = 16) were tested. Quantitative accuracy (compared to nominal values) and variability were determined based on viral standard testing results. Quantitative correlation and variability were assessed with pairwise comparisons across all reagent-platform combinations for clinical plasma sample results. The three reagent sets, when used to assay quantitative standards on the Bio-Rad system, all showed a high degree of accuracy, low variability, and close agreement with one another. When used on the RainDance system, one of the three reagent sets appeared to have a much better correlation to nominal values than did the other two. Quantitative results for patient samples showed good correlation in most pairwise comparisons, with some showing poorer correlations when testing samples with low viral loads. Digital PCR is a robust method for measuring CMV viral load. Some degree of result variation may be seen, depending on platform and reagents used; this variation appears to be greater in samples with low viral load values. PMID:27535685

  1. Preliminary Results of Acoustic Radiation Force Impulse Imaging by Combined Qualitative and Quantitative Analyses for Evaluation of Breast Lesions.

    PubMed

    Wang, Lin; Wan, Cai-Feng; Du, Jing; Li, Feng-Hua

    2018-04-15

    The purpose of this study was to evaluate the application of a new elastographic technique, acoustic radiation force impulse (ARFI) imaging, and its diagnostic performance for characterizing breast lesions. One hundred consecutive female patients with 126 breast lesions were enrolled in our study. After routine breast ultrasound examinations, the patients underwent ARFI elasticity imaging. Virtual Touch tissue imaging (VTI) and Virtual Touch tissue quantification (Siemens Medical Solutions, Mountain View, CA) were used to qualitatively and quantitatively analyze the elasticity and hardness of tumors. A receiver operating characteristic curve analysis was performed to evaluate the diagnostic performance of ARFI for discrimination between benign and malignant breast lesions. Pathologic analysis revealed 40 lesions in the malignant group and 86 lesions in the benign group. Different VTI patterns were observed in benign and malignant breast lesions. Eighty lesions (93.0%) of benign group had pattern 1, 2, or 3, whereas all pattern 4b lesions (n = 20 [50.0%]) were malignant. Regarding the quantitative analysis, the mean VTI-to-B-mode area ratio, internal shear wave velocity, and marginal shear wave velocity of benign lesions were statistically significantly lower than those of malignant lesions (all P < .001). The cutoff point for a scoring system constructed to evaluate the diagnostic performance of ARFI was estimated to be between 3 and 4 points for malignancy, with sensitivity of 77.5%, specificity of 96.5%, accuracy of 90.5%, and an area under the curve of 0.933. The application of ARFI technology has shown promising results by noninvasively providing substantial complementary information and could potentially serve as an effective diagnostic tool for differentiation between benign and malignant breast lesions. © 2018 by the American Institute of Ultrasound in Medicine.

  2. Average is Boring: How Similarity Kills a Meme's Success

    NASA Astrophysics Data System (ADS)

    Coscia, Michele

    2014-09-01

    Every day we are exposed to different ideas, or memes, competing with each other for our attention. Previous research explained popularity and persistence heterogeneity of memes by assuming them in competition for limited attention resources, distributed in a heterogeneous social network. Little has been said about what characteristics make a specific meme more likely to be successful. We propose a similarity-based explanation: memes with higher similarity to other memes have a significant disadvantage in their potential popularity. We employ a meme similarity measure based on semantic text analysis and computer vision to prove that a meme is more likely to be successful and to thrive if its characteristics make it unique. Our results show that indeed successful memes are located in the periphery of the meme similarity space and that our similarity measure is a promising predictor of a meme success.

  3. Average is boring: how similarity kills a meme's success.

    PubMed

    Coscia, Michele

    2014-09-26

    Every day we are exposed to different ideas, or memes, competing with each other for our attention. Previous research explained popularity and persistence heterogeneity of memes by assuming them in competition for limited attention resources, distributed in a heterogeneous social network. Little has been said about what characteristics make a specific meme more likely to be successful. We propose a similarity-based explanation: memes with higher similarity to other memes have a significant disadvantage in their potential popularity. We employ a meme similarity measure based on semantic text analysis and computer vision to prove that a meme is more likely to be successful and to thrive if its characteristics make it unique. Our results show that indeed successful memes are located in the periphery of the meme similarity space and that our similarity measure is a promising predictor of a meme success.

  4. Comparison of MPEG-1 digital videotape with digitized sVHS videotape for quantitative echocardiographic measurements

    NASA Technical Reports Server (NTRS)

    Garcia, M. J.; Thomas, J. D.; Greenberg, N.; Sandelski, J.; Herrera, C.; Mudd, C.; Wicks, J.; Spencer, K.; Neumann, A.; Sankpal, B.; hide

    2001-01-01

    Digital format is rapidly emerging as a preferred method for displaying and retrieving echocardiographic studies. The qualitative diagnostic accuracy of Moving Pictures Experts Group (MPEG-1) compressed digital echocardiographic studies has been previously reported. The goals of the present study were to compare quantitative measurements derived from MPEG-1 recordings with the super-VHS (sVHS) videotape clinical standard. Six reviewers performed blinded measurements from still-frame images selected from 20 echocardiographic studies that were simultaneously acquired in sVHS and MPEG-1 formats. Measurements were obtainable in 1401 (95%) of 1486 MPEG-1 variables compared with 1356 (91%) of 1486 sVHS variables (P <.001). Excellent agreement existed between MPEG-1 and sVHS 2-dimensional linear measurements (r = 0.97; MPEG-1 = 0.95[sVHS] + 1.1 mm; P <.001; Delta = 9% +/- 10%), 2-dimensional area measurements (r = 0.89), color jet areas (r = 0.87, p <.001), and Doppler velocities (r = 0.92, p <.001). Interobserver variability was similar for both sVHS and MPEG-1 readings. Our results indicate that quantitative off-line measurements from MPEG-1 digitized echocardiographic studies are feasible and comparable to those obtained from sVHS.

  5. Classification of wheat: Badhwar profile similarity technique

    NASA Technical Reports Server (NTRS)

    Austin, W. W.

    1980-01-01

    The Badwar profile similarity classification technique used successfully for classification of corn was applied to spring wheat classifications. The software programs and the procedures used to generate full-scene classifications are presented, and numerical results of the acreage estimations are given.

  6. Quantitative habitability.

    PubMed

    Shock, Everett L; Holland, Melanie E

    2007-12-01

    A framework is proposed for a quantitative approach to studying habitability. Considerations of environmental supply and organismal demand of energy lead to the conclusions that power units are most appropriate and that the units for habitability become watts per organism. Extreme and plush environments are revealed to be on a habitability continuum, and extreme environments can be quantified as those where power supply only barely exceeds demand. Strategies for laboratory and field experiments are outlined that would quantify power supplies, power demands, and habitability. An example involving a comparison of various metabolisms pursued by halophiles is shown to be well on the way to a quantitative habitability analysis.

  7. Using the iPhone as a device for a rapid quantitative analysis of trinitrotoluene in soil.

    PubMed

    Choodum, Aree; Kanatharana, Proespichaya; Wongniramaikul, Worawit; Daeid, Niamh Nic

    2013-10-15

    Mobile 'smart' phones have become almost ubiquitous in society and are typically equipped with a high-resolution digital camera which can be used to produce an image very conveniently. In this study, the built-in digital camera of a smart phone (iPhone) was used to capture the results from a rapid quantitative colorimetric test for trinitrotoluene (TNT) in soil. The results were compared to those from a digital single-lens reflex (DSLR) camera. The colored product from the selective test for TNT was quantified using an innovative application of photography where the relationships between the Red Green Blue (RGB) values and the concentrations of colorimetric product were exploited. The iPhone showed itself to be capable of being used more conveniently than the DSLR while providing similar analytical results with increased sensitivity. The wide linear range and low detection limits achieved were comparable with those from spectrophotometric quantification methods. Low relative errors in the range of 0.4 to 6.3% were achieved in the analysis of control samples and 0.4-6.2% for spiked soil extracts with good precision (2.09-7.43% RSD) for the analysis over 4 days. The results demonstrate that the iPhone provides the potential to be used as an ideal novel platform for the development of a rapid on site semi quantitative field test for the analysis of explosives. © 2013 Elsevier B.V. All rights reserved.

  8. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  9. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  10. Transformation and Alignment in Similarity

    ERIC Educational Resources Information Center

    Hodgetts, Carl J.; Hahn, Ulrike; Chater, Nick

    2009-01-01

    This paper contrasts two structural accounts of psychological similarity: structural alignment (SA) and Representational Distortion (RD). SA proposes that similarity is determined by how readily the structures of two objects can be brought into alignment; RD measures similarity by the complexity of the transformation that "distorts" one…

  11. Quantitative molecular characterization of bovine vitreous and lens with non-invasive dynamic light scattering

    NASA Technical Reports Server (NTRS)

    Ansari, R. R.; Suh, K. I.; Dunker, S.; Kitaya, N.; Sebag, J.

    2001-01-01

    The non-invasive technique of dynamic light scattering (DLS) was used to quantitatively characterize vitreous and lens structure on a molecular level by measuring the sizes of the predominant particles and mapping the three-dimensional topographic distribution of these structural macromolecules in three spatial dimensions. The results of DLS measurements in five fresh adult bovine eyes were compared to DLS measurements in model solutions of hyaluronan (HA) and collagen (Coll). In the bovine eyes DLS measurements were obtained from excised samples of gel and liquid vitreous and compared to the model solutions. Measurements in whole vitreous were obtained at multiple points posterior to the lens to generate a three-dimensional 'map' of molecular structure. The macromolecule distribution in bovine lens was similarly characterized.In each bovine vitreous (Bo Vit) specimen, DLS predominantly detected two distinct particles, which differed in diffusion properties and hence size. Comparisons with model vitreous solutions demonstrated that these most likely corresponded to the Coll and HA components of vitreous. Three-dimensional mapping of Bo Vit found heterogeneity throughout the vitreous body, with different particle size distributions for Coll and HA at different loci. In contrast, the three-dimensional distribution of lens macromolecules was more homogeneous. Thus, the non-invasive DLS technique can quantitate the average sizes of vitreous and lens macromolecules and map their three-dimensional distribution. This method to assess quantitatively the macromolecular structure of vitreous and lens should be useful for clinical as well as experimental applications in health and disease. Copyright 2001 Academic Press.

  12. Sample selection in foreign similarity regions for multicrop experiments

    NASA Technical Reports Server (NTRS)

    Malin, J. T. (Principal Investigator)

    1981-01-01

    The selection of sample segments in the U.S. foreign similarity regions for development of proportion estimation procedures and error modeling for Argentina, Australia, Brazil, and USSR in AgRISTARS is described. Each sample was chosen to be similar in crop mix to the corresponding indicator region sample. Data sets, methods of selection, and resulting samples are discussed.

  13. Similar Genetic Architecture with Shared and Unique Quantitative Trait Loci for Bacterial Cold Water Disease Resistance in Two Rainbow Trout Breeding Populations

    PubMed Central

    Vallejo, Roger L.; Liu, Sixin; Gao, Guangtu; Fragomeni, Breno O.; Hernandez, Alvaro G.; Leeds, Timothy D.; Parsons, James E.; Martin, Kyle E.; Evenhuis, Jason P.; Welch, Timothy J.; Wiens, Gregory D.; Palti, Yniv

    2017-01-01

    Bacterial cold water disease (BCWD) causes significant mortality and economic losses in salmonid aquaculture. In previous studies, we identified moderate-large effect quantitative trait loci (QTL) for BCWD resistance in rainbow trout (Oncorhynchus mykiss). However, the recent availability of a 57 K SNP array and a reference genome assembly have enabled us to conduct genome-wide association studies (GWAS) that overcome several experimental limitations from our previous work. In the current study, we conducted GWAS for BCWD resistance in two rainbow trout breeding populations using two genotyping platforms, the 57 K Affymetrix SNP array and restriction-associated DNA (RAD) sequencing. Overall, we identified 14 moderate-large effect QTL that explained up to 60.8% of the genetic variance in one of the two populations and 27.7% in the other. Four of these QTL were found in both populations explaining a substantial proportion of the variance, although major differences were also detected between the two populations. Our results confirm that BCWD resistance is controlled by the oligogenic inheritance of few moderate-large effect loci and a large-unknown number of loci each having a small effect on BCWD resistance. We detected differences in QTL number and genome location between two GWAS models (weighted single-step GBLUP and Bayes B), which highlights the utility of using different models to uncover QTL. The RAD-SNPs detected a greater number of QTL than the 57 K SNP array in one population, suggesting that the RAD-SNPs may uncover polymorphisms that are more unique and informative for the specific population in which they were discovered. PMID:29109734

  14. Similar Genetic Architecture with Shared and Unique Quantitative Trait Loci for Bacterial Cold Water Disease Resistance in Two Rainbow Trout Breeding Populations.

    PubMed

    Vallejo, Roger L; Liu, Sixin; Gao, Guangtu; Fragomeni, Breno O; Hernandez, Alvaro G; Leeds, Timothy D; Parsons, James E; Martin, Kyle E; Evenhuis, Jason P; Welch, Timothy J; Wiens, Gregory D; Palti, Yniv

    2017-01-01

    Bacterial cold water disease (BCWD) causes significant mortality and economic losses in salmonid aquaculture. In previous studies, we identified moderate-large effect quantitative trait loci (QTL) for BCWD resistance in rainbow trout ( Oncorhynchus mykiss ). However, the recent availability of a 57 K SNP array and a reference genome assembly have enabled us to conduct genome-wide association studies (GWAS) that overcome several experimental limitations from our previous work. In the current study, we conducted GWAS for BCWD resistance in two rainbow trout breeding populations using two genotyping platforms, the 57 K Affymetrix SNP array and restriction-associated DNA (RAD) sequencing. Overall, we identified 14 moderate-large effect QTL that explained up to 60.8% of the genetic variance in one of the two populations and 27.7% in the other. Four of these QTL were found in both populations explaining a substantial proportion of the variance, although major differences were also detected between the two populations. Our results confirm that BCWD resistance is controlled by the oligogenic inheritance of few moderate-large effect loci and a large-unknown number of loci each having a small effect on BCWD resistance. We detected differences in QTL number and genome location between two GWAS models (weighted single-step GBLUP and Bayes B), which highlights the utility of using different models to uncover QTL. The RAD-SNPs detected a greater number of QTL than the 57 K SNP array in one population, suggesting that the RAD-SNPs may uncover polymorphisms that are more unique and informative for the specific population in which they were discovered.

  15. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  16. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  17. PHOG analysis of self-similarity in aesthetic images

    NASA Astrophysics Data System (ADS)

    Amirshahi, Seyed Ali; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2012-03-01

    In recent years, there have been efforts in defining the statistical properties of aesthetic photographs and artworks using computer vision techniques. However, it is still an open question how to distinguish aesthetic from non-aesthetic images with a high recognition rate. This is possibly because aesthetic perception is influenced also by a large number of cultural variables. Nevertheless, the search for statistical properties of aesthetic images has not been futile. For example, we have shown that the radially averaged power spectrum of monochrome artworks of Western and Eastern provenance falls off according to a power law with increasing spatial frequency (1/f2 characteristics). This finding implies that this particular subset of artworks possesses a Fourier power spectrum that is self-similar across different scales of spatial resolution. Other types of aesthetic images, such as cartoons, comics and mangas also display this type of self-similarity, as do photographs of complex natural scenes. Since the human visual system is adapted to encode images of natural scenes in a particular efficient way, we have argued that artists imitate these statistics in their artworks. In support of this notion, we presented results that artists portrait human faces with the self-similar Fourier statistics of complex natural scenes although real-world photographs of faces are not self-similar. In view of these previous findings, we investigated other statistical measures of self-similarity to characterize aesthetic and non-aesthetic images. In the present work, we propose a novel measure of self-similarity that is based on the Pyramid Histogram of Oriented Gradients (PHOG). For every image, we first calculate PHOG up to pyramid level 3. The similarity between the histograms of each section at a particular level is then calculated to the parent section at the previous level (or to the histogram at the ground level). The proposed approach is tested on datasets of aesthetic and

  18. The Effects of Similarity on High-Level Visual Working Memory Processing.

    PubMed

    Yang, Li; Mo, Lei

    2017-01-01

    Similarity has been observed to have opposite effects on visual working memory (VWM) for complex images. How can these discrepant results be reconciled? To answer this question, we used a change-detection paradigm to test visual working memory performance for multiple real-world objects. We found that working memory for moderate similarity items was worse than that for either high or low similarity items. This pattern was unaffected by manipulations of stimulus type (faces vs. scenes), encoding duration (limited vs. self-paced), and presentation format (simultaneous vs. sequential). We also found that the similarity effects differed in strength in different categories (scenes vs. faces). These results suggest that complex real-world objects are represented using a centre-surround inhibition organization . These results support the category-specific cortical resource theory and further suggest that centre-surround inhibition organization may differ by category.

  19. Pollinators show flower colour preferences but flowers with similar colours do not attract similar pollinators

    PubMed Central

    Reverté, Sara; Retana, Javier; Gómez, José M.; Bosch, Jordi

    2016-01-01

    Background and aims Colour is one of the main floral traits used by pollinators to locate flowers. Although pollinators show innate colour preferences, the view that the colour of a flower may be considered an important predictor of its main pollinators is highly controversial because flower choice is highly context-dependent, and initial innate preferences may be overridden by subsequent associative learning. Our objective is to establish whether there is a relationship between flower colour and pollinator composition in natural communities. Methods We measured the flower reflectance spectrum and pollinator composition in four plant communities (85 plant species represented by 109 populations, and 32 305 plant–pollinator interactions in total). Pollinators were divided into six taxonomic groups: bees, ants, wasps, coleopterans, dipterans and lepidopterans. Key Results We found consistent associations between pollinator groups and certain colours. These associations matched innate preferences experimentally established for several pollinators and predictions of the pollination syndrome theory. However, flowers with similar colours did not attract similar pollinator assemblages. Conclusions The explanation for this paradoxical result is that most flower species are pollination generalists. We conclude that although pollinator colour preferences seem to condition plant–pollinator interactions, the selective force behind these preferences has not been strong enough to mediate the appearance and maintenance of tight colour-based plant–pollinator associations. PMID:27325897

  20. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  1. Effectiveness of a systematic approach to promote intersectoral collaboration in comprehensive school health promotion-a multiple-case study using quantitative and qualitative data.

    PubMed

    Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K

    2015-07-05

    We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.

  2. Self-similar structure and experimental signatures of suprathermal ion distribution in inertial confinement fusion implosions

    DOE PAGES

    Kagan, Grigory; Svyatskiy, D.; Rinderknecht, H. G.; ...

    2015-09-03

    The distribution function of suprathermal ions is found to be self-similar under conditions relevant to inertial confinement fusion hot spots. By utilizing this feature, interference between the hydrodynamic instabilities and kinetic effects is for the first time assessed quantitatively to find that the instabilities substantially aggravate the fusion reactivity reduction. Thus, the ion tail depletion is also shown to lower the experimentally inferred ion temperature, a novel kinetic effect that may explain the discrepancy between the exploding pusher experiments and rad-hydro simulations and contribute to the observation that temperature inferred from DD reaction products is lower than from DT atmore » the National Ignition Facility.« less

  3. Self-Similar Structure and Experimental Signatures of Suprathermal Ion Distribution in Inertial Confinement Fusion Implosions

    NASA Astrophysics Data System (ADS)

    Kagan, Grigory; Svyatskiy, D.; Rinderknecht, H. G.; Rosenberg, M. J.; Zylstra, A. B.; Huang, C.-K.; McDevitt, C. J.

    2015-09-01

    The distribution function of suprathermal ions is found to be self-similar under conditions relevant to inertial confinement fusion hot spots. By utilizing this feature, interference between the hydrodynamic instabilities and kinetic effects is for the first time assessed quantitatively to find that the instabilities substantially aggravate the fusion reactivity reduction. The ion tail depletion is also shown to lower the experimentally inferred ion temperature, a novel kinetic effect that may explain the discrepancy between the exploding pusher experiments and rad-hydro simulations and contribute to the observation that temperature inferred from DD reaction products is lower than from DT at the National Ignition Facility.

  4. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study

    PubMed Central

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R.

    2017-01-01

    Purpose Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. Materials and methods 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. Results A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2–5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this

  5. Inferring gene ontologies from pairwise similarity data

    PubMed Central

    Kramer, Michael; Dutkowski, Janusz; Yu, Michael; Bafna, Vineet; Ideker, Trey

    2014-01-01

    Motivation: While the manually curated Gene Ontology (GO) is widely used, inferring a GO directly from -omics data is a compelling new problem. Recognizing that ontologies are a directed acyclic graph (DAG) of terms and hierarchical relations, algorithms are needed that: analyze a full matrix of gene–gene pairwise similarities from -omics data;infer true hierarchical structure in these data rather than enforcing hierarchy as a computational artifact; andrespect biological pleiotropy, by which a term in the hierarchy can relate to multiple higher level terms. Methods addressing these requirements are just beginning to emerge—none has been evaluated for GO inference. Methods: We consider two algorithms [Clique Extracted Ontology (CliXO), LocalFitness] that uniquely satisfy these requirements, compared with methods including standard clustering. CliXO is a new approach that finds maximal cliques in a network induced by progressive thresholding of a similarity matrix. We evaluate each method’s ability to reconstruct the GO biological process ontology from a similarity matrix based on (a) semantic similarities for GO itself or (b) three -omics datasets for yeast. Results: For task (a) using semantic similarity, CliXO accurately reconstructs GO (>99% precision, recall) and outperforms other approaches (<20% precision, <20% recall). For task (b) using -omics data, CliXO outperforms other methods using two -omics datasets and achieves ∼30% precision and recall using YeastNet v3, similar to an earlier approach (Network Extracted Ontology) and better than LocalFitness or standard clustering (20–25% precision, recall). Conclusion: This study provides algorithmic foundation for building gene ontologies by capturing hierarchical and pleiotropic structure embedded in biomolecular data. Contact: tideker@ucsd.edu PMID:24932003

  6. Quantitative indexes of aminonucleoside-induced nephrotic syndrome.

    PubMed Central

    Nevins, T. E.; Gaston, T.; Basgen, J. M.

    1984-01-01

    Aminonucleoside of puromycin (PAN) is known to cause altered glomerular permeability, resulting in a nephrotic syndrome in rats. The early sequence of this lesion was studied quantitatively, with the application of a new morphometric technique for determining epithelial foot process widths and a sensitive assay for quantifying urinary albumin excretion. Twenty-four hours following a single intraperitoneal injection of PAN, significant widening of foot processes was documented. Within 36 hours significant increases in urinary albumin excretion were observed. When control rats were examined, there was no clear correlation between epithelial foot process width and quantitative albumin excretion. However, in the PAN-treated animals, abnormal albuminuria only appeared in association with appreciable foot process expansion. These studies indicate that quantitative alterations occur in the rat glomerular capillary wall as early as 24 hours after PAN. Further studies of altered glomerular permeability may use these sensitive measures to more precisely define the temporal sequence and elucidate possible subgroups of experimental glomerular injury. Images Figure 1 Figure 2 PMID:6486243

  7. Quantitative self-assembly prediction yields targeted nanomedicines

    NASA Astrophysics Data System (ADS)

    Shamay, Yosi; Shah, Janki; Işık, Mehtap; Mizrachi, Aviram; Leibold, Josef; Tschaharganeh, Darjus F.; Roxbury, Daniel; Budhathoki-Uprety, Januka; Nawaly, Karla; Sugarman, James L.; Baut, Emily; Neiman, Michelle R.; Dacek, Megan; Ganesh, Kripa S.; Johnson, Darren C.; Sridharan, Ramya; Chu, Karen L.; Rajasekhar, Vinagolu K.; Lowe, Scott W.; Chodera, John D.; Heller, Daniel A.

    2018-02-01

    Development of targeted nanoparticle drug carriers often requires complex synthetic schemes involving both supramolecular self-assembly and chemical modification. These processes are generally difficult to predict, execute, and control. We describe herein a targeted drug delivery system that is accurately and quantitatively predicted to self-assemble into nanoparticles based on the molecular structures of precursor molecules, which are the drugs themselves. The drugs assemble with the aid of sulfated indocyanines into particles with ultrahigh drug loadings of up to 90%. We devised quantitative structure-nanoparticle assembly prediction (QSNAP) models to identify and validate electrotopological molecular descriptors as highly predictive indicators of nano-assembly and nanoparticle size. The resulting nanoparticles selectively targeted kinase inhibitors to caveolin-1-expressing human colon cancer and autochthonous liver cancer models to yield striking therapeutic effects while avoiding pERK inhibition in healthy skin. This finding enables the computational design of nanomedicines based on quantitative models for drug payload selection.

  8. Comprehensive Quantitative Analysis on Privacy Leak Behavior

    PubMed Central

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects. PMID:24066046

  9. Comprehensive quantitative analysis on privacy leak behavior.

    PubMed

    Fan, Lejun; Wang, Yuanzhuo; Jin, Xiaolong; Li, Jingyuan; Cheng, Xueqi; Jin, Shuyuan

    2013-01-01

    Privacy information is prone to be leaked by illegal software providers with various motivations. Privacy leak behavior has thus become an important research issue of cyber security. However, existing approaches can only qualitatively analyze privacy leak behavior of software applications. No quantitative approach, to the best of our knowledge, has been developed in the open literature. To fill this gap, in this paper we propose for the first time four quantitative metrics, namely, possibility, severity, crypticity, and manipulability, for privacy leak behavior analysis based on Privacy Petri Net (PPN). In order to compare the privacy leak behavior among different software, we further propose a comprehensive metric, namely, overall leak degree, based on these four metrics. Finally, we validate the effectiveness of the proposed approach using real-world software applications. The experimental results demonstrate that our approach can quantitatively analyze the privacy leak behaviors of various software types and reveal their characteristics from different aspects.

  10. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  11. Quantitative crystalline silica exposure assessment for a historical cohort epidemiologic study in the German porcelain industry.

    PubMed

    Birk, Thomas; Guldner, Karlheinz; Mundt, Kenneth A; Dahmann, Dirk; Adams, Robert C; Parsons, William

    2010-09-01

    A time-dependent quantitative exposure assessment of silica exposure among nearly 18,000 German porcelain workers was conducted. Results will be used to evaluate exposure-response disease risks. Over 8000 historical industrial hygiene (IH) measurements with original sampling and analysis protocols from 1954-2006 were obtained from the German Berufs- genossenschaft der keramischen-und Glas-Industrie (BGGK) and used to construct a job exposure matrix (JEM). Early measurements from different devices were converted to modern gravimetric equivalent values. Conversion factors were derived from parallel historical measurements and new side-by-side measurements using historical and modern devices in laboratory dust tunnels and active workplace locations. Exposure values were summarized and smoothed using LOESS regression; estimates for early years were derived using backward extrapolation techniques. Employee work histories were merged with JEM values to determine cumulative crystalline silica exposures for cohort members. Average silica concentrations were derived for six primary similar exposure groups (SEGs) for 1938-2006. Over 40% of the cohort accumulated <0.5 mg; just over one-third accumulated >1 mg/m(3)-years. Nearly 5000 workers had cumulative crystalline silica estimates >1.5 mg/m(3)-years. Similar numbers of men and women fell into each cumulative exposure category, except for 1113 women and 1567 men in the highest category. Over half of those hired before 1960 accumulated >3 mg/m(3)-years crystalline silica compared with 4.9% of those hired after 1960. Among those ever working in the materials preparation area, half accumulated >3 mg/m(3)-year compared with 12% of those never working in this area. Quantitative respirable silica exposures were estimated for each member of this cohort, including employment periods for which sampling used now obsolete technologies. Although individual cumulative exposure estimates ranged from background to about 40 mg/m(3)-years

  12. Clinical neurophysiology and quantitative sensory testing in the investigation of orofacial pain and sensory function.

    PubMed

    Jääskeläinen, Satu K

    2004-01-01

    Chronic orofacial pain represents a diagnostic and treatment challenge for the clinician. Some conditions, such as atypical facial pain, still lack proper diagnostic criteria, and their etiology is not known. The recent development of neurophysiological methods and quantitative sensory testing for the examination of the trigeminal somatosensory system offers several tools for diagnostic and etiological investigation of orofacial pain. This review presents some of these techniques and the results of their application in studies on orofacial pain and sensory dysfunction. Clinical neurophysiological investigation has greater diagnostic accuracy and sensitivity than clinical examination in the detection of the neurogenic abnormalities of either peripheral or central origin that may underlie symptoms of orofacial pain and sensory dysfunction. Neurophysiological testing may also reveal trigeminal pathology when magnetic resonance imaging has failed to detect it, so these methods should be considered complementary to each other in the investigation of orofacial pain patients. The blink reflex, corneal reflex, jaw jerk, sensory neurography of the inferior alveolar nerve, and the recording of trigeminal somatosensory-evoked potentials with near-nerve stimulation have all proved to be sensitive and reliable in the detection of dysfunction of the myelinated sensory fibers of the trigeminal nerve or its central connections within the brainstem. With appropriately small thermodes, thermal quantitative sensory testing is useful for the detection of trigeminal small-fiber dysfunction (Adelta and C). In neuropathic conditions, it is most sensitive to lesions causing axonal injury. By combining different techniques for investigation of the trigeminal system, an accurate topographical diagnosis and profile of sensory fiber pathology can be determined. Neurophysiological and quantitative sensory tests have already highlighted some similarities among various orofacial pain conditions

  13. Morphological quantitative criteria and aesthetic evaluation of eight female Han face types.

    PubMed

    Zhao, Qiming; Zhou, Rongrong; Zhang, XuDong; Sun, Huafeng; Lu, Xin; Xia, Dongsheng; Song, Mingli; Liang, Yang

    2013-04-01

    Human facial aesthetics relies on the classification of facial features and standards of attractiveness. However, there are no widely accepted quantitative criteria for facial attractiveness, particularly for Chinese Han faces. Establishing quantitative standards of attractiveness for facial landmarks within facial types is important for planning outcomes in cosmetic plastic surgery. The aim of this study was to determine quantitatively the criteria for attractiveness of eight female Chinese Han facial types. A photographic database of young Chinese Han women's faces was created. Photographed faces (450) were classified based on eight established types and scored for attractiveness. Measurements taken at seven standard facial landmarks and their relative proportions were analyzed for correlations to attractiveness scores. Attractive faces of each type were averaged via an image-morphing algorithm to generate synthetic facial types. Results were compared with the neoclassical ideal and data for Caucasians. Morphological proportions corresponding to the highest attractiveness scores for Chinese Han women differed from the neoclassical ideal. In our population of young, normal, healthy Han women, high attractiveness ratings were given to those with greater temporal width and pogonion-gonion distance, and smaller bizygomatic and bigonial widths. As attractiveness scores increased, the ratio of the temporal to bizygomatic widths increased, and the ratio of the distance between the pogonion and gonion to the bizygomatic width also increased slightly. Among the facial types, the oval and inverted triangular were the most attractive. The neoclassical ideal of attractiveness does not apply to Han faces. However, the proportion of faces considered attractive in this population was similar to that of Caucasian populations. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please

  14. Optimizing Nanoscale Quantitative Optical Imaging of Subfield Scattering Targets

    PubMed Central

    Henn, Mark-Alexander; Barnes, Bryan M.; Zhou, Hui; Sohn, Martin; Silver, Richard M.

    2016-01-01

    The full 3-D scattered field above finite sets of features has been shown to contain a continuum of spatial frequency information, and with novel optical microscopy techniques and electromagnetic modeling, deep-subwavelength geometrical parameters can be determined. Similarly, by using simulations, scattering geometries and experimental conditions can be established to tailor scattered fields that yield lower parametric uncertainties while decreasing the number of measurements and the area of such finite sets of features. Such optimized conditions are reported through quantitative optical imaging in 193 nm scatterfield microscopy using feature sets up to four times smaller in area than state-of-the-art critical dimension targets. PMID:27805660

  15. Investigating Pharmacological Similarity by Charting Chemical Space.

    PubMed

    Buonfiglio, Rosa; Engkvist, Ola; Várkonyi, Péter; Henz, Astrid; Vikeved, Elisabet; Backlund, Anders; Kogej, Thierry

    2015-11-23

    In this study, biologically relevant areas of the chemical space were analyzed using ChemGPS-NP. This application enables comparing groups of ligands within a multidimensional space based on principle components derived from physicochemical descriptors. Also, 3D visualization of the ChemGPS-NP global map can be used to conveniently evaluate bioactive compound similarity and visually distinguish between different types or groups of compounds. To further establish ChemGPS-NP as a method to accurately represent the chemical space, a comparison with structure-based fingerprint has been performed. Interesting complementarities between the two descriptions of molecules were observed. It has been shown that the accuracy of describing molecules with physicochemical descriptors like in ChemGPS-NP is similar to the accuracy of structural fingerprints in retrieving bioactive molecules. Lastly, pharmacological similarity of structurally diverse compounds has been investigated in ChemGPS-NP space. These results further strengthen the case of using ChemGPS-NP as a tool to explore and visualize chemical space.

  16. Average is Boring: How Similarity Kills a Meme's Success

    PubMed Central

    Coscia, Michele

    2014-01-01

    Every day we are exposed to different ideas, or memes, competing with each other for our attention. Previous research explained popularity and persistence heterogeneity of memes by assuming them in competition for limited attention resources, distributed in a heterogeneous social network. Little has been said about what characteristics make a specific meme more likely to be successful. We propose a similarity-based explanation: memes with higher similarity to other memes have a significant disadvantage in their potential popularity. We employ a meme similarity measure based on semantic text analysis and computer vision to prove that a meme is more likely to be successful and to thrive if its characteristics make it unique. Our results show that indeed successful memes are located in the periphery of the meme similarity space and that our similarity measure is a promising predictor of a meme success. PMID:25257730

  17. Similarity constraints in testing of cooled engine parts

    NASA Technical Reports Server (NTRS)

    Colladay, R. S.; Stepka, F. S.

    1974-01-01

    A study is made of the effect of testing cooled parts of current and advanced gas turbine engines at the reduced temperature and pressure conditions which maintain similarity with the engine environment. Some of the problems facing the experimentalist in evaluating heat transfer and aerodynamic performance when hardware is tested at conditions other than the actual engine environment are considered. Low temperature and pressure test environments can simulate the performance of actual size prototype engine hardware within the tolerance of experimental accuracy if appropriate similarity conditions are satisfied. Failure to adhere to these similarity constraints because of test facility limitations or other reasons, can result in a number of serious errors in projecting the performance of test hardware to engine conditions.

  18. Toward a Social Psychology of Diagnosis: Similarity, Attraction, and Clinical Evaluation.

    ERIC Educational Resources Information Center

    Mazer, Donald B.

    1979-01-01

    Clinicians and undergraduates evaluated a client similar or dissimilar to themselves in political radicalism. Results document the presence of diagnostic bias, but only among student subjects is bias a function of similarity. For clinicians, the more radical client is seen as less disturbed. Similarity-attraction relationships were absent in both…

  19. Similarity increases altruistic punishment in humans

    PubMed Central

    Mussweiler, Thomas; Ockenfels, Axel

    2013-01-01

    Humans are attracted to similar others. As a consequence, social networks are homogeneous in sociodemographic, intrapersonal, and other characteristics—a principle called homophily. Despite abundant evidence showing the importance of interpersonal similarity and homophily for human relationships, their behavioral correlates and cognitive foundations are poorly understood. Here, we show that perceived similarity substantially increases altruistic punishment, a key mechanism underlying human cooperation. We induced (dis)similarity perception by manipulating basic cognitive mechanisms in an economic cooperation game that included a punishment phase. We found that similarity-focused participants were more willing to punish others’ uncooperative behavior. This influence of similarity is not explained by group identity, which has the opposite effect on altruistic punishment. Our findings demonstrate that pure similarity promotes reciprocity in ways known to encourage cooperation. At the same time, the increased willingness to punish norm violations among similarity-focused participants provides a rationale for why similar people are more likely to build stable social relationships. Finally, our findings show that altruistic punishment is differentially involved in encouraging cooperation under pure similarity vs. in-group conditions. PMID:24218611

  20. The Experience Elicited by Hallucinogens Presents the Highest Similarity to Dreaming within a Large Database of Psychoactive Substance Reports.

    PubMed

    Sanz, Camila; Tagliazucchi, Enzo

    2018-01-01

    Ever since the modern rediscovery of psychedelic substances by Western society, several authors have independently proposed that their effects bear a high resemblance to the dreams and dreamlike experiences occurring naturally during the sleep-wake cycle. Recent studies in humans have provided neurophysiological evidence supporting this hypothesis. However, a rigorous comparative analysis of the phenomenology ("what it feels like" to experience these states) is currently lacking. We investigated the semantic similarity between a large number of subjective reports of psychoactive substances and reports of high/low lucidity dreams, and found that the highest-ranking substance in terms of the similarity to high lucidity dreams was the serotonergic psychedelic lysergic acid diethylamide (LSD), whereas the highest-ranking in terms of the similarity to dreams of low lucidity were plants of the Datura genus, rich in deliriant tropane alkaloids. Conversely, sedatives, stimulants, antipsychotics, and antidepressants comprised most of the lowest-ranking substances. An analysis of the most frequent words in the subjective reports of dreams and hallucinogens revealed that terms associated with perception ("see," "visual," "face," "reality," "color"), emotion ("fear"), setting ("outside," "inside," "street," "front," "behind") and relatives ("mom," "dad," "brother," "parent," "family") were the most prevalent across both experiences. In summary, we applied novel quantitative analyses to a large volume of empirical data to confirm the hypothesis that, among all psychoactive substances, hallucinogen drugs elicit experiences with the highest semantic similarity to those of dreams. Our results and the associated methodological developments open the way to study the comparative phenomenology of different altered states of consciousness and its relationship with non-invasive measurements of brain physiology.

  1. A graph-based semantic similarity measure for the gene ontology.

    PubMed

    Alvarez, Marco A; Yan, Changhui

    2011-12-01

    Existing methods for calculating semantic similarities between pairs of Gene Ontology (GO) terms and gene products often rely on external databases like Gene Ontology Annotation (GOA) that annotate gene products using the GO terms. This dependency leads to some limitations in real applications. Here, we present a semantic similarity algorithm (SSA), that relies exclusively on the GO. When calculating the semantic similarity between a pair of input GO terms, SSA takes into account the shortest path between them, the depth of their nearest common ancestor, and a novel similarity score calculated between the definitions of the involved GO terms. In our work, we use SSA to calculate semantic similarities between pairs of proteins by combining pairwise semantic similarities between the GO terms that annotate the involved proteins. The reliability of SSA was evaluated by comparing the resulting semantic similarities between proteins with the functional similarities between proteins derived from expert annotations or sequence similarity. Comparisons with existing state-of-the-art methods showed that SSA is highly competitive with the other methods. SSA provides a reliable measure for semantics similarity independent of external databases of functional-annotation observations.

  2. Quantification of map similarity to magnetic pre-screening for heavy metal pollution assessment in top soil

    NASA Astrophysics Data System (ADS)

    Cao, L.; Appel, E.; Roesler, W.; Ojha, G.

    2013-12-01

    From numerous published results, the link between magnetic concentration and heavy metal (HM) concentrations is well established. However, bivariate correlation analysis does not imply causality, and if there are extreme values, which often appear in magnetic data, they can lead to seemingly excellent correlation. It seems clear that site selection for chemical sampling based on magnetic pre-screening can deliver a superior result for outlining HM pollution, but this conclusion has only been drawn from qualitative evaluation so far. In this study, we use map similarity comparison techniques to demonstrate the usefulness of a combined magnetic-chemical approach quantitatively. We chose available data around the 'Schwarze Pumpe', a large coal burning power plant complex located in eastern Germany. The site of 'Schwarze Pumpe' is suitable for a demonstration study as soil in its surrounding is heavy fly-ash polluted, the magnetic natural background is very low, and magnetic investigations can be done in undisturbed forest soil. Magnetic susceptibility (MS) of top soil was measured by a Bartington MS2D surface sensor at 180 locations and by a SM400 downhole device in ~0.5m deep vertical sections at 90 locations. Cores from the 90 downhole sites were also studied for HM analysis. From these results 85 sites could be used to determine a spatial distribution map of HM contents reflecting the 'True' situation of pollution. Different sets comprising 30 sites were chosen by arbitrarily selection from the above 85 sample sites (we refer to four such maps here: S1-4). Additionally, we determined a 'Targeted' map from 30 sites selected on the basis of the pre-screening MS results. The map comparison process is as follows: (1) categorization of all absolute values into five classes by the Natural Breaks classification method; (2) use Delaunay triangulation for connecting the sample locations in the x-y plane; (3) determination of a distribution map of triangular planes with

  3. Quantitative Finance

    NASA Astrophysics Data System (ADS)

    James, Jessica

    2017-01-01

    Quantitative finance is a field that has risen to prominence over the last few decades. It encompasses the complex models and calculations that value financial contracts, particularly those which reference events in the future, and apply probabilities to these events. While adding greatly to the flexibility of the market available to corporations and investors, it has also been blamed for worsening the impact of financial crises. But what exactly does quantitative finance encompass, and where did these ideas and models originate? We show that the mathematics behind finance and behind games of chance have tracked each other closely over the centuries and that many well-known physicists and mathematicians have contributed to the field.

  4. A similarity-based data warehousing environment for medical images.

    PubMed

    Teixeira, Jefferson William; Annibal, Luana Peixoto; Felipe, Joaquim Cezar; Ciferri, Ricardo Rodrigues; Ciferri, Cristina Dutra de Aguiar

    2015-11-01

    A core issue of the decision-making process in the medical field is to support the execution of analytical (OLAP) similarity queries over images in data warehousing environments. In this paper, we focus on this issue. We propose imageDWE, a non-conventional data warehousing environment that enables the storage of intrinsic features taken from medical images in a data warehouse and supports OLAP similarity queries over them. To comply with this goal, we introduce the concept of perceptual layer, which is an abstraction used to represent an image dataset according to a given feature descriptor in order to enable similarity search. Based on this concept, we propose the imageDW, an extended data warehouse with dimension tables specifically designed to support one or more perceptual layers. We also detail how to build an imageDW and how to load image data into it. Furthermore, we show how to process OLAP similarity queries composed of a conventional predicate and a similarity search predicate that encompasses the specification of one or more perceptual layers. Moreover, we introduce an index technique to improve the OLAP query processing over images. We carried out performance tests over a data warehouse environment that consolidated medical images from exams of several modalities. The results demonstrated the feasibility and efficiency of our proposed imageDWE to manage images and to process OLAP similarity queries. The results also demonstrated that the use of the proposed index technique guaranteed a great improvement in query processing. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. On the virtues of automated quantitative structure-activity relationship: the new kid on the block.

    PubMed

    de Oliveira, Marcelo T; Katekawa, Edson

    2018-02-01

    Quantitative structure-activity relationship (QSAR) has proved to be an invaluable tool in medicinal chemistry. Data availability at unprecedented levels through various databases have collaborated to a resurgence in the interest for QSAR. In this context, rapid generation of quality predictive models is highly desirable for hit identification and lead optimization. We showcase the application of an automated QSAR approach, which randomly selects multiple training/test sets and utilizes machine-learning algorithms to generate predictive models. Results demonstrate that AutoQSAR produces models of improved or similar quality to those generated by practitioners in the field but in just a fraction of the time. Despite the potential of the concept to the benefit of the community, the AutoQSAR opportunity has been largely undervalued.

  6. AutoQSAR: an automated machine learning tool for best-practice quantitative structure-activity relationship modeling.

    PubMed

    Dixon, Steven L; Duan, Jianxin; Smith, Ethan; Von Bargen, Christopher D; Sherman, Woody; Repasky, Matthew P

    2016-10-01

    We introduce AutoQSAR, an automated machine-learning application to build, validate and deploy quantitative structure-activity relationship (QSAR) models. The process of descriptor generation, feature selection and the creation of a large number of QSAR models has been automated into a single workflow within AutoQSAR. The models are built using a variety of machine-learning methods, and each model is scored using a novel approach. Effectiveness of the method is demonstrated through comparison with literature QSAR models using identical datasets for six end points: protein-ligand binding affinity, solubility, blood-brain barrier permeability, carcinogenicity, mutagenicity and bioaccumulation in fish. AutoQSAR demonstrates similar or better predictive performance as compared with published results for four of the six endpoints while requiring minimal human time and expertise.

  7. Similar or Different?: The Importance of Similarities and Differences for Support between Siblings

    ERIC Educational Resources Information Center

    Voorpostel, Marieke; van der Lippe, Tanja; Dykstra, Pearl A.; Flap, Henk

    2007-01-01

    Using a large-scale Dutch national sample (N = 7,126), the authors examine the importance of similarities and differences in the sibling dyad for the provision of support. Similarities are assumed to enhance attraction and empathy; differences are assumed to be related to different possibilities for exchange. For helping with housework, helping…

  8. Low-dose CT for quantitative analysis in acute respiratory distress syndrome

    PubMed Central

    2013-01-01

    Introduction The clinical use of serial quantitative computed tomography (CT) to characterize lung disease and guide the optimization of mechanical ventilation in patients with acute respiratory distress syndrome (ARDS) is limited by the risk of cumulative radiation exposure and by the difficulties and risks related to transferring patients to the CT room. We evaluated the effects of tube current-time product (mAs) variations on quantitative results in healthy lungs and in experimental ARDS in order to support the use of low-dose CT for quantitative analysis. Methods In 14 sheep chest CT was performed at baseline and after the induction of ARDS via intravenous oleic acid injection. For each CT session, two consecutive scans were obtained applying two different mAs: 60 mAs was paired with 140, 15 or 7.5 mAs. All other CT parameters were kept unaltered (tube voltage 120 kVp, collimation 32 × 0.5 mm, pitch 0.85, matrix 512 × 512, pixel size 0.625 × 0.625 mm). Quantitative results obtained at different mAs were compared via Bland-Altman analysis. Results Good agreement was observed between 60 mAs and 140 mAs and between 60 mAs and 15 mAs (all biases less than 1%). A further reduction of mAs to 7.5 mAs caused an increase in the bias of poorly aerated and nonaerated tissue (-2.9% and 2.4%, respectively) and determined a significant widening of the limits of agreement for the same compartments (-10.5% to 4.8% for poorly aerated tissue and -5.9% to 10.8% for nonaerated tissue). Estimated mean effective dose at 140, 60, 15 and 7.5 mAs corresponded to 17.8, 7.4, 2.0 and 0.9 mSv, respectively. Image noise of scans performed at 140, 60, 15 and 7.5 mAs corresponded to 10, 16, 38 and 74 Hounsfield units, respectively. Conclusions A reduction of effective dose up to 70% has been achieved with minimal effects on lung quantitative results. Low-dose computed tomography provides accurate quantitative results and could be used to characterize lung compartment distribution and

  9. Finding Protein and Nucleotide Similarities with FASTA

    PubMed Central

    Pearson, William R.

    2016-01-01

    The FASTA programs provide a comprehensive set of rapid similarity searching tools ( fasta36, fastx36, tfastx36, fasty36, tfasty36), similar to those provided by the BLAST package, as well as programs for slower, optimal, local and global similarity searches ( ssearch36, ggsearch36) and for searching with short peptides and oligonucleotides ( fasts36, fastm36). The FASTA programs use an empirical strategy for estimating statistical significance that accommodates a range of similarity scoring matrices and gap penalties, improving alignment boundary accuracy and search sensitivity (Unit 3.5). The FASTA programs can produce “BLAST-like” alignment and tabular output, for ease of integration into existing analysis pipelines, and can search small, representative databases, and then report results for a larger set of sequences, using links from the smaller dataset. The FASTA programs work with a wide variety of database formats, including mySQL and postgreSQL databases (Unit 9.4). The programs also provide a strategy for integrating domain and active site annotations into alignments and highlighting the mutational state of functionally critical residues. These protocols describe how to use the FASTA programs to characterize protein and DNA sequences, using protein:protein, protein:DNA, and DNA:DNA comparisons. PMID:27010337

  10. Finding Protein and Nucleotide Similarities with FASTA.

    PubMed

    Pearson, William R

    2016-03-24

    The FASTA programs provide a comprehensive set of rapid similarity searching tools (fasta36, fastx36, tfastx36, fasty36, tfasty36), similar to those provided by the BLAST package, as well as programs for slower, optimal, local, and global similarity searches (ssearch36, ggsearch36), and for searching with short peptides and oligonucleotides (fasts36, fastm36). The FASTA programs use an empirical strategy for estimating statistical significance that accommodates a range of similarity scoring matrices and gap penalties, improving alignment boundary accuracy and search sensitivity. The FASTA programs can produce "BLAST-like" alignment and tabular output, for ease of integration into existing analysis pipelines, and can search small, representative databases, and then report results for a larger set of sequences, using links from the smaller dataset. The FASTA programs work with a wide variety of database formats, including mySQL and postgreSQL databases. The programs also provide a strategy for integrating domain and active site annotations into alignments and highlighting the mutational state of functionally critical residues. These protocols describe how to use the FASTA programs to characterize protein and DNA sequences, using protein:protein, protein:DNA, and DNA:DNA comparisons. Copyright © 2016 John Wiley & Sons, Inc.

  11. RFLP-facilitated investigation of the quantitative resistance of rice to brown planthopper ( Nilaparvata lugens).

    PubMed

    Xu, X. F.; Mei, H. W.; Luo, L. J.; Cheng, X. N.; Li, Z. K.

    2002-02-01

    Quantitative trait loci (QTLs), conferring quantitative resistance to rice brown planthopper (BPH), were investigated using 160 F(11) recombinant inbred lines (RILs) from the Lemont/Teqing cross, a complete RFLP map, and replicated phenotyping of seedbox inoculation. The paternal indica parent, Teqing, was more-resistant to BPH than the maternal japonica parent, Lemont. The RILs showed transgressive segregation for resistance to BPH. Seven main-effect QTLs and many epistatic QTL pairs were identified and mapped on the 12 rice chromosomes. Collectively, the main-effect and epistatic QTLs accounted for over 70% of the total variation in damage scores. Teqing has the resistance allele at four main-effect QTLs, and the Lemont allele resulted in resistance at the other three. Of the main-effect QTLs identified, QBphr5b was mapped to the vicinity of gl1, a major gene controlling leaf and stem pubescence. The Teqing allele controlling leaf and stem pubescence was associated with resistance, while the Lemont allele for glabrous stem and leaves was associated with susceptibility, indicating that this gene may have contributed to resistance through antixenosis. Similar to the reported BPH resistance genes, the other six detected main-effect QTLs were all mapped to regions where major disease resistance genes locate, suggesting they might have contributed either to antibiosis or tolerance. Our results indicated that marker-aided pyramiding of major resistance genes and QTLs should provide effective and stable control over this devastating pest.

  12. The impact of bereaved parents' perceived grief similarity on relationship satisfaction.

    PubMed

    Buyukcan-Tetik, Asuman; Finkenauer, Catrin; Schut, Henk; Stroebe, Margaret; Stroebe, Wolfgang

    2017-06-01

    The present research focused on bereaved parents' perceived grief similarity, and aimed to investigate the concurrent and longitudinal effects of the perceptions that the partner has less, equal, or more grief intensity than oneself on relationship satisfaction. Participants of our longitudinal study were 229 heterosexual bereaved Dutch couples who completed questionnaires 6, 13, and 20 months after the loss of their child. Average age of participants was 40.7 (SD = 9.5). Across 3 study waves, participants' perceived grief similarity and relationship satisfaction were assessed. To control for their effects, own grief level, child's gender, expectedness of loss, parent's age, parent's gender, and time were also included in the analyses. Consistent with the hypotheses, cross-sectional results revealed that bereaved parents who perceived dissimilar levels of grief (less or more grief) had lower relationship satisfaction than bereaved parents who perceived similar levels of grief. This effect remained significant controlling for the effects of possible confounding variables and actual similarity in grief between partners. We also found that perceived grief similarity at the first study wave was related to the highest level of relationship satisfaction at the second study wave. Moreover, results showed that perceived grief similarity was associated with a higher level in partner's relationship satisfaction. Results are discussed considering the comparison and similarity in grief across bereaved partners after child loss. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  13. [A novel approach to NIR spectral quantitative analysis: semi-supervised least-squares support vector regression machine].

    PubMed

    Li, Lin; Xu, Shuo; An, Xin; Zhang, Lu-Da

    2011-10-01

    In near infrared spectral quantitative analysis, the precision of measured samples' chemical values is the theoretical limit of those of quantitative analysis with mathematical models. However, the number of samples that can obtain accurately their chemical values is few. Many models exclude the amount of samples without chemical values, and consider only these samples with chemical values when modeling sample compositions' contents. To address this problem, a semi-supervised LS-SVR (S2 LS-SVR) model is proposed on the basis of LS-SVR, which can utilize samples without chemical values as well as those with chemical values. Similar to the LS-SVR, to train this model is equivalent to solving a linear system. Finally, the samples of flue-cured tobacco were taken as experimental material, and corresponding quantitative analysis models were constructed for four sample compositions' content(total sugar, reducing sugar, total nitrogen and nicotine) with PLS regression, LS-SVR and S2 LS-SVR. For the S2 LS-SVR model, the average relative errors between actual values and predicted ones for the four sample compositions' contents are 6.62%, 7.56%, 6.11% and 8.20%, respectively, and the correlation coefficients are 0.974 1, 0.973 3, 0.923 0 and 0.948 6, respectively. Experimental results show the S2 LS-SVR model outperforms the other two, which verifies the feasibility and efficiency of the S2 LS-SVR model.

  14. Teaching quantitative biology: goals, assessments, and resources

    PubMed Central

    Aikens, Melissa L.; Dolan, Erin L.

    2014-01-01

    More than a decade has passed since the publication of BIO2010, calling for an increased emphasis on quantitative skills in the undergraduate biology curriculum. In that time, relatively few papers have been published that describe educational innovations in quantitative biology or provide evidence of their effects on students. Using a “backward design” framework, we lay out quantitative skill and attitude goals, assessment strategies, and teaching resources to help biologists teach more quantitatively. Collaborations between quantitative biologists and education researchers are necessary to develop a broader and more appropriate suite of assessment tools, and to provide much-needed evidence on how particular teaching strategies affect biology students' quantitative skill development and attitudes toward quantitative work. PMID:25368425

  15. Development and First Results of the Width-Tapered Beam Method for Adhesion Testing of Photovoltaic Material Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosco, Nick; Tracy, Jared; Dauskardt, Reinhold

    2016-11-21

    A fracture mechanics based approach for quantifying adhesion at every interface within the PV module laminate is presented. The common requirements of monitoring crack length and specimen compliance are circumvented through development of a width-tapered cantilever beam method. This technique may be applied at both the module and coupon level to yield a similar, quantitative, measurement. Details of module and sample preparation are described and first results on field-exposed modules deployed for over 27 years presented.

  16. Semantic similarity measure in biomedical domain leverage web search engine.

    PubMed

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei

    2010-01-01

    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  17. Developing Geoscience Students' Quantitative Skills

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2005-12-01

    Sophisticated quantitative skills are an essential tool for the professional geoscientist. While students learn many of these sophisticated skills in graduate school, it is increasingly important that they have a strong grounding in quantitative geoscience as undergraduates. Faculty have developed many strong approaches to teaching these skills in a wide variety of geoscience courses. A workshop in June 2005 brought together eight faculty teaching surface processes and climate change to discuss and refine activities they use and to publish them on the Teaching Quantitative Skills in the Geosciences website (serc.Carleton.edu/quantskills) for broader use. Workshop participants in consultation with two mathematics faculty who have expertise in math education developed six review criteria to guide discussion: 1) Are the quantitative and geologic goals central and important? (e.g. problem solving, mastery of important skill, modeling, relating theory to observation); 2) Does the activity lead to better problem solving? 3) Are the quantitative skills integrated with geoscience concepts in a way that makes sense for the learning environment and supports learning both quantitative skills and geoscience? 4) Does the methodology support learning? (e.g. motivate and engage students; use multiple representations, incorporate reflection, discussion and synthesis) 5) Are the materials complete and helpful to students? 6) How well has the activity worked when used? Workshop participants found that reviewing each others activities was very productive because they thought about new ways to teach and the experience of reviewing helped them think about their own activity from a different point of view. The review criteria focused their thinking about the activity and would be equally helpful in the design of a new activity. We invite a broad international discussion of the criteria(serc.Carleton.edu/quantskills/workshop05/review.html).The Teaching activities can be found on the

  18. Similar herpes zoster incidence across Europe: results from a systematic literature review.

    PubMed

    Pinchinat, Sybil; Cebrián-Cuenca, Ana M; Bricout, Hélène; Johnson, Robert W

    2013-04-10

    Herpes zoster (HZ) is caused by reactivation of the varicella-zoster virus (VZV) and mainly affects individuals aged ≥50 years. The forthcoming European launch of a vaccine against HZ (Zostavax®) prompts the need for a better understanding of the epidemiology of HZ in Europe. Therefore the aim of this systematic review was to summarize the available data on HZ incidence in Europe and to describe age-specific incidence. The Medline database of the National Library of Medicine was used to conduct a comprehensive literature search of population-based studies of HZ incidence published between 1960 and 2010 carried out in the 27 member countries of the European Union, Iceland, Norway and Switzerland. The identified articles were reviewed and scored according to a reading grid including various quality criteria, and HZ incidence data were extracted and presented by country. The search identified 21 studies, and revealed a similar annual HZ incidence throughout Europe, varying by country from 2.0 to 4.6/1 000 person-years with no clearly observed geographic trend. Despite the fact that age groups differed from one study to another, age-specific HZ incidence rates seemed to hold steady during the review period, at around 1/1 000 children <10 years, around 2/1 000 adults aged <40 years, and around 1-4/1 000 adults aged 40-50 years. They then increased rapidly after age 50 years to around 7-8/1 000, up to 10/1 000 after 80 years of age. Our review confirms that in Europe HZ incidence increases with age, and quite drastically after 50 years of age. In all of the 21 studies included in the present review, incidence rates were higher among women than men, and this difference increased with age. This review also highlights the need to identify standardized surveillance methods to improve the comparability of data within European Union Member States and to monitor the impact of VZV immunization on the epidemiology of HZ. Available data in Europe have shortcomings which

  19. Quantitative structure activity relationship studies of mushroom tyrosinase inhibitors

    NASA Astrophysics Data System (ADS)

    Xue, Chao-Bin; Luo, Wan-Chun; Ding, Qi; Liu, Shou-Zhu; Gao, Xing-Xiang

    2008-05-01

    Here, we report our results from quantitative structure-activity relationship studies on tyrosinase inhibitors. Interactions between benzoic acid derivatives and tyrosinase active sites were also studied using a molecular docking method. These studies indicated that one possible mechanism for the interaction between benzoic acid derivatives and the tyrosinase active site is the formation of a hydrogen-bond between the hydroxyl (aOH) and carbonyl oxygen atoms of Tyr98, which stabilized the position of Tyr98 and prevented Tyr98 from participating in the interaction between tyrosinase and ORF378. Tyrosinase, also known as phenoloxidase, is a key enzyme in animals, plants and insects that is responsible for catalyzing the hydroxylation of tyrosine into o-diphenols and the oxidation of o-diphenols into o-quinones. In the present study, the bioactivities of 48 derivatives of benzaldehyde, benzoic acid, and cinnamic acid compounds were used to construct three-dimensional quantitative structure-activity relationship (3D-QSAR) models using comparative molecular field (CoMFA) and comparative molecular similarity indices (CoMSIA) analyses. After superimposition using common substructure-based alignments, robust and predictive 3D-QSAR models were obtained from CoMFA ( q 2 = 0.855, r 2 = 0.978) and CoMSIA ( q 2 = 0.841, r 2 = 0.946), with 6 optimum components. Chemical descriptors, including electronic (Hammett σ), hydrophobic (π), and steric (MR) parameters, hydrogen bond acceptor (H-acc), and indicator variable ( I), were used to construct a 2D-QSAR model. The results of this QSAR indicated that π, MR, and H-acc account for 34.9, 31.6, and 26.7% of the calculated biological variance, respectively. The molecular interactions between ligand and target were studied using a flexible docking method (FlexX). The best scored candidates were docked flexibly, and the interaction between the benzoic acid derivatives and the tyrosinase active site was elucidated in detail. We believe

  20. Similarity of Cortical Activity Patterns Predicts generalization Behavior

    PubMed Central

    Engineer, Crystal T.; Perez, Claudia A.; Carraway, Ryan S.; Chang, Kevin Q.; Roland, Jarod L.; Sloan, Andrew M.; Kilgard, Michael P.

    2013-01-01

    Humans and animals readily generalize previously learned knowledge to new situations. Determining similarity is critical for assigning category membership to a novel stimulus. We tested the hypothesis that category membership is initially encoded by the similarity of the activity pattern evoked by a novel stimulus to the patterns from known categories. We provide behavioral and neurophysiological evidence that activity patterns in primary auditory cortex contain sufficient information to explain behavioral categorization of novel speech sounds by rats. Our results suggest that category membership might be encoded by the similarity of the activity pattern evoked by a novel speech sound to the patterns evoked by known sounds. Categorization based on featureless pattern matching may represent a general neural mechanism for ensuring accurate generalization across sensory and cognitive systems. PMID:24147140