Sample records for extensive quantitative analysis

  1. Quantitative analysis on the urban flood mitigation effect by the extensive green roof system.

    PubMed

    Lee, J Y; Moon, H J; Kim, T I; Kim, H W; Han, M Y

    2013-10-01

    Extensive green-roof systems are expected to have a synergetic effect in mitigating urban runoff, decreasing temperature and supplying water to a building. Mitigation of runoff through rainwater retention requires the effective design of a green-roof catchment. This study identified how to improve building runoff mitigation through quantitative analysis of an extensive green-roof system. Quantitative analysis of green-roof runoff characteristics indicated that the extensive green roof has a high water-retaining capacity response to rainfall of less than 20 mm/h. As the rainfall intensity increased, the water-retaining capacity decreased. The catchment efficiency of an extensive green roof ranged from 0.44 to 0.52, indicating reduced runoff comparing with efficiency of 0.9 for a concrete roof. Therefore, extensive green roofs are an effective storm water best-management practice and the proposed parameters can be applied to an algorithm for rainwater-harvesting tank design. © 2013 Elsevier Ltd. All rights reserved.

  2. Mineral Analysis of Whole Grain Total Cereal

    ERIC Educational Resources Information Center

    Hooker, Paul

    2005-01-01

    The quantitative analysis of elemental iron in Whole Grain Total Cereal using visible spectroscopy is suitable for a general chemistry course for science or nonscience majors. The more extensive mineral analysis, specifically for the elements iron, calcium and zinc, is suitable for an instrumental or quantitative analysis chemistry course.

  3. Topology Design for Directional Range Extension Networks with Antenna Blockage

    DTIC Science & Technology

    2017-03-19

    introduced by pod-based antenna blockages. Using certain modeling approximations, the paper presents a quantitative analysis showing design trade-offs...parameters. Sec- tion IV develops quantitative relationships among key design elements and performance metrics. Section V considers some implications of the...Topology Design for Directional Range Extension Networks with Antenna Blockage Thomas Shake MIT Lincoln Laboratory shake@ll.mit.edu Abstract

  4. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  5. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  6. A Fan-Tastic Quantitative Exploration of Ohm's Law

    ERIC Educational Resources Information Center

    Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William

    2018-01-01

    Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence,…

  7. A comparison of manual and quantitative elbow strength testing.

    PubMed

    Shahgholi, Leili; Bengtson, Keith A; Bishop, Allen T; Shin, Alexander Y; Spinner, Robert J; Basford, Jeffrey R; Kaufman, Kenton R

    2012-10-01

    The aim of this study was to compare the clinical ratings of elbow strength obtained by skilled clinicians with objective strength measurement obtained through quantitative testing. A retrospective comparison of subject clinical records with quantitative strength testing results in a motion analysis laboratory was conducted. A total of 110 individuals between the ages of 8 and 65 yrs with traumatic brachial plexus injuries were identified. Patients underwent manual muscle strength testing as assessed on the 5-point British Medical Research Council Scale (5/5, normal; 0/5, absent) and quantitative elbow flexion and extension strength measurements. A total of 92 subjects had elbow flexion testing. Half of the subjects clinically assessed as having normal (5/5) elbow flexion strength on manual muscle testing exhibited less than 42% of their age-expected strength on quantitative testing. Eighty-four subjects had elbow extension strength testing. Similarly, half of those displaying normal elbow extension strength on manual muscle testing were found to have less than 62% of their age-expected values on quantitative testing. Significant differences between manual muscle testing and quantitative findings were not detected for the lesser (0-4) strength grades. Manual muscle testing, even when performed by experienced clinicians, may be more misleading than expected for subjects graded as having normal (5/5) strength. Manual muscle testing estimates for the lesser strength grades (1-4/5) seem reasonably accurate.

  8. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  9. Adduct ion-targeted qualitative and quantitative analysis of polyoxypregnanes by ultra-high pressure liquid chromatography coupled with triple quadrupole mass spectrometry.

    PubMed

    Wu, Xu; Zhu, Lin; Ma, Jiang; Ye, Yang; Lin, Ge

    2017-10-25

    Polyoxypregnane and its glycosides (POPs) are frequently present in plants of Asclepiadaceae family, and have a variety of biological activities. There is a great need to comprehensively profile these phytochemicals and to quantify them for monitoring their contents in the herbs and the biological samples. However, POPs undergo extensive adduct ion formation in ESI-MS, which has posed a challenge for qualitative and quantitative analysis of POPs. In the present study, we took the advantage of such extensive adduct ion formation to investigate the suitability of adduct ion-targeted analysis of POPs. For the qualitative analysis, we firstly demonstrated that the sodium and ammonium adduct ion-targeted product ion scans (PIS) provided adequate MS/MS fragmentations for structural characterization of POPs. Aided with precursor ion (PI) scans, which showed high selectivity and sensitivity and improved peak assignment confidence in conjunction with full scan (FS), the informative adduct ion-targeted PIS enabled rapid POPs profiling. For the quantification, we used formic acid rather than ammonium acetate as an additive in the mobile phase to avoid simultaneous formation of sodium and ammonium adduct ions, and greatly improved reproducibility of MS response of POPs. By monitoring the solely formed sodium adduct ions [M+Na] + , a method for simultaneous quantification of 25 POPs in the dynamic multiple reaction monitoring mode was then developed and validated. Finally, the aforementioned methods were applied to qualitative and quantitative analysis of POPs in the extract of a traditional Chinses medicinal herb, Marsdenia tenacissima (Roxb.) Wight et Arn., and in the plasma obtained from the rats treated with this herb. The results demonstrated that adduct ion formation could be optimized for the qualitative and quantitative analysis of POPs, and our developed PI/FS-PIS scanning and sole [M+Na] + ion monitoring significantly improved the analysis of POPs in both herbal and biological samples. This study also provides implications for the analysis of other compounds which undergo extensive adduct ion formation in ESI-MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Time-varying surface electromyography topography as a prognostic tool for chronic low back pain rehabilitation.

    PubMed

    Hu, Yong; Kwok, Jerry Weilun; Tse, Jessica Yuk-Hang; Luk, Keith Dip-Kei

    2014-06-01

    Nonsurgical rehabilitation therapy is a commonly used strategy to treat chronic low back pain (LBP). The selection of the most appropriate therapeutic options is still a big challenge in clinical practices. Surface electromyography (sEMG) topography has been proposed to be an objective assessment of LBP rehabilitation. The quantitative analysis of dynamic sEMG would provide an objective tool of prognosis for LBP rehabilitation. To evaluate the prognostic value of quantitative sEMG topographic analysis and to verify the accuracy of the performance of proposed time-varying topographic parameters for identifying the patients who have better response toward the rehabilitation program. A retrospective study of consecutive patients. Thirty-eight patients with chronic nonspecific LBP and 43 healthy subjects. The accuracy of the time-varying quantitative sEMG topographic analysis for monitoring LBP rehabilitation progress was determined by calculating the corresponding receiver-operating characteristic (ROC) curves. Physiologic measure was the sEMG during lumbar flexion and extension. Patients who suffered from chronic nonspecific LBP without the history of back surgery and any medical conditions causing acute exacerbation of LBP during the clinical test were enlisted to perform the clinical test during the 12-week physiotherapy (PT) treatment. Low back pain patients were classified into two groups: "responding" and "nonresponding" based on the clinical assessment. The responding group referred to the LBP patients who began to recover after the PT treatment, whereas the nonresponding group referred to some LBP patients who did not recover or got worse after the treatment. The results of the time-varying analysis in the responding group were compared with those in the nonresponding group. In addition, the accuracy of the analysis was analyzed through ROC curves. The time-varying analysis showed discrepancies in the root-mean-square difference (RMSD) parameters between the responding and nonresponding groups. The relative area (RA) and relative width (RW) of RMSD at flexion and extension in the responding group were significantly lower than those in the nonresponding group (p<.05). The areas under the ROC curve of RA and RW of RMSD at flexion and extension were greater than 0.7 and were statistically significant. The quantitative time-varying analysis of sEMG topography showed significant difference between the healthy and LBP groups. The discrepancies in quantitative dynamic sEMG topography of LBP group from normal group, in terms of RA and RW of RMSD at flexion and extension, were able to identify those LBP subjects who would respond to a conservative rehabilitation program focused on functional restoration of lumbar muscle. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. Putative regulatory sites unraveled by network-embedded thermodynamic analysis of metabolome data

    PubMed Central

    Kümmel, Anne; Panke, Sven; Heinemann, Matthias

    2006-01-01

    As one of the most recent members of the omics family, large-scale quantitative metabolomics data are currently complementing our systems biology data pool and offer the chance to integrate the metabolite level into the functional analysis of cellular networks. Network-embedded thermodynamic analysis (NET analysis) is presented as a framework for mechanistic and model-based analysis of these data. By coupling the data to an operating metabolic network via the second law of thermodynamics and the metabolites' Gibbs energies of formation, NET analysis allows inferring functional principles from quantitative metabolite data; for example it identifies reactions that are subject to active allosteric or genetic regulation as exemplified with quantitative metabolite data from Escherichia coli and Saccharomyces cerevisiae. Moreover, the optimization framework of NET analysis was demonstrated to be a valuable tool to systematically investigate data sets for consistency, for the extension of sub-omic metabolome data sets and for resolving intracompartmental concentrations from cell-averaged metabolome data. Without requiring any kind of kinetic modeling, NET analysis represents a perfectly scalable and unbiased approach to uncover insights from quantitative metabolome data. PMID:16788595

  12. Quantitative analysis of single-molecule force spectroscopy on folded chromatin fibers

    PubMed Central

    Meng, He; Andresen, Kurt; van Noort, John

    2015-01-01

    Single-molecule techniques allow for picoNewton manipulation and nanometer accuracy measurements of single chromatin fibers. However, the complexity of the data, the heterogeneity of the composition of individual fibers and the relatively large fluctuations in extension of the fibers complicate a structural interpretation of such force-extension curves. Here we introduce a statistical mechanics model that quantitatively describes the extension of individual fibers in response to force on a per nucleosome basis. Four nucleosome conformations can be distinguished when pulling a chromatin fiber apart. A novel, transient conformation is introduced that coexists with single wrapped nucleosomes between 3 and 7 pN. Comparison of force-extension curves between single nucleosomes and chromatin fibers shows that embedding nucleosomes in a fiber stabilizes the nucleosome by 10 kBT. Chromatin fibers with 20- and 50-bp linker DNA follow a different unfolding pathway. These results have implications for accessibility of DNA in fully folded and partially unwrapped chromatin fibers and are vital for understanding force unfolding experiments on nucleosome arrays. PMID:25779043

  13. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  14. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  15. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. TREX13 Data Analysis/Modeling

    DTIC Science & Technology

    2015-09-30

    TREX13 data analysis /modeling Dajun (DJ) Tang Applied Physics Laboratory, University of Washington 1013 NE 40th Street, Seattle, WA 98105...accuracy in those predictions. With extensive TREX13 data in hand, the objective now shifts to realizing the long-term goals using data analysis and...be quantitatively addressed. The approach to analysis can be summarized into the following steps: 1. Based on measurements, assess to what degree

  17. Quantitative genetic versions of Hamilton's rule with empirical applications

    PubMed Central

    McGlothlin, Joel W.; Wolf, Jason B.; Brodie, Edmund D.; Moore, Allen J.

    2014-01-01

    Hamilton's theory of inclusive fitness revolutionized our understanding of the evolution of social interactions. Surprisingly, an incorporation of Hamilton's perspective into the quantitative genetic theory of phenotypic evolution has been slow, despite the popularity of quantitative genetics in evolutionary studies. Here, we discuss several versions of Hamilton's rule for social evolution from a quantitative genetic perspective, emphasizing its utility in empirical applications. Although evolutionary quantitative genetics offers methods to measure each of the critical parameters of Hamilton's rule, empirical work has lagged behind theory. In particular, we lack studies of selection on altruistic traits in the wild. Fitness costs and benefits of altruism can be estimated using a simple extension of phenotypic selection analysis that incorporates the traits of social interactants. We also discuss the importance of considering the genetic influence of the social environment, or indirect genetic effects (IGEs), in the context of Hamilton's rule. Research in social evolution has generated an extensive body of empirical work focusing—with good reason—almost solely on relatedness. We argue that quantifying the roles of social and non-social components of selection and IGEs, in addition to relatedness, is now timely and should provide unique additional insights into social evolution. PMID:24686930

  18. The Effectiveness of Second Language Strategy Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke

    2011-01-01

    Research on the effects of second language strategy instruction (SI) has been extensive yet inconclusive. This meta-analysis, therefore, aims to provide a reliable, quantitative measure of the effect of SI as well as a description of the relationship between SI and the variables that moderate its effectiveness (i.e., different learning contexts,…

  19. A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.

    PubMed

    Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E

    2016-01-01

    Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.

  20. INTRODUCTION TO THE LANDSCAPE ANALYSIS TOOLS ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. watershed or county). ...

  1. Quantifying patterns of research interest evolution

    NASA Astrophysics Data System (ADS)

    Jia, Tao; Wang, Dashun; Szymanski, Boleslaw

    Changing and shifting research interest is an integral part of a scientific career. Despite extensive investigations of various factors that influence a scientist's choice of research topics, quantitative assessments of mechanisms that give rise to macroscopic patterns characterizing research interest evolution of individual scientists remain limited. Here we perform a large-scale analysis of extensive publication records, finding that research interest change follows a reproducible pattern characterized by an exponential distribution. We identify three fundamental features responsible for the observed exponential distribution, which arise from a subtle interplay between exploitation and exploration in research interest evolution. We develop a random walk based model, which adequately reproduces our empirical observations. Our study presents one of the first quantitative analyses of macroscopic patterns governing research interest change, documenting a high degree of regularity underlying scientific research and individual careers.

  2. "Would You Like to Tidy up Now?" An Analysis of Adult Questioning in the English Foundation Stage

    ERIC Educational Resources Information Center

    Siraj-Blatchford, Iram; Manni, Laura

    2008-01-01

    This study provides an extension of analysis concerned with adult questioning carried out in the Researching Effective Pedagogy in the Early Years (REPEY) study. The REPEY study drew on robust quantitative data provided by the Effective Provision of Pre-School Education (EPPE) project to identify the particular pedagogical strategies being applied…

  3. Neural Extensions to Robust Parameter Design

    DTIC Science & Technology

    2010-09-01

    different ANNs to classify a winner in an NBA basketball game based simply on box score data. The results obtained from these authors showed remarkable......27-29, 2009. Loeffelholz, B.J., Bednar, E., & Bauer, K.W. (2009). “Predicting NBA games using neural networks,” Journal of Quantitative Analysis

  4. Ramifications of increased training in quantitative methodology.

    PubMed

    Zimiles, Herbert

    2009-01-01

    Comments on the article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America" by Aiken, West, and Millsap. The current author asks three questions that are provoked by the comprehensive identification of gaps and deficiencies in the training of quantitative methodology that led Aiken, West, and Millsap to call for expanded graduate instruction resources and programs. This comment calls for greater attention to how advances and expansion in the training of quantitative analysis are influencing who chooses to study psychology and how and what will be studied. PsycINFO Database Record 2009 APA.

  5. Dominant Epistasis Between Two Quantitative Trait Loci Governing Sporulation Efficiency in Yeast Saccharomyces cerevisiae

    PubMed Central

    Bergman, Juraj; Mitrikeski, Petar T.

    2015-01-01

    Summary Sporulation efficiency in the yeast Saccharomyces cerevisiae is a well-established model for studying quantitative traits. A variety of genes and nucleotides causing different sporulation efficiencies in laboratory, as well as in wild strains, has already been extensively characterised (mainly by reciprocal hemizygosity analysis and nucleotide exchange methods). We applied a different strategy in order to analyze the variation in sporulation efficiency of laboratory yeast strains. Coupling classical quantitative genetic analysis with simulations of phenotypic distributions (a method we call phenotype modelling) enabled us to obtain a detailed picture of the quantitative trait loci (QTLs) relationships underlying the phenotypic variation of this trait. Using this approach, we were able to uncover a dominant epistatic inheritance of loci governing the phenotype. Moreover, a molecular analysis of known causative quantitative trait genes and nucleotides allowed for the detection of novel alleles, potentially responsible for the observed phenotypic variation. Based on the molecular data, we hypothesise that the observed dominant epistatic relationship could be caused by the interaction of multiple quantitative trait nucleotides distributed across a 60--kb QTL region located on chromosome XIV and the RME1 locus on chromosome VII. Furthermore, we propose a model of molecular pathways which possibly underlie the phenotypic variation of this trait. PMID:27904371

  6. Quantitative analysis of Sudan dye adulteration in paprika powder using FTIR spectroscopy

    USDA-ARS?s Scientific Manuscript database

    The presence of Sudan dye used illegally for coloring in food stuffs has become a point of food safety concern, especially in paprika- and chili-containing food products. Fourier transform infrared (FTIR) spectroscopy has been extensively used as an analytical method for quality control and safety m...

  7. Quantitative Analysis of Non-Financial Motivators and Job Satisfaction of Information Technology Professionals

    ERIC Educational Resources Information Center

    Mieszczak, Gina L.

    2013-01-01

    Organizations depend extensively on Information Technology professionals to drive and deliver technology solutions quickly, efficiently, and effectively to achieve business goals and profitability. It has been demonstrated that professionals with experience specific to the company are valuable assets, and their departure puts technology projects…

  8. Multiplex, quantitative cellular analysis in large tissue volumes with clearing-enhanced 3D microscopy (Ce3D)

    PubMed Central

    Li, Weizhe; Germain, Ronald N.

    2017-01-01

    Organ homeostasis, cellular differentiation, signal relay, and in situ function all depend on the spatial organization of cells in complex tissues. For this reason, comprehensive, high-resolution mapping of cell positioning, phenotypic identity, and functional state in the context of macroscale tissue structure is critical to a deeper understanding of diverse biological processes. Here we report an easy to use method, clearing-enhanced 3D (Ce3D), which generates excellent tissue transparency for most organs, preserves cellular morphology and protein fluorescence, and is robustly compatible with antibody-based immunolabeling. This enhanced signal quality and capacity for extensive probe multiplexing permits quantitative analysis of distinct, highly intermixed cell populations in intact Ce3D-treated tissues via 3D histo-cytometry. We use this technology to demonstrate large-volume, high-resolution microscopy of diverse cell types in lymphoid and nonlymphoid organs, as well as to perform quantitative analysis of the composition and tissue distribution of multiple cell populations in lymphoid tissues. Combined with histo-cytometry, Ce3D provides a comprehensive strategy for volumetric quantitative imaging and analysis that bridges the gap between conventional section imaging and disassociation-based techniques. PMID:28808033

  9. A Fan-tastic Quantitative Exploration of Ohm's Law

    NASA Astrophysics Data System (ADS)

    Mitchell, Brandon; Ekey, Robert; McCullough, Roy; Reitz, William

    2018-02-01

    Teaching simple circuits and Ohm's law to students in the introductory classroom has been extensively investigated through the common practice of using incandescent light bulbs to help students develop a conceptual foundation before moving on to quantitative analysis. However, the bulb filaments' resistance has a large temperature dependence, which makes them less suitable as a tool for quantitative analysis. Some instructors show that light bulbs do not obey Ohm's law either outright or through inquiry-based laboratory experiments. Others avoid the subject altogether by using bulbs strictly for qualitative purposes and then later switching to resistors for a numerical analysis, or by changing the operating conditions of the bulb so that it is "barely" glowing. It seems incongruous to develop a conceptual basis for the behavior of simple circuits using bulbs only to later reveal that they do not follow Ohm's law. Recently, small computer fans were proposed as a suitable replacement of bulbs for qualitative analysis of simple circuits where the current is related to the rotational speed of the fans. In this contribution, we demonstrate that fans can also be used for quantitative measurements and provide suggestions for successful classroom implementation.

  10. ELISA-BASE: An Integrated Bioinformatics Tool for Analyzing and Tracking ELISA Microarray Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Amanda M.; Collett, James L.; Seurynck-Servoss, Shannon L.

    ELISA-BASE is an open-source database for capturing, organizing and analyzing protein enzyme-linked immunosorbent assay (ELISA) microarray data. ELISA-BASE is an extension of the BioArray Soft-ware Environment (BASE) database system, which was developed for DNA microarrays. In order to make BASE suitable for protein microarray experiments, we developed several plugins for importing and analyzing quantitative ELISA microarray data. Most notably, our Protein Microarray Analysis Tool (ProMAT) for processing quantita-tive ELISA data is now available as a plugin to the database.

  11. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data

    PubMed Central

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi

    2015-01-01

    Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366

  12. Biological Dynamics Markup Language (BDML): an open format for representing quantitative biological dynamics data.

    PubMed

    Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi

    2015-04-01

    Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  13. A Simple and Computationally Efficient Approach to Multifactor Dimensionality Reduction Analysis of Gene-Gene Interactions for Quantitative Traits

    PubMed Central

    Gui, Jiang; Moore, Jason H.; Williams, Scott M.; Andrews, Peter; Hillege, Hans L.; van der Harst, Pim; Navis, Gerjan; Van Gilst, Wiek H.; Asselbergs, Folkert W.; Gilbert-Diamond, Diane

    2013-01-01

    We present an extension of the two-class multifactor dimensionality reduction (MDR) algorithm that enables detection and characterization of epistatic SNP-SNP interactions in the context of a quantitative trait. The proposed Quantitative MDR (QMDR) method handles continuous data by modifying MDR’s constructive induction algorithm to use a T-test. QMDR replaces the balanced accuracy metric with a T-test statistic as the score to determine the best interaction model. We used a simulation to identify the empirical distribution of QMDR’s testing score. We then applied QMDR to genetic data from the ongoing prospective Prevention of Renal and Vascular End-Stage Disease (PREVEND) study. PMID:23805232

  14. An Energy Management Programme for Grande Prairie Public School District. Energy Conservation: Energy Management.

    ERIC Educational Resources Information Center

    Calgary Univ. (Alberta).

    This report describes a pilot energy conservation project in Grande Prairie (Alberta) School District No. 2357. Extensive data collection and analysis were undertaken to provide a sound, quantitative basis for evaluation of the program. Energy conserving measures requiring capital outlays were not considered. During the project, electric demand…

  15. Quantitative analysis of ribosome–mRNA complexes at different translation stages

    PubMed Central

    Shirokikh, Nikolay E.; Alkalaeva, Elena Z.; Vassilenko, Konstantin S.; Afonina, Zhanna A.; Alekhina, Olga M.; Kisselev, Lev L.; Spirin, Alexander S.

    2010-01-01

    Inhibition of primer extension by ribosome–mRNA complexes (toeprinting) is a proven and powerful technique for studying mechanisms of mRNA translation. Here we have assayed an advanced toeprinting approach that employs fluorescently labeled DNA primers, followed by capillary electrophoresis utilizing standard instruments for sequencing and fragment analysis. We demonstrate that this improved technique is not merely fast and cost-effective, but also brings the primer extension inhibition method up to the next level. The electrophoretic pattern of the primer extension reaction can be characterized with a precision unattainable by the common toeprint analysis utilizing radioactive isotopes. This method allows us to detect and quantify stable ribosomal complexes at all stages of translation, including initiation, elongation and termination, generated during the complete translation process in both the in vitro reconstituted translation system and the cell lysate. We also point out the unique advantages of this new methodology, including the ability to assay sites of the ribosomal complex assembly on several mRNA species in the same reaction mixture. PMID:19910372

  16. Robust LOD scores for variance component-based linkage analysis.

    PubMed

    Blangero, J; Williams, J T; Almasy, L

    2000-01-01

    The variance component method is now widely used for linkage analysis of quantitative traits. Although this approach offers many advantages, the importance of the underlying assumption of multivariate normality of the trait distribution within pedigrees has not been studied extensively. Simulation studies have shown that traits with leptokurtic distributions yield linkage test statistics that exhibit excessive Type I error when analyzed naively. We derive analytical formulae relating the deviation from the expected asymptotic distribution of the lod score to the kurtosis and total heritability of the quantitative trait. A simple correction constant yields a robust lod score for any deviation from normality and for any pedigree structure, and effectively eliminates the problem of inflated Type I error due to misspecification of the underlying probability model in variance component-based linkage analysis.

  17. [Doppler echocardiography of tricuspid insufficiency. Methods of quantification].

    PubMed

    Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P

    1994-01-01

    Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.

  18. Non-Ionic Highly Permeable Polymer Shells for Encapsulation of Living Cells

    DTIC Science & Technology

    2011-05-01

    I would like to thank Irina Drachuk for her extensive assistance in data collection and analysis , and Drs. Veronika Kozlovskaya and Olga Shchepelina...considered complete when the intensity of the photobleached region stabilized. The quantitative analysis was performed using ImageJ software, and curve...E., Tannin -protein complexes as radical scavengers and radical sinks. J Agric Food Chem 2001, 49 (10), 4917-23. 53. Lopes, G. K.; Schulman, H. M

  19. Benefit-risk analysis : a brief review and proposed quantitative approaches.

    PubMed

    Holden, William L

    2003-01-01

    Given the current status of benefit-risk analysis as a largely qualitative method, two techniques for a quantitative synthesis of a drug's benefit and risk are proposed to allow a more objective approach. The recommended methods, relative-value adjusted number-needed-to-treat (RV-NNT) and its extension, minimum clinical efficacy (MCE) analysis, rely upon efficacy or effectiveness data, adverse event data and utility data from patients, describing their preferences for an outcome given potential risks. These methods, using hypothetical data for rheumatoid arthritis drugs, demonstrate that quantitative distinctions can be made between drugs which would better inform clinicians, drug regulators and patients about a drug's benefit-risk profile. If the number of patients needed to treat is less than the relative-value adjusted number-needed-to-harm in an RV-NNT analysis, patients are willing to undergo treatment with the experimental drug to derive a certain benefit knowing that they may be at risk for any of a series of potential adverse events. Similarly, the results of an MCE analysis allow for determining the worth of a new treatment relative to an older one, given not only the potential risks of adverse events and benefits that may be gained, but also by taking into account the risk of disease without any treatment. Quantitative methods of benefit-risk analysis have a place in the evaluative armamentarium of pharmacovigilance, especially those that incorporate patients' perspectives.

  20. Online Learner Self-Regulation: Learning Presence Viewed through Quantitative Content- and Social Network Analysis

    ERIC Educational Resources Information Center

    Shea, Peter; Hayes, Suzanne; Smith, Sedef Uzuner; Vickers, Jason; Bidjerano, Temi; Gozza-Cohen, Mary; Jian, Shou-Bang; Pickett, Alexandra M.; Wilde, Jane; Tseng, Chi-Hua

    2013-01-01

    This paper presents an extension of an ongoing study of online learning framed within the community of inquiry (CoI) model (Garrison, Anderson, & Archer, 2001) in which we further examine a new construct labeled as "learning presence." We use learning presence to refer to the iterative processes of forethought and planning,…

  1. An Overview of Markov Chain Methods for the Study of Stage-Sequential Developmental Processes

    ERIC Educational Resources Information Center

    Kapland, David

    2008-01-01

    This article presents an overview of quantitative methodologies for the study of stage-sequential development based on extensions of Markov chain modeling. Four methods are presented that exemplify the flexibility of this approach: the manifest Markov model, the latent Markov model, latent transition analysis, and the mixture latent Markov model.…

  2. Selection of internal reference genes for normalization of reverse transcription quantitative polymerase chain reaction (RT-qPCR) analysis in the rumen epithelium

    USDA-ARS?s Scientific Manuscript database

    The rumen is lined on the luminal side by a stratified squamous epithelium that is responsible for not only absorption, but also transport, extensive short-chain fatty acid (SCFA) metabolism and protection. Butyrate has been demonstrated to initiate the differentiation of the tissue following intro...

  3. Quantitative analyses of cell behaviors underlying notochord formation and extension in mouse embryos.

    PubMed

    Sausedo, R A; Schoenwolf, G C

    1994-05-01

    Formation and extension of the notochord (i.e., notogenesis) is one of the earliest and most obvious events of axis development in vertebrate embryos. In birds and mammals, prospective notochord cells arise from Hensen's node and come to lie beneath the midline of the neural plate. Throughout the period of neurulation, the notochord retains its close spatial relationship with the developing neural tube and undergoes rapid extension in concert with the overlying neuroepithelium. In the present study, we examined notochord development quantitatively in mouse embryos. C57BL/6 mouse embryos were collected at 8, 8.5, 9, 9.5, and 10 days of gestation. They were then embedded in paraffin and sectioned transversely. Serial sections from 21 embryos were stained with Schiff's reagent according to the Feulgen-Rossenbeck procedure and used for quantitative analyses of notochord extension. Quantitative analyses revealed that extension of the notochord involves cell division within the notochord proper and cell rearrangement within the notochordal plate (the immediate precursor of the notochord). In addition, extension of the notochord involves cell accretion, that is, the addition of cells to the notochord's caudal end, a process that involves considerable cell rearrangement at the notochordal plate-node interface. Extension of the mouse notochord occurs similarly to that described previously for birds (Sausedo and Schoenwolf, 1993 Anat. Rec. 237:58-70). That is, in both birds (i.e., quail and chick) and mouse embryos, notochord extension involves cell division, cell rearrangement, and cell accretion. Thus higher vertebrates utilize similar morphogenetic movements to effect notogenesis.

  4. Industrial ecology: Quantitative methods for exploring a lower carbon future

    NASA Astrophysics Data System (ADS)

    Thomas, Valerie M.

    2015-03-01

    Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.

  5. Analysis of arsenical metabolites in biological samples.

    PubMed

    Hernandez-Zavala, Araceli; Drobna, Zuzana; Styblo, Miroslav; Thomas, David J

    2009-11-01

    Quantitation of iAs and its methylated metabolites in biological samples provides dosimetric information needed to understand dose-response relations. Here, methods are described for separation of inorganic and mono-, di-, and trimethylated arsenicals by thin layer chromatography. This method has been extensively used to track the metabolism of the radionuclide [(73)As] in a variety of in vitro assay systems. In addition, a hydride generation-cryotrapping-gas chromatography-atomic absorption spectrometric method is described for the quantitation of arsenicals in biological samples. This method uses pH-selective hydride generation to differentiate among arsenicals containing trivalent or pentavalent arsenic.

  6. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  7. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  8. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  9. Quantitative analysis of major dibenzocyclooctane lignans in Schisandrae fructus by online TLC-DART-MS.

    PubMed

    Kim, Hye Jin; Oh, Myung Sook; Hong, Jongki; Jang, Young Pyo

    2011-01-01

    Direct analysis in real time (DART) ion source is a powerful ionising technique for the quick and easy detection of various organic molecules without any sample preparation steps, but the lack of quantitation capacity limits its extensive use in the field of phytochemical analysis. To improvise a new system which utilize DART-MS as a hyphenated detector for quantitation. A total extract of Schisandra chinensis fruit was analyzed on a TLC plate and three major lignan compounds were quantitated by three different methods of UV densitometry, TLC-DART-MS and HPLC-UV to compare the efficiency of each method. To introduce the TLC plate into the DART ion source at a constant velocity, a syringe pump was employed. The DART-MS total ion current chromatogram was recorded for the entire TLC plate. The concentration of each lignan compound was calculated from the calibration curve established with standard compound. Gomisin A, gomisin N and schisandrin were well separated on a silica-coated TLC plate and the specific ion current chromatograms were successfully acquired from the TLC-DART-MS system. The TLC-DART-MS system for the quantitation of natural products showed better linearity and specificity than TLC densitometry, and consumed less time and solvent than conventional HPLC method. A hyphenated system for the quantitation of phytochemicals from crude herbal drugs was successfully established. This system was shown to have a powerful analytical capacity for the prompt and efficient quantitation of natural products from crude drugs. Copyright © 2010 John Wiley & Sons, Ltd.

  10. Professional Learning Communities: An Analysis of Fifth Grade Special Education Student Achievement and Teacher Longevity in Two Texas School Districts

    ERIC Educational Resources Information Center

    Thacker, Teresa D.

    2013-01-01

    Professional Learning Communities (PLCs) are an emerging form of professional learning used nationwide as a means for educators to focus on job-embedded learning. Extensive qualitative data have been compiled regarding the perception of educators and PLCs. However, little quantitative research has been conducted regarding the academic achievement…

  11. This View of Science: Stephen Jay Gould as Historian of Science and Scientific Historian, Popular Scientist and Scientific Popularizer.

    ERIC Educational Resources Information Center

    Shermer, Michael B.

    2002-01-01

    Presents the results of an extensive quantitative content analysis of Gould's 22 books, 101 book reviews, 479 scientific papers, and 300 Natural History essays, in terms of subject matter, and thematic dichotomies. Emphasizes the interaction between the subjects and themata, how Gould has used the history of science to reinforce his evolutionary…

  12. A Primer on Value-Added Models: Towards a Better Understanding of the Quantitative Analysis of Student Achievement

    ERIC Educational Resources Information Center

    Nakamura, Yugo

    2013-01-01

    Value-added models (VAMs) have received considerable attention as a tool to transform our public education system. However, as VAMs are studied by researchers from a broad range of academic disciplines who remain divided over the best methods in analyzing the models and stakeholders without the extensive statistical background have been excluded…

  13. Tracing the metabolism of HT-2 toxin and T-2 toxin in barley by isotope-assisted untargeted screening and quantitative LC-HRMS analysis

    USDA-ARS?s Scientific Manuscript database

    An extensive study of the metabolism of the type-A trichothecene mycotoxins HT-2 toxin and T-2 toxin in barley using liquid chromatography coupled to high resolution mass spectrometry (LC-HRMS) is reported. A recently developed untargeted approach based on stable isotopic labelling, LC-Orbitrap-MS a...

  14. Space Station Freedom Water Recovery test total organic carbon accountability

    NASA Technical Reports Server (NTRS)

    Davidson, Michael W.; Slivon, Laurence; Sheldon, Linda; Traweek, Mary

    1991-01-01

    Marshall Space Flight Center's (MSFC) Water Recovery Test (WRT) addresses the concept of integrated hygiene and potable reuse water recovery systems baselined for Space Station Freedom (SSF). To assess the adequacy of water recovery system designs and the conformance of reclaimed water quality to established specifications, MSFC has initiated an extensive water characterization program. MSFC's goal is to quantitatively account for a large percentage of organic compounds present in waste and reclaimed hygiene and potable waters from the WRT and in humidity condensate from Spacelab missions. The program is coordinated into Phase A and B. Phase A's focus is qualitative and semi-quantitative. Precise quantitative analyses are not emphasized. Phase B's focus centers on a near complete quantitative characterization of all water types. Technical approaches along with Phase A and partial Phase B investigations on the compositional analysis of Total Organic Carbon (TOC) Accountability are presented.

  15. High-throughput real-time quantitative reverse transcription PCR.

    PubMed

    Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F

    2006-02-01

    Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.

  16. Quantitative Analysis of Global Proteome and Lysine Acetylome Reveal the Differential Impacts of VPA and SAHA on HL60 Cells.

    PubMed

    Zhu, Xiaoyu; Liu, Xin; Cheng, Zhongyi; Zhu, Jun; Xu, Lei; Wang, Fengsong; Qi, Wulin; Yan, Jiawei; Liu, Ning; Sun, Zimin; Liu, Huilan; Peng, Xiaojun; Hao, Yingchan; Zheng, Nan; Wu, Quan

    2016-01-29

    Valproic acid (VPA) and suberoylanilide hydroxamic acid (SAHA) are both HDAC inhibitors (HDACi). Previous studies indicated that both inhibitors show therapeutic effects on acute myeloid leukaemia (AML), while the differential impacts of the two different HDACi on AML treatment still remains elusive. In this study, using 3-plex SILAC based quantitative proteomics technique, anti-acetyllysine antibody based affinity enrichment, high resolution LC-MS/MS and intensive bioinformatic analysis, the quantitative proteome and acetylome in SAHA and VPA treated AML HL60 cells were extensively studied. In total, 5,775 proteins and 1,124 lysine acetylation sites were successfully obtained in response to VAP and SAHA treatment. It is found that VPA and SAHA treatment differently induced proteome and acetylome profiling in AML HL60 cells. This study revealed the differential impacts of VPA and SAHA on proteome/acetylome in AML cells, deepening our understanding of HDAC inhibitor mediated AML therapeutics.

  17. Developing a database for pedestrians' earthquake emergency evacuation in indoor scenarios.

    PubMed

    Zhou, Junxue; Li, Sha; Nie, Gaozhong; Fan, Xiwei; Tan, Jinxian; Li, Huayue; Pang, Xiaoke

    2018-01-01

    With the booming development of evacuation simulation software, developing an extensive database in indoor scenarios for evacuation models is imperative. In this paper, we conduct a qualitative and quantitative analysis of the collected videotapes and aim to provide a complete and unitary database of pedestrians' earthquake emergency response behaviors in indoor scenarios, including human-environment interactions. Using the qualitative analysis method, we extract keyword groups and keywords that code the response modes of pedestrians and construct a general decision flowchart using chronological organization. Using the quantitative analysis method, we analyze data on the delay time, evacuation speed, evacuation route and emergency exit choices. Furthermore, we study the effect of classroom layout on emergency evacuation. The database for indoor scenarios provides reliable input parameters and allows the construction of real and effective constraints for use in software and mathematical models. The database can also be used to validate the accuracy of evacuation models.

  18. Private Agricultural Extension System in Kenya: Practice and Policy Lessons

    ERIC Educational Resources Information Center

    Muyanga, Milu; Jayne, T. S.

    2008-01-01

    Private extension system has been at the centre of a debate triggered by inefficient public agricultural extension. The debate is anchored on the premise that the private sector is more efficient in extension service delivery. This study evaluates the private extension system in Kenya. It employs qualitative and quantitative methods. The results…

  19. Analysis of lard in meatball broth using Fourier transform infrared spectroscopy and chemometrics.

    PubMed

    Kurniawati, Endah; Rohman, Abdul; Triyana, Kuwat

    2014-01-01

    Meatball is one of the favorite foods in Indonesia. For the economic reason (due to the price difference), the substitution of beef meat with pork can occur. In this study, FTIR spectroscopy in combination with chemometrics of partial least square (PLS) and principal component analysis (PCA) was used for analysis of pork fat (lard) in meatball broth. Lard in meatball broth was quantitatively determined at wavenumber region of 1018-1284 cm(-1). The coefficient of determination (R(2)) and root mean square error of calibration (RMSEC) values obtained were 0.9975 and 1.34% (v/v), respectively. Furthermore, the classification of lard and beef fat in meatball broth as well as in commercial samples was performed at wavenumber region of 1200-1000 cm(-1). The results showed that FTIR spectroscopy coupled with chemometrics can be used for quantitative analysis and classification of lard in meatball broth for Halal verification studies. The developed method is simple in operation, rapid and not involving extensive sample preparation. © 2013.

  20. The craniocaudal extension of posterolateral approaches and their combination: a quantitative anatomic and clinical analysis.

    PubMed

    Safavi-Abbasi, Sam; de Oliveira, Jean G; Deshmukh, Pushpa; Reis, Cassius V; Brasiliense, Leonardo B C; Crawford, Neil R; Feiz-Erfan, Iman; Spetzler, Robert F; Preul, Mark C

    2010-03-01

    The aim of this study was to describe quantitatively the properties of the posterolateral approaches and their combination. Six silicone-injected cadaveric heads were dissected bilaterally. Quantitative data were generated with the Optotrak 3020 system (Northern Digital, Waterloo, Canada) and Surgiscope (Elekta Instruments, Inc., Atlanta, GA), including key anatomic points on the skull base and brainstem. All parameters were measured after the basic retrosigmoid craniectomy and then after combination with a basic far-lateral extension. The clinical results of 20 patients who underwent a combined retrosigmoid and far-lateral approach were reviewed. The change in accessibility to the lower clivus was greatest after the far-lateral extension (mean change, 43.62 +/- 10.98 mm2; P = .001). Accessibility to the constant landmarks, Meckel's cave, internal auditory meatus, and jugular foramen did not change significantly between the 2 approaches (P > .05). The greatest change in accessibility to soft tissue between the 2 approaches was to the lower brainstem (mean change, 33.88 +/- 5.25 mm2; P = .0001). Total removal was achieved in 75% of the cases. The average postoperative Glasgow Outcome Scale score of patients who underwent the combined retrosigmoid and far-lateral approach improved significantly, compared with the preoperative scores. The combination of the far-lateral and simple retrosigmoid approaches significantly increases the petroclival working area and access to the cranial nerves. However, risk of injury to neurovascular structures and time needed to extend the craniotomy must be weighed against the increased working area and angles of attack.

  1. Purdue Extension: Employee Engagement and Leadership Style

    ERIC Educational Resources Information Center

    Abbott, Angela R.

    2017-01-01

    The purpose of this quantitative study was to assess the Purdue Extension county directors' level of engagement and leadership style and to examine the relationship between these two variables. The study aimed to inform a professional development training program for all Purdue Extension county extension directors. Survey data were collected from…

  2. Quantitative analysis and comparative study of four cities green pattern in API system on the background of big data

    NASA Astrophysics Data System (ADS)

    Xin, YANG; Si-qi, WU; Qi, ZHANG

    2018-05-01

    Beijing, London, Paris, New York are typical cities in the world, so comparative study of four cities green pattern is very important to find out gap and advantage and to learn from each other. The paper will provide basis and new ideas for development of metropolises in China. On the background of big data, API (Application Programming Interface) system can provide extensive and accurate basic data to study urban green pattern in different geographical environment in domestic and foreign. On the basis of this, Average nearest neighbor tool, Kernel density tool and Standard Ellipse tool in ArcGIS platform can process and summarize data and realize quantitative analysis of green pattern. The paper summarized uniqueness of four cities green pattern and reasons of formation on basis of numerical comparison.

  3. Dynamics of uniaxially oriented elastomers using dielectric spectroscopy

    NASA Astrophysics Data System (ADS)

    Lee, Hyungki; Fragiadakis, Daniel; Martin, Darren; Runt, James

    2009-03-01

    We summarize our initial dielectric spectroscopy investigation of the dynamics of oriented segmented polyurethanes and crosslinked polyisoprene elastomers. A specially designed uniaxial stretching rig is used to control the draw ratio, and the electric field is applied normal to the draw direction. For the segmented PUs, we observe a dramatic reduction in relaxation strength of the soft phase segmental process with increasing extension ratio, accompanied by a modest decrease in relaxation frequency. Crosslinking of the polyisoprene was accomplished with dicumyl peroxide and the dynamics of uncrosslinked and crosslinked versions are investigated in the undrawn state and at different extension ratios. Complimentary analysis of the crosslinked PI is conducted with wide angle X- ray diffraction to examine possible strain-induced crystallization, DSC, and swelling experiments. Quantitative analysis of relaxation strengths and shapes as a function of draw ratio will be discussed.

  4. Validated ¹H and 13C Nuclear Magnetic Resonance Methods for the Quantitative Determination of Glycerol in Drug Injections.

    PubMed

    Lu, Jiaxi; Wang, Pengli; Wang, Qiuying; Wang, Yanan; Jiang, Miaomiao

    2018-05-15

    In the current study, we employed high-resolution proton and carbon nuclear magnetic resonance spectroscopy (¹H and 13 C NMR) for quantitative analysis of glycerol in drug injections without any complex pre-treatment or derivatization on samples. The established methods were validated with good specificity, linearity, accuracy, precision, stability, and repeatability. Our results revealed that the contents of glycerol were convenient to calculate directly via the integration ratios of peak areas with an internal standard in ¹H NMR spectra, while the integration of peak heights were proper for 13 C NMR in combination with an external calibration of glycerol. The developed methods were both successfully applied in drug injections. Quantitative NMR methods showed an extensive prospect for glycerol determination in various liquid samples.

  5. Extensor indicis proprius tendon transfer using shear wave elastography.

    PubMed

    Lamouille, J; Müller, C; Aubry, S; Bensamoun, S; Raffoul, W; Durand, S

    2017-06-01

    The means for judging optimal tension during tendon transfers are approximate and not very quantifiable. The purpose of this study was to demonstrate the feasibility of quantitatively assessing muscular mechanical properties intraoperatively using ultrasound elastography (shear wave elastography [SWE]) during extensor indicis proprius (EIP) transfer. We report two cases of EIP transfer for post-traumatic rupture of the extensor pollicis longus muscle. Ultrasound acquisitions measured the elasticity modulus of the EIP muscle at different stages: rest, active extension, active extension against resistance, EIP section, distal passive traction of the tendon, after tendon transfer at rest and then during active extension. A preliminary analysis was conducted of the distribution of values for this modulus at the various transfer steps. Different shear wave velocity and elasticity modulus values were observed at the various transfer steps. The tension applied during the transfer seemed close to the resting tension if a traditional protocol were followed. The elasticity modulus varied by a factor of 37 between the active extension against resistance step (565.1 kPa) and after the tendon section (15.3 kPa). The elasticity modulus values were distributed in the same way for each patient. The therapeutic benefit of SWE elastography was studied for the first time in tendon transfers. Quantitative data on the elasticity modulus during this test may make it an effective means of improving intraoperative adjustments. Copyright © 2017 SFCM. Published by Elsevier Masson SAS. All rights reserved.

  6. A new method for qualitative simulation of water resources systems: 1. Theory

    NASA Astrophysics Data System (ADS)

    Camara, A. S.; Pinheiro, M.; Antunes, M. P.; Seixas, M. J.

    1987-11-01

    A new dynamic modeling methodology, SLIN (Simulação Linguistica), allowing for the analysis of systems defined by linguistic variables, is presented. SLIN applies a set of logical rules avoiding fuzzy theoretic concepts. To make the transition from qualitative to quantitative modes, logical rules are used as well. Extensions of the methodology to simulation-optimization applications and multiexpert system modeling are also discussed.

  7. Practical applications of the bioinformatics toolbox for narrowing quantitative trait loci.

    PubMed

    Burgess-Herbert, Sarah L; Cox, Allison; Tsaih, Shirng-Wern; Paigen, Beverly

    2008-12-01

    Dissecting the genes involved in complex traits can be confounded by multiple factors, including extensive epistatic interactions among genes, the involvement of epigenetic regulators, and the variable expressivity of traits. Although quantitative trait locus (QTL) analysis has been a powerful tool for localizing the chromosomal regions underlying complex traits, systematically identifying the causal genes remains challenging. Here, through its application to plasma levels of high-density lipoprotein cholesterol (HDL) in mice, we demonstrate a strategy for narrowing QTL that utilizes comparative genomics and bioinformatics techniques. We show how QTL detected in multiple crosses are subjected to both combined cross analysis and haplotype block analysis; how QTL from one species are mapped to the concordant regions in another species; and how genomewide scans associating haplotype groups with their phenotypes can be used to prioritize the narrowed regions. Then we illustrate how these individual methods for narrowing QTL can be systematically integrated for mouse chromosomes 12 and 15, resulting in a significantly reduced number of candidate genes, often from hundreds to <10. Finally, we give an example of how additional bioinformatics resources can be combined with experiments to determine the most likely quantitative trait genes.

  8. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  9. Changes in monosaccharides, organic acids and amino acids during Cabernet Sauvignon wine ageing based on a simultaneous analysis using gas chromatography-mass spectrometry.

    PubMed

    Zhang, Xin-Ke; Lan, Yi-Bin; Zhu, Bao-Qing; Xiang, Xiao-Feng; Duan, Chang-Qing; Shi, Ying

    2018-01-01

    Monosaccharides, organic acids and amino acids are the important flavour-related components in wines. The aim of this article is to develop and validate a method that could simultaneously analyse these compounds in wine based on silylation derivatisation and gas chromatography-mass spectrometry (GC-MS), and apply this method to the investigation of the changes of these compounds and speculate upon their related influences on Cabernet Sauvignon wine flavour during wine ageing. This work presented a new approach for wine analysis and provided more information concerning red wine ageing. This method could simultaneously quantitatively analyse 2 monosaccharides, 8 organic acids and 13 amino acids in wine. A validation experiment showed good linearity, sensitivity, reproducibility and recovery. Multiple derivatives of five amino acids have been found but their effects on quantitative analysis were negligible, except for methionine. The evolution pattern of each category was different, and we speculated that the corresponding mechanisms involving microorganism activities, physical interactions and chemical reactions had a great correlation with red wine flavours during ageing. Simultaneously quantitative analysis of monosaccharides, organic acids and amino acids in wine was feasible and reliable and this method has extensive application prospects. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  10. NecroQuant: quantitative assessment of radiological necrosis

    NASA Astrophysics Data System (ADS)

    Hwang, Darryl H.; Mohamed, Passant; Varghese, Bino A.; Cen, Steven Y.; Duddalwar, Vinay

    2017-11-01

    Clinicians can now objectively quantify tumor necrosis by Hounsfield units and enhancement characteristics from multiphase contrast enhanced CT imaging. NecroQuant has been designed to work as part of a radiomics pipelines. The software is a departure from the conventional qualitative assessment of tumor necrosis, as it provides the user (radiologists and researchers) a simple interface to precisely and interactively define and measure necrosis in contrast-enhanced CT images. Although, the software is tested here on renal masses, it can be re-configured to assess tumor necrosis across variety of tumors from different body sites, providing a generalized, open, portable, and extensible quantitative analysis platform that is widely applicable across cancer types to quantify tumor necrosis.

  11. Role of citron kinase in dendritic morphogenesis of cortical neurons.

    PubMed

    Di Cunto, Ferdinando; Ferrara, Luciana; Curtetti, Roberta; Imarisio, Sara; Guazzone, Simona; Broccoli, Vania; Bulfone, Alessandro; Altruda, Fiorella; Vercelli, Alessandro; Silengo, Lorenzo

    2003-05-30

    Small GTPases of the rho family regulate the extensive rearrangements of the cytoskeleton that characterize neuronal differentiation. Citron kinase is a target molecule for activated rhoA, previously implicated in control of cytokinesis. We have found that, in addition, it could play an important role in modulating the extension of neuronal processes. Using constitutively active and dominant negative mutants, we showed that citron kinase is involved in the morphologic differentiation of N1E-115 neuroblastoma cells induced by serum starvation. More importantly, quantitative analysis of citron kinase knockout cerebral cortex displayed that this molecule may differentially regulate the morphology of the dendritic compartment in corticocollicular versus callosally-projecting pyramidal neurons.

  12. Remote sensing image denoising application by generalized morphological component analysis

    NASA Astrophysics Data System (ADS)

    Yu, Chong; Chen, Xiong

    2014-12-01

    In this paper, we introduced a remote sensing image denoising method based on generalized morphological component analysis (GMCA). This novel algorithm is the further extension of morphological component analysis (MCA) algorithm to the blind source separation framework. The iterative thresholding strategy adopted by GMCA algorithm firstly works on the most significant features in the image, and then progressively incorporates smaller features to finely tune the parameters of whole model. Mathematical analysis of the computational complexity of GMCA algorithm is provided. Several comparison experiments with state-of-the-art denoising algorithms are reported. In order to make quantitative assessment of algorithms in experiments, Peak Signal to Noise Ratio (PSNR) index and Structural Similarity (SSIM) index are calculated to assess the denoising effect from the gray-level fidelity aspect and the structure-level fidelity aspect, respectively. Quantitative analysis on experiment results, which is consistent with the visual effect illustrated by denoised images, has proven that the introduced GMCA algorithm possesses a marvelous remote sensing image denoising effectiveness and ability. It is even hard to distinguish the original noiseless image from the recovered image by adopting GMCA algorithm through visual effect.

  13. Predictive value of magnetic resonance imaging determined tumor contact length for extracapsular extension of prostate cancer.

    PubMed

    Baco, Eduard; Rud, Erik; Vlatkovic, Ljiljana; Svindland, Aud; Eggesbø, Heidi B; Hung, Andrew J; Matsugasumi, Toru; Bernhard, Jean-Christophe; Gill, Inderbir S; Ukimura, Osamu

    2015-02-01

    Tumor contact length is defined as the amount of prostate cancer in contact with the prostatic capsule. We evaluated the ability of magnetic resonance imaging determined tumor contact length to predict microscopic extracapsular extension compared to existing predictors of extracapsular extension. We retrospectively analyzed the records of 111 consecutive patients with magnetic resonance imaging/ultrasound fusion targeted, biopsy proven prostate cancer who underwent radical prostatectomy from January 2010 to July 2013. Median patient age was 64 years and median prostate specific antigen was 8.9 ng/ml. Clinical stage was cT1 in 93 cases (84%) and cT2 in 18 (16%). Postoperative pathological analysis confirmed pT2 in 71 patients (64%) and pT3 in 40 (36%). We evaluated 1) in the radical prostatectomy specimen the correlation of microscopic extracapsular extension with pathological cancer volume, pathological tumor contact length and Gleason score, 2) the correlation between microscopic extracapsular extension and magnetic resonance imaging tumor contact length, and 3) the ability of preoperative variables to predict microscopic extracapsular extension. Logistic regression analysis revealed that pathological tumor contact length correlated better with microscopic extracapsular extension than the predictive power of pathological cancer volume (0.821 vs 0.685). The Spearman correlation between pathological and magnetic resonance imaging tumor contact length was r = 0.839 (p <0.0001). ROC AUC analysis revealed that magnetic resonance imaging tumor contact length outperformed cancer core involvement on targeted biopsy and the Partin tables to predict microscopic extracapsular extension (0.88 vs 0.70 and 0.63, respectively). At a magnetic resonance imaging tumor contact length threshold of 20 mm the accuracy for diagnosing microscopic extracapsular extension was superior to that of conventional magnetic resonance imaging criteria (82% vs 67%, p = 0.015). We developed a predicted probability plot curve of extracapsular extension according to magnetic resonance imaging tumor contact length. Magnetic resonance imaging determined tumor contact length could be a promising quantitative predictor of microscopic extracapsular extension. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    PubMed

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  15. Cortical hypometabolism and hypoperfusion in Parkinson's disease is extensive: probably even at early disease stages.

    PubMed

    Borghammer, Per; Chakravarty, Mallar; Jonsdottir, Kristjana Yr; Sato, Noriko; Matsuda, Hiroshi; Ito, Kengo; Arahata, Yutaka; Kato, Takashi; Gjedde, Albert

    2010-05-01

    Recent cerebral blood flow (CBF) and glucose consumption (CMRglc) studies of Parkinson's disease (PD) revealed conflicting results. Using simulated data, we previously demonstrated that the often-reported subcortical hypermetabolism in PD could be explained as an artifact of biased global mean (GM) normalization, and that low-magnitude, extensive cortical hypometabolism is best detected by alternative data-driven normalization methods. Thus, we hypothesized that PD is characterized by extensive cortical hypometabolism but no concurrent widespread subcortical hypermetabolism and tested it on three independent samples of PD patients. We compared SPECT CBF images of 32 early-stage and 33 late-stage PD patients with that of 60 matched controls. We also compared PET FDG images from 23 late-stage PD patients with that of 13 controls. Three different normalization methods were compared: (1) GM normalization, (2) cerebellum normalization, (3) reference cluster normalization (Yakushev et al.). We employed standard voxel-based statistics (fMRIstat) and principal component analysis (SSM). Additionally, we performed a meta-analysis of all quantitative CBF and CMRglc studies in the literature to investigate whether the global mean (GM) values in PD are decreased. Voxel-based analysis with GM normalization and the SSM method performed similarly, i.e., both detected decreases in small cortical clusters and concomitant increases in extensive subcortical regions. Cerebellum normalization revealed more widespread cortical decreases but no subcortical increase. In all comparisons, the Yakushev method detected nearly identical patterns of very extensive cortical hypometabolism. Lastly, the meta-analyses demonstrated that global CBF and CMRglc values are decreased in PD. Based on the results, we conclude that PD most likely has widespread cortical hypometabolism, even at early disease stages. In contrast, extensive subcortical hypermetabolism is probably not a feature of PD.

  16. Extension of the Haseman-Elston regression model to longitudinal data.

    PubMed

    Won, Sungho; Elston, Robert C; Park, Taesung

    2006-01-01

    We propose an extension to longitudinal data of the Haseman and Elston regression method for linkage analysis. The proposed model is a mixed model having several random effects. As response variable, we investigate the sibship sample mean corrected cross-product (smHE) and the BLUP-mean corrected cross product (pmHE), comparing them with the original squared difference (oHE), the overall mean corrected cross-product (rHE), and the weighted average of the squared difference and the squared mean-corrected sum (wHE). The proposed model allows for the correlation structure of longitudinal data. Also, the model can test for gene x time interaction to discover genetic variation over time. The model was applied in an analysis of the Genetic Analysis Workshop 13 (GAW13) simulated dataset for a quantitative trait simulating systolic blood pressure. Independence models did not preserve the test sizes, while the mixed models with both family and sibpair random effects tended to preserve size well. Copyright 2006 S. Karger AG, Basel.

  17. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    PubMed Central

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-01-01

    Local surface charge density of lipid membranes influences membrane–protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values. PMID:27561322

  18. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy

    NASA Astrophysics Data System (ADS)

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-08-01

    Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.

  19. Quantitative analysis of modified proteins and their positional isomers by tandem mass spectrometry: human histone H4.

    PubMed

    Pesavento, James J; Mizzen, Craig A; Kelleher, Neil L

    2006-07-01

    Here we show that fragment ion abundances from dissociation of ions created from mixtures of multiply modified histone H4 (11 kDa) or of N-terminal synthetic peptides (2 kDa) correspond to their respective intact ion abundances measured by Fourier transform mass spectrometry. Isomeric mixtures of modified forms of the same protein are resolved and quantitated with a precision of

  20. Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy.

    PubMed

    Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong

    2016-08-26

    Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.

  1. Predictors of occupational burnout among nurses: a dominance analysis of job stressors.

    PubMed

    Sun, Ji-Wei; Bai, Hua-Yu; Li, Jia-Huan; Lin, Ping-Zhen; Zhang, Hui-Hui; Cao, Feng-Lin

    2017-12-01

    To quantitatively compare dimensions of job stressors' effects on nurses' burnout. Nurses, a key group of health service providers, often experience stressors at work. Extensive research has examined the relationship between job stressors and burnout; however, less has specifically compared the effects of job stressor domains on nurses' burnout. A quantitative cross-sectional survey examined three general hospitals in Jinan, China. Participants were 602 nurses. We compared five potential stressors' ability to predict nurses' burnout using dominance analysis and assuming that each stressor was intercorrelated. Strong positive correlations were found between all five job stressors and burnout. Interpersonal relationships and management issues most strongly predicted participants' burnout (11·3% of average variance). Job stressors, and particularly interpersonal relationships and management issues, significantly predict nurses' job burnout. Understanding the relative effect of job stressors may help identify fruitful areas for intervention and improve nurse recruitment and retention. © 2017 John Wiley & Sons Ltd.

  2. Analysis of the optimal laminated target made up of discrete set of materials

    NASA Technical Reports Server (NTRS)

    Aptukov, Valery N.; Belousov, Valentin L.

    1991-01-01

    A new class of problems was analyzed to estimate an optimal structure of laminated targets fabricated from the specified set of homogeneous materials. An approximate description of the perforation process is based on the model of radial hole extension. The problem is solved by using the needle-type variation technique. The desired optimization conditions and quantitative/qualitative estimations of optimal targets were obtained and are discussed using specific examples.

  3. Defining the consequences of genetic variation on a proteome–wide scale

    PubMed Central

    Chick, Joel M.; Munger, Steven C.; Simecek, Petr; Huttlin, Edward L.; Choi, Kwangbom; Gatti, Daniel M.; Raghupathy, Narayanan; Svenson, Karen L.; Churchill, Gary A.; Gygi, Steven P.

    2016-01-01

    Genetic variation modulates protein expression through both transcriptional and post-transcriptional mechanisms. To characterize the consequences of natural genetic diversity on the proteome, here we combine a multiplexed, mass spectrometry-based method for protein quantification with an emerging outbred mouse model containing extensive genetic variation from eight inbred founder strains. By measuring genome-wide transcript and protein expression in livers from 192 Diversity outbred mice, we identify 2,866 protein quantitative trait loci (pQTL) with twice as many local as distant genetic variants. These data support distinct transcriptional and post-transcriptional models underlying the observed pQTL effects. Using a sensitive approach to mediation analysis, we often identified a second protein or transcript as the causal mediator of distant pQTL. Our analysis reveals an extensive network of direct protein–protein interactions. Finally, we show that local genotype can provide accurate predictions of protein abundance in an independent cohort of collaborative cross mice. PMID:27309819

  4. Barcode extension for analysis and reconstruction of structures

    NASA Astrophysics Data System (ADS)

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L.; Gootenberg, Jonathan S.; Yin, Peng

    2017-03-01

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.

  5. Barcode extension for analysis and reconstruction of structures.

    PubMed

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng

    2017-03-13

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures.

  6. Barcode extension for analysis and reconstruction of structures

    PubMed Central

    Myhrvold, Cameron; Baym, Michael; Hanikel, Nikita; Ong, Luvena L; Gootenberg, Jonathan S; Yin, Peng

    2017-01-01

    Collections of DNA sequences can be rationally designed to self-assemble into predictable three-dimensional structures. The geometric and functional diversity of DNA nanostructures created to date has been enhanced by improvements in DNA synthesis and computational design. However, existing methods for structure characterization typically image the final product or laboriously determine the presence of individual, labelled strands using gel electrophoresis. Here we introduce a new method of structure characterization that uses barcode extension and next-generation DNA sequencing to quantitatively measure the incorporation of every strand into a DNA nanostructure. By quantifying the relative abundances of distinct DNA species in product and monomer bands, we can study the influence of geometry and sequence on assembly. We have tested our method using 2D and 3D DNA brick and DNA origami structures. Our method is general and should be extensible to a wide variety of DNA nanostructures. PMID:28287117

  7. A functional-structural model of rice linking quantitative genetic information with morphological development and physiological processes.

    PubMed

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-04-01

    Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype-phenotype model, we present here a three-dimensional functional-structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed.

  8. A functional–structural model of rice linking quantitative genetic information with morphological development and physiological processes

    PubMed Central

    Xu, Lifeng; Henke, Michael; Zhu, Jun; Kurth, Winfried; Buck-Sorlin, Gerhard

    2011-01-01

    Background and Aims Although quantitative trait loci (QTL) analysis of yield-related traits for rice has developed rapidly, crop models using genotype information have been proposed only relatively recently. As a first step towards a generic genotype–phenotype model, we present here a three-dimensional functional–structural plant model (FSPM) of rice, in which some model parameters are controlled by functions describing the effect of main-effect and epistatic QTLs. Methods The model simulates the growth and development of rice based on selected ecophysiological processes, such as photosynthesis (source process) and organ formation, growth and extension (sink processes). It was devised using GroIMP, an interactive modelling platform based on the Relational Growth Grammar formalism (RGG). RGG rules describe the course of organ initiation and extension resulting in final morphology. The link between the phenotype (as represented by the simulated rice plant) and the QTL genotype was implemented via a data interface between the rice FSPM and the QTLNetwork software, which computes predictions of QTLs from map data and measured trait data. Key Results Using plant height and grain yield, it is shown how QTL information for a given trait can be used in an FSPM, computing and visualizing the phenotypes of different lines of a mapping population. Furthermore, we demonstrate how modification of a particular trait feeds back on the entire plant phenotype via the physiological processes considered. Conclusions We linked a rice FSPM to a quantitative genetic model, thereby employing QTL information to refine model parameters and visualizing the dynamics of development of the entire phenotype as a result of ecophysiological processes, including the trait(s) for which genetic information is available. Possibilities for further extension of the model, for example for the purposes of ideotype breeding, are discussed. PMID:21247905

  9. Informatics in radiology: automated structured reporting of imaging findings using the AIM standard and XML.

    PubMed

    Zimmerman, Stefan L; Kim, Woojin; Boonn, William W

    2011-01-01

    Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.

  10. Protocol for the use of light upon extension real-time PCR for the determination of viral load in HBV infection.

    PubMed

    Li, Guimin; Li, Wangfeng; Liu, Lixia

    2012-01-01

    Real-time PCR has engendered wide acceptance for quantitation of hepatitis B virus (HBV) DNA in the blood due to its improved rapidity, sensitivity, reproducibility, and reduced contamination. Here we describe a cost-effective and highly sensitive HBV real-time quantitative assay based on the light upon extension real-time PCR platform and a simple and reliable HBV DNA preparation method using silica-coated magnetic beads.

  11. DMS-prefiltered mass spectrometry for the detection of biomarkers

    NASA Astrophysics Data System (ADS)

    Coy, Stephen L.; Krylov, Evgeny V.; Nazarov, Erkinjon G.

    2008-04-01

    Technologies based on Differential Mobility Spectrometry (DMS) are ideally matched to rapid, sensitive, and selective detection of chemicals like biomarkers. Biomarkers linked to exposure to radiation, exposure to CWA's, exposure to toxic materials (TICs and TIMs) and to specific diseases are being examined in a number of laboratories. Screening for these types of exposure can be improved in accuracy and greatly speeded up by using DMS-MS instead of slower techniques like LC-MS and GC-MS. We have performed an extensive series of tests with nanospray-DMS-mass spectroscopy and standalone nanospray-DMS obtaining extensive information on chemistry and detectivity. DMS-MS systems implemented with low-resolution, low-cost, portable mass-spectrometry systems are very promising. Lowresolution mass spectrometry alone would be inadequate for the task, but with DMS pre-filtration to suppress interferences, can be quite effective, even for quantitative measurement. Bio-fluids and digests are well suited to ionization by electrospray and detection by mass-spectrometry, but signals from critical markers are overwhelmed by chemical noise from unrelated species, making essential quantitative analysis impossible. Sionex and collaborators have presented data using DMS to suppress chemical noise, allowing detection of cancer biomarkers in 10,000-fold excess of normal products 1,2. In addition, a linear dynamic range of approximately 2,000 has been demonstrated with accurate quantitation 3. We will review the range of possible applications and present new data on DMS-MS biomarker detection.

  12. Two-dimensional fuzzy fault tree analysis for chlorine release from a chlor-alkali industry using expert elicitation.

    PubMed

    Renjith, V R; Madhu, G; Nayagam, V Lakshmana Gomathi; Bhasi, A B

    2010-11-15

    The hazards associated with major accident hazard (MAH) industries are fire, explosion and toxic gas releases. Of these, toxic gas release is the worst as it has the potential to cause extensive fatalities. Qualitative and quantitative hazard analyses are essential for the identification and quantification of these hazards related to chemical industries. Fault tree analysis (FTA) is an established technique in hazard identification. This technique has the advantage of being both qualitative and quantitative, if the probabilities and frequencies of the basic events are known. This paper outlines the estimation of the probability of release of chlorine from storage and filling facility of chlor-alkali industry using FTA. An attempt has also been made to arrive at the probability of chlorine release using expert elicitation and proven fuzzy logic technique for Indian conditions. Sensitivity analysis has been done to evaluate the percentage contribution of each basic event that could lead to chlorine release. Two-dimensional fuzzy fault tree analysis (TDFFTA) has been proposed for balancing the hesitation factor involved in expert elicitation. Copyright © 2010 Elsevier B.V. All rights reserved.

  13. Gene Level Meta-Analysis of Quantitative Traits by Functional Linear Models.

    PubMed

    Fan, Ruzong; Wang, Yifan; Boehnke, Michael; Chen, Wei; Li, Yun; Ren, Haobo; Lobach, Iryna; Xiong, Momiao

    2015-08-01

    Meta-analysis of genetic data must account for differences among studies including study designs, markers genotyped, and covariates. The effects of genetic variants may differ from population to population, i.e., heterogeneity. Thus, meta-analysis of combining data of multiple studies is difficult. Novel statistical methods for meta-analysis are needed. In this article, functional linear models are developed for meta-analyses that connect genetic data to quantitative traits, adjusting for covariates. The models can be used to analyze rare variants, common variants, or a combination of the two. Both likelihood-ratio test (LRT) and F-distributed statistics are introduced to test association between quantitative traits and multiple variants in one genetic region. Extensive simulations are performed to evaluate empirical type I error rates and power performance of the proposed tests. The proposed LRT and F-distributed statistics control the type I error very well and have higher power than the existing methods of the meta-analysis sequence kernel association test (MetaSKAT). We analyze four blood lipid levels in data from a meta-analysis of eight European studies. The proposed methods detect more significant associations than MetaSKAT and the P-values of the proposed LRT and F-distributed statistics are usually much smaller than those of MetaSKAT. The functional linear models and related test statistics can be useful in whole-genome and whole-exome association studies. Copyright © 2015 by the Genetics Society of America.

  14. Quantitative analysis of the pendulum test: application to multiple sclerosis patients treated with botulinum toxin.

    PubMed

    Bianchi, L; Monaldi, F; Paolucci, S; Iani, C; Lacquaniti, F

    1999-01-01

    The aim of this study was to develop quantitative analytical methods in the application of the pendulum test to both normal and spastic subjects. The lower leg was released by a torque motor from different starting positions. The resulting changes in the knee angle were fitted by means of a time-varying model. Stiffness and viscosity coefficients were derived for each half-cycle oscillation in both flexion and extension, and for all knee starting positions. This method was applied to the assessment of the effects of Botulinum toxin A (BTX) in progressive multiple sclerosis patients in a follow-up study. About half of the patients showed a significant decrement in stiffness and viscosity coefficients.

  15. Quantitative fluorescence correlation spectroscopy on DNA in living cells

    NASA Astrophysics Data System (ADS)

    Hodges, Cameron; Kafle, Rudra P.; Meiners, Jens-Christian

    2017-02-01

    FCS is a fluorescence technique conventionally used to study the kinetics of fluorescent molecules in a dilute solution. Being a non-invasive technique, it is now drawing increasing interest for the study of more complex systems like the dynamics of DNA or proteins in living cells. Unlike an ordinary dye solution, the dynamics of macromolecules like proteins or entangled DNA in crowded environments is often slow and subdiffusive in nature. This in turn leads to longer residence times of the attached fluorophores in the excitation volume of the microscope and artifacts from photobleaching abound that can easily obscure the signature of the molecular dynamics of interest and make quantitative analysis challenging.We discuss methods and procedures to make FCS applicable to quantitative studies of the dynamics of DNA in live prokaryotic and eukaryotic cells. The intensity autocorrelation is computed function from weighted arrival times of the photons on the detector that maximizes the information content while simultaneously correcting for the effect of photobleaching to yield an autocorrelation function that reflects only the underlying dynamics of the sample. This autocorrelation function in turn is used to calculate the mean square displacement of the fluorophores attached to DNA. The displacement data is more amenable to further quantitative analysis than the raw correlation functions. By using a suitable integral transform of the mean square displacement, we can then determine the viscoelastic moduli of the DNA in its cellular environment. The entire analysis procedure is extensively calibrated and validated using model systems and computational simulations.

  16. STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.

    PubMed

    Grillo, Vincenzo; Rossi, Francesca

    2013-02-01

    A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  18. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs.

    PubMed

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Zel, Jana; Gruden, Kristina

    2008-10-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1-25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification.

  19. NAIMA: target amplification strategy allowing quantitative on-chip detection of GMOs

    PubMed Central

    Morisset, Dany; Dobnik, David; Hamels, Sandrine; Žel, Jana; Gruden, Kristina

    2008-01-01

    We have developed a novel multiplex quantitative DNA-based target amplification method suitable for sensitive, specific and quantitative detection on microarray. This new method named NASBA Implemented Microarray Analysis (NAIMA) was applied to GMO detection in food and feed, but its application can be extended to all fields of biology requiring simultaneous detection of low copy number DNA targets. In a first step, the use of tailed primers allows the multiplex synthesis of template DNAs in a primer extension reaction. A second step of the procedure consists of transcription-based amplification using universal primers. The cRNA product is further on directly ligated to fluorescent dyes labelled 3DNA dendrimers allowing signal amplification and hybridized without further purification on an oligonucleotide probe-based microarray for multiplex detection. Two triplex systems have been applied to test maize samples containing several transgenic lines, and NAIMA has shown to be sensitive down to two target copies and to provide quantitative data on the transgenic contents in a range of 0.1–25%. Performances of NAIMA are comparable to singleplex quantitative real-time PCR. In addition, NAIMA amplification is faster since 20 min are sufficient to achieve full amplification. PMID:18710880

  20. Past, Present, and Future of Critical Quantitative Research in Higher Education

    ERIC Educational Resources Information Center

    Wells, Ryan S.; Stage, Frances K.

    2014-01-01

    This chapter discusses the evolution of the critical quantitative paradigm with an emphasis on extending this approach to new populations and new methods. Along with this extension of critical quantitative work, however, come continued challenges and tensions for researchers. This chapter recaps and responds to each chapter in the volume, and…

  1. Over the Hurdles: Barriers to Social Media Use in Extension Offices

    ERIC Educational Resources Information Center

    Newbury, Elizabeth; Humphreys, Lee; Fuess, Lucas

    2014-01-01

    The research reported here explored the perceived barriers to social media use by Extension educators. Using a sequential mixed method approach, the research was composed of two parts. The qualitative study used interview data (n = 27) from Wisconsin and New York Extension educators. The quantitative study gathered data from surveying Extension…

  2. Shewanella oneidensis MR-1 nanowires are outer membrane and periplasmic extensions of the extracellular electron transport components

    PubMed Central

    Pirbadian, Sahand; Barchinger, Sarah E.; Leung, Kar Man; Byun, Hye Suk; Jangir, Yamini; Bouhenni, Rachida A.; Reed, Samantha B.; Romine, Margaret F.; Saffarini, Daad A.; Shi, Liang; Gorby, Yuri A.; Golbeck, John H.; El-Naggar, Mohamed Y.

    2014-01-01

    Bacterial nanowires offer an extracellular electron transport (EET) pathway for linking the respiratory chain of bacteria to external surfaces, including oxidized metals in the environment and engineered electrodes in renewable energy devices. Despite the global, environmental, and technological consequences of this biotic–abiotic interaction, the composition, physiological relevance, and electron transport mechanisms of bacterial nanowires remain unclear. We report, to our knowledge, the first in vivo observations of the formation and respiratory impact of nanowires in the model metal-reducing microbe Shewanella oneidensis MR-1. Live fluorescence measurements, immunolabeling, and quantitative gene expression analysis point to S. oneidensis MR-1 nanowires as extensions of the outer membrane and periplasm that include the multiheme cytochromes responsible for EET, rather than pilin-based structures as previously thought. These membrane extensions are associated with outer membrane vesicles, structures ubiquitous in Gram-negative bacteria, and are consistent with bacterial nanowires that mediate long-range EET by the previously proposed multistep redox hopping mechanism. Redox-functionalized membrane and vesicular extensions may represent a general microbial strategy for electron transport and energy distribution. PMID:25143589

  3. Shewanella oneidensis MR-1 nanowires are outer membrane and periplasmic extensions of the extracellular electron transport components.

    PubMed

    Pirbadian, Sahand; Barchinger, Sarah E; Leung, Kar Man; Byun, Hye Suk; Jangir, Yamini; Bouhenni, Rachida A; Reed, Samantha B; Romine, Margaret F; Saffarini, Daad A; Shi, Liang; Gorby, Yuri A; Golbeck, John H; El-Naggar, Mohamed Y

    2014-09-02

    Bacterial nanowires offer an extracellular electron transport (EET) pathway for linking the respiratory chain of bacteria to external surfaces, including oxidized metals in the environment and engineered electrodes in renewable energy devices. Despite the global, environmental, and technological consequences of this biotic-abiotic interaction, the composition, physiological relevance, and electron transport mechanisms of bacterial nanowires remain unclear. We report, to our knowledge, the first in vivo observations of the formation and respiratory impact of nanowires in the model metal-reducing microbe Shewanella oneidensis MR-1. Live fluorescence measurements, immunolabeling, and quantitative gene expression analysis point to S. oneidensis MR-1 nanowires as extensions of the outer membrane and periplasm that include the multiheme cytochromes responsible for EET, rather than pilin-based structures as previously thought. These membrane extensions are associated with outer membrane vesicles, structures ubiquitous in Gram-negative bacteria, and are consistent with bacterial nanowires that mediate long-range EET by the previously proposed multistep redox hopping mechanism. Redox-functionalized membrane and vesicular extensions may represent a general microbial strategy for electron transport and energy distribution.

  4. Improving power and robustness for detecting genetic association with extreme-value sampling design.

    PubMed

    Chen, Hua Yun; Li, Mingyao

    2011-12-01

    Extreme-value sampling design that samples subjects with extremely large or small quantitative trait values is commonly used in genetic association studies. Samples in such designs are often treated as "cases" and "controls" and analyzed using logistic regression. Such a case-control analysis ignores the potential dose-response relationship between the quantitative trait and the underlying trait locus and thus may lead to loss of power in detecting genetic association. An alternative approach to analyzing such data is to model the dose-response relationship by a linear regression model. However, parameter estimation from this model can be biased, which may lead to inflated type I errors. We propose a robust and efficient approach that takes into consideration of both the biased sampling design and the potential dose-response relationship. Extensive simulations demonstrate that the proposed method is more powerful than the traditional logistic regression analysis and is more robust than the linear regression analysis. We applied our method to the analysis of a candidate gene association study on high-density lipoprotein cholesterol (HDL-C) which includes study subjects with extremely high or low HDL-C levels. Using our method, we identified several SNPs showing a stronger evidence of association with HDL-C than the traditional case-control logistic regression analysis. Our results suggest that it is important to appropriately model the quantitative traits and to adjust for the biased sampling when dose-response relationship exists in extreme-value sampling designs. © 2011 Wiley Periodicals, Inc.

  5. Principles of Quantitative MR Imaging with Illustrated Review of Applicable Modular Pulse Diagrams.

    PubMed

    Mills, Andrew F; Sakai, Osamu; Anderson, Stephan W; Jara, Hernan

    2017-01-01

    Continued improvements in diagnostic accuracy using magnetic resonance (MR) imaging will require development of methods for tissue analysis that complement traditional qualitative MR imaging studies. Quantitative MR imaging is based on measurement and interpretation of tissue-specific parameters independent of experimental design, compared with qualitative MR imaging, which relies on interpretation of tissue contrast that results from experimental pulse sequence parameters. Quantitative MR imaging represents a natural next step in the evolution of MR imaging practice, since quantitative MR imaging data can be acquired using currently available qualitative imaging pulse sequences without modifications to imaging equipment. The article presents a review of the basic physical concepts used in MR imaging and how quantitative MR imaging is distinct from qualitative MR imaging. Subsequently, the article reviews the hierarchical organization of major applicable pulse sequences used in this article, with the sequences organized into conventional, hybrid, and multispectral sequences capable of calculating the main tissue parameters of T1, T2, and proton density. While this new concept offers the potential for improved diagnostic accuracy and workflow, awareness of this extension to qualitative imaging is generally low. This article reviews the basic physical concepts in MR imaging, describes commonly measured tissue parameters in quantitative MR imaging, and presents the major available pulse sequences used for quantitative MR imaging, with a focus on the hierarchical organization of these sequences. © RSNA, 2017.

  6. Prediction Analysis for Measles Epidemics

    NASA Astrophysics Data System (ADS)

    Sumi, Ayako; Ohtomo, Norio; Tanaka, Yukio; Sawamura, Sadashi; Olsen, Lars Folke; Kobayashi, Nobumichi

    2003-12-01

    A newly devised procedure of prediction analysis, which is a linearized version of the nonlinear least squares method combined with the maximum entropy spectral analysis method, was proposed. This method was applied to time series data of measles case notification in several communities in the UK, USA and Denmark. The dominant spectral lines observed in each power spectral density (PSD) can be safely assigned as fundamental periods. The optimum least squares fitting (LSF) curve calculated using these fundamental periods can essentially reproduce the underlying variation of the measles data. An extension of the LSF curve can be used to predict measles case notification quantitatively. Some discussions including a predictability of chaotic time series are presented.

  7. Q-space analysis of light scattering by ice crystals

    NASA Astrophysics Data System (ADS)

    Heinson, Yuli W.; Maughan, Justin B.; Ding, Jiachen; Chakrabarti, Amitabha; Yang, Ping; Sorensen, Christopher M.

    2016-12-01

    Q-space analysis is applied to extensive simulations of the single-scattering properties of ice crystals with various habits/shapes over a range of sizes. The analysis uncovers features common to all the shapes: a forward scattering regime with intensity quantitatively related to the Rayleigh scattering by the particle and the internal coupling parameter, followed by a Guinier regime dependent upon the particle size, a complex power law regime with incipient two dimensional diffraction effects, and, in some cases, an enhanced backscattering regime. The effects of significant absorption on the scattering profile are also studied. The overall features found for the ice crystals are similar to features in scattering from same sized spheres.

  8. Conserved DNA methylation patterns in healthy blood cells and extensive changes in leukemia measured by a new quantitative technique

    PubMed Central

    Jelinek, Jaroslav; Liang, Shoudan; Lu, Yue; He, Rong; Ramagli, Louis S.; Shpall, Elizabeth J.; Estecio, Marcos R.H.; Issa, Jean-Pierre J.

    2012-01-01

    Genome wide analysis of DNA methylation provides important information in a variety of diseases, including cancer. Here, we describe a simple method, Digital Restriction Enzyme Analysis of Methylation (DREAM), based on next generation sequencing analysis of methylation-specific signatures created by sequential digestion of genomic DNA with SmaI and XmaI enzymes. DREAM provides information on 150,000 unique CpG sites, of which 39,000 are in CpG islands and 30,000 are at transcription start sites of 13,000 RefSeq genes. We analyzed DNA methylation in healthy white blood cells and found methylation patterns to be remarkably uniform. Inter individual differences > 30% were observed only at 227 of 28,331 (0.8%) of autosomal CpG sites. Similarly, > 30% differences were observed at only 59 sites when we comparing the cord and adult blood. These conserved methylation patterns contrasted with extensive changes affecting 18–40% of CpG sites in a patient with acute myeloid leukemia and in two leukemia cell lines. The method is cost effective, quantitative (r2 = 0.93 when compared with bisulfite pyrosequencing) and reproducible (r2 = 0.997). Using 100-fold coverage, DREAM can detect differences in methylation greater than 10% or 30% with a false positive rate below 0.05 or 0.001, respectively. DREAM can be useful in quantifying epigenetic effects of environment and nutrition, correlating developmental epigenetic variation with phenotypes, understanding epigenetics of cancer and chronic diseases, measuring the effects of drugs on DNA methylation or deriving new biological insights into mammalian genomes. PMID:23075513

  9. Dark Zones of Solid Propellant Flames: Critical Assessment and Quantitative Modeling of Experimental Datasets With Analysis of Chemical Pathways and Sensitivities

    DTIC Science & Technology

    2011-01-01

    with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...Research Associate at ARL with WRA, and largely completed more recently while at Dept. of Chem., SUNY, Cortland, NY. Currently unaffiliated. †Former...promised to provide an extensive, definitive review critically assessing our current understanding of DZ structure and chemistry, and providing a documented

  10. Field and Laboratory Application of a Gas Chromatograph Low Thermal Mass Resistively Heated Column System in Detecting Traditional and Non-Traditional Chemical Warfare Agents Using Solid Phase Micro-Extraction

    DTIC Science & Technology

    2005-02-01

    followed by extensive sample preparation procedures that are performed in a laboratory. Analysis is typically conducted by injecting a liquid or gas sample...Alfentanil, Remifentanil , Sufentanil, and Carfentanil) in a laboratory. (5) Quantitatively determine a maximum temperature ramping rate at which the LTM...RHT Column combined with a GC-MS can separate and analyze a mixture of non- traditional CWAs (i.e. Fentanyl, Alfentanil, Remifentanil , Sufentanil

  11. Little cigars, big cigars: omissions and commissions of harm and harm reduction information on the Internet.

    PubMed

    Dollar, Katherine M; Mix, Jacqueline M; Kozlowski, Lynn T

    2008-05-01

    We conducted a comparative analysis of "harm," "harm reduction," and "little cigar" information about cigars on 10 major English-language health Web sites. The sites were from governmental and nongovernmental organizations based in seven different countries and included "harm" and "harm reduction" information, discussions of little cigars, quantitative estimates of health risks, and qualifying behavioral characteristics (inhalation, number per day). Of the 10 Web sites, 7 offered statements explicitly indicating that cigars may be safer than cigarettes. None of the Web sites reviewed described that little cigars are likely as dangerous as cigarettes. Some Web sites provided quantitative estimates of health risks and extensive discussions of qualifying factors. Reading grade levels were higher than desirable. Extensive and complex information on the reduced risks of cigars compared with cigarettes is available on Web sites affiliated with prominent health organizations. Yet these sites fail to warn consumers that popular cigarette-like little cigars and cigarillos are likely to be just as dangerous as cigarettes, even for those who have never smoked cigarettes. Improvement of these Web sites is urgently needed to provide the public with high-quality health information.

  12. Improving training in methodology enriches the science of psychology.

    PubMed

    Aiken, Leona S; West, Stephen G; Millsap, Roger E

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The current authors state that in their recent article, they reported the results of an extensive survey of quantitative training in all PhD programs in North America. They compared these results with those of a similar survey conducted 12 years earlier (Aiken, West, Sechrest, & Reno, 1990), and raised issues for the future methodological training of substantive and quantitative researchers in psychology. The authors then respond to Zimiles three questions. PsycINFO Database Record 2009 APA.

  13. Clinicopathological characteristics including BRAF V600E mutation status and PET/CT findings in papillary thyroid carcinoma.

    PubMed

    Choi, Eun Kyoung; Chong, Ari; Ha, Jung-Min; Jung, Chan Kwon; O, Joo Hyun; Kim, Sung Hoon

    2017-07-01

    We assessed the associations between FDG uptake in primary papillary thyroid carcinomas (PTCs) and clinicopathological features, including the BRAF V600E mutation, using quantitative and qualitative analyses of preoperative PET/CT data. This was a retrospective review of 106 patients with PTC who underwent PET/CT scans between February 2009 and January 2011 before undergoing total thyroidectomy. Data collected from surgical specimens were compared with FDG uptake in the primary tumour using quantitative and qualitative analyses of preoperative PET/CT data. Clinicopathological data included the primary tumour size, subtype, capsular invasion, extrathyroid extension, multifocality, BRAF V600E mutation status, lymph node metastasis and distant metastasis. The SUVmax of the primary tumour was significantly higher in patients with a primary tumour >1 cm, extrathyroid extension or the BRAF V600E mutation than in patients without these features (P<.001, .049 and <.001). Univariate analyses showed that primary tumour size, extrathyroid extension and BRAF V600E mutation status were associated with the SUVmax of the PTC. Multivariate analysis indicated that primary tumour size and the BRAF V600E mutation were associated with the SUVmax of the PTC. In a visual assessment, the primary tumour size was larger in FDG-avid than in non-FDG-avid PTCs (P<.001). There was no significant difference in the presence of multifocality, thyroid capsular invasion, extrathyroid extension, BRAF V600E mutation, lymph node metastasis or distant metastasis between FDG-avid and non-FDG-avid PTCs. Primary tumour size and the BRAF V600E mutation are significant factors associated with the SUVmax on preoperative PET/CT in patients with PTC. © 2017 John Wiley & Sons Ltd.

  14. Targeted quantitative analysis of Streptococcus pyogenes virulence factors by multiple reaction monitoring.

    PubMed

    Lange, Vinzenz; Malmström, Johan A; Didion, John; King, Nichole L; Johansson, Björn P; Schäfer, Juliane; Rameseder, Jonathan; Wong, Chee-Hong; Deutsch, Eric W; Brusniak, Mi-Youn; Bühlmann, Peter; Björck, Lars; Domon, Bruno; Aebersold, Ruedi

    2008-08-01

    In many studies, particularly in the field of systems biology, it is essential that identical protein sets are precisely quantified in multiple samples such as those representing differentially perturbed cell states. The high degree of reproducibility required for such experiments has not been achieved by classical mass spectrometry-based proteomics methods. In this study we describe the implementation of a targeted quantitative approach by which predetermined protein sets are first identified and subsequently quantified at high sensitivity reliably in multiple samples. This approach consists of three steps. First, the proteome is extensively mapped out by multidimensional fractionation and tandem mass spectrometry, and the data generated are assembled in the PeptideAtlas database. Second, based on this proteome map, peptides uniquely identifying the proteins of interest, proteotypic peptides, are selected, and multiple reaction monitoring (MRM) transitions are established and validated by MS2 spectrum acquisition. This process of peptide selection, transition selection, and validation is supported by a suite of software tools, TIQAM (Targeted Identification for Quantitative Analysis by MRM), described in this study. Third, the selected target protein set is quantified in multiple samples by MRM. Applying this approach we were able to reliably quantify low abundance virulence factors from cultures of the human pathogen Streptococcus pyogenes exposed to increasing amounts of plasma. The resulting quantitative protein patterns enabled us to clearly define the subset of virulence proteins that is regulated upon plasma exposure.

  15. Effect of quantum nuclear motion on hydrogen bonding

    NASA Astrophysics Data System (ADS)

    McKenzie, Ross H.; Bekker, Christiaan; Athokpam, Bijyalaxmi; Ramesh, Sai G.

    2014-05-01

    This work considers how the properties of hydrogen bonded complexes, X-H⋯Y, are modified by the quantum motion of the shared proton. Using a simple two-diabatic state model Hamiltonian, the analysis of the symmetric case, where the donor (X) and acceptor (Y) have the same proton affinity, is carried out. For quantitative comparisons, a parametrization specific to the O-H⋯O complexes is used. The vibrational energy levels of the one-dimensional ground state adiabatic potential of the model are used to make quantitative comparisons with a vast body of condensed phase data, spanning a donor-acceptor separation (R) range of about 2.4 - 3.0 Å, i.e., from strong to weak hydrogen bonds. The position of the proton (which determines the X-H bond length) and its longitudinal vibrational frequency, along with the isotope effects in both are described quantitatively. An analysis of the secondary geometric isotope effect, using a simple extension of the two-state model, yields an improved agreement of the predicted variation with R of frequency isotope effects. The role of bending modes is also considered: their quantum effects compete with those of the stretching mode for weak to moderate H-bond strengths. In spite of the economy in the parametrization of the model used, it offers key insights into the defining features of H-bonds, and semi-quantitatively captures several trends.

  16. Quantitative Cell Cycle Analysis Based on an Endogenous All-in-One Reporter for Cell Tracking and Classification.

    PubMed

    Zerjatke, Thomas; Gak, Igor A; Kirova, Dilyana; Fuhrmann, Markus; Daniel, Katrin; Gonciarz, Magdalena; Müller, Doris; Glauche, Ingmar; Mansfeld, Jörg

    2017-05-30

    Cell cycle kinetics are crucial to cell fate decisions. Although live imaging has provided extensive insights into this relationship at the single-cell level, the limited number of fluorescent markers that can be used in a single experiment has hindered efforts to link the dynamics of individual proteins responsible for decision making directly to cell cycle progression. Here, we present fluorescently tagged endogenous proliferating cell nuclear antigen (PCNA) as an all-in-one cell cycle reporter that allows simultaneous analysis of cell cycle progression, including the transition into quiescence, and the dynamics of individual fate determinants. We also provide an image analysis pipeline for automated segmentation, tracking, and classification of all cell cycle phases. Combining the all-in-one reporter with labeled endogenous cyclin D1 and p21 as prime examples of cell-cycle-regulated fate determinants, we show how cell cycle and quantitative protein dynamics can be simultaneously extracted to gain insights into G1 phase regulation and responses to perturbations. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. Quantitative analysis of small molecule-nucleic acid interactions with a biosensor surface and surface plasmon resonance detection.

    PubMed

    Liu, Yang; Wilson, W David

    2010-01-01

    Surface plasmon resonance (SPR) technology with biosensor surfaces has become a widely-used tool for the study of nucleic acid interactions without any labeling requirements. The method provides simultaneous kinetic and equilibrium characterization of the interactions of biomolecules as well as small molecule-biopolymer binding. SPR monitors molecular interactions in real time and provides significant advantages over optical or calorimetic methods for systems with strong binding coupled to small spectroscopic signals and/or reaction heats. A detailed and practical guide for nucleic acid interaction analysis using SPR-biosensor methods is presented. Details of the SPR technology and basic fundamentals are described with recommendations on the preparation of the SPR instrument, sensor chips, and samples, as well as extensive information on experimental design, quantitative and qualitative data analysis and presentation. A specific example of the interaction of a minor-groove-binding agent with DNA is evaluated by both kinetic and steady-state SPR methods to illustrate the technique. Since the molecules that bind cooperatively to specific DNA sequences are attractive for many applications, a cooperative small molecule-DNA interaction is also presented.

  18. Linking Antisocial Behavior, Substance Use, and Personality: An Integrative Quantitative Model of the Adult Externalizing Spectrum

    PubMed Central

    Krueger, Robert F.; Markon, Kristian E.; Patrick, Christopher J.; Benning, Stephen D.; Kramer, Mark D.

    2008-01-01

    Antisocial behavior, substance use, and impulsive and aggressive personality traits often co-occur, forming a coherent spectrum of personality and psychopathology. In the current research, the authors developed a novel quantitative model of this spectrum. Over 3 waves of iterative data collection, 1,787 adult participants selected to represent a range across the externalizing spectrum provided extensive data about specific externalizing behaviors. Statistical methods such as item response theory and semiparametric factor analysis were used to model these data. The model and assessment instrument that emerged from the research shows how externalizing phenomena are organized hierarchically and cover a wide range of individual differences. The authors discuss the utility of this model for framing research on the correlates and the etiology of externalizing phenomena. PMID:18020714

  19. Colored Petri net modeling and simulation of signal transduction pathways.

    PubMed

    Lee, Dong-Yup; Zimmer, Ralf; Lee, Sang Yup; Park, Sunwon

    2006-03-01

    Presented herein is a methodology for quantitatively analyzing the complex signaling network by resorting to colored Petri nets (CPN). The mathematical as well as Petri net models for two basic reaction types were established, followed by the extension to a large signal transduction system stimulated by epidermal growth factor (EGF) in an application study. The CPN models based on the Petri net representation and the conservation and kinetic equations were used to examine the dynamic behavior of the EGF signaling pathway. The usefulness of Petri nets is demonstrated for the quantitative analysis of the signal transduction pathway. Moreover, the trade-offs between modeling capability and simulation efficiency of this pathway are explored, suggesting that the Petri net model can be invaluable in the initial stage of building a dynamic model.

  20. SNP-VISTA: An Interactive SNPs Visualization Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shah, Nameeta; Teplitsky, Michael V.; Pennacchio, Len A.

    2005-07-05

    Recent advances in sequencing technologies promise better diagnostics for many diseases as well as better understanding of evolution of microbial populations. Single Nucleotide Polymorphisms(SNPs) are established genetic markers that aid in the identification of loci affecting quantitative traits and/or disease in a wide variety of eukaryotic species. With today's technological capabilities, it is possible to re-sequence a large set of appropriate candidate genes in individuals with a given disease and then screen for causative mutations.In addition, SNPs have been used extensively in efforts to study the evolution of microbial populations, and the recent application of random shotgun sequencing to environmentalmore » samples makes possible more extensive SNP analysis of co-occurring and co-evolving microbial populations. The program is available at http://genome.lbl.gov/vista/snpvista.« less

  1. 77 FR 46750 - Agency Information Collection Extension

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-06

    ... Questionnaire Testing, Evaluation, and Research.'' The proposed collection will utilize qualitative and quantitative methodologies to pretest questionnaires and validate EIA survey forms data quality, including..., Evaluation, and Research; (3) Type of Request: Extension, Without Change, of a Previously Approved Collection...

  2. The diagnostic capability of laser induced fluorescence in the characterization of excised breast tissues

    NASA Astrophysics Data System (ADS)

    Galmed, A. H.; Elshemey, Wael M.

    2017-08-01

    Differentiating between normal, benign and malignant excised breast tissues is one of the major worldwide challenges that need a quantitative, fast and reliable technique in order to avoid personal errors in diagnosis. Laser induced fluorescence (LIF) is a promising technique that has been applied for the characterization of biological tissues including breast tissue. Unfortunately, only few studies have adopted a quantitative approach that can be directly applied for breast tissue characterization. This work provides a quantitative means for such characterization via introduction of several LIF characterization parameters and determining the diagnostic accuracy of each parameter in the differentiation between normal, benign and malignant excised breast tissues. Extensive analysis on 41 lyophilized breast samples using scatter diagrams, cut-off values, diagnostic indices and receiver operating characteristic (ROC) curves, shows that some spectral parameters (peak height and area under the peak) are superior for characterization of normal, benign and malignant breast tissues with high sensitivity (up to 0.91), specificity (up to 0.91) and accuracy ranking (highly accurate).

  3. A thorough experimental study of CH/π interactions in water: quantitative structure-stability relationships for carbohydrate/aromatic complexes.

    PubMed

    Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis

    2015-11-13

    CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.

  4. Creating an anthropomorphic digital MR phantom—an extensible tool for comparing and evaluating quantitative imaging algorithms

    NASA Astrophysics Data System (ADS)

    Bosca, Ryan J.; Jackson, Edward F.

    2016-01-01

    Assessing and mitigating the various sources of bias and variance associated with image quantification algorithms is essential to the use of such algorithms in clinical research and practice. Assessment is usually accomplished with grid-based digital reference objects (DRO) or, more recently, digital anthropomorphic phantoms based on normal human anatomy. Publicly available digital anthropomorphic phantoms can provide a basis for generating realistic model-based DROs that incorporate the heterogeneity commonly found in pathology. Using a publicly available vascular input function (VIF) and digital anthropomorphic phantom of a normal human brain, a methodology was developed to generate a DRO based on the general kinetic model (GKM) that represented realistic and heterogeneously enhancing pathology. GKM parameters were estimated from a deidentified clinical dynamic contrast-enhanced (DCE) MRI exam. This clinical imaging volume was co-registered with a discrete tissue model, and model parameters estimated from clinical images were used to synthesize a DCE-MRI exam that consisted of normal brain tissues and a heterogeneously enhancing brain tumor. An example application of spatial smoothing was used to illustrate potential applications in assessing quantitative imaging algorithms. A voxel-wise Bland-Altman analysis demonstrated negligible differences between the parameters estimated with and without spatial smoothing (using a small radius Gaussian kernel). In this work, we reported an extensible methodology for generating model-based anthropomorphic DROs containing normal and pathological tissue that can be used to assess quantitative imaging algorithms.

  5. Quantitative interpretations of Visible-NIR reflectance spectra of blood.

    PubMed

    Serebrennikova, Yulia M; Smith, Jennifer M; Huffman, Debra E; Leparc, German F; García-Rubio, Luis H

    2008-10-27

    This paper illustrates the implementation of a new theoretical model for rapid quantitative analysis of the Vis-NIR diffuse reflectance spectra of blood cultures. This new model is based on the photon diffusion theory and Mie scattering theory that have been formulated to account for multiple scattering populations and absorptive components. This study stresses the significance of the thorough solution of the scattering and absorption problem in order to accurately resolve for optically relevant parameters of blood culture components. With advantages of being calibration-free and computationally fast, the new model has two basic requirements. First, wavelength-dependent refractive indices of the basic chemical constituents of blood culture components are needed. Second, multi-wavelength measurements or at least the measurements of characteristic wavelengths equal to the degrees of freedom, i.e. number of optically relevant parameters, of blood culture system are required. The blood culture analysis model was tested with a large number of diffuse reflectance spectra of blood culture samples characterized by an extensive range of the relevant parameters.

  6. Validating Quantitative Measurement Using Qualitative Data: Combining Rasch Scaling and Latent Semantic Analysis in Psychiatry

    NASA Astrophysics Data System (ADS)

    Lange, Rense

    2015-02-01

    An extension of concurrent validity is proposed that uses qualitative data for the purpose of validating quantitative measures. The approach relies on Latent Semantic Analysis (LSA) which places verbal (written) statements in a high dimensional semantic space. Using data from a medical / psychiatric domain as a case study - Near Death Experiences, or NDE - we established concurrent validity by connecting NDErs qualitative (written) experiential accounts with their locations on a Rasch scalable measure of NDE intensity. Concurrent validity received strong empirical support since the variance in the Rasch measures could be predicted reliably from the coordinates of their accounts in the LSA derived semantic space (R2 = 0.33). These coordinates also predicted NDErs age with considerable precision (R2 = 0.25). Both estimates are probably artificially low due to the small available data samples (n = 588). It appears that Rasch scalability of NDE intensity is a prerequisite for these findings, as each intensity level is associated (at least probabilistically) with a well- defined pattern of item endorsements.

  7. Identification of appropriate reference genes for human mesenchymal stem cell analysis by quantitative real-time PCR.

    PubMed

    Li, Xiuying; Yang, Qiwei; Bai, Jinping; Xuan, Yali; Wang, Yimin

    2015-01-01

    Normalization to a reference gene is the method of choice for quantitative reverse transcription-PCR (RT-qPCR) analysis. The stability of reference genes is critical for accurate experimental results and conclusions. We have evaluated the expression stability of eight commonly used reference genes found in four different human mesenchymal stem cells (MSC). Using geNorm, NormFinder and BestKeeper algorithms, we show that beta-2-microglobulin and peptidyl-prolylisomerase A were the optimal reference genes for normalizing RT-qPCR data obtained from MSC, whereas the TATA box binding protein was not suitable due to its extensive variability in expression. Our findings emphasize the significance of validating reference genes for qPCR analyses. We offer a short list of reference genes to use for normalization and recommend some commercially-available software programs as a rapid approach to validate reference genes. We also demonstrate that the two reference genes, β-actin and glyceraldehyde-3-phosphate dehydrogenase, are frequently used are not always successful in many cases.

  8. Compartmentalized microchannel array for high-throughput analysis of single cell polarized growth and dynamics

    DOE PAGES

    Geng, Tao; Bredeweg, Erin L.; Szymanski, Craig J.; ...

    2015-11-04

    Here, interrogating polarized growth is technologically challenging due to extensive cellular branching and uncontrollable environmental conditions in conventional assays. Here we present a robust and high-performance microfluidic system that enables observations of polarized growth with enhanced temporal and spatial control over prolonged periods. The system has built-in tunability and versatility to accommodate a variety of science applications requiring precisely controlled environments. Using the model filamentous fungus, Neurospora crassa, this microfluidic system enabled direct visualization and analysis of cellular heterogeneity in a clonal fungal cell population, nuclear distribution and dynamics at the subhyphal level, and quantitative dynamics of gene expression withmore » single hyphal compartment resolution in response to carbon source starvation and exchange experiments. Although the microfluidic device is demonstrated on filamentous fungi, our technology is immediately extensible to a wide array of other biosystems that exhibit similar polarized cell growth with applications ranging from bioenergy production to human health.« less

  9. QGene 4.0, an extensible Java QTL-analysis platform.

    PubMed

    Joehanes, Roby; Nelson, James C

    2008-12-01

    Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.

  10. Definition of air quality measurements for monitoring space shuttle launches

    NASA Technical Reports Server (NTRS)

    Thorpe, R. D.

    1978-01-01

    A description of a recommended air quality monitoring network to characterize the impact on ambient air quality in the Kennedy Space Center (KSC) (area) of space shuttle launch operations is given. Analysis of ground cloud processes and prevalent meteorological conditions indicates that transient HCl depositions can be a cause for concern. The system designed to monitor HCl employs an extensive network of inexpensive detectors combined with a central analysis device. An acid rain network is also recommended. A quantitative measure of projected minimal long-term impact involves the limited monitoring of NOx and particulates. All recommended monitoring is confined ti KSC property.

  11. 3-D interactive visualisation tools for Hi spectral line imaging

    NASA Astrophysics Data System (ADS)

    van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.

    2017-06-01

    Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.

  12. Quantitative proteomic characterization of the lung extracellular matrix in chronic obstructive pulmonary disease and idiopathic pulmonary fibrosis.

    PubMed

    Åhrman, Emma; Hallgren, Oskar; Malmström, Lars; Hedström, Ulf; Malmström, Anders; Bjermer, Leif; Zhou, Xiao-Hong; Westergren-Thorsson, Gunilla; Malmström, Johan

    2018-03-01

    Remodeling of the extracellular matrix (ECM) is a common feature in lung diseases such as chronic obstructive pulmonary disease (COPD) and idiopathic pulmonary fibrosis (IPF). Here, we applied a sequential tissue extraction strategy to describe disease-specific remodeling of human lung tissue in disease, using end-stages of COPD and IPF. Our strategy was based on quantitative comparison of the disease proteomes, with specific focus on the matrisome, using data-independent acquisition and targeted data analysis (SWATH-MS). Our work provides an in-depth proteomic characterization of human lung tissue during impaired tissue remodeling. In addition, we show important quantitative and qualitative effects of the solubility of matrisome proteins. COPD was characterized by a disease-specific increase in ECM regulators, metalloproteinase inhibitor 3 (TIMP3) and matrix metalloproteinase 28 (MMP-28), whereas for IPF, impairment in cell adhesion proteins, such as collagen VI and laminins, was most prominent. For both diseases, we identified increased levels of proteins involved in the regulation of endopeptidase activity, with several proteins belonging to the serpin family. The established human lung quantitative proteome inventory and the construction of a tissue-specific protein assay library provides a resource for future quantitative proteomic analyses of human lung tissues. We present a sequential tissue extraction strategy to determine changes in extractability of matrisome proteins in end-stage COPD and IPF compared to healthy control tissue. Extensive quantitative analysis of the proteome changes of the disease states revealed altered solubility of matrisome proteins involved in ECM regulators and cell-ECM communication. The results highlight disease-specific remodeling mechanisms associated with COPD and IPF. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Intramolecular carbon and nitrogen isotope analysis by quantitative dry fragmentation of the phenylurea herbicide isoproturon in a combined injector/capillary reactor prior to GC separation.

    PubMed

    Penning, Holger; Elsner, Martin

    2007-11-01

    Potentially, compound-specific isotope analysis may provide unique information on source and fate of pesticides in natural systems. Yet for isotope analysis, LC-based methods that are based on the use of organic solvents often cannot be used and GC-based analysis is frequently not possible due to thermolability of the analyte. A typical example of a compound with such properties is isoproturon (3-(4-isopropylphenyl)-1,1-dimethylurea), belonging to the worldwide extensively used phenylurea herbicides. To make isoproturon accessible to carbon and nitrogen isotope analysis, we developed a GC-based method during which isoproturon was quantitatively fragmented to dimethylamine and 4-isopropylphenylisocyanate. Fragmentation occurred only partially in the injector but was mainly achieved on a heated capillary column. The fragments were then chromatographically separated and individually measured by isotope ratio mass spectrometry. The reliability of the method was tested in hydrolysis experiments with three isotopically different batches of isoproturon. For all three products, the same isotope fractionation factors were observed during conversion and the difference in isotope composition between the batches was preserved. This study demonstrates that fragmentation of phenylurea herbicides does not only make them accessible to isotope analysis but even enables determination of intramolecular isotope fractionation.

  14. Qualitative and Quantitative Analysis for Facial Complexion in Traditional Chinese Medicine

    PubMed Central

    Zhao, Changbo; Li, Guo-zheng; Li, Fufeng; Wang, Zhi; Liu, Chang

    2014-01-01

    Facial diagnosis is an important and very intuitive diagnostic method in Traditional Chinese Medicine (TCM). However, due to its qualitative and experience-based subjective property, traditional facial diagnosis has a certain limitation in clinical medicine. The computerized inspection method provides classification models to recognize facial complexion (including color and gloss). However, the previous works only study the classification problems of facial complexion, which is considered as qualitative analysis in our perspective. For quantitative analysis expectation, the severity or degree of facial complexion has not been reported yet. This paper aims to make both qualitative and quantitative analysis for facial complexion. We propose a novel feature representation of facial complexion from the whole face of patients. The features are established with four chromaticity bases splitting up by luminance distribution on CIELAB color space. Chromaticity bases are constructed from facial dominant color using two-level clustering; the optimal luminance distribution is simply implemented with experimental comparisons. The features are proved to be more distinctive than the previous facial complexion feature representation. Complexion recognition proceeds by training an SVM classifier with the optimal model parameters. In addition, further improved features are more developed by the weighted fusion of five local regions. Extensive experimental results show that the proposed features achieve highest facial color recognition performance with a total accuracy of 86.89%. And, furthermore, the proposed recognition framework could analyze both color and gloss degrees of facial complexion by learning a ranking function. PMID:24967342

  15. Digital storage and analysis of color Doppler echocardiograms

    NASA Technical Reports Server (NTRS)

    Chandra, S.; Thomas, J. D.

    1997-01-01

    Color Doppler flow mapping has played an important role in clinical echocardiography. Most of the clinical work, however, has been primarily qualitative. Although qualitative information is very valuable, there is considerable quantitative information stored within the velocity map that has not been extensively exploited so far. Recently, many researchers have shown interest in using the encoded velocities to address the clinical problems such as quantification of valvular regurgitation, calculation of cardiac output, and characterization of ventricular filling. In this article, we review some basic physics and engineering aspects of color Doppler echocardiography, as well as drawbacks of trying to retrieve velocities from video tape data. Digital storage, which plays a critical role in performing quantitative analysis, is discussed in some detail with special attention to velocity encoding in DICOM 3.0 (medical image storage standard) and the use of digital compression. Lossy compression can considerably reduce file size with minimal loss of information (mostly redundant); this is critical for digital storage because of the enormous amount of data generated (a 10 minute study could require 18 Gigabytes of storage capacity). Lossy JPEG compression and its impact on quantitative analysis has been studied, showing that images compressed at 27:1 using the JPEG algorithm compares favorably with directly digitized video images, the current goldstandard. Some potential applications of these velocities in analyzing the proximal convergence zones, mitral inflow, and some areas of future development are also discussed in the article.

  16. Quantitative analysis of the patellofemoral motion pattern using semi-automatic processing of 4D CT data.

    PubMed

    Forsberg, Daniel; Lindblom, Maria; Quick, Petter; Gauffin, Håkan

    2016-09-01

    To present a semi-automatic method with minimal user interaction for quantitative analysis of the patellofemoral motion pattern. 4D CT data capturing the patellofemoral motion pattern of a continuous flexion and extension were collected for five patients prone to patellar luxation both pre- and post-surgically. For the proposed method, an observer would place landmarks in a single 3D volume, which then are automatically propagated to the other volumes in a time sequence. From the landmarks in each volume, the measures patellar displacement, patellar tilt and angle between femur and tibia were computed. Evaluation of the observer variability showed the proposed semi-automatic method to be favorable over a fully manual counterpart, with an observer variability of approximately 1.5[Formula: see text] for the angle between femur and tibia, 1.5 mm for the patellar displacement, and 4.0[Formula: see text]-5.0[Formula: see text] for the patellar tilt. The proposed method showed that surgery reduced the patellar displacement and tilt at maximum extension with approximately 10-15 mm and 15[Formula: see text]-20[Formula: see text] for three patients but with less evident differences for two of the patients. A semi-automatic method suitable for quantification of the patellofemoral motion pattern as captured by 4D CT data has been presented. Its observer variability is on par with that of other methods but with the distinct advantage to support continuous motions during the image acquisition.

  17. Illumina whole-genome complementary DNA-mediated annealing, selection, extension and ligation platform: assessing its performance in formalin-fixed, paraffin-embedded samples and identifying invasion pattern-related genes in oral squamous cell carcinoma.

    PubMed

    Loudig, Olivier; Brandwein-Gensler, Margaret; Kim, Ryung S; Lin, Juan; Isayeva, Tatyana; Liu, Christina; Segall, Jeffrey E; Kenny, Paraic A; Prystowsky, Michael B

    2011-12-01

    High-throughput gene expression profiling from formalin-fixed, paraffin-embedded tissues has become a reality, and several methods are now commercially available. The Illumina whole-genome complementary DNA-mediated annealing, selection, extension and ligation assay (Illumina, Inc) is a full-transcriptome version of the original 512-gene complementary DNA-mediated annealing, selection, extension and ligation assay, allowing high-throughput profiling of 24,526 annotated genes from degraded and formalin-fixed, paraffin-embedded RNA. This assay has the potential to allow identification of novel gene signatures associated with clinical outcome using banked archival pathology specimen resources. We tested the reproducibility of the whole-genome complementary DNA-mediated annealing, selection, extension and ligation assay and its sensitivity for detecting differentially expressed genes in RNA extracted from matched fresh and formalin-fixed, paraffin-embedded cells, after 1 and 13 months of storage, using the human breast cell lines MCF7 and MCF10A. Then, using tumor worst pattern of invasion as a classifier, 1 component of the "risk model," we selected 12 formalin-fixed, paraffin-embedded oral squamous cell carcinomas for whole-genome complementary DNA-mediated annealing, selection, extension and ligation assay analysis. We profiled 5 tumors with nonaggressive, nondispersed pattern of invasion, and 7 tumors with aggressive dispersed pattern of invasion and satellites scattered at least 1 mm apart. To minimize variability, the formalin-fixed, paraffin-embedded specimens were prepared from snap-frozen tissues, and RNA was obtained within 24 hours of fixation. One hundred four down-regulated genes and 72 up-regulated genes in tumors with aggressive dispersed pattern of invasion were identified. We performed quantitative reverse transcriptase polymerase chain reaction validation of 4 genes using Taqman assays and in situ protein detection of 1 gene by immunohistochemistry. Functional cluster analysis of genes up-regulated in tumors with aggressive pattern of invasion suggests presence of genes involved in cellular cytoarchitecture, some of which already associated with tumor invasion. Identification of these genes provides biologic rationale for our histologic classification, with regard to tumor invasion, and demonstrates that the whole-genome complementary DNA-mediated annealing, selection, extension and ligation assay is a powerful assay for profiling degraded RNA from archived specimens when combined with quantitative reverse transcriptase polymerase chain reaction validation. Copyright © 2011 Elsevier Inc. All rights reserved.

  18. Multivariate Qst–Fst Comparisons: A Neutrality Test for the Evolution of the G Matrix in Structured Populations

    PubMed Central

    Martin, Guillaume; Chapuis, Elodie; Goudet, Jérôme

    2008-01-01

    Neutrality tests in quantitative genetics provide a statistical framework for the detection of selection on polygenic traits in wild populations. However, the existing method based on comparisons of divergence at neutral markers and quantitative traits (Qst–Fst) suffers from several limitations that hinder a clear interpretation of the results with typical empirical designs. In this article, we propose a multivariate extension of this neutrality test based on empirical estimates of the among-populations (D) and within-populations (G) covariance matrices by MANOVA. A simple pattern is expected under neutrality: D = 2Fst/(1 − Fst)G, so that neutrality implies both proportionality of the two matrices and a specific value of the proportionality coefficient. This pattern is tested using Flury's framework for matrix comparison [common principal-component (CPC) analysis], a well-known tool in G matrix evolution studies. We show the importance of using a Bartlett adjustment of the test for the small sample sizes typically found in empirical studies. We propose a dual test: (i) that the proportionality coefficient is not different from its neutral expectation [2Fst/(1 − Fst)] and (ii) that the MANOVA estimates of mean square matrices between and among populations are proportional. These two tests combined provide a more stringent test for neutrality than the classic Qst–Fst comparison and avoid several statistical problems. Extensive simulations of realistic empirical designs suggest that these tests correctly detect the expected pattern under neutrality and have enough power to efficiently detect mild to strong selection (homogeneous, heterogeneous, or mixed) when it is occurring on a set of traits. This method also provides a rigorous and quantitative framework for disentangling the effects of different selection regimes and of drift on the evolution of the G matrix. We discuss practical requirements for the proper application of our test in empirical studies and potential extensions. PMID:18245845

  19. Human Spaceflight Architecture Model (HSFAM) Data Dictionary

    NASA Technical Reports Server (NTRS)

    Shishko, Robert

    2016-01-01

    HSFAM is a data model based on the DoDAF 2.02 data model with some for purpose extensions. These extensions are designed to permit quantitative analyses regarding stakeholder concerns about technical feasibility, configuration and interface issues, and budgetary and/or economic viability.

  20. Macroscopic crack formation and extension in pristine and artificially aged PBX 9501

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Cheng; Thompson, Darla G

    2010-01-01

    A technique has been developed to quantitatively describe macroscopic cracks, both their location and extent, in heterogeneous high explosive and mock materials. By combining such a technique with the deformation field measurement using digital image correlation (DIC), we conduct observation and measurement of the initiation, extension, and coalescence of internal cracks in the compression of Brazilian disk made of pristine and artificially aged PBX 9501 hjgh explosives. Our results conclude quantitatively that aged PBX 9501 is not only weaker but also much more brittle than the pristine one, thus is more susceptible to macroscopic cracking.

  1. Extension of nanoconfined DNA: Quantitative comparison between experiment and theory

    NASA Astrophysics Data System (ADS)

    Iarko, V.; Werner, E.; Nyberg, L. K.; Müller, V.; Fritzsche, J.; Ambjörnsson, T.; Beech, J. P.; Tegenfeldt, J. O.; Mehlig, K.; Westerlund, F.; Mehlig, B.

    2015-12-01

    The extension of DNA confined to nanochannels has been studied intensively and in detail. However, quantitative comparisons between experiments and model calculations are difficult because most theoretical predictions involve undetermined prefactors, and because the model parameters (contour length, Kuhn length, effective width) are difficult to compute reliably, leading to substantial uncertainties. Here we use a recent asymptotically exact theory for the DNA extension in the "extended de Gennes regime" that allows us to compare experimental results with theory. For this purpose, we performed experiments measuring the mean DNA extension and its standard deviation while varying the channel geometry, dye intercalation ratio, and ionic strength of the buffer. The experimental results agree very well with theory at high ionic strengths, indicating that the model parameters are reliable. At low ionic strengths, the agreement is less good. We discuss possible reasons. In principle, our approach allows us to measure the Kuhn length and the effective width of a single DNA molecule and more generally of semiflexible polymers in solution.

  2. A modified ion-selective electrode method for measurement of chloride in sweat.

    PubMed

    Finley, P R; Dye, J A; Lichti, D A; Byers, J M; Williams, R J

    1978-06-01

    A modified method of analysis of sweat chloride concentration with an ion-selective electrode is presented. The original method of sweat chloride analysis proposed by the Orion Research Corporation (Cambridge, Massachusetts 02139) is inadequate because it produces erratic and misleading results. The modified method was compared with the reference quantitative method of Gibson and Cooke. In the modified method, individual electrode pads are cut and placed in the electrodes rather than using the pads supplied by the company; pilocarpine nitrate (2,000 mg/l) is used in place of pilocarpine HCl (640 mg/l); sodium bicarbonate as the weak electrolyte is used instead of K2SO4. A 10-minute period for sweat accumulation is employed rather than a zero-time collection as in the original Orion method. The modification has been studied for reproducibility in individuals, reproducibility between right and left arm in individuals; it has been compared extensively with the quantitative method of Gibson and Cooke, both in normal individuals and in patients with cystic fibrosis. There is excellent agreement between the modified method and the quantitative reference method. There appears to be a slight bias toward higher concentrations of chloride from the right arm compared with the left arm, but this difference is not medically significant.

  3. Meta-analysis is not an exact science: Call for guidance on quantitative synthesis decisions.

    PubMed

    Haddaway, Neal R; Rytwinski, Trina

    2018-05-01

    Meta-analysis is becoming increasingly popular in the field of ecology and environmental management. It increases the effective power of analyses relative to single studies, and allows researchers to investigate effect modifiers and sources of heterogeneity that could not be easily examined within single studies. Many systematic reviewers will set out to conduct a meta-analysis as part of their synthesis, but meta-analysis requires a niche set of skills that are not widely held by the environmental research community. Each step in the process of carrying out a meta-analysis requires decisions that have both scientific and statistical implications. Reviewers are likely to be faced with a plethora of decisions over which effect size to choose, how to calculate variances, and how to build statistical models. Some of these decisions may be simple based on appropriateness of the options. At other times, reviewers must choose between equally valid approaches given the information available to them. This presents a significant problem when reviewers are attempting to conduct a reliable synthesis, such as a systematic review, where subjectivity is minimised and all decisions are documented and justified transparently. We propose three urgent, necessary developments within the evidence synthesis community. Firstly, we call on quantitative synthesis experts to improve guidance on how to prepare data for quantitative synthesis, providing explicit detail to support systematic reviewers. Secondly, we call on journal editors and evidence synthesis coordinating bodies (e.g. CEE) to ensure that quantitative synthesis methods are adequately reported in a transparent and repeatable manner in published systematic reviews. Finally, where faced with two or more broadly equally valid alternative methods or actions, reviewers should conduct multiple analyses, presenting all options, and discussing the implications of the different analytical approaches. We believe it is vital to tackle the possible subjectivity in quantitative synthesis described herein to ensure that the extensive efforts expended in producing systematic reviews and other evidence synthesis products is not wasted because of a lack of rigour or reliability in the final synthesis step. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Divergent synthesis and identification of the cellular targets of deoxyelephantopins

    NASA Astrophysics Data System (ADS)

    Lagoutte, Roman; Serba, Christelle; Abegg, Daniel; Hoch, Dominic G.; Adibekian, Alexander; Winssinger, Nicolas

    2016-08-01

    Herbal extracts containing sesquiterpene lactones have been extensively used in traditional medicine and are known to be rich in α,β-unsaturated functionalities that can covalently engage target proteins. Here we report synthetic methodologies to access analogues of deoxyelephantopin, a sesquiterpene lactone with anticancer properties. Using alkyne-tagged cellular probes and quantitative proteomics analysis, we identified several cellular targets of deoxyelephantopin. We further demonstrate that deoxyelephantopin antagonizes PPARγ activity in situ via covalent engagement of a cysteine residue in the zinc-finger motif of this nuclear receptor.

  5. Integration of GC/MS Instrumentation into the Undergraduate Laboratory: Separation and Identification of Fatty Acids in Commercial Fats and Oils

    NASA Astrophysics Data System (ADS)

    Rubinson, Judith F.; Neyer-Hilvert, Jennifer

    1997-09-01

    A laboratory experiment using a gas chromatography/mass selective detection method has been developed for the isolation, identification, and quantitation of fatty acid content of commercial fats and oils. Results for corn, nutmeg, peanut, and safflower oils are compared with literature values, and the results for corn oil are compared for two different trials of the experiment. In addition, a number of variations on the experiment are suggested including possible extension of the experiment for use in an instrumental analysis course.

  6. Photons Revisited

    NASA Astrophysics Data System (ADS)

    Batic, Matej; Begalli, Marcia; Han, Min Cheol; Hauf, Steffen; Hoff, Gabriela; Kim, Chan Hyeong; Kim, Han Sung; Grazia Pia, Maria; Saracco, Paolo; Weidenspointner, Georg

    2014-06-01

    A systematic review of methods and data for the Monte Carlo simulation of photon interactions is in progress: it concerns a wide set of theoretical modeling approaches and data libraries available for this purpose. Models and data libraries are assessed quantitatively with respect to an extensive collection of experimental measurements documented in the literature to determine their accuracy; this evaluation exploits rigorous statistical analysis methods. The computational performance of the associated modeling algorithms is evaluated as well. An overview of the assessment of photon interaction models and results of the experimental validation are presented.

  7. Pleiotropy Analysis of Quantitative Traits at Gene Level by Multivariate Functional Linear Models

    PubMed Central

    Wang, Yifan; Liu, Aiyi; Mills, James L.; Boehnke, Michael; Wilson, Alexander F.; Bailey-Wilson, Joan E.; Xiong, Momiao; Wu, Colin O.; Fan, Ruzong

    2015-01-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai–Bartlett trace, Hotelling–Lawley trace, and Wilks’s Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. PMID:25809955

  8. Pleiotropy analysis of quantitative traits at gene level by multivariate functional linear models.

    PubMed

    Wang, Yifan; Liu, Aiyi; Mills, James L; Boehnke, Michael; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao; Wu, Colin O; Fan, Ruzong

    2015-05-01

    In genetics, pleiotropy describes the genetic effect of a single gene on multiple phenotypic traits. A common approach is to analyze the phenotypic traits separately using univariate analyses and combine the test results through multiple comparisons. This approach may lead to low power. Multivariate functional linear models are developed to connect genetic variant data to multiple quantitative traits adjusting for covariates for a unified analysis. Three types of approximate F-distribution tests based on Pillai-Bartlett trace, Hotelling-Lawley trace, and Wilks's Lambda are introduced to test for association between multiple quantitative traits and multiple genetic variants in one genetic region. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and optimal sequence kernel association test (SKAT-O). Extensive simulations were performed to evaluate the false positive rates and power performance of the proposed models and tests. We show that the approximate F-distribution tests control the type I error rates very well. Overall, simultaneous analysis of multiple traits can increase power performance compared to an individual test of each trait. The proposed methods were applied to analyze (1) four lipid traits in eight European cohorts, and (2) three biochemical traits in the Trinity Students Study. The approximate F-distribution tests provide much more significant results than those of F-tests of univariate analysis and SKAT-O for the three biochemical traits. The approximate F-distribution tests of the proposed functional linear models are more sensitive than those of the traditional multivariate linear models that in turn are more sensitive than SKAT-O in the univariate case. The analysis of the four lipid traits and the three biochemical traits detects more association than SKAT-O in the univariate case. © 2015 WILEY PERIODICALS, INC.

  9. Semi-quantitative proteomics of mammalian cells upon short-term exposure to non-ionizing electromagnetic fields.

    PubMed

    Kuzniar, Arnold; Laffeber, Charlie; Eppink, Berina; Bezstarosti, Karel; Dekkers, Dick; Woelders, Henri; Zwamborn, A Peter M; Demmers, Jeroen; Lebbink, Joyce H G; Kanaar, Roland

    2017-01-01

    The potential effects of non-ionizing electromagnetic fields (EMFs), such as those emitted by power-lines (in extremely low frequency range), mobile cellular systems and wireless networking devices (in radio frequency range) on human health have been intensively researched and debated. However, how exposure to these EMFs may lead to biological changes underlying possible health effects is still unclear. To reveal EMF-induced molecular changes, unbiased experiments (without a priori focusing on specific biological processes) with sensitive readouts are required. We present the first proteome-wide semi-quantitative mass spectrometry analysis of human fibroblasts, osteosarcomas and mouse embryonic stem cells exposed to three types of non-ionizing EMFs (ELF 50 Hz, UMTS 2.1 GHz and WiFi 5.8 GHz). We performed controlled in vitro EMF exposures of metabolically labeled mammalian cells followed by reliable statistical analyses of differential protein- and pathway-level regulations using an array of established bioinformatics methods. Our results indicate that less than 1% of the quantitated human or mouse proteome responds to the EMFs by small changes in protein abundance. Further network-based analysis of the differentially regulated proteins did not detect significantly perturbed cellular processes or pathways in human and mouse cells in response to ELF, UMTS or WiFi exposure. In conclusion, our extensive bioinformatics analyses of semi-quantitative mass spectrometry data do not support the notion that the short-time exposures to non-ionizing EMFs have a consistent biologically significant bearing on mammalian cells in culture.

  10. Semi-quantitative proteomics of mammalian cells upon short-term exposure to non-ionizing electromagnetic fields

    PubMed Central

    Laffeber, Charlie; Eppink, Berina; Bezstarosti, Karel; Dekkers, Dick; Woelders, Henri; Zwamborn, A. Peter M.; Demmers, Jeroen; Lebbink, Joyce H. G.; Kanaar, Roland

    2017-01-01

    The potential effects of non-ionizing electromagnetic fields (EMFs), such as those emitted by power-lines (in extremely low frequency range), mobile cellular systems and wireless networking devices (in radio frequency range) on human health have been intensively researched and debated. However, how exposure to these EMFs may lead to biological changes underlying possible health effects is still unclear. To reveal EMF-induced molecular changes, unbiased experiments (without a priori focusing on specific biological processes) with sensitive readouts are required. We present the first proteome-wide semi-quantitative mass spectrometry analysis of human fibroblasts, osteosarcomas and mouse embryonic stem cells exposed to three types of non-ionizing EMFs (ELF 50 Hz, UMTS 2.1 GHz and WiFi 5.8 GHz). We performed controlled in vitro EMF exposures of metabolically labeled mammalian cells followed by reliable statistical analyses of differential protein- and pathway-level regulations using an array of established bioinformatics methods. Our results indicate that less than 1% of the quantitated human or mouse proteome responds to the EMFs by small changes in protein abundance. Further network-based analysis of the differentially regulated proteins did not detect significantly perturbed cellular processes or pathways in human and mouse cells in response to ELF, UMTS or WiFi exposure. In conclusion, our extensive bioinformatics analyses of semi-quantitative mass spectrometry data do not support the notion that the short-time exposures to non-ionizing EMFs have a consistent biologically significant bearing on mammalian cells in culture. PMID:28234898

  11. A quantitative comparison of corrective and perfective maintenance

    NASA Technical Reports Server (NTRS)

    Henry, Joel; Cain, James

    1994-01-01

    This paper presents a quantitative comparison of corrective and perfective software maintenance activities. The comparison utilizes basic data collected throughout the maintenance process. The data collected are extensive and allow the impact of both types of maintenance to be quantitatively evaluated and compared. Basic statistical techniques test relationships between and among process and product data. The results show interesting similarities and important differences in both process and product characteristics.

  12. Lithologic discrimination and alteration mapping from AVIRIS Data, Socorro, New Mexico

    NASA Technical Reports Server (NTRS)

    Beratan, K. K.; Delillo, N.; Jacobson, A.; Blom, R.; Chapin, C. E.

    1993-01-01

    Geologic maps are, by their very nature, interpretive documents. In contrasts, images prepared from AVIRIS data can be used as uninterpreted, and thus unbiased, geologic maps. We are having significant success applying AVIRIS data in this non-quantitative manner to geologic problems. Much of our success has come from the power of the Linked Windows Interactive Data System. LinkWinds is a visual data analysis and exploration system under development at JPL which is designed to rapidly and interactively investigate large multivariate data sets. In this paper, we present information on the analysis technique, and preliminary results from research on potassium metasomatism, a distinctive and structurally significant type of alteration associated with crustal extension.

  13. Systemic Analysis Approaches for Air Transportation

    NASA Technical Reports Server (NTRS)

    Conway, Sheila

    2005-01-01

    Air transportation system designers have had only limited success using traditional operations research and parametric modeling approaches in their analyses of innovations. They need a systemic methodology for modeling of safety-critical infrastructure that is comprehensive, objective, and sufficiently concrete, yet simple enough to be used with reasonable investment. The methodology must also be amenable to quantitative analysis so issues of system safety and stability can be rigorously addressed. However, air transportation has proven itself an extensive, complex system whose behavior is difficult to describe, no less predict. There is a wide range of system analysis techniques available, but some are more appropriate for certain applications than others. Specifically in the area of complex system analysis, the literature suggests that both agent-based models and network analysis techniques may be useful. This paper discusses the theoretical basis for each approach in these applications, and explores their historic and potential further use for air transportation analysis.

  14. Quantitative Proteome Analysis Reveals Increased Content of Basement Membrane Proteins in Arteries From Patients With Type 2 Diabetes Mellitus and Lower Levels Among Metformin Users.

    PubMed

    Preil, Simone A R; Kristensen, Lars P; Beck, Hans C; Jensen, Pia S; Nielsen, Patricia S; Steiniche, Torben; Bjørling-Poulsen, Marina; Larsen, Martin R; Hansen, Maria L; Rasmussen, Lars M

    2015-10-01

    The increased risk of cardiovascular diseases in type 2 diabetes mellitus has been extensively documented, but the origins of the association remain largely unknown. We sought to determine changes in protein expressions in arterial tissue from patients with type 2 diabetes mellitus and moreover hypothesized that metformin intake influences the protein composition. We analyzed nonatherosclerotic repair arteries gathered at coronary bypass operations from 30 patients with type 2 diabetes mellitus and from 30 age- and sex-matched nondiabetic individuals. Quantitative proteome analysis was performed by isobaric tag for relative and absolute quantitation-labeling and liquid chromatography-mass spectrometry, tandem mass spectrometry analysis on individual arterial samples. The amounts of the basement membrane components, α1-type IV collagen and α2-type IV collagen, γ1-laminin and β2-laminin, were significantly increased in patients with diabetes mellitus. Moreover, the expressions of basement membrane components and other vascular proteins were significantly lower among metformin users when compared with nonusers. Patients treated with or without metformin had similar levels of hemoglobin A1c, cholesterol, and blood pressure. In addition, quantitative histomorphometry showed increased area fractions of collagen-stainable material in tunica intima and media among patients with diabetes mellitus. The distinct accumulation of arterial basement membrane proteins in type 2 diabetes mellitus discloses a similarity between the diabetic macroangiopathy and microangiopathy and suggests a molecular explanation behind the alterations in vascular remodeling, biomechanical properties, and aneurysm formation described in diabetes mellitus. The lower amounts of basement membrane components in metformin-treated individuals are compatible with the hypothesis of direct beneficial drug effects on the matrix composition in the vasculature. © 2015 American Heart Association, Inc.

  15. The calibration of video cameras for quantitative measurements

    NASA Technical Reports Server (NTRS)

    Snow, Walter L.; Childers, Brooks A.; Shortis, Mark R.

    1993-01-01

    Several different recent applications of velocimetry at Langley Research Center are described in order to show the need for video camera calibration for quantitative measurements. Problems peculiar to video sensing are discussed, including synchronization and timing, targeting, and lighting. The extension of the measurements to include radiometric estimates is addressed.

  16. Improving EFL Learners' Reading Levels through Extensive Reading

    ERIC Educational Resources Information Center

    Mermelstein, Aaron David

    2014-01-01

    Today there is an increasing amount of research promoting the effectiveness of extensive reading (ER) towards increasing learners' vocabulary, comprehension, reading speed, and motivation towards reading. However, little has been done to measure the effects of ER on learners' reading levels. This quantitative study examined the effects of ER on…

  17. Errors in quantitative backscattered electron analysis of bone standardized by energy-dispersive x-ray spectrometry.

    PubMed

    Vajda, E G; Skedros, J G; Bloebaum, R D

    1998-10-01

    Backscattered electron (BSE) imaging has proven to be a useful method for analyzing the mineral distribution in microscopic regions of bone. However, an accepted method of standardization has not been developed, limiting the utility of BSE imaging for truly quantitative analysis. Previous work has suggested that BSE images can be standardized by energy-dispersive x-ray spectrometry (EDX). Unfortunately, EDX-standardized BSE images tend to underestimate the mineral content of bone when compared with traditional ash measurements. The goal of this study is to investigate the nature of the deficit between EDX-standardized BSE images and ash measurements. A series of analytical standards, ashed bone specimens, and unembedded bone specimens were investigated to determine the source of the deficit previously reported. The primary source of error was found to be inaccurate ZAF corrections to account for the organic phase of the bone matrix. Conductive coatings, methylmethacrylate embedding media, and minor elemental constituents in bone mineral introduced negligible errors. It is suggested that the errors would remain constant and an empirical correction could be used to account for the deficit. However, extensive preliminary testing of the analysis equipment is essential.

  18. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  19. Presentation Extensions of the SOAP

    NASA Technical Reports Server (NTRS)

    Carnright, Robert; Stodden, David; Coggi, John

    2009-01-01

    A set of extensions of the Satellite Orbit Analysis Program (SOAP) enables simultaneous and/or sequential presentation of information from multiple sources. SOAP is used in the aerospace community as a means of collaborative visualization and analysis of data on planned spacecraft missions. The following definitions of terms also describe the display modalities of SOAP as now extended: In SOAP terminology, View signifies an animated three-dimensional (3D) scene, two-dimensional still image, plot of numerical data, or any other visible display derived from a computational simulation or other data source; a) "Viewport" signifies a rectangular portion of a computer-display window containing a view; b) "Palette" signifies a collection of one or more viewports configured for simultaneous (split-screen) display in the same window; c) "Slide" signifies a palette with a beginning and ending time and an animation time step; and d) "Presentation" signifies a prescribed sequence of slides. For example, multiple 3D views from different locations can be crafted for simultaneous display and combined with numerical plots and other representations of data for both qualitative and quantitative analysis. The resulting sets of views can be temporally sequenced to convey visual impressions of a sequence of events for a planned mission.

  20. Label-Free Relative Quantitation of Isobaric and Isomeric Human Histone H2A and H2B Variants by Fourier Transform Ion Cyclotron Resonance Top-Down MS/MS.

    PubMed

    Dang, Xibei; Singh, Amar; Spetman, Brian D; Nolan, Krystal D; Isaacs, Jennifer S; Dennis, Jonathan H; Dalton, Stephen; Marshall, Alan G; Young, Nicolas L

    2016-09-02

    Histone variants are known to play a central role in genome regulation and maintenance. However, many variants are inaccessible by antibody-based methods or bottom-up tandem mass spectrometry due to their highly similar sequences. For many, the only tractable approach is with intact protein top-down tandem mass spectrometry. Here, ultra-high-resolution FT-ICR MS and MS/MS yield quantitative relative abundances of all detected HeLa H2A and H2B isobaric and isomeric variants with a label-free approach. We extend the analysis to identify and relatively quantitate 16 proteoforms from 12 sequence variants of histone H2A and 10 proteoforms of histone H2B from three other cell lines: human embryonic stem cells (WA09), U937, and a prostate cancer cell line LaZ. The top-down MS/MS approach provides a path forward for more extensive elucidation of the biological role of many previously unstudied histone variants and post-translational modifications.

  1. Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.

    PubMed

    He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan

    2009-07-01

    Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.

  2. Agency Problems and Airport Security: Quantitative and Qualitative Evidence on the Impact of Security Training.

    PubMed

    de Gramatica, Martina; Massacci, Fabio; Shim, Woohyun; Turhan, Uğur; Williams, Julian

    2017-02-01

    We analyze the issue of agency costs in aviation security by combining results from a quantitative economic model with a qualitative study based on semi-structured interviews. Our model extends previous principal-agent models by combining the traditional fixed and varying monetary responses to physical and cognitive effort with nonmonetary welfare and potentially transferable value of employees' own human capital. To provide empirical evidence for the tradeoffs identified in the quantitative model, we have undertaken an extensive interview process with regulators, airport managers, security personnel, and those tasked with training security personnel from an airport operating in a relatively high-risk state, Turkey. Our results indicate that the effectiveness of additional training depends on the mix of "transferable skills" and "emotional" buy-in of the security agents. Principals need to identify on which side of a critical tipping point their agents are to ensure that additional training, with attached expectations of the burden of work, aligns the incentives of employees with the principals' own objectives. © 2016 Society for Risk Analysis.

  3. NUclear EVacuation Analysis Code (NUEVAC) : a tool for evaluation of sheltering and evacuation responses following urban nuclear detonations.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoshimura, Ann S.; Brandt, Larry D.

    2009-11-01

    The NUclear EVacuation Analysis Code (NUEVAC) has been developed by Sandia National Laboratories to support the analysis of shelter-evacuate (S-E) strategies following an urban nuclear detonation. This tool can model a range of behaviors, including complex evacuation timing and path selection, as well as various sheltering or mixed evacuation and sheltering strategies. The calculations are based on externally generated, high resolution fallout deposition and plume data. Scenario setup and calculation outputs make extensive use of graphics and interactive features. This software is designed primarily to produce quantitative evaluations of nuclear detonation response options. However, the outputs have also proven usefulmore » in the communication of technical insights concerning shelter-evacuate tradeoffs to urban planning or response personnel.« less

  4. Amino-Acid Network Clique Analysis of Protein Mutation Non-Additive Effects: A Case Study of Lysozme.

    PubMed

    Ming, Dengming; Chen, Rui; Huang, He

    2018-05-10

    Optimizing amino-acid mutations in enzyme design has been a very challenging task in modern bio-industrial applications. It is well known that many successful designs often hinge on extensive correlations among mutations at different sites within the enzyme, however, the underpinning mechanism for these correlations is far from clear. Here, we present a topology-based model to quantitively characterize non-additive effects between mutations. The method is based on the molecular dynamic simulations and the amino-acid network clique analysis. It examines if the two mutation sites of a double-site mutation fall into to a 3-clique structure, and associates such topological property of mutational site spatial distribution with mutation additivity features. We analyzed 13 dual mutations of T4 phage lysozyme and found that the clique-based model successfully distinguishes highly correlated or non-additive double-site mutations from those additive ones whose component mutations have less correlation. We also applied the model to protein Eglin c whose structural topology is significantly different from that of T4 phage lysozyme, and found that the model can, to some extension, still identify non-additive mutations from additive ones. Our calculations showed that mutation non-additive effects may heavily depend on a structural topology relationship between mutation sites, which can be quantitatively determined using amino-acid network k -cliques. We also showed that double-site mutation correlations can be significantly altered by exerting a third mutation, indicating that more detailed physicochemical interactions should be considered along with the network clique-based model for better understanding of this elusive mutation-correlation principle.

  5. Probability Density Functions of Observed Rainfall in Montana

    NASA Technical Reports Server (NTRS)

    Larsen, Scott D.; Johnson, L. Ronald; Smith, Paul L.

    1995-01-01

    The question of whether a rain rate probability density function (PDF) can vary uniformly between precipitation events is examined. Image analysis on large samples of radar echoes is possible because of advances in technology. The data provided by such an analysis easily allow development of radar reflectivity factors (and by extension rain rate) distribution. Finding a PDF becomes a matter of finding a function that describes the curve approximating the resulting distributions. Ideally, one PDF would exist for all cases; or many PDF's that have the same functional form with only systematic variations in parameters (such as size or shape) exist. Satisfying either of theses cases will, validate the theoretical basis of the Area Time Integral (ATI). Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89 percent of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit. Using the method of moments and Elderton's curve selection criteria, the Pearson Type 1 equation was identified as a potential fit for 89% of the observed distributions. Further analysis indicates that the Type 1 curve does approximate the shape of the distributions but quantitatively does not produce a great fit.

  6. NASA Applications and Lessons Learned in Reliability Engineering

    NASA Technical Reports Server (NTRS)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  7. Fecal /sup 13/C analysis for the detection and quantitation of intestinal malabsorption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klein, P.D.; MacLean, W.C. Jr.; Watkins, J.B.

    Use of /sup 14/CO/sub 2/ breath tests and fecal analyses for the detection and quantitation of intestinal malabsorption has been extensively documented in adult subjects. The use of radioisotopes has extended the range of breath test applications to include pediatric and geriatric subjects. Here we report a fecal /sup 13/C analysis that can be used in conjunction with /sup 14/CO/sub 2/ breath tests. Twenty-four-hour fecal samples were collected before and after the administration of a labeled substrate. Simultaneous cholyglycine /sup 13/CO/sub 2/ breath tests and fecal assays were performed in five children. One child with bacterial overgrowth had an abnormalmore » breath test and a normal fecal test. Of three children with ileal dysfunction, only one had an abnormal breath test, whereas the fecal test was abnormal in all three. Both the breath test and fecal test were abnormal for a child who had undergone an ileal resection. Both tests were normal for a child with ulcerative colitis.« less

  8. [Oligonucleotide derivatives in the nucleic acid hybridization analysis. II. Isothermal signal amplification in process of DNA analysis by minisequencing].

    PubMed

    Dmitrienko, E V; Khomiakova, E A; Pyshnaia; Bragin, A G; Vedernikov, V E; Pyshnyĭ, D V

    2010-01-01

    The isothermal amplification of reporter signal via limited probe extension (minisequencing) upon hybridization of nucleic acids has been studied. The intensity of reporter signal has been shown to increase due to enzymatic labeling of multiple probes upon consecutive hybridization with one DNA template both in homophase and heterophase assays using various kinds of detection signal: radioisotope label, fluorescent label, and enzyme-linked assay. The kinetic scheme of the process has been proposed and kinetic parameters for each step have been determined. The signal intensity has been shown to correlate with physicochemical characteristics of both complexes: probe/DNA and product/DNA. The maximum intensity has been observed at minimal difference between the thermodynamic stability of these complexes, provided the reaction temperature has been adjusted near their melting temperature values; rising or lowering the reaction temperature reduces the amount of reporting product. The signal intensity has been shown to decrease significantly upon hybridization with the DNA template containing single-nucleotide mismatches. Limited probe extension assay is useful not only for detection of DNA template but also for its quantitative characterization.

  9. Wound healing revised: A novel reepithelialization mechanism revealed by in vitro and in silico models

    PubMed Central

    Safferling, Kai; Sütterlin, Thomas; Westphal, Kathi; Ernst, Claudia; Breuhahn, Kai; James, Merlin; Jäger, Dirk; Halama, Niels

    2013-01-01

    Wound healing is a complex process in which a tissue’s individual cells have to be orchestrated in an efficient and robust way. We integrated multiplex protein analysis, immunohistochemical analysis, and whole-slide imaging into a novel medium-throughput platform for quantitatively capturing proliferation, differentiation, and migration in large numbers of organotypic skin cultures comprising epidermis and dermis. Using fluorescent time-lag staining, we were able to infer source and final destination of keratinocytes in the healing epidermis. This resulted in a novel extending shield reepithelialization mechanism, which we confirmed by computational multicellular modeling and perturbation of tongue extension. This work provides a consistent experimental and theoretical model for epidermal wound closure in 3D, negating the previously proposed concepts of epidermal tongue extension and highlighting the so far underestimated role of the surrounding tissue. Based on our findings, epidermal wound closure is a process in which cell behavior is orchestrated by a higher level of tissue control that 2D monolayer assays are not able to capture. PMID:24385489

  10. An image analysis system for near-infrared (NIR) fluorescence lymph imaging

    NASA Astrophysics Data System (ADS)

    Zhang, Jingdan; Zhou, Shaohua Kevin; Xiang, Xiaoyan; Rasmussen, John C.; Sevick-Muraca, Eva M.

    2011-03-01

    Quantitative analysis of lymphatic function is crucial for understanding the lymphatic system and diagnosing the associated diseases. Recently, a near-infrared (NIR) fluorescence imaging system is developed for real-time imaging lymphatic propulsion by intradermal injection of microdose of a NIR fluorophore distal to the lymphatics of interest. However, the previous analysis software3, 4 is underdeveloped, requiring extensive time and effort to analyze a NIR image sequence. In this paper, we develop a number of image processing techniques to automate the data analysis workflow, including an object tracking algorithm to stabilize the subject and remove the motion artifacts, an image representation named flow map to characterize lymphatic flow more reliably, and an automatic algorithm to compute lymph velocity and frequency of propulsion. By integrating all these techniques to a system, the analysis workflow significantly reduces the amount of required user interaction and improves the reliability of the measurement.

  11. Quantitative non-invasive intracellular imaging of Plasmodium falciparum infected human erythrocytes

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2014-05-01

    Malaria is a virulent pathological condition which results in over a million annual deaths. The parasitic agent Plasmodium falciparum has been extensively studied in connection with this epidemic but much remains unknown about its development inside the red blood cell host. Optical and fluorescence imaging are among the two most common procedures for investigating infected erythrocytes but both require the introduction of exogenous contrast agents. In this letter, we present a procedure for the non-invasive in situ imaging of malaria infected red blood cells. The procedure is based on the utilization of simultaneously acquired quantitative phase and independent topography data to extract intracellular information. Our method allows for the identification of the developmental stages of the parasite and facilitates in situ analysis of the morphological changes associated with the progression of this disease. This information may assist in the development of efficacious treatment therapies for this condition.

  12. Natural extension of fast-slow decomposition for dynamical systems

    NASA Astrophysics Data System (ADS)

    Rubin, J. E.; Krauskopf, B.; Osinga, H. M.

    2018-01-01

    Modeling and parameter estimation to capture the dynamics of physical systems are often challenging because many parameters can range over orders of magnitude and are difficult to measure experimentally. Moreover, selecting a suitable model complexity requires a sufficient understanding of the model's potential use, such as highlighting essential mechanisms underlying qualitative behavior or precisely quantifying realistic dynamics. We present an approach that can guide model development and tuning to achieve desired qualitative and quantitative solution properties. It relies on the presence of disparate time scales and employs techniques of separating the dynamics of fast and slow variables, which are well known in the analysis of qualitative solution features. We build on these methods to show how it is also possible to obtain quantitative solution features by imposing designed dynamics for the slow variables in the form of specified two-dimensional paths in a bifurcation-parameter landscape.

  13. Boiling points of halogenated aliphatic compounds: a quantitative structure-property relationship for prediction and validation.

    PubMed

    Oberg, Tomas

    2004-01-01

    Halogenated aliphatic compounds have many technical uses, but substances within this group are also ubiquitous environmental pollutants that can affect the ozone layer and contribute to global warming. The establishment of quantitative structure-property relationships is of interest not only to fill in gaps in the available database but also to validate experimental data already acquired. The three-dimensional structures of 240 compounds were modeled with molecular mechanics prior to the generation of empirical descriptors. Two bilinear projection methods, principal component analysis (PCA) and partial-least-squares regression (PLSR), were used to identify outliers. PLSR was subsequently used to build a multivariate calibration model by extracting the latent variables that describe most of the covariation between the molecular structure and the boiling point. Boiling points were also estimated with an extension of the group contribution method of Stein and Brown.

  14. Animal versus human oral drug bioavailability: Do they correlate?

    PubMed Central

    Musther, Helen; Olivares-Morales, Andrés; Hatley, Oliver J.D.; Liu, Bo; Rostami Hodjegan, Amin

    2014-01-01

    Oral bioavailability is a key consideration in development of drug products, and the use of preclinical species in predicting bioavailability in human has long been debated. In order to clarify whether any correlation between human and animal bioavailability exist, an extensive analysis of the published literature data was conducted. Due to the complex nature of bioavailability calculations inclusion criteria were applied to ensure integrity of the data. A database of 184 compounds was assembled. Linear regression for the reported compounds indicated no strong or predictive correlations to human data for all species, individually and combined. The lack of correlation in this extended dataset highlights that animal bioavailability is not quantitatively predictive of bioavailability in human. Although qualitative (high/low bioavailability) indications might be possible, models taking into account species-specific factors that may affect bioavailability are recommended for developing quantitative prediction. PMID:23988844

  15. Ted Hall and the science of biological microprobe X-ray analysis: a historical perspective of methodology and biological dividends.

    PubMed

    Gupta, B L

    1991-06-01

    This review surveys the emergence of electron probe X-ray microanalysis as a quantitative method for measuring the chemical elements in situ. The extension of the method to the biological sciences under the influence of Ted Hall is reviewed. Some classical experiments by Hall and his colleagues in Cambridge, UK, previously unpublished, are described; as are some of the earliest quantitative results from the cryo-sections obtained in Cambridge and elsewhere. The progress of the methodology is critically evaluated from the earliest starts to the present state of the art. Particular attention has been focused on the application of the method in providing fresh insights into the role of ions in cell and tissue physiology and pathology. A comprehensive list of references is included for a further pursuit of the topics by the interested reader.

  16. Focus groups: a useful tool for curriculum evaluation.

    PubMed

    Frasier, P Y; Slatt, L; Kowlowitz, V; Kollisch, D O; Mintzer, M

    1997-01-01

    Focus group interviews have been used extensively in health services program planning, health education, and curriculum planning. However, with the exception of a few reports describing the use of focus groups for a basic science course evaluation and a clerkship's impact on medical students, the potential of focus groups as a tool for curriculum evaluation has not been explored. Focus groups are a valid stand-alone evaluation process, but they are most often used in combination with other quantitative and qualitative methods. Focus groups rely heavily on group interaction, combining elements of individual interviews and participant observation. This article compares the focus group interview with both quantitative and qualitative methods; discusses when to use focus group interviews; outlines a protocol for conducting focus groups, including a comparison of various styles of qualitative data analysis; and offers a case study, in which focus groups evaluated the effectiveness of a pilot preclinical curriculum.

  17. Direct visualization reveals kinetics of meiotic chromosome synapsis

    DOE PAGES

    Rog, Ofer; Dernburg, Abby  F.

    2015-03-17

    The synaptonemal complex (SC) is a conserved protein complex that stabilizes interactions along homologous chromosomes (homologs) during meiosis. The SC regulates genetic exchanges between homologs, thereby enabling reductional division and the production of haploid gametes. Here, we directly observe SC assembly (synapsis) by optimizing methods for long-term fluorescence recording in C. elegans. We report that synapsis initiates independently on each chromosome pair at or near pairing centers—specialized regions required for homolog associations. Once initiated, the SC extends rapidly and mostly irreversibly to chromosome ends. Quantitation of SC initiation frequencies and extension rates reveals that initiation is a rate-limiting step inmore » homolog interactions. Eliminating the dynein-driven chromosome movements that accompany synapsis severely retards SC extension, revealing a new role for these conserved motions. This work provides the first opportunity to directly observe and quantify key aspects of meiotic chromosome interactions and will enable future in vivo analysis of germline processes.« less

  18. On aerodynamic wake analysis and its relation to total aerodynamic drag in a wind tunnel environment

    NASA Astrophysics Data System (ADS)

    Guterres, Rui M.

    The present work was developed with the goal of advancing the state of the art in the application of three-dimensional wake data analysis to the quantification of aerodynamic drag on a body in a low speed wind tunnel environment. Analysis of the existing tools, their strengths and limitations is presented. Improvements to the existing analysis approaches were made. Software tools were developed to integrate the analysis into a practical tool. A comprehensive derivation of the equations needed for drag computations based on three dimensional separated wake data is developed. A set of complete steps ranging from the basic mathematical concept to the applicable engineering equations is presented. An extensive experimental study was conducted. Three representative body types were studied in varying ground effect conditions. A detailed qualitative wake analysis using wake imaging and two and three dimensional flow visualization was performed. Several significant features of the flow were identified and their relation to the total aerodynamic drag established. A comprehensive wake study of this type is shown to be in itself a powerful tool for the analysis of the wake aerodynamics and its relation to body drag. Quantitative wake analysis techniques were developed. Significant post processing and data conditioning tools and precision analysis were developed. The quality of the data is shown to be in direct correlation with the accuracy of the computed aerodynamic drag. Steps are taken to identify the sources of uncertainty. These are quantified when possible and the accuracy of the computed results is seen to significantly improve. When post processing alone does not resolve issues related to precision and accuracy, solutions are proposed. The improved quantitative wake analysis is applied to the wake data obtained. Guidelines are established that will lead to more successful implementation of these tools in future research programs. Close attention is paid to implementation of issues that are of crucial importance for the accuracy of the results and that are not detailed in the literature. The impact of ground effect on the flows in hand is qualitatively and quantitatively studied. Its impact on the accuracy of the computations as well as the wall drag incompatibility with the theoretical model followed are discussed. The newly developed quantitative analysis provides significantly increased accuracy. The aerodynamic drag coefficient is computed within one percent of balance measured value for the best cases.

  19. Music and Suicidality: A Quantitative Review and Extension

    ERIC Educational Resources Information Center

    Stack, Steven; Lester, David; Rosenberg, Jonathan S.

    2012-01-01

    This article provides the first quantitative review of the literature on music and suicidality. Multivariate logistic regression techniques are applied to 90 findings from 21 studies. Investigations employing ecological data on suicide completions are 19.2 times more apt than other studies to report a link between music and suicide. More recent…

  20. The applicability of TaqMan-based quantitative real-time PCR assays for detecting and enumeratIng Cryptosporidium spp. oocysts in the environment

    EPA Science Inventory

    Molecular detection methods such as PCR have been extensively used to type Cryptosporidium oocysts detected in the environment. More recently, studies have developed quantitative real-time PCR assays for detection and quantification of microbial contaminants in water as well as ...

  1. Urban children and nature: a summary of research on camping and outdoor education

    Treesearch

    William R., Jr. Burch

    1977-01-01

    This paper reports the preliminary findings of an extensive bibliographic search that identified studies or urban children in camp and outdoor education programs. These studies were systematically abstracted and classified qualitative or quantitative. Twenty-five percent of the abstracted studies were quantitative. The major findings, techniques of study, and policy...

  2. 76 FR 51442 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-18

    ...-Adviser has designed the following quantitative stock selection rules to make allocation decisions and to..., the Sub-Adviser's investment process is quantitative. Based on extensive historical research, the Sub... open-end fund's portfolio composition must be subject to procedures designed to prevent the use and...

  3. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  4. Evaluating the Impact of Cooperative Extension Outreach via Twitter

    ERIC Educational Resources Information Center

    O'Neill, Barbara

    2014-01-01

    Twitter is increasingly being used by Extension educators as a teaching and program-marketing tool. It is not enough, however, to simply use Twitter to disseminate information. Steps must be taken to evaluate program impact with quantitative and qualitative data. This article described the following Twitter evaluation metrics: unique hashtags,…

  5. 3D Image Analysis of Geomaterials using Confocal Microscopy

    NASA Astrophysics Data System (ADS)

    Mulukutla, G.; Proussevitch, A.; Sahagian, D.

    2009-05-01

    Confocal microscopy is one of the most significant advances in optical microscopy of the last century. It is widely used in biological sciences but its application to geomaterials lingers due to a number of technical problems. Potentially the technique can perform non-invasive testing on a laser illuminated sample that fluoresces using a unique optical sectioning capability that rejects out-of-focus light reaching the confocal aperture. Fluorescence in geomaterials is commonly induced using epoxy doped with a fluorochrome that is impregnated into the sample to enable discrimination of various features such as void space or material boundaries. However, for many geomaterials, this method cannot be used because they do not naturally fluoresce and because epoxy cannot be impregnated into inaccessible parts of the sample due to lack of permeability. As a result, the confocal images of most geomaterials that have not been pre-processed with extensive sample preparation techniques are of poor quality and lack the necessary image and edge contrast necessary to apply any commonly used segmentation techniques to conduct any quantitative study of its features such as vesicularity, internal structure, etc. In our present work, we are developing a methodology to conduct a quantitative 3D analysis of images of geomaterials collected using a confocal microscope with minimal amount of prior sample preparation and no addition of fluorescence. Two sample geomaterials, a volcanic melt sample and a crystal chip containing fluid inclusions are used to assess the feasibility of the method. A step-by-step process of image analysis includes application of image filtration to enhance the edges or material interfaces and is based on two segmentation techniques: geodesic active contours and region competition. Both techniques have been applied extensively to the analysis of medical MRI images to segment anatomical structures. Preliminary analysis suggests that there is distortion in the shapes of the segmented vesicles, vapor bubbles, and void spaces due to the optical measurements, so corrective actions are being explored. This will establish a practical and reliable framework for an adaptive 3D image processing technique for the analysis of geomaterials using confocal microscopy.

  6. Gear Shifting of Quadriceps during Isometric Knee Extension Disclosed Using Ultrasonography.

    PubMed

    Zhang, Shu; Huang, Weijian; Zeng, Yu; Shi, Wenxiu; Diao, Xianfen; Wei, Xiguang; Ling, Shan

    2018-01-01

    Ultrasonography has been widely employed to estimate the morphological changes of muscle during contraction. To further investigate the motion pattern of quadriceps during isometric knee extensions, we studied the relative motion pattern between femur and quadriceps under ultrasonography. An interesting observation is that although the force of isometric knee extension can be controlled to change almost linearly, femur in the simultaneously captured ultrasound video sequences has several different piecewise moving patterns. This phenomenon is like quadriceps having several forward gear ratios like a car starting from rest towards maximal voluntary contraction (MVC) and then returning to rest. Therefore, to verify this assumption, we captured several ultrasound video sequences of isometric knee extension and collected the torque/force signal simultaneously. Then we extract the shapes of femur from these ultrasound video sequences using video processing techniques and study the motion pattern both qualitatively and quantitatively. The phenomenon can be seen easier via a comparison between the torque signal and relative spatial distance between femur and quadriceps. Furthermore, we use cluster analysis techniques to study the process and the clustering results also provided preliminary support to the conclusion that, during both ramp increasing and decreasing phases, quadriceps contraction may have several forward gear ratios relative to femur.

  7. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  8. The mass-lifetime relation

    NASA Astrophysics Data System (ADS)

    LoPresto, Michael C.

    2018-05-01

    In a recent "AstroNote," I described a simple exercise on the mass-luminosity relation for main sequence stars as an example of exposing students in a general education science course of lower mathematical level to the use of quantitative skills such as collecting and analyzing data. Here I present another attempt at a meaningful experience for such students that again involves both the gathering and analysis of numerical data and comparison with accepted result, this time on the relationship of the mass and lifetimes of main sequence stars. This experiment can stand alone or be used as an extension of the previous mass-luminosity relationship experiment.

  9. Petri net modelling of biological networks.

    PubMed

    Chaouiya, Claudine

    2007-07-01

    Mathematical modelling is increasingly used to get insights into the functioning of complex biological networks. In this context, Petri nets (PNs) have recently emerged as a promising tool among the various methods employed for the modelling and analysis of molecular networks. PNs come with a series of extensions, which allow different abstraction levels, from purely qualitative to more complex quantitative models. Noteworthily, each of these models preserves the underlying graph, which depicts the interactions between the biological components. This article intends to present the basics of the approach and to foster the potential role PNs could play in the development of the computational systems biology.

  10. Subsonic/transonic stall flutter investigation of a rotating rig

    NASA Technical Reports Server (NTRS)

    Jutras, R. R.; Fost, R. B.; Chi, R. M.; Beacher, B. F.

    1981-01-01

    Stall flutter is investigated by obtaining detailed quantitative steady and aerodynamic and aeromechanical measurements in a typical fan rotor. The experimental investigation is made with a 31.3 percent scale model of the Quiet Engine Program Fan C rotor system. Both subsonic/transonic (torsional mode) flutter and supersonic (flexural) flutter are investigated. Extensive steady and unsteady data on the blade deformations and aerodynamic properties surrounding the rotor are acquired while operating in both the steady and flutter modes. Analysis of this data shows that while there may be more than one traveling wave present during flutter, they are all forward traveling waves.

  11. Reference genes for reverse transcription quantitative PCR in canine brain tissue.

    PubMed

    Stassen, Quirine E M; Riemers, Frank M; Reijmerink, Hannah; Leegwater, Peter A J; Penning, Louis C

    2015-12-09

    In the last decade canine models have been used extensively to study genetic causes of neurological disorders such as epilepsy and Alzheimer's disease and unravel their pathophysiological pathways. Reverse transcription quantitative polymerase chain reaction is a sensitive and inexpensive method to study expression levels of genes involved in disease processes. Accurate normalisation with stably expressed so-called reference genes is crucial for reliable expression analysis. Following the minimum information for publication of quantitative real-time PCR experiments precise guidelines, the expression of ten frequently used reference genes, namely YWHAZ, HMBS, B2M, SDHA, GAPDH, HPRT, RPL13A, RPS5, RPS19 and GUSB was evaluated in seven brain regions (frontal lobe, parietal lobe, occipital lobe, temporal lobe, thalamus, hippocampus and cerebellum) and whole brain of healthy dogs. The stability of expression varied between different brain areas. Using the GeNorm and Normfinder software HMBS, GAPDH and HPRT were the most reliable reference genes for whole brain. Furthermore based on GeNorm calculations it was concluded that as little as two to three reference genes are sufficient to obtain reliable normalisation, irrespective the brain area. Our results amend/extend the limited previously published data on canine brain reference genes. Despite the excellent expression stability of HMBS, GAPDH and HRPT, the evaluation of expression stability of reference genes must be a standard and integral part of experimental design and subsequent data analysis.

  12. Quantitative phosphoproteome on the silkworm (Bombyx mori) cells infected with baculovirus.

    PubMed

    Shobahah, Jauharotus; Xue, Shengjie; Hu, Dongbing; Zhao, Cui; Wei, Ming; Quan, Yanping; Yu, Wei

    2017-06-19

    Bombyx mori has become an important model organism for many fundamental studies. Bombyx mori nucleopolyhedrovirus (BmNPV) is a significant pathogen to Bombyx mori, yet also an efficient vector for recombinant protein production. A previous study indicated that acetylation plays many vital roles in several cellular processes of Bombyx mori while global phosphorylation pattern upon BmNPV infection remains elusive. Employing tandem mass tag (TMT) labeling and phosphorylation affinity enrichment followed by high-resolution LC-MS/MS analysis and intensive bioinformatics analysis, the quantitative phosphoproteome in Bombyx mori cells infected by BmNPV at 24 hpi with an MOI of 10 was extensively examined. Totally, 6480 phosphorylation sites in 2112 protein groups were identified, among which 4764 sites in 1717 proteins were quantified. Among the quantified proteins, 81 up-regulated and 25 down-regulated sites were identified with significant criteria (the quantitative ratio above 1.3 was considered as up-regulation and below 0.77 was considered as down-regulation) and with significant p-value (p < 0.05). Some proteins of BmNPV were also hyperphosphorylated during infection, such as P6.9, 39 K, LEF-6, Ac58-like protein, Ac82-like protein and BRO-D. The phosphorylated proteins were primary involved in several specific functions, out of which, we focused on the binding activity, protein synthesis, viral replication and apoptosis through kinase activity.

  13. A Taylor weak-statement algorithm for hyperbolic conservation laws

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Kim, J. W.

    1987-01-01

    Finite element analysis, applied to computational fluid dynamics (CFD) problem classes, presents a formal procedure for establishing the ingredients of a discrete approximation numerical solution algorithm. A classical Galerkin weak-statement formulation, formed on a Taylor series extension of the conservation law system, is developed herein that embeds a set of parameters eligible for constraint according to specification of suitable norms. The derived family of Taylor weak statements is shown to contain, as special cases, over one dozen independently derived CFD algorithms published over the past several decades for the high speed flow problem class. A theoretical analysis is completed that facilitates direct qualitative comparisons. Numerical results for definitive linear and nonlinear test problems permit direct quantitative performance comparisons.

  14. Volumetric neuroimage analysis extensions for the MIPAV software package.

    PubMed

    Bazin, Pierre-Louis; Cuzzocreo, Jennifer L; Yassa, Michael A; Gandler, William; McAuliffe, Matthew J; Bassett, Susan S; Pham, Dzung L

    2007-09-15

    We describe a new collection of publicly available software tools for performing quantitative neuroimage analysis. The tools perform semi-automatic brain extraction, tissue classification, Talairach alignment, and atlas-based measurements within a user-friendly graphical environment. They are implemented as plug-ins for MIPAV, a freely available medical image processing software package from the National Institutes of Health. Because the plug-ins and MIPAV are implemented in Java, both can be utilized on nearly any operating system platform. In addition to the software plug-ins, we have also released a digital version of the Talairach atlas that can be used to perform regional volumetric analyses. Several studies are conducted applying the new tools to simulated and real neuroimaging data sets.

  15. Cell behaviors underlying notochord formation and extension in avian embryos: quantitative and immunocytochemical studies.

    PubMed

    Sausedo, R A; Schoenwolf, G C

    1993-09-01

    Formation and extension of the notochord is one of the earliest and most obvious events of axis development in vertebrate embryos. In birds, prospective notochord cells arise from Hensen's node and come to lie beneath the midline of the neural plate, where they assist in the process of neurulation and initiate the dorsoventral patterning of the neural tube through sequential inductive interactions. In the present study, we examined notochord development in avian embryos with quantitative and immunological procedures. Extension of the notochord occurs principally through accretion, that is, the addition of cells to its caudal end, a process that involves considerable cell rearrangement at the notochord-Hensen's node interface. In addition, cell division and cell rearrangement within the notochord proper contribute to notochord extension. Thus, extension of the notochord occurs in a manner that is significantly different from that of the adjacent, overlying, midline region of the neural plate (i.e., the median hinge-point region or future floor plate of the neural tube), which as shown in one of the previous studies from our laboratory (Schoenwolf and Alvarez: Development 106:427-439, 1989), extends caudally as its cells undergo two rounds of mediolateral cell-cell intercalation and two-three rounds of cell division.

  16. Pre-Service Teachers' Development of Technological Pedagogical Content Knowledge (TPACK) in the Context of a Secondary Science Teacher Education Program

    ERIC Educational Resources Information Center

    Habowski, Thomas; Mouza, Chrystalla

    2014-01-01

    This study investigates pre-service teachers' TPACK development in a secondary science teacher education program that combined a content-specific technology integration course with extensive field experience. Both quantitative and qualitative data were collected. Quantitative data were collected through a pre-post administration of the…

  17. Global spectral graph wavelet signature for surface analysis of carpal bones

    NASA Astrophysics Data System (ADS)

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A.

    2018-02-01

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  18. Global spectral graph wavelet signature for surface analysis of carpal bones.

    PubMed

    Masoumi, Majid; Rezaei, Mahsa; Ben Hamza, A

    2018-02-05

    Quantitative shape comparison is a fundamental problem in computer vision, geometry processing and medical imaging. In this paper, we present a spectral graph wavelet approach for shape analysis of carpal bones of the human wrist. We employ spectral graph wavelets to represent the cortical surface of a carpal bone via the spectral geometric analysis of the Laplace-Beltrami operator in the discrete domain. We propose global spectral graph wavelet (GSGW) descriptor that is isometric invariant, efficient to compute, and combines the advantages of both low-pass and band-pass filters. We perform experiments on shapes of the carpal bones of ten women and ten men from a publicly-available database of wrist bones. Using one-way multivariate analysis of variance (MANOVA) and permutation testing, we show through extensive experiments that the proposed GSGW framework gives a much better performance compared to the global point signature embedding approach for comparing shapes of the carpal bones across populations.

  19. Measuring Agricultural Paradigmatic Preferences: The Redevelopment of an Instrument to Determine Individual and Collective Preferences--A Pilot Study

    ERIC Educational Resources Information Center

    Sanagorski, Laura; Murphrey, Theresa Pesl; Lawver, David E.; Baker, Matt; Lindner. James R.

    2013-01-01

    Sustainable agriculture is an area that is gaining momentum. Extension agents are expected to teach production methods that include sustainable agriculture, yet little is known regarding how Extension agents feel about this agricultural paradigm. The research reported here sought to further develop an instrument that could quantitatively measure…

  20. A Quantitative Assessment of an Outsourced Agricultural Extension Service in the Umzimkhulu District of KwaZulu-Natal, South Africa

    ERIC Educational Resources Information Center

    Lyne, Michael C.; Jonas, Nomonde; Ortmann, Gerald F.

    2018-01-01

    Purpose: This study evaluates the impact of an outsourced extension service delivered by Lima Rural Development Foundation (Lima) in the Umzimkhulu district of South Africa. The evaluation is conducted at both the household and program levels. Design/methodology/approach: Household impacts were estimated using two-stage regression with…

  1. ThunderSTORM: a comprehensive ImageJ plug-in for PALM and STORM data analysis and super-resolution imaging

    PubMed Central

    Ovesný, Martin; Křížek, Pavel; Borkovec, Josef; Švindrych, Zdeněk; Hagen, Guy M.

    2014-01-01

    Summary: ThunderSTORM is an open-source, interactive and modular plug-in for ImageJ designed for automated processing, analysis and visualization of data acquired by single-molecule localization microscopy methods such as photo-activated localization microscopy and stochastic optical reconstruction microscopy. ThunderSTORM offers an extensive collection of processing and post-processing methods so that users can easily adapt the process of analysis to their data. ThunderSTORM also offers a set of tools for creation of simulated data and quantitative performance evaluation of localization algorithms using Monte Carlo simulations. Availability and implementation: ThunderSTORM and the online documentation are both freely accessible at https://code.google.com/p/thunder-storm/ Contact: guy.hagen@lf1.cuni.cz Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24771516

  2. Multiplex titration RT-PCR: rapid determination of gene expression patterns for a large number of genes

    NASA Technical Reports Server (NTRS)

    Nebenfuhr, A.; Lomax, T. L.

    1998-01-01

    We have developed an improved method for determination of gene expression levels with RT-PCR. The procedure is rapid and does not require extensive optimization or densitometric analysis. Since the detection of individual transcripts is PCR-based, small amounts of tissue samples are sufficient for the analysis of expression patterns in large gene families. Using this method, we were able to rapidly screen nine members of the Aux/IAA family of auxin-responsive genes and identify those genes which vary in message abundance in a tissue- and light-specific manner. While not offering the accuracy of conventional semi-quantitative or competitive RT-PCR, our method allows quick screening of large numbers of genes in a wide range of RNA samples with just a thermal cycler and standard gel analysis equipment.

  3. Uncertainty Analysis of Radar and Gauge Rainfall Estimates in the Russian River Basin

    NASA Astrophysics Data System (ADS)

    Cifelli, R.; Chen, H.; Willie, D.; Reynolds, D.; Campbell, C.; Sukovich, E.

    2013-12-01

    Radar Quantitative Precipitation Estimation (QPE) has been a very important application of weather radar since it was introduced and made widely available after World War II. Although great progress has been made over the last two decades, it is still a challenging process especially in regions of complex terrain such as the western U.S. It is also extremely difficult to make direct use of radar precipitation data in quantitative hydrologic forecasting models. To improve the understanding of rainfall estimation and distributions in the NOAA Hydrometeorology Testbed in northern California (HMT-West), extensive evaluation of radar and gauge QPE products has been performed using a set of independent rain gauge data. This study focuses on the rainfall evaluation in the Russian River Basin. The statistical properties of the different gridded QPE products will be compared quantitatively. The main emphasis of this study will be on the analysis of uncertainties of the radar and gauge rainfall products that are subject to various sources of error. The spatial variation analysis of the radar estimates is performed by measuring the statistical distribution of the radar base data such as reflectivity and by the comparison with a rain gauge cluster. The application of mean field bias values to the radar rainfall data will also be described. The uncertainty analysis of the gauge rainfall will be focused on the comparison of traditional kriging and conditional bias penalized kriging (Seo 2012) methods. This comparison is performed with the retrospective Multisensor Precipitation Estimator (MPE) system installed at the NOAA Earth System Research Laboratory. The independent gauge set will again be used as the verification tool for the newly generated rainfall products.

  4. Exploring generational cohort work satisfaction in hospital nurses.

    PubMed

    Gordon, Pamela Ann

    2017-07-03

    Purpose Although extensive research exists regarding job satisfaction, many previous studies used a more restrictive, quantitative methodology. The purpose of this qualitative study is to capture the perceptions of hospital nurses within generational cohorts regarding their work satisfaction. Design/methodology/approach A preliminary qualitative, phenomenological study design explored hospital nurses' work satisfaction within generational cohorts - Baby Boomers (1946-1964), Generation X (1965-1980) and Millennials (1981-2000). A South Florida hospital provided the venue for the research. In all, 15 full-time staff nurses, segmented into generational cohorts, participated in personal interviews to determine themes related to seven established factors of work satisfaction: pay, autonomy, task requirements, administration, doctor-nurse relationship, interaction and professional status. Findings An analysis of the transcribed interviews confirmed the importance of the seven factors of job satisfaction. Similarities and differences between the generational cohorts related to a combination of stages of life and generational attributes. Practical implications The results of any qualitative research relate only to the specific venue studied and are not generalizable. However, the information gleaned from this study is transferable and other organizations are encouraged to conduct their own research and compare the results. Originality/value This study is unique, as the seven factors from an extensively used and highly respected quantitative research instrument were applied as the basis for this qualitative inquiry into generational cohort job satisfaction in a hospital setting.

  5. Strain Variation in an Emerging Iridovirus of Warm-Water Fishes

    PubMed Central

    Goldberg, Tony L.; Coleman, David A.; Grant, Emily C.; Inendino, Kate R.; Philipp, David P.

    2003-01-01

    Although iridoviruses vary widely within and among genera with respect to their host range and virulence, variation within iridovirus species has been less extensively characterized. This study explores the nature and extent of intraspecific variation within an emerging iridovirus of North American warm-water fishes, largemouth bass virus (LMBV). Three LMBV isolates recovered from three distinct sources differed genetically and phenotypically. Genetically, the isolates differed in the banding patterns generated from amplified fragment length polymorphism analysis but not in their DNA sequences at two loci of different degrees of evolutionary stability. In vitro, the isolates replicated at identical rates in cell culture, as determined by real-time quantitative PCR of viral particles released into suspension. In vivo, the isolates varied over fivefold in virulence, as measured by the rate at which they induced mortality in juvenile largemouth bass. This variation was reflected in the viral loads of exposed fish, measured using real-time quantitative PCR; the most virulent viral strain also replicated to the highest level in fish. Together, these results justify the designation of these isolates as different strains of LMBV. Strain variation in iridoviruses could help explain why animal populations naturally infected with iridovirus pathogens vary so extensively in their clinical responses to infection. The results of this study are especially relevant to emerging iridoviruses of aquaculture systems and wildlife. PMID:12885900

  6. Amphibolite boudins in marble on Naxos, Greece: 3D analysis of multiphase deformation on a retrograde P-T path

    NASA Astrophysics Data System (ADS)

    Virgo, Simon; von Hagke, Christoph; Urai, Janos L.

    2017-04-01

    Boudins are periodic structures that form by layer parallel extension in mechanically layered rocks. The characteristics of boudins such as orientation and geometry provide constraints on the paleo stress field as well as the rheology of the rocks during deformation. However, most characterizations of boudinage are based on 2D observations and do not consider the 3-dimensional complexity and potentially non-coaxial polyphase genesis of boudinage structures. In marble quarries in the high grade complex on Naxos, Greece, we studied spectacular outcrops of amphibolite and pegmatite boudins, in combination with serial slicing of quarried blocks to reconstruct the 3D boudin structures. We identified five boudin generations with two distinct generations of early, high grade pinch-and-swell followed by two generations of brittle shearband and torn boudins formed along the retrograde path under greenschist facies conditions. The five generations of boudinage indicate that E-W compression is the main mode of deformation in the marbles. The axis of extension changes from subvertical during pinch-and swell deformation to subhorizontal N-S extension at later stages of deformation. Later phases of boudinage are influenced by existing boudin geometries, producing complex structures in 3D. In 2D section the complexity is not directly apparent and reveals itself only after statistical analysis of long continuous sections. Apart from implications for the regional geology, our findings highlight the importance of 3D characterization of boudinage structures for boudin classification. The insights we gain from the analysis of multiphase boudinage structures on Naxos are the basis for quantitative boudin analysis to infer rheology, effective stress, vorticity and strain, and establish a boudin classification scheme with appeal to a complete mechanics.

  7. Molecular and agronomic analysis of intraspecific variability in Capsicum baccatum var. pendulum accessions.

    PubMed

    Leite, P S S; Rodrigues, R; Silva, R N O; Pimenta, S; Medeiros, A M; Bento, C S; Gonçalves, L S A

    2016-10-05

    Capsicum baccatum is one of the most important chili peppers in South America, since this region is considered to be the center of origin and diversity of this species. In Brazil, C. baccatum has been widely explored by family farmers and there are different local names for each fruit phenotype, such as cambuci and dedo-de-moça (lady's finger). Although very popular among farmers and consumers, C. baccatum has been less extensively studied than other Capsicum species. This study describes the phenotypic and genotypic variability in C. baccatum var. pendulum accessions. Twenty-nine accessions from the Universidade Estadual do Norte Fluminense Darcy Ribeiro gene bank, and one commercial genotype ('BRS-Mari') were evaluated for 53 morphoagronomic descriptors (31 qualitative and 22 quantitative traits). In addition, accessions were genotyped using 30 microsatellite primers. Three accessions from the C. annuum complex were included in the molecular characterization. Nine of 31 qualitative descriptors were monomorphic, while all quantitative descriptors were highly significant different between accessions (P < 0.01). Using the unweighted pair group method using arithmetic averages, four groups were obtained based on multicategoric variables and five groups were obtained based on quantitative variables. In the genotyping analysis, 12 polymorphic simple sequence repeat primers amplified in C. baccatum with dissimilarity between accessions ranging from 0.13 to 0.91, permitting the formation of two distinct groups for Bayesian analysis. These results indicate wide variability among the accessions comparing phenotypic and genotypic data and revealed distinct patterns of dissimilarity between matrices, indicating that both steps are valuable for the characterization of C. baccatum var. pendulum accessions.

  8. Structure-Activity Relationships Based on 3D-QSAR CoMFA/CoMSIA and Design of Aryloxypropanol-Amine Agonists with Selectivity for the Human β3-Adrenergic Receptor and Anti-Obesity and Anti-Diabetic Profiles.

    PubMed

    Lorca, Marcos; Morales-Verdejo, Cesar; Vásquez-Velásquez, David; Andrades-Lagos, Juan; Campanini-Salinas, Javier; Soto-Delgado, Jorge; Recabarren-Gajardo, Gonzalo; Mella, Jaime

    2018-05-16

    The wide tissue distribution of the adrenergic β3 receptor makes it a potential target for the treatment of multiple pathologies such as diabetes, obesity, depression, overactive bladder (OAB), and cancer. Currently, there is only one drug on the market, mirabegron, approved for the treatment of OAB. In the present study, we have carried out an extensive structure-activity relationship analysis of a series of 41 aryloxypropanolamine compounds based on three-dimensional quantitative structure-activity relationship (3D-QSAR) techniques. This is the first combined comparative molecular field analysis (CoMFA) and comparative molecular similarity index analysis (CoMSIA) study in a series of selective aryloxypropanolamines displaying anti-diabetes and anti-obesity pharmacological profiles. The best CoMFA and CoMSIA models presented values of r ² ncv = 0.993 and 0.984 and values of r ² test = 0.865 and 0.918, respectively. The results obtained were subjected to extensive external validation ( q ², r ², r ² m , etc.) and a final series of compounds was designed and their biological activity was predicted (best pEC 50 = 8.561).

  9. Quantitative- and Phospho-Proteomic Analysis of the Yeast Response to the Tyrosine Kinase Inhibitor Imatinib to Pharmacoproteomics-Guided Drug Line Extension

    PubMed Central

    dos Santos, Sandra C.; Mira, Nuno P.; Moreira, Ana S.

    2012-01-01

    Abstract Imatinib mesylate (IM) is a potent tyrosine kinase inhibitor used as front-line therapy in chronic myeloid leukemia, a disease caused by the oncogenic kinase Bcr-Abl. Although the clinical success of IM set a new paradigm in molecular-targeted therapy, the emergence of IM resistance is a clinically significant problem. In an effort to obtain new insights into the mechanisms of adaptation and tolerance to IM, as well as the signaling pathways potentially affected by this drug, we performed a two-dimensional electrophoresis-based quantitative- and phospho-proteomic analysis in the eukaryotic model Saccharomyces cerevisiae. We singled out proteins that were either differentially expressed or differentially phosphorylated in response to IM, using the phosphoselective dye Pro-Q® Diamond, and identified 18 proteins in total. Ten were altered only at the content level (mostly decreased), while the remaining 8 possessed IM-repressed phosphorylation. These 18 proteins are mainly involved in cellular carbohydrate processes (glycolysis/gluconeogenesis), translation, protein folding, ion homeostasis, and nucleotide and amino acid metabolism. Remarkably, all 18 proteins have human functional homologs. A role for HSP70 proteins in the response to IM, as well as decreased glycolysis as a metabolic marker of IM action are suggested, consistent with findings from studies in human cell lines. The previously-proposed effect of IM as an inhibitor of vacuolar H+-ATPase function was supported by the identification of an underexpressed protein subunit of this complex. Taken together, these findings reinforce the role of yeast as a valuable eukaryotic model for pharmacological studies and identification of new drug targets, with potential clinical implications in drug reassignment or line extension under a personalized medicine perspective. PMID:22775238

  10. Climate Change Education: Quantitatively Assessing the Impact of a Botanical Garden as an Informal Learning Environment

    ERIC Educational Resources Information Center

    Sellmann, Daniela; Bogner, Franz X.

    2013-01-01

    Although informal learning environments have been studied extensively, ours is one of the first studies to quantitatively assess the impact of learning in botanical gardens on students' cognitive achievement. We observed a group of 10th graders participating in a one-day educational intervention on climate change implemented in a botanical garden.…

  11. Impact of "Grassroots on Work" (GROW) Extension Program to the Bachelor of Arts in Political Science Students' Sense of Civic Responsibility

    ERIC Educational Resources Information Center

    Paga, Mark Leo Huit

    2015-01-01

    Purpose: The purpose of this study was to determine the medium term effect of service-learning program or "Grassroots on Work" extension program to civic responsibility of AB Political Science students. Methodology: This study employed an impact evaluation research design and both qualitative and quantitative. The data on goals and…

  12. An approach to the systematic analysis of urinary steroids

    PubMed Central

    Menini, E.; Norymberski, J. K.

    1965-01-01

    1. Human urine, its extracts, extracts of urine pretreated with enzyme preparations containing β-glucuronidase and steroid sulphatase or β-glucuronidase alone, and products derived from the specific solvolysis of urinary steroid sulphates, were submitted to the following sequence of operations: reduction with borohydride; oxidation with a glycol-cleaving agent (bismuthate or periodate); separation of the products into ketones and others; oxidation of each fraction with tert.-butyl chromate, resolution of the end products by means of paper chromatography or gas–liquid chromatography or both. 2. Qualitative experiments indicated the kind of information the method and some of its modifications can provide. Quantitative experiments were restricted to the direct treatment of urine by the basic procedure outlined. It was partly shown and partly argued that the quantitative results were probably as informative about the composition of the major neutral urinary steroids (and certainly about their presumptive secretory precursors) as those obtained by a number of established analytical procedures. 3. A possible extension of the scope of the reported method was indicated. 4. A simple technique was introduced for the quantitative deposition of a solid sample on to a gas–liquid-chromatographic column. PMID:14333557

  13. Certified Reference Material for Use in 1H, 31P, and 19F Quantitative NMR, Ensuring Traceability to the International System of Units.

    PubMed

    Rigger, Romana; Rück, Alexander; Hellriegel, Christine; Sauermoser, Robert; Morf, Fabienne; Breitruck, KathrinBreitruck; Obkircher, Markus

    2017-09-01

    In recent years, quantitative NMR (qNMR) spectroscopy has become one of the most important tools for content determination of organic substances and quantitative evaluation of impurities. Using Certified Reference Materials (CRMs) as internal or external standards, the extensively used qNMR method can be applied for purity determination, including unbroken traceability to the International System of Units (SI). The implementation of qNMR toward new application fields, e.g., metabolomics, environmental analysis, and physiological pathway studies, brings along more complex molecules and systems, thus making use of 1H qNMR challenging. A smart workaround is possible by the use of other NMR active nuclei, namely 31P and 19F. This article presents the development of three classes of qNMR CRMs based on different NMR active nuclei (1H, 31P, and 19F), and the corresponding approaches to establish traceability to the SI through primary CRMs from the National Institute of Standards and Technology and the National Metrology Institute of Japan. These TraceCERT® qNMR CRMs are produced under ISO/IEC 17025 and ISO Guide 34 using high-performance qNMR.

  14. Widespread age-related differences in the human brain microstructure revealed by quantitative magnetic resonance imaging.

    PubMed

    Callaghan, Martina F; Freund, Patrick; Draganski, Bogdan; Anderson, Elaine; Cappelletti, Marinella; Chowdhury, Rumana; Diedrichsen, Joern; Fitzgerald, Thomas H B; Smittenaar, Peter; Helms, Gunther; Lutti, Antoine; Weiskopf, Nikolaus

    2014-08-01

    A pressing need exists to disentangle age-related changes from pathologic neurodegeneration. This study aims to characterize the spatial pattern and age-related differences of biologically relevant measures in vivo over the course of normal aging. Quantitative multiparameter maps that provide neuroimaging biomarkers for myelination and iron levels, parameters sensitive to aging, were acquired from 138 healthy volunteers (age range: 19-75 years). Whole-brain voxel-wise analysis revealed a global pattern of age-related degeneration. Significant demyelination occurred principally in the white matter. The observed age-related differences in myelination were anatomically specific. In line with invasive histologic reports, higher age-related differences were seen in the genu of the corpus callosum than the splenium. Iron levels were significantly increased in the basal ganglia, red nucleus, and extensive cortical regions but decreased along the superior occipitofrontal fascicle and optic radiation. This whole-brain pattern of age-associated microstructural differences in the asymptomatic population provides insight into the neurobiology of aging. The results help build a quantitative baseline from which to examine and draw a dividing line between healthy aging and pathologic neurodegeneration. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Widespread age-related differences in the human brain microstructure revealed by quantitative magnetic resonance imaging☆

    PubMed Central

    Callaghan, Martina F.; Freund, Patrick; Draganski, Bogdan; Anderson, Elaine; Cappelletti, Marinella; Chowdhury, Rumana; Diedrichsen, Joern; FitzGerald, Thomas H.B.; Smittenaar, Peter; Helms, Gunther; Lutti, Antoine; Weiskopf, Nikolaus

    2014-01-01

    A pressing need exists to disentangle age-related changes from pathologic neurodegeneration. This study aims to characterize the spatial pattern and age-related differences of biologically relevant measures in vivo over the course of normal aging. Quantitative multiparameter maps that provide neuroimaging biomarkers for myelination and iron levels, parameters sensitive to aging, were acquired from 138 healthy volunteers (age range: 19–75 years). Whole-brain voxel-wise analysis revealed a global pattern of age-related degeneration. Significant demyelination occurred principally in the white matter. The observed age-related differences in myelination were anatomically specific. In line with invasive histologic reports, higher age-related differences were seen in the genu of the corpus callosum than the splenium. Iron levels were significantly increased in the basal ganglia, red nucleus, and extensive cortical regions but decreased along the superior occipitofrontal fascicle and optic radiation. This whole-brain pattern of age-associated microstructural differences in the asymptomatic population provides insight into the neurobiology of aging. The results help build a quantitative baseline from which to examine and draw a dividing line between healthy aging and pathologic neurodegeneration. PMID:24656835

  16. A Quantitative Acetylomic Analysis of Early Seed Development in Rice (Oryza sativa L.).

    PubMed

    Wang, Yifeng; Hou, Yuxuan; Qiu, Jiehua; Li, Zhiyong; Zhao, Juan; Tong, Xiaohong; Zhang, Jian

    2017-06-27

    PKA (protein lysine acetylation) is a critical post-translational modification that regulates various developmental processes, including seed development. However, the acetylation events and dynamics on a proteomic scale in this process remain largely unknown, especially in rice early seed development. We report the first quantitative acetylproteomic study focused on rice early seed development by employing a mass spectral-based (MS-based), label-free approach. A total of 1817 acetylsites on 1688 acetylpeptides from 972 acetylproteins were identified in pistils and seeds at three and seven days after pollination, including 268 acetyproteins differentially acetylated among the three stages. Motif-X analysis revealed that six significantly enriched motifs, such as (DxkK), (kH) and (kY) around the acetylsites of the identified rice seed acetylproteins. Differentially acetylated proteins among the three stages, including adenosine diphosphate (ADP) -glucose pyrophosphorylases (AGPs), PDIL1-1 (protein disulfide isomerase like 1-1), hexokinases, pyruvate dehydrogenase complex (PDC) and numerous other regulators that are extensively involved in the starch and sucrose metabolism, glycolysis/gluconeogenesis, tricarboxylic acid (TCA) cycle and photosynthesis pathways during early seed development. This study greatly expanded the rice acetylome dataset, and shed novel insight into the regulatory roles of PKA in rice early seed development.

  17. Kinetic Modeling of Accelerated Stability Testing Enabled by Second Harmonic Generation Microscopy.

    PubMed

    Song, Zhengtian; Sarkar, Sreya; Vogt, Andrew D; Danzer, Gerald D; Smith, Casey J; Gualtieri, Ellen J; Simpson, Garth J

    2018-04-03

    The low limits of detection afforded by second harmonic generation (SHG) microscopy coupled with image analysis algorithms enabled quantitative modeling of the temperature-dependent crystallization of active pharmaceutical ingredients (APIs) within amorphous solid dispersions (ASDs). ASDs, in which an API is maintained in an amorphous state within a polymer matrix, are finding increasing use to address solubility limitations of small-molecule APIs. Extensive stability testing is typically performed for ASD characterization, the time frame for which is often dictated by the earliest detectable onset of crystal formation. Here a study of accelerated stability testing on ritonavir, a human immunodeficiency virus (HIV) protease inhibitor, has been conducted. Under the condition for accelerated stability testing at 50 °C/75%RH and 40 °C/75%RH, ritonavir crystallization kinetics from amorphous solid dispersions were monitored by SHG microscopy. SHG microscopy coupled by image analysis yielded limits of detection for ritonavir crystals as low as 10 ppm, which is about 2 orders of magnitude lower than other methods currently available for crystallinity detection in ASDs. The four decade dynamic range of SHG microscopy enabled quantitative modeling with an established (JMAK) kinetic model. From the SHG images, nucleation and crystal growth rates were independently determined.

  18. Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.

    PubMed

    Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís

    2016-01-08

    In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Quantitative Proteomic Analysis of Serum Exosomes from Patients with Locally Advanced Pancreatic Cancer Undergoing Chemoradiotherapy.

    PubMed

    An, Mingrui; Lohse, Ines; Tan, Zhijing; Zhu, Jianhui; Wu, Jing; Kurapati, Himabindu; Morgan, Meredith A; Lawrence, Theodore S; Cuneo, Kyle C; Lubman, David M

    2017-04-07

    Pancreatic cancer is the third leading cause of cancer-related death in the USA. Despite extensive research, minimal improvements in patient outcomes have been achieved. Early identification of treatment response and metastasis would be valuable to determine the appropriate therapeutic course for patients. In this work, we isolated exosomes from the serum of 10 patients with locally advanced pancreatic cancer at serial time points over a course of therapy, and quantitative analysis was performed using the iTRAQ method. We detected approximately 700-800 exosomal proteins per sample, several of which have been implicated in metastasis and treatment resistance. We compared the exosomal proteome of patients at different time points during treatment to healthy controls and identified eight proteins that show global treatment-specific changes. We then tested the effect of patient-derived exosomes on the migration of tumor cells and found that patient-derived exosomes, but not healthy controls, induce cell migration, supporting their role in metastasis. Our data show that exosomes can be reliably extracted from patient serum and analyzed for protein content. The differential loading of exosomes during a course of therapy suggests that exosomes may provide novel insights into the development of treatment resistance and metastasis.

  20. Analysis of Sequence Data Under Multivariate Trait-Dependent Sampling.

    PubMed

    Tao, Ran; Zeng, Donglin; Franceschini, Nora; North, Kari E; Boerwinkle, Eric; Lin, Dan-Yu

    2015-06-01

    High-throughput DNA sequencing allows for the genotyping of common and rare variants for genetic association studies. At the present time and for the foreseeable future, it is not economically feasible to sequence all individuals in a large cohort. A cost-effective strategy is to sequence those individuals with extreme values of a quantitative trait. We consider the design under which the sampling depends on multiple quantitative traits. Under such trait-dependent sampling, standard linear regression analysis can result in bias of parameter estimation, inflation of type I error, and loss of power. We construct a likelihood function that properly reflects the sampling mechanism and utilizes all available data. We implement a computationally efficient EM algorithm and establish the theoretical properties of the resulting maximum likelihood estimators. Our methods can be used to perform separate inference on each trait or simultaneous inference on multiple traits. We pay special attention to gene-level association tests for rare variants. We demonstrate the superiority of the proposed methods over standard linear regression through extensive simulation studies. We provide applications to the Cohorts for Heart and Aging Research in Genomic Epidemiology Targeted Sequencing Study and the National Heart, Lung, and Blood Institute Exome Sequencing Project.

  1. Genetical genomics of Populus leaf shape variation

    DOE PAGES

    Drost, Derek R.; Puranik, Swati; Novaes, Evandro; ...

    2015-06-30

    Leaf morphology varies extensively among plant species and is under strong genetic control. Mutagenic screens in model systems have identified genes and established molecular mechanisms regulating leaf initiation, development, and shape. However, it is not known whether this diversity across plant species is related to naturally occurring variation at these genes. Quantitative trait locus (QTL) analysis has revealed a polygenic control for leaf shape variation in different species suggesting that loci discovered by mutagenesis may only explain part of the naturally occurring variation in leaf shape. Here we undertook a genetical genomics study in a poplar intersectional pseudo-backcross pedigree tomore » identify genetic factors controlling leaf shape. Here, the approach combined QTL discovery in a genetic linkage map anchored to the Populus trichocarpa reference genome sequence and transcriptome analysis.« less

  2. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy.

    PubMed

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-11-19

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.

  3. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    PubMed Central

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  4. Performance analysis and experimental study on rainfall water purification with an extensive green roof matrix layer in Shanghai, China.

    PubMed

    Guo, Jiankang; Zhang, Yanting; Che, Shengquan

    2018-02-01

    Current research has validated the purification of rainwater by a substrate layer of green roofs to some extent, though the effects of the substrate layer on rainwater purification have not been adequately quantified. The present study set up nine extensive green roof experiment combinations based on the current conditions of precipitation characteristics observed in Shanghai, China. Different rain with pollutants were simulated, and the orthogonal design L9 (33) test was conducted to measure purification performance. The purification influences of the extensive green roof substrate layer were quantitatively analyzed in Shanghai to optimize the thickness, proportion of substrate, and sodium polyacrylate content. The experimental outcomes resulted in ammonium nitrogen (NH 4 + -N), lead (Pb), and zinc (Zn) removal of up to 93.87%, 98.81%, and 94.55% in the artificial rainfall, respectively, and NH 4 + -N, Pb, and Zn event mean concentration (EMC) was depressed to 0.263 mg/L, 0.002 mg/L and 0.018 mg/L, respectively, which were all well below the pollutant concentrations of artificial rainfall. With reference to the rainfall chemical characteristics of Shanghai, a combination of a 200 mm thickness, proportions of 1:1:2 of Loam: Perlite: Cocopeat and 2 g/L sodium polyacrylate content was suggested for the design of an extensive green roof substrate to purify NH 4 + -N, Pb and Zn.

  5. LC–MS Proteomics Analysis of the Insulin/IGF-1-Deficient Caenorhabditis elegans daf-2(e1370) Mutant Reveals Extensive Restructuring of Intermediary Metabolism

    PubMed Central

    2015-01-01

    The insulin/IGF-1 receptor is a major known determinant of dauer formation, stress resistance, longevity, and metabolism in Caenorhabditis elegans. In the past, whole-genome transcript profiling was used extensively to study differential gene expression in response to reduced insulin/IGF-1 signaling, including the expression levels of metabolism-associated genes. Taking advantage of the recent developments in quantitative liquid chromatography mass spectrometry (LC–MS)-based proteomics, we profiled the proteomic changes that occur in response to activation of the DAF-16 transcription factor in the germline-less glp-4(bn2);daf-2(e1370) receptor mutant. Strikingly, the daf-2 profile suggests extensive reorganization of intermediary metabolism, characterized by the upregulation of many core intermediary metabolic pathways. These include glycolysis/gluconeogenesis, glycogenesis, pentose phosphate cycle, citric acid cycle, glyoxylate shunt, fatty acid β-oxidation, one-carbon metabolism, propionate and tyrosine catabolism, and complexes I, II, III, and V of the electron transport chain. Interestingly, we found simultaneous activation of reciprocally regulated metabolic pathways, which is indicative of spatiotemporal coordination of energy metabolism and/or extensive post-translational regulation of these enzymes. This restructuring of daf-2 metabolism is reminiscent to that of hypometabolic dauers, allowing the efficient and economical utilization of internal nutrient reserves and possibly also shunting metabolites through alternative energy-generating pathways to sustain longevity. PMID:24555535

  6. Multidimensional analysis of data obtained in experiments with X-ray emulsion chambers and extensive air showers

    NASA Technical Reports Server (NTRS)

    Chilingaryan, A. A.; Galfayan, S. K.; Zazyan, M. Z.; Dunaevsky, A. M.

    1985-01-01

    Nonparametric statistical methods are used to carry out the quantitative comparison of the model and the experimental data. The same methods enable one to select the events initiated by the heavy nuclei and to calculate the portion of the corresponding events. For this purpose it is necessary to have the data on artificial events describing the experiment sufficiently well established. At present, the model with the small scaling violation in the fragmentation region is the closest to the experiments. Therefore, the treatment of gamma families obtained in the Pamir' experiment is being carried out at present with the application of these models.

  7. Feasibility of line-ratio spectroscopy on helium and neon as edge diagnostic tool for Wendelstein 7-X

    DOE PAGES

    Barbui, T.; Krychowiak, M.; König, R.; ...

    2016-09-27

    A beam emission spectroscopy system on thermal helium (He) and neon (Ne) has been set up at Wendelstein 7-X to measure edge electron temperature and density profiles utilizing the line-ratio technique or its extension by the analysis of absolutely calibrated line emissions. The setup for a first systematic test of these techniques of quantitative atomic spectroscopy in the limiter startup phase (OP1.1) is reported together with first measured profiles. Lastly, this setup and the first results are an important test for developing the technique for the upcoming high density, low temperature island divertor regime.

  8. Translational benchmark risk analysis

    PubMed Central

    Piegorsch, Walter W.

    2010-01-01

    Translational development – in the sense of translating a mature methodology from one area of application to another, evolving area – is discussed for the use of benchmark doses in quantitative risk assessment. Illustrations are presented with traditional applications of the benchmark paradigm in biology and toxicology, and also with risk endpoints that differ from traditional toxicological archetypes. It is seen that the benchmark approach can apply to a diverse spectrum of risk management settings. This suggests a promising future for this important risk-analytic tool. Extensions of the method to a wider variety of applications represent a significant opportunity for enhancing environmental, biomedical, industrial, and socio-economic risk assessments. PMID:20953283

  9. OpenMS: a flexible open-source software platform for mass spectrometry data analysis.

    PubMed

    Röst, Hannes L; Sachsenberg, Timo; Aiche, Stephan; Bielow, Chris; Weisser, Hendrik; Aicheler, Fabian; Andreotti, Sandro; Ehrlich, Hans-Christian; Gutenbrunner, Petra; Kenar, Erhan; Liang, Xiao; Nahnsen, Sven; Nilse, Lars; Pfeuffer, Julianus; Rosenberger, George; Rurik, Marc; Schmitt, Uwe; Veit, Johannes; Walzer, Mathias; Wojnar, David; Wolski, Witold E; Schilling, Oliver; Choudhary, Jyoti S; Malmström, Lars; Aebersold, Ruedi; Reinert, Knut; Kohlbacher, Oliver

    2016-08-30

    High-resolution mass spectrometry (MS) has become an important tool in the life sciences, contributing to the diagnosis and understanding of human diseases, elucidating biomolecular structural information and characterizing cellular signaling networks. However, the rapid growth in the volume and complexity of MS data makes transparent, accurate and reproducible analysis difficult. We present OpenMS 2.0 (http://www.openms.de), a robust, open-source, cross-platform software specifically designed for the flexible and reproducible analysis of high-throughput MS data. The extensible OpenMS software implements common mass spectrometric data processing tasks through a well-defined application programming interface in C++ and Python and through standardized open data formats. OpenMS additionally provides a set of 185 tools and ready-made workflows for common mass spectrometric data processing tasks, which enable users to perform complex quantitative mass spectrometric analyses with ease.

  10. An assay for lateral line regeneration in adult zebrafish.

    PubMed

    Pisano, Gina C; Mason, Samantha M; Dhliwayo, Nyembezi; Intine, Robert V; Sarras, Michael P

    2014-04-08

    Due to the clinical importance of hearing and balance disorders in man, model organisms such as the zebrafish have been used to study lateral line development and regeneration. The zebrafish is particularly attractive for such studies because of its rapid development time and its high regenerative capacity. To date, zebrafish studies of lateral line regeneration have mainly utilized fish of the embryonic and larval stages because of the lower number of neuromasts at these stages. This has made quantitative analysis of lateral line regeneration/and or development easier in the earlier developmental stages. Because many zebrafish models of neurological and non-neurological diseases are studied in the adult fish and not in the embryo/larvae, we focused on developing a quantitative lateral line regenerative assay in adult zebrafish so that an assay was available that could be applied to current adult zebrafish disease models. Building on previous studies by Van Trump et al. that described procedures for ablation of hair cells in adult Mexican blind cave fish and zebrafish (Danio rerio), our assay was designed to allow quantitative comparison between control and experimental groups. This was accomplished by developing a regenerative neuromast standard curve based on the percent of neuromast reappearance over a 24 hr time period following gentamicin-induced necrosis of hair cells in a defined region of the lateral line. The assay was also designed to allow extension of the analysis to the individual hair cell level when a higher level of resolution is required.

  11. Quantitative computed tomography for the prediction of pulmonary function after lung cancer surgery: a simple method using simulation software.

    PubMed

    Ueda, Kazuhiro; Tanaka, Toshiki; Li, Tao-Sheng; Tanaka, Nobuyuki; Hamano, Kimikazu

    2009-03-01

    The prediction of pulmonary functional reserve is mandatory in therapeutic decision-making for patients with resectable lung cancer, especially those with underlying lung disease. Volumetric analysis in combination with densitometric analysis of the affected lung lobe or segment with quantitative computed tomography (CT) helps to identify residual pulmonary function, although the utility of this modality needs investigation. The subjects of this prospective study were 30 patients with resectable lung cancer. A three-dimensional CT lung model was created with voxels representing normal lung attenuation (-600 to -910 Hounsfield units). Residual pulmonary function was predicted by drawing a boundary line between the lung to be preserved and that to be resected, directly on the lung model. The predicted values were correlated with the postoperative measured values. The predicted and measured values corresponded well (r=0.89, p<0.001). Although the predicted values corresponded with values predicted by simple calculation using a segment-counting method (r=0.98), there were two outliers whose pulmonary functional reserves were predicted more accurately by CT than by segment counting. The measured pulmonary functional reserves were significantly higher than the predicted values in patients with extensive emphysematous areas (<-910 Hounsfield units), but not in patients with chronic obstructive pulmonary disease. Quantitative CT yielded accurate prediction of functional reserve after lung cancer surgery and helped to identify patients whose functional reserves are likely to be underestimated. Hence, this modality should be utilized for patients with marginal pulmonary function.

  12. Classification of upper limb disability levels of children with spastic unilateral cerebral palsy using K-means algorithm.

    PubMed

    Raouafi, Sana; Achiche, Sofiane; Begon, Mickael; Sarcher, Aurélie; Raison, Maxime

    2018-01-01

    Treatment for cerebral palsy depends upon the severity of the child's condition and requires knowledge about upper limb disability. The aim of this study was to develop a systematic quantitative classification method of the upper limb disability levels for children with spastic unilateral cerebral palsy based on upper limb movements and muscle activation. Thirteen children with spastic unilateral cerebral palsy and six typically developing children participated in this study. Patients were matched on age and manual ability classification system levels I to III. Twenty-three kinematic and electromyographic variables were collected from two tasks. Discriminative analysis and K-means clustering algorithm were applied using 23 kinematic and EMG variables of each participant. Among the 23 kinematic and electromyographic variables, only two variables containing the most relevant information for the prediction of the four levels of severity of spastic unilateral cerebral palsy, which are fixed by manual ability classification system, were identified by discriminant analysis: (1) the Falconer index (CAI E ) which represents the ratio of biceps to triceps brachii activity during extension and (2) the maximal angle extension (θ Extension,max ). A good correlation (Kendall Rank correlation coefficient = -0.53, p = 0.01) was found between levels fixed by manual ability classification system and the obtained classes. These findings suggest that the cost and effort needed to assess and characterize the disability level of a child can be further reduced.

  13. Rediscovery and Revival of Analytical Refractometry for Protein Determination: Recombining Simplicity With Accuracy in the Digital Era.

    PubMed

    Anderle, Heinz; Weber, Alfred

    2016-03-01

    Among "vintage" methods of protein determination, quantitative analytical refractometry has received far less attention than well-established pharmacopoeial techniques based on protein nitrogen content, such as Dumas combustion (1831) and Kjeldahl digestion (1883). Protein determination by quantitative refractometry dates back to 1903 and has been extensively investigated and characterized in the following 30 years, but has since vanished into a few niche applications that may not require the degree of accuracy and precision essential for pharmaceutical analysis. However, because high-resolution and precision digital refractometers have replaced manual instruments, reducing time and resource consumption, the method appears particularly attractive from an economic, ergonomic, and environmental viewpoint. The sample solution can be measured without dilution or other preparation procedures than the separation of the protein-free matrix by ultrafiltration, which might even be omitted for a constant matrix and excipient composition. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Quantitative assessment of neural outgrowth using spatial light interference microscopy

    NASA Astrophysics Data System (ADS)

    Lee, Young Jae; Cintora, Pati; Arikkath, Jyothi; Akinsola, Olaoluwa; Kandel, Mikhail; Popescu, Gabriel; Best-Popescu, Catherine

    2017-06-01

    Optimal growth as well as branching of axons and dendrites is critical for the nervous system function. Neuritic length, arborization, and growth rate determine the innervation properties of neurons and define each cell's computational capability. Thus, to investigate the nervous system function, we need to develop methods and instrumentation techniques capable of quantifying various aspects of neural network formation: neuron process extension, retraction, stability, and branching. During the last three decades, fluorescence microscopy has yielded enormous advances in our understanding of neurobiology. While fluorescent markers provide valuable specificity to imaging, photobleaching, and photoxicity often limit the duration of the investigation. Here, we used spatial light interference microscopy (SLIM) to measure quantitatively neurite outgrowth as a function of cell confluence. Because it is label-free and nondestructive, SLIM allows for long-term investigation over many hours. We found that neurons exhibit a higher growth rate of neurite length in low-confluence versus medium- and high-confluence conditions. We believe this methodology will aid investigators in performing unbiased, nondestructive analysis of morphometric neuronal parameters.

  15. Topological Cacti: Visualizing Contour-based Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weber, Gunther H.; Bremer, Peer-Timo; Pascucci, Valerio

    2011-05-26

    Contours, the connected components of level sets, play an important role in understanding the global structure of a scalar field. In particular their nestingbehavior and topology-often represented in form of a contour tree-have been used extensively for visualization and analysis. However, traditional contour trees onlyencode structural properties like number of contours or the nesting of contours, but little quantitative information such as volume or other statistics. Here we use thesegmentation implied by a contour tree to compute a large number of per-contour (interval) based statistics of both the function defining the contour tree as well asother co-located functions. We introducemore » a new visual metaphor for contour trees, called topological cacti, that extends the traditional toporrery display of acontour tree to display additional quantitative information as width of the cactus trunk and length of its spikes. We apply the new technique to scalar fields ofvarying dimension and different measures to demonstrate the effectiveness of the approach.« less

  16. High speed digital holographic interferometry for hypersonic flow visualization

    NASA Astrophysics Data System (ADS)

    Hegde, G. M.; Jagdeesh, G.; Reddy, K. P. J.

    2013-06-01

    Optical imaging techniques have played a major role in understanding the flow dynamics of varieties of fluid flows, particularly in the study of hypersonic flows. Schlieren and shadowgraph techniques have been the flow diagnostic tools for the investigation of compressible flows since more than a century. However these techniques provide only the qualitative information about the flow field. Other optical techniques such as holographic interferometry and laser induced fluorescence (LIF) have been used extensively for extracting quantitative information about the high speed flows. In this paper we present the application of digital holographic interferometry (DHI) technique integrated with short duration hypersonic shock tunnel facility having 1 ms test time, for quantitative flow visualization. Dynamics of the flow fields in hypersonic/supersonic speeds around different test models is visualized with DHI using a high-speed digital camera (0.2 million fps). These visualization results are compared with schlieren visualization and CFD simulation results. Fringe analysis is carried out to estimate the density of the flow field.

  17. Quantitative Understanding of SHAPE Mechanism from RNA Structure and Dynamics Analysis.

    PubMed

    Hurst, Travis; Xu, Xiaojun; Zhao, Peinan; Chen, Shi-Jie

    2018-05-10

    The selective 2'-hydroxyl acylation analyzed by primer extension (SHAPE) method probes RNA local structural and dynamic information at single nucleotide resolution. To gain quantitative insights into the relationship between nucleotide flexibility, RNA 3D structure, and SHAPE reactivity, we develop a 3D Structure-SHAPE Relationship model (3DSSR) to rebuild SHAPE profiles from 3D structures. The model starts from RNA structures and combines nucleotide interaction strength and conformational propensity, ligand (SHAPE reagent) accessibility, and base-pairing pattern through a composite function to quantify the correlation between SHAPE reactivity and nucleotide conformational stability. The 3DSSR model shows the relationship between SHAPE reactivity and RNA structure and energetics. Comparisons between the 3DSSR-predicted SHAPE profile and the experimental SHAPE data show correlation, suggesting that the extracted analytical function may have captured the key factors that determine the SHAPE reactivity profile. Furthermore, the theory offers an effective method to sieve RNA 3D models and exclude models that are incompatible with experimental SHAPE data.

  18. A Century of Progress in Molecular Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    McLafferty, Fred W.

    2011-07-01

    The first mass spectrum of a molecule was measured by J.J. Thomson in 1910. Mass spectrometry (MS) soon became crucial to the study of isotopes and atomic weights and to the development of atomic weapons for World War II. Its notable applications to molecules began with the quantitative analysis of light hydrocarbons during World War II. When I joined the Dow Chemical Company in 1950, MS was not favored by organic chemists. This situation improved only with an increased understanding of gaseous ion chemistry, which was obtained through the use of extensive reference data. Gas chromatography-MS was developed in 1956, and tandem MS was first used a decade later. In neutralization-reionization MS, an unusual, unstable species is prepared by ion-beam neutralization and characterized by reionization. Electrospray ionization of a protein mixture produces its corresponding ionized molecules. In top-down proteomics, ions from an individual component can be mass separated and subjected to collision-activated and electron-capture dissociation to provide extensive sequence information.

  19. Extensive site-directed mutagenesis reveals interconnected functional units in the alkaline phosphatase active site

    PubMed Central

    Sunden, Fanny; Peck, Ariana; Salzman, Julia; Ressl, Susanne; Herschlag, Daniel

    2015-01-01

    Enzymes enable life by accelerating reaction rates to biological timescales. Conventional studies have focused on identifying the residues that have a direct involvement in an enzymatic reaction, but these so-called ‘catalytic residues’ are embedded in extensive interaction networks. Although fundamental to our understanding of enzyme function, evolution, and engineering, the properties of these networks have yet to be quantitatively and systematically explored. We dissected an interaction network of five residues in the active site of Escherichia coli alkaline phosphatase. Analysis of the complex catalytic interdependence of specific residues identified three energetically independent but structurally interconnected functional units with distinct modes of cooperativity. From an evolutionary perspective, this network is orders of magnitude more probable to arise than a fully cooperative network. From a functional perspective, new catalytic insights emerge. Further, such comprehensive energetic characterization will be necessary to benchmark the algorithms required to rationally engineer highly efficient enzymes. DOI: http://dx.doi.org/10.7554/eLife.06181.001 PMID:25902402

  20. A draft map of the mouse pluripotent stem cell spatial proteome

    PubMed Central

    Christoforou, Andy; Mulvey, Claire M.; Breckels, Lisa M.; Geladaki, Aikaterini; Hurrell, Tracey; Hayward, Penelope C.; Naake, Thomas; Gatto, Laurent; Viner, Rosa; Arias, Alfonso Martinez; Lilley, Kathryn S.

    2016-01-01

    Knowledge of the subcellular distribution of proteins is vital for understanding cellular mechanisms. Capturing the subcellular proteome in a single experiment has proven challenging, with studies focusing on specific compartments or assigning proteins to subcellular niches with low resolution and/or accuracy. Here we introduce hyperLOPIT, a method that couples extensive fractionation, quantitative high-resolution accurate mass spectrometry with multivariate data analysis. We apply hyperLOPIT to a pluripotent stem cell population whose subcellular proteome has not been extensively studied. We provide localization data on over 5,000 proteins with unprecedented spatial resolution to reveal the organization of organelles, sub-organellar compartments, protein complexes, functional networks and steady-state dynamics of proteins and unexpected subcellular locations. The method paves the way for characterizing the impact of post-transcriptional and post-translational modification on protein location and studies involving proteome-level locational changes on cellular perturbation. An interactive open-source resource is presented that enables exploration of these data. PMID:26754106

  1. Extensive site-directed mutagenesis reveals interconnected functional units in the alkaline phosphatase active site

    DOE PAGES

    Sunden, Fanny; Peck, Ariana; Salzman, Julia; ...

    2015-04-22

    Enzymes enable life by accelerating reaction rates to biological timescales. Conventional studies have focused on identifying the residues that have a direct involvement in an enzymatic reaction, but these so-called ‘catalytic residues’ are embedded in extensive interaction networks. Although fundamental to our understanding of enzyme function, evolution, and engineering, the properties of these networks have yet to be quantitatively and systematically explored. We dissected an interaction network of five residues in the active site of Escherichia coli alkaline phosphatase. Analysis of the complex catalytic interdependence of specific residues identified three energetically independent but structurally interconnected functional units with distinct modesmore » of cooperativity. From an evolutionary perspective, this network is orders of magnitude more probable to arise than a fully cooperative network. From a functional perspective, new catalytic insights emerge. Further, such comprehensive energetic characterization will be necessary to benchmark the algorithms required to rationally engineer highly efficient enzymes.« less

  2. Enhancing graphical literacy skills in the high school science classroom via authentic, intensive data collection and graphical representation exposure

    NASA Astrophysics Data System (ADS)

    Palmeri, Anthony

    This research project was developed to provide extensive practice and exposure to data collection and data representation in a high school science classroom. The student population engaged in this study included 40 high school sophomores enrolled in two microbiology classes. Laboratory investigations and activities were deliberately designed to include quantitative data collection that necessitated organization and graphical representation. These activities were embedded into the curriculum and conducted in conjunction with the normal and expected course content, rather than as a separate entity. It was expected that routine practice with graph construction and interpretation would result in improved competency when graphing data and proficiency in analyzing graphs. To objectively test the effectiveness in achieving this goal, a pre-test and post-test that included graph construction, interpretation, interpolation, extrapolation, and analysis was administered. Based on the results of a paired T-Test, graphical literacy was significantly enhanced by extensive practice and exposure to data representation.

  3. Statistical significance of trace evidence matches using independent physicochemical measurements

    NASA Astrophysics Data System (ADS)

    Almirall, Jose R.; Cole, Michael; Furton, Kenneth G.; Gettinby, George

    1997-02-01

    A statistical approach to the significance of glass evidence is proposed using independent physicochemical measurements and chemometrics. Traditional interpretation of the significance of trace evidence matches or exclusions relies on qualitative descriptors such as 'indistinguishable from,' 'consistent with,' 'similar to' etc. By performing physical and chemical measurements with are independent of one another, the significance of object exclusions or matches can be evaluated statistically. One of the problems with this approach is that the human brain is excellent at recognizing and classifying patterns and shapes but performs less well when that object is represented by a numerical list of attributes. Chemometrics can be employed to group similar objects using clustering algorithms and provide statistical significance in a quantitative manner. This approach is enhanced when population databases exist or can be created and the data in question can be evaluated given these databases. Since the selection of the variables used and their pre-processing can greatly influence the outcome, several different methods could be employed in order to obtain a more complete picture of the information contained in the data. Presently, we report on the analysis of glass samples using refractive index measurements and the quantitative analysis of the concentrations of the metals: Mg, Al, Ca, Fe, Mn, Ba, Sr, Ti and Zr. The extension of this general approach to fiber and paint comparisons also is discussed. This statistical approach should not replace the current interpretative approaches to trace evidence matches or exclusions but rather yields an additional quantitative measure. The lack of sufficient general population databases containing the needed physicochemical measurements and the potential for confusion arising from statistical analysis currently hamper this approach and ways of overcoming these obstacles are presented.

  4. The use of morphological characteristics and texture analysis in the identification of tissue composition in prostatic neoplasia.

    PubMed

    Diamond, James; Anderson, Neil H; Bartels, Peter H; Montironi, Rodolfo; Hamilton, Peter W

    2004-09-01

    Quantitative examination of prostate histology offers clues in the diagnostic classification of lesions and in the prediction of response to treatment and prognosis. To facilitate the collection of quantitative data, the development of machine vision systems is necessary. This study explored the use of imaging for identifying tissue abnormalities in prostate histology. Medium-power histological scenes were recorded from whole-mount radical prostatectomy sections at x 40 objective magnification and assessed by a pathologist as exhibiting stroma, normal tissue (nonneoplastic epithelial component), or prostatic carcinoma (PCa). A machine vision system was developed that divided the scenes into subregions of 100 x 100 pixels and subjected each to image-processing techniques. Analysis of morphological characteristics allowed the identification of normal tissue. Analysis of image texture demonstrated that Haralick feature 4 was the most suitable for discriminating stroma from PCa. Using these morphological and texture measurements, it was possible to define a classification scheme for each subregion. The machine vision system is designed to integrate these classification rules and generate digital maps of tissue composition from the classification of subregions; 79.3% of subregions were correctly classified. Established classification rates have demonstrated the validity of the methodology on small scenes; a logical extension was to apply the methodology to whole slide images via scanning technology. The machine vision system is capable of classifying these images. The machine vision system developed in this project facilitates the exploration of morphological and texture characteristics in quantifying tissue composition. It also illustrates the potential of quantitative methods to provide highly discriminatory information in the automated identification of prostatic lesions using computer vision.

  5. Assessing covariate balance when using the generalized propensity score with quantitative or continuous exposures.

    PubMed

    Austin, Peter C

    2018-01-01

    Propensity score methods are increasingly being used to estimate the effects of treatments and exposures when using observational data. The propensity score was initially developed for use with binary exposures (e.g., active treatment vs. control). The generalized propensity score is an extension of the propensity score for use with quantitative exposures (e.g., dose or quantity of medication, income, years of education). A crucial component of any propensity score analysis is that of balance assessment. This entails assessing the degree to which conditioning on the propensity score (via matching, weighting, or stratification) has balanced measured baseline covariates between exposure groups. Methods for balance assessment have been well described and are frequently implemented when using the propensity score with binary exposures. However, there is a paucity of information on how to assess baseline covariate balance when using the generalized propensity score. We describe how methods based on the standardized difference can be adapted for use with quantitative exposures when using the generalized propensity score. We also describe a method based on assessing the correlation between the quantitative exposure and each covariate in the sample when weighted using generalized propensity score -based weights. We conducted a series of Monte Carlo simulations to evaluate the performance of these methods. We also compared two different methods of estimating the generalized propensity score: ordinary least squared regression and the covariate balancing propensity score method. We illustrate the application of these methods using data on patients hospitalized with a heart attack with the quantitative exposure being creatinine level.

  6. Classroom Demonstrations of Polymer Principles.

    ERIC Educational Resources Information Center

    Rodriguez, F.

    1990-01-01

    Classroom demonstrations of selected mechanical properties of polymers are described that can be used to make quantitative measurements. Stiffness, strength, and extensibility are mechanical properties used to distinguish one polymer from another. (KR)

  7. Automated analysis of high-content microscopy data with deep learning.

    PubMed

    Kraus, Oren Z; Grys, Ben T; Ba, Jimmy; Chong, Yolanda; Frey, Brendan J; Boone, Charles; Andrews, Brenda J

    2017-04-18

    Existing computational pipelines for quantitative analysis of high-content microscopy data rely on traditional machine learning approaches that fail to accurately classify more than a single dataset without substantial tuning and training, requiring extensive analysis. Here, we demonstrate that the application of deep learning to biological image data can overcome the pitfalls associated with conventional machine learning classifiers. Using a deep convolutional neural network (DeepLoc) to analyze yeast cell images, we show improved performance over traditional approaches in the automated classification of protein subcellular localization. We also demonstrate the ability of DeepLoc to classify highly divergent image sets, including images of pheromone-arrested cells with abnormal cellular morphology, as well as images generated in different genetic backgrounds and in different laboratories. We offer an open-source implementation that enables updating DeepLoc on new microscopy datasets. This study highlights deep learning as an important tool for the expedited analysis of high-content microscopy data. © 2017 The Authors. Published under the terms of the CC BY 4.0 license.

  8. Time-series analysis of hepatitis A, B, C and E infections in a large Chinese city: application to prediction analysis.

    PubMed

    Sumi, A; Luo, T; Zhou, D; Yu, B; Kong, D; Kobayashi, N

    2013-05-01

    Viral hepatitis is recognized as one of the most frequently reported diseases, and especially in China, acute and chronic liver disease due to viral hepatitis has been a major public health problem. The present study aimed to analyse and predict surveillance data of infections of hepatitis A, B, C and E in Wuhan, China, by the method of time-series analysis (MemCalc, Suwa-Trast, Japan). On the basis of spectral analysis, fundamental modes explaining the underlying variation of the data for the years 2004-2008 were assigned. The model was calculated using the fundamental modes and the underlying variation of the data reproduced well. An extension of the model to the year 2009 could predict the data quantitatively. Our study suggests that the present method will allow us to model the temporal pattern of epidemics of viral hepatitis much more effectively than using the artificial neural network, which has been used previously.

  9. Quantitative and qualitative transcriptome analysis of four industrial strains of Claviceps purpurea with respect to ergot alkaloid production.

    PubMed

    Majeská Čudejková, Mária; Vojta, Petr; Valík, Josef; Galuszka, Petr

    2016-09-25

    The fungus Claviceps purpurea is a biotrophic phytopathogen widely used in the pharmaceutical industry for its ability to produce ergot alkaloids (EAs). The fungus attacks unfertilized ovaries of grasses and forms sclerotia, which represent the only type of tissue where the synthesis of EAs occurs. The biosynthetic pathway of EAs has been extensively studied; however, little is known concerning its regulation. Here, we present the quantitative transcriptome analysis of the sclerotial and mycelial tissues providing a comprehensive view of transcriptional differences between the tissues that produce EAs and those that do not produce EAs and the pathogenic and non-pathogenic lifestyle. The results indicate metabolic changes coupled with sclerotial differentiation, which are likely needed as initiation factors for EA biosynthesis. One of the promising factors seems to be oxidative stress. Here, we focus on the identification of putative transcription factors and regulators involved in sclerotial differentiation, which might be involved in EA biosynthesis. To shed more light on the regulation of EA composition, whole transcriptome analysis of four industrial strains differing in their alkaloid spectra was performed. The results support the hypothesis proposing the composition of the amino acid pool in sclerotia to be an important factor regulating the final structure of the ergopeptines produced by Claviceps purpurea. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. ComplexQuant: high-throughput computational pipeline for the global quantitative analysis of endogenous soluble protein complexes using high resolution protein HPLC and precision label-free LC/MS/MS.

    PubMed

    Wan, Cuihong; Liu, Jian; Fong, Vincent; Lugowski, Andrew; Stoilova, Snejana; Bethune-Waddell, Dylan; Borgeson, Blake; Havugimana, Pierre C; Marcotte, Edward M; Emili, Andrew

    2013-04-09

    The experimental isolation and characterization of stable multi-protein complexes are essential to understanding the molecular systems biology of a cell. To this end, we have developed a high-throughput proteomic platform for the systematic identification of native protein complexes based on extensive fractionation of soluble protein extracts by multi-bed ion exchange high performance liquid chromatography (IEX-HPLC) combined with exhaustive label-free LC/MS/MS shotgun profiling. To support these studies, we have built a companion data analysis software pipeline, termed ComplexQuant. Proteins present in the hundreds of fractions typically collected per experiment are first identified by exhaustively interrogating MS/MS spectra using multiple database search engines within an integrative probabilistic framework, while accounting for possible post-translation modifications. Protein abundance is then measured across the fractions based on normalized total spectral counts and precursor ion intensities using a dedicated tool, PepQuant. This analysis allows co-complex membership to be inferred based on the similarity of extracted protein co-elution profiles. Each computational step has been optimized for processing large-scale biochemical fractionation datasets, and the reliability of the integrated pipeline has been benchmarked extensively. This article is part of a Special Issue entitled: From protein structures to clinical applications. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Structural and metabolic transitions of C4 leaf development and differentiation defined by microscopy and quantitative proteomics in maize.

    PubMed

    Majeran, Wojciech; Friso, Giulia; Ponnala, Lalit; Connolly, Brian; Huang, Mingshu; Reidel, Edwin; Zhang, Cankui; Asakura, Yukari; Bhuiyan, Nazmul H; Sun, Qi; Turgeon, Robert; van Wijk, Klaas J

    2010-11-01

    C(4) grasses, such as maize (Zea mays), have high photosynthetic efficiency through combined biochemical and structural adaptations. C(4) photosynthesis is established along the developmental axis of the leaf blade, leading from an undifferentiated leaf base just above the ligule into highly specialized mesophyll cells (MCs) and bundle sheath cells (BSCs) at the tip. To resolve the kinetics of maize leaf development and C(4) differentiation and to obtain a systems-level understanding of maize leaf formation, the accumulation profiles of proteomes of the leaf and the isolated BSCs with their vascular bundle along the developmental gradient were determined using large-scale mass spectrometry. This was complemented by extensive qualitative and quantitative microscopy analysis of structural features (e.g., Kranz anatomy, plasmodesmata, cell wall, and organelles). More than 4300 proteins were identified and functionally annotated. Developmental protein accumulation profiles and hierarchical cluster analysis then determined the kinetics of organelle biogenesis, formation of cellular structures, metabolism, and coexpression patterns. Two main expression clusters were observed, each divided in subclusters, suggesting that a limited number of developmental regulatory networks organize concerted protein accumulation along the leaf gradient. The coexpression with BSC and MC markers provided strong candidates for further analysis of C(4) specialization, in particular transporters and biogenesis factors. Based on the integrated information, we describe five developmental transitions that provide a conceptual and practical template for further analysis. An online protein expression viewer is provided through the Plant Proteome Database.

  12. Life cycle cost analysis of a stand-alone PV system in rural Kenya

    NASA Astrophysics Data System (ADS)

    Daly, Emma

    The purpose of this quantitative research study was to determine the economic feasibility of a stand-alone PV system to electrify a rural area in Kenya. The research conducted involved a comprehensive review of all the relevant literature associated with the study. Methodologies were extrapolated from this extensive literature to develop a model for the complete design and economic analysis of a stand-alone PV system. A women's center in rural Kenya was used as a worked example to demonstrate the workings of the model. The results suggest that electrifying the center using a stand-alone PV system is an economically viable option which is encouraging for the surrounding area. This model can be used as a business model to determine the economic feasibility of a stand-alone PV system in alternative sites in Kenya.

  13. Quantitative Outcomes of a One Health approach to Study Global Health Challenges.

    PubMed

    Falzon, Laura C; Lechner, Isabel; Chantziaras, Ilias; Collineau, Lucie; Courcoul, Aurélie; Filippitzi, Maria-Eleni; Laukkanen-Ninios, Riikka; Peroz, Carole; Pinto Ferreira, Jorge; Postma, Merel; Prestmo, Pia G; Phythian, Clare J; Sarno, Eleonora; Vanantwerpen, Gerty; Vergne, Timothée; Grindlay, Douglas J C; Brennan, Marnie L

    2018-03-01

    Having gained momentum in the last decade, the One Health initiative promotes a holistic approach to address complex global health issues. Before recommending its adoption to stakeholders, however, it is paramount to first compile quantitative evidence of the benefit of such an approach. The aim of this scoping review was to identify and summarize primary research that describes monetary and non-monetary outcomes following adoption of a One Health approach. An extensive literature search yielded a total of 42,167 references, of which 85 were included in the final analysis. The top two biotic health issues addressed in these studies were rabies and malaria; the top abiotic health issue was air pollution. Most studies described collaborations between human and animal (n = 42), or human and environmental disciplines (n = 41); commonly reported interventions included vector control and animal vaccination. Monetary outcomes were commonly expressed as cost-benefit or cost-utility ratios; non-monetary outcomes were described using disease frequency or disease burden measurements. The majority of the studies reported positive or partially positive outcomes. This paper illustrates the variety of health challenges that can be addressed using a One Health approach, and provides tangible quantitative measures that can be used to evaluate future implementations of the One Health approach.

  14. Label-free protein sensing by employing blue phase liquid crystal.

    PubMed

    Lee, Mon-Juan; Chang, Chung-Huan; Lee, Wei

    2017-03-01

    Blue phases (BPs) are mesophases existing between the isotropic and chiral nematic phases of liquid crystals (LCs). In recent years, blue phase LCs (BPLCs) have been extensively studied in the field of LC science and display technology. However, the application of BPLCs in biosensing has not been explored. In this study, a BPLC-based biosensing technology was developed for the detection and quantitation of bovine serum albumin (BSA). The sensing platform was constructed by assembling an empty cell with two glass slides coated with homeotropic alignment layers and with immobilized BSA atop. The LC cells were heated to isotropic phase and then allowed to cool down to and maintained at distinct BP temperatures for spectral measurements and texture observations. At BSA concentrations below 10 -6 g/ml, we observed that the Bragg reflection wavelength blue-shifted with increasing concentration of BSA, suggesting that the BP is a potentially sensitive medium in the detection and quantitation of biomolecules. By using the BPLC at 37 °C and the same polymorphic material in the smectic A phase at 20 °C, two linear correlations were established for logarithmic BSA concentrations ranging from 10 -9 to 10 -6 g/ml and from 10 -6 to 10 -3 g/ml. Our results demonstrate the potential of BPLCs in biosensing and quantitative analysis of biomolecules.

  15. Doctoral Training in Statistics, Measurement, and Methodology in Psychology: Replication and Extension of Aiken, West, Sechrest, and Reno's (1990) Survey of PhD Programs in North America

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2008-01-01

    In a survey of all PhD programs in psychology in the United States and Canada, the authors documented the quantitative methodology curriculum (statistics, measurement, and research design) to examine the extent to which innovations in quantitative methodology have diffused into the training of PhDs in psychology. In all, 201 psychology PhD…

  16. Research on Customer Value Based on Extension Data Mining

    NASA Astrophysics Data System (ADS)

    Chun-Yan, Yang; Wei-Hua, Li

    Extenics is a new discipline for dealing with contradiction problems with formulize model. Extension data mining (EDM) is a product combining Extenics with data mining. It explores to acquire the knowledge based on extension transformations, which is called extension knowledge (EK), taking advantage of extension methods and data mining technology. EK includes extensible classification knowledge, conductive knowledge and so on. Extension data mining technology (EDMT) is a new data mining technology that mining EK in databases or data warehouse. Customer value (CV) can weigh the essentiality of customer relationship for an enterprise according to an enterprise as a subject of tasting value and customers as objects of tasting value at the same time. CV varies continually. Mining the changing knowledge of CV in databases using EDMT, including quantitative change knowledge and qualitative change knowledge, can provide a foundation for that an enterprise decides the strategy of customer relationship management (CRM). It can also provide a new idea for studying CV.

  17. TrackMate: An open and extensible platform for single-particle tracking.

    PubMed

    Tinevez, Jean-Yves; Perry, Nick; Schindelin, Johannes; Hoopes, Genevieve M; Reynolds, Gregory D; Laplantine, Emmanuel; Bednarek, Sebastian Y; Shorte, Spencer L; Eliceiri, Kevin W

    2017-02-15

    We present TrackMate, an open source Fiji plugin for the automated, semi-automated, and manual tracking of single-particles. It offers a versatile and modular solution that works out of the box for end users, through a simple and intuitive user interface. It is also easily scriptable and adaptable, operating equally well on 1D over time, 2D over time, 3D over time, or other single and multi-channel image variants. TrackMate provides several visualization and analysis tools that aid in assessing the relevance of results. The utility of TrackMate is further enhanced through its ability to be readily customized to meet specific tracking problems. TrackMate is an extensible platform where developers can easily write their own detection, particle linking, visualization or analysis algorithms within the TrackMate environment. This evolving framework provides researchers with the opportunity to quickly develop and optimize new algorithms based on existing TrackMate modules without the need of having to write de novo user interfaces, including visualization, analysis and exporting tools. The current capabilities of TrackMate are presented in the context of three different biological problems. First, we perform Caenorhabditis-elegans lineage analysis to assess how light-induced damage during imaging impairs its early development. Our TrackMate-based lineage analysis indicates the lack of a cell-specific light-sensitive mechanism. Second, we investigate the recruitment of NEMO (NF-κB essential modulator) clusters in fibroblasts after stimulation by the cytokine IL-1 and show that photodamage can generate artifacts in the shape of TrackMate characterized movements that confuse motility analysis. Finally, we validate the use of TrackMate for quantitative lifetime analysis of clathrin-mediated endocytosis in plant cells. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  18. GiA Roots: software for the high throughput analysis of plant root system architecture.

    PubMed

    Galkovskyi, Taras; Mileyko, Yuriy; Bucksch, Alexander; Moore, Brad; Symonova, Olga; Price, Charles A; Topp, Christopher N; Iyer-Pascuzzi, Anjali S; Zurek, Paul R; Fang, Suqin; Harer, John; Benfey, Philip N; Weitz, Joshua S

    2012-07-26

    Characterizing root system architecture (RSA) is essential to understanding the development and function of vascular plants. Identifying RSA-associated genes also represents an underexplored opportunity for crop improvement. Software tools are needed to accelerate the pace at which quantitative traits of RSA are estimated from images of root networks. We have developed GiA Roots (General Image Analysis of Roots), a semi-automated software tool designed specifically for the high-throughput analysis of root system images. GiA Roots includes user-assisted algorithms to distinguish root from background and a fully automated pipeline that extracts dozens of root system phenotypes. Quantitative information on each phenotype, along with intermediate steps for full reproducibility, is returned to the end-user for downstream analysis. GiA Roots has a GUI front end and a command-line interface for interweaving the software into large-scale workflows. GiA Roots can also be extended to estimate novel phenotypes specified by the end-user. We demonstrate the use of GiA Roots on a set of 2393 images of rice roots representing 12 genotypes from the species Oryza sativa. We validate trait measurements against prior analyses of this image set that demonstrated that RSA traits are likely heritable and associated with genotypic differences. Moreover, we demonstrate that GiA Roots is extensible and an end-user can add functionality so that GiA Roots can estimate novel RSA traits. In summary, we show that the software can function as an efficient tool as part of a workflow to move from large numbers of root images to downstream analysis.

  19. Probing baryogenesis through the Higgs boson self-coupling

    NASA Astrophysics Data System (ADS)

    Reichert, M.; Eichhorn, A.; Gies, H.; Pawlowski, J. M.; Plehn, T.; Scherer, M. M.

    2018-04-01

    The link between a modified Higgs self-coupling and the strong first-order phase transition necessary for baryogenesis is well explored for polynomial extensions of the Higgs potential. We broaden this argument beyond leading polynomial expansions of the Higgs potential to higher polynomial terms and to nonpolynomial Higgs potentials. For our quantitative analysis we resort to the functional renormalization group, which allows us to evolve the full Higgs potential to higher scales and finite temperature. In all cases we find that a strong first-order phase transition manifests itself in an enhancement of the Higgs self-coupling by at least 50%, implying that such modified Higgs potentials should be accessible at the LHC.

  20. Mechanisms for Superconductivity in Cuprates compared with results from the Generalized MacMillan-Rowell Analysis of High Resolution Laser- ARPES

    NASA Astrophysics Data System (ADS)

    Varma, Chandra; Choi, Han-Yong; Zhang, Wentao; Zhou, Xingjiang

    2012-02-01

    The spectra of fluctuations and their coupling to fermions has been deduced from extensive high resolution laser ARPES in several BISCCO samples and quantitatively analyzed. We ask the question whether some of the theories for superconductivity in Cuprates are consistent or inconsistent with the frequency and the momentum dependence of the deductions. We find that any fluctuation spectra, for example that of Antiferromagnetic Fluctuations, whose frequency dependence depends significantly on momentum dependence are excluded. We consider the quantum-critical spectra of the loop-current order observed in under-doped cuprates and its coupling to fermions and find it consistent with the data.

  1. A quantitative investigation of the solar modulation of cosmic-ray protons and helium nuclei. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Garrard, T. L.

    1972-01-01

    The differential energy spectra of cosmic ray protons and He nuclei were measured at energies up to 315 MeV/nucleon using balloon-borne and satellite-borne instruments. These spectra are presented for solar quiet times for the years 1966 through 1970. The data analysis is verified by extensive accelerator calibrations of the detector systems and by calculations and measurements of the production of secondary protons in the atmosphere. The spectra of protons and He nuclei in this energy range are dominated by the solar modulation of the local interstellar spectra. Numerical solutions to the transport equation are presented for a wide range of parameters.

  2. On the thermodynamic and kinetic investigations of a [c2]daisy chain polymer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hmadeh, Mohamad; Fang, Lei; Trabolsi, Ali

    2010-01-01

    We report a variety of [c2]daisy chain molecules which undergo quantitative, efficient, and fully reversible molecular movements upon the addition of base/acid in organic solvents. Such externally triggered molecular movements can induce the contraction and extension of the [c2]daisy chain molecule as a whole. A linear polymer of such a bistable [c2]daisy chain exerts similar types of movements and can be looked upon as a candidate for the development of artificial muscles. The spectrophotometric investigations of both the monomeric and polymeric bistable [c2]daisy chains, as well as the corresponding model compounds, were performed in MeCN at room temperature, in ordermore » to obtain the thermodynamic parameters for these mechanically interlocked molecules. Based on their spectrophotometric and thermodynamic characteristics, kinetic analysis of the acid/base-induced contraction and extension of the [c2]daisy chain monomer and polymer were conducted by employing a stopped-flow technique. These kinetic data suggest that the rates of contraction and extension for these [c2]daisy chain molecules are determined by the thermodynamic stabilities of the corresponding kinetic intermediates. Faster switching rates for both the contraction and extension processes of the polymeric [c2]daisy chain were observed when compared to those of its monomeric counterpart. These kinetic and thermodynamic investigations on [c2]daisy chain-based muscle-like compounds provide important information for those seeking an understanding of the mechanisms of actuation in mechanically interlocked macromolecules.« less

  3. LC-MS Proteomics Analysis of the Insulin/IGF-1 Deficient Caenorhabditis elegans daf-2(e1370) Mutant Reveals Extensive Restructuring of Intermediary Metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Depuydt, Geert G.; Xie, Fang; Petyuk, Vladislav A.

    2014-02-20

    The insulin/IGF-1 receptor is a major known determinant of dauer formation, stress resistance, longevity and metabolism in C. elegans. In the past, whole-genome transcript profiling was used extensively to study differential gene expression in response to reduced insulin/IGF-1 signaling, including expression levels of metabolism-associated genes. Taking advantage of the recent developments in quantitative liquid chromatography mass-spectrometry (LC-MS) based proteomics, we profiled the proteomic changes that occur in response to activation of the DAF-16 transcription factor in the germline-less glp-4(bn2); daf-2(e1370) receptor mutant. Strikingly, the daf-2 profile suggests extensive reorganization of intermediary metabolism, characterized by the up-regulation of many core intermediarymore » metabolic pathways. These include, glycolysis/gluconeogenesis, glycogenesis, pentose phosphate cycle, citric acid cycle, glyoxylate shunt, fatty acid β-oxidation, one-carbon metabolism, propionate and tyrosine catabolism, and complex I, II, III and V of the electron transport chain. Interestingly, we found simultaneous activation of reciprocally regulated metabolic pathways, which is indicative for spatio-temporal coordination of energy metabolism and/or extensive post-translational regulation of these enzymes. This restructuring of daf-2 metabolism is reminiscent to that of hypometabolic dauers, allowing the efficient and economical utilization of internal nutrient reserves, possibly also shunting metabolites through alternative energy-generating pathways, in order to sustain longevity.« less

  4. LC–MS Proteomics Analysis of the Insulin/IGF-1-Deficient Caenorhabditis elegans daf-2(e1370) Mutant Reveals Extensive Restructuring of Intermediary Metabolism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Depuydt, Geert; Xie, Fang; Petyuk, Vladislav A.

    2014-04-04

    The insulin/IGF-1 receptor is a major known determinant of dauer formation, stress resistance, longevity, and metabolism in Caenorhabditis elegans. In the past, whole-genome transcript profiling was used extensively to study differential gene expression in response to reduced insulin/IGF-1 signaling, including the expression levels of metabolism-associated genes. Taking advantage of the recent developments in quantitative liquid chromatography mass spectrometry (LC–MS)-based proteomics, we profiled the proteomic changes that occur in response to activation of the DAF-16 transcription factor in the germline-less glp-4(bn2);daf-2(e1370) receptor mutant. Strikingly, the daf-2 profile suggests extensive reorganization of intermediary metabolism, characterized by the upregulation of many core intermediarymore » metabolic pathways. These include glycolysis/gluconeogenesis, glycogenesis, pentose phosphate cycle, citric acid cycle, glyoxylate shunt, fatty acid β-oxidation, one-carbon metabolism, propionate and tyrosine catabolism, and complexes I, II, III, and V of the electron transport chain. Interestingly, we found simultaneous activation of reciprocally regulated metabolic pathways, which is indicative of spatiotemporal coordination of energy metabolism and/or extensive post-translational regulation of these enzymes. Finally, this restructuring of daf-2 metabolism is reminiscent to that of hypometabolic dauers, allowing the efficient and economical utilization of internal nutrient reserves and possibly also shunting metabolites through alternative energy-generating pathways to sustain longevity.« less

  5. Functional linear models for association analysis of quantitative traits.

    PubMed

    Fan, Ruzong; Wang, Yifan; Mills, James L; Wilson, Alexander F; Bailey-Wilson, Joan E; Xiong, Momiao

    2013-11-01

    Functional linear models are developed in this paper for testing associations between quantitative traits and genetic variants, which can be rare variants or common variants or the combination of the two. By treating multiple genetic variants of an individual in a human population as a realization of a stochastic process, the genome of an individual in a chromosome region is a continuum of sequence data rather than discrete observations. The genome of an individual is viewed as a stochastic function that contains both linkage and linkage disequilibrium (LD) information of the genetic markers. By using techniques of functional data analysis, both fixed and mixed effect functional linear models are built to test the association between quantitative traits and genetic variants adjusting for covariates. After extensive simulation analysis, it is shown that the F-distributed tests of the proposed fixed effect functional linear models have higher power than that of sequence kernel association test (SKAT) and its optimal unified test (SKAT-O) for three scenarios in most cases: (1) the causal variants are all rare, (2) the causal variants are both rare and common, and (3) the causal variants are common. The superior performance of the fixed effect functional linear models is most likely due to its optimal utilization of both genetic linkage and LD information of multiple genetic variants in a genome and similarity among different individuals, while SKAT and SKAT-O only model the similarities and pairwise LD but do not model linkage and higher order LD information sufficiently. In addition, the proposed fixed effect models generate accurate type I error rates in simulation studies. We also show that the functional kernel score tests of the proposed mixed effect functional linear models are preferable in candidate gene analysis and small sample problems. The methods are applied to analyze three biochemical traits in data from the Trinity Students Study. © 2013 WILEY PERIODICALS, INC.

  6. Isotopic exchange during derivatization of platelet activating factor for gas chromatography-mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haroldsen, P.E.; Gaskell, S.J.; Weintraub, S.T.

    1991-04-01

    One approach to the quantitative analysis of platelet activating factor (PAF, 1-O-alkyl-2-acetyl-sn-glycerol-3-phosphocholine; also referred to as AGEPC, alkyl glyceryl ether phosphocholine) is hydrolytic removal of the phosphocholine group and conversion to an electron-capturing derivative for gas chromatography-negative ion mass spectrometry. (2H3)Acetyl-AGEPC has been commonly employed as an internal standard. When 1-hexadecyl-2-(2H3)acetyl glycerol (obtained by enzymatic hydrolysis of (2H3)-C16:0 AGEPC) is treated with pentafluorobenzoyl chloride at 120 degrees C, the resulting 3-pentafluorobenzoate derivative shows extensive loss of the deuterium label. This exchange is evidently acid-catalyzed since derivatization of 1-hexadecyl-2-acetyl glycerol under the same conditions in the presence of a trace ofmore » 2HCl results in the incorporation of up to three deuterium atoms. Isotope exchange can be avoided if the reaction is carried out at low temperature in the presence of base. Direct derivatization of (2H3)-C16:0 AGEPC by treatment with pentafluorobenzoyl chloride or heptafluorobutyric anhydride also results in loss of the deuterium label. The use of (13C2)-C16:0 AGEPC as an internal standard is recommended for rigorous quantitative analysis.« less

  7. Wide-scale quantitative phosphoproteomic analysis reveals that cold treatment of T cells closely mimics soluble antibody stimulation

    PubMed Central

    Ji, Qinqin; Salomon, Arthur R.

    2015-01-01

    The activation of T-lymphocytes through antigen-mediated T-cell receptor (TCR) clustering is vital in regulating the adaptive-immune response. Although T cell receptor signaling has been extensively studied, the fundamental mechanisms for signal initiation are not fully understood. Reduced temperature initiated some of the hallmarks of TCR signaling such as increased phosphorylation and activation on ERK and calcium release from the endoplasmic reticulum as well as coalesce T-cell membrane microdomains. The precise mechanism of TCR signaling initiation due to temperature change remains obscure. One critical question is whether signaling initiated by cold treatment of T cells differs from signaling initiated by crosslinking of the T cell receptor. To address this uncertainty, a wide-scale, quantitative mass spectrometry-based phosphoproteomic analysis was performed on T cells stimulated either by temperature shift or through crosslinking of the TCR. Careful statistical comparison between the two stimulations revealed a striking level of identity between the subset of 339 sites that changed significantly with both stimulations. This study demonstrates for the first time, at unprecedented detail, that T cell cold treatment was sufficient to initiate signaling patterns nearly identical to soluble antibody stimulation, shedding new light on the mechanism of activation of these critically important immune cells. PMID:25839225

  8. Quantitative Analysis of Cellular Metabolic Dissipative, Self-Organized Structures

    PubMed Central

    de la Fuente, Ildefonso Martínez

    2010-01-01

    One of the most important goals of the postgenomic era is understanding the metabolic dynamic processes and the functional structures generated by them. Extensive studies during the last three decades have shown that the dissipative self-organization of the functional enzymatic associations, the catalytic reactions produced during the metabolite channeling, the microcompartmentalization of these metabolic processes and the emergence of dissipative networks are the fundamental elements of the dynamical organization of cell metabolism. Here we present an overview of how mathematical models can be used to address the properties of dissipative metabolic structures at different organizational levels, both for individual enzymatic associations and for enzymatic networks. Recent analyses performed with dissipative metabolic networks have shown that unicellular organisms display a singular global enzymatic structure common to all living cellular organisms, which seems to be an intrinsic property of the functional metabolism as a whole. Mathematical models firmly based on experiments and their corresponding computational approaches are needed to fully grasp the molecular mechanisms of metabolic dynamical processes. They are necessary to enable the quantitative and qualitative analysis of the cellular catalytic reactions and also to help comprehend the conditions under which the structural dynamical phenomena and biological rhythms arise. Understanding the molecular mechanisms responsible for the metabolic dissipative structures is crucial for unraveling the dynamics of cellular life. PMID:20957111

  9. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry.

    PubMed

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. Graphical abstract ᅟ.

  10. Dancing Styles of Collective Cell Migration: Image-Based Computational Analysis of JRAB/MICAL-L2.

    PubMed

    Sakane, Ayuko; Yoshizawa, Shin; Yokota, Hideo; Sasaki, Takuya

    2018-01-01

    Collective cell migration is observed during morphogenesis, angiogenesis, and wound healing, and this type of cell migration also contributes to efficient metastasis in some kinds of cancers. Because collectively migrating cells are much better organized than a random assemblage of individual cells, there seems to be a kind of order in migrating clusters. Extensive research has identified a large number of molecules involved in collective cell migration, and these factors have been analyzed using dramatic advances in imaging technology. To date, however, it remains unclear how myriad cells are integrated as a single unit. Recently, we observed unbalanced collective cell migrations that can be likened to either precision dancing or awa-odori , Japanese traditional dancing similar to the style at Rio Carnival, caused by the impairment of the conformational change of JRAB/MICAL-L2. This review begins with a brief history of image-based computational analyses on cell migration, explains why quantitative analysis of the stylization of collective cell behavior is difficult, and finally introduces our recent work on JRAB/MICAL-L2 as a successful example of the multidisciplinary approach combining cell biology, live imaging, and computational biology. In combination, these methods have enabled quantitative evaluations of the "dancing style" of collective cell migration.

  11. Haplotypes in the APOA1-C3-A4-A5 gene cluster affect plasma lipids in both humans and baboons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qian-fei; Liu, Xin; O'Connell, Jeff

    2003-09-15

    Genetic studies in non-human primates serve as a potential strategy for identifying genomic intervals where polymorphisms impact upon human disease-related phenotypes. It remains unclear, however, whether independently arising polymorphisms in orthologous regions of non-human primates leads to similar variation in a quantitative trait found in both species. To explore this paradigm, we studied a baboon apolipoprotein gene cluster (APOA1/C3/A4/A5) for which the human gene orthologs have well established roles in influencing plasma HDL-cholesterol and triglyceride concentrations. Our extensive polymorphism analysis of this 68 kb gene cluster in 96 pedigreed baboons identified several haplotype blocks each with limited diversity, consistent withmore » haplotype findings in humans. To determine whether baboons, like humans, also have particular haplotypes associated with lipid phenotypes, we genotyped 634 well characterized baboons using 16 haplotype tagging SNPs. Genetic analysis of single SNPs, as well as haplotypes, revealed an association of APOA5 and APOC3 variants with HDL cholesterol and triglyceride concentrations, respectively. Thus, independent variation in orthologous genomic intervals does associate with similar quantitative lipid traits in both species, supporting the possibility of uncovering human QTL genes in a highly controlled non-human primate model.« less

  12. Characterization of Si (sub X)Ge (sub 1-x)/Si Heterostructures for Device Applications Using Spectroscopic Ellipsometry

    NASA Technical Reports Server (NTRS)

    Sieg, R. M.; Alterovitz, S. A.; Croke, E. T.; Harrell, M. J.; Tanner, M.; Wang, K. L.; Mena, R. A.; Young, P. G.

    1993-01-01

    Spectroscopic ellipsometry (SE) characterization of several complex Si (sub X)Ge (sub 1-x)/Si heterostructures prepared for device fabrication, including structures for heterojunction bipolar transistors (HBT), p-type and n-type heterostructure modulation doped field effect transistors, has been performed. We have shown that SE can simultaneously determine all active layer thicknesses, Si (sub X)Ge (sub 1-x) compositions, and the oxide overlayer thickness, with only a general knowledge of the structure topology needed a priori. The characterization of HBT material included the SE analysis of a Si (sub X)Ge (sub 1-x) layer deeply buried (600 nanometers) under the silicon emitter and cap layers. In the SE analysis of n-type heterostructures, we examined for the first time a silicon layer under tensile strain. We found that an excellent fit can be obtained using optical constants of unstrained silicon to represent the strained silicon conduction layer. We also used SE to measure lateral sample homogeneity, providing quantitative identification of the inhomogeneous layer. Surface overlayers resulting from prior sample processing were also detected and measured quantitatively. These results should allow SE to be used extensively as a non-destructive means of characterizing Si (sub X)Ge (sub 1-x)/Si heterostructures prior to device fabrication and testing.

  13. Characterization methods for liquid interfacial layers

    NASA Astrophysics Data System (ADS)

    Javadi, A.; Mucic, N.; Karbaschi, M.; Won, J. Y.; Lotfi, M.; Dan, A.; Ulaganathan, V.; Gochev, G.; Makievski, A. V.; Kovalchuk, V. I.; Kovalchuk, N. M.; Krägel, J.; Miller, R.

    2013-05-01

    Liquid interfaces are met everywhere in our daily life. The corresponding interfacial properties and their modification play an important role in many modern technologies. Most prominent examples are all processes involved in the formation of foams and emulsions, as they are based on a fast creation of new surfaces, often of an immense extension. During the formation of an emulsion, for example, all freshly created and already existing interfaces are permanently subject to all types of deformation. This clearly entails the need of a quantitative knowledge on relevant dynamic interfacial properties and their changes under conditions pertinent to the technological processes. We report on the state of the art of interfacial layer characterization, including the determination of thermodynamic quantities as base line for a further quantitative analysis of the more important dynamic interfacial characteristics. Main focus of the presented work is on the experimental possibilities available at present to gain dynamic interfacial parameters, such as interfacial tensions, adsorbed amounts, interfacial composition, visco-elastic parameters, at shortest available surface ages and fastest possible interfacial perturbations. The experimental opportunities are presented along with examples for selected systems and theoretical models for a best data analysis. We also report on simulation results and concepts of necessary refinements and developments in this important field of interfacial dynamics.

  14. Identification and Quantification of N-Acyl Homoserine Lactones Involved in Bacterial Communication by Small-Scale Synthesis of Internal Standards and Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Leipert, Jan; Treitz, Christian; Leippe, Matthias; Tholey, Andreas

    2017-12-01

    N-acyl homoserine lactones (AHL) are small signal molecules involved in the quorum sensing of many gram-negative bacteria, and play an important role in biofilm formation and pathogenesis. Present analytical methods for identification and quantification of AHL require time-consuming sample preparation steps and are hampered by the lack of appropriate standards. By aiming at a fast and straightforward method for AHL analytics, we investigated the applicability of matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). Suitable MALDI matrices, including crystalline and ionic liquid matrices, were tested and the fragmentation of different AHL in collision-induced dissociation MS/MS was studied, providing information about characteristic marker fragments ions. Employing small-scale synthesis protocols, we established a versatile and cost-efficient procedure for fast generation of isotope-labeled AHL standards, which can be used without extensive purification and yielded accurate standard curves. Quantitative analysis was possible in the low pico-molar range, with lower limits of quantification reaching from 1 to 5 pmol for different AHL. The developed methodology was successfully applied in a quantitative MALDI MS analysis of low-volume culture supernatants of Pseudomonas aeruginosa. [Figure not available: see fulltext.

  15. High Concentrations of Atmospheric Ammonia Induce Alterations in the Hepatic Proteome of Broilers (Gallus gallus): An iTRAQ-Based Quantitative Proteomic Analysis

    PubMed Central

    Zhang, Jize; Li, Cong; Tang, Xiangfang; Lu, Qingping; Sa, Renna; Zhang, Hongfu

    2015-01-01

    With the development of the poultry industry, ammonia, as a main contaminant in the air, is causing increasing problems with broiler health. To date, most studies of ammonia toxicity have focused on the nervous system and the gastrointestinal tract in mammals. However, few detailed studies have been conducted on the hepatic response to ammonia toxicity in poultry. The molecular mechanisms that underlie these effects remain unclear. In the present study, our group applied isobaric tags for relative and absolute quantitation (iTRAQ)-based quantitative proteomic analysis to investigate changes in the protein profile change in hepatic tissue of broilers exposed to high concentrations of atmospheric ammonia, with the goal of characterizing the molecular mechanisms of chronic liver injury from exposure to high ambient levels of ammonia. Overall, 30 differentially expressed proteins that are involved in nutrient metabolism (energy, lipid, and amino acid), immune response, transcriptional and translational regulation, stress response, and detoxification were identified. In particular, two of these proteins, beta-1 galactosidase (GLB1) and a kinase (PRKA) anchor protein 8-like (AKAP8 L), were previously suggested to be potential biomarkers of chronic liver injury. In addition to the changes in the protein profile, serum parameters and histochemical analyses of hepatic tissue also showed extensive hepatic damage in ammonia-exposed broilers. Altogether, these findings suggest that longtime exposure to high concentrations of atmospheric ammonia can trigger chronic hepatic injury in broilers via different mechanisms, providing new information that can be used for intervention using nutritional strategies in the future. PMID:25901992

  16. Integration of PKPD relationships into benefit–risk analysis

    PubMed Central

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-01-01

    Aim Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit–risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit–risk assessment. In addition, we propose the use of pharmacokinetic–pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. Methods A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit–risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. Results A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit–risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit–risk balance before extensive evidence is generated in clinical practice. Conclusions Benefit–risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. PMID:25940398

  17. Integration of PKPD relationships into benefit-risk analysis.

    PubMed

    Bellanti, Francesco; van Wijk, Rob C; Danhof, Meindert; Della Pasqua, Oscar

    2015-11-01

    Despite the continuous endeavour to achieve high standards in medical care through effectiveness measures, a quantitative framework for the assessment of the benefit-risk balance of new medicines is lacking prior to regulatory approval. The aim of this short review is to summarise the approaches currently available for benefit-risk assessment. In addition, we propose the use of pharmacokinetic-pharmacodynamic (PKPD) modelling as the pharmacological basis for evidence synthesis and evaluation of novel therapeutic agents. A comprehensive literature search has been performed using MESH terms in PubMed, in which articles describing benefit-risk assessment and modelling and simulation were identified. In parallel, a critical review of multi-criteria decision analysis (MCDA) is presented as a tool for characterising a drug's safety and efficacy profile. A definition of benefits and risks has been proposed by the European Medicines Agency (EMA), in which qualitative and quantitative elements are included. However, in spite of the value of MCDA as a quantitative method, decisions about benefit-risk balance continue to rely on subjective expert opinion. By contrast, a model-informed approach offers the opportunity for a more comprehensive evaluation of benefit-risk balance before extensive evidence is generated in clinical practice. Benefit-risk balance should be an integral part of the risk management plan and as such considered before marketing authorisation. Modelling and simulation can be incorporated into MCDA to support the evidence synthesis as well evidence generation taking into account the underlying correlations between favourable and unfavourable effects. In addition, it represents a valuable tool for the optimization of protocol design in effectiveness trials. © 2015 The British Pharmacological Society.

  18. Effects of different amine fluoride concentrations on enamel remineralization.

    PubMed

    Naumova, E A; Niemann, N; Aretz, L; Arnold, W H

    2012-09-01

    The aim of this study was to investigate the effects of decreasing fluoride concentrations on repeated demineralizing challenges on human enamel. In 24 teeth, 3mm×3mm windows were prepared on the buccal and lingual sides and treated in a cycling demineralization-remineralization model. Remineralization was achieved with 100, 10 and 0.1 ppm fluoride from anime fluoride. Coronal sections were cut through the artificial lesions, and three sections per tooth were investigated using polarized light microscopy and scanning electron microscopy with quantitative element analysis. The morphology of the lesions was studied, and the extensions of the superficial layer and the body of the lesion were measured. Using element analysis, the Ca, P and F content were determined. The body of the lesion appeared remineralized after application of 100 ppm fluoride, while remineralization of the lesion was less successful after application of 10 and 0.1 ppm fluoride. The thickness of the superficial layer increased with decreasing fluoride concentrations, and also the extension of the body of the lesion increased. Ca and P content increased with increasing fluoride concentrations. The effectiveness of fluoride in enamel remineralization increased with increasing fluoride concentration. A consistently higher level of fluoride in saliva should be a goal in caries prevention. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Analysis of cocaine and metabolites in hair: validation and application of measurement of hydroxycocaine metabolites as evidence of cocaine ingestion.

    PubMed

    Schaffer, Michael; Cheng, Chen-Chih; Chao, Oscar; Hill, Virginia; Matsui, Paul

    2016-03-01

    An LC/MS/MS method to identify and quantitate in hair the minor metabolites of cocaine-meta-, para-, and ortho-hydroxy cocaine-was developed and validated. Analysis was performed on a triple quadrupole ABSciex API 3000 MS equipped with an atmospheric pressure ionization source via an IonSpray (ESI). For LC, a series 200 micro binary pump with a Perkin Elmer Model 200 autosampler was used. The limit of detection (LOD) and limit of quantification (LOQ) were 0.02 ng/10 mg hair, with linearity from 0.02 to 10 ng/10 mg hair. Concentrations of the para isomer in extensively washed hair samples were in the range of 1-2 % of the cocaine in the sample, while the concentrations of the ortho form were considerably less. The method was used to analyze large numbers of samples from two populations: workplace and criminal justice. In vitro experiments to determine if deodorants or peroxide-containing cosmetic treatments could result in the presence of these metabolites in hair showed that this does not occur with extensively washed hair. Presence of hydroxycocaines, when detected after aggressive washing of the hair samples, provides a valuable additional indicator of ingestion of cocaine rather than mere environmental exposure.

  20. A case of extensive hyperostosis frontalis interna in an 87-year-old female human cadaver.

    PubMed

    Talarico, Ernest F; Prather, Andrew D; Hardt, Kevin D

    2008-04-01

    Hyperostosis frontalis interna (HFI) is a condition that involves thickening of the inner surface of the frontal bone with sparing of the midline. Little is known about the etiology and clinical presentation of HFI. We report unusual findings in a woman with extensive Type D hyperostosis of the frontal bone and a large hyperostotic nodule in the parietal bone with impingement on the precentral gyrus, distinguishing this from the common form of HFI. The scalp was dissected from the cranial vault, and the calvaria and brain were removed and digitally imaged. Bone specimens were embedded in methyl methacrylate plastic, sectioned, and stained using the Von Kossa Method with MacNeal's tetrachrome. Medical records were reviewed, and additional history was obtained through interviews with the donor's family. The calvaria had extensive, bilateral thickening of the frontal bone with irregular topography and clearly demarcated borders. The dura was adherent to all hyperostotic regions. A 3.5-cm nodule was visible on the inner table of the left parietal bone. The dura and cerebrum showed compression in this region, but it was unclear if this resulted in clinical ramifications. Microscopic analysis revealed a larger proportion of cancellous bone was present in regions of macroscopic hyperostosis. Quantitative analysis of sections through areas of gross hyperostosis demonstrated a lower proportion of lamellar bone than in the control. The patient exhibited symptoms that have been correlated to HFI in previous studies. We suggest that the HFI disease process was responsible for the manifestation of these symptoms in this patient. (c) 2008 Wiley-Liss, Inc.

  1. Investigation of Drag and Pressure Distribution of Windshields at High Speeds

    DTIC Science & Technology

    1942-01-01

    sharp nega~ive pressure .peaks.and by 10V positivs preqsuro gradisnts-o?er the tail. Of the windshields rppresonted In-figure 11, the -“ &&rne ~nea...extension of the field with Mach number. In the quantitative discus~.on of Interference, it Is convenient to consider the velocity-increment coeffl- , 19...Before the effect of Interference due to the wing and fuselage can be quantitatively estimated, the veloci- ty increments due to these bodies must be

  2. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  3. Estimation of Vulnerability Functions for Debris Flows Using Different Intensity Parameters

    NASA Astrophysics Data System (ADS)

    Akbas, S. O.; Blahut, J.; Luna, B. Q.; Sterlacchini, S.

    2009-04-01

    In landslide risk research, the majority of past studies have focused on hazard analysis, with only few targeting the concept of vulnerability. When debris flows are considered, there is no consensus or even modest agreement on a generalized methodology to estimate physical vulnerability of the affected buildings. Very few quantitative relationships have been proposed between intensities and vulnerability values. More importantly, in most of the existing relationships, information on process intensity is often missing or only described semi-quantitatively. However, robust assessment of vulnerabilities along with the associated uncertainties is of utmost importance from a quantitative risk analysis point of view. On the morning of 13th July 2008, after more than two days of intense rainfall, several debris and mud flows were released in the central part of Valtellina, an Italian alpine valley in Lombardy Region. One of the largest muddy-debris flows occurred in Selvetta, a fraction of Colorina municipality. The result was the complete destruction of two buildings, and damage at varying severity levels to eight others. The authors had the chance to gather detailed information about the event, by conducting extensive field work and interviews with local inhabitants, civil protection teams, and officials. In addition to the data gathered from the field studies, the main characteristics of the debris flow have been estimated using numerical and empirical approaches. The extensive data obtained from Selvetta event gave an opportunity to develop three separate empirical vulnerability curves, which are functions of deposition height, debris flow velocity, and pressure, respectively. Deposition heights were directly obtained from field surveys, whereas the velocity and pressure values were back-calculated using the finite difference program FLO2D. The vulnerability was defined as the ratio between the monetary loss and the reconstruction value. The monetary losses were obtained from official RASDA documents, which were compiled for claim purposes. For each building, the approximate reconstruction value was calculated according to the building type and size, using the official data given in the Housing Prices Index prepared by the Engineers and Architects of Milan. The resulting vulnerability curves were compared to those in the literature, and among themselves. Specific recommendations were given regarding the most suitable parameter to be used for characterizing the intensity of debris flows within the context of physical vulnerability.

  4. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing

    PubMed Central

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-01-01

    Aims A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R2), using R2 as the primary metric of assay agreement. However, the use of R2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. Methods We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Results Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. Conclusions The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. PMID:28747393

  5. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  6. Purity-activity relationships of natural products: the case of anti-TB active ursolic acid.

    PubMed

    Jaki, Birgit U; Franzblau, Scott G; Chadwick, Lucas R; Lankin, David C; Zhang, Fangqiu; Wang, Yuehong; Pauli, Guido F

    2008-10-01

    The present study explores the variability of biological responses from the perspective of sample purity and introduces the concept of purity-activity relationships (PARs) in natural product research. The abundant plant triterpene ursolic acid (1) was selected as an exemplary natural product due to the overwhelming number yet inconsistent nature of its approximate 120 reported biological activities, which include anti-TB potential. Nine different samples of ursolic acid with purity certifications were obtained, and their purity was independently assessed by means of quantitative 1H NMR (qHNMR). Biological evaluation consisted of determining MICs against two strains of virulent Mycobacterium tuberculosis and IC50 values in Vero cells. Ab initio structure elucidation provided unequivocal structural confirmation and included an extensive 1H NMR spin system analysis, determination of nearly all J couplings and the complete NOE pattern, and led to the revision of earlier reports. As a net result, a sigmoid PAR profile of 1 was obtained, demonstrating the inverse correlation of purity and anti-TB bioactivity. The results imply that synergistic effects of 1 and its varying impurities are the likely cause of previously reported antimycobacterial potential. Generating PARs is a powerful extension of the routinely performed quantitative correlation of structure and activity ([Q]SAR). Advanced by the use of primary analytical methods such as qHNMR, PARs enable the elucidation of cases like 1 when increasing purity voids biological activity. This underlines the potential of PARs as a tool in drug discovery and synergy research and accentuates the need to routinely combine biological testing with purity assessment.

  7. [Quantitative surface analysis of Pt-Co, Cu-Au and Cu-Ag alloy films by XPS and AES].

    PubMed

    Li, Lian-Zhong; Zhuo, Shang-Jun; Shen, Ru-Xiang; Qian, Rong; Gao, Jie

    2013-11-01

    In order to improve the quantitative analysis accuracy of AES, We associated XPS with AES and studied the method to reduce the error of AES quantitative analysis, selected Pt-Co, Cu-Au and Cu-Ag binary alloy thin-films as the samples, used XPS to correct AES quantitative analysis results by changing the auger sensitivity factors to make their quantitative analysis results more similar. Then we verified the accuracy of the quantitative analysis of AES when using the revised sensitivity factors by other samples with different composition ratio, and the results showed that the corrected relative sensitivity factors can reduce the error in quantitative analysis of AES to less than 10%. Peak defining is difficult in the form of the integral spectrum of AES analysis since choosing the starting point and ending point when determining the characteristic auger peak intensity area with great uncertainty, and to make analysis easier, we also processed data in the form of the differential spectrum, made quantitative analysis on the basis of peak to peak height instead of peak area, corrected the relative sensitivity factors, and verified the accuracy of quantitative analysis by the other samples with different composition ratio. The result showed that the analytical error in quantitative analysis of AES reduced to less than 9%. It showed that the accuracy of AES quantitative analysis can be highly improved by the way of associating XPS with AES to correct the auger sensitivity factors since the matrix effects are taken into account. Good consistency was presented, proving the feasibility of this method.

  8. Polymerase chain displacement reaction.

    PubMed

    Harris, Claire L; Sanchez-Vargas, Irma J; Olson, Ken E; Alphey, Luke; Fu, Guoliang

    2013-02-01

    Quantitative PCR assays are now the standard method for viral diagnostics. These assays must be specific, as well as sensitive, to detect the potentially low starting copy number of viral genomic material. We describe a new technique, polymerase chain displacement reaction (PCDR), which uses multiple nested primers in a rapid, capped, one-tube reaction that increases the sensitivity of normal quantitative PCR (qPCR) assays. Sensitivity was increased by approximately 10-fold in a proof-of-principle test on dengue virus sequence. In PCDR, when extension occurs from the outer primer, it displaces the extension strand produced from the inner primer by utilizing a polymerase that has strand displacement activity. This allows a greater than 2-fold increase of amplification product for each amplification cycle and therefore increased sensitivity and speed over conventional PCR. Increased sensitivity in PCDR would be useful in nucleic acid detection for viral diagnostics.

  9. On the analysis of time-of-flight spin-echo modulated dark-field imaging data

    NASA Astrophysics Data System (ADS)

    Sales, Morten; Plomp, Jeroen; Bouwman, Wim G.; Tremsin, Anton S.; Habicht, Klaus; Strobl, Markus

    2017-06-01

    Spin-Echo Modulated Small Angle Neutron Scattering with spatial resolution, i.e. quantitative Spin-Echo Dark Field Imaging, is an emerging technique coupling neutron imaging with spatially resolved quantitative small angle scattering information. However, the currently achieved relatively large modulation periods of the order of millimeters are superimposed to the images of the samples. So far this required an independent reduction and analyses of the image and scattering information encoded in the measured data and is involving extensive curve fitting routines. Apart from requiring a priori decisions potentially limiting the information content that is extractable also a straightforward judgment of the data quality and information content is hindered. In contrast we propose a significantly simplified routine directly applied to the measured data, which does not only allow an immediate first assessment of data quality and delaying decisions on potentially information content limiting further reduction steps to a later and better informed state, but also, as results suggest, generally better analyses. In addition the method enables to drop the spatial resolution detector requirement for non-spatially resolved Spin-Echo Modulated Small Angle Neutron Scattering.

  10. Stochastic hydrogeology: what professionals really need?

    PubMed

    Renard, Philippe

    2007-01-01

    Quantitative hydrogeology celebrated its 150th anniversary in 2006. Geostatistics is younger but has had a very large impact in hydrogeology. Today, geostatistics is used routinely to interpolate deterministically most of the parameters that are required to analyze a problem or make a quantitative analysis. In a small number of cases, geostatistics is combined with deterministic approaches to forecast uncertainty. At a more academic level, geostatistics is used extensively to study physical processes in heterogeneous aquifers. Yet, there is an important gap between the academic use and the routine applications of geostatistics. The reasons for this gap are diverse. These include aspects related to the hydrogeology consulting market, technical reasons such as the lack of widely available software, but also a number of misconceptions. A change in this situation requires acting at different levels. First, regulators must be convinced of the benefit of using geostatistics. Second, the economic potential of the approach must be emphasized to customers. Third, the relevance of the theories needs to be increased. Last, but not least, software, data sets, and computing infrastructure such as grid computing need to be widely available.

  11. Revealing martensitic transformation and α/β interface evolution in electron beam melting three-dimensional-printed Ti-6Al-4V

    PubMed Central

    Tan, Xipeng; Kok, Yihong; Toh, Wei Quan; Tan, Yu Jun; Descoins, Marion; Mangelinck, Dominique; Tor, Shu Beng; Leong, Kah Fai; Chua, Chee Kai

    2016-01-01

    As an important metal three-dimensional printing technology, electron beam melting (EBM) is gaining increasing attention due to its huge potential applications in aerospace and biomedical fields. EBM processing of Ti-6Al-4V as well as its microstructure and mechanical properties were extensively investigated. However, it is still lack of quantitative studies regarding its microstructural evolution, indicative of EBM thermal process. Here, we report α′ martensitic transformation and α/β interface evolution in varied printing thicknesses of EBM-printed Ti-6Al-4V block samples by means of atom probe tomography. Quantitative chemical composition analysis suggests a general phase transformation sequence. By increasing in-fill hatched thickness, elemental partitioning ratios arise and β volume fraction is increased. Furthermore, we observe kinetic vanadium segregation and aluminum depletion at interface front and the resultant α/β interface widening phenomenon. It may give rise to an increased α/β lattice mismatch and weakened α/β interfaces, which could account for the degraded strength as printing thickness increases. PMID:27185285

  12. Quantitative analysis of chromosome condensation in fission yeast.

    PubMed

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota; Haering, Christian H

    2013-03-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote.

  13. Quantitative Analysis of Chromosome Condensation in Fission Yeast

    PubMed Central

    Petrova, Boryana; Dehler, Sascha; Kruitwagen, Tom; Hériché, Jean-Karim; Miura, Kota

    2013-01-01

    Chromosomes undergo extensive conformational rearrangements in preparation for their segregation during cell divisions. Insights into the molecular mechanisms behind this still poorly understood condensation process require the development of new approaches to quantitatively assess chromosome formation in vivo. In this study, we present a live-cell microscopy-based chromosome condensation assay in the fission yeast Schizosaccharomyces pombe. By automatically tracking the three-dimensional distance changes between fluorescently marked chromosome loci at high temporal and spatial resolution, we analyze chromosome condensation during mitosis and meiosis and deduct defined parameters to describe condensation dynamics. We demonstrate that this method can determine the contributions of condensin, topoisomerase II, and Aurora kinase to mitotic chromosome condensation. We furthermore show that the assay can identify proteins required for mitotic chromosome formation de novo by isolating mutants in condensin, DNA polymerase ε, and F-box DNA helicase I that are specifically defective in pro-/metaphase condensation. Thus, the chromosome condensation assay provides a direct and sensitive system for the discovery and characterization of components of the chromosome condensation machinery in a genetically tractable eukaryote. PMID:23263988

  14. Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis.

    PubMed

    Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A

    2015-01-13

    The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. © 2014 The Authors.

  15. Quantitative comparison of a human cancer cell surface proteome between interphase and mitosis

    PubMed Central

    Özlü, Nurhan; Qureshi, Mohammad H; Toyoda, Yusuke; Renard, Bernhard Y; Mollaoglu, Gürkan; Özkan, Nazlı E; Bulbul, Selda; Poser, Ina; Timm, Wiebke; Hyman, Anthony A; Mitchison, Timothy J; Steen, Judith A

    2015-01-01

    The cell surface is the cellular compartment responsible for communication with the environment. The interior of mammalian cells undergoes dramatic reorganization when cells enter mitosis. These changes are triggered by activation of the CDK1 kinase and have been studied extensively. In contrast, very little is known of the cell surface changes during cell division. We undertook a quantitative proteomic comparison of cell surface-exposed proteins in human cancer cells that were tightly synchronized in mitosis or interphase. Six hundred and twenty-eight surface and surface-associated proteins in HeLa cells were identified; of these, 27 were significantly enriched at the cell surface in mitosis and 37 in interphase. Using imaging techniques, we confirmed the mitosis-selective cell surface localization of protocadherin PCDH7, a member of a family with anti-adhesive roles in embryos. We show that PCDH7 is required for development of full mitotic rounding pressure at the onset of mitosis. Our analysis provided basic information on how cell cycle progression affects the cell surface. It also provides potential pharmacodynamic biomarkers for anti-mitotic cancer chemotherapy. PMID:25476450

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Qibin; Monroe, Matthew E.; Schepmoes, Athena A.

    Non-enzymatic glycation of proteins is implicated in diabetes mellitus and its related complications. In this report, we extend our previous development and refinement of proteomics-based methods for the analysis of non-enzymatically glycated proteins to comprehensively identify glycated proteins in normal and diabetic human plasma and erythrocytes. Using immunodepletion, enrichment, and fractionation strategies, we identified 7749 unique glycated peptides, corresponding to 3742 unique glycated proteins. Semi-quantitative comparisons revealed a number of proteins with glycation levels significantly increased in diabetes relative to control samples and that erythrocyte proteins are more extensively glycated than plasma proteins. A glycation motif analysis revealed amino acidsmore » that are favored more than others in the protein primary structures in the vicinity of the glycation sites in both sample types. The glycated peptides and corresponding proteins reported here provide a foundation for the potential identification of novel markers for diabetes, glycemia, or diabetic complications.« less

  17. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics

    PubMed Central

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms. PMID:23176545

  18. Statistical characterization of multiple-reaction monitoring mass spectrometry (MRM-MS) assays for quantitative proteomics.

    PubMed

    Mani, D R; Abbatiello, Susan E; Carr, Steven A

    2012-01-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) with stable isotope dilution (SID) is increasingly becoming a widely accepted assay for the quantification of proteins and peptides. These assays have shown great promise in relatively high throughput verification of candidate biomarkers. While the use of MRM-MS assays is well established in the small molecule realm, their introduction and use in proteomics is relatively recent. As such, statistical and computational methods for the analysis of MRM-MS data from proteins and peptides are still being developed. Based on our extensive experience with analyzing a wide range of SID-MRM-MS data, we set forth a methodology for analysis that encompasses significant aspects ranging from data quality assessment, assay characterization including calibration curves, limits of detection (LOD) and quantification (LOQ), and measurement of intra- and interlaboratory precision. We draw upon publicly available seminal datasets to illustrate our methods and algorithms.

  19. Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis.

    PubMed

    Gao, Yurui; Burns, Scott S; Lauzon, Carolyn B; Fong, Andrew E; James, Terry A; Lubar, Joel F; Thatcher, Robert W; Twillie, David A; Wirt, Michael D; Zola, Marc A; Logan, Bret W; Anderson, Adam W; Landman, Bennett A

    2013-03-29

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.

  20. Integration of XNAT/PACS, DICOM, and research software for automated multi-modal image analysis

    NASA Astrophysics Data System (ADS)

    Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.

    2013-03-01

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software.

  1. Integration of XNAT/PACS, DICOM, and Research Software for Automated Multi-modal Image Analysis

    PubMed Central

    Gao, Yurui; Burns, Scott S.; Lauzon, Carolyn B.; Fong, Andrew E.; James, Terry A.; Lubar, Joel F.; Thatcher, Robert W.; Twillie, David A.; Wirt, Michael D.; Zola, Marc A.; Logan, Bret W.; Anderson, Adam W.; Landman, Bennett A.

    2013-01-01

    Traumatic brain injury (TBI) is an increasingly important public health concern. While there are several promising avenues of intervention, clinical assessments are relatively coarse and comparative quantitative analysis is an emerging field. Imaging data provide potentially useful information for evaluating TBI across functional, structural, and microstructural phenotypes. Integration and management of disparate data types are major obstacles. In a multi-institution collaboration, we are collecting electroencephalogy (EEG), structural MRI, diffusion tensor MRI (DTI), and single photon emission computed tomography (SPECT) from a large cohort of US Army service members exposed to mild or moderate TBI who are undergoing experimental treatment. We have constructed a robust informatics backbone for this project centered on the DICOM standard and eXtensible Neuroimaging Archive Toolkit (XNAT) server. Herein, we discuss (1) optimization of data transmission, validation and storage, (2) quality assurance and workflow management, and (3) integration of high performance computing with research software. PMID:24386548

  2. Carbothermic Synthesis of 820 m UN Kernels: Literature Review, Thermodynamics, Analysis, and Related Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lindemer, Terrence; Voit, Stewart L; Silva, Chinthaka M

    2014-01-01

    The U.S. Department of Energy is considering a new nuclear fuel that would be less susceptible to ruptures during a loss-of-coolant accident. The fuel would consist of tristructural isotropic coated particles with large, dense uranium nitride (UN) kernels. This effort explores many factors involved in using gel-derived uranium oxide-carbon microspheres to make large UN kernels. Analysis of recent studies with sufficient experimental details is provided. Extensive thermodynamic calculations are used to predict carbon monoxide and other pressures for several different reactions that may be involved in conversion of uranium oxides and carbides to UN. Experimentally, the method for making themore » gel-derived microspheres is described. These were used in a microbalance with an attached mass spectrometer to determine details of carbothermic conversion in argon, nitrogen, or vacuum. A quantitative model is derived from experiments for vacuum conversion to an uranium oxide-carbide kernel.« less

  3. Application of the Probabilistic Dynamic Synthesis Method to the Analysis of a Realistic Structure

    NASA Technical Reports Server (NTRS)

    Brown, Andrew M.; Ferri, Aldo A.

    1998-01-01

    The Probabilistic Dynamic Synthesis method is a new technique for obtaining the statistics of a desired response engineering quantity for a structure with non-deterministic parameters. The method uses measured data from modal testing of the structure as the input random variables, rather than more "primitive" quantities like geometry or material variation. This modal information is much more comprehensive and easily measured than the "primitive" information. The probabilistic analysis is carried out using either response surface reliability methods or Monte Carlo simulation. A previous work verified the feasibility of the PDS method on a simple seven degree-of-freedom spring-mass system. In this paper, extensive issues involved with applying the method to a realistic three-substructure system are examined, and free and forced response analyses are performed. The results from using the method are promising, especially when the lack of alternatives for obtaining quantitative output for probabilistic structures is considered.

  4. Electrochemical performance investigations on the hydrogen depolarized CO2 concentrator

    NASA Technical Reports Server (NTRS)

    Aylward, J. R.

    1976-01-01

    An extensive investigation of anode and cathode polarization in complete cells and half cells was conducted to determine the factors affecting HDC electrode polarization and the nature of this polarization. Matrix-electrolyte-electrode interactions and cell electrolyte composition were also investigated. The electrodes were found to have normal performance capabilities. The HDC anode polarization characteristics were correlated with a theoretical kinetic analysis; and, except for some quantitative details, a rather complete understanding of the causes for HDC electrode polarization was formulated. One of the important finding resulting from the kinetic analysis was that platinum appears to catalyze the decomposition of carbonic acid to carbon dioxide and water. It was concluded that the abnormal voltage performance of the One Man ARS HDC cells was caused by insufficient cell electrolyte volume under normal operating conditions due to deficiencies in the reservoir to cell interfacing.

  5. Experimental Design and Bioinformatics Analysis for the Application of Metagenomics in Environmental Sciences and Biotechnology.

    PubMed

    Ju, Feng; Zhang, Tong

    2015-11-03

    Recent advances in DNA sequencing technologies have prompted the widespread application of metagenomics for the investigation of novel bioresources (e.g., industrial enzymes and bioactive molecules) and unknown biohazards (e.g., pathogens and antibiotic resistance genes) in natural and engineered microbial systems across multiple disciplines. This review discusses the rigorous experimental design and sample preparation in the context of applying metagenomics in environmental sciences and biotechnology. Moreover, this review summarizes the principles, methodologies, and state-of-the-art bioinformatics procedures, tools and database resources for metagenomics applications and discusses two popular strategies (analysis of unassembled reads versus assembled contigs/draft genomes) for quantitative or qualitative insights of microbial community structure and functions. Overall, this review aims to facilitate more extensive application of metagenomics in the investigation of uncultured microorganisms, novel enzymes, microbe-environment interactions, and biohazards in biotechnological applications where microbial communities are engineered for bioenergy production, wastewater treatment, and bioremediation.

  6. Structural and conformational determinants of macrocycle cell permeability.

    PubMed

    Over, Björn; Matsson, Pär; Tyrchan, Christian; Artursson, Per; Doak, Bradley C; Foley, Michael A; Hilgendorf, Constanze; Johnston, Stephen E; Lee, Maurice D; Lewis, Richard J; McCarren, Patrick; Muncipinto, Giovanni; Norinder, Ulf; Perry, Matthew W D; Duvall, Jeremy R; Kihlberg, Jan

    2016-12-01

    Macrocycles are of increasing interest as chemical probes and drugs for intractable targets like protein-protein interactions, but the determinants of their cell permeability and oral absorption are poorly understood. To enable rational design of cell-permeable macrocycles, we generated an extensive data set under consistent experimental conditions for more than 200 non-peptidic, de novo-designed macrocycles from the Broad Institute's diversity-oriented screening collection. This revealed how specific functional groups, substituents and molecular properties impact cell permeability. Analysis of energy-minimized structures for stereo- and regioisomeric sets provided fundamental insight into how dynamic, intramolecular interactions in the 3D conformations of macrocycles may be linked to physicochemical properties and permeability. Combined use of quantitative structure-permeability modeling and the procedure for conformational analysis now, for the first time, provides chemists with a rational approach to design cell-permeable non-peptidic macrocycles with potential for oral absorption.

  7. Sensing the intruder: a quantitative threshold for recognition cues perception in honeybees

    NASA Astrophysics Data System (ADS)

    Cappa, Federico; Bruschini, Claudia; Cipollini, Maria; Pieraccini, Giuseppe; Cervo, Rita

    2014-02-01

    The ability to discriminate among nestmates and non-nestmate is essential to defend social insect colonies from intruders. Over the years, nestmate recognition has been extensively studied in the honeybee Apis mellifera; nevertheless, the quantitative perceptual aspects at the basis of the recognition system represent an unexplored subject in this species. To test the existence of a cuticular hydrocarbons' quantitative perception threshold for nestmate recognition cues, we conducted behavioural assays by presenting different amounts of a foreign forager's chemical profile to honeybees at the entrance of their colonies. We found an increase in the explorative and aggressive responses as the amount of cues increased based on a threshold mechanism, highlighting the importance of the quantitative perceptual features for the recognition processes in A. mellifera.

  8. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  9. Occurrence and prevalence of antibiotic resistance in landfill leachate.

    PubMed

    Wang, Yangqing; Tang, Wei; Qiao, Jing; Song, Liyan

    2015-08-01

    Antibiotic resistance (AR) is extensively present in various environments, posing emerging threat to public and environmental health. Landfill receives unused and unwanted antibiotics through household waste and AR within waste (e.g., activated sludge and illegal clinical waste) and is supposed to serve as an important AR reservoir. In this study, we used culture-dependent methods and quantitative molecular techniques to detect and quantify antibiotic-resistant bacteria (ARB) and antibiotic resistance genes (ARGs) in 12 landfill leachate samples from six geographic different landfills, China. Five tested ARGs (tetO, tetW, bla(TEM), sulI, and sulII) and seven kinds of antibiotic-resistant heterotrophic ARB were extensively detected in all samples, demonstrating their occurrence in landfill. The detected high ratio (10(-2) to 10(-5)) of ARGs to 16S ribosomal RNA (rRNA) gene copies implied that ARGs are prevalent in landfill. Correlation analysis showed that ARGs (tetO, tetW, sulI, and sulII) significantly correlated to ambient bacterial 16S rRNA gene copies, suggesting that the abundance of bacteria in landfill leachate may play an important role in the horizontal spread of ARGs.

  10. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  11. LesionTracker: Extensible Open-Source Zero-Footprint Web Viewer for Cancer Imaging Research and Clinical Trials.

    PubMed

    Urban, Trinity; Ziegler, Erik; Lewis, Rob; Hafey, Chris; Sadow, Cheryl; Van den Abbeele, Annick D; Harris, Gordon J

    2017-11-01

    Oncology clinical trials have become increasingly dependent upon image-based surrogate endpoints for determining patient eligibility and treatment efficacy. As therapeutics have evolved and multiplied in number, the tumor metrics criteria used to characterize therapeutic response have become progressively more varied and complex. The growing intricacies of image-based response evaluation, together with rising expectations for rapid and consistent results reporting, make it difficult for site radiologists to adequately address local and multicenter imaging demands. These challenges demonstrate the need for advanced cancer imaging informatics tools that can help ensure protocol-compliant image evaluation while simultaneously promoting reviewer efficiency. LesionTracker is a quantitative imaging package optimized for oncology clinical trial workflows. The goal of the project is to create an open source zero-footprint viewer for image analysis that is designed to be extensible as well as capable of being integrated into third-party systems for advanced imaging tools and clinical trials informatics platforms. Cancer Res; 77(21); e119-22. ©2017 AACR . ©2017 American Association for Cancer Research.

  12. A cross-species socio-emotional behaviour development revealed by a multivariate analysis.

    PubMed

    Koshiba, Mamiko; Senoo, Aya; Mimura, Koki; Shirakawa, Yuka; Karino, Genta; Obara, Saya; Ozawa, Shinpei; Sekihara, Hitomi; Fukushima, Yuta; Ueda, Toyotoshi; Kishino, Hirohisa; Tanaka, Toshihisa; Ishibashi, Hidetoshi; Yamanouchi, Hideo; Yui, Kunio; Nakamura, Shun

    2013-01-01

    Recent progress in affective neuroscience and social neurobiology has been propelled by neuro-imaging technology and epigenetic approach in neurobiology of animal behaviour. However, quantitative measurements of socio-emotional development remains lacking, though sensory-motor development has been extensively studied in terms of digitised imaging analysis. Here, we developed a method for socio-emotional behaviour measurement that is based on the video recordings under well-defined social context using animal models with variously social sensory interaction during development. The behaviour features digitized from the video recordings were visualised in a multivariate statistic space using principal component analysis. The clustering of the behaviour parameters suggested the existence of species- and stage-specific as well as cross-species behaviour modules. These modules were used to characterise the behaviour of children with or without autism spectrum disorders (ASDs). We found that socio-emotional behaviour is highly dependent on social context and the cross-species behaviour modules may predict neurobiological basis of ASDs.

  13. Biotic and abiotic dynamics of a high solid-state anaerobic digestion box-type container system.

    PubMed

    Walter, Andreas; Probst, Maraike; Hinterberger, Stephan; Müller, Horst; Insam, Heribert

    2016-03-01

    A solid-state anaerobic digestion box-type container system for biomethane production was observed in 12 three-week batch fermentations. Reactor performance was monitored using physico-chemical analysis and the methanogenic community was identified using ANAEROCHIP-microarrays and quantitative PCR. A resilient community was found in all batches, despite variations in inoculum to substrate ratio, feedstock quality, and fluctuating reactor conditions. The consortia were dominated by mixotrophic Methanosarcina that were accompanied by hydrogenotrophic Methanobacterium, Methanoculleus, and Methanocorpusculum. The relationship between biotic and abiotic variables was investigated using bivariate correlation analysis and univariate analysis of variance. High amounts of biogas were produced in batches with high copy numbers of Methanosarcina. High copy numbers of Methanocorpusculum and extensive percolation, however, were found to negatively correlate with biogas production. Supporting these findings, a negative correlation was detected between Methanocorpusculum and Methanosarcina. Based on these results, this study suggests Methanosarcina as an indicator for well-functioning reactor performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Common cause evaluations in applied risk analysis of nuclear power plants. [PWR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taniguchi, T.; Ligon, D.; Stamatelatos, M.

    1983-04-01

    Qualitative and quantitative approaches were developed for the evaluation of common cause failures (CCFs) in nuclear power plants and were applied to the analysis of the auxiliary feedwater systems of several pressurized water reactors (PWRs). Key CCF variables were identified through a survey of experts in the field and a review of failure experience in operating PWRs. These variables were classified into categories of high, medium, and low defense against a CCF. Based on the results, a checklist was developed for analyzing CCFs of systems. Several known techniques for quantifying CCFs were also reviewed. The information provided valuable insights inmore » the development of a new model for estimating CCF probabilities, which is an extension of and improvement over the Beta Factor method. As applied to the analysis of the PWR auxiliary feedwater systems, the method yielded much more realistic values than the original Beta Factor method for a one-out-of-three system.« less

  15. Quantitative phase and amplitude imaging using Differential-Interference Contrast (DIC) microscopy

    NASA Astrophysics Data System (ADS)

    Preza, Chrysanthe; O'Sullivan, Joseph A.

    2009-02-01

    We present an extension of the development of an alternating minimization (AM) method for the computation of a specimen's complex transmittance function (magnitude and phase) from DIC images. The ability to extract both quantitative phase and amplitude information from two rotationally-diverse DIC images (i.e., acquired by rotating the sample) extends previous efforts in computational DIC microscopy that have focused on quantitative phase imaging only. Simulation results show that the inverse problem at hand is sensitive to noise as well as to the choice of the AM algorithm parameters. The AM framework allows constraints and penalties on the magnitude and phase estimates to be incorporated in a principled manner. Towards this end, Green and De Pierro's "log-cosh" regularization penalty is applied to the magnitude of differences of neighboring values of the complex-valued function of the specimen during the AM iterations. The penalty is shown to be convex in the complex space. A procedure to approximate the penalty within the iterations is presented. In addition, a methodology to pre-compute AM parameters that are optimal with respect to the convergence rate of the AM algorithm is also presented. Both extensions of the AM method are investigated with simulations.

  16. I Vivo Quantitative Ultrasound Imaging and Scatter Assessments.

    NASA Astrophysics Data System (ADS)

    Lu, Zheng Feng

    There is evidence that "instrument independent" measurements of ultrasonic scattering properties would provide useful diagnostic information that is not available with conventional ultrasound imaging. This dissertation is a continuing effort to test the above hypothesis and to incorporate quantitative ultrasound methods into clinical examinations for early detection of diffuse liver disease. A well-established reference phantom method was employed to construct quantitative ultrasound images of tissue in vivo. The method was verified by extensive phantom tests. A new method was developed to measure the effective attenuation coefficient of the body wall. The method relates the slope of the difference between the echo signal power spectrum from a uniform region distal to the body wall and the echo signal power spectrum from a reference phantom to the body wall attenuation. The accuracy obtained from phantom tests suggests further studies with animal experiments. Clinically, thirty-five healthy subjects and sixteen patients with diffuse liver disease were studied by these quantitative ultrasound methods. The average attenuation coefficient in normals agreed with previous investigators' results; in vivo backscatter coefficients agreed with the results from normals measured by O'Donnell. Strong discriminating power (p < 0.001) was found for both attenuation and backscatter coefficients between fatty livers and normals; a significant difference (p < 0.01) was observed in the backscatter coefficient but not in the attenuation coefficient between cirrhotic livers and normals. An in vivo animal model of steroid hepatopathy was used to investigate the system sensitivity in detecting early changes in canine liver resulting from corticosteroid administration. The average attenuation coefficient slope increased from 0.7 dB/cm/MHz in controls to 0.82 dB/cm/MHz (at 6 MHz) in treated animals on day 14 into the treatment, and the backscatter coefficient was 26times 10^{ -4}cm^{-1}sr^{-1} in controls compared with 74times 10^{-4}cm^{-1}sr^ {-1} (at 6 MHz) in treated animals. A simplified quantitative approach using video image signals was developed. Results derived both from the r.f. signal analysis and from the video signal analysis are sensitive to the changes in the liver in this animal model.

  17. Quantitative Radiomics System Decoding the Tumor Phenotype | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Our goal is to construct a publicly available computational radiomics system for the objective and automated extraction of quantitative imaging features that we believe will yield biomarkers of greater prognostic value compared with routinely extracted descriptors of tumor size. We will create a generalized, open, portable, and extensible radiomics platform that is widely applicable across cancer types and imaging modalities and describe how we will use lung and head and neck cancers as models to validate our developments.

  18. Three-dimensional quantitative flow diagnostics

    NASA Technical Reports Server (NTRS)

    Miles, Richard B.; Nosenchuck, Daniel M.

    1989-01-01

    The principles, capabilities, and practical implementation of advanced measurement techniques for the quantitative characterization of three-dimensional flows are reviewed. Consideration is given to particle, Rayleigh, and Raman scattering; fluorescence; flow marking by H2 bubbles, photochromism, photodissociation, and vibrationally excited molecules; light-sheet volume imaging; and stereo imaging. Also discussed are stereo schlieren methods, holographic particle imaging, optical tomography, acoustic and magnetic-resonance imaging, and the display of space-filling data. Extensive diagrams, graphs, photographs, sample images, and tables of numerical data are provided.

  19. MM&T: Testing of Electro-Optic Components.

    DTIC Science & Technology

    1981-02-01

    electro - optic components with special emphasis on diamond-turned optics. The primary purpose of that study was to determine where new government initiatives could be most effective in moving this area forward. Besides an ordered list of recommended government actions, this study has resulted in+ an extensive survey of experts (the most extensive yet made), the largest annotated bibliography in the field, an improved form of Ronchi testing giving quantitative results, a general approach to nonconjugate interferometry, a high accuracy form of multiple-wavelength absolute

  20. Hyperthyroidism stimulates mitochondrial proton leak and ATP turnover in rat hepatocytes but does not change the overall kinetics of substrate oxidation reactions.

    PubMed

    Harper, M E; Brand, M D

    1994-08-01

    Thyroid hormones have well-known effects on oxidative phosphorylation, but there is little quantitative information on their important sites of action. We have used top-down elasticity analysis, an extension of metabolic control analysis, to identify the sites of action of thyroid hormones on oxidative phosphorylation in rat hepatocytes. We divided the oxidative phosphorylation system into three blocks of reactions: the substrate oxidation subsystem, the phosphorylating subsystem, and the mitochondrial proton leak subsystem and have identified those blocks of reactions whose kinetics are significantly changed by hyperthyroidism. Our results show significant effects on the kinetics of the proton leak and the phosphorylating subsystems. Quantitative analyses revealed that 43% of the increase in resting respiration rate in hyperthyroid hepatocytes compared with euthyroid hepatocytes was due to differences in the proton leak and 59% was due to differences in the activity of the phosphorylating subsystem. There were no significant effects on the substrate oxidation subsystem. Changes in nonmitochondrial oxygen consumption accounted for -2% of the change in respiration rate. Top-down control analysis revealed that the distribution of control over the rates of mitochondrial oxygen consumption, ATP synthesis and consumption, and proton leak and over mitochondrial membrane potential (delta psi m) was similar in hepatocytes from hyperthyroid and littermate-paired euthyroid controls. The results of this study include the first complete top-down elasticity and control analyses of oxidative phosphorylation in hepatocytes from hyperthyroid rats.

  1. Refractive index measurements of single, spherical cells using digital holographic microscopy.

    PubMed

    Schürmann, Mirjam; Scholze, Jana; Müller, Paul; Chan, Chii J; Ekpenyong, Andrew E; Chalut, Kevin J; Guck, Jochen

    2015-01-01

    In this chapter, we introduce digital holographic microscopy (DHM) as a marker-free method to determine the refractive index of single, spherical cells in suspension. The refractive index is a conclusive measure in a biological context. Cell conditions, such as differentiation or infection, are known to yield significant changes in the refractive index. Furthermore, the refractive index of biological tissue determines the way it interacts with light. Besides the biological relevance of this interaction in the retina, a lot of methods used in biology, including microscopy, rely on light-tissue or light-cell interactions. Hence, determining the refractive index of cells using DHM is valuable in many biological applications. This chapter covers the main topics that are important for the implementation of DHM: setup, sample preparation, and analysis. First, the optical setup is described in detail including notes and suggestions for the implementation. Following that, a protocol for the sample and measurement preparation is explained. In the analysis section, an algorithm for the determination of quantitative phase maps is described. Subsequently, all intermediate steps for the calculation of the refractive index of suspended cells are presented, exploiting their spherical shape. In the last section, a discussion of possible extensions to the setup, further measurement configurations, and additional analysis methods are given. Throughout this chapter, we describe a simple, robust, and thus easily reproducible implementation of DHM. The different possibilities for extensions show the diverse fields of application for this technique. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Operculum bone carp (cyprinus carprio sp.) scaffold is a new potential xenograft material: a preliminary study

    NASA Astrophysics Data System (ADS)

    Kartiwa, A.; Abbas, B.; Pandansari, P.; Prahasta, A.; Nandini, M.; Fadhlillah, M.; Subroto, T.; Panigoro, R.

    2017-02-01

    Orbital floor fracture with extensive bone loss, would cause herniation of the orbital tissue into the maxillary sinus. Graft implantation should be done on the orbital fracture with extensive bone loss. Different types of grafts have their own characteristics and advantages. Xenograft has been widely studied for use in bone defects. This study was to investigate cyprinus carprio sp. opercula bone as a potential xenograft. The aim of this study was to investigate based on EDS chemical analysis using a ZAF Standardless Method of Quantitative Analysis (Oxide) and SEM examination conducted in the laboratory of Mathematics, Institute of Technology Bandung. Particularly the mass ratio of Ca and P (5.8/3:47), the result is 1.67. This is equivalent to the stoichiometric Hydroxyapatite (HA) (Aoki H, 1991, Science and medical applications of hydroxyapatite, Tokyo: Institute for Medical and Engineering, Tokyo Medical and Dental University). C N O that there is an element of protein/amino acid collagen compound, serves as a matrix together with HA. As shown in the SEM analysis that the matrix is a porous sheet-shaped (oval) that interconnect with each other, which is good scaffold. The pore is composed of large pores >200 microns and smaller pores between the large pores with a size smaller or equal to 10 microns that can serve for the attachment of osteoblast cell. In conclusion, Opercula bone carp (cyprinus carprio sp.) scaffold could be a new potential xenograft material.

  3. Comparison of numerical model simulations and SFO wake vortex windline measurements

    DOT National Transportation Integrated Search

    2003-06-23

    To provide quantitative support for the Simultaneous Offset Instrument Approach (SOIA) procedure, an extensive data collection effort was undertaken at San Francisco International Airport by the Federal Aviation Administration (FAA, U.S. Dept. of Tra...

  4. A CAL-Based Undergraduate Genetics Course.

    ERIC Educational Resources Information Center

    Garbutt, K.; And Others

    1979-01-01

    Describes a second-year undergraduate practical course in quantitative genetics and biometrics, based upon computer-assisted learning (CAL); and discusses the educational benefits of the course, some problems encountered, and some implications of the extensive use of CAL. (Author/CMV)

  5. Analysis of the national school feeding program in the municipality of Viçosa, state of Minas Gerais

    PubMed Central

    Rocha, Naruna Pereira; Filgueiras, Mariana De Santis; de Albuquerque, Fernanda Martins; Milagres, Luana Cupertino; Castro, Ana Paula Pereira; Silva, Mariane Alves; da Costa, Glauce Dias; Priore, Silvia Eloiza; de Novaes, Juliana Farias

    2018-01-01

    ABSTRACT OBJECTIVE To analyze the implementation of the Brazilian National School Feeding Program as a food and nutritional security policy in public schools. METHODS This a cross-sectional study, with a quantitative and qualitative approach, carried out with 268 schoolchildren aged eight to nine years from the public school system of Viçosa, state of Minas Gerais, Brazil, in 2015. Interviews were carried out using semi-structured questionnaires with the children, parents, cooks, nutritionists, trainer of the Technical Assistance and Rural Extension Company, and president of the School Feeding Council. In order to analyze the implementation of the National School Feeding Program in Viçosa, we evaluated the direct weighing of the food served in the schools using mechanical balances with a capacity of up to 10 kg and the perception of the social players involved in the implementation of the National School Feeding Program. The children were questioned about the acceptance of and adherence to the food offered, in addition to the habit of bringing food from home. Parents reported knowledge about the School Feeding Program and Council. The qualitative analysis consisted of content analysis and quantitative analysis using the chi-square test, Fisher’s exact test, and Mann-Whitney test. We adopted the statistical significance of 5% for quantitative analysis. RESULTS Children reported low adherence to the school feeding program and most of them used to bring food from home. Irregularities were identified in the implementation of the National School Feeding Program, such as: inadequate number of nutritionists, suspension of Council meetings, inadequate infrastructure in the areas of preparation and distribution of meals, lack of training of cooks, lack of nutritional adequacy of the food offered, and lack of actions on food and nutritional education. The Program complied with the recommendations for purchasing food from family farms. CONCLUSIONS The National School Feeding Program presented many irregularities in Viçosa. It is important to monitor the problems identified for better reformulation and planning of the Program, in order to guarantee the food and nutritional security of the children served. PMID:29489989

  6. Analysis of the national school feeding program in the municipality of Viçosa, state of Minas Gerais.

    PubMed

    Rocha, Naruna Pereira; Filgueiras, Mariana De Santis; Albuquerque, Fernanda Martins de; Milagres, Luana Cupertino; Castro, Ana Paula Pereira; Silva, Mariane Alves; Costa, Glauce Dias da; Priore, Silvia Eloiza; Novaes, Juliana Farias de

    2018-01-01

    OBJECTIVE To analyze the implementation of the Brazilian National School Feeding Program as a food and nutritional security policy in public schools. METHODS This a cross-sectional study, with a quantitative and qualitative approach, carried out with 268 schoolchildren aged eight to nine years from the public school system of Viçosa, state of Minas Gerais, Brazil, in 2015. Interviews were carried out using semi-structured questionnaires with the children, parents, cooks, nutritionists, trainer of the Technical Assistance and Rural Extension Company, and president of the School Feeding Council. In order to analyze the implementation of the National School Feeding Program in Viçosa, we evaluated the direct weighing of the food served in the schools using mechanical balances with a capacity of up to 10 kg and the perception of the social players involved in the implementation of the National School Feeding Program. The children were questioned about the acceptance of and adherence to the food offered, in addition to the habit of bringing food from home. Parents reported knowledge about the School Feeding Program and Council. The qualitative analysis consisted of content analysis and quantitative analysis using the chi-square test, Fisher's exact test, and Mann-Whitney test. We adopted the statistical significance of 5% for quantitative analysis. RESULTS Children reported low adherence to the school feeding program and most of them used to bring food from home. Irregularities were identified in the implementation of the National School Feeding Program, such as: inadequate number of nutritionists, suspension of Council meetings, inadequate infrastructure in the areas of preparation and distribution of meals, lack of training of cooks, lack of nutritional adequacy of the food offered, and lack of actions on food and nutritional education. The Program complied with the recommendations for purchasing food from family farms. CONCLUSIONS The National School Feeding Program presented many irregularities in Viçosa. It is important to monitor the problems identified for better reformulation and planning of the Program, in order to guarantee the food and nutritional security of the children served.

  7. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  8. Cascading failure in scale-free networks with tunable clustering

    NASA Astrophysics Data System (ADS)

    Zhang, Xue-Jun; Gu, Bo; Guan, Xiang-Min; Zhu, Yan-Bo; Lv, Ren-Li

    2016-02-01

    Cascading failure is ubiquitous in many networked infrastructure systems, such as power grids, Internet and air transportation systems. In this paper, we extend the cascading failure model to a scale-free network with tunable clustering and focus on the effect of clustering coefficient on system robustness. It is found that the network robustness undergoes a nonmonotonic transition with the increment of clustering coefficient: both highly and lowly clustered networks are fragile under the intentional attack, and the network with moderate clustering coefficient can better resist the spread of cascading. We then provide an extensive explanation for this constructive phenomenon via the microscopic point of view and quantitative analysis. Our work can be useful to the design and optimization of infrastructure systems.

  9. A mixed method pilot study: the researchers' experiences.

    PubMed

    Secomb, Jacinta M; Smith, Colleen

    2011-08-01

    This paper reports on the outcomes of a small well designed pilot study. Pilot studies often disseminate limited or statistically meaningless results without adding to the body knowledge on the comparative research benefits. The design a pre-test post-test group parallel randomised control trial and inductive content analysis of focus group transcripts was tested specifically to increase outcomes in a proposed larger study. Strategies are now in place to overcome operational barriers and recruitment difficulties. Links between the qualitative and quantitative arms of the proposed larger study have been made; it is anticipated that this will add depth to the final report. More extensive reporting on the outcomes of pilot studies would assist researchers and increase the body of knowledge in this area.

  10. High energy PIXE: A tool to characterize multi-layer thick samples

    NASA Astrophysics Data System (ADS)

    Subercaze, A.; Koumeir, C.; Métivier, V.; Servagent, N.; Guertin, A.; Haddad, F.

    2018-02-01

    High energy PIXE is a useful and non-destructive tool to characterize multi-layer thick samples such as cultural heritage objects. In a previous work, we demonstrated the possibility to perform quantitative analysis of simple multi-layer samples using high energy PIXE, without any assumption on their composition. In this work an in-depth study of the parameters involved in the method previously published is proposed. Its extension to more complex samples with a repeated layer is also presented. Experiments have been performed at the ARRONAX cyclotron using 68 MeV protons. The thicknesses and sequences of a multi-layer sample including two different layers of the same element have been determined. Performances and limits of this method are presented and discussed.

  11. Hairy Root Cultures of Gymnema sylvestre R. Br. to Produce Gymnemic Acid.

    PubMed

    Rajashekar, J; Kumar, Vadlapudi; Veerashree, V; Poornima, D V; Sannabommaji, Torankumar; Gajula, Hari; Giridhara, B

    2016-01-01

    Gymnema sylvestre R. Br. (Asclepiadaceae) is an endangered species extensively used in the management of diabetes, obesity, and treatment of various diseases. Uncontrolled exploitation to meet the increasing demand and low seed viability hastens the disappearance of the plant from its natural habitat. Hairy root culture provides a suitable alternative for the enhanced production of active principles. The current protocol provides the optimized culture conditions for the establishment of hairy root cultures and elicitation studies and also confirmation of stable integration of A. rhizogenes plasmid T-DNA into host genetic material by PCR and RT-PCR. Furthermore, it also discusses the suitable methods for the extraction procedures, and qualitative and quantitative analysis of gymnemic acid by HPTLC and HPLC.

  12. Analysis of transient fission gas behaviour in oxide fuel using BISON and TRANSURANUS

    NASA Astrophysics Data System (ADS)

    Barani, T.; Bruschi, E.; Pizzocri, D.; Pastore, G.; Van Uffelen, P.; Williamson, R. L.; Luzzi, L.

    2017-04-01

    The modelling of fission gas behaviour is a crucial aspect of nuclear fuel performance analysis in view of the related effects on the thermo-mechanical performance of the fuel rod, which can be particularly significant during transients. In particular, experimental observations indicate that substantial fission gas release (FGR) can occur on a small time scale during transients (burst release). To accurately reproduce the rapid kinetics of the burst release process in fuel performance calculations, a model that accounts for non-diffusional mechanisms such as fuel micro-cracking is needed. In this work, we present and assess a model for transient fission gas behaviour in oxide fuel, which is applied as an extension of conventional diffusion-based models to introduce the burst release effect. The concept and governing equations of the model are presented, and the sensitivity of results to the newly introduced parameters is evaluated through an analytic sensitivity analysis. The model is assessed for application to integral fuel rod analysis by implementation in two structurally different fuel performance codes: BISON (multi-dimensional finite element code) and TRANSURANUS (1.5D code). Model assessment is based on the analysis of 19 light water reactor fuel rod irradiation experiments from the OECD/NEA IFPE (International Fuel Performance Experiments) database, all of which are simulated with both codes. The results point out an improvement in both the quantitative predictions of integral fuel rod FGR and the qualitative representation of the FGR kinetics with the transient model relative to the canonical, purely diffusion-based models of the codes. The overall quantitative improvement of the integral FGR predictions in the two codes is comparable. Moreover, calculated radial profiles of xenon concentration after irradiation are investigated and compared to experimental data, illustrating the underlying representation of the physical mechanisms of burst release.

  13. Multiple Neuropeptide-Coding Genes Involved in Planarian Pharynx Extension.

    PubMed

    Shimoyama, Seira; Inoue, Takeshi; Kashima, Makoto; Agata, Kiyokazu

    2016-06-01

    Planarian feeding behavior involves three steps: moving toward food, extending the pharynx from their planarian's ventral side after arriving at the food, and ingesting the food through the pharynx. Although pharynx extension is a remarkable behavior, it remains unknown what neuronal cell types are involved in its regulation. To identify neurons involved in regulating pharynx extension, we quantitatively analyzed pharynx extension and sought to identify these neurons by RNA interference (RNAi) and in situ hybridization. This assay, when performed using planarians with amputation of various body parts, clearly showed that the head portion is indispensable for inducing pharynx extension. We thus tested the effects of knockdown of brain neurons such as serotonergic, GABAergic, and dopaminergic neurons by RNAi, but did not observe any effects on pharynx extension behavior. However, animals with RNAi of the Prohormone Convertase 2 (PC2, a neuropeptide processing enzyme) gene did not perform the pharynx extension behavior, suggesting the possible involvement of neuropeptide(s in the regulation of pharynx extension. We screened 24 neuropeptide-coding genes, analyzed their functions by RNAi using the pharynx extension assay system, and identified at least five neuropeptide genes involved in pharynx extension. These was expressed in different cells or neurons, and some of them were expressed in the brain, suggesting complex regulation of planarian feeding behavior by the nervous system.

  14. Members of the Dof transcription factor family in Triticum aestivum are associated with light-mediated gene regulation.

    PubMed

    Shaw, Lindsay M; McIntyre, C Lynne; Gresshoff, Peter M; Xue, Gang-Ping

    2009-11-01

    DNA binding with One Finger (Dof) protein is a plant-specific transcription factor implicated in the regulation of many important plant-specific processes, including photosynthesis and carbohydrate metabolism. This study has identified 31 Dof genes (TaDof) in bread wheat through extensive analysis of current nucleotide databases. Phylogenetic analysis suggests that the TaDof family can be divided into four clades. Expression analysis of the TaDof family across all major organs using quantitative RT-PCR and searches of the wheat genome array database revealed that the majority of TaDof members were predominately expressed in vegetative organs. A large number of TaDof members were down-regulated by drought and/or were responsive to the light and dark cycle. Further expression analysis revealed that light up-regulated TaDof members were highly correlated in expression with a number of genes that are involved in photosynthesis or sucrose transport. These data suggest that the TaDof family may have an important role in light-mediated gene regulation, including involvement in the photosynthetic process.

  15. C. elegans dystroglycan coordinates responsiveness of follower axons to dorsal/ventral and anterior/posterior guidance cues

    PubMed Central

    Johnson, Robert P.; Kramer, James M.

    2012-01-01

    Neural development in metazoans is characterized by the establishment of initial process tracts by pioneer axons and the subsequent extension of follower axons along these pioneer processes. Mechanisms governing the fidelity of follower extension along pioneered routes are largely unknown. In C. elegans, formation of the right angle-shaped lumbar commissure connecting the lumbar and preanal ganglia is an example of pioneer/follower dynamics. We find that the dystroglycan ortholog DGN-1 mediates the fidelity of follower lumbar commissure axon extension along the pioneer axon route. In dgn-1 mutants, the axon of the pioneer PVQ neuron faithfully establishes the lumbar commissure, but axons of follower lumbar neurons, such as PVC, frequently bypass the lumbar commissure and extend along an oblique trajectory directly toward the preanal ganglion. In contrast, disruption of the UNC-6/netrin guidance pathway principally perturbs PVQ ventral guidance to pioneer the lumbar commissure. Loss of DGN-1 in unc-6 mutants has a quantitatively similar effect on follower axon guidance regardless of PVQ axon route, indicating that DGN-1 does not mediate follower/pioneer adhesion. Instead, DGN-1 appears to block premature responsiveness of follower axons to a preanal ganglion-directed guidance cue which mediates ventral-to-anterior reorientation of lumbar commissure axons. Deletion analysis shows that only the most N-terminal DGN-1 domain is required for these activities. These studies suggest that dystroglycan modulation of growth cone responsiveness to conflicting guidance cues is important for restricting follower axon extension to the tracts laid down by pioneers. PMID:22275151

  16. Smartphone-Based Mobile Detection Platform for Molecular Diagnostics and Spatiotemporal Disease Mapping.

    PubMed

    Song, Jinzhao; Pandian, Vikram; Mauk, Michael G; Bau, Haim H; Cherry, Sara; Tisi, Laurence C; Liu, Changchun

    2018-04-03

    Rapid and quantitative molecular diagnostics in the field, at home, and at remote clinics is essential for evidence-based disease management, control, and prevention. Conventional molecular diagnostics requires extensive sample preparation, relatively sophisticated instruments, and trained personnel, restricting its use to centralized laboratories. To overcome these limitations, we designed a simple, inexpensive, hand-held, smartphone-based mobile detection platform, dubbed "smart-connected cup" (SCC), for rapid, connected, and quantitative molecular diagnostics. Our platform combines bioluminescent assay in real-time and loop-mediated isothermal amplification (BART-LAMP) technology with smartphone-based detection, eliminating the need for an excitation source and optical filters that are essential in fluorescent-based detection. The incubation heating for the isothermal amplification is provided, electricity-free, with an exothermic chemical reaction, and incubation temperature is regulated with a phase change material. A custom Android App was developed for bioluminescent signal monitoring and analysis, target quantification, data sharing, and spatiotemporal mapping of disease. SCC's utility is demonstrated by quantitative detection of Zika virus (ZIKV) in urine and saliva and HIV in blood within 45 min. We demonstrate SCC's connectivity for disease spatiotemporal mapping with a custom-designed website. Such a smart- and connected-diagnostic system does not require any lab facilities and is suitable for use at home, in the field, in the clinic, and particularly in resource-limited settings in the context of Internet of Medical Things (IoMT).

  17. Goethite surface reactivity: a macroscopic investigation unifying proton, chromate, carbonate, and lead(II) adsorption.

    PubMed

    Villalobos, Mario; Pérez-Gallegos, Ayax

    2008-10-15

    The goethite surface structure has been extensively studied, but no convincing quantitative description of its highly variable surface reactivity as inversely related to its specific surface area (SSA) has been found. The present study adds experimental evidence and provides a unified macroscopic explanation to this anomalous behavior from differences in average adsorption capacities, and not in average adsorption affinities. We investigated the chromate anion and lead(II) cation adsorption behavior onto three different goethites with SSA varying from 50 to 94 m(2)/g, and analyzed an extensive set of published anion adsorption and proton charging data for variable SSA goethites. Maximum chromate adsorption was found to occupy on average from 3.1 to 9.7 sites/nm(2), inversely related to SSA. Congruency of oxyanion and Pb(II) adsorption behavior based on fractional site occupancy using these values, and a site density analysis suggest that: (i) ion binding occurs to singly and doubly coordinated sites, (ii) proton binding occurs to singly and triply coordinated sites (ranging from 6.2 to 8 total sites/nm(2), in most cases), and (iii) a predominance of (210) and/or (010) faces explains the high reactivity of low SSA goethites. The results imply that the macroscopic goethite adsorption behavior may be predicted without a need to investigate extensive structural details of each specific goethite of interest.

  18. A biomechanical comparison of the Rogers interspinous and the Lovely-Carl tension band wiring techniques for fixation of the cervical spine.

    PubMed

    Brasil, A V; Coehlo, D G; Filho, T E; Braga, F M

    2000-07-01

    The authors conducted a biomechanical study in which they compared the uses of the Rogers interspinous and the Lovely-Carl tension band wiring techniques for internal fixation of the cervical spine. An extensive biomechanical evaluation (stiffness in positive and negative rotations around the x, y, and z axes; range of motion in flexion-extension, bilateral axial rotation, and bilateral bending; and neutral zone in flexion-extension, bilateral axial rotation, and lateral bending to the right and to the left) was performed in two groups of intact calf cervical spines. After these initial tests, all specimens were subjected to a distractive flexion Stage 3 ligamentous lesion. Group 1 specimens then underwent surgical fixation by the Rogers technique, and Group 2 specimens underwent surgery by using the Lovely-Carl technique. After fixation, specimens were again submitted to the same biomechanical evaluation. The percentage increase or decrease between the pre- and postoperative parameters was calculated. These values were considered quantitative indicators of the efficacy of the techniques, and the efficacy of the two techniques was compared. Analysis of the findings demonstrated that in the spines treated with the Lovely-Carl technique less restriction of movement was produced without affecting stiffness, compared with those treated with the Rogers technique, thus making the Lovely-Carl technique clinically less useful.

  19. Parameters for Pyrethroid Insecticide QSAR and PBPK/PD Models for Human Risk Assessment

    EPA Science Inventory

    This pyrethroid insecticide parameter review is an extension of our interest in developing quantitative structure–activity relationship–physiologically based pharmacokinetic/pharmacodynamic (QSAR-PBPK/PD) models for assessing health risks, which interest started with the organoph...

  20. Quantitative Morphologic Analysis of Boulder Shape and Surface Texture to Infer Environmental History: A Case Study of Rock Breakdown at the Ephrata Fan, Channeled Scabland, Washington

    NASA Technical Reports Server (NTRS)

    Ehlmann, Bethany L.; Viles, Heather A.; Bourke, Mary C.

    2008-01-01

    Boulder morphology reflects both lithology and climate and is dictated by the combined effects of erosion, transport, and weathering. At present, morphologic information at the boulder scale is underutilized as a recorder of environmental processes, partly because of the lack of a systematic quantitative parameter set for reporting and comparing data sets. We develop such a parameter set, incorporating a range of measures of boulder form and surface texture. We use standard shape metrics measured in the field and fractal and morphometric classification methods borrowed from landscape analysis and applied to laser-scanned molds. The parameter set was pilot tested on three populations of basalt boulders with distinct breakdown histories in the Channeled Scabland, Washington: (1) basalt outcrop talus; (2) flood-transported boulders recently excavated from a quarry; and (3) flood-transported boulders, extensively weathered in situ on the Ephrata Fan surface. Size and shape data were found to distinguish between flood-transported and untransported boulders. Size and edge angles (approximately 120 degrees) of flood-transported boulders suggest removal by preferential fracturing along preexisting columnar joints, and curvature data indicate rounding relative to outcrop boulders. Surface textural data show that boulders which have been exposed at the surface are significantly rougher than those buried by fan sediments. Past signatures diagnostic of flood transport still persist on surface boulders, despite ongoing overprinting by processes in the present breakdown environment through roughening and fracturing in situ. Further use of this quantitative boulder parameter set at other terrestrial and planetary sites will aid in cataloging and understanding morphologic signatures of environmental processes.

  1. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    PubMed

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  2. A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification

    PubMed Central

    Rutledge, Robert G.

    2011-01-01

    Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812

  3. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  4. Improving validation methods for molecular diagnostics: application of Bland-Altman, Deming and simple linear regression analyses in assay comparison and evaluation for next-generation sequencing.

    PubMed

    Misyura, Maksym; Sukhai, Mahadeo A; Kulasignam, Vathany; Zhang, Tong; Kamel-Reid, Suzanne; Stockley, Tracy L

    2018-02-01

    A standard approach in test evaluation is to compare results of the assay in validation to results from previously validated methods. For quantitative molecular diagnostic assays, comparison of test values is often performed using simple linear regression and the coefficient of determination (R 2 ), using R 2 as the primary metric of assay agreement. However, the use of R 2 alone does not adequately quantify constant or proportional errors required for optimal test evaluation. More extensive statistical approaches, such as Bland-Altman and expanded interpretation of linear regression methods, can be used to more thoroughly compare data from quantitative molecular assays. We present the application of Bland-Altman and linear regression statistical methods to evaluate quantitative outputs from next-generation sequencing assays (NGS). NGS-derived data sets from assay validation experiments were used to demonstrate the utility of the statistical methods. Both Bland-Altman and linear regression were able to detect the presence and magnitude of constant and proportional error in quantitative values of NGS data. Deming linear regression was used in the context of assay comparison studies, while simple linear regression was used to analyse serial dilution data. Bland-Altman statistical approach was also adapted to quantify assay accuracy, including constant and proportional errors, and precision where theoretical and empirical values were known. The complementary application of the statistical methods described in this manuscript enables more extensive evaluation of performance characteristics of quantitative molecular assays, prior to implementation in the clinical molecular laboratory. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. Geophysical data analysis and visualization using the Grid Analysis and Display System

    NASA Technical Reports Server (NTRS)

    Doty, Brian E.; Kinter, James L., III

    1995-01-01

    Several problems posed by the rapidly growing volume of geophysical data are described, and a selected set of existing solutions to these problems is outlined. A recently developed desktop software tool called the Grid Analysis and Display System (GrADS) is presented. The GrADS' user interface is a natural extension of the standard procedures scientists apply to their geophysical data analysis problems. The basic GrADS operations have defaults that naturally map to data analysis actions, and there is a programmable interface for customizing data access and manipulation. The fundamental concept of the GrADS' dimension environment, which defines both the space in which the geophysical data reside and the 'slice' of data which is being analyzed at a given time, is expressed The GrADS' data storage and access model is described. An argument is made in favor of describable data formats rather than standard data formats. The manner in which GrADS users may perform operations on their data and display the results is also described. It is argued that two-dimensional graphics provides a powerful quantitative data analysis tool whose value is underestimated in the current development environment which emphasizes three dimensional structure modeling.

  6. The Role of Crustal Strength in Controlling Magmatism and Melt Chemistry During Rifting and Breakup

    NASA Astrophysics Data System (ADS)

    Armitage, John J.; Petersen, Kenni D.; Pérez-Gussinyé, Marta

    2018-02-01

    The strength of the crust has a strong impact on the evolution of continental extension and breakup. Strong crust may promote focused narrow rifting, while wide rifting might be due to a weaker crustal architecture. The strength of the crust also influences deeper processes within the asthenosphere. To quantitatively test the implications of crustal strength on the evolution of continental rift zones, we developed a 2-D numerical model of lithosphere extension that can predict the rare Earth element (REE) chemistry of erupted lava. We find that a difference in crustal strength leads to a different rate of depletion in light elements relative to heavy elements. By comparing the model predictions to rock samples from the Basin and Range, USA, we can demonstrate that slow extension of a weak continental crust can explain the observed depletion in melt chemistry. The same comparison for the Main Ethiopian Rift suggests that magmatism within this narrow rift zone can be explained by the localization of strain caused by a strong lower crust. We demonstrate that the slow extension of a strong lower crust above a mantle of potential temperature of 1,350 °C can fit the observed REE trends and the upper mantle seismic velocity for the Main Ethiopian Rift. The thermo-mechanical model implies that melt composition could provide quantitative information on the style of breakup and the initial strength of the continental crust.

  7. Development of carbon plasma-coated multiwell plates for high-throughput mass spectrometric analysis of highly lipophilic fermentation products.

    PubMed

    Heinig, Uwe; Scholz, Susanne; Dahm, Pia; Grabowy, Udo; Jennewein, Stefan

    2010-08-01

    Classical approaches to strain improvement and metabolic engineering rely on rapid qualitative and quantitative analyses of the metabolites of interest. As an analytical tool, mass spectrometry (MS) has proven to be efficient and nearly universally applicable for timely screening of metabolites. Furthermore, gas chromatography (GC)/MS- and liquid chromatography (LC)/MS-based metabolite screens can often be adapted to high-throughput formats. We recently engineered a Saccharomyces cerevisiae strain to produce taxa-4(5),11(12)-diene, the first pathway-committing biosynthetic intermediate for the anticancer drug Taxol, through the heterologous and homologous expression of several genes related to isoprenoid biosynthesis. To date, GC/MS- and LC/MS-based high-throughput methods have been inherently difficult to adapt to the screening of isoprenoid-producing microbial strains due to the need for extensive sample preparation of these often highly lipophilic compounds. In the current work, we examined different approaches to the high-throughput analysis of taxa-4(5),11(12)-diene biosynthesizing yeast strains in a 96-deep-well format. Carbon plasma coating of standard 96-deep-well polypropylene plates allowed us to circumvent the inherent solvent instability of commonly used deep-well plates. In addition, efficient adsorption of the target isoprenoid product by the coated plates allowed rapid and simple qualitative and quantitative analyses of the individual cultures. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Using Dynamic Contrast Enhanced MRI to Quantitatively Characterize Maternal Vascular Organization in the Primate Placenta

    PubMed Central

    Frias, A.E.; Schabel, M.C.; Roberts, V.H.J.; Tudorica, A.; Grigsby, P.L.; Oh, K.Y.; Kroenke, C. D.

    2015-01-01

    Purpose The maternal microvasculature of the primate placenta is organized into 10-20 perfusion domains that are functionally optimized to facilitate nutrient exchange to support fetal growth. This study describes a dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) method for identifying vascular domains, and quantifying maternal blood flow in them. Methods A rhesus macaque on the 133rd day of pregnancy (G133, term=165 days) underwent Doppler ultrasound (US) procedures, DCE-MRI, and Cesarean-section delivery. Serial T1-weighted images acquired throughout intravenous injection of a contrast reagent (CR) bolus were analyzed to obtain CR arrival time maps of the placenta. Results Watershed segmentation of the arrival time map identified 16 perfusion domains. The number and location of these domains corresponded to anatomical cotyledonary units observed following delivery. Analysis of the CR wave front through each perfusion domain enabled determination of volumetric flow, which ranged from 9.03 to 44.9 mL/sec (25.2 ± 10.3 mL/sec). These estimates are supported by Doppler US results. Conclusions The DCE-MRI analysis described here provides quantitative estimates of the number of maternal perfusion domains in a primate placenta, and estimates flow within each domain. Anticipated extensions of this technique are to the study placental function in nonhuman primate models of obstetric complications. PMID:24753177

  9. Effects of biases in domain wall network evolution. II. Quantitative analysis

    NASA Astrophysics Data System (ADS)

    Correia, J. R. C. C. C.; Leite, I. S. C. R.; Martins, C. J. A. P.

    2018-04-01

    Domain walls form at phase transitions which break discrete symmetries. In a cosmological context, they often overclose the Universe (contrary to observational evidence), although one may prevent this by introducing biases or forcing anisotropic evolution of the walls. In a previous work [Correia et al., Phys. Rev. D 90, 023521 (2014), 10.1103/PhysRevD.90.023521], we numerically studied the evolution of various types of biased domain wall networks in the early Universe, confirming that anisotropic networks ultimately reach scaling while those with a biased potential or biased initial conditions decay. We also found that the analytic decay law obtained by Hindmarsh was in good agreement with simulations of biased potentials, but not of biased initial conditions, and suggested that the difference was related to the Gaussian approximation underlying the analytic law. Here, we extend our previous work in several ways. For the cases of biased potential and biased initial conditions, we study in detail the field distributions in the simulations, confirming that the validity (or not) of the Gaussian approximation is the key difference between the two cases. For anisotropic walls, we carry out a more extensive set of numerical simulations and compare them to the canonical velocity-dependent one-scale model for domain walls, finding that the model accurately predicts the linear scaling regime after isotropization. Overall, our analysis provides a quantitative description of the cosmological evolution of these networks.

  10. CellShape: A user-friendly image analysis tool for quantitative visualization of bacterial cell factories inside.

    PubMed

    Goñi-Moreno, Ángel; Kim, Juhyun; de Lorenzo, Víctor

    2017-02-01

    Visualization of the intracellular constituents of individual bacteria while performing as live biocatalysts is in principle doable through more or less sophisticated fluorescence microscopy. Unfortunately, rigorous quantitation of the wealth of data embodied in the resulting images requires bioinformatic tools that are not widely extended within the community-let alone that they are often subject to licensing that impedes software reuse. In this context we have developed CellShape, a user-friendly platform for image analysis with subpixel precision and double-threshold segmentation system for quantification of fluorescent signals stemming from single-cells. CellShape is entirely coded in Python, a free, open-source programming language with widespread community support. For a developer, CellShape enhances extensibility (ease of software improvements) by acting as an interface to access and use existing Python modules; for an end-user, CellShape presents standalone executable files ready to open without installation. We have adopted this platform to analyse with an unprecedented detail the tridimensional distribution of the constituents of the gene expression flow (DNA, RNA polymerase, mRNA and ribosomal proteins) in individual cells of the industrial platform strain Pseudomonas putida KT2440. While the CellShape first release version (v0.8) is readily operational, users and/or developers are enabled to expand the platform further. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Two Algorithms for High-throughput and Multi-parametric Quantification of Drosophila Neuromuscular Junction Morphology.

    PubMed

    Castells-Nobau, Anna; Nijhof, Bonnie; Eidhof, Ilse; Wolf, Louis; Scheffer-de Gooyert, Jolanda M; Monedero, Ignacio; Torroja, Laura; van der Laak, Jeroen A W M; Schenck, Annette

    2017-05-03

    Synaptic morphology is tightly related to synaptic efficacy, and in many cases morphological synapse defects ultimately lead to synaptic malfunction. The Drosophila larval neuromuscular junction (NMJ), a well-established model for glutamatergic synapses, has been extensively studied for decades. Identification of mutations causing NMJ morphological defects revealed a repertoire of genes that regulate synapse development and function. Many of these were identified in large-scale studies that focused on qualitative approaches to detect morphological abnormalities of the Drosophila NMJ. A drawback of qualitative analyses is that many subtle players contributing to NMJ morphology likely remain unnoticed. Whereas quantitative analyses are required to detect the subtler morphological differences, such analyses are not yet commonly performed because they are laborious. This protocol describes in detail two image analysis algorithms "Drosophila NMJ Morphometrics" and "Drosophila NMJ Bouton Morphometrics", available as Fiji-compatible macros, for quantitative, accurate and objective morphometric analysis of the Drosophila NMJ. This methodology is developed to analyze NMJ terminals immunolabeled with the commonly used markers Dlg-1 and Brp. Additionally, its wider application to other markers such as Hrp, Csp and Syt is presented in this protocol. The macros are able to assess nine morphological NMJ features: NMJ area, NMJ perimeter, number of boutons, NMJ length, NMJ longest branch length, number of islands, number of branches, number of branching points and number of active zones in the NMJ terminal.

  12. Bayesian B-spline mapping for dynamic quantitative traits.

    PubMed

    Xing, Jun; Li, Jiahan; Yang, Runqing; Zhou, Xiaojing; Xu, Shizhong

    2012-04-01

    Owing to their ability and flexibility to describe individual gene expression at different time points, random regression (RR) analyses have become a popular procedure for the genetic analysis of dynamic traits whose phenotypes are collected over time. Specifically, when modelling the dynamic patterns of gene expressions in the RR framework, B-splines have been proved successful as an alternative to orthogonal polynomials. In the so-called Bayesian B-spline quantitative trait locus (QTL) mapping, B-splines are used to characterize the patterns of QTL effects and individual-specific time-dependent environmental errors over time, and the Bayesian shrinkage estimation method is employed to estimate model parameters. Extensive simulations demonstrate that (1) in terms of statistical power, Bayesian B-spline mapping outperforms the interval mapping based on the maximum likelihood; (2) for the simulated dataset with complicated growth curve simulated by B-splines, Legendre polynomial-based Bayesian mapping is not capable of identifying the designed QTLs accurately, even when higher-order Legendre polynomials are considered and (3) for the simulated dataset using Legendre polynomials, the Bayesian B-spline mapping can find the same QTLs as those identified by Legendre polynomial analysis. All simulation results support the necessity and flexibility of B-spline in Bayesian mapping of dynamic traits. The proposed method is also applied to a real dataset, where QTLs controlling the growth trajectory of stem diameters in Populus are located.

  13. Qualitative and quantitative analysis of chemical constituents of Ptychopetalum olacoides Benth.

    PubMed

    Tian, Xiao; Guo, Sen; He, Kan; Roller, Marc; Yang, Meiqi; Liu, Qingchao; Zhang, Li; Ho, Chi-Tang; Bai, Naisheng

    2018-02-01

    Ptychopetalum olacoides is a folk medicinal plant for health care in market, especially in Brazil. Fourteen known compounds were isolated from P. olacoides and their chemical structures were elucidated by extensive spectroscopic data, including 1D NMR, 2D NMR, UV, IR and HR-ESI-MS. The 14 known compounds were identified as N-trans-feruloyl-3,5-dihydroxyindolin-2-one (1), magnoflorine (2), menisperine (3), 4-coumaroylserotonin (4), moschamine (5), luteolin (6), 4'-methoxyluteolin (7), 3-methoxyluteolin (8), 3, 7-dimethoxyluteolin (9), caffeic acid (10), ferulic acid (11), vanillic acid (12), syringic acid (13) and ginsenoside Re (14). To our knowledge, compounds (1-6, 13-14) were isolated from the plant for the first time. Additionally, quantitative analysis results indicated that calibration equations of compounds (1-3, 6, 9, 11-13) exhibited good linear regressions within the test ranges (R 2  ≥ 0.9990) and magnoflorine and menisperine were the major constituents in the barks of P. olacoides. The contents of magnoflorine and menisperine accounted for 75.96% of all analytes. However, the content of phenolic components was smaller and the highest content was no more than 1.04 mg/g. Collectively, these results suggested that alkaloids are the dominant substances in P. olacoides, which can make a difference for the quality control and further use of P. olacoides.

  14. Conservation of the Nucleotide Excision Repair Pathway: Characterization of Hydra Xeroderma Pigmentosum Group F Homolog

    PubMed Central

    Barve, Apurva; Ghaskadbi, Saroj; Ghaskadbi, Surendra

    2013-01-01

    Hydra, one of the earliest metazoans with tissue grade organization and nervous system, is an animal with a remarkable regeneration capacity and shows no signs of organismal aging. We have for the first time identified genes of the nucleotide excision repair (NER) pathway from hydra. Here we report cloning and characterization of hydra homolog of xeroderma pigmentosum group F (XPF) gene that encodes a structure-specific 5′ endonuclease which is a crucial component of NER. In silico analysis shows that hydra XPF amino acid sequence is very similar to its counterparts from other animals, especially vertebrates, and shows all features essential for its function. By in situ hybridization, we show that hydra XPF is expressed prominently in the multipotent stem cell niche in the central region of the body column. Ectoderm of the diploblastic hydra was shown to express higher levels of XPF as compared to the endoderm by semi-quantitative RT-PCR. Semi-quantitative RT-PCR analysis also demonstrated that interstitial cells, a multipotent and rapidly cycling stem cell lineage of hydra, express higher levels of XPF mRNA than other cell types. Our data show that XPF and by extension, the NER pathway is highly conserved during evolution. The prominent expression of an NER gene in interstitial cells may have implications for the lack of senescence in hydra. PMID:23577191

  15. Conservation of the nucleotide excision repair pathway: characterization of hydra Xeroderma Pigmentosum group F homolog.

    PubMed

    Barve, Apurva; Ghaskadbi, Saroj; Ghaskadbi, Surendra

    2013-01-01

    Hydra, one of the earliest metazoans with tissue grade organization and nervous system, is an animal with a remarkable regeneration capacity and shows no signs of organismal aging. We have for the first time identified genes of the nucleotide excision repair (NER) pathway from hydra. Here we report cloning and characterization of hydra homolog of xeroderma pigmentosum group F (XPF) gene that encodes a structure-specific 5' endonuclease which is a crucial component of NER. In silico analysis shows that hydra XPF amino acid sequence is very similar to its counterparts from other animals, especially vertebrates, and shows all features essential for its function. By in situ hybridization, we show that hydra XPF is expressed prominently in the multipotent stem cell niche in the central region of the body column. Ectoderm of the diploblastic hydra was shown to express higher levels of XPF as compared to the endoderm by semi-quantitative RT-PCR. Semi-quantitative RT-PCR analysis also demonstrated that interstitial cells, a multipotent and rapidly cycling stem cell lineage of hydra, express higher levels of XPF mRNA than other cell types. Our data show that XPF and by extension, the NER pathway is highly conserved during evolution. The prominent expression of an NER gene in interstitial cells may have implications for the lack of senescence in hydra.

  16. Characterization of T cell repertoire changes in acute Kawasaki disease

    PubMed Central

    1993-01-01

    Kawasaki disease (KD) is an acute multisystem vasculitis of unknown etiology that is associated with marked activation of T cells and monocyte/macrophages. Using a quantitative polymerase chain reaction (PCR) technique, we recently found that the acute phase of KD is associated with the expansion of T cells expressing the V beta 2 and V beta 8.1 gene segments. In the present work, we used a newly developed anti-V beta 2 monoclonal antibody (mAb) and studied a new group of KD patients to extend our previous PCR results. Immunofluorescence analysis confirmed that V beta 2-bearing T cells are selectively increased in patients with acute KD. The increase occurred primarily in the CD4 T cell subset. The percentages of V beta 2+ T cells as determined by mAb reactivity and flow cytometry correlated linearly with V beta expression as quantitated by PCR. However, T cells from acute KD patients appeared to express proportionately higher levels of V beta 2 transcripts per cell as compared with healthy controls or convalescent KD patients. Sequence analysis of T cell receptor beta chain genes of V beta 2 and V beta 8.1 expressing T cells from acute KD patients showed extensive junctional region diversity. These data showing polyclonal expansion of V beta 2+ and V beta 8+ T cells in acute KD provide additional insight into the immunopathogenesis of this disease. PMID:8094737

  17. Application of Monte Carlo cross-validation to identify pathway cross-talk in neonatal sepsis.

    PubMed

    Zhang, Yuxia; Liu, Cui; Wang, Jingna; Li, Xingxia

    2018-03-01

    To explore genetic pathway cross-talk in neonates with sepsis, an integrated approach was used in this paper. To explore the potential relationships between differently expressed genes between normal uninfected neonates and neonates with sepsis and pathways, genetic profiling and biologic signaling pathway were first integrated. For different pathways, the score was obtained based upon the genetic expression by quantitatively analyzing the pathway cross-talk. The paired pathways with high cross-talk were identified by random forest classification. The purpose of the work was to find the best pairs of pathways able to discriminate sepsis samples versus normal samples. The results found 10 pairs of pathways, which were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways were identified according to analysis of extensive literature. Impact statement To find the best pairs of pathways able to discriminate sepsis samples versus normal samples, an RF classifier, the DS obtained by DEGs of paired pathways significantly associated, and Monte Carlo cross-validation were applied in this paper. Ten pairs of pathways were probably able to discriminate neonates with sepsis versus normal uninfected neonates. Among them, the best two paired pathways ((7) IL-6 Signaling and Phospholipase C Signaling (PLC); (8) Glucocorticoid Receptor (GR) Signaling and Dendritic Cell Maturation) were identified according to analysis of extensive literature.

  18. Neural chronometry and coherency across speed-accuracy demands reveal lack of homomorphism between computational and neural mechanisms of evidence accumulation.

    PubMed

    Heitz, Richard P; Schall, Jeffrey D

    2013-10-19

    The stochastic accumulation framework provides a mechanistic, quantitative account of perceptual decision-making and how task performance changes with experimental manipulations. Importantly, it provides an elegant account of the speed-accuracy trade-off (SAT), which has long been the litmus test for decision models, and also mimics the activity of single neurons in several key respects. Recently, we developed a paradigm whereby macaque monkeys trade speed for accuracy on cue during visual search task. Single-unit activity in frontal eye field (FEF) was not homomorphic with the architecture of models, demonstrating that stochastic accumulators are an incomplete description of neural activity under SAT. This paper summarizes and extends this work, further demonstrating that the SAT leads to extensive, widespread changes in brain activity never before predicted. We will begin by reviewing our recently published work that establishes how spiking activity in FEF accomplishes SAT. Next, we provide two important extensions of this work. First, we report a new chronometric analysis suggesting that increases in perceptual gain with speed stress are evident in FEF synaptic input, implicating afferent sensory-processing sources. Second, we report a new analysis demonstrating selective influence of SAT on frequency coupling between FEF neurons and local field potentials. None of these observations correspond to the mechanics of current accumulator models.

  19. Live-cell imaging of G-actin dynamics using sequential FDAP

    PubMed Central

    Kiuchi, Tai; Nagai, Tomoaki; Ohashi, Kazumasa; Watanabe, Naoki; Mizuno, Kensaku

    2011-01-01

    Various microscopic techniques have been developed to understand the mechanisms that spatiotemporally control actin filament dynamics in live cells. Kinetic data on the processes of actin assembly and disassembly on F-actin have been accumulated. However, the kinetics of cytoplasmic G-actin, a key determinant for actin polymerization, has remained unclear because of a lack of appropriate methods to measure the G-actin concentration quantitatively. We have developed two new microscopic techniques based on the fluorescence decay after photoactivation (FDAP) time-lapse imaging of photoswitchable Dronpa-labeled actin. These techniques, sequential FDAP (s-FDAP) and multipoint FDAP, were used to measure the time-dependent changes in and spatial distribution of the G-actin concentration in live cells. Use of s-FDAP provided data on changes in the G-actin concentration with high temporal resolution; these data were useful for the model analysis of actin assembly processes in live cells. The s-FDAP analysis also provided evidence that the cytoplasmic G-actin concentration substantially decreases after cell stimulation and that the extent of stimulus-induced actin assembly and cell size extension are linearly correlated with the G-actin concentration before cell stimulation. The advantages of using s-FDAP and multipoint FDAP to measure spatiotemporal G-actin dynamics and the roles of G-actin concentration and ADF/cofilin in stimulus-induced actin assembly and lamellipodium extension in live cells are discussed. PMID:22754616

  20. Realist explanatory theory building method for social epidemiology: a protocol for a mixed method multilevel study of neighbourhood context and postnatal depression.

    PubMed

    Eastwood, John G; Jalaludin, Bin B; Kemp, Lynn A

    2014-01-01

    A recent criticism of social epidemiological studies, and multi-level studies in particular has been a paucity of theory. We will present here the protocol for a study that aims to build a theory of the social epidemiology of maternal depression. We use a critical realist approach which is trans-disciplinary, encompassing both quantitative and qualitative traditions, and that assumes both ontological and hierarchical stratification of reality. We describe a critical realist Explanatory Theory Building Method comprising of an: 1) emergent phase, 2) construction phase, and 3) confirmatory phase. A concurrent triangulated mixed method multilevel cross-sectional study design is described. The Emergent Phase uses: interviews, focus groups, exploratory data analysis, exploratory factor analysis, regression, and multilevel Bayesian spatial data analysis to detect and describe phenomena. Abductive and retroductive reasoning will be applied to: categorical principal component analysis, exploratory factor analysis, regression, coding of concepts and categories, constant comparative analysis, drawing of conceptual networks, and situational analysis to generate theoretical concepts. The Theory Construction Phase will include: 1) defining stratified levels; 2) analytic resolution; 3) abductive reasoning; 4) comparative analysis (triangulation); 5) retroduction; 6) postulate and proposition development; 7) comparison and assessment of theories; and 8) conceptual frameworks and model development. The strength of the critical realist methodology described is the extent to which this paradigm is able to support the epistemological, ontological, axiological, methodological and rhetorical positions of both quantitative and qualitative research in the field of social epidemiology. The extensive multilevel Bayesian studies, intensive qualitative studies, latent variable theory, abductive triangulation, and Inference to Best Explanation provide a strong foundation for Theory Construction. The study will contribute to defining the role that realism and mixed methods can play in explaining the social determinants and developmental origins of health and disease.

  1. Cross-system comparisons elucidate disturbance complexities and generalities

    USDA-ARS?s Scientific Manuscript database

    Given that ecological effects of disturbance have been extensively studied in many ecosystems, it is surprising that few quantitative syntheses across diverse ecosystems have been conducted. Building on existing research, we present a conceptual framework and an operational analog to integrate this ...

  2. Characterization of Defects in Composite Material Using Rapidly Acquired Leaky Lamb Wave Dispersion Data

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Y.; Mal, A.; Chang, Z.

    1998-01-01

    The phenomenon of Leaky Lamb Wave (LLW) in composite materials was first observed in 1982 using a Schlieren system. It has been studied extensively by numerous investigators and successfully shown to be an effective quantitative NDE tool.

  3. Distribution of distances between DNA barcode labels in nanochannels close to the persistence length

    NASA Astrophysics Data System (ADS)

    Reinhart, Wesley F.; Reifenberger, Jeff G.; Gupta, Damini; Muralidhar, Abhiram; Sheats, Julian; Cao, Han; Dorfman, Kevin D.

    2015-02-01

    We obtained experimental extension data for barcoded E. coli genomic DNA molecules confined in nanochannels from 40 nm to 51 nm in width. The resulting data set consists of 1 627 779 measurements of the distance between fluorescent probes on 25 407 individual molecules. The probability density for the extension between labels is negatively skewed, and the magnitude of the skewness is relatively insensitive to the distance between labels. The two Odijk theories for DNA confinement bracket the mean extension and its variance, consistent with the scaling arguments underlying the theories. We also find that a harmonic approximation to the free energy, obtained directly from the probability density for the distance between barcode labels, leads to substantial quantitative error in the variance of the extension data. These results suggest that a theory for DNA confinement in such channels must account for the anharmonic nature of the free energy as a function of chain extension.

  4. Online model checking approach based parameter estimation to a neuronal fate decision simulation model in Caenorhabditis elegans with hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Koh, Chuan Hock; Miyano, Satoru

    2011-05-01

    Mathematical modeling and simulation studies are playing an increasingly important role in helping researchers elucidate how living organisms function in cells. In systems biology, researchers typically tune many parameters manually to achieve simulation results that are consistent with biological knowledge. This severely limits the size and complexity of simulation models built. In order to break this limitation, we propose a computational framework to automatically estimate kinetic parameters for a given network structure. We utilized an online (on-the-fly) model checking technique (which saves resources compared to the offline approach), with a quantitative modeling and simulation architecture named hybrid functional Petri net with extension (HFPNe). We demonstrate the applicability of this framework by the analysis of the underlying model for the neuronal cell fate decision model (ASE fate model) in Caenorhabditis elegans. First, we built a quantitative ASE fate model containing 3327 components emulating nine genetic conditions. Then, using our developed efficient online model checker, MIRACH 1.0, together with parameter estimation, we ran 20-million simulation runs, and were able to locate 57 parameter sets for 23 parameters in the model that are consistent with 45 biological rules extracted from published biological articles without much manual intervention. To evaluate the robustness of these 57 parameter sets, we run another 20 million simulation runs using different magnitudes of noise. Our simulation results concluded that among these models, one model is the most reasonable and robust simulation model owing to the high stability against these stochastic noises. Our simulation results provide interesting biological findings which could be used for future wet-lab experiments.

  5. Comparison of the In Vivo Biotransformation of Two Emerging Estrogenic Contaminants, BP2 and BPS, in Zebrafish Embryos and Adults

    PubMed Central

    Le Fol, Vincent; Brion, François; Hillenweck, Anne; Perdu, Elisabeth; Bruel, Sandrine; Aït-Aïssa, Selim; Cravedi, Jean-Pierre; Zalko, Daniel

    2017-01-01

    Zebrafish embryo assays are increasingly used in the toxicological assessment of endocrine disruptors. Among other advantages, these models are 3R-compliant and are fit for screening purposes. Biotransformation processes are well-recognized as a critical factor influencing toxic response, but major gaps of knowledge exist regarding the characterization of functional metabolic capacities expressed in zebrafish. Comparative metabolic studies between embryos and adults are even scarcer. Using 3H-labeled chemicals, we examined the fate of two estrogenic emerging contaminants, benzophenone-2 (BP2) and bisphenol S (BPS), in 4-day embryos and adult zebrafish. BPS and BP2 were exclusively metabolized through phase II pathways, with no major qualitative difference between larvae and adults except the occurrence of a BP2-di-glucuronide in adults. Quantitatively, the biotransformation of both molecules was more extensive in adults. For BPS, glucuronidation was the predominant pathway in adults and larvae. For BP2, glucuronidation was the major pathway in larvae, but sulfation predominated in adults, with ca. 40% conversion of parent BP2 and an extensive release of several conjugates into water. Further larvae/adults quantitative differences were demonstrated for both molecules, with higher residue concentrations measured in larvae. The study contributes novel data regarding the metabolism of BPS and BP2 in a fish model and shows that phase II conjugation pathways are already functional in 4-dpf-old zebrafish. Comparative analysis of BP2 and BPS metabolic profiles in zebrafish larvae and adults further supports the use of zebrafish embryo as a relevant model in which toxicity and estrogenic activity can be assessed, while taking into account the absorption and fate of tested substances. PMID:28346357

  6. Comparison of the In Vivo Biotransformation of Two Emerging Estrogenic Contaminants, BP2 and BPS, in Zebrafish Embryos and Adults.

    PubMed

    Le Fol, Vincent; Brion, François; Hillenweck, Anne; Perdu, Elisabeth; Bruel, Sandrine; Aït-Aïssa, Selim; Cravedi, Jean-Pierre; Zalko, Daniel

    2017-03-25

    Zebrafish embryo assays are increasingly used in the toxicological assessment of endocrine disruptors. Among other advantages, these models are 3R-compliant and are fit for screening purposes. Biotransformation processes are well-recognized as a critical factor influencing toxic response, but major gaps of knowledge exist regarding the characterization of functional metabolic capacities expressed in zebrafish. Comparative metabolic studies between embryos and adults are even scarcer. Using ³H-labeled chemicals, we examined the fate of two estrogenic emerging contaminants, benzophenone-2 (BP2) and bisphenol S (BPS), in 4-day embryos and adult zebrafish. BPS and BP2 were exclusively metabolized through phase II pathways, with no major qualitative difference between larvae and adults except the occurrence of a BP2-di-glucuronide in adults. Quantitatively, the biotransformation of both molecules was more extensive in adults. For BPS, glucuronidation was the predominant pathway in adults and larvae. For BP2, glucuronidation was the major pathway in larvae, but sulfation predominated in adults, with ca. 40% conversion of parent BP2 and an extensive release of several conjugates into water. Further larvae/adults quantitative differences were demonstrated for both molecules, with higher residue concentrations measured in larvae. The study contributes novel data regarding the metabolism of BPS and BP2 in a fish model and shows that phase II conjugation pathways are already functional in 4-dpf-old zebrafish. Comparative analysis of BP2 and BPS metabolic profiles in zebrafish larvae and adults further supports the use of zebrafish embryo as a relevant model in which toxicity and estrogenic activity can be assessed, while taking into account the absorption and fate of tested substances.

  7. A Quantitative Socio-hydrological Characterization of Water Security in Large-Scale Irrigation Systems

    NASA Astrophysics Data System (ADS)

    Siddiqi, A.; Muhammad, A.; Wescoat, J. L., Jr.

    2017-12-01

    Large-scale, legacy canal systems, such as the irrigation infrastructure in the Indus Basin in Punjab, Pakistan, have been primarily conceived, constructed, and operated with a techno-centric approach. The emerging socio-hydrological approaches provide a new lens for studying such systems to potentially identify fresh insights for addressing contemporary challenges of water security. In this work, using the partial definition of water security as "the reliable availability of an acceptable quantity and quality of water", supply reliability is construed as a partial measure of water security in irrigation systems. A set of metrics are used to quantitatively study reliability of surface supply in the canal systems of Punjab, Pakistan using an extensive dataset of 10-daily surface water deliveries over a decade (2007-2016) and of high frequency (10-minute) flow measurements over one year. The reliability quantification is based on comparison of actual deliveries and entitlements, which are a combination of hydrological and social constructs. The socio-hydrological lens highlights critical issues of how flows are measured, monitored, perceived, and experienced from the perspective of operators (government officials) and users (famers). The analysis reveals varying levels of reliability (and by extension security) of supply when data is examined across multiple temporal and spatial scales. The results shed new light on evolution of water security (as partially measured by supply reliability) for surface irrigation in the Punjab province of Pakistan and demonstrate that "information security" (defined as reliable availability of sufficiently detailed data) is vital for enabling water security. It is found that forecasting and management (that are social processes) lead to differences between entitlements and actual deliveries, and there is significant potential to positively affect supply reliability through interventions in the social realm.

  8. Kinematic analysis of upper extremity movement during drinking in hemiplegic subjects.

    PubMed

    Kim, Kyung; Song, Won-Kyung; Lee, Jeongsu; Lee, Hwi-Young; Park, Dae Sung; Ko, Byung-Woo; Kim, Jongbae

    2014-03-01

    It is necessary to analyze the kinematic properties of a paralyzed extremity to quantitatively determine the degree of impairment of hemiplegic people during functional activities of daily living (ADL) such as a drinking task. This study aimed to identify the kinematic differences between 16 hemiplegic and 32 able-bodied participants in relation to the task phases when drinking with a cup and the kinematic strategy used during motion with respect to the gravity direction. The subjects performed a drinking task that was divided into five phases according to Murphy's phase definition: reaching, forward transport, drinking, backward transport, and returning. We found that the groups differed in terms of the movement times and the joint angles and angular velocities of the shoulder, elbow, and wrist joints. Compared to the control group, the hemiplegic participants had a larger shoulder abduction angle of at most 17.1° during all the phases, a larger shoulder flexion angle of 7.6° during the reaching phase, and a smaller shoulder flexion angle of 6.4° during the backward transporting phase. Because of these shoulder joint patterns, a smaller elbow pronation peak angle of at most 13.1° and a larger wrist extension peak angle of 12.0° were found in the motions of the hemiplegic participants, as compensation to complete the drinking task. The movement in the gravity direction during the backward transporting phase resulted in a 15.9% larger peak angular velocity for elbow extension in the hemiplegic participants compared to that of the control group. These quantitative kinematic patterns help provide an understanding of the movements of an affected extremity and can be useful in designing rehabilitation robots to assist hemiplegic people with ADL. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Immune clearance of highly pathogenic SIV infection.

    PubMed

    Hansen, Scott G; Piatak, Michael; Ventura, Abigail B; Hughes, Colette M; Gilbride, Roxanne M; Ford, Julia C; Oswald, Kelli; Shoemaker, Rebecca; Li, Yuan; Lewis, Matthew S; Gilliam, Awbrey N; Xu, Guangwu; Whizin, Nathan; Burwitz, Benjamin J; Planer, Shannon L; Turner, John M; Legasse, Alfred W; Axthelm, Michael K; Nelson, Jay A; Früh, Klaus; Sacha, Jonah B; Estes, Jacob D; Keele, Brandon F; Edlefsen, Paul T; Lifson, Jeffrey D; Picker, Louis J

    2013-10-03

    Established infections with the human and simian immunodeficiency viruses (HIV and SIV, respectively) are thought to be permanent with even the most effective immune responses and antiretroviral therapies only able to control, but not clear, these infections. Whether the residual virus that maintains these infections is vulnerable to clearance is a question of central importance to the future management of millions of HIV-infected individuals. We recently reported that approximately 50% of rhesus macaques (RM; Macaca mulatta) vaccinated with SIV protein-expressing rhesus cytomegalovirus (RhCMV/SIV) vectors manifest durable, aviraemic control of infection with the highly pathogenic strain SIVmac239 (ref. 5). Here we show that regardless of the route of challenge, RhCMV/SIV vector-elicited immune responses control SIVmac239 after demonstrable lymphatic and haematogenous viral dissemination, and that replication-competent SIV persists in several sites for weeks to months. Over time, however, protected RM lost signs of SIV infection, showing a consistent lack of measurable plasma- or tissue-associated virus using ultrasensitive assays, and a loss of T-cell reactivity to SIV determinants not in the vaccine. Extensive ultrasensitive quantitative PCR and quantitative PCR with reverse transcription analyses of tissues from RhCMV/SIV vector-protected RM necropsied 69-172 weeks after challenge did not detect SIV RNA or DNA sequences above background levels, and replication-competent SIV was not detected in these RM by extensive co-culture analysis of tissues or by adoptive transfer of 60 million haematolymphoid cells to naive RM. These data provide compelling evidence for progressive clearance of a pathogenic lentiviral infection, and suggest that some lentiviral reservoirs may be susceptible to the continuous effector memory T-cell-mediated immune surveillance elicited and maintained by cytomegalovirus vectors.

  10. Print News Coverage of School-Based HPV Vaccine Mandate

    PubMed Central

    Casciotti, Dana; Smith, Katherine C.; Andon, Lindsay; Vernick, Jon; Tsui, Amy; Klassen, Ann C.

    2015-01-01

    BACKGROUND In 2007, legislation was proposed in 24 states and the District of Columbia for school-based HPV vaccine mandates, and mandates were enacted in Texas, Virginia, and the District of Columbia. Media coverage of these events was extensive, and media messages both reflected and contributed to controversy surrounding these legislative activities. Messages communicated through the media are an important influence on adolescent and parent understanding of school-based vaccine mandates. METHODS We conducted structured text analysis of newspaper coverage, including quantitative analysis of 169 articles published in mandate jurisdictions from 2005-2009, and qualitative analysis of 63 articles from 2007. Our structured analysis identified topics, key stakeholders and sources, tone, and the presence of conflict. Qualitative thematic analysis identified key messages and issues. RESULTS Media coverage was often incomplete, providing little context about cervical cancer or screening. Skepticism and autonomy concerns were common. Messages reflected conflict and distrust of government activities, which could negatively impact this and other youth-focused public health initiatives. CONCLUSIONS If school health professionals are aware of the potential issues raised in media coverage of school-based health mandates, they will be more able to convey appropriate health education messages, and promote informed decision-making by parents and students. PMID:25099421

  11. Event time analysis of longitudinal neuroimage data.

    PubMed

    Sabuncu, Mert R; Bernal-Rusiel, Jorge L; Reuter, Martin; Greve, Douglas N; Fischl, Bruce

    2014-08-15

    This paper presents a method for the statistical analysis of the associations between longitudinal neuroimaging measurements, e.g., of cortical thickness, and the timing of a clinical event of interest, e.g., disease onset. The proposed approach consists of two steps, the first of which employs a linear mixed effects (LME) model to capture temporal variation in serial imaging data. The second step utilizes the extended Cox regression model to examine the relationship between time-dependent imaging measurements and the timing of the event of interest. We demonstrate the proposed method both for the univariate analysis of image-derived biomarkers, e.g., the volume of a structure of interest, and the exploratory mass-univariate analysis of measurements contained in maps, such as cortical thickness and gray matter density. The mass-univariate method employs a recently developed spatial extension of the LME model. We applied our method to analyze structural measurements computed using FreeSurfer, a widely used brain Magnetic Resonance Image (MRI) analysis software package. We provide a quantitative and objective empirical evaluation of the statistical performance of the proposed method on longitudinal data from subjects suffering from Mild Cognitive Impairment (MCI) at baseline. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. MEM spectral analysis for predicting influenza epidemics in Japan.

    PubMed

    Sumi, Ayako; Kamo, Ken-ichi

    2012-03-01

    The prediction of influenza epidemics has long been the focus of attention in epidemiology and mathematical biology. In this study, we tested whether time series analysis was useful for predicting the incidence of influenza in Japan. The method of time series analysis we used consists of spectral analysis based on the maximum entropy method (MEM) in the frequency domain and the nonlinear least squares method in the time domain. Using this time series analysis, we analyzed the incidence data of influenza in Japan from January 1948 to December 1998; these data are unique in that they covered the periods of pandemics in Japan in 1957, 1968, and 1977. On the basis of the MEM spectral analysis, we identified the periodic modes explaining the underlying variations of the incidence data. The optimum least squares fitting (LSF) curve calculated with the periodic modes reproduced the underlying variation of the incidence data. An extension of the LSF curve could be used to predict the incidence of influenza quantitatively. Our study suggested that MEM spectral analysis would allow us to model temporal variations of influenza epidemics with multiple periodic modes much more effectively than by using the method of conventional time series analysis, which has been used previously to investigate the behavior of temporal variations in influenza data.

  13. Assessing Classroom Assessment Techniques

    ERIC Educational Resources Information Center

    Simpson-Beck, Victoria

    2011-01-01

    Classroom assessment techniques (CATs) are teaching strategies that provide formative assessments of student learning. It has been argued that the use of CATs enhances and improves student learning. Although the various types of CATs have been extensively documented and qualitatively studied, there appears to be little quantitative research…

  14. Small- and Large-Effect Quantitative Trait Locus Interactions Underlie Variation in Yeast Sporulation Efficiency

    PubMed Central

    Lorenz, Kim; Cohen, Barak A.

    2012-01-01

    Quantitative trait loci (QTL) with small effects on phenotypic variation can be difficult to detect and analyze. Because of this a large fraction of the genetic architecture of many complex traits is not well understood. Here we use sporulation efficiency in Saccharomyces cerevisiae as a model complex trait to identify and study small-effect QTL. In crosses where the large-effect quantitative trait nucleotides (QTN) have been genetically fixed we identify small-effect QTL that explain approximately half of the remaining variation not explained by the major effects. We find that small-effect QTL are often physically linked to large-effect QTL and that there are extensive genetic interactions between small- and large-effect QTL. A more complete understanding of quantitative traits will require a better understanding of the numbers, effect sizes, and genetic interactions of small-effect QTL. PMID:22942125

  15. 75 FR 54117 - Building Energy Standards Program: Preliminary Determination Regarding Energy Efficiency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-03

    ... Response to Comments on Previous Analysis C. Summary of the Comparative Analysis 1. Quantitative Analysis 2... preliminary quantitative analysis are specific building designs, in most cases with specific spaces defined... preliminary determination. C. Summary of the Comparative Analysis DOE carried out both a broad quantitative...

  16. Mass cytometry: a highly multiplexed single-cell technology for advancing drug development.

    PubMed

    Atkuri, Kondala R; Stevens, Jeffrey C; Neubert, Hendrik

    2015-02-01

    Advanced single-cell analysis technologies (e.g., mass cytometry) that help in multiplexing cellular measurements in limited-volume primary samples are critical in bridging discovery efforts to successful drug approval. Mass cytometry is the state-of-the-art technology in multiparametric single-cell analysis. Mass cytometers (also known as cytometry by time-of-flight or CyTOF) combine the cellular analysis principles of traditional fluorescence-based flow cytometry with the selectivity and quantitative power of inductively coupled plasma-mass spectrometry. Standard flow cytometry is limited in the number of parameters that can be measured owing to the overlap in signal when detecting fluorescently labeled antibodies. Mass cytometry uses antibodies tagged to stable isotopes of rare earth metals, which requires minimal signal compensation between the different metal tags. This unique feature enables researchers to seamlessly multiplex up to 40 independent measurements on single cells. In this overview we first present an overview of mass cytometry and compare it with traditional flow cytometry. We then discuss the emerging and potential applications of CyTOF technology in the pharmaceutical industry, including quantitative and qualitative deep profiling of immune cells and their applications in assessing drug immunogenicity, extensive mapping of signaling networks in single cells, cell surface receptor quantification and multiplexed internalization kinetics, multiplexing sample analysis by barcoding, and establishing cell ontologies on the basis of phenotype and/or function. We end with a discussion of the anticipated impact of this technology on drug development lifecycle with special emphasis on the utility of mass cytometry in deciphering a drug's pharmacokinetics and pharmacodynamics relationship. Copyright © 2014 by The American Society for Pharmacology and Experimental Therapeutics.

  17. Quantitative Proteomic and Microarray Analysis of the Archaeon Methanosarcina Acetivorans Grown with Acetate Versus Methanol*

    PubMed Central

    Li, Lingyun; Li, Qingbo; Rohlin, Lars; Kim, UnMi; Salmon, Kirsty; Rejtar, Tomas; Gunsalus, Robert P.; Karger, Barry L.; Ferry, James G.

    2008-01-01

    Summary Methanosarcina acetivorans strain C2A is an acetate- and methanol-utilizing methane-producing organism for which the genome, the largest yet sequenced among the Archaea, reveals extensive physiological diversity. LC linear ion trap-FTICR mass spectrometry was employed to analyze acetate- vs. methanol-grown cells metabolically labeled with 14N vs. 15N, respectively, to obtain quantitative protein abundance ratios. DNA microarray analyses of acetate- vs. methanol-grown cells was also performed to determine gene expression ratios. The combined approaches were highly complementary, extending the physiological understanding of growth and methanogenesis. Of the 1081 proteins detected, 255 were ≥ 3-fold differentially abundant. DNA microarray analysis revealed 410 genes that were ≥ 2.5-fold differentially expressed of 1972 genes with detected expression. The ratios of differentially abundant proteins were in good agreement with expression ratios of the encoding genes. Taken together, the results suggest several novel roles for electron transport components specific to acetate-grown cells, including two flavodoxins each specific for growth on acetate or methanol. Protein abundance ratios indicated that duplicate CO dehydrogenase/acetyl-CoA complexes function in the conversion of acetate to methane. Surprisingly, the protein abundance and gene expression ratios indicated a general stress response in acetate- vs. methanol-grown cells that included enzymes specific for polyphosphate accumulation and oxidative stress. The microarray analysis identified transcripts of several genes encoding regulatory proteins with identity to the PhoU, MarR, GlnK, and TetR families commonly found in the Bacteria domain. An analysis of neighboring genes suggested roles in controlling phosphate metabolism (PhoU), ammonia assimilation (GlnK), and molybdopterin cofactor biosynthesis (TetR). Finally, the proteomic and microarray results suggested roles for two-component regulatory systems specific for each growth substrate. PMID:17269732

  18. Semblance analysis to assess GPR data from a five-year forensic study of simulated clandestine graves

    NASA Astrophysics Data System (ADS)

    Booth, Adam D.; Pringle, Jamie K.

    2016-02-01

    Ground penetrating radar (GPR) surveys have proven useful for locating clandestine graves in a number of forensic searches. There has been extensive research into the geophysical monitoring of simulated clandestine graves in different burial scenarios and ground conditions. Whilst these studies have been used to suggest optimum dominant radar frequencies, the data themselves have not been quantitatively analysed to-date. This study uses a common-offset configuration of semblance analysis, both to characterise velocity trends from GPR diffraction hyperbolae and, since the magnitude of a semblance response is proportional to signal-to-noise ratio, to quantify the strength of a forensic GPR response. 2D GPR profiles were acquired over a simulated clandestine burial, with a wrapped-pig cadaver monitored at three-month intervals between 2008 and 2013 with GPR antennas of three different centre-frequencies (110, 225 and 450 MHz). The GPR response to the cadaver was a strong diffraction hyperbola. Results show, in contrast to resistivity surveys, that semblance analysis have little sensitivity to changes attributable to decomposition, and only a subtle influence from seasonality: velocity increases (0.01-0.02 m/ns) were observed in summer, associated with a decrease (5-10%) in peak semblance magnitude, SM, and potentially in the reflectivity of the cadaver. The lowest-frequency antennas consistently gave the highest signal-to-noise ratio although the grave was nonetheless detectable by all frequencies trialled. These observations suggest that forensic GPR surveys could be undertaken with little seasonal hindrance. Whilst GPR analysis cannot currently provide a quantitative diagnostic proxy for time-since-burial, the consistency of responses suggests that graves will remain detectable beyond the five years shown here.

  19. Agricultural science in the wild: a social network analysis of farmer knowledge exchange.

    PubMed

    Wood, Brennon A; Blair, Hugh T; Gray, David I; Kemp, Peter D; Kenyon, Paul R; Morris, Steve T; Sewell, Alison M

    2014-01-01

    Responding to demands for transformed farming practices requires new forms of knowledge. Given their scale and complexity, agricultural problems can no longer be solved by linear transfers in which technology developed by specialists passes to farmers by way of extension intermediaries. Recent research on alternative approaches has focused on the innovation systems formed by interactions between heterogeneous actors. Rather than linear transfer, systems theory highlights network facilitation as a specialized function. This paper contributes to our understanding of such facilitation by investigating the networks in which farmers discuss science. We report findings based on the study of a pastoral farming experiment collaboratively undertaken by a group of 17 farmers and five scientists. Analysis of prior contact and alter sharing between the group's members indicates strongly tied and decentralized networks. Farmer knowledge exchanges about the experiment have been investigated using a mix of quantitative and qualitative methods. Network surveys identified who the farmers contacted for knowledge before the study began and who they had talked to about the experiment by 18 months later. Open-ended interviews collected farmer statements about their most valuable contacts and these statements have been thematically analysed. The network analysis shows that farmers talked about the experiment with 192 people, most of whom were fellow farmers. Farmers with densely tied and occupationally homogeneous contacts grew their networks more than did farmers with contacts that are loosely tied and diverse. Thematic analysis reveals three general principles: farmers value knowledge delivered by persons rather than roles, privilege farming experience, and develop knowledge with empiricist rather than rationalist techniques. Taken together, these findings suggest that farmers deliberate about science in intensive and durable networks that have significant implications for theorizing agricultural innovation. The paper thus concludes by considering the findings' significance for current efforts to rethink agricultural extension.

  20. Discovery of Colorectal Cancer Biomarker Candidates by Membrane Proteomic Analysis and Subsequent Verification using Selected Reaction Monitoring (SRM) and Tissue Microarray (TMA) Analysis*

    PubMed Central

    Kume, Hideaki; Muraoka, Satoshi; Kuga, Takahisa; Adachi, Jun; Narumi, Ryohei; Watanabe, Shio; Kuwano, Masayoshi; Kodera, Yoshio; Matsushita, Kazuyuki; Fukuoka, Junya; Masuda, Takeshi; Ishihama, Yasushi; Matsubara, Hisahiro; Nomura, Fumio; Tomonaga, Takeshi

    2014-01-01

    Recent advances in quantitative proteomic technology have enabled the large-scale validation of biomarkers. We here performed a quantitative proteomic analysis of membrane fractions from colorectal cancer tissue to discover biomarker candidates, and then extensively validated the candidate proteins identified. A total of 5566 proteins were identified in six tissue samples, each of which was obtained from polyps and cancer with and without metastasis. GO cellular component analysis predicted that 3087 of these proteins were membrane proteins, whereas TMHMM algorithm predicted that 1567 proteins had a transmembrane domain. Differences were observed in the expression of 159 membrane proteins and 55 extracellular proteins between polyps and cancer without metastasis, while the expression of 32 membrane proteins and 17 extracellular proteins differed between cancer with and without metastasis. A total of 105 of these biomarker candidates were quantitated using selected (or multiple) reaction monitoring (SRM/MRM) with stable synthetic isotope-labeled peptides as an internal control. The results obtained revealed differences in the expression of 69 of these proteins, and this was subsequently verified in an independent set of patient samples (polyps (n = 10), cancer without metastasis (n = 10), cancer with metastasis (n = 10)). Significant differences were observed in the expression of 44 of these proteins, including ITGA5, GPRC5A, PDGFRB, and TFRC, which have already been shown to be overexpressed in colorectal cancer, as well as proteins with unknown function, such as C8orf55. The expression of C8orf55 was also shown to be high not only in colorectal cancer, but also in several cancer tissues using a multicancer tissue microarray, which included 1150 cores from 14 cancer tissues. This is the largest verification study of biomarker candidate membrane proteins to date; our methods for biomarker discovery and subsequent validation using SRM/MRM will contribute to the identification of useful biomarker candidates for various cancers. Data are available via ProteomeXchange with identifier PXD000851. PMID:24687888

  1. Discovery of colorectal cancer biomarker candidates by membrane proteomic analysis and subsequent verification using selected reaction monitoring (SRM) and tissue microarray (TMA) analysis.

    PubMed

    Kume, Hideaki; Muraoka, Satoshi; Kuga, Takahisa; Adachi, Jun; Narumi, Ryohei; Watanabe, Shio; Kuwano, Masayoshi; Kodera, Yoshio; Matsushita, Kazuyuki; Fukuoka, Junya; Masuda, Takeshi; Ishihama, Yasushi; Matsubara, Hisahiro; Nomura, Fumio; Tomonaga, Takeshi

    2014-06-01

    Recent advances in quantitative proteomic technology have enabled the large-scale validation of biomarkers. We here performed a quantitative proteomic analysis of membrane fractions from colorectal cancer tissue to discover biomarker candidates, and then extensively validated the candidate proteins identified. A total of 5566 proteins were identified in six tissue samples, each of which was obtained from polyps and cancer with and without metastasis. GO cellular component analysis predicted that 3087 of these proteins were membrane proteins, whereas TMHMM algorithm predicted that 1567 proteins had a transmembrane domain. Differences were observed in the expression of 159 membrane proteins and 55 extracellular proteins between polyps and cancer without metastasis, while the expression of 32 membrane proteins and 17 extracellular proteins differed between cancer with and without metastasis. A total of 105 of these biomarker candidates were quantitated using selected (or multiple) reaction monitoring (SRM/MRM) with stable synthetic isotope-labeled peptides as an internal control. The results obtained revealed differences in the expression of 69 of these proteins, and this was subsequently verified in an independent set of patient samples (polyps (n = 10), cancer without metastasis (n = 10), cancer with metastasis (n = 10)). Significant differences were observed in the expression of 44 of these proteins, including ITGA5, GPRC5A, PDGFRB, and TFRC, which have already been shown to be overexpressed in colorectal cancer, as well as proteins with unknown function, such as C8orf55. The expression of C8orf55 was also shown to be high not only in colorectal cancer, but also in several cancer tissues using a multicancer tissue microarray, which included 1150 cores from 14 cancer tissues. This is the largest verification study of biomarker candidate membrane proteins to date; our methods for biomarker discovery and subsequent validation using SRM/MRM will contribute to the identification of useful biomarker candidates for various cancers. Data are available via ProteomeXchange with identifier PXD000851. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  2. A Combined XRD/XRF Instrument for Lunar Resource Assessment

    NASA Technical Reports Server (NTRS)

    Vaniman, D. T.; Bish, D. L.; Chipera, S. J.; Blacic, J. D.

    1992-01-01

    Robotic surface missions to the Moon should be capable of measuring mineral as well as chemical abundances in regolith samples. Although much is already known about the lunar regolith, our data are far from comprehensive. Most of the regolith samples returned to Earth for analysis had lost the upper surface, or it was intermixed with deeper regolith. This upper surface is the part of the regolith most recently exposed to the solar wind; as such it will be important to resource assessment. In addition, it may be far easier to mine and process the uppermost few centimeters of regolith over a broad area than to engage in deep excavation of a smaller area. The most direct means of analyzing the regolith surface will be by studies in situ. In addition, the analysis of the impact-origin regolith surfaces, the Fe-rich glasses of mare pyroclastic deposits, are of resource interest, but are inadequately known; none of the extensive surface-exposed pyroclastic deposits of the Moon have been systematically sampled, although we know something about such deposits from the Apollo 17 site. Because of the potential importance of pyroclastic deposits, methods to quantify glass as well as mineral abundances will be important to resource evaluation. Combined x ray diffraction (XRD) and x ray fluorescence (XRF) analysis will address many resource characterization problems on the Moon. XRF methods are valuable for obtaining full major-element abundances with high precision. Such data, collected in parallel with quantitative mineralogy, permit unambiguous determination of both mineral and chemical abundances where concentrations are high enough to be of resource grade. Collection of both XRD and XRF data from a single sample provides simultaneous chemical and mineralogic information. These data can be used to correlate quantitative chemistry and mineralogy as a set of simultaneous linear equations, the solution of which can lead to full characterization of the sample. The use of Rietveld methods for XRD data analysis can provide a powerful tool for quantitative mineralogy and for obtaining crystallographic data on complex minerals.

  3. Quantitative analysis of population-scale family trees with millions of relatives.

    PubMed

    Kaplanis, Joanna; Gordon, Assaf; Shor, Tal; Weissbrod, Omer; Geiger, Dan; Wahl, Mary; Gershovits, Michael; Markus, Barak; Sheikh, Mona; Gymrek, Melissa; Bhatia, Gaurav; MacArthur, Daniel G; Price, Alkes L; Erlich, Yaniv

    2018-04-13

    Family trees have vast applications in fields as diverse as genetics, anthropology, and economics. However, the collection of extended family trees is tedious and usually relies on resources with limited geographical scope and complex data usage restrictions. We collected 86 million profiles from publicly available online data shared by genealogy enthusiasts. After extensive cleaning and validation, we obtained population-scale family trees, including a single pedigree of 13 million individuals. We leveraged the data to partition the genetic architecture of human longevity and to provide insights into the geographical dispersion of families. We also report a simple digital procedure to overlay other data sets with our resource. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  4. Distributed photovoltaic systems: Utility interface issues and their present status

    NASA Technical Reports Server (NTRS)

    Hassan, M.; Klein, J.

    1981-01-01

    Major technical issues involving the integration of distributed photovoltaics (PV) into electric utility systems are defined and their impacts are described quantitatively. An extensive literature search, interviews, and analysis yielded information about the work in progress and highlighted problem areas in which additional work and research are needed. The findings from the literature search were used to determine whether satisfactory solutions to the problems exist or whether satisfactory approaches to a solution are underway. It was discovered that very few standards, specifications, or guidelines currently exist that will aid industry in integrating PV into the utility system. Specific areas of concern identified are: (1) protection, (2) stability, (3) system unbalance, (4) voltage regulation and reactive power requirements, (5) harmonics, (6) utility operations, (7) safety, (8) metering, and (9) distribution system planning and design.

  5. Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.

    PubMed

    Huang, Chia-Chia; Pan, Tzu-Ming

    2005-05-18

    The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.

  6. No Evidence for Extensions to the Standard Cosmological Model.

    PubMed

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-08

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (ΛCDM) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (lnB=-7.8), nonzero scalar-to-tensor ratio (lnB=-4.3), running of the spectral index (lnB=-4.7), curvature (lnB=-3.6), nonstandard numbers of neutrinos (lnB=-3.1), nonstandard neutrino masses (lnB=-3.2), nonstandard lensing potential (lnB=-4.6), evolving dark energy (lnB=-3.2), sterile neutrinos (lnB=-6.9), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (lnB=-10.8). Other models are less strongly disfavored with respect to flat ΛCDM. As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does ΛCDM become disfavored, and only mildly, compared with a dynamical dark energy model (lnB∼+2).

  7. No Evidence for Extensions to the Standard Cosmological Model

    NASA Astrophysics Data System (ADS)

    Heavens, Alan; Fantaye, Yabebal; Sellentin, Elena; Eggers, Hans; Hosenie, Zafiirah; Kroon, Steve; Mootoovaloo, Arrykrishna

    2017-09-01

    We compute the Bayesian evidence for models considered in the main analysis of Planck cosmic microwave background data. By utilizing carefully defined nearest-neighbor distances in parameter space, we reuse the Monte Carlo Markov chains already produced for parameter inference to compute Bayes factors B for many different model-data set combinations. The standard 6-parameter flat cold dark matter model with a cosmological constant (Λ CDM ) is favored over all other models considered, with curvature being mildly favored only when cosmic microwave background lensing is not included. Many alternative models are strongly disfavored by the data, including primordial correlated isocurvature models (ln B =-7.8 ), nonzero scalar-to-tensor ratio (ln B =-4.3 ), running of the spectral index (ln B =-4.7 ), curvature (ln B =-3.6 ), nonstandard numbers of neutrinos (ln B =-3.1 ), nonstandard neutrino masses (ln B =-3.2 ), nonstandard lensing potential (ln B =-4.6 ), evolving dark energy (ln B =-3.2 ), sterile neutrinos (ln B =-6.9 ), and extra sterile neutrinos with a nonzero scalar-to-tensor ratio (ln B =-10.8 ). Other models are less strongly disfavored with respect to flat Λ CDM . As with all analyses based on Bayesian evidence, the final numbers depend on the widths of the parameter priors. We adopt the priors used in the Planck analysis, while performing a prior sensitivity analysis. Our quantitative conclusion is that extensions beyond the standard cosmological model are disfavored by Planck data. Only when newer Hubble constant measurements are included does Λ CDM become disfavored, and only mildly, compared with a dynamical dark energy model (ln B ˜+2 ).

  8. Mini-Column Ion-Exchange Separation and Atomic Absorption Quantitation of Nickel, Cobalt, and Iron: An Undergraduate Quantitative Analysis Experiment.

    ERIC Educational Resources Information Center

    Anderson, James L.; And Others

    1980-01-01

    Presents an undergraduate quantitative analysis experiment, describing an atomic absorption quantitation scheme that is fast, sensitive and comparatively simple relative to other titration experiments. (CS)

  9. Energy Dispersive Spectrometry and Quantitative Analysis Short Course. Introduction to X-ray Energy Dispersive Spectrometry and Quantitative Analysis

    NASA Technical Reports Server (NTRS)

    Carpenter, Paul; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.

  10. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  11. Does contraceptive treatment in wildlife result in side effects? A review of quantitative and anecdotal evidence.

    PubMed

    Gray, Meeghan E; Cameron, Elissa Z

    2010-01-01

    The efficacy of contraceptive treatments has been extensively tested, and several formulations are effective at reducing fertility in a range of species. However, these formulations should minimally impact the behavior of individuals and populations before a contraceptive is used for population manipulation, but these effects have received less attention. Potential side effects have been identified theoretically and we reviewed published studies that have investigated side effects on behavior and physiology of individuals or population-level effects, which provided mixed results. Physiological side effects were most prevalent. Most studies reported a lack of secondary effects, but were usually based on qualitative data or anecdotes. A meta-analysis on quantitative studies of side effects showed that secondary effects consistently occur across all categories and all contraceptive types. This contrasts with the qualitative studies, suggesting that anecdotal reports are insufficient to investigate secondary impacts of contraceptive treatment. We conclude that more research is needed to address fundamental questions about secondary effects of contraceptive treatment and experiments are fundamental to conclusions. In addition, researchers are missing a vital opportunity to use contraceptives as an experimental tool to test the influence of reproduction, sex and fertility on the behavior of wildlife species.

  12. Detection of melamine in milk powder using MCT-based short-wave infrared hyperspectral imaging system.

    PubMed

    Lee, Hoonsoo; Kim, Moon S; Lohumi, Santosh; Cho, Byoung-Kwan

    2018-06-05

    Extensive research has been conducted on non-destructive and rapid detection of melamine in powdered foods in the last decade. While Raman and near-infrared hyperspectral imaging techniques have been successful in terms of non-destructive and rapid measurement, they have limitations with respect to measurement time and detection capability, respectively. Therefore, the objective of this study was to develop a mercury cadmium telluride (MCT)-based short-wave infrared (SWIR) hyperspectral imaging system and algorithm to detect melamine quantitatively in milk powder. The SWIR hyperspectral imaging system consisted of a custom-designed illumination system, a SWIR hyperspectral camera, a data acquisition module and a sample transfer table. SWIR hyperspectral images were obtained for melamine-milk samples with different melamine concentrations, pure melamine and pure milk powder. Analysis of variance and the partial least squares regression method over the 1000-2500 nm wavelength region were used to develop an optimal model for detection. The results showed that a melamine concentration as low as 50 ppm in melamine-milk powder samples could be detected. Thus, the MCT-based SWIR hyperspectral imaging system has the potential for quantitative and qualitative detection of adulterants in powder samples.

  13. Music and suicidality: a quantitative review and extension.

    PubMed

    Stack, Steven; Lester, David; Rosenberg, Jonathan S

    2012-12-01

    This article provides the first quantitative review of the literature on music and suicidality. Multivariate logistic regression techniques are applied to 90 findings from 21 studies. Investigations employing ecological data on suicide completions are 19.2 times more apt than other studies to report a link between music and suicide. More recent and studies with large samples are also more apt than their counterparts to report significant results. Further, none of the studies based on experimental research designs found a link between music and suicide ideation, prompting us to do a brief content analysis of 24 suicide songs versus 24 nonsuicide songs from the same album. Using Linguistic Inquiry and Word Count software, we found no difference in the content of the suicide songs and controls, including the percentage of sad words, negative affect, and mentions of death, thus providing an explanation for nonfindings from experimental research. In summary, ecologically based (which capture at-risk persons not in typical school-based samples) and more recent investigations (which have used superior or new methodologies) tend to demonstrate a linkage between music and suicidality. Experimental research is needed with a control group of songs from an alternative genre with low suicidogenic content. © 2012 The American Association of Suicidology.

  14. Tracking antibiotic resistome during wastewater treatment using high throughput quantitative PCR.

    PubMed

    An, Xin-Li; Su, Jian-Qiang; Li, Bing; Ouyang, Wei-Ying; Zhao, Yi; Chen, Qing-Lin; Cui, Li; Chen, Hong; Gillings, Michael R; Zhang, Tong; Zhu, Yong-Guan

    2018-05-08

    Wastewater treatment plants (WWTPs) contain diverse antibiotic resistance genes (ARGs), and thus are considered as a major pathway for the dissemination of these genes into the environments. However, comprehensive evaluations of ARGs dynamic during wastewater treatment process lack extensive investigations on a broad spectrum of ARGs. Here, we investigated the dynamics of ARGs and bacterial community structures in 114 samples from eleven Chinese WWTPs using high-throughput quantitative PCR and 16S rRNA-based Illumina sequencing analysis. Significant shift of ARGs profiles was observed and wastewater treatment process could significantly reduce the abundance and diversity of ARGs, with the removal of ARGs concentration by 1-2 orders of magnitude. Whereas, a considerable number of ARGs were detected and enriched in effluents compared with influents. In particular, seven ARGs mainly conferring resistance to beta-lactams and aminoglycosides and three mobile genetic elements persisted in all WWTPs samples after wastewater treatment. ARGs profiles varied with wastewater treatment processes, seasons and regions. This study tracked the footprint of ARGs during wastewater treatment process, which would support the assessment on the spread of ARGs from WWTPs and provide data for identifying management options to improve ARG mitigation in WWTPs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. MAPA distinguishes genotype-specific variability of highly similar regulatory protein isoforms in potato tuber.

    PubMed

    Hoehenwarter, Wolfgang; Larhlimi, Abdelhalim; Hummel, Jan; Egelhofer, Volker; Selbig, Joachim; van Dongen, Joost T; Wienkoop, Stefanie; Weckwerth, Wolfram

    2011-07-01

    Mass Accuracy Precursor Alignment is a fast and flexible method for comparative proteome analysis that allows the comparison of unprecedented numbers of shotgun proteomics analyses on a personal computer in a matter of hours. We compared 183 LC-MS analyses and more than 2 million MS/MS spectra and could define and separate the proteomic phenotypes of field grown tubers of 12 tetraploid cultivars of the crop plant Solanum tuberosum. Protein isoforms of patatin as well as other major gene families such as lipoxygenase and cysteine protease inhibitor that regulate tuber development were found to be the primary source of variability between the cultivars. This suggests that differentially expressed protein isoforms modulate genotype specific tuber development and the plant phenotype. We properly assigned the measured abundance of tryptic peptides to different protein isoforms that share extensive stretches of primary structure and thus inferred their abundance. Peptides unique to different protein isoforms were used to classify the remaining peptides assigned to the entire subset of isoforms based on a common abundance profile using multivariate statistical procedures. We identified nearly 4000 proteins which we used for quantitative functional annotation making this the most extensive study of the tuber proteome to date.

  16. Aeromagnetic maps with geologic interpretations for the Tularosa Valley, south-central New Mexico

    USGS Publications Warehouse

    Bath, G.D.

    1977-01-01

    An aeromagnetic survey of the Tularosa Valley in south-central New Mexico has provided information on the igneous rocks that are buried beneath alluvium and colluvium. The data, compiled as residual magnetic anomalies, are shown on twelve maps at a scale of 1:62,500. Measurements of magnetic properties of samples collected in the valley and adjacent highlands give a basis for identifying the anomaly-producing rocks. Precambrian rocks of the crystalline basement have weakly induced magnetizations and produce anomalies having low magnetic intensities and low magnetic gradients. Late Cretaceous and Cenozoic intrusive rocks have moderately to strongly induced magnetizations. Precambrian rocks produce prominent magnetic anomalies having higher amplitudes and higher gradients. The Quaternary basalt has a strong remanent magnetization of normal polarity and produces narrow anomalies having high-magnetic gradients. Interpretations include an increase in elevation to the top of buried Precambrian rock in the northern part of the valley, a large Late Cretaceous and Cenozoic intrusive near Alamogordo, and a southern extension of the intrusive rock exposed in the Jarilla Mountains. Evidence for the southern extension comes from a quantitative analysis of the magnetic anomalies..

  17. Compliant threads maximize spider silk connection strength and toughness

    PubMed Central

    Meyer, Avery; Pugno, Nicola M.; Cranford, Steven W.

    2014-01-01

    Millions of years of evolution have adapted spider webs to achieve a range of functionalities, including the well-known capture of prey, with efficient use of material. One feature that has escaped extensive investigation is the silk-on-silk connection joints within spider webs, particularly from a structural mechanics perspective. We report a joint theoretical and computational analysis of an idealized silk-on-silk fibre junction. By modifying the theory of multiple peeling, we quantitatively compare the performance of the system while systematically increasing the rigidity of the anchor thread, by both scaling the stress–strain response and the introduction of an applied pre-strain. The results of our study indicate that compliance is a virtue—the more extensible the anchorage, the tougher and stronger the connection becomes. In consideration of the theoretical model, in comparison with rigid substrates, a compliant anchorage enormously increases the effective adhesion strength (work required to detach), independent of the adhered thread itself, attributed to a nonlinear alignment between thread and anchor (contact peeling angle). The results can direct novel engineering design principles to achieve possible load transfer from compliant fibre-to-fibre anchorages, be they silk-on-silk or another, as-yet undeveloped, system. PMID:25008083

  18. Diffusion MRI: literature review in salivary gland tumors.

    PubMed

    Attyé, A; Troprès, I; Rouchy, R-C; Righini, C; Espinoza, S; Kastler, A; Krainik, A

    2017-07-01

    Surgical resection is currently the best treatment for salivary gland tumors. A reliable magnetic resonance imaging mapping, encompassing tumor grade, location, and extension may assist safe and effective tumor resection and provide better information for patients regarding potential risks and morbidity after surgical intervention. However, direct examination of the tumor grade and extension using conventional morphological MRI remains difficult, often requiring contrast media injection and complex algorithms on perfusion imaging to estimate the degree of malignancy. In addition, contrast-enhanced MRI technique may be problematic due to the recently demonstrated gadolinium accumulation in the dentate nucleus of the cerebellum. Significant developments in magnetic resonance diffusion imaging, involving voxel-based quantitative analysis through the measurement of the apparent diffusion coefficient, have enhanced our knowledge on the different histopathological salivary tumor grades. Other diffusion imaging-derived techniques, including high-order tractography models, have recently demonstrated their usefulness in assessing the facial nerve location in parotid tumor context. All of these imaging techniques do not require contrast media injection. Our review starts by outlining the physical basis of diffusion imaging, before discussing findings from diagnostic studies testing its usefulness in assessing salivary glands tumors with diffusion MRI. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Status and opportunities for genomics research with rainbow trout

    USGS Publications Warehouse

    Thorgaard, G.H.; Bailey, G.S.; Williams, D.; Buhler, D.R.; Kaattari, S.L.; Ristow, S.S.; Hansen, J.D.; Winton, J.R.; Bartholomew, J.L.; Nagler, J.J.; Walsh, P.J.; Vijayan, M.M.; Devlin, R.H.; Hardy, R.W.; Overturf, K.E.; Young, W.P.; Robison, B.D.; Rexroad, C.; Palti, Y.

    2002-01-01

    The rainbow trout (Oncorhynchus mykiss) is one of the most widely studied of model fish species. Extensive basic biological information has been collected for this species, which because of their large size relative to other model fish species are particularly suitable for studies requiring ample quantities of specific cells and tissue types. Rainbow trout have been widely utilized for research in carcinogenesis, toxicology, comparative immunology, disease ecology, physiology and nutrition. They are distinctive in having evolved from a relatively recent tetraploid event, resulting in a high incidence of duplicated genes. Natural populations are available and have been well characterized for chromosomal, protein, molecular and quantitative genetic variation. Their ease of culture, and experimental and aquacultural significance has led to the development of clonal lines and the widespread application of transgenic technology to this species. Numerous microsatellites have been isolated and two relatively detailed genetic maps have been developed. Extensive sequencing of expressed sequence tags has begun and four BAC libraries have been developed. The development and analysis of additional genomic sequence data will provide distinctive opportunities to address problems in areas such as evolution of the immune system and duplicate genes. ?? 2002 Elsevier Science Inc. All rights reserved.

  20. Behavior coordination of mobile robotics using supervisory control of fuzzy discrete event systems.

    PubMed

    Jayasiri, Awantha; Mann, George K I; Gosine, Raymond G

    2011-10-01

    In order to incorporate the uncertainty and impreciseness present in real-world event-driven asynchronous systems, fuzzy discrete event systems (DESs) (FDESs) have been proposed as an extension to crisp DESs. In this paper, first, we propose an extension to the supervisory control theory of FDES by redefining fuzzy controllable and uncontrollable events. The proposed supervisor is capable of enabling feasible uncontrollable and controllable events with different possibilities. Then, the extended supervisory control framework of FDES is employed to model and control several navigational tasks of a mobile robot using the behavior-based approach. The robot has limited sensory capabilities, and the navigations have been performed in several unmodeled environments. The reactive and deliberative behaviors of the mobile robotic system are weighted through fuzzy uncontrollable and controllable events, respectively. By employing the proposed supervisory controller, a command-fusion-type behavior coordination is achieved. The observability of fuzzy events is incorporated to represent the sensory imprecision. As a systematic analysis of the system, a fuzzy-state-based controllability measure is introduced. The approach is implemented in both simulation and real time. A performance evaluation is performed to quantitatively estimate the validity of the proposed approach over its counterparts.

  1. Ancient Paradoxes Can Extend Mathematical Thinking

    ERIC Educational Resources Information Center

    Czocher, Jennifer A.; Moss, Diana L.

    2017-01-01

    This article presents the Snail problem, a relatively simple challenge about motion that offers engaging extensions involving the notion of infinity. It encourages students in grades 5-9 to connect mathematics learning to logic, history, and philosophy through analyzing the problem, making sense of quantitative relationships, and modeling with…

  2. The Causal Effects of Cultural Relevance: Evidence from an Ethnic Studies Curriculum

    ERIC Educational Resources Information Center

    Dee, Thomas S.; Penner, Emily K.

    2017-01-01

    An extensive theoretical and qualitative literature stresses the promise of instructional practices and content aligned with minority students' experiences. Ethnic studies courses provide an example of such "culturally relevant pedagogy" (CRP). Despite theoretical support, quantitative evidence on the effectiveness of these courses is…

  3. Quantitation of buried contamination by use of solvents

    NASA Technical Reports Server (NTRS)

    Pappas, S. P.; Hsiao, P.; Hill, L. W.

    1972-01-01

    Solubilization studies were carried out on various cured silicone resins. A solvent spectrum was prepared. It was found that complete dissolution of cured silicone resins could be achieved without extensive physical degradation of samples. Based on the solubilization results, amine solvents were selected for spore viability studies.

  4. Wavelet Analysis of Turbulent Spots and Other Coherent Structures in Unsteady Transition

    NASA Technical Reports Server (NTRS)

    Lewalle, Jacques

    1998-01-01

    This is a secondary analysis of a portion of the Halstead data. The hot-film traces from an embedded stage of a low pressure turbine have been extensively analyzed by Halstead et al. In this project, wavelet analysis is used to develop the quantitative characterization of individual coherent structures in terms of size, amplitude, phase, convection speed, etc., as well as phase-averaged time scales. The purposes of the study are (1) to extract information about turbulent time scales for comparison with unsteady model results (e.g. k/epsilon). Phase-averaged maps of dominant time scales will be presented; and (2) to evaluate any differences between wake-induced and natural spots that might affect model performance. Preliminary results, subject to verification with data at higher frequency resolution, indicate that spot properties are independent of their phase relative to the wake footprints: therefore requirements for the physical content of models are kept relatively simple. Incidentally, we also observed that spot substructures can be traced over several stations; further study will examine their possible impact.

  5. Carbon Nanotube Fiber Ionization Mass Spectrometry: A Fundamental Study of a Multi-Walled Carbon Nanotube Functionalized Corona Discharge Pin for Polycyclic Aromatic Hydrocarbons Analysis.

    PubMed

    Nahan, Keaton S; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne

    2017-11-01

    Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R 2 ) of three target PAHs with phenanthrene internal standard. Graphical Abstract ᅟ.

  6. Finite Element-Based Mechanical Assessment of Bone Quality on the Basis of In Vivo Images.

    PubMed

    Pahr, Dieter H; Zysset, Philippe K

    2016-12-01

    Beyond bone mineral density (BMD), bone quality designates the mechanical integrity of bone tissue. In vivo images based on X-ray attenuation, such as CT reconstructions, provide size, shape, and local BMD distribution and may be exploited as input for finite element analysis (FEA) to assess bone fragility. Further key input parameters of FEA are the material properties of bone tissue. This review discusses the main determinants of bone mechanical properties and emphasizes the added value, as well as the important assumptions underlying finite element analysis. Bone tissue is a sophisticated, multiscale composite material that undergoes remodeling but exhibits a rather narrow band of tissue mineralization. Mechanically, bone tissue behaves elastically under physiologic loads and yields by cracking beyond critical strain levels. Through adequate cell-orchestrated modeling, trabecular bone tunes its mechanical properties by volume fraction and fabric. With proper calibration, these mechanical properties may be incorporated in quantitative CT-based finite element analysis that has been validated extensively with ex vivo experiments and has been applied increasingly in clinical trials to assess treatment efficacy against osteoporosis.

  7. High-Dimensional Heteroscedastic Regression with an Application to eQTL Data Analysis

    PubMed Central

    Daye, Z. John; Chen, Jinbo; Li, Hongzhe

    2011-01-01

    Summary We consider the problem of high-dimensional regression under non-constant error variances. Despite being a common phenomenon in biological applications, heteroscedasticity has, so far, been largely ignored in high-dimensional analysis of genomic data sets. We propose a new methodology that allows non-constant error variances for high-dimensional estimation and model selection. Our method incorporates heteroscedasticity by simultaneously modeling both the mean and variance components via a novel doubly regularized approach. Extensive Monte Carlo simulations indicate that our proposed procedure can result in better estimation and variable selection than existing methods when heteroscedasticity arises from the presence of predictors explaining error variances and outliers. Further, we demonstrate the presence of heteroscedasticity in and apply our method to an expression quantitative trait loci (eQTLs) study of 112 yeast segregants. The new procedure can automatically account for heteroscedasticity in identifying the eQTLs that are associated with gene expression variations and lead to smaller prediction errors. These results demonstrate the importance of considering heteroscedasticity in eQTL data analysis. PMID:22547833

  8. Comparing DNA damage-processing pathways by computer analysis of chromosome painting data.

    PubMed

    Levy, Dan; Vazquez, Mariel; Cornforth, Michael; Loucas, Bradford; Sachs, Rainer K; Arsuaga, Javier

    2004-01-01

    Chromosome aberrations are large-scale illegitimate rearrangements of the genome. They are indicative of DNA damage and informative about damage processing pathways. Despite extensive investigations over many years, the mechanisms underlying aberration formation remain controversial. New experimental assays such as multiplex fluorescent in situ hybridyzation (mFISH) allow combinatorial "painting" of chromosomes and are promising for elucidating aberration formation mechanisms. Recently observed mFISH aberration patterns are so complex that computer and graph-theoretical methods are needed for their full analysis. An important part of the analysis is decomposing a chromosome rearrangement process into "cycles." A cycle of order n, characterized formally by the cyclic graph with 2n vertices, indicates that n chromatin breaks take part in a single irreducible reaction. We here describe algorithms for computing cycle structures from experimentally observed or computer-simulated mFISH aberration patterns. We show that analyzing cycles quantitatively can distinguish between different aberration formation mechanisms. In particular, we show that homology-based mechanisms do not generate the large number of complex aberrations, involving higher-order cycles, observed in irradiated human lymphocytes.

  9. On mining complex sequential data by means of FCA and pattern structures

    NASA Astrophysics Data System (ADS)

    Buzmakov, Aleksey; Egho, Elias; Jay, Nicolas; Kuznetsov, Sergei O.; Napoli, Amedeo; Raïssi, Chedy

    2016-02-01

    Nowadays data-sets are available in very complex and heterogeneous ways. Mining of such data collections is essential to support many real-world applications ranging from healthcare to marketing. In this work, we focus on the analysis of "complex" sequential data by means of interesting sequential patterns. We approach the problem using the elegant mathematical framework of formal concept analysis and its extension based on "pattern structures". Pattern structures are used for mining complex data (such as sequences or graphs) and are based on a subsumption operation, which in our case is defined with respect to the partial order on sequences. We show how pattern structures along with projections (i.e. a data reduction of sequential structures) are able to enumerate more meaningful patterns and increase the computing efficiency of the approach. Finally, we show the applicability of the presented method for discovering and analysing interesting patient patterns from a French healthcare data-set on cancer. The quantitative and qualitative results (with annotations and analysis from a physician) are reported in this use-case which is the main motivation for this work.

  10. Integrated Path Differential Absorption Lidar Optimizations Based on Pre-Analyzed Atmospheric Data for ASCENDS Mission Applications

    NASA Technical Reports Server (NTRS)

    Pliutau, Denis; Prasad, Narasimha S.

    2012-01-01

    In this paper a modeling method based on data reductions is investigated which includes pre analyzed MERRA atmospheric fields for quantitative estimates of uncertainties introduced in the integrated path differential absorption methods for the sensing of various molecules including CO2. This approach represents the extension of our existing lidar modeling framework previously developed and allows effective on- and offline wavelength optimizations and weighting function analysis to minimize the interference effects such as those due to temperature sensitivity and water vapor absorption. The new simulation methodology is different from the previous implementation in that it allows analysis of atmospheric effects over annual spans and the entire Earth coverage which was achieved due to the data reduction methods employed. The effectiveness of the proposed simulation approach is demonstrated with application to the mixing ratio retrievals for the future ASCENDS mission. Independent analysis of multiple accuracy limiting factors including the temperature, water vapor interferences, and selected system parameters is further used to identify favorable spectral regions as well as wavelength combinations facilitating the reduction in total errors in the retrieved XCO2 values.

  11. Carbon Nanotube Fiber Ionization Mass Spectrometry: A Fundamental Study of a Multi-Walled Carbon Nanotube Functionalized Corona Discharge Pin for Polycyclic Aromatic Hydrocarbons Analysis

    NASA Astrophysics Data System (ADS)

    Nahan, Keaton S.; Alvarez, Noe; Shanov, Vesselin; Vonderheide, Anne

    2017-09-01

    Mass spectrometry continues to tackle many complicated tasks, and ongoing research seeks to simplify its instrumentation as well as sampling. The desorption electrospray ionization (DESI) source was the first ambient ionization source to function without extensive gas requirements and chromatography. Electrospray techniques generally have low efficiency for ionization of nonpolar analytes and some researchers have resorted to methods such as direct analysis in real time (DART) or desorption atmospheric pressure chemical ionization (DAPCI) for their analysis. In this work, a carbon nanotube fiber ionization (nanoCFI) source was developed and was found to be capable of solid phase microextraction (SPME) of nonpolar analytes as well as ionization and sampling similar to that of direct probe atmospheric pressure chemical ionization (DP-APCI). Conductivity and adsorption were maintained by utilizing a corona pin functionalized with a multi-walled carbon nanotube (MWCNT) thread. Quantitative work with the nanoCFI source with a designed corona discharge pin insert demonstrated linearity up to 0.97 (R2) of three target PAHs with phenanthrene internal standard. [Figure not available: see fulltext.

  12. The Brazilian Experience with Agroecological Extension: A Critical Analysis of Reform in a Pluralistic Extension System

    ERIC Educational Resources Information Center

    Diesel, Vivien; Miná Dias, Marcelo

    2016-01-01

    Purpose: To analyze the Brazilian experience in designing and implementing a recent extension policy reform based on agroecology, and reflect on its wider theoretical implications for extension reform literature. Design/methodology/approach: Using a critical public analysis we characterize the evolution of Brazilian federal extension policy…

  13. Quantitative Analysis of High-Quality Officer Selection by Commandants Career-Level Education Board

    DTIC Science & Technology

    2017-03-01

    due to Marines being evaluated before the end of their initial service commitment. Our research utilizes quantitative variables to analyze the...not provide detailed information why. B. LIMITATIONS The photograph analysis in this research is strictly limited to a quantitative analysis in...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release. Distribution is unlimited. QUANTITATIVE

  14. Work, malaise, and well-being in Spanish and Latin-American doctors

    PubMed Central

    Ochoa, Paola; Blanch, Josep M

    2016-01-01

    ABSTRACT OBJECTIVE To analyze the relations between the meanings of working and the levels of doctors work well-being in the context of their working conditions. METHOD The research combined the qualitative methodology of textual analysis and the quantitative one of correspondence factor analysis. A convenience, intentional, and stratified sample composed of 305 Spanish and Latin American doctors completed an extensive questionnaire on the topics of the research. RESULTS The general meaning of working for the group located in the quartile of malaise included perceptions of discomfort, frustration, and exhaustion. However, those showing higher levels of well-being, located on the opposite quartile, associated their working experience with good conditions and the development of their professional and personal competences. CONCLUSIONS The study provides empirical evidence of the relationship between contextual factors and the meanings of working for participants with higher levels of malaise, and of the importance granted both to intrinsic and extrinsic factors by those who scored highest on well-being. PMID:27191157

  15. The 'Book of Life' in the press: comparing German and Irish media discourse on human genome research.

    PubMed

    O'Mahony, Patrick; Schäfer, Mike Steffen

    2005-02-01

    The essay compares German and Irish media coverage of human genome research in the year 2000, using qualitative and quantitative frame analysis of a print media corpus. Drawing from a media-theoretical account of science communication, the study examines four analytic dimensions: (1) the influence of global and national sources of discourse; (2) the nature of elaboration on important themes; (3) the extent of societal participation in discourse production; (4) the cultural conditions in which the discourse resonates. The analysis shows that a global discursive package, emphasizing claims of scientific achievement and medical progress, dominates media coverage in both countries. However, German coverage is more extensive and elaborate, and includes a wider range of participants. Irish coverage more often incorporates the global package without further elaboration. These finding indicate that the global package is 'localized' differently due to national patterns of interests, German participation in human genome research, traditions of media coverage, and the domestic resonance of the issue.

  16. Use of mathematics to guide target selection in systems pharmacology; application to receptor tyrosine kinase (RTK) pathways.

    PubMed

    Benson, Neil; van der Graaf, Piet H; Peletier, Lambertus A

    2017-11-15

    A key element of the drug discovery process is target selection. Although the topic is subject to much discussion and experimental effort, there are no defined quantitative rules around optimal selection. Often 'rules of thumb', that have not been subject to rigorous exploration, are used. In this paper we explore the 'rule of thumb' notion that the molecule that initiates a pathway signal is the optimal target. Given the multi-factorial and complex nature of this question, we have simplified an example pathway to its logical minimum of two steps and used a mathematical model of this to explore the different options in the context of typical small and large molecule drugs. In this paper, we report the conclusions of our analysis and describe the analysis tool and methods used. These provide a platform to enable a more extensive enquiry into this important topic. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Detection and characterization of naturally acquired West Nile virus infection in a female wild turkey.

    PubMed

    Zhang, Z; Wilson, F; Read, R; Pace, L; Zhang, S

    2006-03-01

    An adult female wild turkey exhibiting disorientation and failure to flee when approached was submitted to the Mississippi Veterinary Research and Diagnostic Laboratory. Gross pathologic examination revealed evidence of dehydration and the presence of modest numbers of adult nematodes in the small intestine. Histologic examination revealed extensive multifocal perivascular lymphocytic infiltration in brain, marked heterophilic hyperplasia in bone marrow, and multifocal interstitial lymphocytic infiltration in heart, pancreas, ventriculus, and skeletal muscles. West Nile virus (WNV) was isolated from the brain, lung, and kidney tissues using cultured Vero cells. Higher copies of viral RNA were detected from brain, lung, and kidney than from heart, liver, or spleen by quantitative real-time reverse transcription-polymerase chain reaction (RRT-PCR) analysis. Immunohistochemical (IHC) analysis detected WNV antigen in various tissues including neurons, kidney, respiratory tract epithelium, heart, and bone marrow. On the basis of the data from this investigation, it is concluded that WNV caused encephalitis along with many other pathologic changes in the affected wild turkey.

  18. In-vivo dynamic characterization of microneedle skin penetration using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Enfield, Joey; O'Connell, Marie-Louise; Lawlor, Kate; Jonathan, Enock; O'Mahony, Conor; Leahy, Martin

    2010-07-01

    The use of microneedles as a method of circumventing the barrier properties of the stratum corneum is receiving much attention. Although skin disruption technologies and subsequent transdermal diffusion rates are being extensively studied, no accurate data on depth and closure kinetics of microneedle-induced skin pores are available, primarily due to the cumbersome techniques currently required for skin analysis. We report on the first use of optical coherence tomography technology to image microneedle penetration in real time and in vivo. We show that optical coherence tomography (OCT) can be used to painlessly measure stratum corneum and epidermis thickness, as well as microneedle penetration depth after microneedle insertion. Since OCT is a real-time, in-vivo, nondestructive technique, we also analyze skin healing characteristics and present quantitative data on micropore closure rate. Two locations (the volar forearm and dorsal aspect of the fingertip) have been assessed as suitable candidates for microneedle administration. The results illustrate the applicability of OCT analysis as a tool for microneedle-related skin characterization.

  19. Tertiary structural propensities reveal fundamental sequence/structure relationships.

    PubMed

    Zheng, Fan; Zhang, Jian; Grigoryan, Gevorg

    2015-05-05

    Extracting useful generalizations from the continually growing Protein Data Bank (PDB) is of central importance. We hypothesize that the PDB contains valuable quantitative information on the level of local tertiary structural motifs (TERMs). We show that by breaking a protein structure into its constituent TERMs, and querying the PDB to characterize the natural ensemble matching each, we can estimate the compatibility of the structure with a given amino acid sequence through a metric we term "structure score." Considering submissions from recent Critical Assessment of Structure Prediction (CASP) experiments, we found a strong correlation (R = 0.69) between structure score and model accuracy, with poorly predicted regions readily identifiable. This performance exceeds that of leading atomistic statistical energy functions. Furthermore, TERM-based analysis of two prototypical multi-state proteins rapidly produced structural insights fully consistent with prior extensive experimental studies. We thus find that TERM-based analysis should have considerable utility for protein structural biology. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. The effect of poverty-influenced, food-related consumer behaviors on obesity: An analysis of the NHANES flexible consumer behavioral module.

    PubMed

    O'Dare Wilson, Kellie

    2017-01-01

    Despite extensive research investigating obesity, the problem continues to increase, particularly in poor, minority, and under-resourced communities. However, the literature continues to demonstrate many obesity-predicating variables are outside of personal volitional control, such as food-related consumer behaviors, which are strongly influenced by income and environment. This cross-sectional study (n = 5,109) employed secondary data analysis to quantitatively examine the effect of food-related consumer variables on obesity while controlling for covariates. Participants answered questions regarding money spent on food, time preparing meals, number of meals eaten at home and away from home, and types of food products consumed (frozen/fast foods, sodas, salty snacks, etc.) In this study, 48.9% of respondents were either overweight or obese. No significant differences were noted between the contextual variables examined and BMI scores. However, given the sample's limitations illuminated in the study, further research regarding the relationship between obesity and poverty-influenced, food-related consumer behaviors is warranted.

  1. Hepatitis C Virus Antigenic Convergence

    PubMed Central

    Campo, David S.; Dimitrova, Zoya; Yokosawa, Jonny; Hoang, Duc; Perez, Nestor O.; Ramachandran, Sumathi; Khudyakov, Yury

    2012-01-01

    Vaccine development against hepatitis C virus (HCV) is hindered by poor understanding of factors defining cross-immunoreactivity among heterogeneous epitopes. Using synthetic peptides and mouse immunization as a model, we conducted a quantitative analysis of cross-immunoreactivity among variants of the HCV hypervariable region 1 (HVR1). Analysis of 26,883 immunological reactions among pairs of peptides showed that the distribution of cross-immunoreactivity among HVR1 variants was skewed, with antibodies against a few variants reacting with all tested peptides. The HVR1 cross-immunoreactivity was accurately modeled based on amino acid sequence alone. The tested peptides were mapped in the HVR1 sequence space, which was visualized as a network of 11,319 sequences. The HVR1 variants with a greater network centrality showed a broader cross-immunoreactivity. The entire sequence space is explored by each HCV genotype and subtype. These findings indicate that HVR1 antigenic diversity is extensively convergent and effectively limited, suggesting significant implications for vaccine development. PMID:22355779

  2. Comprehensive Identification of Glycated Peptides and Their Glycation Motifs in Plasma and Erythrocytes of Control and Diabetic Subjects

    PubMed Central

    Zhang, Qibin; Monroe, Matthew E.; Schepmoes, Athena A.; Clauss, Therese R. W.; Gritsenko, Marina A.; Meng, Da; Petyuk, Vladislav A.; Smith, Richard D.; Metz, Thomas O.

    2011-01-01

    Non-enzymatic glycation of proteins sets the stage for formation of advanced glycation end-products and development of chronic complications of diabetes. In this report, we extended our previous methods on proteomics analysis of glycated proteins to comprehensively identify glycated proteins in control and diabetic human plasma and erythrocytes. Using immunodepletion, enrichment, and fractionation strategies, we identified 7749 unique glycated peptides, corresponding to 3742 unique glycated proteins. Semi-quantitative comparisons showed that glycation levels of a number of proteins were significantly increased in diabetes and that erythrocyte proteins were more extensively glycated than plasma proteins. A glycation motif analysis revealed that some amino acids were favored more than others in the protein primary structures in the vicinity of the glycation sites in both sample types. The glycated peptides and corresponding proteins reported here provide a foundation for potential identification of novel markers for diabetes, hyperglycemia, and diabetic complications in future studies. PMID:21612289

  3. Polymerase/DNA interactions and enzymatic activity: multi-parameter analysis with electro-switchable biosurfaces

    NASA Astrophysics Data System (ADS)

    Langer, Andreas; Schräml, Michael; Strasser, Ralf; Daub, Herwin; Myers, Thomas; Heindl, Dieter; Rant, Ulrich

    2015-07-01

    The engineering of high-performance enzymes for future sequencing and PCR technologies as well as the development of many anticancer drugs requires a detailed analysis of DNA/RNA synthesis processes. However, due to the complex molecular interplay involved, real-time methodologies have not been available to obtain comprehensive information on both binding parameters and enzymatic activities. Here we introduce a chip-based method to investigate polymerases and their interactions with nucleic acids, which employs an electrical actuation of DNA templates on microelectrodes. Two measurement modes track both the dynamics of the induced switching process and the DNA extension simultaneously to quantitate binding kinetics, dissociation constants and thermodynamic energies. The high sensitivity of the method reveals previously unidentified tight binding states for Taq and Pol I (KF) DNA polymerases. Furthermore, the incorporation of label-free nucleotides can be followed in real-time and changes in the DNA polymerase conformation (finger closing) during enzymatic activity are observable.

  4. Evolution of the 3-dimensional video system for facial motion analysis: ten years' experiences and recent developments.

    PubMed

    Tzou, Chieh-Han John; Pona, Igor; Placheta, Eva; Hold, Alina; Michaelidou, Maria; Artner, Nicole; Kropatsch, Walter; Gerber, Hans; Frey, Manfred

    2012-08-01

    Since the implementation of the computer-aided system for assessing facial palsy in 1999 by Frey et al (Plast Reconstr Surg. 1999;104:2032-2039), no similar system that can make an objective, three-dimensional, quantitative analysis of facial movements has been marketed. This system has been in routine use since its launch, and it has proven to be reliable, clinically applicable, and therapeutically accurate. With the cooperation of international partners, more than 200 patients were analyzed. Recent developments in computer vision--mostly in the area of generative face models, applying active--appearance models (and extensions), optical flow, and video-tracking-have been successfully incorporated to automate the prototype system. Further market-ready development and a business partner will be needed to enable the production of this system to enhance clinical methodology in diagnostic and prognostic accuracy as a personalized therapy concept, leading to better results and higher quality of life for patients with impaired facial function.

  5. A comparative analysis of high-throughput platforms for validation of a circulating microRNA signature in diabetic retinopathy.

    PubMed

    Farr, Ryan J; Januszewski, Andrzej S; Joglekar, Mugdha V; Liang, Helena; McAulley, Annie K; Hewitt, Alex W; Thomas, Helen E; Loudovaris, Tom; Kay, Thomas W H; Jenkins, Alicia; Hardikar, Anandwardhan A

    2015-06-02

    MicroRNAs are now increasingly recognized as biomarkers of disease progression. Several quantitative real-time PCR (qPCR) platforms have been developed to determine the relative levels of microRNAs in biological fluids. We systematically compared the detection of cellular and circulating microRNA using a standard 96-well platform, a high-content microfluidics platform and two ultra-high content platforms. We used extensive analytical tools to compute inter- and intra-run variability and concordance measured using fidelity scoring, coefficient of variation and cluster analysis. We carried out unprejudiced next generation sequencing to identify a microRNA signature for Diabetic Retinopathy (DR) and systematically assessed the validation of this signature on clinical samples using each of the above four qPCR platforms. The results indicate that sensitivity to measure low copy number microRNAs is inversely related to qPCR reaction volume and that the choice of platform for microRNA biomarker validation should be made based on the abundance of miRNAs of interest.

  6. Quantifying strain partitioning between magmatic and amagmatic portions of the Afar triple junction of Ethiopia and Djibouti through use of contemporary and late Quaternary extension rates

    NASA Astrophysics Data System (ADS)

    Polun, S. G.; Hickcox, K.; Tesfaye, S.; Gomez, F. G.

    2016-12-01

    The central Afar rift in Ethiopia and Djibouti is a zone of accommodation between the onshore propagations of the Gulf of Aden and Red Sea oceanic spreading centers forming part of the Afar triple junction that divides the Arabia, Nubia, and Somalia plates. While extension in the onshore magmatic propagators is accommodated through magmatism and associated faulting, extension in the central Afar is accommodated solely by large and small faults. The contributions of these major faults to the overall strain budget can be well characterized, but smaller faults are more difficult to quantify. Sparse GPS data covering the region constrain the total extension budget across the diffuse triple junction zone. Late Quaternary slip rates for major faults in Hanle, Dobe, Guma, and Immino grabens were estimated using the quantitative analysis of faulted landforms. This forms a nearly complete transect from the onshore propagation of the Red Sea rift in Tendaho graben and the onshore propagation of the Gulf of Aden rift at Manda Inakir. Field surveying was accomplished using a combination of electronic distance measurer profiling and low altitude aerial surveying. Age constraints are provided from the Holocene lacustrine history or through terrestrial cosmogenic nuclide (TCN) dating of the faulted geomorphic surface. Along this transect, late Quaternary slip rates of major faults appear to accommodate 25% of the total horizontal stretching rate between the southern margin of Tendaho graben and the Red Sea coast, as determined from published GPS velocities. This constrains the proportion of total extension between Nubia and Arabia that is accommodated through major faulting in the central Afar, compared to the magmatism and associated faulting of the magmatic propagators elsewhere in the triple junction. Along the transect, individual fault slip rates decrease from the southeast to the northwest, suggesting a `Crank-Arm' model may be more applicable to explain the regional kinematics and the evolution of the triple junction.

  7. A Bayesian approach to reliability and confidence

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1989-01-01

    The historical evolution of NASA's interest in quantitative measures of reliability assessment is outlined. The introduction of some quantitative methodologies into the Vehicle Reliability Branch of the Safety, Reliability and Quality Assurance (SR and QA) Division at Johnson Space Center (JSC) was noted along with the development of the Extended Orbiter Duration--Weakest Link study which will utilize quantitative tools for a Bayesian statistical analysis. Extending the earlier work of NASA sponsor, Richard Heydorn, researchers were able to produce a consistent Bayesian estimate for the reliability of a component and hence by a simple extension for a system of components in some cases where the rate of failure is not constant but varies over time. Mechanical systems in general have this property since the reliability usually decreases markedly as the parts degrade over time. While they have been able to reduce the Bayesian estimator to a simple closed form for a large class of such systems, the form for the most general case needs to be attacked by the computer. Once a table is generated for this form, researchers will have a numerical form for the general solution. With this, the corresponding probability statements about the reliability of a system can be made in the most general setting. Note that the utilization of uniform Bayesian priors represents a worst case scenario in the sense that as researchers incorporate more expert opinion into the model, they will be able to improve the strength of the probability calculations.

  8. In-cell RNA structure probing with SHAPE-MaP.

    PubMed

    Smola, Matthew J; Weeks, Kevin M

    2018-06-01

    This protocol is an extension to: Nat. Protoc. 10, 1643-1669 (2015); doi:10.1038/nprot.2015.103; published online 01 October 2015RNAs play key roles in many cellular processes. The underlying structure of RNA is an important determinant of how transcripts function, are processed, and interact with RNA-binding proteins and ligands. RNA structure analysis by selective 2'-hydroxyl acylation analyzed by primer extension (SHAPE) takes advantage of the reactivity of small electrophilic chemical probes that react with the 2'-hydroxyl group to assess RNA structure at nucleotide resolution. When coupled with mutational profiling (MaP), in which modified nucleotides are detected as internal miscodings during reverse transcription and then read out by massively parallel sequencing, SHAPE yields quantitative per-nucleotide measurements of RNA structure. Here, we provide an extension to our previous in vitro SHAPE-MaP protocol with detailed guidance for undertaking and analyzing SHAPE-MaP probing experiments in live cells. The MaP strategy works for both abundant-transcriptome experiments and for cellular RNAs of low to moderate abundance, which are not well examined by whole-transcriptome methods. In-cell SHAPE-MaP, performed in roughly 3 d, can be applied in cell types ranging from bacteria to cultured mammalian cells and is compatible with a variety of structure-probing reagents. We detail several strategies by which in-cell SHAPE-MaP can inform new biological hypotheses and emphasize downstream analyses that reveal sequence or structure motifs important for RNA interactions in cells.

  9. INTRODUCTION TO ATTILA (ANALYTICAL TOOLS INTERFACE FOR LANDSCAPE ASSESSMENTS): AN ARCVIEW EXTENSION

    EPA Science Inventory

    Geographic Information Systems (GIS) have become a powerful tool in the field of landscape ecology. A common application of GIS is the generation of landscape indicators, which are quantitative measurements of the status or potential health of an area (e.g. ecological region, wat...

  10. Local Government Leadership Education: Measuring the Impact of Leadership Skill Development on Public Officials

    ERIC Educational Resources Information Center

    Davis, Gregory A.; Lucente, Joe

    2012-01-01

    Many Extension leadership development programs have been evaluated for effectiveness. Little literature exists focusing on the evaluation of leadership development programs involving elected and appointed local officials. This article describes an annual program involving elected and appointed local officials and shares quantitative and…

  11. Fractionation of secondary metabolites of orange (Citrus sinensis L.) leaves by fast centrifugal partition chromatography

    USDA-ARS?s Scientific Manuscript database

    Conventional HPLC provides ready detection of the major phenolic compounds in methanol extracts of orange leaves, yet conventional HPLC also shows the presence of many more compounds, to an extent where extensive peak overlap prevents distinct peak detection and reliable quantitation. A more complet...

  12. A Multidimensional Model for the Identification of Dual-Exceptional Learners

    ERIC Educational Resources Information Center

    Al-Hroub, Anies

    2013-01-01

    This research takes mathematics as a model for investigating the definitions, identification, classification and characteristics of a group of gifted student related to the notion of "dual-exceptionality". An extensive process using qualitative and quantitative methods was conducted by a multidisciplinary team to develop and implement a…

  13. A Multiyear Investigation of Combating Bullying in Middle School: Stakeholder Perspectives

    ERIC Educational Resources Information Center

    Shriberg, David; Burns, Mallory; Desai, Poonam; Grunewald, Stephanie; Pitt, Rachel

    2015-01-01

    Working collaboratively to address bullying among middle school students is an ongoing challenge. This study used participatory action research to collaborate with key stakeholders within a middle school to identify needs and implement more effective practices. Extensive qualitative and quantitative data are presented, along with process…

  14. The Self-Systems: Facilitating Personal Well-Being Experiences at School

    ERIC Educational Resources Information Center

    Phan, Huy P.

    2017-01-01

    The focus of inquiry pertaining to quality learning and student well-being experiences at school has involved numerous studies, utilizing complex quantitative methodological approaches. In a similar vein, for consideration of research advancement, there has been extensive progress made regarding motivational tenets of effective learning and…

  15. Improving Training in Methodology Enriches the Science of Psychology

    ERIC Educational Resources Information Center

    Aiken, Leona S.; West, Stephen G.; Millsap, Roger E.

    2009-01-01

    Replies to the comment Ramifications of increased training in quantitative methodology by Herbet Zimiles on the current authors original article "Doctoral training in statistics, measurement, and methodology in psychology: Replication and extension of Aiken, West, Sechrest, and Reno's (1990) survey of PhD programs in North America". The…

  16. Measuring landscape esthetics: the scenic beauty estimation method

    Treesearch

    Terry C. Daniel; Ron S. Boster

    1976-01-01

    The Scenic Beauty Estimation Method (SBE) provides quantitative measures of esthetic preferences for alternative wildland management systems. Extensive experimentation and testing with user, interest, and professional groups validated the method. SBE shows promise as an efficient and objective means for assessing the scenic beauty of public forests and wildlands, and...

  17. 76 FR 44020 - Proposed Collection; Comment Request; Generic Clearance for Partners and Customer Satisfaction...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    ... surveys, which will be both quantitative and qualitative, are designed to assess the quality of services... Request; Generic Clearance for Partners and Customer Satisfaction Surveys SUMMARY: In compliance with the... Voluntary Partners and Customers Satisfaction Surveys: Extension. The information collected in these surveys...

  18. 75 FR 47606 - Strategic Plan for Consumer Education via Cooperative Agreement (U18)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-06

    ... or quantitative research with stakeholders and meetings with stakeholder groups and consumer experts... and resulting from an extensive consumer research process. In 2007, PFSE joined with USDA to create... responsibilities of FDA. B. Research Objectives PFSE supports a large, complex, and multi-faceted consumer food...

  19. 78 FR 21366 - Agency Information Collection Activities: Announcement of Board Approval Under Delegated...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-10

    ....C. 552(b)(4)). Abstract: This voluntary survey collects qualitative and limited quantitative..., Division of Research and Statistics, Board of Governors of the Federal Reserve System, Washington, DC 20551... extension for three years, with revision, of the following survey: Report title: Senior Credit Officer...

  20. An Experimental Ecological Study of a Garden Compost Heap.

    ERIC Educational Resources Information Center

    Curds, Tracy

    1985-01-01

    A quantitative study of the fauna of a garden compost heap shows it to be similar to that of organisms found in soil and leaf litter. Materials, methods, and results are discussed and extensive tables of fauna lists, wet/dry masses, and statistical analyses are presented. (Author/DH)

  1. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  2. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  3. Risk assessment of the onset of Osgood-Schlatter disease using kinetic analysis of various motions in sports.

    PubMed

    Itoh, Gento; Ishii, Hideyuki; Kato, Haruyasu; Nagano, Yasuharu; Hayashi, Hiroteru; Funasaki, Hiroki

    2018-01-01

    Some studies have listed motions that may cause Osgood-Schlatter disease, but none have quantitatively assessed the load on the tibial tubercle by such motions. To quantitatively identify the load on the tibial tubercle through a biomechanical approach using various motions that may cause Osgood-Schlatter disease, and to compare the load between different motions. Eight healthy male subjects were included. They conducted 4 types of kicks with a soccer ball, 2 types of runs, 2 types of squats, 2 types of jump landings, 2 types of stops, 1 type of turn, and 1 type of cutting motion. The angular impulse was calculated for knee extension moments ≥1.0 Nm/kg, ≥1.5 Nm/kg, ≥2.0 Nm/kg, and ≥2.5 Nm/kg. After analysis of variance, the post-hoc test was used to perform pairwise comparisons between all groups. The motion with the highest mean angular impulse of knee extension moment ≥1.0 Nm/kg was the single-leg landing after a jump, and that with the second highest mean was the cutting motion. At ≥1.5 Nm/kg, ≥2.0 Nm/kg, and ≥2.5 Nm/kg, the cutting motion was the highest, followed by the jump with a single-leg landing. They have a large load, and are associated with a higher risk of developing Osgood-Schlatter disease. The mean angular impulse of the 2 types of runs was small at all the indicators. Motions with a high risk of developing Osgood-Schlatter disease and low-risk motions can be assessed in further detail if future studies can quantify the load and number of repetitions that may cause Osgood-Schlatter disease while considering age and the development stage. Scheduled training regimens that balance load on the tibial tubercle with low-load motions after a training day of many load-intensive motions may prevent athletes from developing Osgood-Schlatter disease and increase their participation in sports.

  4. Risk assessment of the onset of Osgood–Schlatter disease using kinetic analysis of various motions in sports

    PubMed Central

    Ishii, Hideyuki; Kato, Haruyasu; Nagano, Yasuharu; Hayashi, Hiroteru; Funasaki, Hiroki

    2018-01-01

    Background Some studies have listed motions that may cause Osgood-Schlatter disease, but none have quantitatively assessed the load on the tibial tubercle by such motions. Purposes To quantitatively identify the load on the tibial tubercle through a biomechanical approach using various motions that may cause Osgood-Schlatter disease, and to compare the load between different motions. Methods Eight healthy male subjects were included. They conducted 4 types of kicks with a soccer ball, 2 types of runs, 2 types of squats, 2 types of jump landings, 2 types of stops, 1 type of turn, and 1 type of cutting motion. The angular impulse was calculated for knee extension moments ≥1.0 Nm/kg, ≥1.5 Nm/kg, ≥2.0 Nm/kg, and ≥2.5 Nm/kg. After analysis of variance, the post-hoc test was used to perform pairwise comparisons between all groups. Results/Conclusions The motion with the highest mean angular impulse of knee extension moment ≥1.0 Nm/kg was the single-leg landing after a jump, and that with the second highest mean was the cutting motion. At ≥1.5 Nm/kg, ≥2.0 Nm/kg, and ≥2.5 Nm/kg, the cutting motion was the highest, followed by the jump with a single-leg landing. They have a large load, and are associated with a higher risk of developing Osgood-Schlatter disease. The mean angular impulse of the 2 types of runs was small at all the indicators. Clinical relevance Motions with a high risk of developing Osgood-Schlatter disease and low-risk motions can be assessed in further detail if future studies can quantify the load and number of repetitions that may cause Osgood-Schlatter disease while considering age and the development stage. Scheduled training regimens that balance load on the tibial tubercle with low-load motions after a training day of many load-intensive motions may prevent athletes from developing Osgood-Schlatter disease and increase their participation in sports. PMID:29309422

  5. A minimalist biosensor: Quantitation of cyclic di-GMP using the conformational change of a riboswitch aptamer.

    PubMed

    Kellenberger, Colleen A; Sales-Lee, Jade; Pan, Yuchen; Gassaway, Madalee M; Herr, Amy E; Hammond, Ming C

    2015-01-01

    Cyclic di-GMP (c-di-GMP) is a second messenger that is important in regulating bacterial physiology and behavior, including motility and virulence. Many questions remain about the role and regulation of this signaling molecule, but current methods of detection are limited by either modest sensitivity or requirements for extensive sample purification. We have taken advantage of a natural, high affinity receptor of c-di-GMP, the Vc2 riboswitch aptamer, to develop a sensitive and rapid electrophoretic mobility shift assay (EMSA) for c-di-GMP quantitation that required minimal engineering of the RNA.

  6. Mapping asphalt pavement aging and condition using multiple endmember spectral mixture analysis in Beijing, China

    NASA Astrophysics Data System (ADS)

    Pan, Yifan; Zhang, Xianfeng; Tian, Jie; Jin, Xu; Luo, Lun; Yang, Ke

    2017-01-01

    Asphalt road reflectance spectra change as pavement ages. This provides the possibility for remote sensing to be used to monitor a change in asphalt pavement conditions. However, the relatively narrow geometry of roads and the relatively coarse spatial resolution of remotely sensed imagery result in mixtures between pavement and adjacent landcovers (e.g., vegetation, buildings, and soil), increasing uncertainties in spectral analysis. To overcome this problem, multiple endmember spectral mixture analysis (MESMA) was used to map the asphalt pavement condition using Worldview-2 satellite imagery in this study. Based on extensive field investigation and in situ measurements, aged asphalt pavements were categorized into four stages-preliminarily aged, moderately aged, heavily aged, and distressed. The spectral characteristics in the first three stages were further analyzed, and a MESMA unmixing analysis was conducted to map these three kinds of pavement conditions from the Worldview-2 image. The results showed that the road pavement conditions could be detected well and mapped with an overall accuracy of 81.71% and Kappa coefficient of 0.77. Finally, a quantitative assessment of the pavement conditions for each road segment in this study area was conducted to inform road maintenance management.

  7. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density.

    PubMed

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done.

  8. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    PubMed Central

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  9. ExGUtils: A Python Package for Statistical Analysis With the ex-Gaussian Probability Density

    PubMed Central

    Moret-Tatay, Carmen; Gamermann, Daniel; Navarro-Pardo, Esperanza; Fernández de Córdoba Castellá, Pedro

    2018-01-01

    The study of reaction times and their underlying cognitive processes is an important field in Psychology. Reaction times are often modeled through the ex-Gaussian distribution, because it provides a good fit to multiple empirical data. The complexity of this distribution makes the use of computational tools an essential element. Therefore, there is a strong need for efficient and versatile computational tools for the research in this area. In this manuscript we discuss some mathematical details of the ex-Gaussian distribution and apply the ExGUtils package, a set of functions and numerical tools, programmed for python, developed for numerical analysis of data involving the ex-Gaussian probability density. In order to validate the package, we present an extensive analysis of fits obtained with it, discuss advantages and differences between the least squares and maximum likelihood methods and quantitatively evaluate the goodness of the obtained fits (which is usually an overlooked point in most literature in the area). The analysis done allows one to identify outliers in the empirical datasets and criteriously determine if there is a need for data trimming and at which points it should be done. PMID:29765345

  10. Managing complex research datasets using electronic tools: a meta-analysis exemplar.

    PubMed

    Brown, Sharon A; Martin, Ellen E; Garcia, Theresa J; Winter, Mary A; García, Alexandra A; Brown, Adama; Cuevas, Heather E; Sumlin, Lisa L

    2013-06-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, for example, EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process as well as enhancing communication among research team members. The purpose of this article is to describe the electronic processes designed, using commercially available software, for an extensive, quantitative model-testing meta-analysis. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to decide on which electronic tools to use, determine how these tools would be used, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.

  11. Diagnostic value of (99m)Tc-3PRGD2 scintimammography for differentiation of malignant from benign breast lesions: Comparison of visual and semi-quantitative analysis.

    PubMed

    Chen, Qianqian; Xie, Qian; Zhao, Min; Chen, Bin; Gao, Shi; Zhang, Haishan; Xing, Hua; Ma, Qingjie

    2015-01-01

    To compare the diagnostic value of visual and semi-quantitative analysis of technetium-99m-poly-ethylene glycol, 4-arginine-glycine-aspartic acid ((99m)Tc-3PRGD2) scintimammography (SMG) for better differentiation of benign from malignant breast masses, and also investigate the incremental role of semi-quantitative index of SMG. A total of 72 patients with breast lesions were included in the study. Technetium-99m-3PRGD2 SMG was performed with single photon emission computed tomography (SPET) at 60 min after intravenous injection of 749 ± 86MBq of the radiotracer. Images were evaluated by visual interpretation and semi-quantitative indices of tumor to non-tumor (T/N) ratios, which were compared with pathology results. Receiver operating characteristics (ROC) curve analyses were performed to determine the optimal visual grade, to calculate cut-off values of semi-quantitative indices, and to compare visual and semi-quantitative diagnostic values. Among the 72 patients, 89 lesions were confirmed by histopathology after fine needle aspiration biopsy or surgery, 48 malignant and 41 benign lesions. The mean T/N ratio of (99m)Tc-3PRGD2 SMG in malignant lesions was significantly higher than that in benign lesions (P<0.05). When grade 2 of the disease was used as cut-off value for the detection of primary breast cancer, the sensitivity, specificity and accuracy were 81.3%, 70.7%, and 76.4%, respectively. When a T/N ratio of 2.01 was used as cut-off value, the sensitivity, specificity and accuracy were 79.2%, 75.6%, and 77.5%, respectively. According to ROC analysis, the area under the curve for semi-quantitative analysis was higher than that for visual analysis, but the statistical difference was not significant (P=0.372). Compared with visual analysis or semi-quantitative analysis alone, the sensitivity, specificity and accuracy of visual analysis combined with semi-quantitative analysis in diagnosing primary breast cancer were higher, being: 87.5%, 82.9%, and 85.4%, respectively. The area under the curve was 0.891. Results of the present study suggest that the semi-quantitative and visual analysis statistically showed similar results. The semi-quantitative analysis provided incremental value additive to visual analysis of (99m)Tc-3PRGD2 SMG for the detection of breast cancer. It seems from our results that, when the tumor was located in the medial part of the breast, the semi-quantitative analysis gave better diagnostic results.

  12. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological analysis. Conclusions Fluorescence-guided resection may be a useful adjunct in the resection of skull base meningiomas. The use of a quantitative intraoperative probe to detect PpIX concentration allows more accurate determination of neoplastic tissue in meningiomas than visible fluorescence and is readily applicable in areas, such as the skull base, where complete resection is critical but difficult because of the vital structures surrounding the pathology. PMID:21529179

  13. Evaluation of acute ischemic stroke using quantitative EEG: a comparison with conventional EEG and CT scan.

    PubMed

    Murri, L; Gori, S; Massetani, R; Bonanni, E; Marcella, F; Milani, S

    1998-06-01

    The sensitivity of quantitative electroencephalogram (EEG) was compared with that of conventional EEG in patients with acute ischaemic stroke. In addition, a correlation between quantitative EEG data and computerized tomography (CT) scan findings was carried out for all the areas of lesion in order to reassess the actual role of EEG in the evaluation of stroke. Sixty-five patients were tested with conventional and quantitative EEG within 24 h from the onset of neurological symptoms, whereas CT scan was performed within 4 days from the onset of stroke. EEG was recorded from 19 electrodes placed upon the scalp according to the International 10-20 System. Spectral analysis was carried out on 30 artefact-free 4-sec epochs. For each channel absolute and relative power were calculated for the delta, theta, alpha and beta frequency bands and such data were successively represented in colour-coded maps. Ten patients with extensive lesions documented by CT scan were excluded. The results indicated that conventional EEG revealed abnormalities in 40 of 55 cases, while EEG mapping showed abnormalities in 46 of 55 cases: it showed focal abnormalities in five cases and nonfocal abnormalities in one of six cases which had appeared to be normal according to visual inspection of EEG. In a further 11 cases, where the conventional EEG revealed abnormalities in one hemisphere, the quantitative EEG and maps allowed to further localize abnormal activity in a more localized way. The sensitivity of both methods was higher for frontocentral, temporal and parieto-occipital cortical-subcortical infarctions than for basal ganglia and internal capsule lesions; however, quantitative EEG was more efficient for all areas of lesion in detecting cases that had appeared normal by visual inspection and was clearly superior in revealing focal abnormalities. When we considered the electrode related to which the maximum power of the delta frequency band is recorded, a fairly close correlation was found between the localization of the maximum delta power and the position of lesions documented by CT scan for all areas of lesion excepting those located in the striatocapsular area.

  14. Development of estimation system of knee extension strength using image features in ultrasound images of rectus femoris

    NASA Astrophysics Data System (ADS)

    Murakami, Hiroki; Watanabe, Tsuneo; Fukuoka, Daisuke; Terabayashi, Nobuo; Hara, Takeshi; Muramatsu, Chisako; Fujita, Hiroshi

    2016-04-01

    The word "Locomotive syndrome" has been proposed to describe the state of requiring care by musculoskeletal disorders and its high-risk condition. Reduction of the knee extension strength is cited as one of the risk factors, and the accurate measurement of the strength is needed for the evaluation. The measurement of knee extension strength using a dynamometer is one of the most direct and quantitative methods. This study aims to develop a system for measuring the knee extension strength using the ultrasound images of the rectus femoris muscles obtained with non-invasive ultrasonic diagnostic equipment. First, we extract the muscle area from the ultrasound images and determine the image features, such as the thickness of the muscle. We combine these features and physical features, such as the patient's height, and build a regression model of the knee extension strength from training data. We have developed a system for estimating the knee extension strength by applying the regression model to the features obtained from test data. Using the test data of 168 cases, correlation coefficient value between the measured values and estimated values was 0.82. This result suggests that this system can estimate knee extension strength with high accuracy.

  15. Using Qualitative Hazard Analysis to Guide Quantitative Safety Analysis

    NASA Technical Reports Server (NTRS)

    Shortle, J. F.; Allocco, M.

    2005-01-01

    Quantitative methods can be beneficial in many types of safety investigations. However, there are many difficulties in using quantitative m ethods. Far example, there may be little relevant data available. This paper proposes a framework for using quantitative hazard analysis to prioritize hazard scenarios most suitable for quantitative mziysis. The framework first categorizes hazard scenarios by severity and likelihood. We then propose another metric "modeling difficulty" that desc ribes the complexity in modeling a given hazard scenario quantitatively. The combined metrics of severity, likelihood, and modeling difficu lty help to prioritize hazard scenarios for which quantitative analys is should be applied. We have applied this methodology to proposed concepts of operations for reduced wake separation for airplane operatio ns at closely spaced parallel runways.

  16. A Complete Color Normalization Approach to Histopathology Images Using Color Cues Computed From Saturation-Weighted Statistics.

    PubMed

    Li, Xingyu; Plataniotis, Konstantinos N

    2015-07-01

    In digital histopathology, tasks of segmentation and disease diagnosis are achieved by quantitative analysis of image content. However, color variation in image samples makes it challenging to produce reliable results. This paper introduces a complete normalization scheme to address the problem of color variation in histopathology images jointly caused by inconsistent biopsy staining and nonstandard imaging condition. Method : Different from existing normalization methods that either address partial cause of color variation or lump them together, our method identifies causes of color variation based on a microscopic imaging model and addresses inconsistency in biopsy imaging and staining by an illuminant normalization module and a spectral normalization module, respectively. In evaluation, we use two public datasets that are representative of histopathology images commonly received in clinics to examine the proposed method from the aspects of robustness to system settings, performance consistency against achromatic pixels, and normalization effectiveness in terms of histological information preservation. As the saturation-weighted statistics proposed in this study generates stable and reliable color cues for stain normalization, our scheme is robust to system parameters and insensitive to image content and achromatic colors. Extensive experimentation suggests that our approach outperforms state-of-the-art normalization methods as the proposed method is the only approach that succeeds to preserve histological information after normalization. The proposed color normalization solution would be useful to mitigate effects of color variation in pathology images on subsequent quantitative analysis.

  17. Social media in epilepsy: A quantitative and qualitative analysis.

    PubMed

    Meng, Ying; Elkaim, Lior; Wang, Justin; Liu, Jessica; Alotaibi, Naif M; Ibrahim, George M; Fallah, Aria; Weil, Alexander G; Valiante, Taufik A; Lozano, Andres M; Rutka, James T

    2017-06-01

    While the social burden of epilepsy has been extensively studied, an evaluation of social media related to epilepsy may provide novel insight into disease perception, patient needs and access to treatments. The objective of this study is to assess patterns in social media and online communication usage related to epilepsy and its associated topics. We searched two major social media platforms (Facebook and Twitter) for public accounts dedicated to epilepsy. Results were analyzed using qualitative and quantitative methodologies. The former involved thematic and word count analysis for online posts and tweets on these platforms, while the latter employed descriptive statistics and non-parametric tests. Facebook had a higher number of pages (840 accounts) and users (3 million) compared to Twitter (137 accounts and 274,663 users). Foundation and support groups comprised most of the accounts and users on both Facebook and Twitter. The number of accounts increased by 100% from 2012 to 2016. Among the 403 posts and tweets analyzed, "providing information" on medications or correcting common misconceptions in epilepsy was the most common theme (48%). Surgical interventions for epilepsy were only mentioned in 1% of all posts and tweets. The current study provides a comprehensive reference on the usage of social media in epilepsy. The number of online users interested in epilepsy is likely the highest among all neurological conditions. Surgery, as a method of treating refractory epilepsy, however, could be underrepresented on social media. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Location identification of closed crack based on Duffing oscillator transient transition

    NASA Astrophysics Data System (ADS)

    Liu, Xiaofeng; Bo, Lin; Liu, Yaolu; Zhao, Youxuan; Zhang, Jun; Deng, Mingxi; Hu, Ning

    2018-02-01

    The existence of a closed micro-crack in plates can be detected by using the nonlinear harmonic characteristics of the Lamb wave. However, its location identification is difficult. By considering the transient nonlinear Lamb under the noise interference, we proposed a location identification method for the closed crack based on the quantitative measurement of Duffing oscillator transient transfer in the phase space. The sliding short-time window was used to create a window truncation of to-be-detected signal. And then, the periodic extension processing for transient nonlinear Lamb wave was performed to ensure that the Duffing oscillator has adequate response time to reach a steady state. The transient autocorrelation method was used to reduce the occurrence of missed harmonic detection due to the random variable phase of nonlinear Lamb wave. Moreover, to overcome the deficiency in the quantitative analysis of Duffing system state by phase trajectory diagram and eliminate the misjudgment caused by harmonic frequency component contained in broadband noise, logic operation method of oscillator state transition function based on circular zone partition was adopted to establish the mapping relation between the oscillator transition state and the nonlinear harmonic time domain information. Final state transition discriminant function of Duffing oscillator was used as basis for identifying the reflected and transmitted harmonics from the crack. Chirplet time-frequency analysis was conducted to identify the mode of generated harmonics and determine the propagation speed. Through these steps, accurate position identification of the closed crack was achieved.

  19. An Extensive Survey of Tyrosine Phosphorylation Revealing New Sites in Human Mammary Epithelial Cells

    PubMed Central

    Heibeck, Tyler H.; Ding, Shi-Jian; Opresko, Lee K.; Zhao, Rui; Schepmoes, Athena A.; Yang, Feng; Tolmachev, Aleksey V.; Monroe, Matthew E.; Camp, David G.; Smith, Richard D.; Wiley, H. Steven; Qian, Wei-Jun

    2010-01-01

    Protein tyrosine phosphorylation represents a central regulatory mechanism in cell signaling. Here we present an extensive survey of tyrosine phosphorylation sites in a normal-derived human mammary epithelial cell line by applying anti-phosphotyrosine peptide immunoaffinity purification coupled with high sensitivity capillary liquid chromatography tandem mass spectrometry. A total of 481 tyrosine phosphorylation sites (covered by 716 unique peptides) from 285 proteins were confidently identified in HMEC following the analysis of both the basal condition and acute stimulation with epidermal growth factor (EGF). The estimated false discovery rate was 1.0% as determined by searching against a scrambled database. Comparison of these data with existing literature showed significant agreement for previously reported sites. However, we observed 281 sites that were not previously reported for HMEC cultures and 29 of which have not been reported for any human cell or tissue system. The analysis showed that the majority of highly phosphorylated proteins were relatively low-abundance. Large differences in phosphorylation stoichiometry for sites within the same protein were also observed, raising the possibility of more important functional roles for such highly phosphorylated pTyr sites. By mapping to major signaling networks, such as the EGF receptor and insulin growth factor-1 receptor signaling pathways, many known proteins involved in these pathways were revealed to be tyrosine phosphorylated, which provides interesting targets for future hypothesis-driven and targeted quantitative studies involving tyrosine phosphorylation in HMEC or other human systems. PMID:19534553

  20. The Effectiveness of Asulam for Bracken ( Pteridium aquilinum) Control in the United Kingdom: A Meta-Analysis

    NASA Astrophysics Data System (ADS)

    Stewart, Gavin B.; Pullin, Andrew S.; Tyler, Claire

    2007-11-01

    Bracken ( Pteridium aquilinum) is a major problem for livestock-based extensive agriculture, conservation, recreation, and game management globally. It is an invasive species often achieving dominance to the detriment of other species. Control is essential to maintain plant communities such as grassland and lowland heath or if extensive grazing by domestic stock, particularly sheep, is to be viable on upland margins. Bracken is managed primarily by herbicide application or cutting but other techniques including rolling, burning, and grazing are also utilized. Here we evaluate the evidence regarding the effectiveness of asulam for the control of bracken. Thirteen studies provided data for meta-analyses which demonstrate that application of the herbicide asulam reduces bracken abundance. Subgroup analyses indicate that the number of treatments had an important impact, with multiple follow-up treatments more effective than one or two treatments. Management practices should reflect the requirement for repeated follow-up. There is insufficient available experimental evidence for quantitative analysis of the effectiveness of other management interventions, although this results from lack of reporting in papers where cutting and comparisons of cutting and asulam application are concerned. Systematic searching and meta-analytical synthesis have effectively demonstrated the limits of current knowledge, based on recorded empirical evidence, and increasing the call for more rigorous monitoring of bracken control techniques. Lack of experimental evidence on the effectiveness of management such as rolling or grazing with hardy cattle breeds contrasts with the widespread acceptance of their use through dissemination of experience.

  1. Infliximab-Related Infusion Reactions: Systematic Review

    PubMed Central

    Ron, Yulia; Kivity, Shmuel; Ben-Horin, Shomron; Israeli, Eran; Fraser, Gerald M.; Dotan, Iris; Chowers, Yehuda; Confino-Cohen, Ronit; Weiss, Batia

    2015-01-01

    Objective: Administration of infliximab is associated with a well-recognised risk of infusion reactions. Lack of a mechanism-based rationale for their prevention, and absence of adequate and well-controlled studies, has led to the use of diverse empirical administration protocols. The aim of this study is to perform a systematic review of the evidence behind the strategies for preventing infusion reactions to infliximab, and for controlling the reactions once they occur. Methods: We conducted extensive search of electronic databases of MEDLINE [PubMed] for reports that communicate various aspects of infusion reactions to infliximab in IBD patients. Results: We examined full texts of 105 potentially eligible articles. No randomised controlled trials that pre-defined infusion reaction as a primary outcome were found. Three RCTs evaluated infusion reactions as a secondary outcome; another four RCTs included infusion reactions in the safety evaluation analysis; and 62 additional studies focused on various aspects of mechanism/s, risk, primary and secondary preventive measures, and management algorithms. Seven studies were added by a manual search of reference lists of the relevant articles. A total of 76 original studies were included in quantitative analysis of the existing strategies. Conclusions: There is still paucity of systematic and controlled data on the risk, prevention, and management of infusion reactions to infliximab. We present working algorithms based on systematic and extensive review of the available data. More randomised controlled trials are needed in order to investigate the efficacy of the proposed preventive and management algorithms. PMID:26092578

  2. Benchmarking patient improvement in physical therapy with data envelopment analysis.

    PubMed

    Friesner, Daniel; Neufelder, Donna; Raisor, Janet; Khayum, Mohammed

    2005-01-01

    The purpose of this article is to present a case study that documents how management science techniques (in particular data envelopment analysis) can be applied to performance improvement initiatives in an inpatient physical therapy setting. The data used in this study consist of patients referred for inpatient physical therapy following total knee replacement surgery (at a medium-sized medical facility in the Midwestern USA) during the fiscal year 2002. Data envelopment analysis (DEA) was applied to determine the efficiency of treatment, as well as to identify benchmarks for potential patient improvement. Statistical trends in the benchmarking and efficiency results were subsequently analyzed using non-parametric and parametric methods. Our analysis indicated that the rehabilitation process was largely effective in terms of providing consistent, quality care, as more than half of the patients in our study achieved the maximum amount of rehabilitation possible given available inputs. Among patients that did not achieve maximum results, most could obtain increases in the degree of flexion gain and reductions in the degree of knee extension. The study is retrospective in nature, and is not based on clinical trial or experimental data. Additionally, DEA results are inherently sensitive to sampling: adding or subtracting individuals from the sample may change the baseline against which efficiency and rehabilitation potential are measured. As such, therapists using this approach must ensure that the sample is representative of the general population, and must not contain significant measurement error. Third, individuals who choose total knee arthroplasty will incur a transient disability. However, this population does not generally fit the World Health Organization International Classification of Functioning, Disability and Health definition of disability if the surgical procedure is successful. Since the study focuses on the outcomes of physical therapy, range of motion measurements and circumferential measurements were chosen as opposed to the more global measures of functional independence such as mobility, transfers and stair climbing. Applying this technique to data on patients with different disabilities (or the same disability with other outcome variables, such as Functional Independence Measure scores) may give dissimilar results. This case study provides an example of how one can apply quantitative management science tools in a manner that is both tractable and intuitive to the practising therapist, who may not have an extensive background in quantitative performance improvement or statistics. DEA has not been applied to rehabilitation, especially in the case where managers have limited data available.

  3. Homeless people's access to primary care physiotherapy services: an exploratory, mixed-method investigation using a follow-up qualitative extension to core quantitative research.

    PubMed

    Dawes, Jo; Deaton, Stuart; Greenwood, Nan

    2017-06-30

    The purpose of this study was to appraise referrals of homeless patients to physiotherapy services and explore perceptions of barriers to access. This exploratory mixed-method study used a follow-up qualitative extension to core quantitative research design. Over 9 months, quantitative data were gathered from the healthcare records of homeless patients referred to physiotherapy by a general practitioner (GP) practice, including the number of referrals and demographic data of all homeless patients referred. Corresponding physiotherapy records of those people referred to physiotherapy were searched for the outcome of their care. Qualitative semi-structured telephone interviews, based on the quantitative findings, were carried out with staff involved with patient care from the referring GP practice and were used to expand insight into the quantitative findings. Two primary care sites provided data for this study: a GP practice dedicated exclusively to homeless people and the physiotherapy department receiving their referrals. Quantitative data from the healthcare records of 34 homeless patient referrals to physiotherapy were collected and analysed. In addition, five staff involved in patient care were interviewed. 34 referrals of homeless people were made to physiotherapy in a 9-month period. It was possible to match 25 of these to records from the physiotherapy department. Nine (36%) patients did not attend their first appointment; seven (28%) attended an initial appointment, but did not attend a subsequent appointment and were discharged from the service; five (20%) completed treatment and four patients (16%) had ongoing treatment. Semi-structured interviews revealed potential barriers preventing homeless people from accessing physiotherapy services, the complex factors being faced by those making referrals and possible ways to improve physiotherapy access. Homeless people with musculoskeletal problems may fail to access physiotherapy treatment, but opportunities exist to make access to physiotherapy easier. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. Deficient Contractor Business Systems: Applying the Value at Risk (VaR) Model to Earned Value Management Systems

    DTIC Science & Technology

    2013-06-30

    QUANTITATIVE RISK ANALYSIS The use of quantitative cost risk analysis tools can be valuable in measuring numerical risk to the government ( Galway , 2004...assessment of the EVMS itself. Galway (2004) practically linked project quantitative risk assessment to EVM by focusing on cost, schedule, and...www.amazon.com Galway , L. (2004, February). Quantitative risk analysis for project management: A critical review (RAND Working Paper WR-112-RC

  5. On the accuracy of the Padé-resummed master equation approach to dissipative quantum dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hsing-Ta; Reichman, David R.; Berkelbach, Timothy C.

    2016-04-21

    Well-defined criteria are proposed for assessing the accuracy of quantum master equations whose memory functions are approximated by Padé resummation of the first two moments in the electronic coupling. These criteria partition the parameter space into distinct levels of expected accuracy, ranging from quantitatively accurate regimes to regions of parameter space where the approach is not expected to be applicable. Extensive comparison of Padé-resummed master equations with numerically exact results in the context of the spin–boson model demonstrates that the proposed criteria correctly demarcate the regions of parameter space where the Padé approximation is reliable. The applicability analysis we presentmore » is not confined to the specifics of the Hamiltonian under consideration and should provide guidelines for other classes of resummation techniques.« less

  6. Instrumentation development for study of Reynolds Analogy in reacting flows

    NASA Technical Reports Server (NTRS)

    Deturris, Dianne J.

    1995-01-01

    Boundary layers in supersonic reacting flows are not well understood. Recently a technique has been developed which makes more extensive surface measurements practical, increasing the capability to understand the turbulent boundary layer. A significant advance in this understanding would be the formulation of an analytic relation between the transfer of momentum and the transfer of heat for this flow, similar to the Reynolds Analogy that exists for laminar flow. A gauge has been designed and built which allows a thorough experimental investigation of the relative effects of heat transfer and skin friction in the presence of combustion. Direct concurrent measurements made at the same location, combined with local flow conditions, enable a quantitative analysis to obtain a relation between the surface drag and wall heating, as well as identifying possible ways of reducing both.

  7. Real medical benefit assessed by indirect comparison.

    PubMed

    Falissard, Bruno; Zylberman, Myriam; Cucherat, Michel; Izard, Valérie; Meyer, François

    2009-01-01

    Frequently, in data packages submitted for Marketing Approval to the CHMP, there is a lack of relevant head-to-head comparisons of medicinal products that could enable national authorities responsible for the approval of reimbursement to assess the Added Therapeutic Value (ASMR) of new clinical entities or line extensions of existing therapies.Indirect or mixed treatment comparisons (MTC) are methods stemming from the field of meta-analysis that have been designed to tackle this problem. Adjusted indirect comparisons, meta-regressions, mixed models, Bayesian network analyses pool results of randomised controlled trials (RCTs), enabling a quantitative synthesis.The REAL procedure, recently developed by the HAS (French National Authority for Health), is a mixture of an MTC and effect model based on expert opinions. It is intended to translate the efficacy observed in the trials into effectiveness expected in day-to-day clinical practice in France.

  8. Stability of drugs of abuse in urine samples stored at -20 degrees C.

    PubMed

    Dugan, S; Bogema, S; Schwartz, R W; Lappas, N T

    1994-01-01

    Isolated studies of the stability of individual drugs of abuse have been reported. However, few have evaluated stability in frozen urine samples stored for 12 months. We have determined the stability of 11-nor-9-carboxy-delta 9-tetrahydrocannabinol (9-COOH-THC), amphetamine, methamphetamine, morphine, codeine, cocaine, benzoylecgonine, and phencyclidine in 236 physiological urine samples. Following the initial quantitative analysis, the samples were stored at -20 degrees C for 12 months and then reanalyzed. All drug concentrations were determined by gas chromatographic-mass spectrometric methods with cutoff concentrations of 5 ng/mL for 9-COOH-THC and phencyclidine and 100 ng/mL for each of the other drugs. The average change in the concentrations of these drugs following this long-term storage was not extensive except for an average change of -37% in cocaine concentrations.

  9. Methylation-independent adaptation in chemotaxis of Escherichia coli involves acetylation-dependent speed adaptation.

    PubMed

    Baron, Szilvia; Afanzar, Oshri; Eisenbach, Michael

    2017-01-01

    Chemoreceptor methylation and demethylation has been shown to be at the core of the adaptation mechanism in Escherichia coli chemotaxis. Nevertheless, mutants lacking the methylation machinery can adapt to some extent. Here we carried out an extensive quantitative analysis of chemotactic and chemokinetic methylation-independent adaptation. We show that partial or complete adaptation of the direction of flagellar rotation and the swimming speed in the absence of the methylation machinery each occurs in a small fraction of cells. Furthermore, deletion of the main enzyme responsible for acetylation of the signaling molecule CheY prevented speed adaptation but not adaptation of the direction of rotation. These results suggest that methylation-independent adaptation in bacterial chemotaxis involves chemokinetic adaptation, which is dependent on CheY acetylation. © 2016 Federation of European Biochemical Societies.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimpe, T; Marchessoux, C; Rostang, J

    Purpose: Use of color images in medical imaging has increased significantly the last few years. As of today there is no agreed standard on how color information needs to be visualized on medical color displays, resulting into large variability of color appearance and it making consistency and quality assurance a challenge. This paper presents a proposal for an extension of DICOM GSDF towards color. Methods: Visualization needs for several color modalities (multimodality imaging, nuclear medicine, digital pathology, quantitative imaging applications…) have been studied. On this basis a proposal was made for desired color behavior of color medical display systems andmore » its behavior and effect on color medical images was analyzed. Results: Several medical color modalities could benefit from perceptually linear color visualization for similar reasons as why GSDF was put in place for greyscale medical images. An extension of the GSDF (Greyscale Standard Display Function) to color is proposed: CSDF (color standard display function). CSDF is based on deltaE2000 and offers a perceptually linear color behavior. CSDF uses GSDF as its neutral grey behavior. A comparison between sRGB/GSDF and CSDF confirms that CSDF significantly improves perceptual color linearity. Furthermore, results also indicate that because of the improved perceptual linearity, CSDF has the potential to increase perceived contrast of clinically relevant color features. Conclusion: There is a need for an extension of GSDF towards color visualization in order to guarantee consistency and quality. A first proposal (CSDF) for such extension has been made. Behavior of a CSDF calibrated display has been characterized and compared with sRGB/GSDF behavior. First results indicate that CSDF could have a positive influence on perceived contrast of clinically relevant color features and could offer benefits for quantitative imaging applications. Authors are employees of Barco Healthcare.« less

  11. What Are We Doing When We Translate from Quantitative Models?

    PubMed Central

    Critchfield, Thomas S; Reed, Derek D

    2009-01-01

    Although quantitative analysis (in which behavior principles are defined in terms of equations) has become common in basic behavior analysis, translational efforts often examine everyday events through the lens of narrative versions of laboratory-derived principles. This approach to translation, although useful, is incomplete because equations may convey concepts that are difficult to capture in words. To support this point, we provide a nontechnical introduction to selected aspects of quantitative analysis; consider some issues that translational investigators (and, potentially, practitioners) confront when attempting to translate from quantitative models; and discuss examples of relevant translational studies. We conclude that, where behavior-science translation is concerned, the quantitative features of quantitative models cannot be ignored without sacrificing conceptual precision, scientific and practical insights, and the capacity of the basic and applied wings of behavior analysis to communicate effectively. PMID:22478533

  12. A Quantitative and Comparative Research on Chinese and English Numerical Phrases

    ERIC Educational Resources Information Center

    Chen, Peijun

    2010-01-01

    Numerical phases have rich cultural connotations and connect closely with culture. Along with the extension of China's reform and opening up policy, cross-cultural communication tends to be wider. The comparative research on cross-cultural languages is very important. Because of different cultural backgrounds, the cultural connotations of Chinese…

  13. Validity in Mixed Methods Research in Education: The Application of Habermas' Critical Theory

    ERIC Educational Resources Information Center

    Long, Haiying

    2017-01-01

    Mixed methods approach has developed into the third methodological movement in educational research. Validity in mixed methods research as an important issue, however, has not been examined as extensively as that of quantitative and qualitative research. Additionally, the previous discussions of validity in mixed methods research focus on research…

  14. Validating Personal Well-Being Experiences at School: A Quantitative Examination of Secondary School Students

    ERIC Educational Resources Information Center

    Phan, Huy P.; Ngu, Bing H.

    2015-01-01

    Progress in education has involved, to a large extent, a focus on individuals' well-being experiences at school (ACU and Erebus International, 2008; Fraillon, 2004). This line of inquiry has produced extensive findings, highlighting the diverse coverage and scope of this psychosocial theoretical orientation. We recently developed a theoretical…

  15. Psychometrics and Its Discontents: An Historical Perspective on the Discourse of the Measurement Tradition

    ERIC Educational Resources Information Center

    Schoenherr, Jordan Richard; Hamstra, Stanley J.

    2016-01-01

    Psychometrics has recently undergone extensive criticism within the medical education literature. The use of quantitative measurement using psychometric instruments such as response scales is thought to emphasize a narrow range of relevant learner skills and competencies. Recent reviews and commentaries suggest that a paradigm shift might be…

  16. Rear Admirals and Biochemists: Why Do They Want to Teach High School?

    ERIC Educational Resources Information Center

    Merseth, Katherine K.

    The shortage of qualified secondary school mathematics and science teachers is discussed and a program is described which aims to lessen the severity of this dilemma. The MidCareer Mathematics and Science Program (MCMS) at the Harvard Graduate School of Education demonstrates that quantitatively trained individuals with extensive knowledge of…

  17. Cyberbully, Cybervictim, and Forgiveness among Indonesian High School Students

    ERIC Educational Resources Information Center

    Safaria, Triantoro; Tentama, Fatwa; Suyono, Hadi

    2016-01-01

    Cyberbullying has been commonplace practice among Indonesian teenagers engaging in on-line space. However, this phenomenon has not been extensively researched in the context of Indonesia. This present study aims to examine the extent to which level of forgiveness contribute to varying degrees of cyberbullying. It is a quantitative in which the…

  18. 19 CFR 10.252 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Act Extension of Atpa Benefits to Tuna and Certain Other Non-Textile Articles § 10.252 Definitions... duty and free of any quantitative restrictions in the case of tuna described in § 10.253(a)(1) and free... United States and for which a license has been issued pursuant to section 9 of the South Pacific Tuna Act...

  19. 19 CFR 10.252 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Act Extension of Atpa Benefits to Tuna and Certain Other Non-Textile Articles § 10.252 Definitions... duty and free of any quantitative restrictions in the case of tuna described in § 10.253(a)(1) and free... United States and for which a license has been issued pursuant to section 9 of the South Pacific Tuna Act...

  20. 19 CFR 10.252 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Act Extension of Atpa Benefits to Tuna and Certain Other Non-Textile Articles § 10.252 Definitions... duty and free of any quantitative restrictions in the case of tuna described in § 10.253(a)(1) and free... United States and for which a license has been issued pursuant to section 9 of the South Pacific Tuna Act...

  1. 19 CFR 10.252 - Definitions.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Act Extension of Atpa Benefits to Tuna and Certain Other Non-Textile Articles § 10.252 Definitions... duty and free of any quantitative restrictions in the case of tuna described in § 10.253(a)(1) and free... United States and for which a license has been issued pursuant to section 9 of the South Pacific Tuna Act...

  2. 19 CFR 10.252 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Act Extension of Atpa Benefits to Tuna and Certain Other Non-Textile Articles § 10.252 Definitions... duty and free of any quantitative restrictions in the case of tuna described in § 10.253(a)(1) and free... United States and for which a license has been issued pursuant to section 9 of the South Pacific Tuna Act...

  3. Calculation of High Angle of Attack Aerodynamics of Fighter Configurations. Volume 1. Steady

    DTIC Science & Technology

    1991-04-01

    patterns are now well known qualitatively for fighter configurations from extensive wind and water tunnel tests. However, development of quantitative ...Illustration of Flow Features Predicted in the Present Method -55- z -I1 Figure 2. Difinition of Airplane Coordinate Systems -56- zz T .. l y vy.y Mean

  4. Use of in Vitro HTS-Derived Concentration-Response Data as Biological Descriptors Improves the Accuracy of QSAR Models of in Vivo Toxicity

    EPA Science Inventory

    Background: Quantitative high-throughput screening (qHTS) assays are increasingly being employed to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in co...

  5. Management-by-Results and Performance Measurement in Universities--Implications for Work Motivation

    ERIC Educational Resources Information Center

    Kallio, Kirsi-Mari; Kallio, Tomi J.

    2014-01-01

    The article focuses on the effects of management-by-results from the perspective of the work motivation of university employees. The study is based on extensive survey data among employees at Finnish universities. According to the results, performance measurement is based on quantitative rather than qualitative measures, and the current…

  6. Consortial Collaboration and the Creation of an Assessment Instrument for Community-Based Learning

    ERIC Educational Resources Information Center

    Murphy, Margueritte S.; Flowers, Kathleen S.

    2017-01-01

    This article describes the development of the Community-Based Learning (CBL) Scorecard by a grant-funded consortium of liberal arts institutions. The aim of the scorecard was to promote assessment that improves student learning with an instrument that employs a quantitative scale, allowing for benchmarking across institutions. Extensive interviews…

  7. Cross-system comparisons elucidate disturbance complexities and generalities

    Treesearch

    Debra P.C. Peters; Ariel E. Lugo; F. Stuart Chapin; Steward T.A. Pickett; Michael Duniway; Adrian V. Rocha; Frederick J. Swanson; Christine Laney; Julia Jones

    2011-01-01

    Given that ecological effects of disturbance have been extensively studied in many ecosystems, it is surprising that few quantitative syntheses across diverse ecosystems have been conducted. Multi-system studies tend to be qualitative because they focus on disturbance types that are difficult to measure in an ecologically relevant way. In addition, synthesis of...

  8. Distal and Proximal Vision: A Multi-Perspective Research in Sociology of Education

    ERIC Educational Resources Information Center

    Giancola, Orazio; Viteritti, Assunta

    2014-01-01

    Drawing inspiration from the research conducted in Italian schools involved in the reform process, the article proposes to investigate two visions in the research on Sociology of Education: one distal and the other proximal. The distal vision is offered by quantitative research nowadays supported by extensive public funding and framed as…

  9. Scale Development: Heterosexist Attitudes in Women's Collegiate Athletics

    ERIC Educational Resources Information Center

    Mullin, Elizabeth M.

    2013-01-01

    Homophobia and heterosexism in women's athletics have been studied extensively using a qualitative approach. Limited research from a quantitative approach has been conducted in the area and none with a sport-specific instrument. The purpose of the current study was to develop a valid and reliable questionnaire to measure heterosexist attitudes in…

  10. Does the Community of Inquiry Framework Predict Outcomes in Online MBA Courses?

    ERIC Educational Resources Information Center

    Arbaugh, J. B.

    2008-01-01

    While Garrison and colleagues' (2000) Community of Inquiry (CoI) framework has generated substantial interest among online learning researchers, it has yet to be subjected to extensive quantitative verification or tested for external validity. Using a sample of students from 55 online MBA courses, the findings of this study suggest strong…

  11. Analysis of airborne MAIS imaging spectrometric data for mineral exploration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang Jinnian; Zheng Lanfen; Tong Qingxi

    1996-11-01

    The high spectral resolution imaging spectrometric system made quantitative analysis and mapping of surface composition possible. The key issue will be the quantitative approach for analysis of surface parameters for imaging spectrometer data. This paper describes the methods and the stages of quantitative analysis. (1) Extracting surface reflectance from imaging spectrometer image. Lab. and inflight field measurements are conducted for calibration of imaging spectrometer data, and the atmospheric correction has also been used to obtain ground reflectance by using empirical line method and radiation transfer modeling. (2) Determining quantitative relationship between absorption band parameters from the imaging spectrometer data andmore » chemical composition of minerals. (3) Spectral comparison between the spectra of spectral library and the spectra derived from the imagery. The wavelet analysis-based spectrum-matching techniques for quantitative analysis of imaging spectrometer data has beer, developed. Airborne MAIS imaging spectrometer data were used for analysis and the analysis results have been applied to the mineral and petroleum exploration in Tarim Basin area china. 8 refs., 8 figs.« less

  12. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method.

    PubMed

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-02-01

    To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil.

  13. Quantitative analysis of γ-oryzanol content in cold pressed rice bran oil by TLC-image analysis method

    PubMed Central

    Sakunpak, Apirak; Suksaeree, Jirapornchai; Monton, Chaowalit; Pathompak, Pathamaporn; Kraisintu, Krisana

    2014-01-01

    Objective To develop and validate an image analysis method for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. Methods TLC-densitometric and TLC-image analysis methods were developed, validated, and used for quantitative analysis of γ-oryzanol in cold pressed rice bran oil. The results obtained by these two different quantification methods were compared by paired t-test. Results Both assays provided good linearity, accuracy, reproducibility and selectivity for determination of γ-oryzanol. Conclusions The TLC-densitometric and TLC-image analysis methods provided a similar reproducibility, accuracy and selectivity for the quantitative determination of γ-oryzanol in cold pressed rice bran oil. A statistical comparison of the quantitative determinations of γ-oryzanol in samples did not show any statistically significant difference between TLC-densitometric and TLC-image analysis methods. As both methods were found to be equal, they therefore can be used for the determination of γ-oryzanol in cold pressed rice bran oil. PMID:25182282

  14. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates.

    PubMed

    Schwämmle, Veit; León, Ileana Rodríguez; Jensen, Ole Nørregaard

    2013-09-06

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant changes on the peptide level, for example, in phospho-proteomics experiments. In order to assess the extent of this problem and the implications for large-scale proteome analysis, we investigated and optimized the performance of three statistical approaches by using simulated and experimental data sets with varying numbers of missing values. We applied three tools, including standard t test, moderated t test, also known as limma, and rank products for the detection of significantly changing features in simulated and experimental proteomics data sets with missing values. The rank product method was improved to work with data sets containing missing values. Extensive analysis of simulated and experimental data sets revealed that the performance of the statistical analysis tools depended on simple properties of the data sets. High-confidence results were obtained by using the limma and rank products methods for analyses of triplicate data sets that exhibited more than 1000 features and more than 50% missing values. The maximum number of differentially represented features was identified by using limma and rank products methods in a complementary manner. We therefore recommend combined usage of these methods as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved rank products algorithm and the combined analysis is available.

  15. Compositional analysis and depth profiling of thin film CrO{sub 2} by heavy ion ERDA and standard RBS: a comparison

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khamlich, S., E-mail: skhamlich@gmail.com; Department of Chemistry, Tshwane University of Technology, Private Bag X 680, Pretoria, 0001; The African Laser Centre, CSIR campus, P.O. Box 395, Pretoria

    2012-08-15

    Chromium dioxide (CrO{sub 2}) thin film has generated considerable interest in applied research due to the wide variety of its technological applications. It has been extensively investigated in recent years, attracting the attention of researchers working on spintronic heterostructures and in the magnetic recording industry. However, its synthesis is usually a difficult task due to its metastable nature and various synthesis techniques are being investigated. In this work a polycrystalline thin film of CrO{sub 2} was prepared by electron beam vaporization of Cr{sub 2}O{sub 3} onto a Si substrate. The polycrystalline structure was confirmed through XRD analysis. The stoichiometry andmore » elemental depth distribution of the deposited film were measured by ion beam nuclear analytical techniques heavy ion elastic recoil detection analysis (ERDA) and Rutherford backscattering spectrometry (RBS), which both have relative advantage over non-nuclear spectrometries in that they can readily provide quantitative information about the concentration and distribution of different atomic species in a layer. Moreover, the analysis carried out highlights the importance of complementary usage of the two techniques to obtain a more complete description of elemental content and depth distribution in thin films. - Graphical abstract: Heavy ion elastic recoil detection analysis (ERDA) and Rutherford backscattering spectrometry (RBS) both have relative advantage over non-nuclear spectrometries in that they can readily provide quantitative information about the concentration and distribution of different atomic species in a layer. Highlights: Black-Right-Pointing-Pointer Thin films of CrO{sub 2} have been grown by e-beam evaporation of Cr{sub 2}O{sub 3} target in vacuum. Black-Right-Pointing-Pointer The composition was determined by heavy ion-ERDA and RBS. Black-Right-Pointing-Pointer HI-ERDA and RBS provided information on the light and heavy elements, respectively.« less

  16. Integrated Analysis and Tools for Land Subsidence Surveying and Monitoring: a Semi-Quantitative Approach

    NASA Astrophysics Data System (ADS)

    Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.

    2015-10-01

    This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.

  17. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  18. Study of Surface Cleaning Methods and Pyrolysis Temperature on Nano-Structured Carbon Films using X-ray Photoelectron Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerber, Pranita B.; Porter, Lisa M.; McCullough, L. A.

    2012-10-12

    Nanostructured carbon (ns-C) films fabricated by stabilization and pyrolysis of di-block copolymers are of interest for a variety of electrical/electronic applications due to their chemical inertness, high-temperature insensitivity, very high surface area, and tunable electrical resistivity over a wide range [Kulkarni et al., Synth. Met. 159, (2009) 177]. Because of their high porosity and associated high specific surface area, controlled surface cleaning studies are important for fabricating electronic devices from these films. In this study, quantification of surface composition and surface cleaning studies on ns-C films synthesized by carbonization of di-block copolymers of polyacrylonitrile-b-poly(n-butyl acrylate) (PAN-b-PBA) at two different temperaturesmore » were carried out. X-ray photoelectron spectroscopy was used for elemental analysis and to determine the efficacy of various surface cleaning methods for ns-C films and to examine the polymer residues in the films. The in-situ surface cleaning methods included: HF vapor treatment, vacuum annealing, and exposure to UV-ozone. Quantitative analysis of high-resolution XPS scans showed 11 at. % of nitrogen present in the films pyrolyzed at 600 °C, suggesting incomplete denitrogenation of the copolymer films. The nitrogen atomic concentration reduced significantly for films pyrolyzed at 900 °C confirming extensive denitrogenation at that temperature. Furthermore, quantitative analysis of nitrogen sub-peaks indicated higher loss of nitrogen atoms residing at the edge of graphitic clusters relative to that of nitrogen atoms within the graphitic cluster, suggesting higher graphitization with increasing pyrolysis temperature. Of the surface cleaning methods investigated, in-situ annealing of the films at 300 °C for 40 min was found to be the most efficacious in removing adventitious carbon and oxygen impurities from the surface.« less

  19. Study of surface cleaning methods and pyrolysis temperatures on nanostructured carbon films using x-ray photoelectron spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerber, Pranita; Porter, Lisa M.; McCullough, Lynne A.

    2012-11-15

    Nanostructured carbon (ns-C) films fabricated by stabilization and pyrolysis of diblock copolymers are of interest for a variety of electrical/electronic applications due to their chemical inertness, high-temperature insensitivity, very high surface area, and tunable electrical resistivity over a wide range [Kulkarni et al., Synth. Met. 159, 177 (2009)]. Because of their high porosity and associated high specific surface area, controlled surface cleaning studies are important for fabricating electronic devices from these films. In this study, quantification of surface composition and surface cleaning studies on ns-C films synthesized by carbonization of diblock copolymers of polyacrylonitrile-b-poly(n-butyl acrylate) at two different temperatures weremore » carried out. X-ray photoelectron spectroscopy was used for elemental analysis and to determine the efficacy of various surface cleaning methods for ns-C films and to examine the polymer residues in the films. The in-situ surface cleaning methods included HF vapor treatment, vacuum annealing, and exposure to UV-ozone. Quantitative analysis of high-resolution XPS scans showed 11 at. % nitrogen was present in the films pyrolyzed at 600 Degree-Sign C, suggesting incomplete denitrogenation of the copolymer films. The nitrogen atomic concentration decreased significantly for films pyrolyzed at 900 Degree-Sign C confirming extensive denitrogenation at that temperature. Furthermore, quantitative analysis of nitrogen subpeaks indicated higher loss of nitrogen atoms residing at the edge of graphitic clusters relative to that of nitrogen atoms within the graphitic clusters, suggesting higher graphitization with increasing pyrolysis temperature. Of the surface cleaning methods investigated, in-situ annealing of the films at 300 Degree-Sign C for 40 min was found to be the most efficacious in removing adventitious carbon and oxygen impurities from the surface.« less

  20. Sensitive Quantification of Cannabinoids in Milk by Alkaline Saponification-Solid Phase Extraction Combined with Isotope Dilution UPLC-MS/MS.

    PubMed

    Wei, Binnian; McGuffey, James E; Blount, Benjamin C; Wang, Lanqing

    2016-01-01

    Maternal exposure to marijuana during the lactation period-either active or passive-has prompted concerns about transmission of cannabinoids to breastfed infants and possible subsequent adverse health consequences. Assessing these health risks requires a sensitive analytical approach that is able to quantitatively measure trace-level cannabinoids in breast milk. Here, we describe a saponification-solid phase extraction approach combined with ultra-high-pressure liquid chromatography-tandem mass spectrometry for simultaneously quantifying Δ9-tetrahydrocannabinol (THC), cannabidiol (CBD), and cannabinol (CBN) in breast milk. We demonstrate for the first time that constraints on sensitivity can be overcome by utilizing alkaline saponification of the milk samples. After extensively optimizing the saponification procedure, the validated method exhibited limits of detections of 13, 4, and 66 pg/mL for THC, CBN, and CBD, respectively. Notably, the sensitivity achieved was significantly improved, for instance, the limits of detection for THC is at least 100-fold more sensitive compared to that previously reported in the literature. This is essential for monitoring cannabinoids in breast milk resulting from passive or nonrecent active maternal exposure. Furthermore, we simultaneously acquired multiple reaction monitoring transitions for 12 C- and 13 C-analyte isotopes. This combined analysis largely facilitated data acquisition by reducing the repetitive analysis rate for samples exceeding the linear limits of 12 C-analytes. In addition to high sensitivity and broad quantitation range, this method delivers excellent accuracy (relative error within ±10%), precision (relative standard deviation <10%), and efficient analysis. In future studies, we expect this method to play a critical role in assessing infant exposure to cannabinoids through breastfeeding.

Top