Sample records for quantitative study based

  1. A Quantitative Comparative Study Measuring Consumer Satisfaction Based on Health Record Format

    ERIC Educational Resources Information Center

    Moore, Vivianne E.

    2013-01-01

    This research study used a quantitative comparative method to investigate the relationship between consumer satisfaction and communication based on the format of health record. The central problem investigated in this research study related to the format of health record used and consumer satisfaction with care provided and effect on communication…

  2. Competency-Based Education: A Quantitative Study of the U.S. Air Force Noncommissioned Officer Academy

    ERIC Educational Resources Information Center

    Houser, Bonnie L.

    2017-01-01

    There are relatively few empirical studies that examine whether using a competency-based education (CBE) approach results in increased student learning or achievement when compared to traditional education approaches. This study uses a quantitative research methodology, a nonexperimental comparative descriptive research design, and a two-group…

  3. Quantitative genetic bases of anthocyanin variation in grape (Vitis vinifera L. ssp. sativa) berry: a quantitative trait locus to quantitative trait nucleotide integrated study.

    PubMed

    Fournier-Level, Alexandre; Le Cunff, Loïc; Gomez, Camila; Doligez, Agnès; Ageorges, Agnès; Roux, Catherine; Bertrand, Yves; Souquet, Jean-Marc; Cheynier, Véronique; This, Patrice

    2009-11-01

    The combination of QTL mapping studies of synthetic lines and association mapping studies of natural diversity represents an opportunity to throw light on the genetically based variation of quantitative traits. With the positional information provided through quantitative trait locus (QTL) mapping, which often leads to wide intervals encompassing numerous genes, it is now feasible to directly target candidate genes that are likely to be responsible for the observed variation in completely sequenced genomes and to test their effects through association genetics. This approach was performed in grape, a newly sequenced genome, to decipher the genetic architecture of anthocyanin content. Grapes may be either white or colored, ranging from the lightest pink to the darkest purple tones according to the amount of anthocyanin accumulated in the berry skin, which is a crucial trait for both wine quality and human nutrition. Although the determinism of the white phenotype has been fully identified, the genetic bases of the quantitative variation of anthocyanin content in berry skin remain unclear. A single QTL responsible for up to 62% of the variation in the anthocyanin content was mapped on a Syrah x Grenache F(1) pseudo-testcross. Among the 68 unigenes identified in the grape genome within the QTL interval, a cluster of four Myb-type genes was selected on the basis of physiological evidence (VvMybA1, VvMybA2, VvMybA3, and VvMybA4). From a core collection of natural resources (141 individuals), 32 polymorphisms revealed significant association, and extended linkage disequilibrium was observed. Using a multivariate regression method, we demonstrated that five polymorphisms in VvMybA genes except VvMybA4 (one retrotransposon, three single nucleotide polymorphisms and one 2-bp insertion/deletion) accounted for 84% of the observed variation. All these polymorphisms led to either structural changes in the MYB proteins or differences in the VvMybAs promoters. We concluded that

  4. A Quantitative Study of Teacher Readiness to Teach School-Based HIV/AIDS Education in Kenyan Primary Schools

    ERIC Educational Resources Information Center

    Lang'at, Edwin K.

    2014-01-01

    Purpose and Method of Study: The purpose of this study was to investigate teachers' self-perceived readiness to teach school-based HIV/AIDS Awareness and Prevention education in Kenyan primary schools based on their knowledge, attitudes and instructional confidence. This research utilized a non-experimental quantitative approach with a…

  5. Gene-Based Testing of Interactions in Association Studies of Quantitative Traits

    PubMed Central

    Ma, Li; Clark, Andrew G.; Keinan, Alon

    2013-01-01

    Various methods have been developed for identifying gene–gene interactions in genome-wide association studies (GWAS). However, most methods focus on individual markers as the testing unit, and the large number of such tests drastically erodes statistical power. In this study, we propose novel interaction tests of quantitative traits that are gene-based and that confer advantage in both statistical power and biological interpretation. The framework of gene-based gene–gene interaction (GGG) tests combine marker-based interaction tests between all pairs of markers in two genes to produce a gene-level test for interaction between the two. The tests are based on an analytical formula we derive for the correlation between marker-based interaction tests due to linkage disequilibrium. We propose four GGG tests that extend the following P value combining methods: minimum P value, extended Simes procedure, truncated tail strength, and truncated P value product. Extensive simulations point to correct type I error rates of all tests and show that the two truncated tests are more powerful than the other tests in cases of markers involved in the underlying interaction not being directly genotyped and in cases of multiple underlying interactions. We applied our tests to pairs of genes that exhibit a protein–protein interaction to test for gene-level interactions underlying lipid levels using genotype data from the Atherosclerosis Risk in Communities study. We identified five novel interactions that are not evident from marker-based interaction testing and successfully replicated one of these interactions, between SMAD3 and NEDD9, in an independent sample from the Multi-Ethnic Study of Atherosclerosis. We conclude that our GGG tests show improved power to identify gene-level interactions in existing, as well as emerging, association studies. PMID:23468652

  6. Quantitative learning strategies based on word networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yue-Tian-Yi; Jia, Zi-Yang; Tang, Yong; Xiong, Jason Jie; Zhang, Yi-Cheng

    2018-02-01

    Learning English requires a considerable effort, but the way that vocabulary is introduced in textbooks is not optimized for learning efficiency. With the increasing population of English learners, learning process optimization will have significant impact and improvement towards English learning and teaching. The recent developments of big data analysis and complex network science provide additional opportunities to design and further investigate the strategies in English learning. In this paper, quantitative English learning strategies based on word network and word usage information are proposed. The strategies integrate the words frequency with topological structural information. By analyzing the influence of connected learned words, the learning weights for the unlearned words and dynamically updating of the network are studied and analyzed. The results suggest that quantitative strategies significantly improve learning efficiency while maintaining effectiveness. Especially, the optimized-weight-first strategy and segmented strategies outperform other strategies. The results provide opportunities for researchers and practitioners to reconsider the way of English teaching and designing vocabularies quantitatively by balancing the efficiency and learning costs based on the word network.

  7. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  8. The Impact of Situation-Based Learning to Students’ Quantitative Literacy

    NASA Astrophysics Data System (ADS)

    Latifah, T.; Cahya, E.; Suhendra

    2017-09-01

    Nowadays, the usage of quantities can be seen almost everywhere. There has been an increase of quantitative thinking, such as quantitative reasoning and quantitative literacy, within the context of daily life. However, many people today are still not fully equipped with the knowledge of quantitative thinking. There are still a lot of individuals not having enough quantitative skills to perform well within today’s society. Based on this issue, the research aims to improve students’ quantitative literacy in junior high school. The qualitative analysis of written student work and video observations during the experiment reveal that the impact of situation-based learning affects students’ quantitative literacy.

  9. Assessing the reporting of categorised quantitative variables in observational epidemiological studies.

    PubMed

    Mabikwa, Onkabetse V; Greenwood, Darren C; Baxter, Paul D; Fleming, Sarah J

    2017-03-14

    One aspect to consider when reporting results of observational studies in epidemiology is how quantitative risk factors are analysed. The STROBE (Strengthening the Reporting of Observational Studies in Epidemiology) guidelines recommend that researchers describe how they handle quantitative variables when analysing data. For categorised quantitative variables, the authors are required to provide reasons and justifications informing their practice. We investigated and assessed the practices and reporting of categorised quantitative variables in epidemiology. The assessment was based on five medical journals that publish epidemiological research. Observational studies published between April and June 2015 and investigating the relationships between quantitative exposures (or risk factors) and the outcomes were considered for assessment. A standard form was used to collect the data, and the reporting patterns amongst eligible studies were quantified and described. Out of 61 articles assessed for eligibility, 23 observational studies were included in the assessment. Categorisation of quantitative exposures occurred in 61% of these studies and reasons informing the practice were rarely provided. Only one article explained the choice of categorisation in the analysis. Transformation of quantitative exposures into four or five groups was common and dominant amongst studies using equally spaced categories. Dichotomisation was not popular; the practice featured in one article. Overall, the majority (86%) of the studies preferred ordered or arbitrary group categories. Other criterions used to decide categorical boundaries were based on established guidelines such as consensus statements and WHO standards. Categorisation of continuous variables remains a dominant practice in epidemiological studies. The reasons informing the practice of categorisation within published work are limited and remain unknown in most articles. The existing STROBE guidelines could provide stronger

  10. A novel multi-walled carbon nanotube-based antibody conjugate for quantitative and semi-quantitative lateral flow assays.

    PubMed

    Sun, Wenjuan; Hu, Xiaolong; Liu, Jia; Zhang, Yurong; Lu, Jianzhong; Zeng, Libo

    2017-10-01

    In this study, the multi-walled carbon nanotubes (MWCNTs) were applied in lateral flow strips (LFS) for semi-quantitative and quantitative assays. Firstly, the solubility of MWCNTs was improved using various surfactants to enhance their biocompatibility for practical application. The dispersed MWCNTs were conjugated with the methamphetamine (MET) antibody in a non-covalent manner and then manufactured into the LFS for the quantitative detection of MET. The MWCNTs-based lateral flow assay (MWCNTs-LFA) exhibited an excellent linear relationship between the values of test line and MET when its concentration ranges from 62.5 to 1500 ng/mL. The sensitivity of the LFS was evaluated by conjugating MWCNTs with HCG antibody and the MWCNTs conjugated method is 10 times more sensitive than the one conjugated with classical colloidal gold nanoparticles. Taken together, our data demonstrate that MWCNTs-LFA is a more sensitive and reliable assay for semi-quantitative and quantitative detection which can be used in forensic analysis.

  11. Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study

    PubMed Central

    Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger

    2015-01-01

    Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle

  12. Statistical design of quantitative mass spectrometry-based proteomic experiments.

    PubMed

    Oberg, Ann L; Vitek, Olga

    2009-05-01

    We review the fundamental principles of statistical experimental design, and their application to quantitative mass spectrometry-based proteomics. We focus on class comparison using Analysis of Variance (ANOVA), and discuss how randomization, replication and blocking help avoid systematic biases due to the experimental procedure, and help optimize our ability to detect true quantitative changes between groups. We also discuss the issues of pooling multiple biological specimens for a single mass analysis, and calculation of the number of replicates in a future study. When applicable, we emphasize the parallels between designing quantitative proteomic experiments and experiments with gene expression microarrays, and give examples from that area of research. We illustrate the discussion using theoretical considerations, and using real-data examples of profiling of disease.

  13. Establishing the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS): Operationalizing Community-based Research in a Large National Quantitative Study.

    PubMed

    Loutfy, Mona; Greene, Saara; Kennedy, V Logan; Lewis, Johanna; Thomas-Pavanel, Jamie; Conway, Tracey; de Pokomandy, Alexandra; O'Brien, Nadia; Carter, Allison; Tharao, Wangari; Nicholson, Valerie; Beaver, Kerrigan; Dubuc, Danièle; Gahagan, Jacqueline; Proulx-Boucher, Karène; Hogg, Robert S; Kaida, Angela

    2016-08-19

    Community-based research has gained increasing recognition in health research over the last two decades. Such participatory research approaches are lauded for their ability to anchor research in lived experiences, ensuring cultural appropriateness, accessing local knowledge, reaching marginalized communities, building capacity, and facilitating research-to-action. While having these positive attributes, the community-based health research literature is predominantly composed of small projects, using qualitative methods, and set within geographically limited communities. Its use in larger health studies, including clinical trials and cohorts, is limited. We present the Canadian HIV Women's Sexual and Reproductive Health Cohort Study (CHIWOS), a large-scale, multi-site, national, longitudinal quantitative study that has operationalized community-based research in all steps of the research process. Successes, challenges and further considerations are offered. Through the integration of community-based research principles, we have been successful in: facilitating a two-year long formative phase for this study; developing a novel survey instrument with national involvement; training 39 Peer Research Associates (PRAs); offering ongoing comprehensive support to PRAs; and engaging in an ongoing iterative community-based research process. Our community-based research approach within CHIWOS demanded that we be cognizant of challenges managing a large national team, inherent power imbalances and challenges with communication, compensation and volunteering considerations, and extensive delays in institutional processes. It is important to consider the iterative nature of community-based research and to work through tensions that emerge given the diverse perspectives of numerous team members. Community-based research, as an approach to large-scale quantitative health research projects, is an increasingly viable methodological option. Community-based research has several

  14. The APOSTEL recommendations for reporting quantitative optical coherence tomography studies.

    PubMed

    Cruz-Herranz, Andrés; Balk, Lisanne J; Oberwahrenbrock, Timm; Saidha, Shiv; Martinez-Lapiscina, Elena H; Lagreze, Wolf A; Schuman, Joel S; Villoslada, Pablo; Calabresi, Peter; Balcer, Laura; Petzold, Axel; Green, Ari J; Paul, Friedemann; Brandt, Alexander U; Albrecht, Philipp

    2016-06-14

    To develop consensus recommendations for reporting of quantitative optical coherence tomography (OCT) study results. A panel of experienced OCT researchers (including 11 neurologists, 2 ophthalmologists, and 2 neuroscientists) discussed requirements for performing and reporting quantitative analyses of retinal morphology and developed a list of initial recommendations based on experience and previous studies. The list of recommendations was subsequently revised during several meetings of the coordinating group. We provide a 9-point checklist encompassing aspects deemed relevant when reporting quantitative OCT studies. The areas covered are study protocol, acquisition device, acquisition settings, scanning protocol, funduscopic imaging, postacquisition data selection, postacquisition data analysis, recommended nomenclature, and statistical analysis. The Advised Protocol for OCT Study Terminology and Elements recommendations include core items to standardize and improve quality of reporting in quantitative OCT studies. The recommendations will make reporting of quantitative OCT studies more consistent and in line with existing standards for reporting research in other biomedical areas. The recommendations originated from expert consensus and thus represent Class IV evidence. They will need to be regularly adjusted according to new insights and practices. © 2016 American Academy of Neurology.

  15. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  16. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  17. MR Imaging-based Semi-quantitative Methods for Knee Osteoarthritis

    PubMed Central

    JARRAYA, Mohamed; HAYASHI, Daichi; ROEMER, Frank Wolfgang; GUERMAZI, Ali

    2016-01-01

    Magnetic resonance imaging (MRI)-based semi-quantitative (SQ) methods applied to knee osteoarthritis (OA) have been introduced during the last decade and have fundamentally changed our understanding of knee OA pathology since then. Several epidemiological studies and clinical trials have used MRI-based SQ methods to evaluate different outcome measures. Interest in MRI-based SQ scoring system has led to continuous update and refinement. This article reviews the different SQ approaches for MRI-based whole organ assessment of knee OA and also discuss practical aspects of whole joint assessment. PMID:26632537

  18. Evaluating a Dutch cardiology primary care plus intervention on the Triple Aim outcomes: study design of a practice-based quantitative and qualitative research.

    PubMed

    Quanjel, Tessa C C; Spreeuwenberg, Marieke D; Struijs, Jeroen N; Baan, Caroline A; Ruwaard, Dirk

    2017-09-06

    In an attempt to deal with the pressures on the health-care system and to guarantee sustainability, changes are needed. This study focuses on a cardiology primary care plus intervention. Primary care plus (PC+) is a new health-care delivery model focused on substitution of specialist care in the hospital setting with specialist care in the primary care setting. The intervention consists of a cardiology PC+ centre in which cardiologists, supported by other health-care professionals, provide consultations in a primary care setting. The PC+ centre aims to improve the health of the population and quality of care as experienced by patients, and reduce the number of referrals to hospital-based outpatient specialist care in order to reduce health-care costs. These aims reflect the Triple Aim principle. Hence, the objectives of the study are to evaluate the cardiology PC+ centre in terms of the Triple Aim outcomes and to evaluate the process of the introduction of PC+. The study is a practice-based, quantitative study with a longitudinal observational design, and an additional qualitative study to supplement, interpret and improve the quantitative study. The study population of the quantitative part will consist of adult patients (≥18 years) with non-acute and low-complexity cardiology-related health complaints, who will be referred to the cardiology PC+ centre (intervention group) or hospital-based outpatient cardiology care (control group). All eligible patients will be asked to complete questionnaires at three different time points consisting of questions about their demographics, health status and experience of care. Additionally, quantitative data will be collected about health-care utilization and related health-care costs at the PC+ centre and the hospital. The qualitative part, consisting of semi-structured interviews, focus groups, and observations, is designed to evaluate the process as well as to amplify, clarify and explain quantitative results. This study

  19. Models of Quantitative Estimations: Rule-Based and Exemplar-Based Processes Compared

    ERIC Educational Resources Information Center

    von Helversen, Bettina; Rieskamp, Jorg

    2009-01-01

    The cognitive processes underlying quantitative estimations vary. Past research has identified task-contingent changes between rule-based and exemplar-based processes (P. Juslin, L. Karlsson, & H. Olsson, 2008). B. von Helversen and J. Rieskamp (2008), however, proposed a simple rule-based model--the mapping model--that outperformed the…

  20. Classification of cassava genotypes based on qualitative and quantitative data.

    PubMed

    Oliveira, E J; Oliveira Filho, O S; Santos, V S

    2015-02-02

    We evaluated the genetic variation of cassava accessions based on qualitative (binomial and multicategorical) and quantitative traits (continuous). We characterized 95 accessions obtained from the Cassava Germplasm Bank of Embrapa Mandioca e Fruticultura; we evaluated these accessions for 13 continuous, 10 binary, and 25 multicategorical traits. First, we analyzed the accessions based only on quantitative traits; next, we conducted joint analysis (qualitative and quantitative traits) based on the Ward-MLM method, which performs clustering in two stages. According to the pseudo-F, pseudo-t2, and maximum likelihood criteria, we identified five and four groups based on quantitative trait and joint analysis, respectively. The smaller number of groups identified based on joint analysis may be related to the nature of the data. On the other hand, quantitative data are more subject to environmental effects in the phenotype expression; this results in the absence of genetic differences, thereby contributing to greater differentiation among accessions. For most of the accessions, the maximum probability of classification was >0.90, independent of the trait analyzed, indicating a good fit of the clustering method. Differences in clustering according to the type of data implied that analysis of quantitative and qualitative traits in cassava germplasm might explore different genomic regions. On the other hand, when joint analysis was used, the means and ranges of genetic distances were high, indicating that the Ward-MLM method is very useful for clustering genotypes when there are several phenotypic traits, such as in the case of genetic resources and breeding programs.

  1. Portable smartphone based quantitative phase microscope

    NASA Astrophysics Data System (ADS)

    Meng, Xin; Tian, Xiaolin; Yu, Wei; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2018-01-01

    To realize portable device with high contrast imaging capability, we designed a quantitative phase microscope using transport of intensity equation method based on a smartphone. The whole system employs an objective and an eyepiece as imaging system and a cost-effective LED as illumination source. A 3-D printed cradle is used to align these components. Images of different focal planes are captured by manual focusing, followed by calculation of sample phase via a self-developed Android application. To validate its accuracy, we first tested the device by measuring a random phase plate with known phases, and then red blood cell smear, Pap smear, broad bean epidermis sections and monocot root were also measured to show its performance. Owing to its advantages as accuracy, high-contrast, cost-effective and portability, the portable smartphone based quantitative phase microscope is a promising tool which can be future adopted in remote healthcare and medical diagnosis.

  2. Clinical study of quantitative diagnosis of early cervical cancer based on the classification of acetowhitening kinetics

    NASA Astrophysics Data System (ADS)

    Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.

    2010-03-01

    A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.

  3. [Study of Cervical Exfoliated Cell's DNA Quantitative Analysis Based on Multi-Spectral Imaging Technology].

    PubMed

    Wu, Zheng; Zeng, Li-bo; Wu, Qiong-shui

    2016-02-01

    The conventional cervical cancer screening methods mainly include TBS (the bethesda system) classification method and cellular DNA quantitative analysis, however, by using multiple staining method in one cell slide, which is staining the cytoplasm with Papanicolaou reagent and the nucleus with Feulgen reagent, the study of achieving both two methods in the cervical cancer screening at the same time is still blank. Because the difficulty of this multiple staining method is that the absorbance of the non-DNA material may interfere with the absorbance of DNA, so that this paper has set up a multi-spectral imaging system, and established an absorbance unmixing model by using multiple linear regression method based on absorbance's linear superposition character, and successfully stripped out the absorbance of DNA to run the DNA quantitative analysis, and achieved the perfect combination of those two kinds of conventional screening method. Through a series of experiment we have proved that between the absorbance of DNA which is calculated by the absorbance unmixxing model and the absorbance of DNA which is measured there is no significant difference in statistics when the test level is 1%, also the result of actual application has shown that there is no intersection between the confidence interval of the DNA index of the tetraploid cells which are screened by using this paper's analysis method when the confidence level is 99% and the DNA index's judging interval of cancer cells, so that the accuracy and feasibility of the quantitative DNA analysis with multiple staining method expounded by this paper have been verified, therefore this analytical method has a broad application prospect and considerable market potential in early diagnosis of cervical cancer and other cancers.

  4. Use of Standardized, Quantitative Digital Photography in a Multicenter Web-based Study

    PubMed Central

    Molnar, Joseph A.; Lew, Wesley K.; Rapp, Derek A.; Gordon, E. Stanley; Voignier, Denise; Rushing, Scott; Willner, William

    2009-01-01

    Objective: We developed a Web-based, blinded, prospective, randomized, multicenter trial, using standardized digital photography to clinically evaluate hand burn depth and accurately determine wound area with digital planimetry. Methods: Photos in each center were taken with identical digital cameras with standardized settings on a custom backdrop developed at Wake Forest University containing a gray, white, black, and centimeter scale. The images were downloaded, transferred via the Web, and stored on servers at the principal investigator's home institution. Color adjustments to each photo were made using Adobe Photoshop 6.0 (Adobe, San Jose, Calif). In an initial pilot study, model hands marked with circles of known areas were used to determine the accuracy of the planimetry technique. Two-dimensional digital planimetry using SigmaScan Pro 5.0 (SPSS Science, Chicago, Ill) was used to calculate wound area from the digital images. Results: Digital photography is a simple and cost-effective method for quantifying wound size when used in conjunction with digital planimetry (SigmaScan) and photo enhancement (Adobe Photoshop) programs. The accuracy of the SigmaScan program in calculating predetermined areas was within 4.7% (95% CI, 3.4%–5.9%). Dorsal hand burns of the initial 20 patients in a national study involving several centers were evaluated with this technique. Images obtained by individuals denying experience in photography proved reliable and useful for clinical evaluation and quantification of wound area. Conclusion: Standardized digital photography may be used quantitatively in a Web-based, multicenter trial of burn care. This technique could be modified for other medical studies with visual endpoints. PMID:19212431

  5. Use of standardized, quantitative digital photography in a multicenter Web-based study.

    PubMed

    Molnar, Joseph A; Lew, Wesley K; Rapp, Derek A; Gordon, E Stanley; Voignier, Denise; Rushing, Scott; Willner, William

    2009-01-01

    We developed a Web-based, blinded, prospective, randomized, multicenter trial, using standardized digital photography to clinically evaluate hand burn depth and accurately determine wound area with digital planimetry. Photos in each center were taken with identical digital cameras with standardized settings on a custom backdrop developed at Wake Forest University containing a gray, white, black, and centimeter scale. The images were downloaded, transferred via the Web, and stored on servers at the principal investigator's home institution. Color adjustments to each photo were made using Adobe Photoshop 6.0 (Adobe, San Jose, Calif). In an initial pilot study, model hands marked with circles of known areas were used to determine the accuracy of the planimetry technique. Two-dimensional digital planimetry using SigmaScan Pro 5.0 (SPSS Science, Chicago, Ill) was used to calculate wound area from the digital images. Digital photography is a simple and cost-effective method for quantifying wound size when used in conjunction with digital planimetry (SigmaScan) and photo enhancement (Adobe Photoshop) programs. The accuracy of the SigmaScan program in calculating predetermined areas was within 4.7% (95% CI, 3.4%-5.9%). Dorsal hand burns of the initial 20 patients in a national study involving several centers were evaluated with this technique. Images obtained by individuals denying experience in photography proved reliable and useful for clinical evaluation and quantification of wound area. Standardized digital photography may be used quantitatively in a Web-based, multicenter trial of burn care. This technique could be modified for other medical studies with visual endpoints.

  6. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  7. Reproducibility and quantitation of amplicon sequencing-based detection

    PubMed Central

    Zhou, Jizhong; Wu, Liyou; Deng, Ye; Zhi, Xiaoyang; Jiang, Yi-Huei; Tu, Qichao; Xie, Jianping; Van Nostrand, Joy D; He, Zhili; Yang, Yunfeng

    2011-01-01

    To determine the reproducibility and quantitation of the amplicon sequencing-based detection approach for analyzing microbial community structure, a total of 24 microbial communities from a long-term global change experimental site were examined. Genomic DNA obtained from each community was used to amplify 16S rRNA genes with two or three barcode tags as technical replicates in the presence of a small quantity (0.1% wt/wt) of genomic DNA from Shewanella oneidensis MR-1 as the control. The technical reproducibility of the amplicon sequencing-based detection approach is quite low, with an average operational taxonomic unit (OTU) overlap of 17.2%±2.3% between two technical replicates, and 8.2%±2.3% among three technical replicates, which is most likely due to problems associated with random sampling processes. Such variations in technical replicates could have substantial effects on estimating β-diversity but less on α-diversity. A high variation was also observed in the control across different samples (for example, 66.7-fold for the forward primer), suggesting that the amplicon sequencing-based detection approach could not be quantitative. In addition, various strategies were examined to improve the comparability of amplicon sequencing data, such as increasing biological replicates, and removing singleton sequences and less-representative OTUs across biological replicates. Finally, as expected, various statistical analyses with preprocessed experimental data revealed clear differences in the composition and structure of microbial communities between warming and non-warming, or between clipping and non-clipping. Taken together, these results suggest that amplicon sequencing-based detection is useful in analyzing microbial community structure even though it is not reproducible and quantitative. However, great caution should be taken in experimental design and data interpretation when the amplicon sequencing-based detection approach is used for quantitative

  8. Model-Based Linkage Analysis of a Quantitative Trait.

    PubMed

    Song, Yeunjoo E; Song, Sunah; Schnell, Audrey H

    2017-01-01

    Linkage Analysis is a family-based method of analysis to examine whether any typed genetic markers cosegregate with a given trait, in this case a quantitative trait. If linkage exists, this is taken as evidence in support of a genetic basis for the trait. Historically, linkage analysis was performed using a binary disease trait, but has been extended to include quantitative disease measures. Quantitative traits are desirable as they provide more information than binary traits. Linkage analysis can be performed using single-marker methods (one marker at a time) or multipoint (using multiple markers simultaneously). In model-based linkage analysis the genetic model for the trait of interest is specified. There are many software options for performing linkage analysis. Here, we use the program package Statistical Analysis for Genetic Epidemiology (S.A.G.E.). S.A.G.E. was chosen because it also includes programs to perform data cleaning procedures and to generate and test genetic models for a quantitative trait, in addition to performing linkage analysis. We demonstrate in detail the process of running the program LODLINK to perform single-marker analysis, and MLOD to perform multipoint analysis using output from SEGREG, where SEGREG was used to determine the best fitting statistical model for the trait.

  9. Validation of Greyscale-Based Quantitative Ultrasound in Manual Wheelchair Users

    PubMed Central

    Collinger, Jennifer L.; Fullerton, Bradley; Impink, Bradley G.; Koontz, Alicia M.; Boninger, Michael L.

    2010-01-01

    Objective The primary aim of this study is to establish the validity of greyscale-based quantitative ultrasound (QUS) measures of the biceps and supraspinatus tendons. Design Nine QUS measures of the biceps and supraspinatus tendons were computed from ultrasound images collected from sixty-seven manual wheelchair users. Shoulder pathology was measured using questionnaires, physical examination maneuvers, and a clinical ultrasound grading scale. Results Increased age, duration of wheelchair use, and body mass correlated with a darker, more homogenous tendon appearance. Subjects with pain during physical examination tests for biceps tenderness and acromioclavicular joint tenderness exhibited significantly different supraspinatus QUS values. Even when controlling for tendon depth, QUS measures of the biceps tendon differed significantly between subjects with healthy tendons, mild tendinosis, and severe tendinosis. Clinical grading of supraspinatus tendon health was correlated with QUS measures of the supraspinatus tendon. Conclusions Quantitative ultrasound is valid method to quantify tendinopathy and may allow for early detection of tendinosis. Manual wheelchair users are at a high risk for developing shoulder tendon pathology and may benefit from quantitative ultrasound-based research that focuses on identifying interventions designed to reduce this risk. PMID:20407304

  10. Impact of implementation choices on quantitative predictions of cell-based computational models

    NASA Astrophysics Data System (ADS)

    Kursawe, Jochen; Baker, Ruth E.; Fletcher, Alexander G.

    2017-09-01

    'Cell-based' models provide a powerful computational tool for studying the mechanisms underlying the growth and dynamics of biological tissues in health and disease. An increasing amount of quantitative data with cellular resolution has paved the way for the quantitative parameterisation and validation of such models. However, the numerical implementation of cell-based models remains challenging, and little work has been done to understand to what extent implementation choices may influence model predictions. Here, we consider the numerical implementation of a popular class of cell-based models called vertex models, which are often used to study epithelial tissues. In two-dimensional vertex models, a tissue is approximated as a tessellation of polygons and the vertices of these polygons move due to mechanical forces originating from the cells. Such models have been used extensively to study the mechanical regulation of tissue topology in the literature. Here, we analyse how the model predictions may be affected by numerical parameters, such as the size of the time step, and non-physical model parameters, such as length thresholds for cell rearrangement. We find that vertex positions and summary statistics are sensitive to several of these implementation parameters. For example, the predicted tissue size decreases with decreasing cell cycle durations, and cell rearrangement may be suppressed by large time steps. These findings are counter-intuitive and illustrate that model predictions need to be thoroughly analysed and implementation details carefully considered when applying cell-based computational models in a quantitative setting.

  11. Distance-based microfluidic quantitative detection methods for point-of-care testing.

    PubMed

    Tian, Tian; Li, Jiuxing; Song, Yanling; Zhou, Leiji; Zhu, Zhi; Yang, Chaoyong James

    2016-04-07

    Equipment-free devices with quantitative readout are of great significance to point-of-care testing (POCT), which provides real-time readout to users and is especially important in low-resource settings. Among various equipment-free approaches, distance-based visual quantitative detection methods rely on reading the visual signal length for corresponding target concentrations, thus eliminating the need for sophisticated instruments. The distance-based methods are low-cost, user-friendly and can be integrated into portable analytical devices. Moreover, such methods enable quantitative detection of various targets by the naked eye. In this review, we first introduce the concept and history of distance-based visual quantitative detection methods. Then, we summarize the main methods for translation of molecular signals to distance-based readout and discuss different microfluidic platforms (glass, PDMS, paper and thread) in terms of applications in biomedical diagnostics, food safety monitoring, and environmental analysis. Finally, the potential and future perspectives are discussed.

  12. Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.

    PubMed

    Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei

    2017-09-01

    Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  14. Three-dimensional quantitative structure-activity relationship studies on c-Src inhibitors based on different docking methods.

    PubMed

    Bairy, Santhosh Kumar; Suneel Kumar, B V S; Bhalla, Joseph Uday Tej; Pramod, A B; Ravikumar, Muttineni

    2009-04-01

    c-Src kinase play an important role in cell growth and differentiation and its inhibitors can be useful for the treatment of various diseases, including cancer, osteoporosis, and metastatic bone disease. Three dimensional quantitative structure-activity relationship (3D-QSAR) studies were carried out on quinazolin derivatives inhibiting c-Src kinase. Molecular field analysis (MFA) models with four different alignment techniques, namely, GLIDE, GOLD, LIGANDFIT and Least squares based methods were developed. glide based MFA model showed better results (Leave one out cross validation correlation coefficient r(2)(cv) = 0.923 and non-cross validation correlation coefficient r(2)= 0.958) when compared with other models. These results help us to understand the nature of descriptors required for activity of these compounds and thereby provide guidelines to design novel and potent c-Src kinase inhibitors.

  15. Does Homework Really Matter for College Students in Quantitatively-Based Courses?

    ERIC Educational Resources Information Center

    Young, Nichole; Dollman, Amanda; Angel, N. Faye

    2016-01-01

    This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…

  16. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  17. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  18. Synthesising quantitative and qualitative research in evidence-based patient information.

    PubMed

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-03-01

    Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and

  19. Synthesising quantitative and qualitative research in evidence‐based patient information

    PubMed Central

    Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan

    2007-01-01

    Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A

  20. 75 FR 9488 - Basel Comprehensive Quantitative Impact Study

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-02

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Basel Comprehensive Quantitative Impact... Quantitative Impact Study. OMB Number: 1550-0NEW. Form Numbers: N/A. Regulation requirement: 12 CFR Part 567... Basel II Capital Accord, the Basel Committee will conduct a quantitative impact study (QIS) to assess...

  1. Quantitative radiomics studies for tissue characterization: a review of technology and methodological procedures.

    PubMed

    Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter

    2017-02-01

    Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.

  2. Quantitative Courses in a Liberal Education Program: A Case Study

    ERIC Educational Resources Information Center

    Wismath, Shelly L.; Mackay, D. Bruce

    2012-01-01

    This essay argues for the importance of quantitative reasoning skills as part of a liberal education and describes the successful introduction of a mathematics-based quantitative skills course at a small Canadian university. Today's students need quantitative problem-solving skills, to function as adults, professionals, consumers, and citizens in…

  3. Returning to Work after Cancer: Quantitative Studies and Prototypical Narratives

    PubMed Central

    Steiner, John F.; Nowels, Carolyn T.; Main, Deborah S.

    2009-01-01

    Objective A combination of quantitative data and illustrative narratives may allow cancer survivorship researchers to disseminate their research findings more broadly. We identified recent, methodologically rigorous quantitative studies on return to work after cancer, summarized the themes from these studies, and illustrated those themes with narratives of individual cancer survivors. Methods We reviewed English-language studies of return to work for adult cancer survivors through June, 2008, and identified 13 general themes from papers that met methodological criteria (population-based sampling, prospective and longitudinal assessment, detailed assessment of work, evaluation of economic impact, assessment of moderators of work return, and large sample size). We drew survivorship narratives from a prior qualitative research study to illustrate these themes. Results Nine quantitative studies met 4 or more of our 6 methodological criteria. These studies suggested that most cancer survivors could return to work without residual disabilities. Cancer site, clinical prognosis, treatment modalities, socioeconomic status, and attributes of the job itself influenced the likelihood of work return. Three narratives - a typical survivor who returned to work after treatment, an individual unable to return to work, and an inspiring survivor who returned to work despite substantial barriers - illustrated many of the themes from the quantitative literature while providing additional contextual details. Conclusion Illustrative narratives can complement the findings of cancer survivorship research if researchers are rigorous and transparent in the selection, analysis, and retelling of those stories. PMID:19507264

  4. Quantitative analysis of multiple sclerosis: a feasibility study

    NASA Astrophysics Data System (ADS)

    Li, Lihong; Li, Xiang; Wei, Xinzhou; Sturm, Deborah; Lu, Hongbing; Liang, Zhengrong

    2006-03-01

    Multiple Sclerosis (MS) is an inflammatory and demyelinating disorder of the central nervous system with a presumed immune-mediated etiology. For treatment of MS, the measurements of white matter (WM), gray matter (GM), and cerebral spinal fluid (CSF) are often used in conjunction with clinical evaluation to provide a more objective measure of MS burden. In this paper, we apply a new unifying automatic mixture-based algorithm for segmentation of brain tissues to quantitatively analyze MS. The method takes into account the following effects that commonly appear in MR imaging: 1) The MR data is modeled as a stochastic process with an inherent inhomogeneity effect of smoothly varying intensity; 2) A new partial volume (PV) model is built in establishing the maximum a posterior (MAP) segmentation scheme; 3) Noise artifacts are minimized by a priori Markov random field (MRF) penalty indicating neighborhood correlation from tissue mixture. The volumes of brain tissues (WM, GM) and CSF are extracted from the mixture-based segmentation. Experimental results of feasibility studies on quantitative analysis of MS are presented.

  5. Magnetic Resonance-based Motion Correction for Quantitative PET in Simultaneous PET-MR Imaging.

    PubMed

    Rakvongthai, Yothin; El Fakhri, Georges

    2017-07-01

    Motion degrades image quality and quantitation of PET images, and is an obstacle to quantitative PET imaging. Simultaneous PET-MR offers a tool that can be used for correcting the motion in PET images by using anatomic information from MR imaging acquired concurrently. Motion correction can be performed by transforming a set of reconstructed PET images into the same frame or by incorporating the transformation into the system model and reconstructing the motion-corrected image. Several phantom and patient studies have validated that MR-based motion correction strategies have great promise for quantitative PET imaging in simultaneous PET-MR. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Experiencing Teaching and Learning Quantitative Reasoning in a Project-Based Context

    ERIC Educational Resources Information Center

    Muir, Tracey; Beswick, Kim; Callingham, Rosemary; Jade, Katara

    2016-01-01

    This paper presents the findings of a small-scale study that investigated the issues and challenges of teaching and learning about quantitative reasoning (QR) within a project-based learning (PjBL) context. Students and teachers were surveyed and interviewed about their experiences of learning and teaching QR in that context in contrast to…

  7. A Statistical Framework for Protein Quantitation in Bottom-Up MS-Based Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and associated confidence measures. Challenges include the presence of low quality or incorrectly identified peptides and informative missingness. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model that carefully accounts for informative missingness in peak intensities and allows unbiased, model-based, protein-level estimation and inference. The model is applicable to both label-based and label-free quantitation experiments. We also provide automated, model-based, algorithms for filtering of proteins and peptides as well as imputation of missing values. Two LC/MS datasets are used to illustrate themore » methods. In simulation studies, our methods are shown to achieve substantially more discoveries than standard alternatives. Availability: The software has been made available in the opensource proteomics platform DAnTE (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu Supplementary information: Supplementary data are available at Bioinformatics online.« less

  8. Quantitative detection of melamine based on terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhao, Xiaojing; Wang, Cuicui; Liu, Shangjian; Zuo, Jian; Zhou, Zihan; Zhang, Cunlin

    2018-01-01

    Melamine is an organic base and a trimer of cyanamide, with a 1, 3, 5-triazine skeleton. It is usually used for the production of plastics, glue and flame retardants. Melamine combines with acid and related compounds to form melamine cyanurate and related crystal structures, which have been implicated as contaminants or biomarkers in protein adulterations by lawbreakers, especially in milk powder. This paper is focused on developing an available method for quantitative detection of melamine in the fields of security inspection and nondestructive testing based on THz-TDS. Terahertz (THz) technology has promising applications for the detection and identification of materials because it exhibits the properties of spectroscopy, good penetration and safety. Terahertz time-domain spectroscopy (THz-TDS) is a key technique that is applied to spectroscopic measurement of materials based on ultrafast femtosecond laser. In this study, the melamine and its mixture with polyethylene powder in different consistence are measured using the transmission THz-TDS. And we obtained the refractive index spectra and the absorption spectrum of different concentrations of melamine on 0.2-2.8THz. In the refractive index spectra, it is obvious to see that decline trend with the decrease of concentration; and in the absorption spectrum, two peaks of melamine at 1.98THz and 2.28THz can be obtained. Based on the experimental result, the absorption coefficient and the consistence of the melamine in the mixture are determined. Finally, methods for quantitative detection of materials in the fields of nondestructive testing and quality control based on THz-TDS have been studied.

  9. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women

    PubMed Central

    Nardo, Lorenzo; Karampinos, Dimitrios C.; Joseph, Gabby B.; Yap, Samuel P.; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M.

    2013-01-01

    Objective The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Methods Sixty-two women (age 61±6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. Results A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P<0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0–4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Conclusion Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. PMID:22411305

  10. A low cost mobile phone dark-field microscope for nanoparticle-based quantitative studies.

    PubMed

    Sun, Dali; Hu, Tony Y

    2018-01-15

    Dark-field microscope (DFM) analysis of nanoparticle binding signal is highly useful for a variety of research and biomedical applications, but current applications for nanoparticle quantification rely on expensive DFM systems. The cost, size, limited robustness of these DFMs limits their utility for non-laboratory settings. Most nanoparticle analyses use high-magnification DFM images, which are labor intensive to acquire and subject to operator bias. Low-magnification DFM image capture is faster, but is subject to background from surface artifacts and debris, although image processing can partially compensate for background signal. We thus mated an LED light source, a dark-field condenser and a 20× objective lens with a mobile phone camera to create an inexpensive, portable and robust DFM system suitable for use in non-laboratory conditions. This proof-of-concept mobile DFM device weighs less than 400g and costs less than $2000, but analysis of images captured with this device reveal similar nanoparticle quantitation results to those acquired with a much larger and more expensive desktop DFMM system. Our results suggest that similar devices may be useful for quantification of stable, nanoparticle-based activity and quantitation assays in resource-limited areas where conventional assay approaches are not practical. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A simple approach to quantitative analysis using three-dimensional spectra based on selected Zernike moments.

    PubMed

    Zhai, Hong Lin; Zhai, Yue Yuan; Li, Pei Zhen; Tian, Yue Li

    2013-01-21

    A very simple approach to quantitative analysis is proposed based on the technology of digital image processing using three-dimensional (3D) spectra obtained by high-performance liquid chromatography coupled with a diode array detector (HPLC-DAD). As the region-based shape features of a grayscale image, Zernike moments with inherently invariance property were employed to establish the linear quantitative models. This approach was applied to the quantitative analysis of three compounds in mixed samples using 3D HPLC-DAD spectra, and three linear models were obtained, respectively. The correlation coefficients (R(2)) for training and test sets were more than 0.999, and the statistical parameters and strict validation supported the reliability of established models. The analytical results suggest that the Zernike moment selected by stepwise regression can be used in the quantitative analysis of target compounds. Our study provides a new idea for quantitative analysis using 3D spectra, which can be extended to the analysis of other 3D spectra obtained by different methods or instruments.

  12. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  13. Quantitative study of protein-protein interactions by quartz nanopipettes

    NASA Astrophysics Data System (ADS)

    Tiwari, Purushottam Babu; Astudillo, Luisana; Miksovska, Jaroslava; Wang, Xuewen; Li, Wenzhi; Darici, Yesim; He, Jin

    2014-08-01

    In this report, protein-modified quartz nanopipettes were used to quantitatively study protein-protein interactions in attoliter sensing volumes. As shown by numerical simulations, the ionic current through the conical-shaped nanopipette is very sensitive to the surface charge variation near the pore mouth. With the appropriate modification of negatively charged human neuroglobin (hNgb) onto the inner surface of a nanopipette, we were able to detect concentration-dependent current change when the hNgb-modified nanopipette tip was exposed to positively charged cytochrome c (Cyt c) with a series of concentrations in the bath solution. Such current change is due to the adsorption of Cyt c to the inner surface of the nanopipette through specific interactions with hNgb. In contrast, a smaller current change with weak concentration dependence was observed when Cyt c was replaced with lysozyme, which does not specifically bind to hNgb. The equilibrium dissociation constant (KD) for the Cyt c-hNgb complex formation was derived and the value matched very well with the result from surface plasmon resonance measurement. This is the first quantitative study of protein-protein interactions by a conical-shaped nanopore based on charge sensing. Our results demonstrate that nanopipettes can potentially be used as a label-free analytical tool to quantitatively characterize protein-protein interactions.In this report, protein-modified quartz nanopipettes were used to quantitatively study protein-protein interactions in attoliter sensing volumes. As shown by numerical simulations, the ionic current through the conical-shaped nanopipette is very sensitive to the surface charge variation near the pore mouth. With the appropriate modification of negatively charged human neuroglobin (hNgb) onto the inner surface of a nanopipette, we were able to detect concentration-dependent current change when the hNgb-modified nanopipette tip was exposed to positively charged cytochrome c (Cyt c) with

  14. Quantitative susceptibility mapping of human brain at 3T: a multisite reproducibility study.

    PubMed

    Lin, P-Y; Chao, T-C; Wu, M-L

    2015-03-01

    Quantitative susceptibility mapping of the human brain has demonstrated strong potential in examining iron deposition, which may help in investigating possible brain pathology. This study assesses the reproducibility of quantitative susceptibility mapping across different imaging sites. In this study, the susceptibility values of 5 regions of interest in the human brain were measured on 9 healthy subjects following calibration by using phantom experiments. Each of the subjects was imaged 5 times on 1 scanner with the same procedure repeated on 3 different 3T systems so that both within-site and cross-site quantitative susceptibility mapping precision levels could be assessed. Two quantitative susceptibility mapping algorithms, similar in principle, one by using iterative regularization (iterative quantitative susceptibility mapping) and the other with analytic optimal solutions (deterministic quantitative susceptibility mapping), were implemented, and their performances were compared. Results show that while deterministic quantitative susceptibility mapping had nearly 700 times faster computation speed, residual streaking artifacts seem to be more prominent compared with iterative quantitative susceptibility mapping. With quantitative susceptibility mapping, the putamen, globus pallidus, and caudate nucleus showed smaller imprecision on the order of 0.005 ppm, whereas the red nucleus and substantia nigra, closer to the skull base, had a somewhat larger imprecision of approximately 0.01 ppm. Cross-site errors were not significantly larger than within-site errors. Possible sources of estimation errors are discussed. The reproducibility of quantitative susceptibility mapping in the human brain in vivo is regionally dependent, and the precision levels achieved with quantitative susceptibility mapping should allow longitudinal and multisite studies such as aging-related changes in brain tissue magnetic susceptibility. © 2015 by American Journal of Neuroradiology.

  15. Quantitative structure-property relationship (correlation analysis) of phosphonic acid-based chelates in design of MRI contrast agent.

    PubMed

    Tiwari, Anjani K; Ojha, Himanshu; Kaul, Ankur; Dutta, Anupama; Srivastava, Pooja; Shukla, Gauri; Srivastava, Rakesh; Mishra, Anil K

    2009-07-01

    Nuclear magnetic resonance imaging is a very useful tool in modern medical diagnostics, especially when gadolinium (III)-based contrast agents are administered to the patient with the aim of increasing the image contrast between normal and diseased tissues. With the use of soft modelling techniques such as quantitative structure-activity relationship/quantitative structure-property relationship after a suitable description of their molecular structure, we have studied a series of phosphonic acid for designing new MRI contrast agent. Quantitative structure-property relationship studies with multiple linear regression analysis were applied to find correlation between different calculated molecular descriptors of the phosphonic acid-based chelating agent and their stability constants. The final quantitative structure-property relationship mathematical models were found as--quantitative structure-property relationship Model for phosphonic acid series (Model 1)--log K(ML) = {5.00243(+/-0.7102)}- MR {0.0263(+/-0.540)}n = 12 l r l = 0.942 s = 0.183 F = 99.165 quantitative structure-property relationship Model for phosphonic acid series (Model 2)--log K(ML) = {5.06280(+/-0.3418)}- MR {0.0252(+/- .198)}n = 12 l r l = 0.956 s = 0.186 F = 99.256.

  16. Quantitative SIMS Imaging of Agar-Based Microbial Communities.

    PubMed

    Dunham, Sage J B; Ellis, Joseph F; Baig, Nameera F; Morales-Soto, Nydia; Cao, Tianyuan; Shrout, Joshua D; Bohn, Paul W; Sweedler, Jonathan V

    2018-05-01

    After several decades of widespread use for mapping elemental ions and small molecular fragments in surface science, secondary ion mass spectrometry (SIMS) has emerged as a powerful analytical tool for molecular imaging in biology. Biomolecular SIMS imaging has primarily been used as a qualitative technique; although the distribution of a single analyte can be accurately determined, it is difficult to map the absolute quantity of a compound or even to compare the relative abundance of one molecular species to that of another. We describe a method for quantitative SIMS imaging of small molecules in agar-based microbial communities. The microbes are cultivated on a thin film of agar, dried under nitrogen, and imaged directly with SIMS. By use of optical microscopy, we show that the area of the agar is reduced by 26 ± 2% (standard deviation) during dehydration, but the overall biofilm morphology and analyte distribution are largely retained. We detail a quantitative imaging methodology, in which the ion intensity of each analyte is (1) normalized to an external quadratic regression curve, (2) corrected for isomeric interference, and (3) filtered for sample-specific noise and lower and upper limits of quantitation. The end result is a two-dimensional surface density image for each analyte. The sample preparation and quantitation methods are validated by quantitatively imaging four alkyl-quinolone and alkyl-quinoline N-oxide signaling molecules (including Pseudomonas quinolone signal) in Pseudomonas aeruginosa colony biofilms. We show that the relative surface densities of the target biomolecules are substantially different from values inferred through direct intensity comparison and that the developed methodologies can be used to quantitatively compare as many ions as there are available standards.

  17. A CZT-based blood counter for quantitative molecular imaging.

    PubMed

    Espagnet, Romain; Frezza, Andrea; Martin, Jean-Pierre; Hamel, Louis-André; Lechippey, Laëtitia; Beauregard, Jean-Mathieu; Després, Philippe

    2017-12-01

    Robust quantitative analysis in positron emission tomography (PET) and in single-photon emission computed tomography (SPECT) typically requires the time-activity curve as an input function for the pharmacokinetic modeling of tracer uptake. For this purpose, a new automated tool for the determination of blood activity as a function of time is presented. The device, compact enough to be used on the patient bed, relies on a peristaltic pump for continuous blood withdrawal at user-defined rates. Gamma detection is based on a 20 × 20 × 15 mm 3 cadmium zinc telluride (CZT) detector, read by custom-made electronics and a field-programmable gate array-based signal processing unit. A graphical user interface (GUI) allows users to select parameters and easily perform acquisitions. This paper presents the overall design of the device as well as the results related to the detector performance in terms of stability, sensitivity and energy resolution. Results from a patient study are also reported. The device achieved a sensitivity of 7.1 cps/(kBq/mL) and a minimum detectable activity of 2.5 kBq/ml for 18 F. The gamma counter also demonstrated an excellent stability with a deviation in count rates inferior to 0.05% over 6 h. An energy resolution of 8% was achieved at 662 keV. The patient study was conclusive and demonstrated that the compact gamma blood counter developed has the sensitivity and the stability required to conduct quantitative molecular imaging studies in PET and SPECT.

  18. Slow erosion of a quantitative apple resistance to Venturia inaequalis based on an isolate-specific Quantitative Trait Locus.

    PubMed

    Caffier, Valérie; Le Cam, Bruno; Al Rifaï, Mehdi; Bellanger, Marie-Noëlle; Comby, Morgane; Denancé, Caroline; Didelot, Frédérique; Expert, Pascale; Kerdraon, Tifenn; Lemarquand, Arnaud; Ravon, Elisa; Durel, Charles-Eric

    2016-10-01

    Quantitative plant resistance affects the aggressiveness of pathogens and is usually considered more durable than qualitative resistance. However, the efficiency of a quantitative resistance based on an isolate-specific Quantitative Trait Locus (QTL) is expected to decrease over time due to the selection of isolates with a high level of aggressiveness on resistant plants. To test this hypothesis, we surveyed scab incidence over an eight-year period in an orchard planted with susceptible and quantitatively resistant apple genotypes. We sampled 79 Venturia inaequalis isolates from this orchard at three dates and we tested their level of aggressiveness under controlled conditions. Isolates sampled on resistant genotypes triggered higher lesion density and exhibited a higher sporulation rate on apple carrying the resistance allele of the QTL T1 compared to isolates sampled on susceptible genotypes. Due to this ability to select aggressive isolates, we expected the QTL T1 to be non-durable. However, our results showed that the quantitative resistance based on the QTL T1 remained efficient in orchard over an eight-year period, with only a slow decrease in efficiency and no detectable increase of the aggressiveness of fungal isolates over time. We conclude that knowledge on the specificity of a QTL is not sufficient to evaluate its durability. Deciphering molecular mechanisms associated with resistance QTLs, genetic determinants of aggressiveness and putative trade-offs within pathogen populations is needed to help in understanding the erosion processes. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  20. Problem-based learning on quantitative analytical chemistry course

    NASA Astrophysics Data System (ADS)

    Fitri, Noor

    2017-12-01

    This research applies problem-based learning method on chemical quantitative analytical chemistry, so called as "Analytical Chemistry II" course, especially related to essential oil analysis. The learning outcomes of this course include aspects of understanding of lectures, the skills of applying course materials, and the ability to identify, formulate and solve chemical analysis problems. The role of study groups is quite important in improving students' learning ability and in completing independent tasks and group tasks. Thus, students are not only aware of the basic concepts of Analytical Chemistry II, but also able to understand and apply analytical concepts that have been studied to solve given analytical chemistry problems, and have the attitude and ability to work together to solve the problems. Based on the learning outcome, it can be concluded that the problem-based learning method in Analytical Chemistry II course has been proven to improve students' knowledge, skill, ability and attitude. Students are not only skilled at solving problems in analytical chemistry especially in essential oil analysis in accordance with local genius of Chemistry Department, Universitas Islam Indonesia, but also have skilled work with computer program and able to understand material and problem in English.

  1. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  2. Quantitative analysis of factors that affect oil pipeline network accident based on Bayesian networks: A case study in China

    NASA Astrophysics Data System (ADS)

    Zhang, Chao; Qin, Ting Xin; Huang, Shuai; Wu, Jian Song; Meng, Xin Yan

    2018-06-01

    Some factors can affect the consequences of oil pipeline accident and their effects should be analyzed to improve emergency preparation and emergency response. Although there are some qualitative analysis models of risk factors' effects, the quantitative analysis model still should be researched. In this study, we introduce a Bayesian network (BN) model of risk factors' effects analysis in an oil pipeline accident case that happened in China. The incident evolution diagram is built to identify the risk factors. And the BN model is built based on the deployment rule for factor nodes in BN and the expert knowledge by Dempster-Shafer evidence theory. Then the probabilities of incident consequences and risk factors' effects can be calculated. The most likely consequences given by this model are consilient with the case. Meanwhile, the quantitative estimations of risk factors' effects may provide a theoretical basis to take optimal risk treatment measures for oil pipeline management, which can be used in emergency preparation and emergency response.

  3. Sensitivity analyses of exposure estimates from a quantitative job-exposure matrix (SYN-JEM) for use in community-based studies.

    PubMed

    Peters, Susan; Kromhout, Hans; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Vermeulen, Roel

    2013-01-01

    We describe the elaboration and sensitivity analyses of a quantitative job-exposure matrix (SYN-JEM) for respirable crystalline silica (RCS). The aim was to gain insight into the robustness of the SYN-JEM RCS estimates based on critical decisions taken in the elaboration process. SYN-JEM for RCS exposure consists of three axes (job, region, and year) based on estimates derived from a previously developed statistical model. To elaborate SYN-JEM, several decisions were taken: i.e. the application of (i) a single time trend; (ii) region-specific adjustments in RCS exposure; and (iii) a prior job-specific exposure level (by the semi-quantitative DOM-JEM), with an override of 0 mg/m(3) for jobs a priori defined as non-exposed. Furthermore, we assumed that exposure levels reached a ceiling in 1960 and remained constant prior to this date. We applied SYN-JEM to the occupational histories of subjects from a large international pooled community-based case-control study. Cumulative exposure levels derived with SYN-JEM were compared with those from alternative models, described by Pearson correlation ((Rp)) and differences in unit of exposure (mg/m(3)-year). Alternative models concerned changes in application of job- and region-specific estimates and exposure ceiling, and omitting the a priori exposure ranking. Cumulative exposure levels for the study subjects ranged from 0.01 to 60 mg/m(3)-years, with a median of 1.76 mg/m(3)-years. Exposure levels derived from SYN-JEM and alternative models were overall highly correlated (R(p) > 0.90), although somewhat lower when omitting the region estimate ((Rp) = 0.80) or not taking into account the assigned semi-quantitative exposure level (R(p) = 0.65). Modification of the time trend (i.e. exposure ceiling at 1950 or 1970, or assuming a decline before 1960) caused the largest changes in absolute exposure levels (26-33% difference), but without changing the relative ranking ((Rp) = 0.99). Exposure estimates derived from SYN

  4. "Standards"-Based Mathematics Curricula and the Promotion of Quantitative Literacy in Elementary School

    ERIC Educational Resources Information Center

    Wilkins, Jesse L. M.

    2015-01-01

    Background: Prior research has shown that students taught using "Standards"-based mathematics curricula tend to outperform students on measures of mathematics achievement. However, little research has focused particularly on the promotion of student quantitative literacy (QLT). In this study, the potential influence of the…

  5. Refining Intervention Targets in Family-Based Research: Lessons From Quantitative Behavioral Genetics

    PubMed Central

    Leve, Leslie D.; Harold, Gordon T.; Ge, Xiaojia; Neiderhiser, Jenae M.; Patterson, Gerald

    2010-01-01

    The results from a large body of family-based research studies indicate that modifying the environment (specifically dimensions of the social environment) through intervention is an effective mechanism for achieving positive outcomes. Parallel to this work is a growing body of evidence from genetically informed studies indicating that social environmental factors are central to enhancing or offsetting genetic influences. Increased precision in the understanding of the role of the social environment in offsetting genetic risk might provide new information about environmental mechanisms that could be applied to prevention science. However, at present, the multifaceted conceptualization of the environment in prevention science is mismatched with the more limited measurement of the environment in many genetically informed studies. A framework for translating quantitative behavioral genetic research to inform the development of preventive interventions is presented in this article. The measurement of environmental indices amenable to modification is discussed within the context of quantitative behavioral genetic studies. In particular, emphasis is placed on the necessary elements that lead to benefits in prevention science, specifically the development of evidence-based interventions. An example from an ongoing prospective adoption study is provided to illustrate the potential of this translational process to inform the selection of preventive intervention targets. PMID:21188273

  6. Quantitative study of protein-protein interactions by quartz nanopipettes.

    PubMed

    Tiwari, Purushottam Babu; Astudillo, Luisana; Miksovska, Jaroslava; Wang, Xuewen; Li, Wenzhi; Darici, Yesim; He, Jin

    2014-09-07

    In this report, protein-modified quartz nanopipettes were used to quantitatively study protein-protein interactions in attoliter sensing volumes. As shown by numerical simulations, the ionic current through the conical-shaped nanopipette is very sensitive to the surface charge variation near the pore mouth. With the appropriate modification of negatively charged human neuroglobin (hNgb) onto the inner surface of a nanopipette, we were able to detect concentration-dependent current change when the hNgb-modified nanopipette tip was exposed to positively charged cytochrome c (Cyt c) with a series of concentrations in the bath solution. Such current change is due to the adsorption of Cyt c to the inner surface of the nanopipette through specific interactions with hNgb. In contrast, a smaller current change with weak concentration dependence was observed when Cyt c was replaced with lysozyme, which does not specifically bind to hNgb. The equilibrium dissociation constant (KD) for the Cyt c-hNgb complex formation was derived and the value matched very well with the result from surface plasmon resonance measurement. This is the first quantitative study of protein-protein interactions by a conical-shaped nanopore based on charge sensing. Our results demonstrate that nanopipettes can potentially be used as a label-free analytical tool to quantitatively characterize protein-protein interactions.

  7. [Quantitative assessment of urban ecosystem services flow based on entropy theory: A case study of Beijing, China].

    PubMed

    Li, Jing Xin; Yang, Li; Yang, Lei; Zhang, Chao; Huo, Zhao Min; Chen, Min Hao; Luan, Xiao Feng

    2018-03-01

    Quantitative evaluation of ecosystem service is a primary premise for rational resources exploitation and sustainable development. Examining ecosystem services flow provides a scientific method to quantity ecosystem services. We built an assessment indicator system based on land cover/land use under the framework of four types of ecosystem services. The types of ecosystem services flow were reclassified. Using entropy theory, disorder degree and developing trend of indicators and urban ecosystem were quantitatively assessed. Beijing was chosen as the study area, and twenty-four indicators were selected for evaluation. The results showed that the entropy value of Beijing urban ecosystem during 2004 to 2015 was 0.794 and the entropy flow was -0.024, suggesting a large disordered degree and near verge of non-health. The system got maximum values for three times, while the mean annual variation of the system entropy value increased gradually in three periods, indicating that human activities had negative effects on urban ecosystem. Entropy flow reached minimum value in 2007, implying the environmental quality was the best in 2007. The determination coefficient for the fitting function of total permanent population in Beijing and urban ecosystem entropy flow was 0.921, indicating that urban ecosystem health was highly correlated with total permanent population.

  8. The Development of Mathematical Knowledge for Teaching for Quantitative Reasoning Using Video-Based Instruction

    NASA Astrophysics Data System (ADS)

    Walters, Charles David

    Quantitative reasoning (P. W. Thompson, 1990, 1994) is a powerful mathematical tool that enables students to engage in rich problem solving across the curriculum. One way to support students' quantitative reasoning is to develop prospective secondary teachers' (PSTs) mathematical knowledge for teaching (MKT; Ball, Thames, & Phelps, 2008) related to quantitative reasoning. However, this may prove challenging, as prior to entering the classroom, PSTs often have few opportunities to develop MKT by examining and reflecting on students' thinking. Videos offer one avenue through which such opportunities are possible. In this study, I report on the design of a mini-course for PSTs that featured a series of videos created as part of a proof-of-concept NSF-funded project. These MathTalk videos highlight the ways in which the quantitative reasoning of two high school students developed over time. Using a mixed approach to grounded theory, I analyzed pre- and postinterviews using an extant coding scheme based on the Silverman and Thompson (2008) framework for the development of MKT. This analysis revealed a shift in participants' affect as well as three distinct shifts in their MKT around quantitative reasoning with distances, including shifts in: (a) quantitative reasoning; (b) point of view (decentering); and (c) orientation toward problem solving. Using the four-part focusing framework (Lobato, Hohensee, & Rhodehamel, 2013), I analyzed classroom data to account for how participants' noticing was linked with the shifts in MKT. Notably, their increased noticing of aspects of MKT around quantitative reasoning with distances, which features prominently in the MathTalk videos, seemed to contribute to the emergence of the shifts in MKT. Results from this study link elements of the learning environment to the development of specific facets of MKT around quantitative reasoning with distances. These connections suggest that vicarious experiences with two students' quantitative

  9. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  10. Confocal reflectance quantitative phase microscope system for cellular membranes dynamics study (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Singh, Vijay Raj; Yaqoob, Zahid; So, Peter T. C.

    2017-02-01

    Quantitative phase microscopy (QPM) techniques developed so far primarily belongs to high speed transmitted light based systems that has enough sensitivity to resolve membrane fluctuations and dynamics, but has no depth resolution. Therefore, most biomechanics studies using QPM today is confined to simple cells, such as RBCs, without internal organelles. An important instrument that will greatly extend the biomedical applications of QPM is to develop next generation microscope with 3D capability and sufficient temporal resolution to study biomechanics of complex eukaryotic cells including the mechanics of their internal compartments. For eukaryotic cells, the depth sectioning capability is critical and should be sufficient to distinguish nucleic membrane fluctuations from plasma membrane fluctuations. Further, this microscope must provide high temporal resolution since typical eukaryotes membranes are substantially stiffer than RBCs. A confocal reflectance quantitative phase microscope is presented based on multi-pinhole scanning, with the capabilities of higher temporal resolution and sensitivity for nucleic and plasma membranes of eukaryotic cells. System hardware is developed based on an array of confocal pinhole generated by using the `ON' state of subset of micro-mirrors of digital micro-mirror device (DMD, from Texas Instruments) and high-speed raster scanning provides 14ms imaging speed in wide-field mode. A common path interferometer is integrated at the imaging arm for detection of specimens' quantitative phase information. Theoretical investigation of quantitative phase reconstructed from system is investigated and application of system is presented for dimensional fluctuations measurements of both cellular plasma and nucleic membranes of embryonic stem cells.

  11. iTRAQ-based Quantitative Proteomics Study in Patients with Refractory Mycoplasma pneumoniae Pneumonia.

    PubMed

    Yu, Jia-Lu; Song, Qi-Fang; Xie, Zhi-Wei; Jiang, Wen-Hui; Chen, Jia-Hui; Fan, Hui-Feng; Xie, Ya-Ping; Lu, Gen

    2017-09-25

    Mycoplasma pneumoniae (MP) is a leading cause of community-acquired pneumonia in children and young adults. Although MP pneumonia is usually benign and self-limited, in some cases it can develop into life-threating refractory MP pneumonia (RMPP). However, the pathogenesis of RMPP is poorly understood. The identification and characterization of proteins related to RMPP could provide a proof of principle to facilitate appropriate diagnostic and therapeutic strategies for treating paients with MP. In this study, we used a quantitative proteomic technique (iTRAQ) to analyze MP-related proteins in serum samples from 5 patients with RMPP, 5 patients with non-refractory MP pneumonia (NRMPP), and 5 healthy children. Functional classification, sub-cellular localization, and protein interaction network analysis were carried out based on protein annotation through evolutionary relationship (PANTHER) and Cytoscape analysis. A total of 260 differentially expressed proteins were identified in the RMPP and NRMPP groups. Compared to the control group, the NRMPP and RMPP groups showed 134 (70 up-regulated and 64 down-regulated) and 126 (63 up-regulated and 63 down-regulated) differentially expressed proteins, respectively. The complex functional classification and protein interaction network of the identified proteins reflected the complex pathogenesis of RMPP. Our study provides the first comprehensive proteome map of RMPP-related proteins from MP pneumonia. These profiles may be useful as part of a diagnostic panel, and the identified proteins provide new insights into the pathological mechanisms underlying RMPP.

  12. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  14. Evaluating the Effectiveness of Remedial Reading Courses at Community Colleges: A Quantitative Study

    ERIC Educational Resources Information Center

    Lavonier, Nicole

    2014-01-01

    The present study evaluated the effectiveness of two instructional approaches for remedial reading courses at a community college. The instructional approaches were strategic reading and traditional, textbook-based instruction. The two research questions that guided the quantitative, quasi-experimental study were: (a) what is the effect of…

  15. Quantitative methods to direct exploration based on hydrogeologic information

    USGS Publications Warehouse

    Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.

    2006-01-01

    Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.

  16. Study On The Application Of CBERS-02B To Quantitative Soil Erosion Monitoring

    NASA Astrophysics Data System (ADS)

    Shi, Mingchang; Xu, Jing; Wang, Lei; Wang, Xiaoyun; Mu, Jing

    2010-10-01

    Currently, the reduction of soil erosion is an important prerequisite for achieving ecological security. Since real-time and quantitative evaluation on regional soil erosion plays a significant role in reducing the soil erosion, soil erosion models are more and more widely used. Based on RUSLE model, this paper carries out the quantitative soil erosion monitoring in the Xi River Basin and its surrounding areas by using CBERS-02B CCD, DEM, TRMM and other data. Besides, it performs the validation for monitoring results by using remote sensing investigation results in 2005. The monitoring results show that in 2009, the total amount of soil erosion in the study area was 1.94×106t, the erosion area was 2055.2km2 (54.06% of the total area), and the average soil erosion modulus was 509.7t km-2 a-1. As a case using CBERS-02B data for quantitative soil erosion monitoring, this study provides experience on the application of CBERS-02B data in the field of quantitative soil erosion monitoring and also for local soil erosion management.

  17. A Quantitative Corpus-Based Approach to English Spatial Particles: Conceptual Symmetry and Its Pedagogical Implications

    ERIC Educational Resources Information Center

    Chen, Alvin Cheng-Hsien

    2014-01-01

    The present study aims to investigate how conceptual symmetry plays a role in the use of spatial particles in English and to further examine its pedagogical implications via a corpus-based evaluation of the course books in senior high schools in Taiwan. More specifically, we adopt a quantitative corpus-based approach to investigate whether bipolar…

  18. Quantitative dispersion microscopy

    PubMed Central

    Fu, Dan; Choi, Wonshik; Sung, Yongjin; Yaqoob, Zahid; Dasari, Ramachandra R.; Feld, Michael

    2010-01-01

    Refractive index dispersion is an intrinsic optical property and a useful source of contrast in biological imaging studies. In this report, we present the first dispersion phase imaging of living eukaryotic cells. We have developed quantitative dispersion microscopy based on the principle of quantitative phase microscopy. The dual-wavelength quantitative phase microscope makes phase measurements at 310 nm and 400 nm wavelengths to quantify dispersion (refractive index increment ratio) of live cells. The measured dispersion of living HeLa cells is found to be around 1.088, which agrees well with that measured directly for protein solutions using total internal reflection. This technique, together with the dry mass and morphology measurements provided by quantitative phase microscopy, could prove to be a useful tool for distinguishing different types of biomaterials and studying spatial inhomogeneities of biological samples. PMID:21113234

  19. A novel image-based quantitative method for the characterization of NETosis

    PubMed Central

    Zhao, Wenpu; Fogg, Darin K.; Kaplan, Mariana J.

    2015-01-01

    NETosis is a newly recognized mechanism of programmed neutrophil death. It is characterized by a stepwise progression of chromatin decondensation, membrane rupture, and release of bactericidal DNA-based structures called neutrophil extracellular traps (NETs). Conventional ‘suicidal’ NETosis has been described in pathogenic models of systemic autoimmune disorders. Recent in vivo studies suggest that a process of ‘vital’ NETosis also exists, in which chromatin is condensed and membrane integrity is preserved. Techniques to assess ‘suicidal’ or ‘vital’ NET formation in a specific, quantitative, rapid and semiautomated way have been lacking, hindering the characterization of this process. Here we have developed a new method to simultaneously assess both ‘suicidal’ and ‘vital’ NETosis, using high-speed multi-spectral imaging coupled to morphometric image analysis, to quantify spontaneous NET formation observed ex-vivo or stimulus-induced NET formation triggered in vitro. Use of imaging flow cytometry allows automated, quantitative and rapid analysis of subcellular morphology and texture, and introduces the potential for further investigation using NETosis as a biomarker in pre-clinical and clinical studies. PMID:26003624

  20. Study on quantitative risk assessment model of the third party damage for natural gas pipelines based on fuzzy comprehensive assessment

    NASA Astrophysics Data System (ADS)

    Qiu, Zeyang; Liang, Wei; Wang, Xue; Lin, Yang; Zhang, Meng

    2017-05-01

    As an important part of national energy supply system, transmission pipelines for natural gas are possible to cause serious environmental pollution, life and property loss in case of accident. The third party damage is one of the most significant causes for natural gas pipeline system accidents, and it is very important to establish an effective quantitative risk assessment model of the third party damage for reducing the number of gas pipelines operation accidents. Against the third party damage accident has the characteristics such as diversity, complexity and uncertainty, this paper establishes a quantitative risk assessment model of the third party damage based on Analytic Hierarchy Process (AHP) and Fuzzy Comprehensive Evaluation (FCE). Firstly, risk sources of third party damage should be identified exactly, and the weight of factors could be determined via improved AHP, finally the importance of each factor is calculated by fuzzy comprehensive evaluation model. The results show that the quantitative risk assessment model is suitable for the third party damage of natural gas pipelines and improvement measures could be put forward to avoid accidents based on the importance of each factor.

  1. Quantitative Articles: Developing Studies for Publication in Counseling Journals

    ERIC Educational Resources Information Center

    Trusty, Jerry

    2011-01-01

    This article is presented as a guide for developing quantitative studies and preparing quantitative manuscripts for publication in counseling journals. It is intended as an aid for aspiring authors in conceptualizing studies and formulating valid research designs. Material is presented on choosing variables and measures and on selecting…

  2. Pilot study of Iopamidol-based quantitative pH imaging on a clinical 3T MR scanner.

    PubMed

    Müller-Lutz, Anja; Khalil, Nadia; Schmitt, Benjamin; Jellus, Vladimir; Pentang, Gael; Oeltzschner, Georg; Antoch, Gerald; Lanzman, Rotem S; Wittsack, Hans-Jörg

    2014-12-01

    The objective of this study was to show the feasibility to perform Iopamidol-based pH imaging via clinical 3T magnetic resonance imaging (MRI) using chemical exchange saturation transfer (CEST) imaging with pulse train presaturation. The pulse train presaturation scheme of a CEST sequence was investigated for Iopamidol-based pH measurements using a 3T magnetic resonance (MR) scanner. The CEST sequence was applied to eight tubes filled with 100-mM Iopamidol solutions with pH values ranging from 5.6 to 7.0. Calibration curves for pH quantification were determined. The dependence of pH values on the concentration of Iopamidol was investigated. An in vivo measurement was performed in one patient who had undergone a previous contrast-enhanced computed tomography (CT) scan with Iopamidol. The pH values of urine measured with CEST MRI and with a pH meter were compared. In the measured pH range, pH imaging using CEST imaging with pulse train presaturation was possible. Dependence between the pH value and the concentration of Iopamidol was not observed. In the in vivo investigation, the pH values in the human bladder measured by the Iopamidol CEST sequence and in urine were consistent. Our study shows the feasibility of using CEST imaging with Iopamidol for quantitative pH mapping in vitro and in vivo on a 3T MR scanner.

  3. Precise Quantitation of MicroRNA in a Single Cell with Droplet Digital PCR Based on Ligation Reaction.

    PubMed

    Tian, Hui; Sun, Yuanyuan; Liu, Chenghui; Duan, Xinrui; Tang, Wei; Li, Zhengping

    2016-12-06

    MicroRNA (miRNA) analysis in a single cell is extremely important because it allows deep understanding of the exact correlation between the miRNAs and cell functions. Herein, we wish to report a highly sensitive and precisely quantitative assay for miRNA detection based on ligation-based droplet digital polymerase chain reaction (ddPCR), which permits the quantitation of miRNA in a single cell. In this ligation-based ddPCR assay, two target-specific oligonucleotide probes can be simply designed to be complementary to the half-sequence of the target miRNA, respectively, which avoids the sophisticated design of reverse transcription and provides high specificity to discriminate a single-base difference among miRNAs with simple operations. After the miRNA-templated ligation, the ddPCR partitions individual ligated products into a water-in-oil droplet and digitally counts the fluorescence-positive and negative droplets after PCR amplification for quantification of the target molecules, which possesses the power of precise quantitation and robustness to variation in PCR efficiency. By integrating the advantages of the precise quantification of ddPCR and the simplicity of the ligation-based PCR, the proposed method can sensitively measure let-7a miRNA with a detection limit of 20 aM (12 copies per microliter), and even a single-base difference can be discriminated in let-7 family members. More importantly, due to its high selectivity and sensitivity, the proposed method can achieve precise quantitation of miRNAs in single-cell lysate. Therefore, the ligation-based ddPCR assay may serve as a useful tool to exactly reveal the miRNAs' actions in a single cell, which is of great importance for the study of miRNAs' biofunction as well as for the related biomedical studies.

  4. Quantitative and qualitative 5-aminolevulinic acid–induced protoporphyrin IX fluorescence in skull base meningiomas

    PubMed Central

    Bekelis, Kimon; Valdés, Pablo A.; Erkmen, Kadir; Leblond, Frederic; Kim, Anthony; Wilson, Brian C.; Harris, Brent T.; Paulsen, Keith D.; Roberts, David W.

    2011-01-01

    Object Complete resection of skull base meningiomas provides patients with the best chance for a cure; however, surgery is frequently difficult given the proximity of lesions to vital structures, such as cranial nerves, major vessels, and venous sinuses. Accurate discrimination between tumor and normal tissue is crucial for optimal tumor resection. Qualitative assessment of protoporphyrin IX (PpIX) fluorescence following the exogenous administration of 5-aminolevulinic acid (ALA) has demonstrated utility in malignant glioma resection but limited use in meningiomas. Here the authors demonstrate the use of ALA-induced PpIX fluorescence guidance in resecting a skull base meningioma and elaborate on the advantages and disadvantages provided by both quantitative and qualitative fluorescence methodologies in skull base meningioma resection. Methods A 52-year-old patient with a sphenoid wing WHO Grade I meningioma underwent tumor resection as part of an institutional review board–approved prospective study of fluorescence-guided resection. A surgical microscope modified for fluorescence imaging was used for the qualitative assessment of visible fluorescence, and an intraoperative probe for in situ fluorescence detection was utilized for quantitative measurements of PpIX. The authors assessed the detection capabilities of both the qualitative and quantitative fluorescence approaches. Results The patient harboring a sphenoid wing meningioma with intraorbital extension underwent radical resection of the tumor with both visibly and nonvisibly fluorescent regions. The patient underwent a complete resection without any complications. Some areas of the tumor demonstrated visible fluorescence. The quantitative probe detected neoplastic tissue better than the qualitative modified surgical microscope. The intraoperative probe was particularly useful in areas that did not reveal visible fluorescence, and tissue from these areas was confirmed as tumor following histopathological

  5. Retinal status analysis method based on feature extraction and quantitative grading in OCT images.

    PubMed

    Fu, Dongmei; Tong, Hejun; Zheng, Shuang; Luo, Ling; Gao, Fulin; Minar, Jiri

    2016-07-22

    Optical coherence tomography (OCT) is widely used in ophthalmology for viewing the morphology of the retina, which is important for disease detection and assessing therapeutic effect. The diagnosis of retinal diseases is based primarily on the subjective analysis of OCT images by trained ophthalmologists. This paper describes an OCT images automatic analysis method for computer-aided disease diagnosis and it is a critical part of the eye fundus diagnosis. This study analyzed 300 OCT images acquired by Optovue Avanti RTVue XR (Optovue Corp., Fremont, CA). Firstly, the normal retinal reference model based on retinal boundaries was presented. Subsequently, two kinds of quantitative methods based on geometric features and morphological features were proposed. This paper put forward a retinal abnormal grading decision-making method which was used in actual analysis and evaluation of multiple OCT images. This paper showed detailed analysis process by four retinal OCT images with different abnormal degrees. The final grading results verified that the analysis method can distinguish abnormal severity and lesion regions. This paper presented the simulation of the 150 test images, where the results of analysis of retinal status showed that the sensitivity was 0.94 and specificity was 0.92.The proposed method can speed up diagnostic process and objectively evaluate the retinal status. This paper aims on studies of retinal status automatic analysis method based on feature extraction and quantitative grading in OCT images. The proposed method can obtain the parameters and the features that are associated with retinal morphology. Quantitative analysis and evaluation of these features are combined with reference model which can realize the target image abnormal judgment and provide a reference for disease diagnosis.

  6. Exploring the Perceptions of College Instructors towards Computer Simulation Software Programs: A Quantitative Study

    ERIC Educational Resources Information Center

    Punch, Raymond J.

    2012-01-01

    The purpose of the quantitative regression study was to explore and to identify relationships between attitudes toward use and perceptions of value of computer-based simulation programs, of college instructors, toward computer based simulation programs. A relationship has been reported between attitudes toward use and perceptions of the value of…

  7. [Quality evaluation of rhubarb dispensing granules based on multi-component simultaneous quantitative analysis and bioassay].

    PubMed

    Tan, Peng; Zhang, Hai-Zhu; Zhang, Ding-Kun; Wu, Shan-Na; Niu, Ming; Wang, Jia-Bo; Xiao, Xiao-He

    2017-07-01

    This study attempts to evaluate the quality of Chinese formula granules by combined use of multi-component simultaneous quantitative analysis and bioassay. The rhubarb dispensing granules were used as the model drug for demonstrative study. The ultra-high performance liquid chromatography (UPLC) method was adopted for simultaneously quantitative determination of the 10 anthraquinone derivatives (such as aloe emodin-8-O-β-D-glucoside) in rhubarb dispensing granules; purgative biopotency of different batches of rhubarb dispensing granules was determined based on compound diphenoxylate tablets-induced mouse constipation model; blood activating biopotency of different batches of rhubarb dispensing granules was determined based on in vitro rat antiplatelet aggregation model; SPSS 22.0 statistical software was used for correlation analysis between 10 anthraquinone derivatives and purgative biopotency, blood activating biopotency. The results of multi-components simultaneous quantitative analysisshowed that there was a great difference in chemical characterizationand certain differences inpurgative biopotency and blood activating biopotency among 10 batches of rhubarb dispensing granules. The correlation analysis showed that the intensity of purgative biopotency was significantly correlated with the content of conjugated anthraquinone glycosides (P<0.01), and the intensity of blood activating biopotency was significantly correlated with the content of free anthraquinone (P<0.01). In summary, the combined use of multi-component simultaneous quantitative analysis and bioassay can achieve objective quantification and more comprehensive reflection on overall quality difference among different batches of rhubarb dispensing granules. Copyright© by the Chinese Pharmaceutical Association.

  8. A Quantitative Study of Oxygen as a Metabolic Regulator

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; LaManna, Joseph C.; Cabera, Marco E.

    2000-01-01

    An acute reduction in oxygen delivery to a tissue is associated with metabolic changes aimed at maintaining ATP homeostasis. However, given the complexity of the human bio-energetic system, it is difficult to determine quantitatively how cellular metabolic processes interact to maintain ATP homeostasis during stress (e.g., hypoxia, ischemia, and exercise). In particular, we are interested in determining mechanisms relating cellular oxygen concentration to observed metabolic responses at the cellular, tissue, organ, and whole body levels and in quantifying how changes in tissue oxygen availability affect the pathways of ATP synthesis and the metabolites that control these pathways. In this study; we extend a previously developed mathematical model of human bioenergetics, to provide a physicochemical framework that permits quantitative understanding of oxygen as a metabolic regulator. Specifically, the enhancement - sensitivity analysis - permits studying the effects of variations in tissue oxygenation and parameters controlling cellular respiration on glycolysis, lactate production, and pyruvate oxidation. The analysis can distinguish between parameters that must be determined accurately and those that require less precision, based on their effects on model predictions. This capability may prove to be important in optimizing experimental design, thus reducing use of animals.

  9. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  10. Applying quantitative adiposity feature analysis models to predict benefit of bevacizumab-based chemotherapy in ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; More, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2016-03-01

    How to rationally identify epithelial ovarian cancer (EOC) patients who will benefit from bevacizumab or other antiangiogenic therapies is a critical issue in EOC treatments. The motivation of this study is to quantitatively measure adiposity features from CT images and investigate the feasibility of predicting potential benefit of EOC patients with or without receiving bevacizumab-based chemotherapy treatment using multivariate statistical models built based on quantitative adiposity image features. A dataset involving CT images from 59 advanced EOC patients were included. Among them, 32 patients received maintenance bevacizumab after primary chemotherapy and the remaining 27 patients did not. We developed a computer-aided detection (CAD) scheme to automatically segment subcutaneous fat areas (VFA) and visceral fat areas (SFA) and then extracted 7 adiposity-related quantitative features. Three multivariate data analysis models (linear regression, logistic regression and Cox proportional hazards regression) were performed respectively to investigate the potential association between the model-generated prediction results and the patients' progression-free survival (PFS) and overall survival (OS). The results show that using all 3 statistical models, a statistically significant association was detected between the model-generated results and both of the two clinical outcomes in the group of patients receiving maintenance bevacizumab (p<0.01), while there were no significant association for both PFS and OS in the group of patients without receiving maintenance bevacizumab. Therefore, this study demonstrated the feasibility of using quantitative adiposity-related CT image features based statistical prediction models to generate a new clinical marker and predict the clinical outcome of EOC patients receiving maintenance bevacizumab-based chemotherapy.

  11. A quantitative analysis of qualitative studies in clinical journals for the 2000 publishing year

    PubMed Central

    McKibbon, Kathleen Ann; Gadd, Cynthia S

    2004-01-01

    Background Quantitative studies are becoming more recognized as important to understanding health care with all of its richness and complexities. The purpose of this descriptive survey was to provide a quantitative evaluation of the qualitative studies published in 170 core clinical journals for 2000. Methods All identified studies that used qualitative methods were reviewed to ascertain which clinical journals publish qualitative studies and to extract research methods, content (persons and health care issues studied), and whether mixed methods (quantitative and qualitative methods) were used. Results 60 330 articles were reviewed. 355 reports of original qualitative studies and 12 systematic review articles were identified in 48 journals. Most of the journals were in the discipline of nursing. Only 4 of the most highly cited health care journals, based on ISI Science Citation Index (SCI) Impact Factors, published qualitative studies. 37 of the 355 original reports used both qualitative and quantitative (mixed) methods. Patients and non-health care settings were the most common groups of people studied. Diseases and conditions were cancer, mental health, pregnancy and childbirth, and cerebrovascular disease with many other diseases and conditions represented. Phenomenology and grounded theory were commonly used; substantial ethnography was also present. No substantial differences were noted for content or methods when articles published in all disciplines were compared with articles published in nursing titles or when studies with mixed methods were compared with studies that included only qualitative methods. Conclusions The clinical literature includes many qualitative studies although they are often published in nursing journals or journals with low SCI Impact Factor journals. Many qualitative studies incorporate both qualitative and quantitative methods. PMID:15271221

  12. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    Treesearch

    Hui Wang; Mingyue Jiang; Shujun Li; Chung-Yun Hse; Chunde Jin; Fangli Sun; Zhuo Li

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and...

  13. A Study of Cognitive Load for Enhancing Student’s Quantitative Literacy in Inquiry Lab Learning

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahman, T.; Alifiani, D. P.; Khoerunnisa, R. S.

    2017-09-01

    Students often find it difficult to appreciate the relevance of the role of quantitative analysis and concept attainment in the science class. This study measured student cognitive load during the inquiry lab of the respiratory system to improve quantitative literacy. Participants in this study were 40 11th graders from senior high school in Indonesia. After students learned, their feelings about the degree of mental effort that it took to complete the learning tasks were measured by 28 self-report on a 4-point Likert scale. The Task Complexity Worksheet were used to asses processing quantitative information and paper based test were applied to assess participants’ concept achievements. The results showed that inquiry instructional induced a relatively low mental effort, high processing information and high concept achievments.

  14. Quantitative evaluation methods of skin condition based on texture feature parameters.

    PubMed

    Pang, Hui; Chen, Tianhua; Wang, Xiaoyi; Chang, Zhineng; Shao, Siqi; Zhao, Jing

    2017-03-01

    In order to quantitatively evaluate the improvement of the skin condition after using skin care products and beauty, a quantitative evaluation method for skin surface state and texture is presented, which is convenient, fast and non-destructive. Human skin images were collected by image sensors. Firstly, the median filter of the 3 × 3 window is used and then the location of the hairy pixels on the skin is accurately detected according to the gray mean value and color information. The bilinear interpolation is used to modify the gray value of the hairy pixels in order to eliminate the negative effect of noise and tiny hairs on the texture. After the above pretreatment, the gray level co-occurrence matrix (GLCM) is calculated. On the basis of this, the four characteristic parameters, including the second moment, contrast, entropy and correlation, and their mean value are calculated at 45 ° intervals. The quantitative evaluation model of skin texture based on GLCM is established, which can calculate the comprehensive parameters of skin condition. Experiments show that using this method evaluates the skin condition, both based on biochemical indicators of skin evaluation methods in line, but also fully consistent with the human visual experience. This method overcomes the shortcomings of the biochemical evaluation method of skin damage and long waiting time, also the subjectivity and fuzziness of the visual evaluation, which achieves the non-destructive, rapid and quantitative evaluation of skin condition. It can be used for health assessment or classification of the skin condition, also can quantitatively evaluate the subtle improvement of skin condition after using skin care products or stage beauty.

  15. Extending Theory-Based Quantitative Predictions to New Health Behaviors.

    PubMed

    Brick, Leslie Ann D; Velicer, Wayne F; Redding, Colleen A; Rossi, Joseph S; Prochaska, James O

    2016-04-01

    Traditional null hypothesis significance testing suffers many limitations and is poorly adapted to theory testing. A proposed alternative approach, called Testing Theory-based Quantitative Predictions, uses effect size estimates and confidence intervals to directly test predictions based on theory. This paper replicates findings from previous smoking studies and extends the approach to diet and sun protection behaviors using baseline data from a Transtheoretical Model behavioral intervention (N = 5407). Effect size predictions were developed using two methods: (1) applying refined effect size estimates from previous smoking research or (2) using predictions developed by an expert panel. Thirteen of 15 predictions were confirmed for smoking. For diet, 7 of 14 predictions were confirmed using smoking predictions and 6 of 16 using expert panel predictions. For sun protection, 3 of 11 predictions were confirmed using smoking predictions and 5 of 19 using expert panel predictions. Expert panel predictions and smoking-based predictions poorly predicted effect sizes for diet and sun protection constructs. Future studies should aim to use previous empirical data to generate predictions whenever possible. The best results occur when there have been several iterations of predictions for a behavior, such as with smoking, demonstrating that expected values begin to converge on the population effect size. Overall, the study supports necessity in strengthening and revising theory with empirical data.

  16. In silico quantitative structure-toxicity relationship study of aromatic nitro compounds.

    PubMed

    Pasha, Farhan Ahmad; Neaz, Mohammad Morshed; Cho, Seung Joo; Ansari, Mohiuddin; Mishra, Sunil Kumar; Tiwari, Sharvan

    2009-05-01

    Small molecules often have toxicities that are a function of molecular structural features. Minor variations in structural features can make large difference in such toxicity. Consequently, in silico techniques may be used to correlate such molecular toxicities with their structural features. Relative to nine different sets of aromatic nitro compounds having known observed toxicities against different targets, we developed ligand-based 2D quantitative structure-toxicity relationship models using 20 selected topological descriptors. The topological descriptors have several advantages such as conformational independency, facile and less time-consuming computation to yield good results. Multiple linear regression analysis was used to correlate variations of toxicity with molecular properties. The information index on molecular size, lopping centric index and Kier flexibility index were identified as fundamental descriptors for different kinds of toxicity, and further showed that molecular size, branching and molecular flexibility might be particularly important factors in quantitative structure-toxicity relationship analysis. This study revealed that topological descriptor-guided quantitative structure-toxicity relationship provided a very useful, cost and time-efficient, in silico tool for describing small-molecule toxicities.

  17. Development and application of absolute quantitative detection by duplex chamber-based digital PCR of genetically modified maize events without pretreatment steps.

    PubMed

    Zhu, Pengyu; Fu, Wei; Wang, Chenguang; Du, Zhixin; Huang, Kunlun; Zhu, Shuifang; Xu, Wentao

    2016-04-15

    The possibility of the absolute quantitation of GMO events by digital PCR was recently reported. However, most absolute quantitation methods based on the digital PCR required pretreatment steps. Meanwhile, singleplex detection could not meet the demand of the absolute quantitation of GMO events that is based on the ratio of foreign fragments and reference genes. Thus, to promote the absolute quantitative detection of different GMO events by digital PCR, we developed a quantitative detection method based on duplex digital PCR without pretreatment. Moreover, we tested 7 GMO events in our study to evaluate the fitness of our method. The optimized combination of foreign and reference primers, limit of quantitation (LOQ), limit of detection (LOD) and specificity were validated. The results showed that the LOQ of our method for different GMO events was 0.5%, while the LOD is 0.1%. Additionally, we found that duplex digital PCR could achieve the detection results with lower RSD compared with singleplex digital PCR. In summary, the duplex digital PCR detection system is a simple and stable way to achieve the absolute quantitation of different GMO events. Moreover, the LOQ and LOD indicated that this method is suitable for the daily detection and quantitation of GMO events. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Allelic-based gene-gene interaction associated with quantitative traits.

    PubMed

    Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M

    2009-05-01

    Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.

  19. A new LC-MS based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous system

    PubMed Central

    Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.

    2018-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718

  20. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  1. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  2. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study.

    PubMed

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R

    2017-01-01

    Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2-5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this parameter. This work has shown the

  3. Towards assessing cortical bone porosity using low-frequency quantitative acoustics: A phantom-based study

    PubMed Central

    Vogl, Florian; Bernet, Benjamin; Bolognesi, Daniele; Taylor, William R.

    2017-01-01

    Purpose Cortical porosity is a key characteristic governing the structural properties and mechanical behaviour of bone, and its quantification is therefore critical for understanding and monitoring the development of various bone pathologies such as osteoporosis. Axial transmission quantitative acoustics has shown to be a promising technique for assessing bone health in a fast, non-invasive, and radiation-free manner. One major hurdle in bringing this approach to clinical application is the entanglement of the effects of individual characteristics (e.g. geometry, porosity, anisotropy etc.) on the measured wave propagation. In order to address this entanglement problem, we therefore propose a systematic bottom-up approach, in which only one bone property is varied, before addressing interaction effects. This work therefore investigated the sensitivity of low-frequency quantitative acoustics to changes in porosity as well as individual pore characteristics using specifically designed cortical bone phantoms. Materials and methods 14 bone phantoms were designed with varying pore size, axial-, and radial pore number, resulting in porosities (bone volume fraction) between 0% and 15%, similar to porosity values found in human cortical bone. All phantoms were manufactured using laser sintering, measured using axial-transmission acoustics and analysed using a full-wave approach. Experimental results were compared to theoretical predictions based on a modified Timoshenko theory. Results A clear dependence of phase velocity on frequency and porosity produced by increasing pore size or radial pore number was demonstrated, with the velocity decreasing by between 2–5 m/s per percent of additional porosity, which corresponds to -0.5% to -1.0% of wave speed. While the change in phase velocity due to axial pore number was consistent with the results due to pore size and radial pore number, the relative uncertainties for the estimates were too high to draw any conclusions for this

  4. Comparing the MRI-based Goutallier Classification to an experimental quantitative MR spectroscopic fat measurement of the supraspinatus muscle.

    PubMed

    Gilbert, Fabian; Böhm, Dirk; Eden, Lars; Schmalzl, Jonas; Meffert, Rainer H; Köstler, Herbert; Weng, Andreas M; Ziegler, Dirk

    2016-08-22

    The Goutallier Classification is a semi quantitative classification system to determine the amount of fatty degeneration in rotator cuff muscles. Although initially proposed for axial computer tomography scans it is currently applied to magnet-resonance-imaging-scans. The role for its clinical use is controversial, as the reliability of the classification has been shown to be inconsistent. The purpose of this study was to compare the semi quantitative MRI-based Goutallier Classification applied by 5 different raters to experimental MR spectroscopic quantitative fat measurement in order to determine the correlation between this classification system and the true extent of fatty degeneration shown by spectroscopy. MRI-scans of 42 patients with rotator cuff tears were examined by 5 shoulder surgeons and were graduated according to the MRI-based Goutallier Classification proposed by Fuchs et al. Additionally the fat/water ratio was measured with MR spectroscopy using the experimental SPLASH technique. The semi quantitative grading according to the Goutallier Classification was statistically correlated with the quantitative measured fat/water ratio using Spearman's rank correlation. Statistical analysis of the data revealed only fair correlation of the Goutallier Classification system and the quantitative fat/water ratio with R = 0.35 (p < 0.05). By dichotomizing the scale the correlation was 0.72. The interobserver and intraobserver reliabilities were substantial with R = 0.62 and R = 0.74 (p < 0.01). The correlation between the semi quantitative MRI based Goutallier Classification system and MR spectroscopic fat measurement is weak. As an adequate estimation of fatty degeneration based on standard MRI may not be possible, quantitative methods need to be considered in order to increase diagnostic safety and thus provide patients with ideal care in regard to the amount of fatty degeneration. Spectroscopic MR measurement may increase the accuracy of

  5. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  6. Quantitative detection of bovine and porcine gelatin difference using surface plasmon resonance based biosensor

    NASA Astrophysics Data System (ADS)

    Wardani, Devy P.; Arifin, Muhammad; Suharyadi, Edi; Abraha, Kamsul

    2015-05-01

    Gelatin is a biopolymer derived from collagen that is widely used in food and pharmaceutical products. Due to some religion restrictions and health issues regarding the gelatin consumption which is extracted from certain species, it is necessary to establish a robust, reliable, sensitive and simple quantitative method to detect gelatin from different parent collagen species. To the best of our knowledge, there has not been a gelatin differentiation method based on optical sensor that could detect gelatin from different species quantitatively. Surface plasmon resonance (SPR) based biosensor is known to be a sensitive, simple and label free optical method for detecting biomaterials that is able to do quantitative detection. Therefore, we have utilized SPR-based biosensor to detect the differentiation between bovine and porcine gelatin in various concentration, from 0% to 10% (w/w). Here, we report the ability of SPR-based biosensor to detect difference between both gelatins, its sensitivity toward the gelatin concentration change, its reliability and limit of detection (LOD) and limit of quantification (LOQ) of the sensor. The sensor's LOD and LOQ towards bovine gelatin concentration are 0.38% and 1.26% (w/w), while towards porcine gelatin concentration are 0.66% and 2.20% (w/w), respectively. The results show that SPR-based biosensor is a promising tool for detecting gelatin from different raw materials quantitatively.

  7. Quantitative breast tissue characterization using grating-based x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Willner, M.; Herzen, J.; Grandl, S.; Auweter, S.; Mayr, D.; Hipp, A.; Chabior, M.; Sarapata, A.; Achterhold, K.; Zanette, I.; Weitkamp, T.; Sztrókay, A.; Hellerhoff, K.; Reiser, M.; Pfeiffer, F.

    2014-04-01

    X-ray phase-contrast imaging has received growing interest in recent years due to its high capability in visualizing soft tissue. Breast imaging became the focus of particular attention as it is considered the most promising candidate for a first clinical application of this contrast modality. In this study, we investigate quantitative breast tissue characterization using grating-based phase-contrast computed tomography (CT) at conventional polychromatic x-ray sources. Different breast specimens have been scanned at a laboratory phase-contrast imaging setup and were correlated to histopathology. Ascertained tumor types include phylloides tumor, fibroadenoma and infiltrating lobular carcinoma. Identified tissue types comprising adipose, fibroglandular and tumor tissue have been analyzed in terms of phase-contrast Hounsfield units and are compared to high-quality, high-resolution data obtained with monochromatic synchrotron radiation, as well as calculated values based on tabulated tissue properties. The results give a good impression of the method’s prospects and limitations for potential tumor detection and the associated demands on such a phase-contrast breast CT system. Furthermore, the evaluated quantitative tissue values serve as a reference for simulations and the design of dedicated phantoms for phase-contrast mammography.

  8. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast

    PubMed Central

    Pang, Wei; Coghill, George M.

    2015-01-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. PMID:25864377

  9. Quantitative Study of Interface/Interphase in Epoxy/Graphene-Based Nanocomposites by Combining STEM and EELS.

    PubMed

    Liu, Yu; Hamon, Ann-Lenaig; Haghi-Ashtiani, Paul; Reiss, Thomas; Fan, Benhui; He, Delong; Bai, Jinbo

    2016-12-14

    A quantitative study of the interphase and interface of graphene nanoplatelets (GNPs)/epoxy and graphene oxide (GO)/epoxy was carried out by combining scanning transmission electron microscopy (STEM) and electron energy-loss spectroscopy (EELS). The interphase regions between GNPs and epoxy matrix were clearly identified by the discrepancy of the plasmon peak positions in the low energy-loss spectra due to different valence electron densities. The spectrum acquisitions were carried out along lines across the interface. An interphase thickness of 13 and 12.5 nm was measured for GNPs/epoxy and GO/epoxy, respectively. The density of the GNPs/epoxy interphase was 2.89% higher than that of the epoxy matrix. However, the density of the GO/epoxy interphase was 1.37% lower than that of the epoxy matrix. The interphase layer thickness measured in this work is in good agreement with the transition layer theory, which proposed an area with modulus linearly varying across a finite width. The results provide an insight into the interphase for carbon-based polymer composites that can help to design the functionalization of nanofillers to improve the composite properties.

  10. Modelling of occupational respirable crystalline silica exposure for quantitative exposure assessment in community-based case-control studies.

    PubMed

    Peters, Susan; Vermeulen, Roel; Portengen, Lützen; Olsson, Ann; Kendzia, Benjamin; Vincent, Raymond; Savary, Barbara; Lavoué, Jérôme; Cavallo, Domenico; Cattaneo, Andrea; Mirabelli, Dario; Plato, Nils; Fevotte, Joelle; Pesch, Beate; Brüning, Thomas; Straif, Kurt; Kromhout, Hans

    2011-11-01

    We describe an empirical model for exposure to respirable crystalline silica (RCS) to create a quantitative job-exposure matrix (JEM) for community-based studies. Personal measurements of exposure to RCS from Europe and Canada were obtained for exposure modelling. A mixed-effects model was elaborated, with region/country and job titles as random effect terms. The fixed effect terms included year of measurement, measurement strategy (representative or worst-case), sampling duration (minutes) and a priori exposure intensity rating for each job from an independently developed JEM (none, low, high). 23,640 personal RCS exposure measurements, covering a time period from 1976 to 2009, were available for modelling. The model indicated an overall downward time trend in RCS exposure levels of -6% per year. Exposure levels were higher in the UK and Canada, and lower in Northern Europe and Germany. Worst-case sampling was associated with higher reported exposure levels and an increase in sampling duration was associated with lower reported exposure levels. Highest predicted RCS exposure levels in the reference year (1998) were for chimney bricklayers (geometric mean 0.11 mg m(-3)), monument carvers and other stone cutters and carvers (0.10 mg m(-3)). The resulting model enables us to predict time-, job-, and region/country-specific exposure levels of RCS. These predictions will be used in the SYNERGY study, an ongoing pooled multinational community-based case-control study on lung cancer.

  11. Synthesizing Quantitative Evidence for Evidence-based Nursing: Systematic Review.

    PubMed

    Oh, Eui Geum

    2016-06-01

    As evidence-based practice has become an important issue in healthcare settings, the educational needs for knowledge and skills for the generation and utilization of healthcare evidence are increasing. Systematic review (SR), a way of evidence generation, is a synthesis of primary scientific evidence, which summarizes the best evidence on a specific clinical question using a transparent, a priori protocol driven approach. SR methodology requires a critical appraisal of primary studies, data extraction in a reliable and repeatable way, and examination for validity of the results. SRs are considered hierarchically as the highest form of evidence as they are a systematic search, identification, and summarization of the available evidence to answer a focused clinical question with particular attention to the methodological quality of studies or the credibility of opinion and text. The purpose of this paper is to introduce an overview of the fundamental knowledge, principals and processes in SR. The focus of this paper is on SR especially for the synthesis of quantitative data from primary research studies that examines the effectiveness of healthcare interventions. To activate evidence-based nursing care in various healthcare settings, the best and available scientific evidence are essential components. This paper will include some examples to promote understandings. Copyright © 2016. Published by Elsevier B.V.

  12. Plasmonic Metasurfaces Based on Nanopin-Cavity Resonator for Quantitative Colorimetric Ricin Sensing.

    PubMed

    Fan, Jiao-Rong; Zhu, Jia; Wu, Wen-Gang; Huang, Yun

    2017-01-01

    In view of the toxic potential of a bioweapon threat, rapid visual recognition and sensing of ricin has been of considerable interest while remaining a challenging task up to date. In this study, a gold nanopin-based colorimetric sensor is developed realizing a multicolor variation for ricin qualitative recognition and analysis. It is revealed that such plasmonic metasurfaces based on nanopin-cavity resonator exhibit reflective color appearance, due to the excitation of standing-wave resonances of narrow bandwidth in visible region. This clear color variation is a consequence of the reflective color mixing defined by different resonant wavelengths. In addition, the colored metasurfaces appear sharp color difference in a narrow refractive index range, which makes them especially well-suited for sensing applications. Therefore, this antibody-functionalized nanopin-cavity biosensor features high sensitivity and fast response, allowing for visual quantitative ricin detection within the range of 10-120 ng mL -1 (0.15 × 10 -9 -1.8 × 10 -9 m), a limit of detection of 10 ng mL -1 , and the typical measurement time of less than 10 min. The on-chip integration of such nanopin metasurfaces to portable colorimetric microfluidic device may be envisaged for the quantitative studies of a variety of biochemical molecules. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A quantitative risk-based model for reasoning over critical system properties

    NASA Technical Reports Server (NTRS)

    Feather, M. S.

    2002-01-01

    This position paper suggests the use of a quantitative risk-based model to help support reeasoning and decision making that spans many of the critical properties such as security, safety, survivability, fault tolerance, and real-time.

  14. The Brain Network for Deductive Reasoning: A Quantitative Meta-analysis of 28 Neuroimaging Studies

    PubMed Central

    Prado, Jérôme; Chadha, Angad; Booth, James R.

    2011-01-01

    Over the course of the past decade, contradictory claims have been made regarding the neural bases of deductive reasoning. Researchers have been puzzled by apparent inconsistencies in the literature. Some have even questioned the effectiveness of the methodology used to study the neural bases of deductive reasoning. However, the idea that neuroimaging findings are inconsistent is not based on any quantitative evidence. Here, we report the results of a quantitative meta-analysis of 28 neuroimaging studies of deductive reasoning published between 1997 and 2010, combining 382 participants. Consistent areas of activations across studies were identified using the multilevel kernel density analysis method. We found that results from neuroimaging studies are more consistent than what has been previously assumed. Overall, studies consistently report activations in specific regions of a left fronto-parietal system, as well as in the left Basal Ganglia. This brain system can be decomposed into three subsystems that are specific to particular types of deductive arguments: relational, categorical, and propositional. These dissociations explain inconstancies in the literature. However, they are incompatible with the notion that deductive reasoning is supported by a single cognitive system relying either on visuospatial or rule-based mechanisms. Our findings provide critical insight into the cognitive organization of deductive reasoning and need to be accounted for by cognitive theories. PMID:21568632

  15. Critical Quantitative Study of Immigrant Students

    ERIC Educational Resources Information Center

    Conway, Katherine M.

    2014-01-01

    The author discusses the importance of critical quantitative research for studies of immigrant students, a large and growing group, whose higher education experience is crucial to the future of the United States. The author outlines some of the distinctions to be made among immigrant students and recommends areas of future inquiry.

  16. Quantitative Residual Strain Analyses on Strain Hardened Nickel Based Alloy

    NASA Astrophysics Data System (ADS)

    Yonezawa, Toshio; Maeguchi, Takaharu; Goto, Toru; Juan, Hou

    Many papers have reported about the effects of strain hardening by cold rolling, grinding, welding, etc. on stress corrosion cracking susceptibility of nickel based alloys and austenitic stainless steels for LWR pipings and components. But, the residual strain value due to cold rolling, grinding, welding, etc. is not so quantitatively evaluated.

  17. The Quantitative Evaluation of the Clinical and Translational Science Awards (CTSA) Program Based on Science Mapping and Scientometric Analysis

    PubMed Central

    Zhang, Yin; Wang, Lei

    2013-01-01

    Abstract The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. PMID:24330689

  18. The quantitative evaluation of the Clinical and Translational Science Awards (CTSA) program based on science mapping and scientometric analysis.

    PubMed

    Zhang, Yin; Wang, Lei; Diao, Tianxi

    2013-12-01

    The Clinical and Translational Science Awards (CTSA) program is one of the most important initiatives in translational medical funding. The quantitative evaluation of the efficiency and performance of the CTSA program has a significant referential meaning for the decision making of global translational medical funding. Using science mapping and scientometric analytic tools, this study quantitatively analyzed the scientific articles funded by the CTSA program. The results of the study showed that the quantitative productivities of the CTSA program had a stable increase since 2008. In addition, the emerging trends of the research funded by the CTSA program covered clinical and basic medical research fields. The academic benefits from the CTSA program were assisting its members to build a robust academic home for the Clinical and Translational Science and to attract other financial support. This study provided a quantitative evaluation of the CTSA program based on science mapping and scientometric analysis. Further research is required to compare and optimize other quantitative methods and to integrate various research results. © 2013 Wiley Periodicals, Inc.

  19. Qualitative, semi-quantitative, and quantitative simulation of the osmoregulation system in yeast.

    PubMed

    Pang, Wei; Coghill, George M

    2015-05-01

    In this paper we demonstrate how Morven, a computational framework which can perform qualitative, semi-quantitative, and quantitative simulation of dynamical systems using the same model formalism, is applied to study the osmotic stress response pathway in yeast. First the Morven framework itself is briefly introduced in terms of the model formalism employed and output format. We then built a qualitative model for the biophysical process of the osmoregulation in yeast, and a global qualitative-level picture was obtained through qualitative simulation of this model. Furthermore, we constructed a Morven model based on existing quantitative model of the osmoregulation system. This model was then simulated qualitatively, semi-quantitatively, and quantitatively. The obtained simulation results are presented with an analysis. Finally the future development of the Morven framework for modelling the dynamic biological systems is discussed. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  20. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Quantitative study of tectonic geomorphology along Haiyuan fault based on airborne LiDAR

    USGS Publications Warehouse

    Chen, Tao; Zhang, Pei Zhen; Liu, Jing; Li, Chuan You; Ren, Zhi Kun; Hudnut, Kenneth W.

    2014-01-01

    High-precision and high-resolution topography are the fundamental data for active fault research. Light detection and ranging (LiDAR) presents a new approach to build detailed digital elevation models effectively. We take the Haiyuan fault in Gansu Province as an example of how LiDAR data may be used to improve the study of active faults and the risk assessment of related hazards. In the eastern segment of the Haiyuan fault, the Shaomayin site has been comprehensively investigated in previous research because of its exemplary tectonic topographic features. Based on unprecedented LiDAR data, the horizontal and vertical coseismic offsets at the Shaomayin site are described. The measured horizontal value is about 8.6 m, and the vertical value is about 0.8 m. Using prior dating ages sampled from the same location, we estimate the horizontal slip rate as 4.0 ± 1.0 mm/a with high confidence and define that the lower bound of the vertical slip rate is 0.4 ± 0.1 mm/a since the Holocene. LiDAR data can repeat the measurements of field work on quantifying offsets of tectonic landform features quite well. The offset landforms are visualized on an office computer workstation easily, and specialized software may be used to obtain displacement quantitatively. By combining precious chronological results, the fundamental link between fault activity and large earthquakes is better recognized, as well as the potential risk for future earthquake hazards.

  2. Quantitative insights for the design of substrate-based SIRT1 inhibitors.

    PubMed

    Kokkonen, Piia; Mellini, Paolo; Nyrhilä, Olli; Rahnasto-Rilla, Minna; Suuronen, Tiina; Kiviranta, Päivi; Huhtiniemi, Tero; Poso, Antti; Jarho, Elina; Lahtela-Kakkonen, Maija

    2014-08-01

    Sirtuin 1 (SIRT1) is the most studied human sirtuin and it catalyzes the deacetylation reaction of acetylated lysine residues of its target proteins, for example histones. It is a promising drug target in the treatment of age-related diseases, such as neurodegenerative diseases and cancer. In this study, a series of known substrate-based sirtuin inhibitors was analyzed with comparative molecular field analysis (CoMFA), which is a three-dimensional quantitative structure-activity relationships (3D-QSAR) technique. The CoMFA model was validated both internally and externally, producing the statistical values concordance correlation coefficient (CCC) of 0.88, the mean value r(2)m of 0.66 and Q(2)F3 of 0.89. Based on the CoMFA interaction contours, 13 new potential inhibitors with high predicted activity were designed, and the activities were verified by in vitro measurements. This work proposes an effective approach for the design and activity prediction of new potential substrate-based SIRT1 inhibitors. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Systematic assessment of survey scan and MS2-based abundance strategies for label-free quantitative proteomics using high-resolution MS data.

    PubMed

    Tu, Chengjian; Li, Jun; Sheng, Quanhu; Zhang, Ming; Qu, Jun

    2014-04-04

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R(2) > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery.

  4. Systematic Assessment of Survey Scan and MS2-Based Abundance Strategies for Label-Free Quantitative Proteomics Using High-Resolution MS Data

    PubMed Central

    2015-01-01

    Survey-scan-based label-free method have shown no compelling benefit over fragment ion (MS2)-based approaches when low-resolution mass spectrometry (MS) was used, the growing prevalence of high-resolution analyzers may have changed the game. This necessitates an updated, comparative investigation of these approaches for data acquired by high-resolution MS. Here, we compared survey scan-based (ion current, IC) and MS2-based abundance features including spectral-count (SpC) and MS2 total-ion-current (MS2-TIC), for quantitative analysis using various high-resolution LC/MS data sets. Key discoveries include: (i) study with seven different biological data sets revealed only IC achieved high reproducibility for lower-abundance proteins; (ii) evaluation with 5-replicate analyses of a yeast sample showed IC provided much higher quantitative precision and lower missing data; (iii) IC, SpC, and MS2-TIC all showed good quantitative linearity (R2 > 0.99) over a >1000-fold concentration range; (iv) both MS2-TIC and IC showed good linear response to various protein loading amounts but not SpC; (v) quantification using a well-characterized CPTAC data set showed that IC exhibited markedly higher quantitative accuracy, higher sensitivity, and lower false-positives/false-negatives than both SpC and MS2-TIC. Therefore, IC achieved an overall superior performance than the MS2-based strategies in terms of reproducibility, missing data, quantitative dynamic range, quantitative accuracy, and biomarker discovery. PMID:24635752

  5. Validating internal controls for quantitative plant gene expression studies

    PubMed Central

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-01-01

    Background Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Results Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Conclusion Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments. PMID:15317655

  6. Validating internal controls for quantitative plant gene expression studies.

    PubMed

    Brunner, Amy M; Yakovlev, Igor A; Strauss, Steven H

    2004-08-18

    Real-time reverse transcription PCR (RT-PCR) has greatly improved the ease and sensitivity of quantitative gene expression studies. However, accurate measurement of gene expression with this method relies on the choice of a valid reference for data normalization. Studies rarely verify that gene expression levels for reference genes are adequately consistent among the samples used, nor compare alternative genes to assess which are most reliable for the experimental conditions analyzed. Using real-time RT-PCR to study the expression of 10 poplar (genus Populus) housekeeping genes, we demonstrate a simple method for determining the degree of stability of gene expression over a set of experimental conditions. Based on a traditional method for analyzing the stability of varieties in plant breeding, it defines measures of gene expression stability from analysis of variance (ANOVA) and linear regression. We found that the potential internal control genes differed widely in their expression stability over the different tissues, developmental stages and environmental conditions studied. Our results support that quantitative comparisons of candidate reference genes are an important part of real-time RT-PCR studies that seek to precisely evaluate variation in gene expression. The method we demonstrated facilitates statistical and graphical evaluation of gene expression stability. Selection of the best reference gene for a given set of experimental conditions should enable detection of biologically significant changes in gene expression that are too small to be revealed by less precise methods, or when highly variable reference genes are unknowingly used in real-time RT-PCR experiments.

  7. Quantitative evaluation of specific vulnerability to nitrate for groundwater resource protection based on process-based simulation model.

    PubMed

    Huan, Huan; Wang, Jinsheng; Zhai, Yuanzheng; Xi, Beidou; Li, Juan; Li, Mingxiao

    2016-04-15

    It has been proved that groundwater vulnerability assessment is an effective tool for groundwater protection. Nowadays, quantitative assessment methods for specific vulnerability are scarce due to limited cognition of complicated contaminant fate and transport processes in the groundwater system. In this paper, process-based simulation model for specific vulnerability to nitrate using 1D flow and solute transport model in the unsaturated vadose zone is presented for groundwater resource protection. For this case study in Jilin City of northeast China, rate constants of denitrification and nitrification as well as adsorption constants of ammonium and nitrate in the vadose zone were acquired by laboratory experiments. The transfer time at the groundwater table t50 was taken as the specific vulnerability indicator. Finally, overall vulnerability was assessed by establishing the relationship between groundwater net recharge, layer thickness and t50. The results suggested that the most vulnerable regions of Jilin City were mainly distributed in the floodplain of Songhua River and Mangniu River. The least vulnerable areas mostly appear in the second terrace and back of the first terrace. The overall area of low, relatively low and moderate vulnerability accounted for 76% of the study area, suggesting the relatively low possibility of suffering nitrate contamination. In addition, the sensitivity analysis showed that the most sensitive factors of specific vulnerability in the vadose zone included the groundwater net recharge rate, physical properties of soil medium and rate constants of nitrate denitrification. By validating the suitability of the process-based simulation model for specific vulnerability and comparing with index-based method by a group of integrated indicators, more realistic and accurate specific vulnerability mapping could be acquired by the process-based simulation model acquiring. In addition, the advantages, disadvantages, constraint conditions and

  8. Stable isotopic labeling-based quantitative targeted glycomics (i-QTaG).

    PubMed

    Kim, Kyoung-Jin; Kim, Yoon-Woo; Kim, Yun-Gon; Park, Hae-Min; Jin, Jang Mi; Hwan Kim, Young; Yang, Yung-Hun; Kyu Lee, Jun; Chung, Junho; Lee, Sun-Gu; Saghatelian, Alan

    2015-01-01

    Mass spectrometry (MS) analysis combined with stable isotopic labeling is a promising method for the relative quantification of aberrant glycosylation in diseases and disorders. We developed a stable isotopic labeling-based quantitative targeted glycomics (i-QTaG) technique for the comparative and quantitative analysis of total N-glycans using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF MS). We established the analytical procedure with the chemical derivatizations (i.e., sialic acid neutralization and stable isotopic labeling) of N-glycans using a model glycoprotein (bovine fetuin). Moreover, the i-QTaG using MALDI-TOF MS was evaluated with various molar ratios (1:1, 1:2, 1:5) of (13) C6 /(12) C6 -2-aminobenzoic acid-labeled glycans from normal human serum. Finally, this method was applied to direct comparison of the total N-glycan profiles between normal human sera (n = 8) and prostate cancer patient sera (n = 17). The intensities of the N-glycan peaks from i-QTaG method showed a good linearity (R(2) > 0.99) with the amount of the bovine fetuin glycoproteins. The ratios of relative intensity between the isotopically 2-AA labeled N-glycans were close to the theoretical molar ratios (1:1, 1:2, 1:5). We also demonstrated that the up-regulation of the Lewis antigen (~82%) in sera from prostate cancer patients. In this proof-of-concept study, we demonstrated that the i-QTaG method, which enables to achieve a reliable comparative quantitation of total N-glycans via MALDI-TOF MS analysis, has the potential to diagnose and monitor alterations in glycosylation associated with disease states or biotherapeutics. © 2015 American Institute of Chemical Engineers.

  9. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  10. Publications on dementia in Medline 1974-2009: a quantitative bibliometric study.

    PubMed

    Theander, Sten S; Gustafson, Lars

    2013-05-01

    The aim is to describe the development of the scientific literature on dementia. We present a quantitative, bibliometric study of the literature on dementia, based on Medline, covering 36 years (1974-2009). Two samples of references to dementia papers were retrieved: The main sample based on the MeSH term Dementia holds more than 88,500 references. We have compared the annual additions of references on dementia with the addition to total Medline. Changes of 'the Dementia to Medline ratio' (%) give the best information on the development. Publications on dementia increased 5.6 times faster than Medline. Most of this relative acceleration took place during 1980-1997, when the references on dementia increased from 0.17 to 0.78%. During the recent 12 years, the publications on dementia have been keeping pace with Medline and have stabilized around 0.8%. We have shown a large increase of the literature on dementia, relative both to the development of all medical research and to all psychiatric research. The bibliometric approach may be questioned as quantitative methods treat articles as being of equal value, what is not true. If, for example, during a certain period, the research output is 'inflated' by a great number of repetitive papers, the quantitative method will give an unfair picture of the development. Our relative method, however, will give relevant results as, at each point of time, the proportion of 'valuable research' ought to be about the same in the dementia group as in total Medline. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Patient expectations of treatment for back pain: a systematic review of qualitative and quantitative studies.

    PubMed

    Verbeek, Jos; Sengers, Marie-José; Riemens, Linda; Haafkens, Joke

    2004-10-15

    A systematic review of qualitative and quantitative studies. To summarize evidence from studies among patients with low back pain on their expectations and satisfaction with treatment as part of practice guideline development. Patients are often dissatisfied with treatment for acute or chronic back pain. We searched the literature for studies on patient expectations and satisfaction with treatment for low back pain. Treatment aspects related to expectations or satisfaction were identified in qualitative studies. Percentages of dissatisfied patients were calculated from quantitative studies. Twelve qualitative and eight quantitative studies were found. Qualitative studies revealed the following aspects that patient expectation from treatment for back pain or with which they are dissatisfied. Patients want a clear diagnosis of the cause of their pain, information and instructions, pain relief, and a physical examination. Next, expectations are that there are more diagnostic tests, other therapy or referrals to specialists, and sickness certification. They expect confirmation from the healthcare provider that their pain is real. Like other patients, they want a confidence-based association that includes understanding, listening, respect, and being included in decision-making. The results from qualitative studies are confirmed by quantitative studies. Patients have explicit expectations on diagnosis, instructions, and interpersonal management. New strategies need to be developed in order to meet patients' expectations better. Practice guidelines should pay more attention to the best way of discussing the causes and diagnosis with the patient and should involve them in the decision-making process.

  12. Genetic programming based quantitative structure-retention relationships for the prediction of Kovats retention indices.

    PubMed

    Goel, Purva; Bapat, Sanket; Vyas, Renu; Tambe, Amruta; Tambe, Sanjeev S

    2015-11-13

    The development of quantitative structure-retention relationships (QSRR) aims at constructing an appropriate linear/nonlinear model for the prediction of the retention behavior (such as Kovats retention index) of a solute on a chromatographic column. Commonly, multi-linear regression and artificial neural networks are used in the QSRR development in the gas chromatography (GC). In this study, an artificial intelligence based data-driven modeling formalism, namely genetic programming (GP), has been introduced for the development of quantitative structure based models predicting Kovats retention indices (KRI). The novelty of the GP formalism is that given an example dataset, it searches and optimizes both the form (structure) and the parameters of an appropriate linear/nonlinear data-fitting model. Thus, it is not necessary to pre-specify the form of the data-fitting model in the GP-based modeling. These models are also less complex, simple to understand, and easy to deploy. The effectiveness of GP in constructing QSRRs has been demonstrated by developing models predicting KRIs of light hydrocarbons (case study-I) and adamantane derivatives (case study-II). In each case study, two-, three- and four-descriptor models have been developed using the KRI data available in the literature. The results of these studies clearly indicate that the GP-based models possess an excellent KRI prediction accuracy and generalization capability. Specifically, the best performing four-descriptor models in both the case studies have yielded high (>0.9) values of the coefficient of determination (R(2)) and low values of root mean squared error (RMSE) and mean absolute percent error (MAPE) for training, test and validation set data. The characteristic feature of this study is that it introduces a practical and an effective GP-based method for developing QSRRs in gas chromatography that can be gainfully utilized for developing other types of data-driven models in chromatography science

  13. Fragment-based quantitative structure-activity relationship (FB-QSAR) for fragment-based drug design.

    PubMed

    Du, Qi-Shi; Huang, Ri-Bo; Wei, Yu-Tuo; Pang, Zong-Wen; Du, Li-Qin; Chou, Kuo-Chen

    2009-01-30

    In cooperation with the fragment-based design a new drug design method, the so-called "fragment-based quantitative structure-activity relationship" (FB-QSAR) is proposed. The essence of the new method is that the molecular framework in a family of drug candidates are divided into several fragments according to their substitutes being investigated. The bioactivities of molecules are correlated with the physicochemical properties of the molecular fragments through two sets of coefficients in the linear free energy equations. One coefficient set is for the physicochemical properties and the other for the weight factors of the molecular fragments. Meanwhile, an iterative double least square (IDLS) technique is developed to solve the two sets of coefficients in a training data set alternately and iteratively. The IDLS technique is a feedback procedure with machine learning ability. The standard Two-dimensional quantitative structure-activity relationship (2D-QSAR) is a special case, in the FB-QSAR, when the whole molecule is treated as one entity. The FB-QSAR approach can remarkably enhance the predictive power and provide more structural insights into rational drug design. As an example, the FB-QSAR is applied to build a predictive model of neuraminidase inhibitors for drug development against H5N1 influenza virus. (c) 2008 Wiley Periodicals, Inc.

  14. Multicomponent quantitative spectroscopic analysis without reference substances based on ICA modelling.

    PubMed

    Monakhova, Yulia B; Mushtakova, Svetlana P

    2017-05-01

    A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.

  15. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  16. Sieve-based device for MALDI sample preparation. III. Its power for quantitative measurements.

    PubMed

    Molin, Laura; Cristoni, Simone; Seraglia, Roberta; Traldi, Pietro

    2011-02-01

    The solid sample inhomogeneity is a weak point of traditional MALDI deposition techniques that reflects negatively on quantitative analysis. The recently developed sieve-based device (SBD) sample deposition method, based on the electrospraying of matrix/analyte solutions through a grounded sieve, allows the homogeneous deposition of microcrystals with dimensions smaller than that of the laser spot. In each microcrystal the matrix/analyte molar ratio can be considered constant. Then, by irradiating different portions of the microcrystal distribution an identical response is obtained. This result suggests the employment of SBD in the development of quantitative procedures. For this aim, mixtures of different proteins of known molarity were analyzed, showing a good relationship between molarity and intensity ratios. This behaviour was also observed in the case of proteins with quite different ionic yields. The power of the developed method for quantitative evaluation was also tested by the measurement of the abundance of IGPP[Oxi]GPP[Oxi]GLMGPP (m/z 1219) present in the collagen-α-5(IV) chain precursor, differently expressed in urines from healthy subjects and diabetic-nephropathic patients, confirming its overexpression in the presence of nephropathy. The data obtained indicate that SBD is a particularly effective method for quantitative analysis also in biological fluids of interest. Copyright © 2011 John Wiley & Sons, Ltd.

  17. A quantitative study of gully erosion based on object-oriented analysis techniques: a case study in Beiyanzikou catchment of Qixia, Shandong, China.

    PubMed

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.

  18. A Quantitative Study of Gully Erosion Based on Object-Oriented Analysis Techniques: A Case Study in Beiyanzikou Catchment of Qixia, Shandong, China

    PubMed Central

    Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo

    2014-01-01

    This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626

  19. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China*

    PubMed Central

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-01-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors. PMID:19353749

  20. A new quantitative model of ecological compensation based on ecosystem capital in Zhejiang Province, China.

    PubMed

    Jin, Yan; Huang, Jing-feng; Peng, Dai-liang

    2009-04-01

    Ecological compensation is becoming one of key and multidiscipline issues in the field of resources and environmental management. Considering the change relation between gross domestic product (GDP) and ecological capital (EC) based on remote sensing estimation, we construct a new quantitative estimate model for ecological compensation, using county as study unit, and determine standard value so as to evaluate ecological compensation from 2001 to 2004 in Zhejiang Province, China. Spatial differences of the ecological compensation were significant among all the counties or districts. This model fills up the gap in the field of quantitative evaluation of regional ecological compensation and provides a feasible way to reconcile the conflicts among benefits in the economic, social, and ecological sectors.

  1. Predictive value of EEG in postanoxic encephalopathy: A quantitative model-based approach.

    PubMed

    Efthymiou, Evdokia; Renzel, Roland; Baumann, Christian R; Poryazova, Rositsa; Imbach, Lukas L

    2017-10-01

    The majority of comatose patients after cardiac arrest do not regain consciousness due to severe postanoxic encephalopathy. Early and accurate outcome prediction is therefore essential in determining further therapeutic interventions. The electroencephalogram is a standardized and commonly available tool used to estimate prognosis in postanoxic patients. The identification of pathological EEG patterns with poor prognosis relies however primarily on visual EEG scoring by experts. We introduced a model-based approach of EEG analysis (state space model) that allows for an objective and quantitative description of spectral EEG variability. We retrospectively analyzed standard EEG recordings in 83 comatose patients after cardiac arrest between 2005 and 2013 in the intensive care unit of the University Hospital Zürich. Neurological outcome was assessed one month after cardiac arrest using the Cerebral Performance Category. For a dynamic and quantitative EEG analysis, we implemented a model-based approach (state space analysis) to quantify EEG background variability independent from visual scoring of EEG epochs. Spectral variability was compared between groups and correlated with clinical outcome parameters and visual EEG patterns. Quantitative assessment of spectral EEG variability (state space velocity) revealed significant differences between patients with poor and good outcome after cardiac arrest: Lower mean velocity in temporal electrodes (T4 and T5) was significantly associated with poor prognostic outcome (p<0.005) and correlated with independently identified visual EEG patterns such as generalized periodic discharges (p<0.02). Receiver operating characteristic (ROC) analysis confirmed the predictive value of lower state space velocity for poor clinical outcome after cardiac arrest (AUC 80.8, 70% sensitivity, 15% false positive rate). Model-based quantitative EEG analysis (state space analysis) provides a novel, complementary marker for prognosis in postanoxic

  2. Methodological aspects of multicenter studies with quantitative PET.

    PubMed

    Boellaard, Ronald

    2011-01-01

    Quantification of whole-body FDG PET studies is affected by many physiological and physical factors. Much of the variability in reported standardized uptake value (SUV) data seen in the literature results from the variability in methodology applied among these studies, i.e., due to the use of different scanners, acquisition and reconstruction settings, region of interest strategies, SUV normalization, and/or corrections methods. To date, the variability in applied methodology prohibits a proper comparison and exchange of quantitative FDG PET data. Consequently, the promising role of quantitative PET has been demonstrated in several monocentric studies, but these published results cannot be used directly as a guideline for clinical (multicenter) trials performed elsewhere. In this chapter, the main causes affecting whole-body FDG PET quantification and strategies to minimize its inter-institute variability are addressed.

  3. Quantitative Methods in Psychology: Inevitable and Useless

    PubMed Central

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments. PMID:21833199

  4. Quantitative methods in psychology: inevitable and useless.

    PubMed

    Toomela, Aaro

    2010-01-01

    Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian-Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause-effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

  5. Towards quantitative mass spectrometry-based metabolomics in microbial and mammalian systems.

    PubMed

    Kapoore, Rahul Vijay; Vaidyanathan, Seetharaman

    2016-10-28

    Metabolome analyses are a suite of analytical approaches that enable us to capture changes in the metabolome (small molecular weight components, typically less than 1500 Da) in biological systems. Mass spectrometry (MS) has been widely used for this purpose. The key challenge here is to be able to capture changes in a reproducible and reliant manner that is representative of the events that take place in vivo Typically, the analysis is carried out in vitro, by isolating the system and extracting the metabolome. MS-based approaches enable us to capture metabolomic changes with high sensitivity and resolution. When developing the technique for different biological systems, there are similarities in challenges and differences that are specific to the system under investigation. Here, we review some of the challenges in capturing quantitative changes in the metabolome with MS based approaches, primarily in microbial and mammalian systems.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Author(s).

  6. Quantitative analyses of tartaric acid based on terahertz time domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Cao, Binghua; Fan, Mengbao

    2010-10-01

    Terahertz wave is the electromagnetic spectrum situated between microwave and infrared wave. Quantitative analysis based on terahertz spectroscopy is very important for the application of terahertz techniques. But how to realize it is still under study. L-tartaric acid is widely used as acidulant in beverage, and other food, such as soft drinks, wine, candy, bread and some colloidal sweetmeats. In this paper, terahertz time-domain spectroscopy is applied to quantify the tartaric acid. Two methods are employed to process the terahertz spectra of different samples with different content of tartaric acid. The first one is linear regression combining correlation analysis. The second is partial least square (PLS), in which the absorption spectra in the 0.8-1.4THz region are used to quantify the tartaric acid. To compare the performance of these two principles, the relative error of the two methods is analyzed. For this experiment, the first method does better than the second one. But the first method is suitable for the quantitative analysis of materials which has obvious terahertz absorption peaks, while for material which has no obvious terahertz absorption peaks, the second one is more appropriate.

  7. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews.

    PubMed

    Pluye, Pierre; Gagnon, Marie-Pierre; Griffiths, Frances; Johnson-Lafleur, Janique

    2009-04-01

    A new form of literature review has emerged, Mixed Studies Review (MSR). These reviews include qualitative, quantitative and mixed methods studies. In the present paper, we examine MSRs in health sciences, and provide guidance on processes that should be included and reported. However, there are no valid and usable criteria for concomitantly appraising the methodological quality of the qualitative, quantitative and mixed methods studies. To propose criteria for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies or study components. A three-step critical review was conducted. 2322 references were identified in MEDLINE, and their titles and abstracts were screened; 149 potentially relevant references were selected and the full-text papers were examined; 59 MSRs were retained and scrutinized using a deductive-inductive qualitative thematic data analysis. This revealed three types of MSR: convenience, reproducible, and systematic. Guided by a proposal, we conducted a qualitative thematic data analysis of the quality appraisal procedures used in the 17 systematic MSRs (SMSRs). Of 17 SMSRs, 12 showed clear quality appraisal procedures with explicit criteria but no SMSR used valid checklists to concomitantly appraise qualitative, quantitative and mixed methods studies. In two SMSRs, criteria were developed following a specific procedure. Checklists usually contained more criteria than needed. In four SMSRs, a reliability assessment was described or mentioned. While criteria for quality appraisal were usually based on descriptors that require specific methodological expertise (e.g., appropriateness), no SMSR described the fit between reviewers' expertise and appraised studies. Quality appraisal usually resulted in studies being ranked by methodological quality. A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs. This

  8. Quantitative data standardization of X-ray based densitometry methods

    NASA Astrophysics Data System (ADS)

    Sergunova, K. A.; Petraikin, A. V.; Petrjajkin, F. A.; Akhmad, K. S.; Semenov, D. S.; Potrakhov, N. N.

    2018-02-01

    In the present work is proposed the design of special liquid phantom for assessing the accuracy of quantitative densitometric data. Also are represented the dependencies between the measured bone mineral density values and the given values for different X-ray based densitometry techniques. Shown linear graphs make it possible to introduce correction factors to increase the accuracy of BMD measurement by QCT, DXA and DECT methods, and to use them for standardization and comparison of measurements.

  9. The Adoption Process of Ricefield-Based Fish Seed Production in Northwest Bangladesh: An Understanding through Quantitative and Qualitative Investigation

    ERIC Educational Resources Information Center

    Haque, Mohammad Mahfujul; Little, David C.; Barman, Benoy K.; Wahab, Md. Abdul

    2010-01-01

    Purpose: The purpose of the study was to understand the adoption process of ricefield based fish seed production (RBFSP) that has been developed, promoted and established in Northwest Bangladesh. Design/Methodology/Approach: Quantitative investigation based on regression analysis and qualitative investigation using semi-structured interview were…

  10. [Study on once sampling quantitation based on information entropy of ISSR amplified bands of Houttuynia cordata].

    PubMed

    Wang, Haiqin; Liu, Wenlong; He, Fuyuan; Chen, Zuohong; Zhang, Xili; Xie, Xianggui; Zeng, Jiaoli; Duan, Xiaopeng

    2012-02-01

    To explore the once sampling quantitation of Houttuynia cordata through its DNA polymorphic bands that carried information entropy, from other form that the expression of traditional Chinese medicine polymorphism, genetic polymorphism, of traditional Chinese medicine. The technique of inter simple sequence repeat (ISSR) was applied to analyze genetic polymorphism of H. cordata samples from the same GAP producing area, the DNA genetic bands were transformed its into the information entropy, and the minimum once sampling quantitation with the mathematical mode was measured. One hundred and thirty-four DNA bands were obtained by using 9 screened ISSR primers to amplify from 46 strains DNA samples of H. cordata from the same GAP, the information entropy was H=0.365 6-0.978 6, and RSD was 14.75%. The once sampling quantitation was W=11.22 kg (863 strains). The "once minimum sampling quantitation" were calculated from the angle of the genetic polymorphism of H. cordata, and a great differences between this volume and the amount from the angle of fingerprint were found.

  11. Quantitative three-dimensional photoacoustic tomography of the finger joints: an in vivo study

    NASA Astrophysics Data System (ADS)

    Sun, Yao; Sobel, Eric; Jiang, Huabei

    2009-11-01

    We present for the first time in vivo full three-dimensional (3-D) photoacoustic tomography (PAT) of the distal interphalangeal joint in a human subject. Both absorbed energy density and absorption coefficient images of the joint are quantitatively obtained using our finite-element-based photoacoustic image reconstruction algorithm coupled with the photon diffusion equation. The results show that major anatomical features in the joint along with the side arteries can be imaged with a 1-MHz transducer in a spherical scanning geometry. In addition, the cartilages associated with the joint can be quantitatively differentiated from the phalanx. This in vivo study suggests that the 3-D PAT method described has the potential to be used for early diagnosis of joint diseases such as osteoarthritis and rheumatoid arthritis.

  12. A Study to Formulate Quantitative Guidelines for the Audio-Visual Communications Field. Final Report.

    ERIC Educational Resources Information Center

    Faris, Gene; Sherman, Mendel

    Quantitative guidelines for use in determining the audiovisual (AV) needs of educational institutions were developed by the Octobe r 14-16, 1965 Seminar of the NDEA (National Defense Education Act), Faris-Sherman study. The guidelines that emerged were based in part on a review of past efforts and existing standards but primarily reflected the…

  13. Information Technology Tools Analysis in Quantitative Courses of IT-Management (Case Study: M.Sc.-Tehran University)

    ERIC Educational Resources Information Center

    Eshlaghy, Abbas Toloie; Kaveh, Haydeh

    2009-01-01

    The purpose of this study was to determine the most suitable ICT-based education and define the most suitable e-content creation tools for quantitative courses in the IT-management Masters program. ICT-based tools and technologies are divided in to three categories: the creation of e-content, the offering of e-content, and access to e-content. In…

  14. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics

    PubMed Central

    Lavallée-Adam, Mathieu

    2017-01-01

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. PMID:27010334

  15. Real time quantitative phase microscopy based on single-shot transport of intensity equation (ssTIE) method

    NASA Astrophysics Data System (ADS)

    Yu, Wei; Tian, Xiaolin; He, Xiaoliang; Song, Xiaojun; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-08-01

    Microscopy based on transport of intensity equation provides quantitative phase distributions which opens another perspective for cellular observations. However, it requires multi-focal image capturing while mechanical and electrical scanning limits its real time capacity in sample detections. Here, in order to break through this restriction, real time quantitative phase microscopy based on single-shot transport of the intensity equation method is proposed. A programmed phase mask is designed to realize simultaneous multi-focal image recording without any scanning; thus, phase distributions can be quantitatively retrieved in real time. It is believed the proposed method can be potentially applied in various biological and medical applications, especially for live cell imaging.

  16. Method and platform standardization in MRM-based quantitative plasma proteomics.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  17. Quantitative proteomics to study carbapenem resistance in Acinetobacter baumannii

    PubMed Central

    Tiwari, Vishvanath; Tiwari, Monalisa

    2014-01-01

    Acinetobacter baumannii is an opportunistic pathogen causing pneumonia, respiratory infections and urinary tract infections. The prevalence of this lethal pathogen increases gradually in the clinical setup where it can grow on artificial surfaces, utilize ethanol as a carbon source. Moreover it resists desiccation. Carbapenems, a β-lactam, are the most commonly prescribed drugs against A. baumannii. Resistance against carbapenem has emerged in Acinetobacter baumannii which can create significant health problems and is responsible for high morbidity and mortality. With the development of quantitative proteomics, a considerable progress has been made in the study of carbapenem resistance of Acinetobacter baumannii. Recent updates showed that quantitative proteomics has now emerged as an important tool to understand the carbapenem resistance mechanism in Acinetobacter baumannii. Present review also highlights the complementary nature of different quantitative proteomic methods used to study carbapenem resistance and suggests to combine multiple proteomic methods for understanding the response to antibiotics by Acinetobacter baumannii. PMID:25309531

  18. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform

    NASA Astrophysics Data System (ADS)

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-11-01

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (r

  19. Quantitative Analyses about Market- and Prevalence-Based Needs for Adapted Physical Education Teachers in the Public Schools in the United States

    ERIC Educational Resources Information Center

    Zhang, Jiabei

    2011-01-01

    The purpose of this study was to analyze quantitative needs for more adapted physical education (APE) teachers based on both market- and prevalence-based models. The market-based need for more APE teachers was examined based on APE teacher positions funded, while the prevalence-based need for additional APE teachers was analyzed based on students…

  20. A Markov Chain-based quantitative study of angular distribution of photons through turbid slabs via isotropic light scattering

    NASA Astrophysics Data System (ADS)

    Li, Xuesong; Northrop, William F.

    2016-04-01

    This paper describes a quantitative approach to approximate multiple scattering through an isotropic turbid slab based on Markov Chain theorem. There is an increasing need to utilize multiple scattering for optical diagnostic purposes; however, existing methods are either inaccurate or computationally expensive. Here, we develop a novel Markov Chain approximation approach to solve multiple scattering angular distribution (AD) that can accurately calculate AD while significantly reducing computational cost compared to Monte Carlo simulation. We expect this work to stimulate ongoing multiple scattering research and deterministic reconstruction algorithm development with AD measurements.

  1. Experimental validation of a Monte-Carlo-based inversion scheme for 3D quantitative photoacoustic tomography

    NASA Astrophysics Data System (ADS)

    Buchmann, Jens; Kaplan, Bernhard A.; Prohaska, Steffen; Laufer, Jan

    2017-03-01

    Quantitative photoacoustic tomography (qPAT) aims to extract physiological parameters, such as blood oxygen saturation (sO2), from measured multi-wavelength image data sets. The challenge of this approach lies in the inherently nonlinear fluence distribution in the tissue, which has to be accounted for by using an appropriate model, and the large scale of the inverse problem. In addition, the accuracy of experimental and scanner-specific parameters, such as the wavelength dependence of the incident fluence, the acoustic detector response, the beam profile and divergence, needs to be considered. This study aims at quantitative imaging of blood sO2, as it has been shown to be a more robust parameter compared to absolute concentrations. We propose a Monte-Carlo-based inversion scheme in conjunction with a reduction in the number of variables achieved using image segmentation. The inversion scheme is experimentally validated in tissue-mimicking phantoms consisting of polymer tubes suspended in a scattering liquid. The tubes were filled with chromophore solutions at different concentration ratios. 3-D multi-spectral image data sets were acquired using a Fabry-Perot based PA scanner. A quantitative comparison of the measured data with the output of the forward model is presented. Parameter estimates of chromophore concentration ratios were found to be within 5 % of the true values.

  2. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    NASA Astrophysics Data System (ADS)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  3. Quantitative identification of chemical compounds by dual-soliton based coherent anti-Stokes Raman scattering spectroscopy

    NASA Astrophysics Data System (ADS)

    Chen, Kun; Wu, Tao; Li, Yan; Wei, Haoyun

    2017-12-01

    Coherent anti-Stokes Raman scattering (CARS) is a powerful nonlinear spectroscopy technique that is rapidly gaining recognition of different molecules. Unfortunately, molecular concentration information is generally not immediately accessible from the raw CARS signal due to the nonresonant background. In addition, mainstream biomedical applications of CARS are currently hampered by a complex and bulky excitation setup. Here, we establish a dual-soliton Stokes based CARS spectroscopy scheme capable of quantifying the sample molecular, using a single fiber laser. This dual-soliton CARS scheme takes advantage of a differential configuration to achieve efficient suppression of nonresonant background and therefore allows extraction of quantitative composition information. Besides, our all-fiber based excitation source can probe the most fingerprint region (1100-1800 cm-1) with a spectral resolution of 15 cm-1 under the spectral focusing mechanism, where is considerably more information contained throughout an entire spectrum than at just a single frequency within that spectrum. Systematic studies of the scope of application and several fundamental aspects are discussed. Quantitative capability is further experimentally demonstrated through the determination of oleic acid concentration based on the linear dependence of signal on different Raman vibration bands.

  4. PCA-based groupwise image registration for quantitative MRI.

    PubMed

    Huizinga, W; Poot, D H J; Guyader, J-M; Klaassen, R; Coolen, B F; van Kranenburg, M; van Geuns, R J M; Uitterdijk, A; Polfliet, M; Vandemeulebroucke, J; Leemans, A; Niessen, W J; Klein, S

    2016-04-01

    Quantitative magnetic resonance imaging (qMRI) is a technique for estimating quantitative tissue properties, such as the T1 and T2 relaxation times, apparent diffusion coefficient (ADC), and various perfusion measures. This estimation is achieved by acquiring multiple images with different acquisition parameters (or at multiple time points after injection of a contrast agent) and by fitting a qMRI signal model to the image intensities. Image registration is often necessary to compensate for misalignments due to subject motion and/or geometric distortions caused by the acquisition. However, large differences in image appearance make accurate image registration challenging. In this work, we propose a groupwise image registration method for compensating misalignment in qMRI. The groupwise formulation of the method eliminates the requirement of choosing a reference image, thus avoiding a registration bias. The method minimizes a cost function that is based on principal component analysis (PCA), exploiting the fact that intensity changes in qMRI can be described by a low-dimensional signal model, but not requiring knowledge on the specific acquisition model. The method was evaluated on 4D CT data of the lungs, and both real and synthetic images of five different qMRI applications: T1 mapping in a porcine heart, combined T1 and T2 mapping in carotid arteries, ADC mapping in the abdomen, diffusion tensor mapping in the brain, and dynamic contrast-enhanced mapping in the abdomen. Each application is based on a different acquisition model. The method is compared to a mutual information-based pairwise registration method and four other state-of-the-art groupwise registration methods. Registration accuracy is evaluated in terms of the precision of the estimated qMRI parameters, overlap of segmented structures, distance between corresponding landmarks, and smoothness of the deformation. In all qMRI applications the proposed method performed better than or equally well as

  5. Quantitative genetic models of sexual conflict based on interacting phenotypes.

    PubMed

    Moore, Allen J; Pizzari, Tommaso

    2005-05-01

    Evolutionary conflict arises between reproductive partners when alternative reproductive opportunities are available. Sexual conflict can generate sexually antagonistic selection, which mediates sexual selection and intersexual coevolution. However, despite intense interest, the evolutionary implications of sexual conflict remain unresolved. We propose a novel theoretical approach to study the evolution of sexually antagonistic phenotypes based on quantitative genetics and the measure of social selection arising from male-female interactions. We consider the phenotype of one sex as both a genetically influenced evolving trait as well as the (evolving) social environment in which the phenotype of the opposite sex evolves. Several important points emerge from our analysis, including the relationship between direct selection on one sex and indirect effects through selection on the opposite sex. We suggest that the proposed approach may be a valuable tool to complement other theoretical approaches currently used to study sexual conflict. Most importantly, our approach highlights areas where additional empirical data can help clarify the role of sexual conflict in the evolutionary process.

  6. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  7. Preliminary research on quantitative methods of water resources carrying capacity based on water resources balance sheet

    NASA Astrophysics Data System (ADS)

    Wang, Yanqiu; Huang, Xiaorong; Gao, Linyun; Guo, Biying; Ma, Kai

    2018-06-01

    Water resources are not only basic natural resources, but also strategic economic resources and ecological control factors. Water resources carrying capacity constrains the sustainable development of regional economy and society. Studies of water resources carrying capacity can provide helpful information about how the socioeconomic system is both supported and restrained by the water resources system. Based on the research of different scholars, major problems in the study of water resources carrying capacity were summarized as follows: the definition of water resources carrying capacity is not yet unified; the methods of carrying capacity quantification based on the definition of inconsistency are poor in operability; the current quantitative research methods of water resources carrying capacity did not fully reflect the principles of sustainable development; it is difficult to quantify the relationship among the water resources, economic society and ecological environment. Therefore, it is necessary to develop a better quantitative evaluation method to determine the regional water resources carrying capacity. This paper proposes a new approach to quantifying water resources carrying capacity (that is, through the compilation of the water resources balance sheet) to get a grasp of the regional water resources depletion and water environmental degradation (as well as regional water resources stock assets and liabilities), figure out the squeeze of socioeconomic activities on the environment, and discuss the quantitative calculation methods and technical route of water resources carrying capacity which are able to embody the substance of sustainable development.

  8. Decoding brain cancer dynamics: a quantitative histogram-based approach using temporal MRI

    NASA Astrophysics Data System (ADS)

    Zhou, Mu; Hall, Lawrence O.; Goldgof, Dmitry B.; Russo, Robin; Gillies, Robert J.; Gatenby, Robert A.

    2015-03-01

    Brain tumor heterogeneity remains a challenge for probing brain cancer evolutionary dynamics. In light of evolution, it is a priority to inspect the cancer system from a time-domain perspective since it explicitly tracks the dynamics of cancer variations. In this paper, we study the problem of exploring brain tumor heterogeneity from temporal clinical magnetic resonance imaging (MRI) data. Our goal is to discover evidence-based knowledge from such temporal imaging data, where multiple clinical MRI scans from Glioblastoma multiforme (GBM) patients are generated during therapy. In particular, we propose a quantitative histogram-based approach that builds a prediction model to measure the difference in histograms obtained from pre- and post-treatment. The study could significantly assist radiologists by providing a metric to identify distinctive patterns within each tumor, which is crucial for the goal of providing patient-specific treatments. We examine the proposed approach for a practical application - clinical survival group prediction. Experimental results show that our approach achieved 90.91% accuracy.

  9. A quantitative acoustic emission study on fracture processes in ceramics based on wavelet packet decomposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ning, J. G.; Chu, L.; Ren, H. L., E-mail: huilanren@bit.edu.cn

    2014-08-28

    We base a quantitative acoustic emission (AE) study on fracture processes in alumina ceramics on wavelet packet decomposition and AE source location. According to the frequency characteristics, as well as energy and ringdown counts of AE, the fracture process is divided into four stages: crack closure, nucleation, development, and critical failure. Each of the AE signals is decomposed by a 2-level wavelet package decomposition into four different (from-low-to-high) frequency bands (AA{sub 2}, AD{sub 2}, DA{sub 2}, and DD{sub 2}). The energy eigenvalues P{sub 0}, P{sub 1}, P{sub 2}, and P{sub 3} corresponding to these four frequency bands are calculated. Bymore » analyzing changes in P{sub 0} and P{sub 3} in the four stages, we determine the inverse relationship between AE frequency and the crack source size during ceramic fracture. AE signals with regard to crack nucleation can be expressed when P{sub 0} is less than 5 and P{sub 3} more than 60; whereas AE signals with regard to dangerous crack propagation can be expressed when more than 92% of P{sub 0} is greater than 4, and more than 95% of P{sub 3} is less than 45. Geiger location algorithm is used to locate AE sources and cracks in the sample. The results of this location algorithm are consistent with the positions of fractures in the sample when observed under a scanning electronic microscope; thus the locations of fractures located with Geiger's method can reflect the fracture process. The stage division by location results is in a good agreement with the division based on AE frequency characteristics. We find that both wavelet package decomposition and Geiger's AE source locations are suitable for the identification of the evolutionary process of cracks in alumina ceramics.« less

  10. A statistical framework for protein quantitation in bottom-up MS-based proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas

    2009-08-15

    ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less

  11. Using PSEA-Quant for Protein Set Enrichment Analysis of Quantitative Mass Spectrometry-Based Proteomics.

    PubMed

    Lavallée-Adam, Mathieu; Yates, John R

    2016-03-24

    PSEA-Quant analyzes quantitative mass spectrometry-based proteomics datasets to identify enrichments of annotations contained in repositories such as the Gene Ontology and Molecular Signature databases. It allows users to identify the annotations that are significantly enriched for reproducibly quantified high abundance proteins. PSEA-Quant is available on the Web and as a command-line tool. It is compatible with all label-free and isotopic labeling-based quantitative proteomics methods. This protocol describes how to use PSEA-Quant and interpret its output. The importance of each parameter as well as troubleshooting approaches are also discussed. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  12. Quantitative Study on Corrosion of Steel Strands Based on Self-Magnetic Flux Leakage.

    PubMed

    Xia, Runchuan; Zhou, Jianting; Zhang, Hong; Liao, Leng; Zhao, Ruiqiang; Zhang, Zeyu

    2018-05-02

    This paper proposed a new computing method to quantitatively and non-destructively determine the corrosion of steel strands by analyzing the self-magnetic flux leakage (SMFL) signals from them. The magnetic dipole model and three growth models (Logistic model, Exponential model, and Linear model) were proposed to theoretically analyze the characteristic value of SMFL. Then, the experimental study on the corrosion detection by the magnetic sensor was carried out. The setup of the magnetic scanning device and signal collection method were also introduced. The results show that the Logistic Growth model is verified as the optimal model for calculating the magnetic field with good fitting effects. Combined with the experimental data analysis, the amplitudes of the calculated values ( B xL ( x,z ) curves) agree with the measured values in general. This method provides significant application prospects for the evaluation of the corrosion and the residual bearing capacity of steel strand.

  13. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  14. FDTD-based quantitative analysis of terahertz wave detection for multilayered structures.

    PubMed

    Tu, Wanli; Zhong, Shuncong; Shen, Yaochun; Zhou, Qing; Yao, Ligang

    2014-10-01

    Experimental investigations have shown that terahertz pulsed imaging (TPI) is able to quantitatively characterize a range of multilayered media (e.g., biological issues, pharmaceutical tablet coatings, layered polymer composites, etc.). Advanced modeling of the interaction of terahertz radiation with a multilayered medium is required to enable the wide application of terahertz technology in a number of emerging fields, including nondestructive testing. Indeed, there have already been many theoretical analyses performed on the propagation of terahertz radiation in various multilayered media. However, to date, most of these studies used 1D or 2D models, and the dispersive nature of the dielectric layers was not considered or was simplified. In the present work, the theoretical framework of using terahertz waves for the quantitative characterization of multilayered media was established. A 3D model based on the finite difference time domain (FDTD) method is proposed. A batch of pharmaceutical tablets with a single coating layer of different coating thicknesses and different refractive indices was modeled. The reflected terahertz wave from such a sample was computed using the FDTD method, assuming that the incident terahertz wave is broadband, covering a frequency range up to 3.5 THz. The simulated results for all of the pharmaceutical-coated tablets considered were found to be in good agreement with the experimental results obtained using a commercial TPI system. In addition, we studied a three-layered medium to mimic the occurrence of defects in the sample.

  15. Development and evaluation of a model-based downscatter compensation method for quantitative I-131 SPECT

    PubMed Central

    Song, Na; Du, Yong; He, Bin; Frey, Eric C.

    2011-01-01

    Purpose: The radionuclide 131I has found widespread use in targeted radionuclide therapy (TRT), partly due to the fact that it emits photons that can be imaged to perform treatment planning or posttherapy dose verification as well as beta rays that are suitable for therapy. In both the treatment planning and dose verification applications, it is necessary to estimate the activity distribution in organs or tumors at several time points. In vivo estimates of the 131I activity distribution at each time point can be obtained from quantitative single-photon emission computed tomography (QSPECT) images and organ activity estimates can be obtained either from QSPECT images or quantification of planar projection data. However, in addition to the photon used for imaging, 131I decay results in emission of a number of other higher-energy photons with significant abundances. These higher-energy photons can scatter in the body, collimator, or detector and be counted in the 364 keV photopeak energy window, resulting in reduced image contrast and degraded quantitative accuracy; these photons are referred to as downscatter. The goal of this study was to develop and evaluate a model-based downscatter compensation method specifically designed for the compensation of high-energy photons emitted by 131I and detected in the imaging energy window. Methods: In the evaluation study, we used a Monte Carlo simulation (MCS) code that had previously been validated for other radionuclides. Thus, in preparation for the evaluation study, we first validated the code for 131I imaging simulation by comparison with experimental data. Next, we assessed the accuracy of the downscatter model by comparing downscatter estimates with MCS results. Finally, we combined the downscatter model with iterative reconstruction-based compensation for attenuation (A) and scatter (S) and the full (D) collimator-detector response of the 364 keV photons to form a comprehensive compensation method. We evaluated this

  16. A novel gel based vehicle for the delivery of acetylcholine in quantitative sudomotor axon reflex testing.

    PubMed

    Sletten, David M; Kimpinski, Kurt; Weigand, Stephen D; Low, Phillip A

    2009-10-05

    This study describes a novel gel based vehicle for the delivery of acetylcholine (ACh) during quantitative sudomotor axon reflex testing (QSART). A dose and current response study were undertaken on 20 healthy control participants to characterize the efficiency of a gel based vehicle for the delivery of ACh. Values obtained for total sweat volume and latency to sweat onset with gel iontophoresis of ACh during QSART were comparable to previously published normative data using solution based vehicles. Patient discomfort, utilizing the gel based vehicle during the QSART procedure, was minimal. Improvement in iontophoresis using the gel formulation as a vehicle for ACh delivery has the potential to lower the voltage required to overcome skin resistance during QSART and may result in improved patient comfort during the procedure.

  17. FRET-based genetically-encoded sensors for quantitative monitoring of metabolites.

    PubMed

    Mohsin, Mohd; Ahmad, Altaf; Iqbal, Muhammad

    2015-10-01

    Neighboring cells in the same tissue can exist in different states of dynamic activities. After genomics, proteomics and metabolomics, fluxomics is now equally important for generating accurate quantitative information on the cellular and sub-cellular dynamics of ions and metabolite, which is critical for functional understanding of organisms. Various spectrometry techniques are used for monitoring ions and metabolites, although their temporal and spatial resolutions are limited. Discovery of the fluorescent proteins and their variants has revolutionized cell biology. Therefore, novel tools and methods targeting sub-cellular compartments need to be deployed in specific cells and targeted to sub-cellular compartments in order to quantify the target-molecule dynamics directly. We require tools that can measure cellular activities and protein dynamics with sub-cellular resolution. Biosensors based on fluorescence resonance energy transfer (FRET) are genetically encoded and hence can specifically target sub-cellular organelles by fusion to proteins or targetted sequences. Since last decade, FRET-based genetically encoded sensors for molecules involved in energy production, reactive oxygen species and secondary messengers have helped to unravel key aspects of cellular physiology. This review, describing the design and principles of sensors, presents a database of sensors for different analytes/processes, and illustrate examples of application in quantitative live cell imaging.

  18. Smartphone based visual and quantitative assays on upconversional paper sensor.

    PubMed

    Mei, Qingsong; Jing, Huarong; Li, You; Yisibashaer, Wuerzha; Chen, Jian; Nan Li, Bing; Zhang, Yong

    2016-01-15

    The integration of smartphone with paper sensors recently has been gain increasing attentions because of the achievement of quantitative and rapid analysis. However, smartphone based upconversional paper sensors have been restricted by the lack of effective methods to acquire luminescence signals on test paper. Herein, by the virtue of 3D printing technology, we exploited an auxiliary reusable device, which orderly assembled a 980nm mini-laser, optical filter and mini-cavity together, for digitally imaging the luminescence variations on test paper and quantitative analyzing pesticide thiram by smartphone. In detail, copper ions decorated NaYF4:Yb/Tm upconversion nanoparticles were fixed onto filter paper to form test paper, and the blue luminescence on it would be quenched after additions of thiram through luminescence resonance energy transfer mechanism. These variations could be monitored by the smartphone camera, and then the blue channel intensities of obtained colored images were calculated to quantify amounts of thiram through a self-written Android program installed on the smartphone, offering a reliable and accurate detection limit of 0.1μM for the system. This work provides an initial demonstration of integrating upconversion nanosensors with smartphone digital imaging for point-of-care analysis on a paper-based platform. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Analog to digital workflow improvement: a quantitative study.

    PubMed

    Wideman, Catherine; Gallet, Jacqueline

    2006-01-01

    This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.

  20. Quantitative geomorphologic studies from spaceborne platforms

    NASA Technical Reports Server (NTRS)

    Williams, R. S., Jr.

    1985-01-01

    Although LANDSAT images of our planet represent a quantum improvement in the availability of a global image-data set for independent or comparative regional geomorphic studies of landforms, such images have several limitations which restrict their suitability for quantitative geomorphic investigations. The three most serious deficiencies are: (1) photogrammetric inaccuracies, (2) two-dimensional nature of the data, and (3) spatial resolution. These deficiencies are discussed, as well as the use of stereoscopic images and laser altimeter data.

  1. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  2. Portable paper-based device for quantitative colorimetric assays relying on light reflectance principle.

    PubMed

    Li, Bowei; Fu, Longwen; Zhang, Wei; Feng, Weiwei; Chen, Lingxin

    2014-04-01

    This paper presents a novel paper-based analytical device based on the colorimetric paper assays through its light reflectance. The device is portable, low cost (<20 dollars), and lightweight (only 176 g) that is available to assess the cost-effectiveness and appropriateness of the original health care or on-site detection information. Based on the light reflectance principle, the signal can be obtained directly, stably and user-friendly in our device. We demonstrated the utility and broad applicability of this technique with measurements of different biological and pollution target samples (BSA, glucose, Fe, and nitrite). Moreover, the real samples of Fe (II) and nitrite in the local tap water were successfully analyzed, and compared with the standard UV absorption method, the quantitative results showed good performance, reproducibility, and reliability. This device could provide quantitative information very conveniently and show great potential to broad fields of resource-limited analysis, medical diagnostics, and on-site environmental detection. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Multicenter AIDS Cohort Study Quantitative Coronary Plaque Progression Study: rationale and design.

    PubMed

    Nakanishi, Rine; Post, Wendy S; Osawa, Kazuhiro; Jayawardena, Eranthi; Kim, Michael; Sheidaee, Nasim; Nezarat, Negin; Rahmani, Sina; Kim, Nicholas; Hathiramani, Nicolai; Susarla, Shriraj; Palella, Frank; Witt, Mallory; Blaha, Michael J; Brown, Todd T; Kingsley, Lawrence; Haberlen, Sabina A; Dailing, Christopher; Budoff, Matthew J

    2018-01-01

    The association of HIV with coronary atherosclerosis has been established; however, the progression of coronary atherosclerosis over time among participants with HIV is not well known. The Multicenter AIDS Cohort Study Quantitative Coronary Plaque Progression Study is a large prospective multicenter study quantifying progression of coronary plaque assessed by serial coronary computed tomography angiography (CTA). HIV-infected and uninfected men who were enrolled in the Multicenter AIDS Cohort Study Cardiovascular Substudy were eligible to complete a follow-up contrast coronary CTA 3-6 years after baseline. We measured coronary plaque volume and characteristics (calcified and noncalcified plaque including fibrous, fibrous-fatty, and low attenuation) and vulnerable plaque among HIV-infected and uninfected men using semiautomated plaque software to investigate the progression of coronary atherosclerosis over time. We describe a novel, large prospective multicenter study investigating incidence, transition of characteristics, and progression in coronary atherosclerosis quantitatively assessed by serial coronary CTAs among HIV-infected and uninfected men.

  4. Real-time label-free quantitative fluorescence microscopy-based detection of ATP using a tunable fluorescent nano-aptasensor platform.

    PubMed

    Shrivastava, Sajal; Sohn, Il-Yung; Son, Young-Min; Lee, Won-Il; Lee, Nae-Eung

    2015-12-14

    Although real-time label-free fluorescent aptasensors based on nanomaterials are increasingly recognized as a useful strategy for the detection of target biomolecules with high fidelity, the lack of an imaging-based quantitative measurement platform limits their implementation with biological samples. Here we introduce an ensemble strategy for a real-time label-free fluorescent graphene (Gr) aptasensor platform. This platform employs aptamer length-dependent tunability, thus enabling the reagentless quantitative detection of biomolecules through computational processing coupled with real-time fluorescence imaging data. We demonstrate that this strategy effectively delivers dose-dependent quantitative readouts of adenosine triphosphate (ATP) concentration on chemical vapor deposited (CVD) Gr and reduced graphene oxide (rGO) surfaces, thereby providing cytotoxicity assessment. Compared with conventional fluorescence spectrometry methods, our highly efficient, universally applicable, and rational approach will facilitate broader implementation of imaging-based biosensing platforms for the quantitative evaluation of a range of target molecules.

  5. A Quantitative Comparative Study of Blended and Traditional Models in the Secondary Advanced Placement Statistics Classroom

    ERIC Educational Resources Information Center

    Owens, Susan T.

    2017-01-01

    Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…

  6. A quantitative Kirkpatrick Level 1 and 2 study of equipment specialist apprentice operations training

    NASA Astrophysics Data System (ADS)

    Hughes, Dirk D.

    The primary purpose of the quantitative experimental study is to compare employee-learning outcomes for a course of study that is offered in two formats: explicit and tacit instructor led and explicit e-learning operations training. A Kirkpatrick Level 2 course examination is used to establish a pretest knowledge baseline and to measure posttest learning outcomes for each instructional format. A secondary purpose is to compare responses of the two groups using a Kirkpatrick Level 1 customer satisfaction index survey. Several authors reported the United States electric utility industry would have an employee attrition issue during the 2010 through 2015 period. This is at the same time the industry will be experiencing an increased demand for electricity. There now is a demand for highly training powerplant operators. A review of literature yielded few studies comparing instructor led training and e-based training. Though the Electric Power Research Institute stated the two training modes would be acceptable instruction, the organization did not develop a quantifiable justified recommendation as to the training. Subjects participated in a basic operations course and decided to take either the instructor led or e-based training course. Results of the study concluded that both instructor led and e-based training provided significant learning to the participants. The Kirkpatrick Level 1 results indicated significantly better results for instructor led training. There was not a significant difference in the Kirkpatrick Level 2 results between the two training modalities. Recommendation for future research include conducting a quantitative studies including a Phillips Level 5 study and qualitative studies including a more detailed examination of the customer satisfaction survey (Kirkpatrick Level 1).

  7. Study of quantitative changes of cereal allergenic proteins after food processing.

    PubMed

    Flodrová, Dana; Benkovská, Dagmar; Laštovičková, Markéta

    2015-03-30

    Within last few years, the occurrence of food allergens and corresponding food allergies has been increasing, therefore research into the individual allergens is required. In the present work, the effect of cereal processing on the amounts of allergenic proteins is studied by modern proteomic-based approaches. The most important wheat and barley allergens are low-molecular-weight (LMW) proteins. Therefore we investigated the relative quantitative changes of these proteins after food technological processing, namely wheat couscous production and barley malting. A comparative study using mass spectrometry in connection with the technique of isobaric tag for relative and absolute quantification (iTRAQ) revealed that the amount of wheat allergenic LMW proteins decreased significantly during couscous production (approximately to 5-26% of their initial content in wheat flour). After barley malting, the amounts of the majority of LMW proteins decreased as well, although to a lesser extent than in the case of wheat/couscous. The level of two allergens even slightly increased. Suggested proteomic strategy proved as universal and sensitive method for fast and reliable identification of various cereal allergens and monitoring of their quantitative changes during food processing. Such information is important for consumers who suffer from allergies. © 2014 Society of Chemical Industry.

  8. Mass Spectrometry Based Identification of Geometric Isomers during Metabolic Stability Study of a New Cytotoxic Sulfonamide Derivatives Supported by Quantitative Structure-Retention Relationships

    PubMed Central

    Belka, Mariusz; Hewelt-Belka, Weronika; Sławiński, Jarosław; Bączek, Tomasz

    2014-01-01

    A set of 15 new sulphonamide derivatives, presenting antitumor activity have been subjected to a metabolic stability study. The results showed that besides products of biotransformation, some additional peaks occurred in chromatograms. Tandem mass spectrometry revealed the same mass and fragmentation pathway, suggesting that geometric isomerization occurred. Thus, to support this hypothesis, quantitative structure-retention relationships were applied. Human liver microsomes were used as an in vitro model of metabolism. The biotransformation reactions were tracked by liquid chromatography assay and additionally, fragmentation mass spectra were recorded. In silico molecular modeling at a semi-empirical level was conducted as a starting point for molecular descriptor calculations. A quantitative structure-retention relationship model was built applying multiple linear regression based on selected three-dimensional descriptors. The studied compounds revealed high metabolic stability, with a tendency to form hydroxylated biotransformation products. However, significant chemical instability in conditions simulating human body fluids was noticed. According to literature and MS data geometrical isomerization was suggested. The developed in sillico model was able to describe the relationship between the geometry of isomer pairs and their chromatographic retention properties, thus it supported the hypothesis that the observed pairs of peaks are most likely geometric isomers. However, extensive structural investigations are needed to fully identify isomers’ geometry. An effort to describe MS fragmentation pathways of novel chemical structures is often not enough to propose structures of potent metabolites and products of other chemical reactions that can be observed in compound solutions at early drug discovery studies. The results indicate that the relatively non-expensive and not time- and labor-consuming in sillico approach could be a good supportive tool assisting the

  9. Quantitative phase microscopy for cellular dynamics based on transport of intensity equation.

    PubMed

    Li, Ying; Di, Jianglei; Ma, Chaojie; Zhang, Jiwei; Zhong, Jinzhan; Wang, Kaiqiang; Xi, Teli; Zhao, Jianlin

    2018-01-08

    We demonstrate a simple method for quantitative phase imaging of tiny transparent objects such as living cells based on the transport of intensity equation. The experiments are performed using an inverted bright field microscope upgraded with a flipping imaging module, which enables to simultaneously create two laterally separated images with unequal defocus distances. This add-on module does not include any lenses or gratings and is cost-effective and easy-to-alignment. The validity of this method is confirmed by the measurement of microlens array and human osteoblastic cells in culture, indicating its potential in the applications of dynamically measuring living cells and other transparent specimens in a quantitative, non-invasive and label-free manner.

  10. Liquid crystal-based biosensor with backscattering interferometry: A quantitative approach.

    PubMed

    Khan, Mashooq; Park, Soo-Young

    2017-01-15

    We developed a new technology that uses backscattering interferometry (BSI) to quantitatively measure nematic liquid crystal (NLC)-based biosensors, those usually relied on texture reading for on/off signals. The LC-based BSI comprised an octadecyltrichlorosilane (OTS)-coated square capillary filled with 4-cyano-4'-pentylbiphenyl (5CB, a nematic LC at room temperature). The LC/water interface in the capillary was functionalized by a coating of poly(acrylicacid-b-4-cyanobiphenyl-4'-oxyundecylacrylate) (PAA-b-LCP) and immobilized with the enzymes glucose oxidase (GOx) and horseradish peroxidase (HRP) through covalent linkage to the PAA chains (5CB PAA-GOx:HRP ) for glucose detection. Laser irradiation of the LC near the LC/water interface resulted in backscattered fringes with high contrast. The change in the spatial position of the fringes (because of the change in the orientation of the LC caused by the GOx:HRP enzymatic reaction of glucose) altered the output voltage of the photodetector when its active area was aligned with the edge of one of the fringes. The change in the intensity at the photodetector allowed the detection limit of the instrument to be as low as 0.008mM with a linear range of 0.02-9mM in a short response time (~60s). This LC-based BSI technique allows for quantitative, sensitive, selective, reproducible, easily obtainable, and interference-free detection in a large linear dynamic range and for practical applications with human serum. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Towards quantitative PET/MRI: a review of MR-based attenuation correction techniques.

    PubMed

    Hofmann, Matthias; Pichler, Bernd; Schölkopf, Bernhard; Beyer, Thomas

    2009-03-01

    Positron emission tomography (PET) is a fully quantitative technology for imaging metabolic pathways and dynamic processes in vivo. Attenuation correction of raw PET data is a prerequisite for quantification and is typically based on separate transmission measurements. In PET/CT attenuation correction, however, is performed routinely based on the available CT transmission data. Recently, combined PET/magnetic resonance (MR) has been proposed as a viable alternative to PET/CT. Current concepts of PET/MRI do not include CT-like transmission sources and, therefore, alternative methods of PET attenuation correction must be found. This article reviews existing approaches to MR-based attenuation correction (MR-AC). Most groups have proposed MR-AC algorithms for brain PET studies and more recently also for torso PET/MR imaging. Most MR-AC strategies require the use of complementary MR and transmission images, or morphology templates generated from transmission images. We review and discuss these algorithms and point out challenges for using MR-AC in clinical routine. MR-AC is work-in-progress with potentially promising results from a template-based approach applicable to both brain and torso imaging. While efforts are ongoing in making clinically viable MR-AC fully automatic, further studies are required to realize the potential benefits of MR-based motion compensation and partial volume correction of the PET data.

  12. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  13. Toward Quantitative Small Animal Pinhole SPECT: Assessment of Quantitation Accuracy Prior to Image Compensations

    PubMed Central

    Chen, Chia-Lin; Wang, Yuchuan; Lee, Jason J. S.; Tsui, Benjamin M. W.

    2011-01-01

    Purpose We assessed the quantitation accuracy of small animal pinhole single photon emission computed tomography (SPECT) under the current preclinical settings, where image compensations are not routinely applied. Procedures The effects of several common image-degrading factors and imaging parameters on quantitation accuracy were evaluated using Monte-Carlo simulation methods. Typical preclinical imaging configurations were modeled, and quantitative analyses were performed based on image reconstructions without compensating for attenuation, scatter, and limited system resolution. Results Using mouse-sized phantom studies as examples, attenuation effects alone degraded quantitation accuracy by up to −18% (Tc-99m or In-111) or −41% (I-125). The inclusion of scatter effects changed the above numbers to −12% (Tc-99m or In-111) and −21% (I-125), respectively, indicating the significance of scatter in quantitative I-125 imaging. Region-of-interest (ROI) definitions have greater impacts on regional quantitation accuracy for small sphere sources as compared to attenuation and scatter effects. For the same ROI, SPECT acquisitions using pinhole apertures of different sizes could significantly affect the outcome, whereas the use of different radii-of-rotation yielded negligible differences in quantitation accuracy for the imaging configurations simulated. Conclusions We have systematically quantified the influence of several factors affecting the quantitation accuracy of small animal pinhole SPECT. In order to consistently achieve accurate quantitation within 5% of the truth, comprehensive image compensation methods are needed. PMID:19048346

  14. Quantitative gene-gene and gene-environment mapping for leaf shape variation using tree-based models.

    PubMed

    Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun

    2017-01-01

    Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  15. ABRF-PRG07: advanced quantitative proteomics study.

    PubMed

    Falick, Arnold M; Lane, William S; Lilley, Kathryn S; MacCoss, Michael J; Phinney, Brett S; Sherman, Nicholas E; Weintraub, Susan T; Witkowska, H Ewa; Yates, Nathan A

    2011-04-01

    A major challenge for core facilities is determining quantitative protein differences across complex biological samples. Although there are numerous techniques in the literature for relative and absolute protein quantification, the majority is nonroutine and can be challenging to carry out effectively. There are few studies comparing these technologies in terms of their reproducibility, accuracy, and precision, and no studies to date deal with performance across multiple laboratories with varied levels of expertise. Here, we describe an Association of Biomolecular Resource Facilities (ABRF) Proteomics Research Group (PRG) study based on samples composed of a complex protein mixture into which 12 known proteins were added at varying but defined ratios. All of the proteins were present at the same concentration in each of three tubes that were provided. The primary goal of this study was to allow each laboratory to evaluate its capabilities and approaches with regard to: detection and identification of proteins spiked into samples that also contain complex mixtures of background proteins and determination of relative quantities of the spiked proteins. The results returned by 43 participants were compiled by the PRG, which also collected information about the strategies used to assess overall performance and as an aid to development of optimized protocols for the methodologies used. The most accurate results were generally reported by the most experienced laboratories. Among laboratories that used the same technique, values that were closer to the expected ratio were obtained by more experienced groups.

  16. Undergraduate Students' Quantitative Reasoning in Economic Contexts

    ERIC Educational Resources Information Center

    Mkhatshwa, Thembinkosi Peter; Doerr, Helen M.

    2018-01-01

    Contributing to a growing body of research on undergraduate students' quantitative reasoning, the study reported in this article used task-based interviews to investigate business calculus students' quantitative reasoning when solving two optimization tasks situated in the context of revenue and profit maximization. Analysis of verbal responses…

  17. Impact of image quality on OCT angiography based quantitative measurements.

    PubMed

    Al-Sheikh, Mayss; Ghasemi Falavarjani, Khalil; Akil, Handan; Sadda, SriniVas R

    2017-01-01

    To study the impact of image quality on quantitative measurements and the frequency of segmentation error with optical coherence tomography angiography (OCTA). Seventeen eyes of 10 healthy individuals were included in this study. OCTA was performed using a swept-source device (Triton, Topcon). Each subject underwent three scanning sessions 1-2 min apart; the first two scans were obtained under standard conditions and for the third session, the image quality index was reduced using application of a topical ointment. En face OCTA images of the retinal vasculature were generated using the default segmentation for the superficial and deep retinal layer (SRL, DRL). Intraclass correlation coefficient (ICC) was used as a measure for repeatability. The frequency of segmentation error, motion artifact, banding artifact and projection artifact was also compared among the three sessions. The frequency of segmentation error, and motion artifact was statistically similar between high and low image quality sessions (P = 0.707, and P = 1 respectively). However, the frequency of projection and banding artifact was higher with a lower image quality. The vessel density in the SRL was highly repeatable in the high image quality sessions (ICC = 0.8), however, the repeatability was low, comparing the high and low image quality measurements (ICC = 0.3). In the DRL, the repeatability of the vessel density measurements was fair in the high quality sessions (ICC = 0.6 and ICC = 0.5, with and without automatic artifact removal, respectively) and poor comparing high and low image quality sessions (ICC = 0.3 and ICC = 0.06, with and without automatic artifact removal, respectively). The frequency of artifacts is higher and the repeatability of the measurements is lower with lower image quality. The impact of image quality index should be always considered in OCTA based quantitative measurements.

  18. Multifunctional sample preparation kit and on-chip quantitative nucleic acid sequence-based amplification tests for microbial detection.

    PubMed

    Zhao, Xinyan; Dong, Tao

    2012-10-16

    This study reports a quantitative nucleic acid sequence-based amplification (Q-NASBA) microfluidic platform composed of a membrane-based sampling module, a sample preparation cassette, and a 24-channel Q-NASBA chip for environmental investigations on aquatic microorganisms. This low-cost and highly efficient sampling module, having seamless connection with the subsequent steps of sample preparation and quantitative detection, is designed for the collection of microbial communities from aquatic environments. Eight kinds of commercial membrane filters are relevantly analyzed using Saccharomyces cerevisiae, Escherichia coli, and Staphylococcus aureus as model microorganisms. After the microorganisms are concentrated on the membrane filters, the retentate can be easily conserved in a transport medium (TM) buffer and sent to a remote laboratory. A Q-NASBA-oriented sample preparation cassette is originally designed to extract DNA/RNA molecules directly from the captured cells on the membranes. Sequentially, the extract is analyzed within Q-NASBA chips that are compatible with common microplate readers in laboratories. Particularly, a novel analytical algorithmic method is developed for simple but robust on-chip Q-NASBA assays. The reported multifunctional microfluidic system could detect a few microorganisms quantitatively and simultaneously. Further research should be conducted to simplify and standardize ecological investigations on aquatic environments.

  19. A gold nanoparticle-based semi-quantitative and quantitative ultrasensitive paper sensor for the detection of twenty mycotoxins

    NASA Astrophysics Data System (ADS)

    Kong, Dezhao; Liu, Liqiang; Song, Shanshan; Suryoprabowo, Steven; Li, Aike; Kuang, Hua; Wang, Libing; Xu, Chuanlai

    2016-02-01

    A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan reader, with the calculated limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.04-0.17, 0.06-49, 0.15-0.22, 0.056-0.49 and 0.53-1.05 μg kg-1, respectively. The analytical results of spiked samples were in accordance with the accurate content in the simultaneous detection analysis. This newly developed ICA strip assay is suitable for the on-site detection and rapid initial screening of mycotoxins in cereal samples, facilitating both semi-quantitative and quantitative determination.A semi-quantitative and quantitative multi-immunochromatographic (ICA) strip detection assay was developed for the simultaneous detection of twenty types of mycotoxins from five classes, including zearalenones (ZEAs), deoxynivalenols (DONs), T-2 toxins (T-2s), aflatoxins (AFs), and fumonisins (FBs), in cereal food samples. Sensitive and specific monoclonal antibodies were selected for this assay. The semi-quantitative results were obtained within 20 min by the naked eye, with visual limits of detection for ZEAs, DONs, T-2s, AFs and FBs of 0.1-0.5, 2.5-250, 0.5-1, 0.25-1 and 2.5-10 μg kg-1, and cut-off values of 0.25-1, 5-500, 1-10, 0.5-2.5 and 5-25 μg kg-1, respectively. The quantitative results were obtained using a hand-held strip scan

  20. Quantitative 3D breast magnetic resonance imaging fibroglandular tissue analysis and correlation with qualitative assessments: a feasibility study.

    PubMed

    Ha, Richard; Mema, Eralda; Guo, Xiaotao; Mango, Victoria; Desperito, Elise; Ha, Jason; Wynn, Ralph; Zhao, Binsheng

    2016-04-01

    The amount of fibroglandular tissue (FGT) has been linked to breast cancer risk based on mammographic density studies. Currently, the qualitative assessment of FGT on mammogram (MG) and magnetic resonance imaging (MRI) is prone to intra and inter-observer variability. The purpose of this study is to develop an objective quantitative FGT measurement tool for breast MRI that could provide significant clinical value. An IRB approved study was performed. Sixty breast MRI cases with qualitative assessment of mammographic breast density and MRI FGT were randomly selected for quantitative analysis from routine breast MRIs performed at our institution from 1/2013 to 12/2014. Blinded to the qualitative data, whole breast and FGT contours were delineated on T1-weighted pre contrast sagittal images using an in-house, proprietary segmentation algorithm which combines the region-based active contours and a level set approach. FGT (%) was calculated by: [segmented volume of FGT (mm(3))/(segmented volume of whole breast (mm(3))] ×100. Statistical correlation analysis was performed between quantified FGT (%) on MRI and qualitative assessments of mammographic breast density and MRI FGT. There was a significant positive correlation between quantitative MRI FGT assessment and qualitative MRI FGT (r=0.809, n=60, P<0.001) and mammographic density assessment (r=0.805, n=60, P<0.001). There was a significant correlation between qualitative MRI FGT assessment and mammographic density assessment (r=0.725, n=60, P<0.001). The four qualitative assessment categories of FGT correlated with the calculated mean quantitative FGT (%) of 4.61% (95% CI, 0-12.3%), 8.74% (7.3-10.2%), 18.1% (15.1-21.1%), 37.4% (29.5-45.3%). Quantitative measures of FGT (%) were computed with data derived from breast MRI and correlated significantly with conventional qualitative assessments. This quantitative technique may prove to be a valuable tool in clinical use by providing computer generated standardized

  1. Quantitative structure activity relationship studies of mushroom tyrosinase inhibitors

    NASA Astrophysics Data System (ADS)

    Xue, Chao-Bin; Luo, Wan-Chun; Ding, Qi; Liu, Shou-Zhu; Gao, Xing-Xiang

    2008-05-01

    Here, we report our results from quantitative structure-activity relationship studies on tyrosinase inhibitors. Interactions between benzoic acid derivatives and tyrosinase active sites were also studied using a molecular docking method. These studies indicated that one possible mechanism for the interaction between benzoic acid derivatives and the tyrosinase active site is the formation of a hydrogen-bond between the hydroxyl (aOH) and carbonyl oxygen atoms of Tyr98, which stabilized the position of Tyr98 and prevented Tyr98 from participating in the interaction between tyrosinase and ORF378. Tyrosinase, also known as phenoloxidase, is a key enzyme in animals, plants and insects that is responsible for catalyzing the hydroxylation of tyrosine into o-diphenols and the oxidation of o-diphenols into o-quinones. In the present study, the bioactivities of 48 derivatives of benzaldehyde, benzoic acid, and cinnamic acid compounds were used to construct three-dimensional quantitative structure-activity relationship (3D-QSAR) models using comparative molecular field (CoMFA) and comparative molecular similarity indices (CoMSIA) analyses. After superimposition using common substructure-based alignments, robust and predictive 3D-QSAR models were obtained from CoMFA ( q 2 = 0.855, r 2 = 0.978) and CoMSIA ( q 2 = 0.841, r 2 = 0.946), with 6 optimum components. Chemical descriptors, including electronic (Hammett σ), hydrophobic (π), and steric (MR) parameters, hydrogen bond acceptor (H-acc), and indicator variable ( I), were used to construct a 2D-QSAR model. The results of this QSAR indicated that π, MR, and H-acc account for 34.9, 31.6, and 26.7% of the calculated biological variance, respectively. The molecular interactions between ligand and target were studied using a flexible docking method (FlexX). The best scored candidates were docked flexibly, and the interaction between the benzoic acid derivatives and the tyrosinase active site was elucidated in detail. We believe

  2. Optical coherence tomography based microangiography for quantitative monitoring of structural and vascular changes in a rat model of acute uveitis in vivo: a preliminary study

    NASA Astrophysics Data System (ADS)

    Choi, Woo June; Pepple, Kathryn L.; Zhi, Zhongwei; Wang, Ruikang K.

    2015-01-01

    Uveitis models in rodents are important in the investigation of pathogenesis in human uveitis and the development of appropriate therapeutic strategies for treatment. Quantitative monitoring of ocular inflammation in small animal models provides an objective metric to assess uveitis progression and/or therapeutic effects. We present a new application of optical coherence tomography (OCT) and OCT-based microangiography (OMAG) to a rat model of acute anterior uveitis induced by intravitreal injection of a killed mycobacterial extract. OCT/OMAG is used to provide noninvasive three-dimensional imaging of the anterior segment of the eyes prior to injection (baseline) and two days post-injection (peak inflammation) in rats with and without steroid treatments. OCT imaging identifies characteristic structural and vascular changes in the anterior segment of the inflamed animals when compared to baseline images. Characteristics of inflammation identified include anterior chamber cells, corneal edema, pupillary membranes, and iris vasodilation. In contrast, no significant difference from the control is observed for the steroid-treated eye. These findings are compared with the histology assessment of the same eyes. In addition, quantitative measurements of central corneal thickness and iris vessel diameter are determined. This pilot study demonstrates that OCT-based microangiography promises to be a useful tool for the assessment and management of uveitis in vivo.

  3. Issues in Quantitative Analysis of Ultraviolet Imager (UV) Data: Airglow

    NASA Technical Reports Server (NTRS)

    Germany, G. A.; Richards, P. G.; Spann, J. F.; Brittnacher, M. J.; Parks, G. K.

    1999-01-01

    The GGS Ultraviolet Imager (UVI) has proven to be especially valuable in correlative substorm, auroral morphology, and extended statistical studies of the auroral regions. Such studies are based on knowledge of the location, spatial, and temporal behavior of auroral emissions. More quantitative studies, based on absolute radiometric intensities from UVI images, require a more intimate knowledge of the instrument behavior and data processing requirements and are inherently more difficult than studies based on relative knowledge of the oval location. In this study, UVI airglow observations are analyzed and compared with model predictions to illustrate issues that arise in quantitative analysis of UVI images. These issues include instrument calibration, long term changes in sensitivity, and imager flat field response as well as proper background correction. Airglow emissions are chosen for this study because of their relatively straightforward modeling requirements and because of their implications for thermospheric compositional studies. The analysis issues discussed here, however, are identical to those faced in quantitative auroral studies.

  4. DMD-based quantitative phase microscopy and optical diffraction tomography

    NASA Astrophysics Data System (ADS)

    Zhou, Renjie

    2018-02-01

    Digital micromirror devices (DMDs), which offer high speed and high degree of freedoms in steering light illuminations, have been increasingly applied to optical microscopy systems in recent years. Lately, we introduced DMDs into digital holography to enable new imaging modalities and break existing imaging limitations. In this paper, we will first present our progress in using DMDs for demonstrating laser-illumination Fourier ptychographic microscopy (FPM) with shotnoise limited detection. After that, we will present a novel common-path quantitative phase microscopy (QPM) system based on using a DMD. Building on those early developments, a DMD-based high speed optical diffraction tomography (ODT) system has been recently demonstrated, and the results will also be presented. This ODT system is able to achieve video-rate 3D refractive-index imaging, which can potentially enable observations of high-speed 3D sample structural changes.

  5. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  6. A versatile quantitation platform based on platinum nanoparticles incorporated volumetric bar-chart chip for highly sensitive assays.

    PubMed

    Wang, Yuzhen; Zhu, Guixian; Qi, Wenjin; Li, Ying; Song, Yujun

    2016-11-15

    Platinum nanoparticles incorporated volumetric bar-chart chip (PtNPs-V-Chip) is able to be used for point-of-care tests by providing quantitative and visualized readout without any assistance from instruments, data processing, or graphic plotting. To improve the sensitivity of PtNPs-V-Chip, hybridization chain reaction was employed in this quantitation platform for highly sensitive assays that can detect as low as 16 pM Ebola Virus DNA, 0.01ng/mL carcinoembryonic antigen (CEA), and the 10 HER2-expressing cancer cells. Based on this amplified strategy, a 100-fold decrease of detection limit was achieved for DNA by improving the number of platinum nanoparticle catalyst for the captured analyte. This quantitation platform can also distinguish single base mismatch of DNA hybridization and observe the concentration threshold of CEA. The new strategy lays the foundation for this quantitation platform to be applied in forensic analysis, biothreat detection, clinical diagnostics and drug screening. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. A quantitative study to design an experimental setup for photoacoustic imaging.

    PubMed

    Marion, Adrien; Boutet, Jérôme; Debourdeau, Mathieu; Dinten, Jean-Marc; Vray, Didier

    2011-01-01

    During the last decade, a new modality called photoacoustic imaging has emerged. The increasing interest for this new modality is due to the fact that it combines advantages of ultrasound and optical imaging, i.e. the high contrast due to optical absorption and the low acoustic attenuation in biological tissues. It is thus possible to study vascularization because blood has high optical absorption coefficient. Papers in the literature often focus on applications and rarely discuss quantitative parameters. The goal of this paper is to provide quantitative elements to design an acquisition setup. By defining the targeted resolution and penetration depth, it is then possible to evaluate which kind of excitation and reception systems have to be used. First, we recall theoretical background related to photoacoustic effect before to describe the experiments based on a nanosecond laser at 1064 nm and 2.25-5 MHz transducers. Second, we present results about the relation linking fluence laser to signal amplitude and axial and lateral resolutions of our acquisition setup. We verify the linear relation between fluence and amplitude before to estimate axial resolution at 550 μm for a 2.25 MHz ultrasonic transducer. Concerning lateral resolution, we show that a reconstruction technique based on curvilinear acquisition of 30 lines improves it by a factor of 3 compared to a lateral displacement. Future works will include improvement of lateral resolution using probes, like in ultrasound imaging, instead of single-element transducers.

  8. Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.

    PubMed

    Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P

    2013-12-16

    Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and

  9. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  10. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  11. Optimized protocol for quantitative multiple reaction monitoring-based proteomic analysis of formalin-fixed, paraffin embedded tissues

    PubMed Central

    Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.

    2016-01-01

    Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933

  12. Improvement of semi-quantitative small-animal PET data with recovery coefficients: a phantom and rat study.

    PubMed

    Aide, Nicolas; Louis, Marie-Hélène; Dutoit, Soizic; Labiche, Alexandre; Lemoisson, Edwige; Briand, Mélanie; Nataf, Valérie; Poulain, Laurent; Gauduchon, Pascal; Talbot, Jean-Noël; Montravers, Françoise

    2007-10-01

    To evaluate the accuracy of semi-quantitative small-animal PET data, uncorrected for attenuation, and then of the same semi-quantitative data corrected by means of recovery coefficients (RCs) based on phantom studies. A phantom containing six fillable spheres (diameter range: 4.4-14 mm) was filled with an 18F-FDG solution (spheres/background activity=10.1, 5.1 and 2.5). RCs, defined as measured activity/expected activity, were calculated. Nude rats harbouring tumours (n=50) were imaged after injection of 18F-FDG and sacrificed. The standardized uptake value (SUV) in tumours was determined with small-animal PET and compared to ex-vivo counting (ex-vivo SUV). Small-animal PET SUVs were corrected with RCs based on the greatest tumour diameter. Tumour proliferation was assessed with cyclin A immunostaining and correlated to the SUV. RCs ranged from 0.33 for the smallest sphere to 0.72 for the largest. A sigmoidal correlation was found between RCs and sphere diameters (r(2)=0.99). Small-animal PET SUVs were well correlated with ex-vivo SUVs (y=0.48x-0.2; r(2)=0.71) and the use of RCs based on the greatest tumour diameter significantly improved regression (y=0.84x-0.81; r(2)=0.77), except for tumours with important necrosis. Similar results were obtained without sacrificing animals, by using PET images to estimate tumour dimensions. RC-based corrections improved correlation between small-animal PET SUVs and tumour proliferation (uncorrected data: Rho=0.79; corrected data: Rho=0.83). Recovery correction significantly improves both accuracy of small-animal PET semi-quantitative data in rat studies and their correlation with tumour proliferation, except for largely necrotic tumours.

  13. Retention of Nontraditional Students: A Quantitative Research Study

    ERIC Educational Resources Information Center

    Nichols, Shirley J.

    2009-01-01

    The purpose of this quantitative correlational research study was to investigate, describe, and measure factors influencing retention of nontraditional first and second year students at a university located in the Midwestern United States. Retention of adult students has become a major issue for many institutions of higher education and many…

  14. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  15. Nonparametric evaluation of quantitative traits in population-based association studies when the genetic model is unknown.

    PubMed

    Konietschke, Frank; Libiger, Ondrej; Hothorn, Ludwig A

    2012-01-01

    Statistical association between a single nucleotide polymorphism (SNP) genotype and a quantitative trait in genome-wide association studies is usually assessed using a linear regression model, or, in the case of non-normally distributed trait values, using the Kruskal-Wallis test. While linear regression models assume an additive mode of inheritance via equi-distant genotype scores, Kruskal-Wallis test merely tests global differences in trait values associated with the three genotype groups. Both approaches thus exhibit suboptimal power when the underlying inheritance mode is dominant or recessive. Furthermore, these tests do not perform well in the common situations when only a few trait values are available in a rare genotype category (disbalance), or when the values associated with the three genotype categories exhibit unequal variance (variance heterogeneity). We propose a maximum test based on Marcus-type multiple contrast test for relative effect sizes. This test allows model-specific testing of either dominant, additive or recessive mode of inheritance, and it is robust against variance heterogeneity. We show how to obtain mode-specific simultaneous confidence intervals for the relative effect sizes to aid in interpreting the biological relevance of the results. Further, we discuss the use of a related all-pairwise comparisons contrast test with range preserving confidence intervals as an alternative to Kruskal-Wallis heterogeneity test. We applied the proposed maximum test to the Bogalusa Heart Study dataset, and gained a remarkable increase in the power to detect association, particularly for rare genotypes. Our simulation study also demonstrated that the proposed non-parametric tests control family-wise error rate in the presence of non-normality and variance heterogeneity contrary to the standard parametric approaches. We provide a publicly available R library nparcomp that can be used to estimate simultaneous confidence intervals or compatible

  16. A Quantitative ADME-base Tool for Exploring Human ...

    EPA Pesticide Factsheets

    Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It

  17. The relationship between international trade and non-nutritional health outcomes: A systematic review of quantitative studies.

    PubMed

    Burns, Darren K; Jones, Andrew P; Suhrcke, Marc

    2016-03-01

    Markets throughout the world have been reducing barriers to international trade and investment in recent years. The resulting increases in levels of international trade and investment have subsequently generated research interest into the potential population health impact. We present a systematic review of quantitative studies investigating the relationship between international trade, foreign direct investment and non-nutritional health outcomes. Articles were systematically collected from the SCOPUS, PubMed, EconLit and Web of Science databases. Due to the heterogeneous nature of the evidence considered, the 16 included articles were subdivided into individual level data analyses, selected country analyses and international panel analyses. Articles were then quality assessed using a tool developed as part of the project. Nine of the studies were assessed to be high quality, six as medium quality, and one as low quality. The evidence from the quantitative literature suggests that overall, there appears to be a beneficial association between international trade and population health. There was also evidence of the importance of foreign direct investment, yet a lack of research considering the direction of causality. Taken together, quantitative research into the relationship between trade and non-nutritional health indicates trade to be beneficial, yet this body of research is still in its infancy. Future quantitative studies based on this foundation will provide a stronger basis on which to inform relevant national and international institutions about the health consequences of trade policies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Classification-based quantitative analysis of stable isotope labeling by amino acids in cell culture (SILAC) data.

    PubMed

    Kim, Seongho; Carruthers, Nicholas; Lee, Joohyoung; Chinni, Sreenivasa; Stemmer, Paul

    2016-12-01

    Stable isotope labeling by amino acids in cell culture (SILAC) is a practical and powerful approach for quantitative proteomic analysis. A key advantage of SILAC is the ability to simultaneously detect the isotopically labeled peptides in a single instrument run and so guarantee relative quantitation for a large number of peptides without introducing any variation caused by separate experiment. However, there are a few approaches available to assessing protein ratios and none of the existing algorithms pays considerable attention to the proteins having only one peptide hit. We introduce new quantitative approaches to dealing with SILAC protein-level summary using classification-based methodologies, such as Gaussian mixture models with EM algorithms and its Bayesian approach as well as K-means clustering. In addition, a new approach is developed using Gaussian mixture model and a stochastic, metaheuristic global optimization algorithm, particle swarm optimization (PSO), to avoid either a premature convergence or being stuck in a local optimum. Our simulation studies show that the newly developed PSO-based method performs the best among others in terms of F1 score and the proposed methods further demonstrate the ability of detecting potential markers through real SILAC experimental data. No matter how many peptide hits the protein has, the developed approach can be applicable, rescuing many proteins doomed to removal. Furthermore, no additional correction for multiple comparisons is necessary for the developed methods, enabling direct interpretation of the analysis outcomes. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  20. A Quantitative Study Identifying Political Strategies Used by Principals of Dual Language Programs

    ERIC Educational Resources Information Center

    Girard, Guadalupe

    2017-01-01

    Purpose. The purpose of this quantitative study was to identify the external and internal political strategies used by principals that allow them to successfully navigate the political environment surrounding dual language programs. Methodology. This quantitative study used descriptive research to collect, analyze, and report data that identified…

  1. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure-activity relationship.

    PubMed

    Wang, Hui; Jiang, Mingyue; Li, Shujun; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-09-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure-activity relationships (QSARs) for CAAS compounds against Aspergillus niger ( A. niger ) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models ( R 2  = 0.9346 for A. niger , R 2  = 0.9590 for P. citrinum, ) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi.

  2. Design of cinnamaldehyde amino acid Schiff base compounds based on the quantitative structure–activity relationship

    PubMed Central

    Wang, Hui; Jiang, Mingyue; Hse, Chung-Yun; Jin, Chunde; Sun, Fangli; Li, Zhuo

    2017-01-01

    Cinnamaldehyde amino acid Schiff base (CAAS) is a new class of safe, bioactive compounds which could be developed as potential antifungal agents for fungal infections. To design new cinnamaldehyde amino acid Schiff base compounds with high bioactivity, the quantitative structure–activity relationships (QSARs) for CAAS compounds against Aspergillus niger (A. niger) and Penicillium citrinum (P. citrinum) were analysed. The QSAR models (R2 = 0.9346 for A. niger, R2 = 0.9590 for P. citrinum,) were constructed and validated. The models indicated that the molecular polarity and the Max atomic orbital electronic population had a significant effect on antifungal activity. Based on the best QSAR models, two new compounds were designed and synthesized. Antifungal activity tests proved that both of them have great bioactivity against the selected fungi. PMID:28989758

  3. Communication patterns in a psychotherapy following traumatic brain injury: A quantitative case study based on symbolic dynamics

    PubMed Central

    2011-01-01

    Background The role of psychotherapy in the treatment of traumatic brain injury is receiving increased attention. The evaluation of psychotherapy with these patients has been conducted largely in the absence of quantitative data concerning the therapy itself. Quantitative methods for characterizing the sequence-sensitive structure of patient-therapist communication are now being developed with the objective of improving the effectiveness of psychotherapy following traumatic brain injury. Methods The content of three therapy session transcripts (sessions were separated by four months) obtained from a patient with a history of several motor vehicle accidents who was receiving dialectical behavior therapy was scored and analyzed using methods derived from the mathematical theory of symbolic dynamics. Results The analysis of symbol frequencies was largely uninformative. When repeated triples were examined a marked pattern of change in content was observed over the three sessions. The context free grammar complexity and the Lempel-Ziv complexity were calculated for each therapy session. For both measures, the rate of complexity generation, expressed as bits per minute, increased longitudinally during the course of therapy. The between-session increases in complexity generation rates are consistent with calculations of mutual information. Taken together these results indicate that there was a quantifiable increase in the variability of patient-therapist verbal behavior during the course of therapy. Comparison of complexity values against values obtained from equiprobable random surrogates established the presence of a nonrandom structure in patient-therapist dialog (P = .002). Conclusions While recognizing that only limited conclusions can be based on a case history, it can be noted that these quantitative observations are consistent with qualitative clinical observations of increases in the flexibility of discourse during therapy. These procedures can be of particular

  4. Design of 3-D adipospheres for quantitative metabolic study

    PubMed Central

    Akama, Takeshi; Leung, Brendan M.; Labuz, Joseph M.; Takayama, Shuichi; Chun, Tae-Hwa

    2017-01-01

    Quantitative assessment of adipose mitochondrial activity is critical for better understanding of adipose tissue function in obesity and diabetes. While the two-dimensional (2-D) tissue culture method has been sufficient to discover key molecules that regulate adipocyte differentiation and function, the method is insufficient to determine the role of extracellular matrix (ECM) molecules and their modifiers, such as matrix metalloproteinases (MMPs), in regulating adipocyte function in three-dimensional (3-D) in vivo-like microenvironments. By using a 3-D hanging drop tissue culture system, we are able to produce scalable 3-D adipospheres that are suitable for quantitative mitochondrial study in 3-D microenvironment. PMID:28244051

  5. Quantitative Methods in the Study of Local History

    ERIC Educational Resources Information Center

    Davey, Pene

    1974-01-01

    The author suggests how the quantitative analysis of data from census records, assessment roles, and newspapers may be integrated into the classroom. Suggestions for obtaining quantitative data are provided. (DE)

  6. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  7. Relative validity of a semi-quantitative, web-based FFQ used in the 'Snart Forældre' cohort - a Danish study of diet and fertility.

    PubMed

    Knudsen, Vibeke K; Hatch, Elizabeth E; Cueto, Heidi; Tucker, Katherine L; Wise, Lauren; Christensen, Tue; Mikkelsen, Ellen M

    2016-04-01

    To assess the relative validity of a semi-quantitative, web-based FFQ completed by female pregnancy planners in the Danish 'Snart Forældre' study. We validated a web-based FFQ based on the FFQ used in the Danish National Birth Cohort against a 4 d food diary (FD) and assessed the relative validity of intakes of foods and nutrients. We compared means and medians of intakes, and calculated Pearson correlation coefficients and de-attenuated coefficients to assess agreement between the two methods. We also calculated the proportion correctly classified based on the same or adjacent quintile of intake and the proportion of grossly misclassified (extreme quintiles). Participants (n 128) in the 'Snart Forældre' study who had completed the web-based FFQ were invited to participate in the validation study. Participants in the 'Snart Forældre' study, in total ninety-seven women aged 20-42 years. Reported intakes of dairy products, vegetables and potatoes were higher in the FFQ compared with the FD, whereas reported intakes of fruit, meat, sugar and beverages were lower in the FFQ than in the FD. Overall the de-attenuated correlation coefficients were acceptable, ranging from 0·33 for energy to 0·93 for vitamin D. The majority of the women were classified in the same or adjacent quintile and few women were misclassified (extreme quintiles). The web-based FFQ performs well for ranking women of reproductive age according to high or low intake of foods and nutrients and, thus, provides a solid basis for investigating associations between diet and fertility.

  8. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model

    PubMed Central

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software “Kongoh” for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1–4 persons’ contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI’s contribution in true contributors and non-contributors by using 2–4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI’s contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples. PMID:29149210

  9. Development and validation of open-source software for DNA mixture interpretation based on a quantitative continuous model.

    PubMed

    Manabe, Sho; Morimoto, Chie; Hamano, Yuya; Fujimoto, Shuntaro; Tamaki, Keiji

    2017-01-01

    In criminal investigations, forensic scientists need to evaluate DNA mixtures. The estimation of the number of contributors and evaluation of the contribution of a person of interest (POI) from these samples are challenging. In this study, we developed a new open-source software "Kongoh" for interpreting DNA mixture based on a quantitative continuous model. The model uses quantitative information of peak heights in the DNA profile and considers the effect of artifacts and allelic drop-out. By using this software, the likelihoods of 1-4 persons' contributions are calculated, and the most optimal number of contributors is automatically determined; this differs from other open-source software. Therefore, we can eliminate the need to manually determine the number of contributors before the analysis. Kongoh also considers allele- or locus-specific effects of biological parameters based on the experimental data. We then validated Kongoh by calculating the likelihood ratio (LR) of a POI's contribution in true contributors and non-contributors by using 2-4 person mixtures analyzed through a 15 short tandem repeat typing system. Most LR values obtained from Kongoh during true-contributor testing strongly supported the POI's contribution even for small amounts or degraded DNA samples. Kongoh correctly rejected a false hypothesis in the non-contributor testing, generated reproducible LR values, and demonstrated higher accuracy of the estimated number of contributors than another software based on the quantitative continuous model. Therefore, Kongoh is useful in accurately interpreting DNA evidence like mixtures and small amounts or degraded DNA samples.

  10. [Quantitative classification-based occupational health management for electroplating enterprises in Baoan District of Shenzhen, China].

    PubMed

    Zhang, Sheng; Huang, Jinsheng; Yang, Baigbing; Lin, Binjie; Xu, Xinyun; Chen, Jinru; Zhao, Zhuandi; Tu, Xiaozhi; Bin, Haihua

    2014-04-01

    To improve the occupational health management levels in electroplating enterprises with quantitative classification measures and to provide a scientific basis for the prevention and control of occupational hazards in electroplating enterprises and the protection of workers' health. A quantitative classification table was created for the occupational health management in electroplating enterprises. The evaluation indicators included 6 items and 27 sub-items, with a total score of 100 points. Forty electroplating enterprises were selected and scored according to the quantitative classification table. These electroplating enterprises were classified into grades A, B, and C based on the scores. Among 40 electroplating enterprises, 11 (27.5%) had scores of >85 points (grade A), 23 (57.5%) had scores of 60∼85 points (grade B), and 6 (15.0%) had scores of <60 points (grade C). Quantitative classification management for electroplating enterprises is a valuable attempt, which is helpful for the supervision and management by the health department and provides an effective method for the self-management of enterprises.

  11. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular

  12. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  13. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  14. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  15. 3D-quantitative structure-activity relationship study for the design of novel enterovirus A71 3C protease inhibitors.

    PubMed

    Nie, Quandeng; Xu, Xiaoyi; Zhang, Qi; Ma, Yuying; Yin, Zheng; Shang, Luqing

    2018-06-07

    A three-dimensional quantitative structure-activity relationships model of enterovirus A71 3C protease inhibitors was constructed in this study. The protein-ligand interaction fingerprint was analyzed to generate a pharmacophore model. A predictive and reliable three-dimensional quantitative structure-activity relationships model was built based on the Flexible Alignment of AutoGPA. Moreover, three novel compounds (I-III) were designed and evaluated for their biochemical activity against 3C protease and anti-enterovirus A71 activity in vitro. III exhibited excellent inhibitory activity (IC 50 =0.031 ± 0.005 μM, EC 50 =0.036 ± 0.007 μM). Thus, this study provides a useful quantitative structure-activity relationships model to develop potent inhibitors for enterovirus A71 3C protease. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  16. A web-based quantitative signal detection system on adverse drug reaction in China.

    PubMed

    Li, Chanjuan; Xia, Jielai; Deng, Jianxiong; Chen, Wenge; Wang, Suzhen; Jiang, Jing; Chen, Guanquan

    2009-07-01

    To establish a web-based quantitative signal detection system for adverse drug reactions (ADRs) based on spontaneous reporting to the Guangdong province drug-monitoring database in China. Using Microsoft Visual Basic and Active Server Pages programming languages and SQL Server 2000, a web-based system with three software modules was programmed to perform data preparation and association detection, and to generate reports. Information component (IC), the internationally recognized measure of disproportionality for quantitative signal detection, was integrated into the system, and its capacity for signal detection was tested with ADR reports collected from 1 January 2002 to 30 June 2007 in Guangdong. A total of 2,496 associations including known signals were mined from the test database. Signals (e.g., cefradine-induced hematuria) were found early by using the IC analysis. In addition, 291 drug-ADR associations were alerted for the first time in the second quarter of 2007. The system can be used for the detection of significant associations from the Guangdong drug-monitoring database and could be an extremely useful adjunct to the expert assessment of very large numbers of spontaneously reported ADRs for the first time in China.

  17. Hybrid quantitative MRI using chemical shift displacement and recovery-based simultaneous water and lipid imaging: A preliminary study.

    PubMed

    Ohno, Naoki; Miyati, Tosiaki; Suzuki, Shuto; Kan, Hirohito; Aoki, Toshitaka; Nakamura, Yoshitaka; Hiramatsu, Yuki; Kobayashi, Satoshi; Gabata, Toshifumi

    2018-07-01

    To suppress olefinic signals and enable simultaneous and quantitative estimation of multiple functional parameters associated with water and lipid, we investigated a modified method using chemical shift displacement and recovery-based separation of lipid tissue (SPLIT) involving acquisitions with different inversion times (TIs), echo times (TEs), and b-values. Single-shot diffusion echo-planar imaging (SSD-EPI) with multiple b-values (0-3000 s/mm 2 ) was performed without fat suppression to separate water and lipid images using the chemical shift displacement of lipid signals in the phase-encoding direction. An inversion pulse (TI = 292 ms) was applied to SSD-EPI to remove olefinic signals. Consecutively, SSD-EPI (b = 0 s/mm 2 ) was performed with TI = 0 ms and TE = 31.8 ms for T 1 and T 2 measurements, respectively. Under these conditions, transverse water and lipid images at the maximum diameter of the right calf were obtained in six healthy subjects. T 1 , T 2 , and the apparent diffusion coefficients (ADC) were then calculated for the tibialis anterior (TA), gastrocnemius (GM), and soleus (SL) muscles, tibialis bone marrow (TB), and subcutaneous fat (SF). Perfusion-related (D*) and restricted diffusion coefficients (D) were calculated for the muscles. Lastly, the lipid fractions (LF) of the muscles were determined after T 1 and T 2 corrections. The modified SPLIT method facilitated sufficient separation of water and lipid images of the calf, and the inversion pulse with TI of 292 ms effectively suppressed olefinic signals. All quantitative parameters obtained with the modified SPLIT method were found to be in general agreement with those previously reported in the literature. The modified SPLIT technique enabled sufficient suppression of olefinic signals and simultaneous acquisition of quantitative parameters including diffusion, perfusion, T 1 and T 2 relaxation times, and LF. Copyright © 2018. Published by Elsevier Inc.

  18. Metstoich--Teaching Quantitative Metabolism and Energetics in Biochemical Engineering

    ERIC Educational Resources Information Center

    Wong, Kelvin W. W.; Barford, John P.

    2010-01-01

    Metstoich, a metabolic calculator developed for teaching, can provide a novel way to teach quantitative metabolism to biochemical engineering students. It can also introduce biochemistry/life science students to the quantitative aspects of life science subjects they have studied. Metstoich links traditional biochemistry-based metabolic approaches…

  19. Nonmydriatic fluorescence-based quantitative imaging of human macular pigment distributions

    NASA Astrophysics Data System (ADS)

    Sharifzadeh, Mohsen; Bernstein, Paul S.; Gellermann, Werner

    2006-10-01

    We have developed a CCD-camera-based nonmydriatic instrument that detects fluorescence from retinal lipofuscin chromophores ("autofluorescence") as a means to indirectly quantify and spatially image the distribution of macular pigment (MP). The lipofuscin fluorescence intensity is reduced at all retinal locations containing MP, since MP has a competing absorption in the blue-green wavelength region. Projecting a large diameter, 488 nm excitation spot onto the retina, centered on the fovea, but extending into the macular periphery, and comparing lipofuscin fluorescence intensities outside and inside the foveal area, it is possible to spatially map out the distribution of MP. Spectrally selective detection of the lipofuscin fluorescence reveals an important wavelength dependence of the obtainable image contrast and deduced MP optical density levels, showing that it is important to block out interfering fluorescence contributions in the detection setup originating from ocular media such as the lens. Measuring 70 healthy human volunteer subjects with no ocular pathologies, we find widely varying spatial extent of MP, distinctly differing distribution patterns of MP, and strongly differing absolute MP levels among individuals. Our population study suggests that MP imaging based on lipofuscin fluorescence is useful as a relatively simple, objective, and quantitative noninvasive optical technique suitable to rapidly screen MP levels and distributions in healthy humans with undilated pupils.

  20. A Quantitative Study of Faculty Perceptions and Attitudes on Asynchronous Virtual Teamwork Using the Technology Acceptance Model

    ERIC Educational Resources Information Center

    Wolusky, G. Anthony

    2016-01-01

    This quantitative study used a web-based questionnaire to assess the attitudes and perceptions of online and hybrid faculty towards student-centered asynchronous virtual teamwork (AVT) using the technology acceptance model (TAM) of Davis (1989). AVT is online student participation in a team approach to problem-solving culminating in a written…

  1. Multigrid-based reconstruction algorithm for quantitative photoacoustic tomography

    PubMed Central

    Li, Shengfu; Montcel, Bruno; Yuan, Zhen; Liu, Wanyu; Vray, Didier

    2015-01-01

    This paper proposes a multigrid inversion framework for quantitative photoacoustic tomography reconstruction. The forward model of optical fluence distribution and the inverse problem are solved at multiple resolutions. A fixed-point iteration scheme is formulated for each resolution and used as a cost function. The simulated and experimental results for quantitative photoacoustic tomography reconstruction show that the proposed multigrid inversion can dramatically reduce the required number of iterations for the optimization process without loss of reliability in the results. PMID:26203371

  2. Quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria by gas chromatography-mass spectrometry.

    PubMed

    Guan, Wenna; Zhao, Hui; Lu, Xuefeng; Wang, Cong; Yang, Menglong; Bai, Fali

    2011-11-11

    Simple and rapid quantitative determination of fatty-acid-based biofuels is greatly important for the study of genetic engineering progress for biofuels production by microalgae. Ideal biofuels produced from biological systems should be chemically similar to petroleum, like fatty-acid-based molecules including free fatty acids, fatty acid methyl esters, fatty acid ethyl esters, fatty alcohols and fatty alkanes. This study founded a gas chromatography-mass spectrometry (GC-MS) method for simultaneous quantification of seven free fatty acids, nine fatty acid methyl esters, five fatty acid ethyl esters, five fatty alcohols and three fatty alkanes produced by wild-type Synechocystis PCC 6803 and its genetically engineered strain. Data obtained from GC-MS analyses were quantified using internal standard peak area comparisons. The linearity, limit of detection (LOD) and precision (RSD) of the method were evaluated. The results demonstrated that fatty-acid-based biofuels can be directly determined by GC-MS without derivation. Therefore, rapid and reliable quantitative analysis of fatty-acid-based biofuels produced by wild-type and genetically engineered cyanobacteria can be achieved using the GC-MS method founded in this work. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Improved quantitative analysis of spectra using a new method of obtaining derivative spectra based on a singular perturbation technique.

    PubMed

    Li, Zhigang; Wang, Qiaoyun; Lv, Jiangtao; Ma, Zhenhe; Yang, Linjuan

    2015-06-01

    Spectroscopy is often applied when a rapid quantitative analysis is required, but one challenge is the translation of raw spectra into a final analysis. Derivative spectra are often used as a preliminary preprocessing step to resolve overlapping signals, enhance signal properties, and suppress unwanted spectral features that arise due to non-ideal instrument and sample properties. In this study, to improve quantitative analysis of near-infrared spectra, derivatives of noisy raw spectral data need to be estimated with high accuracy. A new spectral estimator based on singular perturbation technique, called the singular perturbation spectra estimator (SPSE), is presented, and the stability analysis of the estimator is given. Theoretical analysis and simulation experimental results confirm that the derivatives can be estimated with high accuracy using this estimator. Furthermore, the effectiveness of the estimator for processing noisy infrared spectra is evaluated using the analysis of beer spectra. The derivative spectra of the beer and the marzipan are used to build the calibration model using partial least squares (PLS) modeling. The results show that the PLS based on the new estimator can achieve better performance compared with the Savitzky-Golay algorithm and can serve as an alternative choice for quantitative analytical applications.

  4. Quantitative study of bundle size effect on thermal conductivity of single-walled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Feng, Ya; Inoue, Taiki; An, Hua; Xiang, Rong; Chiashi, Shohei; Maruyama, Shigeo

    2018-05-01

    Compared with isolated single-walled carbon nanotubes (SWNTs), thermal conductivity is greatly impeded in SWNT bundles; however, the measurement of the bundle size effect is difficult. In this study, the number of SWNTs in a bundle was determined based on the transferred horizontally aligned SWNTs on a suspended micro-thermometer to quantitatively study the effect of the bundle size on thermal conductivity. Increasing the bundle size significantly degraded the thermal conductivity. For isolated SWNTs, thermal conductivity was approximately 5000 ± 1000 W m-1 K-1 at room temperature, three times larger than that of the four-SWNT bundle. The logarithmical deterioration of thermal conductivity resulting from the increased bundle size can be attributed to the increased scattering rate with neighboring SWNTs based on the kinetic theory.

  5. Study protocol: quantitative fibronectin to help decision-making in women with symptoms of preterm labour (QUIDS) part 2, UK Prospective Cohort Study

    PubMed Central

    Wotherspoon, Lisa M; Boyd, Kathleen Anne; Morris, Rachel K; Jackson, Lesley; Chandiramani, Manju; David, Anna L; Khalil, Asma; Shennan, Andrew; Hodgetts Morton, Victoria; Lavender, Tina; Khan, Khalid; Harper-Clarke, Susan; Mol, Ben; Riley, Richard D; Norrie, John; Norman, Jane

    2018-01-01

    Introduction The aim of the QUIDS study is to develop a decision support tool for the management of women with symptoms and signs of preterm labour, based on a validated prognostic model using quantitative fetal fibronectin (fFN) concentration, in combination with clinical risk factors. Methods and analysis The study will evaluate the Rapid fFN 10Q System (Hologic, Marlborough, Massachusetts, USA) which quantifies fFN in a vaginal swab. In QUIDS part 2, we will perform a prospective cohort study in at least eight UK consultant-led maternity units, in women with symptoms of preterm labour at 22+0 to 34+6 weeks gestation to externally validate a prognostic model developed in QUIDS part 1. The effects of quantitative fFN on anxiety will be assessed, and acceptability of the test and prognostic model will be evaluated in a subgroup of women and clinicians (n=30). The sample size is 1600 women (with estimated 96–192 events of preterm delivery within 7 days of testing). Clinicians will be informed of the qualitative fFN result (positive/negative) but be blinded to quantitative fFN result. Research midwives will collect outcome data from the maternal and neonatal clinical records. The final validated prognostic model will be presented as a mobile or web-based application. Ethics and dissemination The study is funded by the National Institute of Healthcare Research Health Technology Assessment (HTA 14/32/01). It has been approved by the West of Scotland Research Ethics Committee (16/WS/0068). Version Protocol V.2, Date 1 November 2016. Trial registration number ISRCTN41598423 and CPMS: 31277. PMID:29674373

  6. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    PubMed

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. [Study on ethnic medicine quantitative reference herb,Tibetan medicine fruits of Capsicum frutescens as a case].

    PubMed

    Zan, Ke; Cui, Gan; Guo, Li-Nong; Ma, Shuang-Cheng; Zheng, Jian

    2018-05-01

    High price and difficult to get of reference substance have become obstacles to HPLC assay of ethnic medicine. A new method based on quantitative reference herb (QRH) was proposed. Specific chromatograms in fruits of Capsicum frutescens were employed to determine peak positions, and HPLC quantitative reference herb was prepared from fruits of C. frutescens. The content of capsaicin and dihydrocapsaicin in the quantitative control herb was determined by HPLC. Eleven batches of fruits of C. frutescens were analyzed with quantitative reference herb and reference substance respectively. The results showed no difference. The present method is feasible for quality control of ethnic medicines and quantitative reference herb is suitable to replace reference substances in assay. Copyright© by the Chinese Pharmaceutical Association.

  8. PCR-free quantitative detection of genetically modified organism from raw materials. An electrochemiluminescence-based bio bar code method.

    PubMed

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R

    2008-05-15

    A bio bar code assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio bar code assay requires lengthy experimental procedures including the preparation and release of bar code DNA probes from the target-nanoparticle complex and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio bar code assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2,2'-bipyridyl) ruthenium (TBR)-labeled bar code DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products.

  9. Diagnostic accuracy of semi-automatic quantitative metrics as an alternative to expert reading of CT myocardial perfusion in the CORE320 study.

    PubMed

    Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C

    To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  10. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  11. Studying Biology to Understand Risk: Dosimetry Models and Quantitative Adverse Outcome Pathways

    EPA Science Inventory

    Confidence in the quantitative prediction of risk is increased when the prediction is based to as great an extent as possible on the relevant biological factors that constitute the pathway from exposure to adverse outcome. With the first examples now over 40 years old, physiologi...

  12. Diversity of respiratory impedance based on quantitative computed tomography in patients with COPD.

    PubMed

    Wada, Yosuke; Kitaguchi, Yoshiaki; Yasuo, Masanori; Ueno, Fumika; Kawakami, Satoshi; Fukushima, Kiyoyasu; Fujimoto, Keisaku; Hanaoka, Masayuki

    2018-01-01

    This study was conducted in order to investigate the diversity of respiratory physiology, including the respiratory impedance and reversibility of airway obstruction, based on quantitative computed tomography (CT) in patients with COPD. Medical records of 174 stable COPD patients were retrospectively reviewed to obtain the patients' clinical data, including the pulmonary function and imaging data. According to the software-based quantification of the degree of emphysema and airway wall thickness, the patients were classified into the "normal by CT" phenotype, the airway-dominant phenotype, the emphysema-dominant phenotype, and the mixed phenotype. The pulmonary function, including the respiratory impedance evaluated by using the forced oscillation technique (FOT) and the reversibility of airway obstruction in response to inhaled short-acting β 2 -agonists, was then compared among the four phenotypes. The respiratory system resistance at 5 and 20 Hz (R5 and R20) was significantly higher, and the respiratory system reactance at 5 Hz (X5) was significantly more negative in the airway-dominant and mixed phenotypes than in the other phenotypes. The within-breath changes of X5 (ΔX5) were significantly greater in the mixed phenotype than in the "normal by CT" and emphysema-dominant phenotypes. The FOT parameters (R5, R20, and X5) were significantly correlated with indices of the degree of airway wall thickness and significantly but weakly correlated with the reversibility of airway obstruction. There was no significant correlation between the FOT parameters (R5, R20, and X5) and the degree of emphysema. There is a diversity of respiratory physiology, including the respiratory impedance and reversibility of airway obstruction, based on quantitative CT in patients with COPD. The FOT measurements may reflect the degree of airway disease and aid in detecting airway remodeling in patients with COPD.

  13. Pilot clinical study for quantitative spectral diagnosis of non-melanoma skin cancer.

    PubMed

    Rajaram, Narasimhan; Reichenberg, Jason S; Migden, Michael R; Nguyen, Tri H; Tunnell, James W

    2010-12-01

    Several research groups have demonstrated the non-invasive diagnostic potential of diffuse optical spectroscopy (DOS) and laser-induced fluorescence (LIF) techniques for early cancer detection. By combining both modalities, one can simultaneously measure quantitative parameters related to the morphology, function and biochemical composition of tissue and use them to diagnose malignancy. The objective of this study was to use a quantitative reflectance/fluorescence spectroscopic technique to determine the optical properties of normal skin and non-melanoma skin cancers and the ability to accurately classify them. An additional goal was to determine the ability of the technique to differentiate non-melanoma skin cancers from normal skin. The study comprised 48 lesions measured from 40 patients scheduled for a biopsy of suspected non-melanoma skin cancers. White light reflectance and laser-induced fluorescence spectra (wavelength range = 350-700 nm) were collected from each suspected lesion and adjacent clinically normal skin using a custom-built, optical fiber-based clinical instrument. After measurement, the skin sites were biopsied and categorized according to histopathology. Using a quantitative model, we extracted various optical parameters from the measured spectra that could be correlated to the physiological state of tissue. Scattering from cancerous lesions was significantly lower than normal skin for every lesion group, whereas absorption parameters were significantly higher. Using numerical cut-offs for our optical parameters, our clinical instrument could classify basal cell carcinomas with a sensitivity and specificity of 94% and 89%, respectively. Similarly, the instrument classified actinic keratoses and squamous cell carcinomas with a sensitivity of 100% and specificity of 50%. The measured optical properties and fluorophore contributions of normal skin and non-melanoma skin cancers are significantly different from each other and correlate well

  14. Nuclear medicine and imaging research: Quantitative studies in radiopharmaceutical science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copper, M.; Beck, R.N.

    1991-06-01

    During the past three years the program has undergone a substantial revitalization. There has been no significant change in the scientific direction of this grant, in which emphasis continues to be placed on developing new or improved methods of obtaining quantitative data from radiotracer imaging studies. However, considerable scientific progress has been made in the three areas of interest: Radiochemistry, Quantitative Methodologies, and Experimental Methods and Feasibility Studies, resulting in a sharper focus of perspective and improved integration of the overall scientific effort. Changes in Faculty and staff, including development of new collaborations, have contributed to this, as has acquisitionmore » of additional and new equipment and renovations and expansion of the core facilities. 121 refs., 30 figs., 2 tabs.« less

  15. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  16. Temporal Data Set Reduction Based on D-Optimality for Quantitative FLIM-FRET Imaging.

    PubMed

    Omer, Travis; Intes, Xavier; Hahn, Juergen

    2015-01-01

    Fluorescence lifetime imaging (FLIM) when paired with Förster resonance energy transfer (FLIM-FRET) enables the monitoring of nanoscale interactions in living biological samples. FLIM-FRET model-based estimation methods allow the quantitative retrieval of parameters such as the quenched (interacting) and unquenched (non-interacting) fractional populations of the donor fluorophore and/or the distance of the interactions. The quantitative accuracy of such model-based approaches is dependent on multiple factors such as signal-to-noise ratio and number of temporal points acquired when sampling the fluorescence decays. For high-throughput or in vivo applications of FLIM-FRET, it is desirable to acquire a limited number of temporal points for fast acquisition times. Yet, it is critical to acquire temporal data sets with sufficient information content to allow for accurate FLIM-FRET parameter estimation. Herein, an optimal experimental design approach based upon sensitivity analysis is presented in order to identify the time points that provide the best quantitative estimates of the parameters for a determined number of temporal sampling points. More specifically, the D-optimality criterion is employed to identify, within a sparse temporal data set, the set of time points leading to optimal estimations of the quenched fractional population of the donor fluorophore. Overall, a reduced set of 10 time points (compared to a typical complete set of 90 time points) was identified to have minimal impact on parameter estimation accuracy (≈5%), with in silico and in vivo experiment validations. This reduction of the number of needed time points by almost an order of magnitude allows the use of FLIM-FRET for certain high-throughput applications which would be infeasible if the entire number of time sampling points were used.

  17. Gold Nanoparticle Labeling Based ICP-MS Detection/Measurement of Bacteria, and Their Quantitative Photothermal Destruction

    PubMed Central

    Lin, Yunfeng

    2015-01-01

    Bacteria such as Salmonella and E. coli present a great challenge in public health care in today’s society. Protection of public safety against bacterial contamination and rapid diagnosis of infection require simple and fast assays for the detection and elimination of bacterial pathogens. After utilizing Salmonella DT104 as an example bacterial strain for our investigation, we report a rapid and sensitive assay for the qualitative and quantitative detection of bacteria by using antibody affinity binding, popcorn shaped gold nanoparticle (GNPOPs) labeling, surfance enchanced Raman spectroscopy (SERS), and inductively coupled plasma mass spectrometry (ICP-MS) detection. For qualitative analysis, our assay can detect Salmonella within 10 min by Raman spectroscopy; for quantitative analysis, our assay has the ability to measure as few as 100 Salmonella DT104 in a 1 mL sample (100 CFU/mL) within 40 min. Based on the quantitative detection, we investigated the quantitative destruction of Salmonella DT104, and the assay’s photothermal efficiency in order to reduce the amount of GNPOPs in the assay to ultimately to eliminate any potential side effects/toxicity to the surrounding cells in vivo. Results suggest that our assay may serve as a promising candidate for qualitative and quantitative detection and elimination of a variety of bacterial pathogens. PMID:26417447

  18. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    NASA Astrophysics Data System (ADS)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; Zheng, Lu; Jiang, Zhanzhi; Ganesan, Vishal; Wang, Yayu; Lai, Keji

    2018-04-01

    We report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-field microwave imaging with small distance modulation.

  19. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    PubMed

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-02-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences.

  20. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    PubMed

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  1. 3D-quantitative structure-activity relationship studies on benzothiadiazepine hydroxamates as inhibitors of tumor necrosis factor-alpha converting enzyme.

    PubMed

    Murumkar, Prashant R; Giridhar, Rajani; Yadav, Mange Ram

    2008-04-01

    A set of 29 benzothiadiazepine hydroxamates having selective tumor necrosis factor-alpha converting enzyme inhibitory activity were used to compare the quality and predictive power of 3D-quantitative structure-activity relationship, comparative molecular field analysis, and comparative molecular similarity indices models for the atom-based, centroid/atom-based, data-based, and docked conformer-based alignment. Removal of two outliers from the initial training set of molecules improved the predictivity of models. Among the 3D-quantitative structure-activity relationship models developed using the above four alignments, the database alignment provided the optimal predictive comparative molecular field analysis model for the training set with cross-validated r(2) (q(2)) = 0.510, non-cross-validated r(2) = 0.972, standard error of estimates (s) = 0.098, and F = 215.44 and the optimal comparative molecular similarity indices model with cross-validated r(2) (q(2)) = 0.556, non-cross-validated r(2) = 0.946, standard error of estimates (s) = 0.163, and F = 99.785. These models also showed the best test set prediction for six compounds with predictive r(2) values of 0.460 and 0.535, respectively. The contour maps obtained from 3D-quantitative structure-activity relationship studies were appraised for activity trends for the molecules analyzed. The comparative molecular similarity indices models exhibited good external predictivity as compared with that of comparative molecular field analysis models. The data generated from the present study helped us to further design and report some novel and potent tumor necrosis factor-alpha converting enzyme inhibitors.

  2. Smartphone-Based Dual-Modality Imaging System for Quantitative Detection of Color or Fluorescent Lateral Flow Immunochromatographic Strips

    NASA Astrophysics Data System (ADS)

    Hou, Yafei; Wang, Kan; Xiao, Kun; Qin, Weijian; Lu, Wenting; Tao, Wei; Cui, Daxiang

    2017-04-01

    Nowadays, lateral flow immunochromatographic assays are increasingly popular as a diagnostic tool for point-of-care (POC) test based on their simplicity, specificity, and sensitivity. Hence, quantitative detection and pluralistic popular application are urgently needed in medical examination. In this study, a smartphone-based dual-modality imaging system was developed for quantitative detection of color or fluorescent lateral flow test strips, which can be operated anywhere at any time. In this system, the white and ultra-violet (UV) light of optical device was designed, which was tunable with different strips, and the Sobel operator algorithm was used in the software, which could enhance the identification ability to recognize the test area from the background boundary information. Moreover, this technology based on extraction of the components from RGB format (red, green, and blue) of color strips or only red format of the fluorescent strips can obviously improve the high-signal intensity and sensitivity. Fifty samples were used to evaluate the accuracy of this system, and the ideal detection limit was calculated separately from detection of human chorionic gonadotropin (HCG) and carcinoembryonic antigen (CEA). The results indicated that smartphone-controlled dual-modality imaging system could provide various POC diagnoses, which becomes a potential technology for developing the next-generation of portable system in the near future.

  3. Quantitative Outline-based Shape Analysis and Classification of Planetary Craterforms using Supervised Learning Models

    NASA Astrophysics Data System (ADS)

    Slezak, Thomas Joseph; Radebaugh, Jani; Christiansen, Eric

    2017-10-01

    The shapes of craterform morphology on planetary surfaces provides rich information about their origins and evolution. While morphologic information provides rich visual clues to geologic processes and properties, the ability to quantitatively communicate this information is less easily accomplished. This study examines the morphology of craterforms using the quantitative outline-based shape methods of geometric morphometrics, commonly used in biology and paleontology. We examine and compare landforms on planetary surfaces using shape, a property of morphology that is invariant to translation, rotation, and size. We quantify the shapes of paterae on Io, martian calderas, terrestrial basaltic shield calderas, terrestrial ash-flow calderas, and lunar impact craters using elliptic Fourier analysis (EFA) and the Zahn and Roskies (Z-R) shape function, or tangent angle approach to produce multivariate shape descriptors. These shape descriptors are subjected to multivariate statistical analysis including canonical variate analysis (CVA), a multiple-comparison variant of discriminant analysis, to investigate the link between craterform shape and classification. Paterae on Io are most similar in shape to terrestrial ash-flow calderas and the shapes of terrestrial basaltic shield volcanoes are most similar to martian calderas. The shapes of lunar impact craters, including simple, transitional, and complex morphology, are classified with a 100% rate of success in all models. Multiple CVA models effectively predict and classify different craterforms using shape-based identification and demonstrate significant potential for use in the analysis of planetary surfaces.

  4. Quantitative DLA-based compressed sensing for T1-weighted acquisitions

    NASA Astrophysics Data System (ADS)

    Svehla, Pavel; Nguyen, Khieu-Van; Li, Jing-Rebecca; Ciobanu, Luisa

    2017-08-01

    High resolution Manganese Enhanced Magnetic Resonance Imaging (MEMRI), which uses manganese as a T1 contrast agent, has great potential for functional imaging of live neuronal tissue at single neuron scale. However, reaching high resolutions often requires long acquisition times which can lead to reduced image quality due to sample deterioration and hardware instability. Compressed Sensing (CS) techniques offer the opportunity to significantly reduce the imaging time. The purpose of this work is to test the feasibility of CS acquisitions based on Diffusion Limited Aggregation (DLA) sampling patterns for high resolution quantitative T1-weighted imaging. Fully encoded and DLA-CS T1-weighted images of Aplysia californica neural tissue were acquired on a 17.2T MRI system. The MR signal corresponding to single, identified neurons was quantified for both versions of the T1 weighted images. For a 50% undersampling, DLA-CS can accurately quantify signal intensities in T1-weighted acquisitions leading to only 1.37% differences when compared to the fully encoded data, with minimal impact on image spatial resolution. In addition, we compared the conventional polynomial undersampling scheme with the DLA and showed that, for the data at hand, the latter performs better. Depending on the image signal to noise ratio, higher undersampling ratios can be used to further reduce the acquisition time in MEMRI based functional studies of living tissues.

  5. Smartphone based hand-held quantitative phase microscope using the transport of intensity equation method.

    PubMed

    Meng, Xin; Huang, Huachuan; Yan, Keding; Tian, Xiaolin; Yu, Wei; Cui, Haoyang; Kong, Yan; Xue, Liang; Liu, Cheng; Wang, Shouyu

    2016-12-20

    In order to realize high contrast imaging with portable devices for potential mobile healthcare, we demonstrate a hand-held smartphone based quantitative phase microscope using the transport of intensity equation method. With a cost-effective illumination source and compact microscope system, multi-focal images of samples can be captured by the smartphone's camera via manual focusing. Phase retrieval is performed using a self-developed Android application, which calculates sample phases from multi-plane intensities via solving the Poisson equation. We test the portable microscope using a random phase plate with known phases, and to further demonstrate its performance, a red blood cell smear, a Pap smear and monocot root and broad bean epidermis sections are also successfully imaged. Considering its advantages as an accurate, high-contrast, cost-effective and field-portable device, the smartphone based hand-held quantitative phase microscope is a promising tool which can be adopted in the future in remote healthcare and medical diagnosis.

  6. Genome-Wide Association Studies of Quantitatively Measured Skin, Hair, and Eye Pigmentation in Four European Populations

    PubMed Central

    Candille, Sophie I.; Absher, Devin M.; Beleza, Sandra; Bauchet, Marc; McEvoy, Brian; Garrison, Nanibaa’ A.; Li, Jun Z.; Myers, Richard M.; Barsh, Gregory S.; Tang, Hua; Shriver, Mark D.

    2012-01-01

    Pigmentation of the skin, hair, and eyes varies both within and between human populations. Identifying the genes and alleles underlying this variation has been the goal of many candidate gene and several genome-wide association studies (GWAS). Most GWAS for pigmentary traits to date have been based on subjective phenotypes using categorical scales. But skin, hair, and eye pigmentation vary continuously. Here, we seek to characterize quantitative variation in these traits objectively and accurately and to determine their genetic basis. Objective and quantitative measures of skin, hair, and eye color were made using reflectance or digital spectroscopy in Europeans from Ireland, Poland, Italy, and Portugal. A GWAS was conducted for the three quantitative pigmentation phenotypes in 176 women across 313,763 SNP loci, and replication of the most significant associations was attempted in a sample of 294 European men and women from the same countries. We find that the pigmentation phenotypes are highly stratified along axes of European genetic differentiation. The country of sampling explains approximately 35% of the variation in skin pigmentation, 31% of the variation in hair pigmentation, and 40% of the variation in eye pigmentation. All three quantitative phenotypes are correlated with each other. In our two-stage association study, we reproduce the association of rs1667394 at the OCA2/HERC2 locus with eye color but we do not identify new genetic determinants of skin and hair pigmentation supporting the lack of major genes affecting skin and hair color variation within Europe and suggesting that not only careful phenotyping but also larger cohorts are required to understand the genetic architecture of these complex quantitative traits. Interestingly, we also see that in each of these four populations, men are more lightly pigmented in the unexposed skin of the inner arm than women, a fact that is underappreciated and may vary across the world. PMID:23118974

  7. A quantitative study of nanoparticle skin penetration with interactive segmentation.

    PubMed

    Lee, Onseok; Lee, See Hyun; Jeong, Sang Hoon; Kim, Jaeyoung; Ryu, Hwa Jung; Oh, Chilhwan; Son, Sang Wook

    2016-10-01

    In the last decade, the application of nanotechnology techniques has expanded within diverse areas such as pharmacology, medicine, and optical science. Despite such wide-ranging possibilities for implementation into practice, the mechanisms behind nanoparticle skin absorption remain unknown. Moreover, the main mode of investigation has been qualitative analysis. Using interactive segmentation, this study suggests a method of objectively and quantitatively analyzing the mechanisms underlying the skin absorption of nanoparticles. Silica nanoparticles (SNPs) were assessed using transmission electron microscopy and applied to the human skin equivalent model. Captured fluorescence images of this model were used to evaluate degrees of skin penetration. These images underwent interactive segmentation and image processing in addition to statistical quantitative analyses of calculated image parameters including the mean, integrated density, skewness, kurtosis, and area fraction. In images from both groups, the distribution area and intensity of fluorescent silica gradually increased in proportion to time. Since statistical significance was achieved after 2 days in the negative charge group and after 4 days in the positive charge group, there is a periodic difference. Furthermore, the quantity of silica per unit area showed a dramatic change after 6 days in the negative charge group. Although this quantitative result is identical to results obtained by qualitative assessment, it is meaningful in that it was proven by statistical analysis with quantitation by using image processing. The present study suggests that the surface charge of SNPs could play an important role in the percutaneous absorption of NPs. These findings can help achieve a better understanding of the percutaneous transport of NPs. In addition, these results provide important guidance for the design of NPs for biomedical applications.

  8. PCR-free quantitative detection of genetically modified organism from raw materials – A novel electrochemiluminescence-based bio-barcode method

    PubMed Central

    Zhu, Debin; Tang, Yabing; Xing, Da; Chen, Wei R.

    2018-01-01

    Bio-barcode assay based on oligonucleotide-modified gold nanoparticles (Au-NPs) provides a PCR-free method for quantitative detection of nucleic acid targets. However, the current bio-barcode assay requires lengthy experimental procedures including the preparation and release of barcode DNA probes from the target-nanoparticle complex, and immobilization and hybridization of the probes for quantification. Herein, we report a novel PCR-free electrochemiluminescence (ECL)-based bio-barcode assay for the quantitative detection of genetically modified organism (GMO) from raw materials. It consists of tris-(2’2’-bipyridyl) ruthenium (TBR)-labele barcode DNA, nucleic acid hybridization using Au-NPs and biotin-labeled probes, and selective capture of the hybridization complex by streptavidin-coated paramagnetic beads. The detection of target DNA is realized by direct measurement of ECL emission of TBR. It can quantitatively detect target nucleic acids with high speed and sensitivity. This method can be used to quantitatively detect GMO fragments from real GMO products. PMID:18386909

  9. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  10. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  11. The acellular matrix (ACM) for bladder tissue engineering: A quantitative magnetic resonance imaging study.

    PubMed

    Cheng, Hai-Ling Margaret; Loai, Yasir; Beaumont, Marine; Farhat, Walid A

    2010-08-01

    Bladder acellular matrices (ACMs) derived from natural tissue are gaining increasing attention for their role in tissue engineering and regeneration. Unlike conventional scaffolds based on biodegradable polymers or gels, ACMs possess native biomechanical and many acquired biologic properties. Efforts to optimize ACM-based scaffolds are ongoing and would be greatly assisted by a noninvasive means to characterize scaffold properties and monitor interaction with cells. MRI is well suited to this role, but research with MRI for scaffold characterization has been limited. This study presents initial results from quantitative MRI measurements for bladder ACM characterization and investigates the effects of incorporating hyaluronic acid, a natural biomaterial useful in tissue-engineering and regeneration. Measured MR relaxation times (T(1), T(2)) and diffusion coefficient were consistent with increased water uptake and glycosaminoglycan content observed on biochemistry in hyaluronic acid ACMs. Multicomponent MRI provided greater specificity, with diffusion data showing an acellular environment and T(2) components distinguishing the separate effects of increased glycosaminoglycans and hydration. These results suggest that quantitative MRI may provide useful information on matrix composition and structure, which is valuable in guiding further development using bladder ACMs for organ regeneration and in strategies involving the use of hyaluronic acid.

  12. Effective Heart Disease Detection Based on Quantitative Computerized Traditional Chinese Medicine Using Representation Based Classifiers.

    PubMed

    Shu, Ting; Zhang, Bob; Tang, Yuan Yan

    2017-01-01

    At present, heart disease is the number one cause of death worldwide. Traditionally, heart disease is commonly detected using blood tests, electrocardiogram, cardiac computerized tomography scan, cardiac magnetic resonance imaging, and so on. However, these traditional diagnostic methods are time consuming and/or invasive. In this paper, we propose an effective noninvasive computerized method based on facial images to quantitatively detect heart disease. Specifically, facial key block color features are extracted from facial images and analyzed using the Probabilistic Collaborative Representation Based Classifier. The idea of facial key block color analysis is founded in Traditional Chinese Medicine. A new dataset consisting of 581 heart disease and 581 healthy samples was experimented by the proposed method. In order to optimize the Probabilistic Collaborative Representation Based Classifier, an analysis of its parameters was performed. According to the experimental results, the proposed method obtains the highest accuracy compared with other classifiers and is proven to be effective at heart disease detection.

  13. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  14. Quantitative measurements of nanoscale permittivity and conductivity using tuning-fork-based microwave impedance microscopy

    DOE PAGES

    Wu, Xiaoyu; Hao, Zhenqi; Wu, Di; ...

    2018-04-01

    Here, we report quantitative measurements of nanoscale permittivity and conductivity using tuning-fork (TF) based microwave impedance microscopy (MIM). The system is operated under the driving amplitude modulation mode, which ensures satisfactory feedback stability on samples with rough surfaces. The demodulated MIM signals on a series of bulk dielectrics are in good agreement with results simulated by finite-element analysis. Using the TF-MIM, we have visualized the evolution of nanoscale conductance on back-gated MoS 2 field effect transistors, and the results are consistent with the transport data. Our work suggests that quantitative analysis of mesoscopic electrical properties can be achieved by near-fieldmore » microwave imaging with small distance modulation.« less

  15. Looking at the Brains behind Figurative Language--A Quantitative Meta-Analysis of Neuroimaging Studies on Metaphor, Idiom, and Irony Processing

    ERIC Educational Resources Information Center

    Bohrn, Isabel C.; Altmann, Ulrike; Jacobs, Arthur M.

    2012-01-01

    A quantitative, coordinate-based meta-analysis combined data from 354 participants across 22 fMRI studies and one positron emission tomography (PET) study to identify the differences in neural correlates of figurative and literal language processing, and to investigate the role of the right hemisphere (RH) in figurative language processing.…

  16. Generating Linear Equations Based on Quantitative Reasoning

    ERIC Educational Resources Information Center

    Lee, Mi Yeon

    2017-01-01

    The Common Core's Standards for Mathematical Practice encourage teachers to develop their students' ability to reason abstractly and quantitatively by helping students make sense of quantities and their relationships within problem situations. The seventh-grade content standards include objectives pertaining to developing linear equations in…

  17. [Self-perception of health care team leaders in Andalusia. A quantitative and qualitative study].

    PubMed

    García-Romera, I; Danet, A; March-Cerdà, J C

    To determine the perception and self-assessment on leadership among health care team leaders in Andalusia. Design: Exploratory descriptive study using quantitative and qualitative methodology, developed between 2013 and 2015, using a questionnaire and semi-structured interviews. Andalusia. All health managers from the Primary Care Management Units and Health Management Areas of the Departments of Paediatrics, Emergency and Internal Medicine, for the quantitative study. A purposive sample of 24 health managers was used for the qualitative study. Descriptive statistical study and bivariate analysis of comparison of means. Content analysis of the semi-structured interviews: Codification, category tree, and triangulation of results. The best self-assessment dimension relates to support, and the worst to considering oneself as a 'good leader'. The definition of a 'good leader' includes: Honesty, trust, and attitudes of good communication, closeness, appreciation, and reinforcement of the health team members. Different leadership styles were perceived. Main difficulties for leadership are related to the economic crisis and the management of personal conflicts. Health managers describe an adaptive leadership style, based on personal and professional support, and using communication as the main cohesive element for the team project. More studies on leaders' perspectives are important, in order to better understand their experiences, needs and expectations. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  18. Prognostic Value of Quantitative Stress Perfusion Cardiac Magnetic Resonance.

    PubMed

    Sammut, Eva C; Villa, Adriana D M; Di Giovine, Gabriella; Dancy, Luke; Bosio, Filippo; Gibbs, Thomas; Jeyabraba, Swarna; Schwenke, Susanne; Williams, Steven E; Marber, Michael; Alfakih, Khaled; Ismail, Tevfik F; Razavi, Reza; Chiribiri, Amedeo

    2018-05-01

    This study sought to evaluate the prognostic usefulness of visual and quantitative perfusion cardiac magnetic resonance (CMR) ischemic burden in an unselected group of patients and to assess the validity of consensus-based ischemic burden thresholds extrapolated from nuclear studies. There are limited data on the prognostic value of assessing myocardial ischemic burden by CMR, and there are none using quantitative perfusion analysis. Patients with suspected coronary artery disease referred for adenosine-stress perfusion CMR were included (n = 395; 70% male; age 58 ± 13 years). The primary endpoint was a composite of cardiovascular death, nonfatal myocardial infarction, aborted sudden death, and revascularization after 90 days. Perfusion scans were assessed visually and with quantitative analysis. Cross-validated Cox regression analysis and net reclassification improvement were used to assess the incremental prognostic value of visual or quantitative perfusion analysis over a baseline clinical model, initially as continuous covariates, then using accepted thresholds of ≥2 segments or ≥10% myocardium. After a median 460 days (interquartile range: 190 to 869 days) follow-up, 52 patients reached the primary endpoint. At 2 years, the addition of ischemic burden was found to increase prognostic value over a baseline model of age, sex, and late gadolinium enhancement (baseline model area under the curve [AUC]: 0.75; visual AUC: 0.84; quantitative AUC: 0.85). Dichotomized quantitative ischemic burden performed better than visual assessment (net reclassification improvement 0.043 vs. 0.003 against baseline model). This study was the first to address the prognostic benefit of quantitative analysis of perfusion CMR and to support the use of consensus-based ischemic burden thresholds by perfusion CMR for prognostic evaluation of patients with suspected coronary artery disease. Quantitative analysis provided incremental prognostic value to visual assessment and

  19. Digital micromirror device-based common-path quantitative phase imaging.

    PubMed

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T C

    2017-04-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the "off" state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption.

  20. Quantitative Assessment of Parkinsonian Tremor Based on an Inertial Measurement Unit

    PubMed Central

    Dai, Houde; Zhang, Pengyue; Lueth, Tim C.

    2015-01-01

    Quantitative assessment of parkinsonian tremor based on inertial sensors can provide reliable feedback on the effect of medication. In this regard, the features of parkinsonian tremor and its unique properties such as motor fluctuations and dyskinesia are taken into account. Least-square-estimation models are used to assess the severities of rest, postural, and action tremors. In addition, a time-frequency signal analysis algorithm for tremor state detection was also included in the tremor assessment method. This inertial sensor-based method was verified through comparison with an electromagnetic motion tracking system. Seven Parkinson’s disease (PD) patients were tested using this tremor assessment system. The measured tremor amplitudes correlated well with the judgments of a neurologist (r = 0.98). The systematic analysis of sensor-based tremor quantification and the corresponding experiments could be of great help in monitoring the severity of parkinsonian tremor. PMID:26426020

  1. Cytoarchitectonic and quantitative Golgi study of the hedgehog supraoptic nucleus.

    PubMed Central

    Caminero, A A; Machín, C; Sanchez-Toscano, F

    1992-01-01

    A cytoarchitectural study was made of the supraoptic nucleus (SON) of the hedgehog with special attention to the quantitative comparison of its main neuronal types. The main purposes were (1) to relate the characteristics of this nucleus in the hedgehog (a primitive mammalian insectivorous brain) with those in the SONs of more evolutionarily advanced species; (2) to identify quantitatively the dendritic fields of the main neuronal types in the hedgehog SON and to study their synaptic connectivity. From a descriptive standpoint, 3 neuronal types were found with respect to the number of dendritic stems arising from the neuronal soma: bipolar neurons (48%), multipolar neurons (45.5%) and monopolar neurons (6.5%). Within the multipolar type 2 subtypes could be distinguished, taking into account the number of dendritic spines: (a) with few spines (93%) and (b) very spiny (7%). These results indicate that the hedgehog SON is similar to that in other species except for the very spiny neurons, the significance of which is discussed. In order to characterise the main types more satisfactorily (bipolar and multipolars with few spines) we undertook a quantitative Golgi study of their dendritic fields. Although the patterns of the dendritic field are similar in both neuronal types, the differences in the location of their connectivity can reflect functional changes and alterations in relation to the synaptic afferences. Images Fig. 2 Fig. 3 Fig. 5 Fig. 6 Fig. 7 Fig. 8 Fig. 9 PMID:1452481

  2. Aquatic toxicity of acrylates and methacrylates: quantitative structure-activity relationships based on Kow and LC50

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinert, K.H.

    1987-12-01

    Recent EPA scrutiny of acrylate and methacrylate monomers has resulted in restrictive consent orders and Significant New Use Rules under the Toxic Substances Control Act, based on structure-activity relationships using mouse skin painting studies. The concern is centered on human health issues regarding worker and consumer exposure. Environmental issues, such as aquatic toxicity, are still of concern. Understanding the relationships and environmental risks to aquatic organisms may improve the understanding of the potential risks to human health. This study evaluates the quantitative structure-activity relationships from measured log Kow's and log LC50's for Pimephales promelas (fathead minnow) and Carassius auratus (goldfish).more » Scientific support of the current regulations is also addressed. Two monomer classes were designated: acrylates and methacrylates. Spearman rank correlation and linear regression were run. Based on this study, an ecotoxicological difference exists between acrylates and methacrylates. Regulatory activities and scientific study should reflect this difference.« less

  3. Tissue microarrays and quantitative tissue-based image analysis as a tool for oncology biomarker and diagnostic development.

    PubMed

    Dolled-Filhart, Marisa P; Gustavson, Mark D

    2012-11-01

    Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.

  4. Quantitative evaluation of analyte transport on microfluidic paper-based analytical devices (μPADs).

    PubMed

    Ota, Riki; Yamada, Kentaro; Suzuki, Koji; Citterio, Daniel

    2018-02-07

    The transport efficiency during capillary flow-driven sample transport on microfluidic paper-based analytical devices (μPADs) made from filter paper has been investigated for a selection of model analytes (Ni 2+ , Zn 2+ , Cu 2+ , PO 4 3- , bovine serum albumin, sulforhodamine B, amaranth) representing metal cations, complex anions, proteins and anionic molecules. For the first time, the transport of the analytical target compounds rather than the sample liquid, has been quantitatively evaluated by means of colorimetry and absorption spectrometry-based methods. The experiments have revealed that small paperfluidic channel dimensions, additional user operation steps (e.g. control of sample volume, sample dilution, washing step) as well as the introduction of sample liquid wicking areas allow to increase analyte transport efficiency. It is also shown that the interaction of analytes with the negatively charged cellulosic paper substrate surface is strongly influenced by the physico-chemical properties of the model analyte and can in some cases (Cu 2+ ) result in nearly complete analyte depletion during sample transport. The quantitative information gained through these experiments is expected to contribute to the development of more sensitive μPADs.

  5. Quantitative study on spatio-temporal change of urban landscape pattern based on RS/GIS: a case of Xi'an metropolitan area in China

    NASA Astrophysics Data System (ADS)

    Chen, Meiwu; Zong, Yueguang; Ma, Qiang; Li, Jian

    2007-06-01

    The study on landscape pattern is an important field of urban land use and ecological change. Since 1990s, the widely accepted Patch-Corridor-Matrix model is generally used in qualitative description of landscape pattern. In recent years, quantitative evaluation on urban landscape dynamics is becoming hot in research. By making a critical review on existing research methods of landscape pattern, a new approach based on RS/GIS is put forward in this paper, comprising three steps, "General pattern characteristics - Gradient differentiation feature- Directional signature of the landscape", and we call it GGD. This method is applied to the case study of Xi'an metropolitan area in China. The result shows that the method is effective on quantitative study of urban landscape. The preparation of the method GGD is setting up research platform based on RS and GIS. By using the software of Geographical Information System (Arcgis9.0 & Erdas), the authors got the interpretation of remote sensing images of different years, and carried on the division of the landscape type of the research region. By calculating various index of landscape level with software Fragstats3.3 as an assistant tool and adopting three steps of GGD combined with landscape index, this paper can assesses the landscape spatial pattern of urban area: 1) General pattern characteristics analysis is to get transition probability of various landscape through Markov chain and to predict the landscape transformation by introducing CA model. The analysis emphasizes on total landscape structure and its change over time; 2) Gradient characteristic analysis, which makes gradient zone by taking city as a center outwardly with certain distance and contrastively analyzes the landscape index of each subarea, stresses the spatial character of landscape pattern, verifies urban morphology theories and provides the quantitative warranty for establishment of urban modality. Therefore, the analysis is useful for supervising urban

  6. Development of a Postcolumn Infused-Internal Standard Liquid Chromatography Mass Spectrometry Method for Quantitative Metabolomics Studies.

    PubMed

    Liao, Hsiao-Wei; Chen, Guan-Yuan; Wu, Ming-Shiang; Liao, Wei-Chih; Lin, Ching-Hung; Kuo, Ching-Hua

    2017-02-03

    Quantitative metabolomics has become much more important in clinical research in recent years. Individual differences in matrix effects (MEs) and the injection order effect are two major factors that reduce the quantification accuracy in liquid chromatography-electrospray ionization-mass spectrometry-based (LC-ESI-MS) metabolomics studies. This study proposed a postcolumn infused-internal standard (PCI-IS) combined with a matrix normalization factor (MNF) strategy to improve the analytical accuracy of quantitative metabolomics. The PCI-IS combined with the MNF method was applied for a targeted metabolomics study of amino acids (AAs). D8-Phenylalanine was used as the PCI-IS, and it was postcolumn-infused into the ESI interface for calibration purposes. The MNF was used to bridge the AA response in a standard solution with the plasma samples. The MEs caused signal changes that were corrected by dividing the AA signal intensities by the PCI-IS intensities after adjustment with the MNF. After the method validation, we evaluated the method applicability for breast cancer research using 100 plasma samples. The quantification results revealed that the 11 tested AAs exhibit an accuracy between 88.2 and 110.7%. The principal component analysis score plot revealed that the injection order effect can be successfully removed, and most of the within-group variation of the tested AAs decreased after the PCI-IS correction. Finally, targeted metabolomics studies on the AAs showed that tryptophan was expressed more in malignant patients than in the benign group. We anticipate that a similar approach can be applied to other endogenous metabolites to facilitate quantitative metabolomics studies.

  7. GIS based quantitative morphometric analysis and its consequences: a case study from Shanur River Basin, Maharashtra India

    NASA Astrophysics Data System (ADS)

    Pande, Chaitanya B.; Moharir, Kanak

    2017-05-01

    A morphometric analysis of Shanur basin has been carried out using geoprocessing techniques in GIS. These techniques are found relevant for the extraction of river basin and its drainage networks. The extracted drainage network was classified according to Strahler's system of classification and it reveals that the terrain exhibits dendritic to sub-dendritic drainage pattern. Hence, from the study, it is concluded that remote sensing data (SRTM-DEM data of 30 m resolution) coupled with geoprocessing techniques prove to be a competent tool used in morphometric analysis and evaluation of linear, slope, areal and relief aspects of morphometric parameters. The combined outcomes have established the topographical and even recent developmental situations in basin. It will also change the setup of the region. It therefore needs to analyze high level parameters of drainage and environment for suitable planning and management of water resource developmental plan and land resource development plan. The Shanur drainage basin is sprawled over an area of 281.33 km2. The slope of the basin varies from 1 to 10 %, and the slope variation is chiefly controlled by the local geology and erosion cycles. The main stream length ratio of the basin is 14.92 indicating that the study area is elongated with moderate relief and steep slopes. The morphometric parameters of the stream have been analyzed and calculated by applying standard methods and techniques viz. Horton (Trans Am Geophys Union 13:350-361, 1945), Miller (A quantitative geomorphologic study of drainage basin characteristics in the clinch mountain area, Virginia and Tennessee Columbia University, Department of Geology, Technical Report, No. 3, Contract N6 ONR 271-300, 1953), and Strahler (Handbook of applied hydrology, McGraw Hill Book Company, New York, 1964). GIS based on analysis of all morphometric parameters and the erosional development of the area by the streams has been progressed well beyond maturity and lithology is

  8. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  9. Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research

    PubMed Central

    SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN

    2015-01-01

    Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073

  10. Single and two-shot quantitative phase imaging using Hilbert-Huang Transform based fringe pattern analysis

    NASA Astrophysics Data System (ADS)

    Trusiak, Maciej; Micó, Vicente; Patorski, Krzysztof; García-Monreal, Javier; Sluzewski, Lukasz; Ferreira, Carlos

    2016-08-01

    In this contribution we propose two Hilbert-Huang Transform based algorithms for fast and accurate single-shot and two-shot quantitative phase imaging applicable in both on-axis and off-axis configurations. In the first scheme a single fringe pattern containing information about biological phase-sample under study is adaptively pre-filtered using empirical mode decomposition based approach. Further it is phase demodulated by the Hilbert Spiral Transform aided by the Principal Component Analysis for the local fringe orientation estimation. Orientation calculation enables closed fringes efficient analysis and can be avoided using arbitrary phase-shifted two-shot Gram-Schmidt Orthonormalization scheme aided by Hilbert-Huang Transform pre-filtering. This two-shot approach is a trade-off between single-frame and temporal phase shifting demodulation. Robustness of the proposed techniques is corroborated using experimental digital holographic microscopy studies of polystyrene micro-beads and red blood cells. Both algorithms compare favorably with the temporal phase shifting scheme which is used as a reference method.

  11. Accuracy and Precision of Silicon Based Impression Media for Quantitative Areal Texture Analysis

    PubMed Central

    Goodall, Robert H.; Darras, Laurent P.; Purnell, Mark A.

    2015-01-01

    Areal surface texture analysis is becoming widespread across a diverse range of applications, from engineering to ecology. In many studies silicon based impression media are used to replicate surfaces, and the fidelity of replication defines the quality of data collected. However, while different investigators have used different impression media, the fidelity of surface replication has not been subjected to quantitative analysis based on areal texture data. Here we present the results of an analysis of the accuracy and precision with which different silicon based impression media of varying composition and viscosity replicate rough and smooth surfaces. Both accuracy and precision vary greatly between different media. High viscosity media tested show very low accuracy and precision, and most other compounds showed either the same pattern, or low accuracy and high precision, or low precision and high accuracy. Of the media tested, mid viscosity President Jet Regular Body and low viscosity President Jet Light Body (Coltène Whaledent) are the only compounds to show high levels of accuracy and precision on both surface types. Our results show that data acquired from different impression media are not comparable, supporting calls for greater standardisation of methods in areal texture analysis. PMID:25991505

  12. Designing automation for human use: empirical studies and quantitative models.

    PubMed

    Parasuraman, R

    2000-07-01

    An emerging knowledge base of human performance research can provide guidelines for designing automation that can be used effectively by human operators of complex systems. Which functions should be automated and to what extent in a given system? A model for types and levels of automation that provides a framework and an objective basis for making such choices is described. The human performance consequences of particular types and levels of automation constitute primary evaluative criteria for automation design when using the model. Four human performance areas are considered--mental workload, situation awareness, complacency and skill degradation. Secondary evaluative criteria include such factors as automation reliability, the risks of decision/action consequences and the ease of systems integration. In addition to this qualitative approach, quantitative models can inform design. Several computational and formal models of human interaction with automation that have been proposed by various researchers are reviewed. An important future research need is the integration of qualitative and quantitative approaches. Application of these models provides an objective basis for designing automation for effective human use.

  13. Beyond Epilepsy: How Can Quantitative Electroencephalography Improve Conventional Electroencephalography Findings? A Systematic Review of Comparative EEG Studies.

    PubMed

    Martins, Cassio Henrique Taques; Assunção, Catarina De Marchi

    2018-01-01

    It is a fundamental element in both research and clinical applications of electroencephalography to know the frequency composition of brain electrical activity. The quantitative analysis of brain electrical activity uses computer resources to evaluate the electroencephalography and allows quantification of the data. The contribution of the quantitative perspective is unique, since conventional electroencephalography based on the visual examination of the tracing is not as objective. A systematic review was performed on the MEDLINE database in October 2017. The authors independently analyzed the studies, by title and abstract, and selected articles that met the inclusion criteria: comparative studies, not older than 30 years, that compared the use of conventional electroencephalogram (EEG) with the use of quantitative electroencephalogram (QEEG) in the English language. One hundred twelve articles were automatically selected by the MEDLINE search engine, but only six met the above criteria. The review found that given a 95% confidence interval, QEEG had no statistically higher sensitivity than EEG in four of the six studies reviewed. However, these results must be viewed with appropriate caution, particularly as groups in between studies were not matched on important variables such as gender, age, type of illness, recovery stage, and treatment. The authors' findings in this systematic review are suggestive of the importance of QEEG as an auxiliary tool to traditional EEG, and as such, justifying further refinement, standardization, and eventually the future execution of a head-to-head prospective study on comparing the two methods.

  14. Art or Science? An Evidence-Based Approach to Human Facial Beauty a Quantitative Analysis Towards an Informed Clinical Aesthetic Practice.

    PubMed

    Harrar, Harpal; Myers, Simon; Ghanem, Ali M

    2018-02-01

    Patients often seek guidance from the aesthetic practitioners regarding treatments to enhance their 'beauty'. Is there a science behind the art of assessment and if so is it measurable? Through the centuries, this question has challenged scholars, artists and surgeons. This study aims to undertake a review of the evidence behind quantitative facial measurements in assessing beauty to help the practitioner in everyday aesthetic practice. A Medline, Embase search for beauty, facial features and quantitative analysis was undertaken. Inclusion criteria were studies on adults, and exclusions included studies undertaken for dental, cleft lip, oncology, burns or reconstructive surgeries. The abstracts and papers were appraised, and further studies excluded that were considered inappropriate. The data were extracted using a standardised table. The final dataset was appraised in accordance with the PRISMA checklist and Holland and Rees' critique tools. Of the 1253 studies screened, 1139 were excluded from abstracts and a further 70 excluded from full text articles. The remaining 44 were assessed qualitatively and quantitatively. It became evident that the datasets were not comparable. Nevertheless, common themes were obvious, and these were summarised. Despite measures of the beauty of individual components to the sum of all the parts, such as symmetry and the golden ratio, we are yet far from establishing what truly constitutes quantitative beauty. Perhaps beauty is truly in the 'eyes of the beholder' (and perhaps in the eyes of the subject too). This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  15. [Quantitative Analysis of Heavy Metals in Water with LIBS Based on Signal-to-Background Ratio].

    PubMed

    Hu, Li; Zhao, Nan-jing; Liu, Wen-qing; Fang, Li; Zhang, Da-hai; Wang, Yin; Meng, De Shuo; Yu, Yang; Ma, Ming-jun

    2015-07-01

    There are many influence factors in the precision and accuracy of the quantitative analysis with LIBS technology. According to approximately the same characteristics trend of background spectrum and characteristic spectrum along with the change of temperature through in-depth analysis, signal-to-background ratio (S/B) measurement and regression analysis could compensate the spectral line intensity changes caused by system parameters such as laser power, spectral efficiency of receiving. Because the measurement dates were limited and nonlinear, we used support vector machine (SVM) for regression algorithm. The experimental results showed that the method could improve the stability and the accuracy of quantitative analysis of LIBS, and the relative standard deviation and average relative error of test set respectively were 4.7% and 9.5%. Data fitting method based on signal-to-background ratio(S/B) is Less susceptible to matrix elements and background spectrum etc, and provides data processing reference for real-time online LIBS quantitative analysis technology.

  16. Quantitative Assessment of a Field-Based Course on Integrative Geology, Ecology and Cultural History

    ERIC Educational Resources Information Center

    Sheppard, Paul R.; Donaldson, Brad A.; Huckleberry, Gary

    2010-01-01

    A field-based course at the University of Arizona called Sense of Place (SOP) covers the geology, ecology and cultural history of the Tucson area. SOP was quantitatively assessed for pedagogical effectiveness. Students of the Spring 2008 course were given pre- and post-course word association surveys in order to assess awareness and comprehension…

  17. 'Stories' or 'snapshots'? A study directed at comparing qualitative and quantitative approaches to curriculum evaluation.

    PubMed

    Pateman, B; Jinks, A M

    1999-01-01

    The focus of this paper is a study designed to explore the validity of quantitative approaches of student evaluation in a pre-registration degree programme. As managers of the students' education we were concerned that the quantitative method, which used lecturer criteria, may not fully represent students' views. The approach taken is that of a process-type strategy for curriculum evaluation as described by Parlett and Hamilton (1972). The aim of the study is to produce illuminative data, or students' 'stories' of their educational experiences through use of semi-structured interviews. The results are then compared to the current quantitative measurement tools designed to obtain 'snapshots' of the educational effectiveness of the curriculum. The quantitative measurement tools use Likert scale measurements of teacher-devised criterion statements. The results of the study give a rich source of qualitative data which can be used to inform future curriculum development. However, complete validation of the current quantitative instruments used was not achieved in this study. Student and teacher agendas in respect of important issues pertaining to the course programme were found to differ. Limitations of the study are given. There is discussion of the options open to the management team with regard to future development of curriculum evaluation systems.

  18. The emerging science of quantitative imaging biomarkers terminology and definitions for scientific studies and regulatory submissions.

    PubMed

    Kessler, Larry G; Barnhart, Huiman X; Buckler, Andrew J; Choudhury, Kingshuk Roy; Kondratovich, Marina V; Toledano, Alicia; Guimaraes, Alexander R; Filice, Ross; Zhang, Zheng; Sullivan, Daniel C

    2015-02-01

    The development and implementation of quantitative imaging biomarkers has been hampered by the inconsistent and often incorrect use of terminology related to these markers. Sponsored by the Radiological Society of North America, an interdisciplinary group of radiologists, statisticians, physicists, and other researchers worked to develop a comprehensive terminology to serve as a foundation for quantitative imaging biomarker claims. Where possible, this working group adapted existing definitions derived from national or international standards bodies rather than invent new definitions for these terms. This terminology also serves as a foundation for the design of studies that evaluate the technical performance of quantitative imaging biomarkers and for studies of algorithms that generate the quantitative imaging biomarkers from clinical scans. This paper provides examples of research studies and quantitative imaging biomarker claims that use terminology consistent with these definitions as well as examples of the rampant confusion in this emerging field. We provide recommendations for appropriate use of quantitative imaging biomarker terminological concepts. It is hoped that this document will assist researchers and regulatory reviewers who examine quantitative imaging biomarkers and will also inform regulatory guidance. More consistent and correct use of terminology could advance regulatory science, improve clinical research, and provide better care for patients who undergo imaging studies. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  19. Recidivism, Disciplinary History, and Institutional Adjustment: A Quantitative Study Examining Correctional Education Programs

    ERIC Educational Resources Information Center

    Flamer, Eric, Sr.

    2012-01-01

    Establishing college-degree programs for prison inmates is an evidence-based effective instructional strategy in reducing recidivism. Evaluating academic arenas as a resource to improve behavior and levels of functioning within correctional facilities is a necessary component of inmate academic programs. The purpose of this quantitative,…

  20. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  1. Multiplexed and Microparticle-based Analyses: Quantitative Tools for the Large-Scale Analysis of Biological Systems

    PubMed Central

    Nolan, John P.; Mandy, Francis

    2008-01-01

    While the term flow cytometry refers to the measurement of cells, the approach of making sensitive multiparameter optical measurements in a flowing sample stream is a very general analytical approach. The past few years have seen an explosion in the application of flow cytometry technology for molecular analysis and measurements using micro-particles as solid supports. While microsphere-based molecular analyses using flow cytometry date back three decades, the need for highly parallel quantitative molecular measurements that has arisen from various genomic and proteomic advances has driven the development in particle encoding technology to enable highly multiplexed assays. Multiplexed particle-based immunoassays are now common place, and new assays to study genes, protein function, and molecular assembly. Numerous efforts are underway to extend the multiplexing capabilities of microparticle-based assays through new approaches to particle encoding and analyte reporting. The impact of these developments will be seen in the basic research and clinical laboratories, as well as in drug development. PMID:16604537

  2. The Functional Resonance Analysis Method for a systemic risk based environmental auditing in a sinter plant: A semi-quantitative approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patriarca, Riccardo, E-mail: riccardo.patriarca@uniroma1.it; Di Gravio, Giulio; Costantino, Francesco

    Environmental auditing is a main issue for any production plant and assessing environmental performance is crucial to identify risks factors. The complexity of current plants arises from interactions among technological, human and organizational system components, which are often transient and not easily detectable. The auditing thus requires a systemic perspective, rather than focusing on individual behaviors, as emerged in recent research in the safety domain for socio-technical systems. We explore the significance of modeling the interactions of system components in everyday work, by the application of a recent systemic method, i.e. the Functional Resonance Analysis Method (FRAM), in order tomore » define dynamically the system structure. We present also an innovative evolution of traditional FRAM following a semi-quantitative approach based on Monte Carlo simulation. This paper represents the first contribution related to the application of FRAM in the environmental context, moreover considering a consistent evolution based on Monte Carlo simulation. The case study of an environmental risk auditing in a sinter plant validates the research, showing the benefits in terms of identifying potential critical activities, related mitigating actions and comprehensive environmental monitoring indicators. - Highlights: • We discuss the relevance of a systemic risk based environmental audit. • We present FRAM to represent functional interactions of the system. • We develop a semi-quantitative FRAM framework to assess environmental risks. • We apply the semi-quantitative FRAM framework to build a model for a sinter plant.« less

  3. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  4. Renilla luciferase-based quantitation of Potato virus A infection initiated with Agrobacterium infiltration of N. benthamiana leaves.

    PubMed

    Eskelin, K; Suntio, T; Hyvärinen, S; Hafren, A; Mäkinen, K

    2010-03-01

    A quantitation method based on the sensitive detection of Renilla luciferase (Rluc) activity was developed and optimized for Potato virus A (PVA; genus Potyviridae) gene expression. This system is based on infections initiated by Agrobacterium infiltration and subsequent detection of the translation of PVA::Rluc RNA, which is enhanced by viral replication, first within the cells infected initially and later by translation and replication within new cells after spread of the virus. Firefly luciferase (Fluc) was used as an internal control to normalize the Rluc activity. An approximately 10-fold difference in the Rluc/Fluc activity ratio between a movement-deficient and a replication-deficient mutant was observed starting from 48h post Agrobacterium infiltration (h.p.i.). The Rluc activity derived from wild type (wt) PVA increased significantly between 48 and 72h.p.i. and the Rluc/Fluc activity deviated clearly from that of the mutant viruses. Quantitation of the Rluc and Fluc mRNAs by semi-quantitative RT-PCR indicated that increases and decreases in the Renillareniformis luciferase (rluc) mRNA levels coincided with changes in Rluc activity. However, a subtle increase in the mRNA level led to pronounced changes in Rluc activity. PVA CP accumulation was quantitated by enzyme-linked immunosorbent assay. The increase in Rluc activity correlated closely with virus accumulation. Copyright (c) 2009 Elsevier B.V. All rights reserved.

  5. Digital micromirror device-based common-path quantitative phase imaging

    PubMed Central

    Zheng, Cheng; Zhou, Renjie; Kuang, Cuifang; Zhao, Guangyuan; Yaqoob, Zahid; So, Peter T. C.

    2017-01-01

    We propose a novel common-path quantitative phase imaging (QPI) method based on a digital micromirror device (DMD). The DMD is placed in a plane conjugate to the objective back-aperture plane for the purpose of generating two plane waves that illuminate the sample. A pinhole is used in the detection arm to filter one of the beams after sample to create a reference beam. Additionally, a transmission-type liquid crystal device, placed at the objective back-aperture plane, eliminates the specular reflection noise arising from all the “off” state DMD micromirrors, which is common in all DMD-based illuminations. We have demonstrated high sensitivity QPI, which has a measured spatial and temporal noise of 4.92 nm and 2.16 nm, respectively. Experiments with calibrated polystyrene beads illustrate the desired phase measurement accuracy. In addition, we have measured the dynamic height maps of red blood cell membrane fluctuations, showing the efficacy of the proposed system for live cell imaging. Most importantly, the DMD grants the system convenience in varying the interference fringe period on the camera to easily satisfy the pixel sampling conditions. This feature also alleviates the pinhole alignment complexity. We envision that the proposed DMD-based common-path QPI system will allow for system miniaturization and automation for a broader adaption. PMID:28362789

  6. Nurses' perceptions of evidence-based practice: a quantitative study at a teaching hospital in Iran.

    PubMed

    Shafiei, Ebrahim; Baratimarnani, Ahmad; Goharinezhad, Salime; Kalhor, Rohollah; Azmal, Mohammad

    2014-01-01

    Evidence-based practice (EBP) provides nurses a method to use critically appraised and scientifically proven evidence for delivering quality health care and the best decision that leads to quality outcomes. The purpose of this study was to measure the practice, attitude and knowledge/skill of evidence-based practice of nurses in a teaching hospital in Iran. This cross-sectional study was conducted in 2011.The study sample was composed of 195 nurses who were working at the Fatemeh Zahra Hospital affiliated to Bushehr University of Medical Sciences (BPUMS). The survey instrument was a questionnaire based on Upton and Upton study. This tool measures Nurses' perceptions in the three sub-scales of practice, attitude and knowledge/skill of evidence-based practice. Descriptive statistical analysis was used to analyze the data. Pearson correlation coefficients were used to examine the relationship between subscales. The overall mean score of the evidence-based practice in this study was 4.48±1.26 from 7, and the three subscales of practice, attitude and knowledge/skill in evidence-based practice were, 4.58±1.24, 4.57±1.35 and 4.39±1.20, respectively. There was a strong relationship between knowledge and performance subscale (r=0.73,p<0.01). Findings of the study indicate that more training and education are required for evidence-based nursing. Successful implementation of evidence-based nursing depends on organizational plans and empowerment programs in hospitals. Hence, hospital managers should formulate a comprehensive strategy for improving EBP.

  7. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Principles of quantitation of viral loads using nucleic acid sequence-based amplification in combination with homogeneous detection using molecular beacons.

    PubMed

    Weusten, Jos J A M; Carpay, Wim M; Oosterlaken, Tom A M; van Zuijlen, Martien C A; van de Wiel, Paul A

    2002-03-15

    For quantitative NASBA-based viral load assays using homogeneous detection with molecular beacons, such as the NucliSens EasyQ HIV-1 assay, a quantitation algorithm is required. During the amplification process there is a constant growth in the concentration of amplicons to which the beacon can bind while generating a fluorescence signal. The overall fluorescence curve contains kinetic information on both amplicon formation and beacon binding, but only the former is relevant for quantitation. In the current paper, mathematical modeling of the relevant processes is used to develop an equation describing the fluorescence curve as a function of the amplification time and the relevant kinetic parameters. This equation allows reconstruction of RNA formation, which is characterized by an exponential increase in concentrations as long as the primer concentrations are not rate limiting and by linear growth over time after the primer pool is depleted. During the linear growth phase, the actual quantitation is based on assessing the amplicon formation rate from the viral RNA relative to that from a fixed amount of calibrator RNA. The quantitation procedure has been successfully applied in the NucliSens EasyQ HIV-1 assay.

  9. Quantitative characterization of the carbon/carbon composites components based on video of polarized light microscope.

    PubMed

    Li, Yixian; Qi, Lehua; Song, Yongshan; Chao, Xujiang

    2017-06-01

    The components of carbon/carbon (C/C) composites have significant influence on the thermal and mechanical properties, so a quantitative characterization of component is necessary to study the microstructure of C/C composites, and further to improve the macroscopic properties of C/C composites. Considering the extinction crosses of the pyrocarbon matrix have significant moving features, the polarized light microscope (PLM) video is used to characterize C/C composites quantitatively because it contains sufficiently dynamic and structure information. Then the optical flow method is introduced to compute the optical flow field between the adjacent frames, and segment the components of C/C composites from PLM image by image processing. Meanwhile the matrix with different textures is re-segmented by the length difference of motion vectors, and then the component fraction of each component and extinction angle of pyrocarbon matrix are calculated directly. Finally, the C/C composites are successfully characterized from three aspects of carbon fiber, pyrocarbon, and pores by a series of image processing operators based on PLM video, and the errors of component fractions are less than 15%. © 2017 Wiley Periodicals, Inc.

  10. WE-DE-207B-04: Quantitative Contrast-Enhanced Spectral Mammography Based On Photon-Counting Detectors: A Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, H; Zhou, B; Beidokhti, D

    Purpose: To investigate the feasibility of accurate quantification of iodine mass thickness in contrast-enhanced spectral mammography. Methods: Experimental phantom studies were performed on a spectral mammography system based on Si strip photon-counting detectors. Dual-energy images were acquired using 40 kVp and a splitting energy of 34 keV with 3 mm Al pre-filtration. The initial calibration was done with glandular and adipose tissue equivalent phantoms of uniform thicknesses and iodine disk phantoms of various concentrations. A secondary calibration was carried out using the iodine signal obtained from the dual-energy decomposed images and the known background phantom thicknesses and densities. The iodinemore » signal quantification method was validated using phantoms composed of a mixture of glandular and adipose materials, for various breast thicknesses and densities. Finally, the traditional dual-energy weighted subtraction method was also studied as a comparison. The measured iodine signal from both methods was compared to the known iodine concentrations of the disk phantoms to characterize the quantification accuracy. Results: There was good agreement between the iodine mass thicknesses measured using the proposed method and the known values. The root-mean-square (RMS) error was estimated to be 0.2 mg/cm2. The traditional weighted subtraction method also predicted a linear correlation between the measured signal and the known iodine mass thickness. However, the correlation slope and offset values were strongly dependent on the total breast thickness and density. Conclusion: The results of the current study suggest that iodine mass thickness can be accurately quantified with contrast-enhanced spectral mammography. The quantitative information can potentially improve the differentiation between benign and malignant legions. Grant funding from Philips Medical Systems.« less

  11. Family-based childhood obesity prevention interventions: a systematic review and quantitative content analysis.

    PubMed

    Ash, Tayla; Agaronov, Alen; Young, Ta'Loria; Aftosmes-Tobio, Alyssa; Davison, Kirsten K

    2017-08-24

    A wide range of interventions has been implemented and tested to prevent obesity in children. Given parents' influence and control over children's energy-balance behaviors, including diet, physical activity, media use, and sleep, family interventions are a key strategy in this effort. The objective of this study was to profile the field of recent family-based childhood obesity prevention interventions by employing systematic review and quantitative content analysis methods to identify gaps in the knowledge base. Using a comprehensive search strategy, we searched the PubMed, PsycIFO, and CINAHL databases to identify eligible interventions aimed at preventing childhood obesity with an active family component published between 2008 and 2015. Characteristics of study design, behavioral domains targeted, and sample demographics were extracted from eligible articles using a comprehensive codebook. More than 90% of the 119 eligible interventions were based in the United States, Europe, or Australia. Most interventions targeted children 2-5 years of age (43%) or 6-10 years of age (35%), with few studies targeting the prenatal period (8%) or children 14-17 years of age (7%). The home (28%), primary health care (27%), and community (33%) were the most common intervention settings. Diet (90%) and physical activity (82%) were more frequently targeted in interventions than media use (55%) and sleep (20%). Only 16% of interventions targeted all four behavioral domains. In addition to studies in developing countries, racial minorities and non-traditional families were also underrepresented. Hispanic/Latino and families of low socioeconomic status were highly represented. The limited number of interventions targeting diverse populations and obesity risk behaviors beyond diet and physical activity inhibit the development of comprehensive, tailored interventions. To ensure a broad evidence base, more interventions implemented in developing countries and targeting racial

  12. Behavioral and molecular studies of quantitative differences in hygienic behavior in honeybees.

    PubMed

    Gempe, Tanja; Stach, Silke; Bienefeld, Kaspar; Otte, Marianne; Beye, Martin

    2016-10-21

    Hygienic behavior (HB) enables honeybees to tolerate parasites, including infection with the parasitic mite Varroa destructor, and it is a well-known example of a quantitative genetic trait. The understanding of the molecular processes underpinning the quantitative differences in this behavior remains limited. We performed gene expression studies in worker bees that displayed quantitative genetic differences in HB. We established a high and low genetic source of HB performance and studied the engagements into HB of single worker bees under the same environmental conditions. We found that the percentage of worker bees that engaged in a hygienic behavioral task tripled in the high versus low HB sources, thus suggesting that genetic differences may mediate differences in stimulated states to perform HB. We found 501 differently expressed genes (DEGs) in the brains of hygienic and non-hygienic performing workers in the high HB source bees, and 342 DEGs in the brains of hygienic performing worker bees, relative to the gene expression in non-hygienic worker bees from the low HB source group. "Cell surface receptor ligand signal transduction" in the high and "negative regulation of cell communication" in the low HB source were overrepresented molecular processes, suggesting that these molecular processes in the brain may play a role in the regulation of quantitative differences in HB. Moreover, only 21 HB-associated DEGs were common between the high and low HB sources. The better HB colony performance is primarily achieved by a high number of bees engaging in the hygienic tasks that associate with distinct molecular processes in the brain. We propose that different gene products and pathways may mediate the quantitative genetic differences of HB.

  13. A computer system to be used with laser-based endoscopy for quantitative diagnosis of early gastric cancer.

    PubMed

    Miyaki, Rie; Yoshida, Shigeto; Tanaka, Shinji; Kominami, Yoko; Sanomura, Yoji; Matsuo, Taiji; Oka, Shiro; Raytchev, Bisser; Tamaki, Toru; Koide, Tetsushi; Kaneda, Kazufumi; Yoshihara, Masaharu; Chayama, Kazuaki

    2015-02-01

    To evaluate the usefulness of a newly devised computer system for use with laser-based endoscopy in differentiating between early gastric cancer, reddened lesions, and surrounding tissue. Narrow-band imaging based on laser light illumination has come into recent use. We devised a support vector machine (SVM)-based analysis system to be used with the newly devised endoscopy system to quantitatively identify gastric cancer on images obtained by magnifying endoscopy with blue-laser imaging (BLI). We evaluated the usefulness of the computer system in combination with the new endoscopy system. We evaluated the system as applied to 100 consecutive early gastric cancers in 95 patients examined by BLI magnification at Hiroshima University Hospital. We produced a set of images from the 100 early gastric cancers; 40 flat or slightly depressed, small, reddened lesions; and surrounding tissues, and we attempted to identify gastric cancer, reddened lesions, and surrounding tissue quantitatively. The average SVM output value was 0.846 ± 0.220 for cancerous lesions, 0.381 ± 0.349 for reddened lesions, and 0.219 ± 0.277 for surrounding tissue, with the SVM output value for cancerous lesions being significantly greater than that for reddened lesions or surrounding tissue. The average SVM output value for differentiated-type cancer was 0.840 ± 0.207 and for undifferentiated-type cancer was 0.865 ± 0.259. Although further development is needed, we conclude that our computer-based analysis system used with BLI will identify gastric cancers quantitatively.

  14. iTRAQ-Based Quantitative Proteomic Analysis of Spirulina platensis in Response to Low Temperature Stress

    PubMed Central

    Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng

    2016-01-01

    Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance. PMID:27902743

  15. iTRAQ-Based Quantitative Proteomic Analysis of Spirulina platensis in Response to Low Temperature Stress.

    PubMed

    Li, Qingye; Chang, Rong; Sun, Yijun; Li, Bosheng

    2016-01-01

    Low temperature (LT) is one of the most important abiotic stresses that can significantly reduce crop yield. To gain insight into how Spirulina responds to LT stress, comprehensive physiological and proteomic analyses were conducted in this study. Significant decreases in growth and pigment levels as well as excessive accumulation of compatible osmolytes were observed in response to LT stress. An isobaric tag for relative and absolute quantitation (iTRAQ)-based quantitative proteomics approach was used to identify changes in protein abundance in Spirulina under LT. A total of 3,782 proteins were identified, of which 1,062 showed differential expression. Bioinformatics analysis indicated that differentially expressed proteins that were enriched in photosynthesis, carbohydrate metabolism, amino acid biosynthesis, and translation are important for the maintenance of cellular homeostasis and metabolic balance in Spirulina when subjected to LT stress. The up-regulation of proteins involved in gluconeogenesis, starch and sucrose metabolism, and amino acid biosynthesis served as coping mechanisms of Spirulina in response to LT stress. Moreover, the down-regulated expression of proteins involved in glycolysis, TCA cycle, pentose phosphate pathway, photosynthesis, and translation were associated with reduced energy consumption. The findings of the present study allow a better understanding of the response of Spirulina to LT stress and may facilitate in the elucidation of mechanisms underlying LT tolerance.

  16. Poem Generator: A Comparative Quantitative Evaluation of a Microworlds-Based Learning Approach for Teaching English

    ERIC Educational Resources Information Center

    Jenkins, Craig

    2015-01-01

    This paper is a comparative quantitative evaluation of an approach to teaching poetry in the subject domain of English that employs a "guided discovery" pedagogy using computer-based microworlds. It uses a quasi-experimental design in order to measure performance gains in computational thinking and poetic thinking following a…

  17. Fifteen years of quantitative trait loci studies in fish: challenges and future directions.

    PubMed

    Ashton, David T; Ritchie, Peter A; Wellenreuther, Maren

    2017-03-01

    Understanding the genetic basis of phenotypic variation is a major challenge in biology. Here, we systematically evaluate 146 quantitative trait loci (QTL) studies on teleost fish over the last 15 years to investigate (i) temporal trends and (ii) factors affecting QTL detection and fine-mapping. The number of fish QTL studies per year increased over the review period and identified a cumulative number of 3632 putative QTLs. Most studies used linkage-based mapping approaches and were conducted on nonmodel species with limited genomic resources. A gradual and moderate increase in the size of the mapping population and a sharp increase in marker density from 2011 onwards were observed; however, the number of QTLs and variance explained by QTLs changed only minimally over the review period. Based on these findings, we discuss the causative factors and outline how larger sample sizes, phenomics, comparative genomics, epigenetics and software development could improve both the quantity and quality of QTLs in future genotype-phenotype studies. Given that the technical limitations on DNA sequencing have mostly been overcome in recent years, a renewed focus on these and other study design factors will likely lead to significant improvements in QTL studies in the future. © 2016 John Wiley & Sons Ltd.

  18. Proteus mirabilis biofilm - qualitative and quantitative colorimetric methods-based evaluation.

    PubMed

    Kwiecinska-Piróg, Joanna; Bogiel, Tomasz; Skowron, Krzysztof; Wieckowska, Ewa; Gospodarek, Eugenia

    2014-01-01

    Proteus mirabilis strains ability to form biofilm is a current topic of a number of research worldwide. In this study the biofilm formation of P. mirabilis strains derived from urine of the catheterized and non-catheterized patients has been investigated. A total number of 39 P. mirabilis strains isolated from the urine samples of the patients of dr Antoni Jurasz University Hospital No. 1 in Bydgoszcz clinics between 2011 and 2012 was used. Biofilm formation was evaluated using two independent quantitative and qualitative methods with TTC (2,3,5-triphenyl-tetrazolium chloride) and CV (crystal violet) application. The obtained results confirmed biofilm formation by all the examined strains, except quantitative method with TTC, in which 7.7% of the strains did not have this ability. It was shown that P. mirabilis rods have the ability to form biofilm on the surfaces of both biomaterials applied, polystyrene and polyvinyl chloride (Nelaton catheters). The differences in ability to form biofilm observed between P. mirabilis strains derived from the urine of the catheterized and non-catheterized patients were not statistically significant.

  19. Simulation and the Development of Clinical Judgment: A Quantitative Study

    ERIC Educational Resources Information Center

    Holland, Susan

    2015-01-01

    The purpose of this quantitative pretest posttest quasi-experimental research study was to explore the effect of the NESD on clinical judgment in associate degree nursing students and compare the differences between groups when the Nursing Education Simulation Design (NESD) guided simulation in order to identify educational strategies promoting…

  20. Suicide in the media: a quantitative review of studies based on non-fictional stories.

    PubMed

    Stack, Steven

    2005-04-01

    Research on the effect of suicide stories in the media on suicide in the real world has been marked by much debate and inconsistent findings. Recent narrative reviews have suggested that research based on nonfictional models is more apt to uncover imitative effects than research based on fictional models. There is, however, substantial variation in media effects within the research restricted to nonfictional accounts of suicide. The present analysis provides some explanations of the variation in findings in the work on nonfictional media. Logistic regression techniques applied to 419 findings from 55 studies determined that: (1) studies measuring the presence of either an entertainment or political celebrity were 5.27 times more likely to find a copycat effect, (2) studies focusing on stories that stressed negative definitions of suicide were 99% less likely to report a copycat effect, (3) research based on television stories (which receive less coverage than print stories) were 79% less likely to find a copycat effect, and (4) studies focusing on female suicide were 4.89 times more likely to report a copycat effect than other studies. The full logistic regression model correctly classified 77.3% of the findings from the 55 studies. Methodological differences among studies are associated with discrepancies in their results.

  1. Quantitation of 47 human tear proteins using high resolution multiple reaction monitoring (HR-MRM) based-mass spectrometry.

    PubMed

    Tong, Louis; Zhou, Xi Yuan; Jylha, Antti; Aapola, Ulla; Liu, Dan Ning; Koh, Siew Kwan; Tian, Dechao; Quah, Joanne; Uusitalo, Hannu; Beuerman, Roger W; Zhou, Lei

    2015-02-06

    Tear proteins are intimately related to the pathophysiology of the ocular surface. Many recent studies have demonstrated that the tear is an accessible fluid for studying eye diseases and biomarker discovery. This study describes a high resolution multiple reaction monitoring (HR-MRM) approach for developing assays for quantification of biologically important tear proteins. Human tear samples were collected from 1000 subjects with no eye complaints (411 male, 589 female, average age: 55.5±14.5years) after obtaining informed consent. Tear samples were collected using Schirmer's strips and pooled into a single global control sample. Quantification of proteins was carried out by selecting "signature" peptides derived by trypsin digestion. A 1-h nanoLC-MS/MS run was used to quantify the tear proteins in HR-MRM mode. Good reproducibility of signal intensity (using peak areas) was demonstrated for all 47 HR-MRM assays with an average coefficient of variation (CV%) of 4.82% (range: 1.52-10.30%). All assays showed consistent retention time with a CV of less than 0.80% (average: 0.57%). HR-MRM absolute quantitation of eight tear proteins was demonstrated using stable isotope-labeled peptides. In this study, we demonstrated for the first time the technique to quantify 47 human tear proteins in HR-MRM mode using approximately 1μl of human tear sample. These multiplexed HR-MRM-based assays show great promise of further development for biomarker validation in human tear samples. Both discovery-based and targeted quantitative proteomics can be achieved in a single quadrupole time-of-flight mass spectrometer platform (TripleTOF 5600 system). Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Comparative evaluation of two quantitative test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface: a precollaborative study.

    PubMed

    Tomasino, Stephen F; Hamilton, Martin A

    2007-01-01

    Two quantitative carrier-based test methods for determining the efficacy of liquid sporicides and sterilants on a hard surface, the Standard Quantitative Carrier Test Method-ASTM E 2111-00 and an adaptation of a quantitative micro-method as reported by Sagripanti and Bonifacino, were compared in this study. The methods were selected based on their desirable characteristics (e.g., well-developed protocol, previous use with spores, fully quantitative, and use of readily available equipment) for testing liquid sporicides and sterilants on a hard surface. In this paper, the Sagripanti-Bonifacino procedure is referred to as the Three Step Method (TSM). AOAC Official Method 966.04 was included in this study as a reference method. Three laboratories participated in the evaluation. Three chemical treatments were tested: (1) 3000 ppm sodium hypochlorite with pH adjusted to 7.0, (2) a hydrogen peroxide/peroxyacetic acid product, and (3) 3000 ppm sodium hypochlorite with pH unadjusted (pH of approximately 10.0). A fourth treatment, 6000 ppm sodium hypochlorite solution with pH adjusted to 7.0, was included only for Method 966.04 as a positive control (high level of efficacy). The contact time was 10 min for all chemical treatments except the 6000 ppm sodium hypochlorite treatment which was tested at 30 min. Each chemical treatment was tested 3 times using each of the methods. Only 2 of the laboratories performed the AOAC method. Method performance was assessed by the within-laboratory variance, between-laboratory variance, and total variance associated with the log reduction (LR) estimates generated by each quantitative method. The quantitative methods performed similarly, and the LR values generated by each method were not statistically different for the 3 treatments evaluated. Based on feedback from the participating laboratories, compared to the TSM, ASTM E 2111-00 was more resource demanding and required more set-up time. The logistical and resource concerns identified

  3. A comparative study of qualitative and quantitative methods for the assessment of adhesive remnant after bracket debonding.

    PubMed

    Cehreli, S Burcak; Polat-Ozsoy, Omur; Sar, Cagla; Cubukcu, H Evren; Cehreli, Zafer C

    2012-04-01

    The amount of the residual adhesive after bracket debonding is frequently assessed in a qualitative manner, utilizing the adhesive remnant index (ARI). This study aimed to investigate whether quantitative assessment of the adhesive remnant yields more precise results compared to qualitative methods utilizing the 4- and 5-point ARI scales. Twenty debonded brackets were selected. Evaluation and scoring of the adhesive remnant on bracket bases were made consecutively using: 1. qualitative assessment (visual scoring) and 2. quantitative measurement (image analysis) on digital photographs. Image analysis was made on scanning electron micrographs (SEM) and high-precision elemental maps of the adhesive remnant as determined by energy dispersed X-ray spectrometry. Evaluations were made in accordance with the original 4-point and the modified 5-point ARI scales. Intra-class correlation coefficients (ICCs) were calculated, and the data were evaluated using Friedman test followed by Wilcoxon signed ranks test with Bonferroni correction. ICC statistics indicated high levels of agreement for qualitative visual scoring among examiners. The 4-point ARI scale was compliant with the SEM assessments but indicated significantly less adhesive remnant compared to the results of quantitative elemental mapping. When the 5-point scale was used, both quantitative techniques yielded similar results with those obtained qualitatively. These results indicate that qualitative visual scoring using the ARI is capable of generating similar results with those assessed by quantitative image analysis techniques. In particular, visual scoring with the 5-point ARI scale can yield similar results with both the SEM analysis and elemental mapping.

  4. Selecting the most appropriate inferential statistical test for your quantitative research study.

    PubMed

    Bettany-Saltikov, Josette; Whittaker, Victoria Jane

    2014-06-01

    To discuss the issues and processes relating to the selection of the most appropriate statistical test. A review of the basic research concepts together with a number of clinical scenarios is used to illustrate this. Quantitative nursing research generally features the use of empirical data which necessitates the selection of both descriptive and statistical tests. Different types of research questions can be answered by different types of research designs, which in turn need to be matched to a specific statistical test(s). Discursive paper. This paper discusses the issues relating to the selection of the most appropriate statistical test and makes some recommendations as to how these might be dealt with. When conducting empirical quantitative studies, a number of key issues need to be considered. Considerations for selecting the most appropriate statistical tests are discussed and flow charts provided to facilitate this process. When nursing clinicians and researchers conduct quantitative research studies, it is crucial that the most appropriate statistical test is selected to enable valid conclusions to be made. © 2013 John Wiley & Sons Ltd.

  5. Subjective Quantitative Studies of Human Agency

    ERIC Educational Resources Information Center

    Alkire, Sabina

    2005-01-01

    Amartya Sen's writings have articulated the importance of human agency, and identified the need for information on agency freedom to inform our evaluation of social arrangements. Many approaches to poverty reduction stress the need for empowerment. This paper reviews "subjective quantitative measures of human agency at the individual level." It…

  6. [Exercise Therapy in German Medical Rehabilitation - an Analysis based on Quantitative Routine Data].

    PubMed

    Brüggemann, Silke; Sewöster, Daniela; Kranzmann, Angela

    2018-02-01

    This study describes the quantitative importance of exercise therapy in German medical rehabilitation based on 2014 routine data of the German Pension Insurance. It also shows changes in comparison with data from 2007. Data from 710012 rehabilitation discharge letters comprising 83677802 treatments from central indications in medical rehabilitation were analysed descriptively. Overall 35.4% of treatments could be classified as exercise therapy. Total and relative duration, percentage of individual treatment and kind of exercise treatment varied between indications in 2007 as well as in 2014. There were also differences between sexes, age groups and settings. During the period examined the high importance of exercise therapy in German medical rehabilitation has increased. The results point at a meaningful concept behind the composition of exercise therapy taking indications and disease related factors into account. © Georg Thieme Verlag KG Stuttgart · New York.

  7. Nuclear medicine and quantitative imaging research (quantitative studies in radiopharmaceutical science): Comprehensive progress report, April 1, 1986-December 31, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1988-06-01

    This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less

  8. Quantitative assessment of human-induced impacts based on net primary productivity in Guangzhou, China.

    PubMed

    Wu, Yanyan; Wu, Zhifeng

    2018-04-01

    Urban expansion and land cover change driven primarily by human activities have significant influences on the urban eco-environment, and together with climate change jointly alter net primary productivity (NPP). However, at the spatiotemporal scale, there has been limited quantitative analysis of the impacts of human activities independent of climate change on NPP. We chose Guangzhou city as a study area to analyze the impacts of human activities on NPP, as well as the spatiotemporal variations of those impacts within three segments, using a relative impact index (RII) based on potential NPP (NPP p ), actual NPP (NPP act ), and NPP appropriation due to land use/land cover change (NPP lulc ). The spatial patterns and dynamics of NPP act and NPP lulc were evaluated and the impacts of human activities on NPP during the process of urban sprawl were quantitatively analyzed and assessed using the RII. The results showed that NPP act and NPP lulc in the study area had clear spatial heterogeneity, between 2001 and 2013 there was a declining trend in NPP act while an increasing trend occurred in NPP lulc , and those trends were especially significant in the 10-40-km segment. The results also revealed that more than 91.0% of pixels in whole study region had positive RII values, while the lowest average RII values were found in the > 40-km segment (39.03%), indicating that human activities were not the main cause for the change in NPP there; meanwhile, the average RII was greater than 65.0% in the other two, suggesting that they were subjected to severe anthropogenic disturbances. The RII values in all three segments of the study area increased, indicating an increasing human interference. The 10-40-km buffer zone had the largest slope value (0.5665), suggesting that this segment was closely associated with growing human disturbances. Particularly noteworthy is the fact that the > 40-km segment had a large slope value (0.3323) and required more conservation efforts. Based

  9. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  10. Quantitative study of FORC diagrams in thermally corrected Stoner- Wohlfarth nanoparticles systems

    NASA Astrophysics Data System (ADS)

    De Biasi, E.; Curiale, J.; Zysler, R. D.

    2016-12-01

    The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations "blur" the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner- Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution.

  11. A quantitative study on magnesium alloy stent biodegradation.

    PubMed

    Gao, Yuanming; Wang, Lizhen; Gu, Xuenan; Chu, Zhaowei; Guo, Meng; Fan, Yubo

    2018-06-06

    Insufficient scaffolding time in the process of rapid corrosion is the main problem of magnesium alloy stent (MAS). Finite element method had been used to investigate corrosion of MAS. However, related researches mostly described all elements suffered corrosion in view of one-dimensional corrosion. Multi-dimensional corrosions significantly influence mechanical integrity of MAS structures such as edges and corners. In this study, the effects of multi-dimensional corrosion were studied using experiment quantitatively, then a phenomenological corrosion model was developed to consider these effects. We implemented immersion test with magnesium alloy (AZ31B) cubes, which had different numbers of exposed surfaces to analyze differences of dimension. It was indicated that corrosion rates of cubes are almost proportional to their exposed-surface numbers, especially when pitting corrosions are not marked. The cubes also represented the hexahedron elements in simulation. In conclusion, corrosion rate of every element accelerates by increasing corrosion-surface numbers in multi-dimensional corrosion. The damage ratios among elements with the same size are proportional to the ratios of corrosion-surface numbers under uniform corrosion. The finite element simulation using proposed model provided more details of changes of morphology and mechanics in scaffolding time by removing 25.7% of elements of MAS. The proposed corrosion model reflected the effects of multi-dimension on corrosions. It would be used to predict degradation process of MAS quantitatively. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Evaluation of chemotherapy response in ovarian cancer treatment using quantitative CT image biomarkers: a preliminary study

    NASA Astrophysics Data System (ADS)

    Qiu, Yuchen; Tan, Maxine; McMeekin, Scott; Thai, Theresa; Moore, Kathleen; Ding, Kai; Liu, Hong; Zheng, Bin

    2015-03-01

    The purpose of this study is to identify and apply quantitative image biomarkers for early prediction of the tumor response to the chemotherapy among the ovarian cancer patients participated in the clinical trials of testing new drugs. In the experiment, we retrospectively selected 30 cases from the patients who participated in Phase I clinical trials of new drug or drug agents for ovarian cancer treatment. Each case is composed of two sets of CT images acquired pre- and post-treatment (4-6 weeks after starting treatment). A computer-aided detection (CAD) scheme was developed to extract and analyze the quantitative image features of the metastatic tumors previously tracked by the radiologists using the standard Response Evaluation Criteria in Solid Tumors (RECIST) guideline. The CAD scheme first segmented 3-D tumor volumes from the background using a hybrid tumor segmentation scheme. Then, for each segmented tumor, CAD computed three quantitative image features including the change of tumor volume, tumor CT number (density) and density variance. The feature changes were calculated between the matched tumors tracked on the CT images acquired pre- and post-treatments. Finally, CAD predicted patient's 6-month progression-free survival (PFS) using a decision-tree based classifier. The performance of the CAD scheme was compared with the RECIST category. The result shows that the CAD scheme achieved a prediction accuracy of 76.7% (23/30 cases) with a Kappa coefficient of 0.493, which is significantly higher than the performance of RECIST prediction with a prediction accuracy and Kappa coefficient of 60% (17/30) and 0.062, respectively. This study demonstrated the feasibility of analyzing quantitative image features to improve the early predicting accuracy of the tumor response to the new testing drugs or therapeutic methods for the ovarian cancer patients.

  13. TECHNOLOGICAL INNOVATION IN NEUROSURGERY: A QUANTITATIVE STUDY

    PubMed Central

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-01-01

    Object Technological innovation within healthcare may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technologically intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical technique. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation respectively. Methods A patent database was searched between 1960 and 2010 using the search terms “neurosurgeon” OR “neurosurgical” OR “neurosurgery”. The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top performing technology cluster was then selected as an exemplar for more detailed analysis of individual patents. Results In all, 11,672 patents and 208,203 publications relating to neurosurgery were identified. The top performing technology clusters over the 50 years were: image guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes and endoscopes. Image guidance and neuromodulation devices demonstrated a highly correlated rapid rise in patents and publications, suggesting they are areas of technology expansion. In-depth analysis of neuromodulation patents revealed that the majority of high performing patents were related to Deep Brain Stimulation (DBS). Conclusions Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery. PMID:25699414

  14. Technological innovation in neurosurgery: a quantitative study.

    PubMed

    Marcus, Hani J; Hughes-Hallett, Archie; Kwasnicki, Richard M; Darzi, Ara; Yang, Guang-Zhong; Nandi, Dipankar

    2015-07-01

    Technological innovation within health care may be defined as the introduction of a new technology that initiates a change in clinical practice. Neurosurgery is a particularly technology-intensive surgical discipline, and new technologies have preceded many of the major advances in operative neurosurgical techniques. The aim of the present study was to quantitatively evaluate technological innovation in neurosurgery using patents and peer-reviewed publications as metrics of technology development and clinical translation, respectively. The authors searched a patent database for articles published between 1960 and 2010 using the Boolean search term "neurosurgeon OR neurosurgical OR neurosurgery." The top 50 performing patent codes were then grouped into technology clusters. Patent and publication growth curves were then generated for these technology clusters. A top-performing technology cluster was then selected as an exemplar for a more detailed analysis of individual patents. In all, 11,672 patents and 208,203 publications related to neurosurgery were identified. The top-performing technology clusters during these 50 years were image-guidance devices, clinical neurophysiology devices, neuromodulation devices, operating microscopes, and endoscopes. In relation to image-guidance and neuromodulation devices, the authors found a highly correlated rapid rise in the numbers of patents and publications, which suggests that these are areas of technology expansion. An in-depth analysis of neuromodulation-device patents revealed that the majority of well-performing patents were related to deep brain stimulation. Patent and publication data may be used to quantitatively evaluate technological innovation in neurosurgery.

  15. A Quantitative Correlational Study of Teacher Preparation Program on Student Achievement

    ERIC Educational Resources Information Center

    Dingman, Jacob Blackstone

    2010-01-01

    The purpose of this quantitative correlational study was to identify the relationship between the type of teacher preparation program and student performance on the seventh and eighth grade mathematics state assessments in rural school settings. The study included a survey of a convenience sample of 36 teachers from Colorado and Washington school…

  16. A template-based approach to semi-quantitative SPECT myocardial perfusion imaging: Independent of normal databases.

    PubMed

    Hughes, Tyler; Shcherbinin, Sergey; Celler, Anna

    2011-07-01

    Normal patient databases (NPDs) are used to distinguish between normal and abnormal perfusion in SPECT myocardial perfusion imaging (MPI) and have gained wide acceptance in the clinical environment, yet there are limitations to this approach. This study introduces a template-based method for semi-quantitative MPI, which attempts to overcome some of the NPD limitations. Our approach involves the construction of a 3D digital healthy heart template from the delineation of the patient's left ventricle in the SPECT image. This patient-specific template of the heart, filled with uniform activity, is then analytically projected and reconstructed using the same algorithm as the original image. Subsequent to generating bulls-eye maps for the patient image (PB) and the template image (TB), a ratio (PB/TB) is calculated, which produces a reconstruction-artifact corrected image (CB). Finally, a threshold is used to define defects within CB enabling measurements of the perfusion defect extent (EXT). The SPECT-based template (Ts) measurements were compared to those of a CT-based "ideal" template (TI). Twenty digital phantoms were simulated: male and female, each with one healthy heart and nine hearts with various defects. Four physical phantom studies were performed modeling a healthy heart and three hearts with different defects. The phantom represented a thorax with spine, lung, and left ventricle inserts. Images were acquired on General Electric's (GE) Infinia Hawkeye SPECT/CT camera using standard clinical MPI protocol. Finally, our method was applied to 14 patient MPI rest/stress studies acquired on the GE Infinia Hawkeye SPECT/CT camera and compared to the results obtained from Cedars-Sinai's QPS software. In the simulation studies, the true EXT correlated well with the TI (slope= 1.08; offset = -0.40%; r = 0.99) and Ts (slope = 0.90; offset = 0.27%; r = 0.99) methods with no significant differences between them. Similarly, strong correlations were measured for EXT

  17. A Quantitative Study Examining Teacher Stress, Burnout, and Self-Efficacy

    ERIC Educational Resources Information Center

    Stephenson, Timar D.

    2012-01-01

    The purpose of this quantitative, correlational study was to examine the relationships between stress, burnout, and self-efficacy in public school teachers in the Turks and Caicos Islands. The Teacher Stress Inventory was used to collect data on teacher stress, the Maslach Burnout Inventory Educators Survey was used to obtain data on teacher…

  18. A globotetraosylceramide (Gb₄) receptor-based ELISA for quantitative detection of Shiga toxin 2e.

    PubMed

    Togashi, Katsuhiro; Sasaki, Shiho; Sato, Wataru

    2015-08-01

    Currently, no simple assays are available for routine quantitative detection of Escherichia coli-produced Shiga toxin 2e (Stx2e) that causes porcine edema disease. Here, we present a novel quantitative detection method for Stx2e based on the measurement of Stx2e binding to the specific globotetraosylceramide (Gb4) receptor by ELISA (Gb4-ELISA). No cross-reactivity was found with the other Shiga toxins Stx1 and Stx2, indicating high specificity. When the recombinant Stx2e B subunit (Stx2eB) was used, the absorbance measured by Gb4-ELISA increased linearly with Stx2eB concentration in the range of 20-2,500 ng/ml. The Gb4-ELISA method can be easily performed, suggesting that it would be a useful diagnostic tool for porcine edema disease.

  19. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  20. Quantitative studies of sulphate conjugation by isolated rat liver cells using [35S]sulphate.

    PubMed

    Dawson, J; Knowles, R G; Pogson, C I

    1991-06-21

    We have developed a simple, rapid and sensitive method for the study of sulphate conjugation in isolated liver cells based on the incorporation of 35S from [35S]sulphate. Excess [35S]sulphate is removed by a barium precipitation procedure, leaving [35S]sulphate conjugates in solution. We have used this method to examine the kinetics of sulphation of N-acetyl-p-aminophenol (acetaminophen), 4-nitrophenol and 1-naphthol in isolated rat liver cells. The efficiency of recovery of the sulphate conjugates was greater than 86%. The method is applicable to the quantitative study of sulphate conjugation of any substrate which forms a sulphate conjugate that is soluble in the presence of barium, without the need for standards or radiolabelled sulphate acceptors.

  1. A novel approach for evaluating the performance of real time quantitative loop-mediated isothermal amplification-based methods.

    PubMed

    Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A

    2014-12-01

    Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.

  2. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  3. Quantitative Detection of Cracks in Steel Using Eddy Current Pulsed Thermography.

    PubMed

    Shi, Zhanqun; Xu, Xiaoyu; Ma, Jiaojiao; Zhen, Dong; Zhang, Hao

    2018-04-02

    Small cracks are common defects in steel and often lead to catastrophic accidents in industrial applications. Various nondestructive testing methods have been investigated for crack detection; however, most current methods focus on qualitative crack identification and image processing. In this study, eddy current pulsed thermography (ECPT) was applied for quantitative crack detection based on derivative analysis of temperature variation. The effects of the incentive parameters on the temperature variation were analyzed in the simulation study. The crack profile and position are identified in the thermal image based on the Canny edge detection algorithm. Then, one or more trajectories are determined through the crack profile in order to determine the crack boundary through its temperature distribution. The slope curve along the trajectory is obtained. Finally, quantitative analysis of the crack sizes was performed by analyzing the features of the slope curves. The experimental verification showed that the crack sizes could be quantitatively detected with errors of less than 1%. Therefore, the proposed ECPT method was demonstrated to be a feasible and effective nondestructive approach for quantitative crack detection.

  4. A qualitative and quantitative laser-based computer-aided flow visualization method. M.S. Thesis, 1992 Final Report

    NASA Technical Reports Server (NTRS)

    Canacci, Victor A.; Braun, M. Jack

    1994-01-01

    The experimental approach presented here offers a nonintrusive, qualitative and quantitative evaluation of full field flow patterns applicable in various geometries in a variety of fluids. This Full Flow Field Tracking (FFFT) Particle Image Velocimetry (PIV) technique, by means of particle tracers illuminated by a laser light sheet, offers an alternative to Laser Doppler Velocimetry (LDV), and intrusive systems such as Hot Wire/Film Anemometry. The method makes obtainable the flow patterns, and allows quantitative determination of the velocities, accelerations, and mass flows of an entire flow field. The method uses a computer based digitizing system attached through an imaging board to a low luminosity camera. A customized optical train allows the system to become a long distance microscope (LDM), allowing magnifications of areas of interest ranging up to 100 times. Presented in addition to the method itself, are studies in which the flow patterns and velocities were observed and evaluated in three distinct geometries, with three different working fluids. The first study involved pressure and flow analysis of a brush seal in oil. The next application involved studying the velocity and flow patterns in a cowl lip cooling passage of an air breathing aircraft engine using water as the working fluid. Finally, the method was extended to a study in air to examine the flows in a staggered pin arrangement located on one side of a branched duct.

  5. Quantitative analysis of fracture surface by roughness and fractal method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, X.W.; Tian, J.F.; Kang, Y.

    1995-09-01

    In recent years there has been extensive research and great development in Quantitative Fractography, which acts as an integral part of fractographic analysis. A prominent technique for studying the fracture surface is based on fracture profile generation and the major means for characterizing the profile quantitatively are roughness and fractal methods. By this way, some quantitative indexes such as the roughness parameters R{sub L} for profile and R{sub S} for surface, fractal dimensions D{sub L} for profile and D{sub S} for surface can be measured. Given the relationships between the indexes and the mechanical properties of materials, it is possiblemore » to achieve the goal of protecting materials from fracture. But, as the case stands, the theory and experimental technology of quantitative fractography are still imperfect and remain to be studied further. Recently, Gokhale and Underwood et al have proposed an assumption-free method for estimating the surface roughness by vertically sectioning the fracture surface with sections at an angle of 120 deg with each other, which could be expressed as follows: R{sub S} = {ovr R{sub L}{center_dot}{Psi}} where {Psi} is the profile structure factor. This method is based on the classical sterological principles and verified with the aid of computer simulations for some ruled surfaces. The results are considered to be applicable to fracture surfaces with any arbitrary complexity and anisotropy. In order to extend the detail applications to this method in quantitative fractography, the authors made a study on roughness and fractal methods dependent on this method by performing quantitative measurements on some typical low-temperature impact fractures.« less

  6. Extracting quantitative measures from EAP: a small clinical study using BFOR.

    PubMed

    Hosseinbor, A Pasha; Chung, Moo K; Wu, Yu-Chien; Fleming, John O; Field, Aaron S; Alexander, Andrew L

    2012-01-01

    The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents, and hence providing rich information about complex tissue microstructure properties. Bessel Fourier orientation reconstruction (BFOR) is one of several analytical, non-Cartesian EAP reconstruction schemes employing multiple shell acquisitions that have recently been proposed. Such modeling bases have not yet been fully exploited in the extraction of rotationally invariant q-space indices that describe the degree of diffusion anisotropy/restrictivity. Such quantitative measures include the zero-displacement probability (P(o)), mean squared displacement (MSD), q-space inverse variance (QIV), and generalized fractional anisotropy (GFA), and all are simply scalar features of the EAP. In this study, a general relationship between MSD and q-space diffusion signal is derived and an EAP-based definition of GFA is introduced. A significant part of the paper is dedicated to utilizing BFOR in a clinical dataset, comprised of 5 multiple sclerosis (MS) patients and 4 healthy controls, to estimate P(o), MSD, QIV, and GFA of corpus callosum, and specifically, to see if such indices can detect changes between normal appearing white matter (NAWM) and healthy white matter (WM). Although the sample size is small, this study is a proof of concept that can be extended to larger sample sizes in the future.

  7. Quantitative contrast-enhanced spectral mammography based on photon-counting detectors: A feasibility study.

    PubMed

    Ding, Huanjun; Molloi, Sabee

    2017-08-01

    , the correlation slope and offset values were strongly dependent on the total breast thickness and density. The results of this study suggest that iodine mass thickness for cm-scale lesions can be accurately quantified with contrast-enhanced spectral mammography. The quantitative information can potentially improve the differential power for malignancy. © 2017 American Association of Physicists in Medicine.

  8. Brain Injury Lesion Imaging Using Preconditioned Quantitative Susceptibility Mapping without Skull Stripping.

    PubMed

    Soman, S; Liu, Z; Kim, G; Nemec, U; Holdsworth, S J; Main, K; Lee, B; Kolakowsky-Hayner, S; Selim, M; Furst, A J; Massaband, P; Yesavage, J; Adamson, M M; Spincemallie, P; Moseley, M; Wang, Y

    2018-04-01

    Identifying cerebral microhemorrhage burden can aid in the diagnosis and management of traumatic brain injury, stroke, hypertension, and cerebral amyloid angiopathy. MR imaging susceptibility-based methods are more sensitive than CT for detecting cerebral microhemorrhage, but methods other than quantitative susceptibility mapping provide results that vary with field strength and TE, require additional phase maps to distinguish blood from calcification, and depict cerebral microhemorrhages as bloom artifacts. Quantitative susceptibility mapping provides universal quantification of tissue magnetic property without these constraints but traditionally requires a mask generated by skull-stripping, which can pose challenges at tissue interphases. We evaluated the preconditioned quantitative susceptibility mapping MR imaging method, which does not require skull-stripping, for improved depiction of brain parenchyma and pathology. Fifty-six subjects underwent brain MR imaging with a 3D multiecho gradient recalled echo acquisition. Mask-based quantitative susceptibility mapping images were created using a commonly used mask-based quantitative susceptibility mapping method, and preconditioned quantitative susceptibility images were made using precondition-based total field inversion. All images were reviewed by a neuroradiologist and a radiology resident. Ten subjects (18%), all with traumatic brain injury, demonstrated blood products on 3D gradient recalled echo imaging. All lesions were visible on preconditioned quantitative susceptibility mapping, while 6 were not visible on mask-based quantitative susceptibility mapping. Thirty-one subjects (55%) demonstrated brain parenchyma and/or lesions that were visible on preconditioned quantitative susceptibility mapping but not on mask-based quantitative susceptibility mapping. Six subjects (11%) demonstrated pons artifacts on preconditioned quantitative susceptibility mapping and mask-based quantitative susceptibility mapping

  9. Assessing the Impact of a Race-Based Course on Counseling Students: A Quantitative Study

    ERIC Educational Resources Information Center

    Paone, Tina R.; Malott, Krista M.; Barr, Jason J.

    2015-01-01

    This study sought to determine changes in 121 White counseling students following their participation in an experiential, race-based course taught in a group format. Pre- and postoutcomes were reported based on instruments that measured White racial identity development, White privilege, color blindness, and the costs of racism. Findings indicated…

  10. [Diagnostic value of quantitative cultures of endotracheal aspirate in ventilator-associated pneumonia: a multicenter study].

    PubMed

    Valencia Arango, M; Torres Martí, A; Insausti Ordeñana, J; Alvarez Lerma, F; Carrasco Joaquinet, N; Herranz Casado, M; Tirapu León, J P

    2003-09-01

    To study the validity of quantitative cultures of tracheal aspirate (TA) in comparison with the plugged telescoping catheter (PTC) for the diagnosis of mechanical ventilator-associated pneumonia. Prospective multicenter study enrolling patients undergoing mechanical ventilation for longer than 72 hours. TA samples were collected from patients with suspected ventilator-associated pneumonia, followed by PTC sampling. Quantitative cultures were performed on all samples. Patients were classified according to the presence or not of pneumonia, based on clinical and radiologic criteria, clinical course and autopsy findings. The cutoff points were > or = 103 colony-forming units (cfu)/mL for PTC cultures; the TA cutoffs analyzed were > or = 105 and > or = 106 cfu/mL. Of the 120 patients studied, 84 had diagnoses of pneumonia and 36 did not (controls). The sensitivity values for TA > or = 106, TA > or = 105, and PTC, respectively, were 54% (95% confidence interval [CI], 42%-64%), 71% (95% CI, 60%-81%), and 68% (95% CI, 57%-78%). The specificity values were 75% (95% CI, 58%-88%), 58% (95% CI, 41%-74%), and 75% (95% CI, 58%-88%), respectively. Staphylococcus aureus was the microorganism most frequently isolated in both TA and PTC samples, followed in frequency by Pseudomomonas aeruginosa in TA samples and Haemophilus influenzae in PTC samples. No significant differences were found between the sensitivity of TA > or = 105 and that of PTC, nor between the specificities of TA > or = 106 and PTC. No differences in the specificities of PTC and TA were found when a TA cutoff of > or = 106 cfu/ml was used. Moreover, at a cutoff of > or = 105 the sensitivity of TA was not statistically different from that of PTC. Quantitative cultures of TA can be considered acceptable for the diagnosis of ventilator-associated pneumonia.

  11. Quantitative and qualitative approaches in the study of poverty and adolescent development: separation or integration?

    PubMed

    Leung, Janet T Y; Shek, Daniel T L

    2011-01-01

    This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.

  12. Targeted, Site-specific quantitation of N- and O-glycopeptides using 18O-labeling and product ion based mass spectrometry.

    PubMed

    Srikanth, Jandhyam; Agalyadevi, Rathinasamy; Babu, Ponnusamy

    2017-02-01

    The site-specific quantitation of N- and O-glycosylation is vital to understanding the function(s) of different glycans expressed at a given site of a protein under physiological and disease conditions. Most commonly used precursor ion intensity based quantification method is less accurate and other labeled methods are expensive and require enrichment of glycopeptides. Here, we used glycopeptide product (y and Y0) ions and 18 O-labeling of C-terminal carboxyl group as a strategy to obtain quantitative information about fold-change and relative abundance of most of the glycoforms attached to the glycopeptides. As a proof of concept, the accuracy and robustness of this targeted, relative quantification LC-MS method was demonstrated using Rituximab. Furthermore, the N-glycopeptide quantification results were compared with a biosimilar of Rituximab and validated with quantitative data obtained from 2-AB-UHPLC-FL method. We further demonstrated the intensity fold-change and relative abundance of 46 unique N- and O-glycopeptides and aglycopeptides from innovator and biosimilar samples of Etanercept using both the normal-MS and product ion based quantitation. The results showed a very similar site-specific expression of N- and O-glycopeptides between the samples but with subtle differences. Interestingly, we have also been able to quantify macro-heterogeneity of all N- and O-glycopetides of Etanercept. In addition to applications in biotherapeutics, the developed method can also be used for site-specific quantitation of N- and O-glycopeptides and aglycopeptides of glycoproteins with known glycosylation pattern.

  13. Quantitative Correlational Study: Emotional Intelligence and Project Outcomes among Hispanics in Technology

    ERIC Educational Resources Information Center

    Trejo, Arturo

    2013-01-01

    The present quantitative correlational research study explored relationships between Emotional Intelligence (EI) competencies, such as self-awareness, self-management, social awareness, and relationship management, and project management outcomes: scope creep, in-budget project cost, and project timeliness. The study was conducted within the…

  14. Connecting qualitative observation and quantitative measurement for enhancing quantitative literacy in plant anatomy course

    NASA Astrophysics Data System (ADS)

    Nuraeni, E.; Rahmat, A.

    2018-05-01

    Forming of cognitive schemes of plant anatomy concepts is performed by processing of qualitative and quantitative data obtained from microscopic observations. To enhancing student’s quantitative literacy, strategy of plant anatomy course was modified by adding the task to analyze quantitative data produced by quantitative measurement of plant anatomy guided by material course. Participant in this study was 24 biology students and 35 biology education students. Quantitative Literacy test, complex thinking in plant anatomy test and questioner used to evaluate the course. Quantitative literacy capability data was collected by quantitative literacy test with the rubric from the Association of American Colleges and Universities, Complex thinking in plant anatomy by test according to Marzano and questioner. Quantitative literacy data are categorized according to modified Rhodes and Finley categories. The results showed that quantitative literacy of biology education students is better than biology students.

  15. Using the realist perspective to link theory from qualitative evidence synthesis to quantitative studies: Broadening the matrix approach.

    PubMed

    van Grootel, Leonie; van Wesel, Floryt; O'Mara-Eves, Alison; Thomas, James; Hox, Joop; Boeije, Hennie

    2017-09-01

    This study describes an approach for the use of a specific type of qualitative evidence synthesis in the matrix approach, a mixed studies reviewing method. The matrix approach compares quantitative and qualitative data on the review level by juxtaposing concrete recommendations from the qualitative evidence synthesis against interventions in primary quantitative studies. However, types of qualitative evidence syntheses that are associated with theory building generate theoretical models instead of recommendations. Therefore, the output from these types of qualitative evidence syntheses cannot directly be used for the matrix approach but requires transformation. This approach allows for the transformation of these types of output. The approach enables the inference of moderation effects instead of direct effects from the theoretical model developed in a qualitative evidence synthesis. Recommendations for practice are formulated on the basis of interactional relations inferred from the qualitative evidence synthesis. In doing so, we apply the realist perspective to model variables from the qualitative evidence synthesis according to the context-mechanism-outcome configuration. A worked example shows that it is possible to identify recommendations from a theory-building qualitative evidence synthesis using the realist perspective. We created subsets of the interventions from primary quantitative studies based on whether they matched the recommendations or not and compared the weighted mean effect sizes of the subsets. The comparison shows a slight difference in effect sizes between the groups of studies. The study concludes that the approach enhances the applicability of the matrix approach. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Assessment of treatment response during chemoradiation therapy for pancreatic cancer based on quantitative radiomic analysis of daily CTs: An exploratory study.

    PubMed

    Chen, Xiaojian; Oshima, Kiyoko; Schott, Diane; Wu, Hui; Hall, William; Song, Yingqiu; Tao, Yalan; Li, Dingjie; Zheng, Cheng; Knechtges, Paul; Erickson, Beth; Li, X Allen

    2017-01-01

    In an effort for early assessment of treatment response, we investigate radiation induced changes in quantitative CT features of tumor during the delivery of chemoradiation therapy (CRT) for pancreatic cancer. Diagnostic-quality CT data acquired daily during routine CT-guided CRT using a CT-on-rails for 20 pancreatic head cancer patients were analyzed. On each daily CT, the pancreatic head, the spinal cord and the aorta were delineated and the histograms of CT number (CTN) in these contours were extracted. Eight histogram-based radiomic metrics including the mean CTN (MCTN), peak position, volume, standard deviation (SD), skewness, kurtosis, energy and entropy were calculated for each fraction. Paired t-test was used to check the significance of the change of specific metric at specific time. GEE model was used to test the association between changes of metrics over time for different pathology responses. In general, CTN histogram in the pancreatic head (but not in spinal cord) changed during the CRT delivery. Changes from the 1st to the 26th fraction in MCTN ranged from -15.8 to 3.9 HU with an average of -4.7 HU (p<0.001). Meanwhile the volume decreased, the skewness increased (less skewed), and the kurtosis decreased (less peaked). The changes of MCTN, volume, skewness, and kurtosis became significant after two weeks of treatment. Patient pathological response is associated with the changes of MCTN, SD, and skewness. In cases of good response, patients tend to have large reductions in MCTN and skewness, and large increases in SD and kurtosis. Significant changes in CT radiomic features, such as the MCTN, skewness, and kurtosis in tumor were observed during the course of CRT for pancreas cancer based on quantitative analysis of daily CTs. These changes may be potentially used for early assessment of treatment response and stratification for therapeutic intensification.

  17. Single Fluorescence Channel-based Multiplex Detection of Avian Influenza Virus by Quantitative PCR with Intercalating Dye

    PubMed Central

    Ahberg, Christian D.; Manz, Andreas; Neuzil, Pavel

    2015-01-01

    Since its invention in 1985 the polymerase chain reaction (PCR) has become a well-established method for amplification and detection of segments of double-stranded DNA. Incorporation of fluorogenic probe or DNA intercalating dyes (such as SYBR Green) into the PCR mixture allowed real-time reaction monitoring and extraction of quantitative information (qPCR). Probes with different excitation spectra enable multiplex qPCR of several DNA segments using multi-channel optical detection systems. Here we show multiplex qPCR using an economical EvaGreen-based system with single optical channel detection. Previously reported non quantitative multiplex real-time PCR techniques based on intercalating dyes were conducted once the PCR is completed by performing melting curve analysis (MCA). The technique presented in this paper is both qualitative and quantitative as it provides information about the presence of multiple DNA strands as well as the number of starting copies in the tested sample. Besides important internal control, multiplex qPCR also allows detecting concentrations of more than one DNA strand within the same sample. Detection of the avian influenza virus H7N9 by PCR is a well established method. Multiplex qPCR greatly enhances its specificity as it is capable of distinguishing both haemagglutinin (HA) and neuraminidase (NA) genes as well as their ratio. PMID:26088868

  18. Supply chain risk management of newspaper industry: A quantitative study

    NASA Astrophysics Data System (ADS)

    Sartika, Viny; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    The newspaper industry has several distinctive features that make it stands out from other industries. The strict delivery deadline and zero inventory led to a very short time frame for production and distribution. On the other hand, there is pressure from the newsroom to encourage the start of production as slowly as possible in order to enter the news, while there is pressure from production and distribution to start production as early as possible. Supply chain risk management is needed in determining the best strategy for dealing with possible risks in the newspaper industry. In a case study of a newspaper in Surakarta, quantitative approaches are made to the newspaper supply chain risk management by calculating the expected cost of risk based on the magnitude of the impact and the probability of a risk event. From the calculation results obtained that the five risks with the highest value are newspaper delays to the end customer, broken plate, miss print, down machine, and delayed delivery of newspaper content. Then analyzed appropriate mitigation strategies to cope with such risk events.

  19. A GIS-based Quantitative Approach for the Search of Clandestine Graves, Italy.

    PubMed

    Somma, Roberta; Cascio, Maria; Silvestro, Massimiliano; Torre, Eliana

    2018-05-01

    Previous research on the RAG color-coded prioritization systems for the discovery of clandestine graves has not considered all the factors influencing the burial site choice within a GIS project. The goal of this technical note was to discuss a GIS-based quantitative approach for the search of clandestine graves. The method is based on cross-referenced RAG maps with cumulative suitability factors to host a burial, leading to the editing of different search scenarios for ground searches showing high-(Red), medium-(Amber), and low-(Green) priority areas. The application of this procedure allowed several outcomes to be determined: If the concealment occurs at night, then the "search scenario without the visibility" will be the most effective one; if the concealment occurs in daylight, then the "search scenario with the DSM-based visibility" will be most appropriate; the different search scenarios may be cross-referenced with offender's confessions and eyewitnesses' testimonies to verify the veracity of their statements. © 2017 American Academy of Forensic Sciences.

  20. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE PAGES

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    2015-12-07

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  1. Multi-scale modeling of microstructure dependent intergranular brittle fracture using a quantitative phase-field based method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Pritam; Zhang, Yongfeng; Tonks, Michael R.

    In this study, the fracture behavior of brittle materials is strongly influenced by their underlying microstructure that needs explicit consideration for accurate prediction of fracture properties and the associated scatter. In this work, a hierarchical multi-scale approach is pursued to model microstructure sensitive brittle fracture. A quantitative phase-field based fracture model is utilized to capture the complex crack growth behavior in the microstructure and the related parameters are calibrated from lower length scale atomistic simulations instead of engineering scale experimental data. The workability of this approach is demonstrated by performing porosity dependent intergranular fracture simulations in UO 2 and comparingmore » the predictions with experiments.« less

  2. Identification of common coexpression modules based on quantitative network comparison.

    PubMed

    Jo, Yousang; Kim, Sanghyeon; Lee, Doheon

    2018-06-13

    Finding common molecular interactions from different samples is essential work to understanding diseases and other biological processes. Coexpression networks and their modules directly reflect sample-specific interactions among genes. Therefore, identification of common coexpression network or modules may reveal the molecular mechanism of complex disease or the relationship between biological processes. However, there has been no quantitative network comparison method for coexpression networks and we examined previous methods for other networks that cannot be applied to coexpression network. Therefore, we aimed to propose quantitative comparison methods for coexpression networks and to find common biological mechanisms between Huntington's disease and brain aging by the new method. We proposed two similarity measures for quantitative comparison of coexpression networks. Then, we performed experiments using known coexpression networks. We showed the validity of two measures and evaluated threshold values for similar coexpression network pairs from experiments. Using these similarity measures and thresholds, we quantitatively measured the similarity between disease-specific and aging-related coexpression modules and found similar Huntington's disease-aging coexpression module pairs. We identified similar Huntington's disease-aging coexpression module pairs and found that these modules are related to brain development, cell death, and immune response. It suggests that up-regulated cell signalling related cell death and immune/ inflammation response may be the common molecular mechanisms in the pathophysiology of HD and normal brain aging in the frontal cortex.

  3. Validation of a new UNIX-based quantitative coronary angiographic system for the measurement of coronary artery lesions.

    PubMed

    Bell, M R; Britson, P J; Chu, A; Holmes, D R; Bresnahan, J F; Schwartz, R S

    1997-01-01

    We describe a method of validation of computerized quantitative coronary arteriography and report the results of a new UNIX-based quantitative coronary arteriography software program developed for rapid on-line (digital) and off-line (digital or cinefilm) analysis. The UNIX operating system is widely available in computer systems using very fast processors and has excellent graphics capabilities. The system is potentially compatible with any cardiac digital x-ray system for on-line analysis and has been designed to incorporate an integrated database, have on-line and immediate recall capabilities, and provide digital access to all data. The accuracy (mean signed differences of the observed minus the true dimensions) and precision (pooled standard deviations of the measurements) of the program were determined x-ray vessel phantoms. Intra- and interobserver variabilities were assessed from in vivo studies during routine clinical coronary arteriography. Precision from the x-ray phantom studies (6-In. field of view) for digital images was 0.066 mm and for digitized cine images was 0.060 mm. Accuracy was 0.076 mm (overestimation) for digital images compared to 0.008 mm for digitized cine images. Diagnostic coronary catheters were also used for calibration; accuracy.varied according to size of catheter and whether or not they were filled with iodinated contrast. Intra- and interobserver variabilities were excellent and indicated that coronary lesion measurements were relatively user-independent. Thus, this easy to use and very fast UNIX based program appears to be robust with optimal accuracy and precision for clinical and research applications.

  4. Quantitative Susceptibility Mapping Indicates a Disturbed Brain Iron Homeostasis in Neuromyelitis Optica - A Pilot Study.

    PubMed

    Doring, Thomas Martin; Granado, Vanessa; Rueda, Fernanda; Deistung, Andreas; Reichenbach, Juergen R; Tukamoto, Gustavo; Gasparetto, Emerson Leandro; Schweser, Ferdinand

    2016-01-01

    Dysregulation of brain iron homeostasis is a hallmark of many neurodegenerative diseases and can be associated with oxidative stress. The objective of this study was to investigate brain iron in patients with Neuromyelitis Optica (NMO) using quantitative susceptibility mapping (QSM), a quantitative iron-sensitive MRI technique. 12 clinically confirmed NMO patients (6 female and 6 male; age 35.4y±14.2y) and 12 age- and sex-matched healthy controls (7 female and 5 male; age 33.9±11.3y) underwent MRI of the brain at 3 Tesla. Quantitative maps of the effective transverse relaxation rate (R2*) and magnetic susceptibility were calculated and a blinded ROI-based group comparison analysis was performed. Normality of the data and differences between patients and controls were tested by Kolmogorov-Smirnov and t-test, respectively. Correlation with age was studied using Spearman's rank correlation and an ANCOVA-like analysis. Magnetic susceptibility values were decreased in the red nucleus (p<0.01; d>0.95; between -15 and -22 ppb depending on reference region) with a trend toward increasing differences with age. R2* revealed significantly decreased relaxation in the optic radiations of five of the 12 patients (p<0.0001; -3.136±0.567 s-1). Decreased relaxation in the optic radiation is indicative for demyelination, which is in line with previous findings. Decreased magnetic susceptibility in the red nucleus is indicative for a lower brain iron concentration, a chemical redistribution of iron into less magnetic forms, or both. Further investigations are necessary to elucidate the pathological cause or consequence of this finding.

  5. Genome-wide Association Study of a Quantitative Disordered Gambling Trait

    PubMed Central

    Lind, Penelope A.; Zhu, Gu; Montgomery, Grant W; Madden, Pamela A.F.; Heath, Andrew C.; Martin, Nicholas G.; Slutske, Wendy S.

    2012-01-01

    Disordered gambling is a moderately heritable trait, but the underlying genetic basis is largely unknown. We performed a genome-wide association study (GWAS) for disordered gambling using a quantitative factor score in 1,312 twins from 894 Australian families. Association was conducted for 2,381,914 single nucleotide polymorphisms (SNPs) using the family-based association test in Merlin followed by gene and pathway enrichment analyses. Although no SNP reached genome-wide significance, six achieved P-values < 1 × 10−5 with variants in three genes (MT1X, ATXN1 and VLDLR) implicated in disordered gambling. Secondary case-control analyses found two SNPs on chromosome 9 (rs1106076 and rs12305135 near VLDLR) and rs10812227 near FZD10 on chromosome 12 to be significantly associated with lifetime DSM-IV pathological gambling and SOGS classified probable pathological gambling status. Furthermore, several addiction-related pathways were enriched for SNPs associated with disordered gambling. Finally, gene-based analysis of 24 candidate genes for dopamine agonist induced gambling in individuals with Parkinson’s disease suggested an enrichment of SNPs associated with disordered gambling. We report the first GWAS of disordered gambling. While further replication is required, the identification of susceptibility loci and biological pathways will be important in characterizing the biological mechanisms that underpin disordered gambling. PMID:22780124

  6. Towards quantitative assessment of calciphylaxis

    NASA Astrophysics Data System (ADS)

    Deserno, Thomas M.; Sárándi, István.; Jose, Abin; Haak, Daniel; Jonas, Stephan; Specht, Paula; Brandenburg, Vincent

    2014-03-01

    Calciphylaxis is a rare disease that has devastating conditions associated with high morbidity and mortality. Calciphylaxis is characterized by systemic medial calcification of the arteries yielding necrotic skin ulcerations. In this paper, we aim at supporting the installation of multi-center registries for calciphylaxis, which includes a photographic documentation of skin necrosis. However, photographs acquired in different centers under different conditions using different equipment and photographers cannot be compared quantitatively. For normalization, we use a simple color pad that is placed into the field of view, segmented from the image, and its color fields are analyzed. In total, 24 colors are printed on that scale. A least-squares approach is used to determine the affine color transform. Furthermore, the card allows scale normalization. We provide a case study for qualitative assessment. In addition, the method is evaluated quantitatively using 10 images of two sets of different captures of the same necrosis. The variability of quantitative measurements based on free hand photography is assessed regarding geometric and color distortions before and after our simple calibration procedure. Using automated image processing, the standard deviation of measurements is significantly reduced. The coefficients of variations yield 5-20% and 2-10% for geometry and color, respectively. Hence, quantitative assessment of calciphylaxis becomes practicable and will impact a better understanding of this rare but fatal disease.

  7. Experiencing teaching and learning quantitative reasoning in a project-based context

    NASA Astrophysics Data System (ADS)

    Muir, Tracey; Beswick, Kim; Callingham, Rosemary; Jade, Katara

    2016-12-01

    This paper presents the findings of a small-scale study that investigated the issues and challenges of teaching and learning about quantitative reasoning (QR) within a project-based learning (PjBL) context. Students and teachers were surveyed and interviewed about their experiences of learning and teaching QR in that context in contrast to teaching and learning mathematics in more traditional settings. The grade 9-12 student participants were characterised by a history of disengagement with mathematics and school in general, and the teacher participants were non-mathematics specialist teachers. Both students and teachers were new to the PjBL situation, which resulted in the teaching/learning relationship being a reciprocal one. The findings indicated that students and teachers viewed QR positively, particularly when compared with traditional mathematics teaching, yet tensions were identified for aspects such as implementation of curriculum and integration of relevant mathematics into projects. Both sets of participants identified situations where learning QR was particularly successful, along with concerns or difficulties about integrating QR into project work. The findings have implications for educators, who may need to examine their own approaches to mathematics teaching, particularly in terms of facilitating student engagement with the subject.

  8. Motor Events during Healthy Sleep: A Quantitative Polysomnographic Study

    PubMed Central

    Frauscher, Birgit; Gabelia, David; Mitterling, Thomas; Biermayr, Marlene; Bregler, Deborah; Ehrmann, Laura; Ulmer, Hanno; Högl, Birgit

    2014-01-01

    Study Objectives: Many sleep disorders are characterized by increased motor activity during sleep. In contrast, studies on motor activity during physiological sleep are largely lacking. We quantitatively investigated a large range of motor phenomena during polysomnography in physiological sleep. Design: Prospective polysomnographic investigation. Setting: Academic referral sleep laboratory. Participants: One hundred healthy sleepers age 19-77 y were strictly selected from a representative population sample by a two-step screening procedure. Interventions: N/A. Measurements and Results: Polysomnography according to American Academy of Sleep Medicine (AASM) standards was performed, and quantitative normative values were established for periodic limb movements in sleep (PLMS), high frequency leg movements (HFLM), fragmentary myoclonus (FM), neck myoclonus (NM), and rapid eye movement (REM)-related electromyographic (EMG) activity. Thirty-six subjects had a PLMS index > 5/h, 18 had a PLMS index > 15/h (90th percentile: 24.8/h). Thirty-three subjects had HFLM (90th percentile: four sequences/night). All subjects had FM (90th percentile 143.7/h sleep). Nine subjects fulfilled AASM criteria for excessive FM. Thirty-five subjects had NM (90th percentile: 8.8/h REM sleep). For REM sleep, different EMG activity measures for the mentalis and flexor digitorum superficialis muscles were calculated: the 90th percentile for phasic mentalis EMG activity for 30-sec epochs according to AASM recommendation was 15.6%, and for tonic mentalis EMG activity 2.6%. Twenty-five subjects exceeded the recently proposed phasic mentalis cutoff of 11%. None of the subjects exceeded the tonic mentalis cutoff of 9.6%. Conclusion: Quantification of motor phenomena is a basic prerequisite to develop normative values, and is a first step toward a more precise description of the various motor phenomena present during sleep. Because rates of motor events were unexpectedly high even in physiological

  9. Comparative quantitative study of astrocytes and capillary distribution in optic nerve laminar regions.

    PubMed

    Balaratnasingam, Chandrakumar; Kang, Min H; Yu, Paula; Chan, Geoffrey; Morgan, William H; Cringle, Stephen J; Yu, Dao-Yi

    2014-04-01

    Retinal ganglion cell (RGC) axonal structure and function in the optic nerve head (ONH) is predominantly supported by astrocytes and capillaries. There is good experimental evidence to demonstrate that RGC axons are perturbed in a non-uniform manner following ONH injury and it is likely that the pattern of RGC axonal modification bears some correlation with the quantitative properties of astrocytes and capillaries within laminar compartments. Although there have been some excellent topographic studies concerning glial and microvascular networks in the ONH our knowledge regarding the quantitative properties of these structures are limited. This report is an in-depth quantitative, structural analysis of astrocytes and capillaries in the pre laminar, lamina cribrosa and post laminar compartments of the ONH. 49 optic nerves from human (n = 10), pig (n = 12), horse (n = 6), rat (n = 11) and rabbit (n = 10) eyes are studied. Immunohistochemical and high-magnification confocal microscopy techniques are used to co-localise astrocytes, capillaries and nuclei in the mid-portion of the optic nerve. Quantitative methodology is used to determine the area occupied by astrocyte processes, microglia processes, nuclei density and the area occupied by capillaries in each laminar compartment. Comparisons are made within and between species. Relationships between ONH histomorphometry and astrocyte-capillary constitution are also explored. This study demonstrates that there are significant differences in the quantitative properties of capillaries and astrocytes between the laminar compartments of the human ONH. Astrocyte processes occupied the greatest area in the lamina cribrosa compartment of the human ONH implicating it as an area of great metabolic demands. Microglia were found to occupy only a small proportion of tissue in the rat, rabbit and pig optic nerve suggesting that the astrocyte is the predominant glia cell type in the optic nerve. This study also demonstrates

  10. Contribution of insula in Parkinson's disease: A quantitative meta-analysis study.

    PubMed

    Criaud, Marion; Christopher, Leigh; Boulinguez, Philippe; Ballanger, Benedicte; Lang, Anthony E; Cho, Sang S; Houle, Sylvain; Strafella, Antonio P

    2016-04-01

    The insula region is known to be an integrating hub interacting with multiple brain networks involved in cognitive, affective, sensory, and autonomic processes. There is growing evidence suggesting that this region may have an important role in Parkinson's disease (PD). Thus, to investigate the functional organization of the insular cortex and its potential role in parkinsonian features, we used a coordinate-based quantitative meta-analysis approach, the activation likelihood estimation. A total of 132 insular foci were selected from 96 published experiments comprising the five functional categories: cognition, affective/behavioral symptoms, bodily awareness/autonomic function, sensorimotor function, and nonspecific resting functional changes associated with the disease. We found a significant convergence of activation maxima related to PD in different insular regions including anterior and posterior regions bilaterally. This study provides evidence of an important functional distribution of different domains within the insular cortex in PD, particularly in relation to nonmotor aspects, with an influence of medication effect. © 2016 Wiley Periodicals, Inc.

  11. Comparative measurement and quantitative risk assessment of alcohol consumption through wastewater-based epidemiology: An international study in 20 cities.

    PubMed

    Ryu, Yeonsuk; Barceló, Damià; Barron, Leon P; Bijlsma, Lubertus; Castiglioni, Sara; de Voogt, Pim; Emke, Erik; Hernández, Félix; Lai, Foon Yin; Lopes, Alvaro; de Alda, Miren López; Mastroianni, Nicola; Munro, Kelly; O'Brien, Jake; Ort, Christoph; Plósz, Benedek G; Reid, Malcolm J; Yargeau, Viviane; Thomas, Kevin V

    2016-09-15

    Quantitative measurement of drug consumption biomarkers in wastewater can provide objective information on community drug use patterns and trends. This study presents the measurement of alcohol consumption in 20 cities across 11 countries through the use of wastewater-based epidemiology (WBE), and reports the application of these data for the risk assessment of alcohol on a population scale using the margin of exposure (MOE) approach. Raw 24-h composite wastewater samples were collected over a one-week period from 20 cities following a common protocol. For each sample a specific and stable alcohol consumption biomarker, ethyl sulfate (EtS) was determined by liquid chromatography coupled to tandem mass spectrometry. The EtS concentrations were used for estimation of per capita alcohol consumption in each city, which was further compared with international reports and applied for risk assessment by MOE. The average per capita consumption in 20 cities ranged between 6.4 and 44.3L/day/1000 inhabitants. An increase in alcohol consumption during the weekend occurred in all cities, however the level of this increase was found to differ. In contrast to conventional data (sales statistics and interviews), WBE revealed geographical differences in the level and pattern of actual alcohol consumption at an inter-city level. All the sampled cities were in the "high risk" category (MOE<10) and the average MOE for the whole population studied was 2.5. These results allowed direct comparisons of alcohol consumption levels, patterns and risks among the cities. This study shows that WBE can provide timely and complementary information on alcohol use and alcohol associated risks in terms of exposure at the community level. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. An analytical approach based on ESI-MS, LC-MS and PCA for the quali-quantitative analysis of cycloartane derivatives in Astragalus spp.

    PubMed

    Napolitano, Assunta; Akay, Seref; Mari, Angela; Bedir, Erdal; Pizza, Cosimo; Piacente, Sonia

    2013-11-01

    Astragalus species are widely used as health foods and dietary supplements, as well as drugs in traditional medicine. To rapidly evaluate metabolite similarities and differences among the EtOH extracts of the roots of eight commercial Astragalus spp., an approach based on direct analyses by ESI-MS followed by PCA of ESI-MS data, was carried out. Successively, quali-quantitative analyses of cycloartane derivatives in the eight Astragalus spp. by LC-ESI-MS(n) and PCA of LC-ESI-MS data were performed. This approach allowed to promptly highlighting metabolite similarities and differences among the various Astragalus spp. PCA results from LC-ESI-MS data of Astragalus samples were in reasonable agreement with both PCA results of ESI-MS data and quantitative results. This study affords an analytical method for the quali-quantitative determination of cycloartane derivatives in herbal preparations used as health and food supplements. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. The flexibility of a generic LC-MS/MS method for the quantitative analysis of therapeutic proteins based on human immunoglobulin G and related constructs in animal studies.

    PubMed

    Lanshoeft, Christian; Wolf, Thierry; Walles, Markus; Barteau, Samuel; Picard, Franck; Kretz, Olivier; Cianférani, Sarah; Heudi, Olivier

    2016-11-30

    An increasing demand of new analytical methods is associated with the growing number of biotherapeutic programs being prosecuted in the pharmaceutical industry. Whilst immunoassay has been the standard method for decades, a great interest in assays based on liquid chromatography tandem mass spectrometry (LC-MS/MS) is evolving. In this present work, the development of a generic method for the quantitative analysis of therapeutic proteins based on human immunoglobulin G (hIgG) in rat serum is reported. The method is based on four generic peptides GPSVFPLAPSSK (GPS), TTPPVLDSDGSFFLYSK (TTP), VVSVLTVLHQDWLNGK (VVS) and FNWYVDGVEVHNAK (FNW) originating from different parts of the fraction crystallizable (Fc) region of a reference hIgG1 (hIgG1A). A tryptic pellet digestion of rat serum spiked with hIgG1A and a stable isotope labeled protein (hIgG1B) used as internal standard (ISTD) was applied prior LC-MS/MS analysis. The upper limit of quantification was at 1000μg/mL. The lower limit of quantitation was for GPS, TTP and VVS at 1.00μg/mL whereas for FNW at 5.00μg/mL. Accuracy and precision data met acceptance over three days. The presented method was further successfully applied to the quantitative analysis of other hIgG1s (hIgG1C and hIgG1D) and hIgG4-based therapeutic proteins on spiked quality control (QC) samples in monkey and rat serum using calibration standards (Cs) prepared with hIgG1A in rat serum. In order to extend the applicability of our generic approach, a bispecific-bivalent hIgG1 (bb-hIgG1) and two lysine conjugated antibody-drug conjugates (ADC1 and ADC2) were incorporated as well. The observed values on spiked QC samples in monkey serum were satisfactory with GPS for the determination of bb-hIgG1 whereas the FNW and TTP peptides were suitable for the ADCs. Moreover, comparable mean concentration-time profiles were obtained from monkeys previously dosed intravenously with ADC2 measured against Cs samples prepared either with hIgG1A in rat serum

  15. Dioscin Inhibits HSC-T6 Cell Migration via Adjusting SDC-4 Expression: Insights from iTRAQ-Based Quantitative Proteomics.

    PubMed

    Yin, Lianhong; Qi, Yan; Xu, Youwei; Xu, Lina; Han, Xu; Tao, Xufeng; Song, Shasha; Peng, Jinyong

    2017-01-01

    Hepatic stellate cells (HSCs) migration, an important bioprocess, contributes to the development of liver fibrosis. Our previous studies have found the potent activity of dioscin against liver fibrosis by inhibiting HSCs proliferation, triggering the senescence and inducing apoptosis of activated HSCs, but the molecular mechanisms associated with cell migration were not clarified. In this work, iTRAQ (isobaric tags for relative and absolution quantitation)-based quantitative proteomics study was carried out, and a total of 1566 differentially expressed proteins with fold change ≥2.0 and p < 0.05 were identified in HSC-T6 cells treated by dioscin (5.0 μg/mL). Based on Gene Ontology classification, String and KEGG pathway assays, the effects of dioscin to inhibit cell migration via regulating SDC-4 were carried out. The results of wound-healing, cell migration and western blotting assays indicated that dioscin significantly inhibit HSC-T6 cell migration through SDC-4-dependent signal pathway by affecting the expression levels of Fn, PKCα, Src, FAK, and ERK1/2. Specific SDC-4 knockdown by shRNA also blocked HSC-T6 cell migration, and dioscin slightly enhanced the inhibiting effect. Taken together, the present work showed that SDC-4 played a crucial role on HSC-T6 cell adhesion and migration of dioscin against liver fibrosis, which may be one potent therapeutic target for fibrotic diseases.

  16. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  17. Quantitative real-time PCR approaches for microbial community studies in wastewater treatment systems: applications and considerations.

    PubMed

    Kim, Jaai; Lim, Juntaek; Lee, Changsoo

    2013-12-01

    Quantitative real-time PCR (qPCR) has been widely used in recent environmental microbial ecology studies as a tool for detecting and quantifying microorganisms of interest, which aids in better understandings of the complexity of wastewater microbial communities. Although qPCR can be used to provide more specific and accurate quantification than other molecular techniques, it does have limitations that must be considered when applying it in practice. This article reviews the principle of qPCR quantification and its applications to microbial ecology studies in various wastewater treatment environments. Here we also address several limitations of qPCR-based approaches that can affect the validity of quantification data: template nucleic acid quality, nucleic acid extraction efficiency, specificity of group-specific primers and probes, amplification of nonviable DNA, gene copy number variation, and limited number of sequences in the database. Even with such limitations, qPCR is reportedly among the best methods for quantitatively investigating environmental microbial communities. The application of qPCR is and will continue to be increasingly common in studies of wastewater treatment systems. To obtain reliable analyses, however, the limitations that have often been overlooked must be carefully considered when interpreting the results. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Label-Free, LC-MS-Based Assays to Quantitate Small-Molecule Antagonist Binding to the Mammalian BLT1 Receptor.

    PubMed

    Chen, Xun; Stout, Steven; Mueller, Uwe; Boykow, George; Visconti, Richard; Siliphaivanh, Phieng; Spencer, Kerrie; Presland, Jeremy; Kavana, Michael; Basso, Andrea D; McLaren, David G; Myers, Robert W

    2017-08-01

    We have developed and validated label-free, liquid chromatography-mass spectrometry (LC-MS)-based equilibrium direct and competition binding assays to quantitate small-molecule antagonist binding to recombinant human and mouse BLT1 receptors expressed in HEK 293 cell membranes. Procedurally, these binding assays involve (1) equilibration of the BLT1 receptor and probe ligand, with or without a competitor; (2) vacuum filtration through cationic glass fiber filters to separate receptor-bound from free probe ligand; and (3) LC-MS analysis in selected reaction monitoring mode for bound probe ligand quantitation. Two novel, optimized probe ligands, compounds 1 and 2, were identified by screening 20 unlabeled BLT1 antagonists for direct binding. Saturation direct binding studies confirmed the high affinity, and dissociation studies established the rapid binding kinetics of probe ligands 1 and 2. Competition binding assays were established using both probe ligands, and the affinities of structurally diverse BLT1 antagonists were measured. Both binding assay formats can be executed with high specificity and sensitivity and moderate throughput (96-well plate format) using these approaches. This highly versatile, label-free method for studying ligand binding to membrane-associated receptors should find broad application as an alternative to traditional methods using labeled ligands.

  19. Gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of Japanese encephalitis virus

    NASA Astrophysics Data System (ADS)

    Huang, Su-Hua; Yang, Tsuey-Ching; Tsai, Ming-Hong; Tsai, I.-Shou; Lu, Huang-Chih; Chuang, Pei-Hsin; Wan, Lei; Lin, Ying-Ju; Lai, Chih-Ho; Lin, Cheng-Wen

    2008-10-01

    Virus isolation and antibody detection are routinely used for diagnosis of Japanese encephalitis virus (JEV) infection, but the low level of transient viremia in some JE patients makes JEV isolation from clinical and surveillance samples very difficult. We describe the use of gold nanoparticle-based RT-PCR and real-time quantitative RT-PCR assays for detection of JEV from its RNA genome. We tested the effect of gold nanoparticles on four different PCR systems, including conventional PCR, reverse-transcription PCR (RT-PCR), and SYBR green real-time PCR and RT-PCR assays for diagnosis in the acute phase of JEV infection. Gold nanoparticles increased the amplification yield of the PCR product and shortened the PCR time compared to the conventional reaction. In addition, nanogold-based real-time RT-PCR showed a linear relationship between Ct and template amount using ten-fold dilutions of JEV. The nanogold-based RT-PCR and real-time quantitative RT-PCR assays were able to detect low levels (1-10 000 copies) of the JEV RNA genomes extracted from culture medium or whole blood, providing early diagnostic tools for the detection of low-level viremia in the acute-phase infection. The assays described here were simple, sensitive, and rapid approaches for detection and quantitation of JEV in tissue cultured samples as well as clinical samples.

  20. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  1. Peer support for parents of children with chronic disabling conditions: a systematic review of quantitative and qualitative studies.

    PubMed

    Shilling, Val; Morris, Christopher; Thompson-Coon, Jo; Ukoumunne, Obioha; Rogers, Morwenna; Logan, Stuart

    2013-07-01

    To review the qualitative and quantitative evidence of the benefits of peer support for parents of children with disabling conditions in the context of health, well-being, impact on family, and economic and service implications. We comprehensively searched multiple databases. Eligible studies evaluated parent-to-parent support and reported on the psychological health and experience of giving or receiving support. There were no limits on the child's condition, study design, language, date, or setting. We sought to aggregate quantitative data; findings of qualitative studies were combined using thematic analysis. Qualitative and quantitative data were brought together in a narrative synthesis. Seventeen papers were included: nine qualitative studies, seven quantitative studies, and one mixed-methods evaluation. Four themes were identified from qualitative studies: (1) shared social identity, (2) learning from the experiences of others, (3) personal growth, and (4) supporting others. Some quantitative studies reported a positive effect of peer support on psychological health and other outcomes; however, this was not consistently confirmed. It was not possible to aggregate data across studies. No costing data were identified. Qualitative studies strongly suggest that parents perceive benefit from peer support programmes, an effect seen across different types of support and conditions. However, quantitative studies provide inconsistent evidence of positive effects. Further research should explore whether this dissonance is substantive or an artefact of how outcomes have been measured. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.

  2. Results of Studying Astronomy Students’ Science Literacy, Quantitative Literacy, and Information Literacy

    NASA Astrophysics Data System (ADS)

    Buxner, Sanlyn; Impey, Chris David; Follette, Katherine B.; Dokter, Erin F.; McCarthy, Don; Vezino, Beau; Formanek, Martin; Romine, James M.; Brock, Laci; Neiberding, Megan; Prather, Edward E.

    2017-01-01

    Introductory astronomy courses often serve as terminal science courses for non-science majors and present an opportunity to assess non future scientists’ attitudes towards science as well as basic scientific knowledge and scientific analysis skills that may remain unchanged after college. Through a series of studies, we have been able to evaluate students’ basic science knowledge, attitudes towards science, quantitative literacy, and informational literacy. In the Fall of 2015, we conducted a case study of a single class administering all relevant surveys to an undergraduate class of 20 students. We will present our analysis of trends of each of these studies as well as the comparison case study. In general we have found that students basic scientific knowledge has remained stable over the past quarter century. In all of our studies, there is a strong relationship between student attitudes and their science and quantitative knowledge and skills. Additionally, students’ information literacy is strongly connected to their attitudes and basic scientific knowledge. We are currently expanding these studies to include new audiences and will discuss the implications of our findings for instructors.

  3. Dissociative conceptual and quantitative problem solving outcomes across interactive engagement and traditional format introductory physics

    NASA Astrophysics Data System (ADS)

    McDaniel, Mark A.; Stoen, Siera M.; Frey, Regina F.; Markow, Zachary E.; Hynes, K. Mairin; Zhao, Jiuqing; Cahill, Michael J.

    2016-12-01

    The existing literature indicates that interactive-engagement (IE) based general physics classes improve conceptual learning relative to more traditional lecture-oriented classrooms. Very little research, however, has examined quantitative problem-solving outcomes from IE based relative to traditional lecture-based physics classes. The present study included both pre- and post-course conceptual-learning assessments and a new quantitative physics problem-solving assessment that included three representative conservation of energy problems from a first-semester calculus-based college physics course. Scores for problem translation, plan coherence, solution execution, and evaluation of solution plausibility were extracted for each problem. Over 450 students in three IE-based sections and two traditional lecture sections taught at the same university during the same semester participated. As expected, the IE-based course produced more robust gains on a Force Concept Inventory than did the lecture course. By contrast, when the full sample was considered, gains in quantitative problem solving were significantly greater for lecture than IE-based physics; when students were matched on pre-test scores, there was still no advantage for IE-based physics on gains in quantitative problem solving. Further, the association between performance on the concept inventory and quantitative problem solving was minimal. These results highlight that improved conceptual understanding does not necessarily support improved quantitative physics problem solving, and that the instructional method appears to have less bearing on gains in quantitative problem solving than does the kinds of problems emphasized in the courses and homework and the overlap of these problems to those on the assessment.

  4. Quantitative and Qualitative Study of Intestinal Flora in Neonates

    PubMed Central

    Sharma, Nidhi; Chaudhry, Rama; Panigrahi, Pinaki

    2012-01-01

    Background: In the neonatal period the developing intestinal barrier function provides a sub-optimal mucosal defense against infection. Establishment of the normal commensal micro-flora plays a vital role in this process. Aims: To determine aerobic and anaerobic bacteria by quantitative and qualitative methods from faecal samples of neonates. Settings and Design: A prospective study was carried out in two groups in a tertiary care hospital, Group A-comprised preterm infant and in group B-full term infants. Materials and Methods: Sixty two preterm infants with the weight < 1500 gm and gestation age < 34 weeks and twenty nine full term infants with 4 weeks of age were included. Quantitation of bacterial load was done by ten-fold serial dilutions on respective media. Statistical Analysis: The data were analyzed by using EPIINFO-Ver 6.04. Results and Conclusions: The predominant aerobic bacterium was Klebsiella pneumoniae. In pre term infants aerobic bacteria were colonized with an average of 2.1 and anaerobic bacteria 0.1. Quantitation showed faecal bacterial colony count ranging from 104-1013 CFU/gms. Gram negative and gram positive bacteria increased gradually over an interval of 2 to 3 weeks. Mean log CFU of gram negative bacteria and gram positive bacteria were statistically insignificant from day 3 to day 14 (P > 0.05). On day 21 there was a significant change in colonization of both bacterial sp (P < 0.05). Potential pathogenic aerobic bacteria dominate the intestinal flora of premature babies nursed in neonatal unit. There is a need to investigate interventions to offset this imbalance in gut micro-ecology of premature babies. PMID:23326075

  5. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  6. Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.

    PubMed

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H

    2014-05-01

    Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. © 2013.

  7. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  8. Quantitative secondary electron detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, Jyoti; Joy, David C.; Nayak, Subuhadarshi

    Quantitative Secondary Electron Detection (QSED) using the array of solid state devices (SSD) based electron-counters enable critical dimension metrology measurements in materials such as semiconductors, nanomaterials, and biological samples (FIG. 3). Methods and devices effect a quantitative detection of secondary electrons with the array of solid state detectors comprising a number of solid state detectors. An array senses the number of secondary electrons with a plurality of solid state detectors, counting the number of secondary electrons with a time to digital converter circuit in counter mode.

  9. Quantitative nanoparticle tracking: applications to nanomedicine.

    PubMed

    Huang, Feiran; Dempsey, Christopher; Chona, Daniela; Suh, Junghae

    2011-06-01

    Particle tracking is an invaluable technique to extract quantitative and qualitative information regarding the transport of nanomaterials through complex biological environments. This technique can be used to probe the dynamic behavior of nanoparticles as they interact with and navigate through intra- and extra-cellular barriers. In this article, we focus on the recent developments in the application of particle-tracking technology to nanomedicine, including the study of synthetic and virus-based materials designed for gene and drug delivery. Specifically, we cover research where mean square displacements of nanomaterial transport were explicitly determined in order to quantitatively assess the transport of nanoparticles through biological environments. Particle-tracking experiments can provide important insights that may help guide the design of more intelligent and effective diagnostic and therapeutic nanoparticles.

  10. A probe-based quantitative PCR assay for detecting Tetracapsuloides bryosalmonae in fish tissue and environmental DNA water samples

    USGS Publications Warehouse

    Hutchins, Patrick; Sepulveda, Adam; Martin, Renee; Hopper, Lacey

    2017-01-01

    A probe-based quantitative real-time PCR assay was developed to detect Tetracapsuloides bryosalmonae, which causes proliferative kidney disease in salmonid fish, in kidney tissue and environmental DNA (eDNA) water samples. The limits of detection and quantification were 7 and 100 DNA copies for calibration standards and T. bryosalmonae was reliably detected down to 100 copies in tissue and eDNA samples. The assay presented here is a highly sensitive and quantitative tool for detecting T. bryosalmonae with potential applications for tissue diagnostics and environmental detection.

  11. Project-Based Learning in Undergraduate Environmental Chemistry Laboratory: Using EPA Methods to Guide Student Method Development for Pesticide Quantitation

    ERIC Educational Resources Information Center

    Davis, Eric J.; Pauls, Steve; Dick, Jonathan

    2017-01-01

    Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…

  12. Quantitative Analysis of Endocytic Recycling of Membrane Proteins by Monoclonal Antibody-Based Recycling Assays.

    PubMed

    Blagojević Zagorac, Gordana; Mahmutefendić, Hana; Maćešić, Senka; Karleuša, Ljerka; Lučin, Pero

    2017-03-01

    In this report, we present an analysis of several recycling protocols based on labeling of membrane proteins with specific monoclonal antibodies (mAbs). We analyzed recycling of membrane proteins that are internalized by clathrin-dependent endocytosis, represented by the transferrin receptor, and by clathrin-independent endocytosis, represented by the Major Histocompatibility Class I molecules. Cell surface membrane proteins were labeled with mAbs and recycling of mAb:protein complexes was determined by several approaches. Our study demonstrates that direct and indirect detection of recycled mAb:protein complexes at the cell surface underestimate the recycling pool, especially for clathrin-dependent membrane proteins that are rapidly reinternalized after recycling. Recycling protocols based on the capture of recycled mAb:protein complexes require the use of the Alexa Fluor 488 conjugated secondary antibodies or FITC-conjugated secondary antibodies in combination with inhibitors of endosomal acidification and degradation. Finally, protocols based on the capture of recycled proteins that are labeled with Alexa Fluor 488 conjugated primary antibodies and quenching of fluorescence by the anti-Alexa Fluor 488 displayed the same quantitative assessment of recycling as the antibody-capture protocols. J. Cell. Physiol. 232: 463-476, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  13. Leadership Trust in Virtual Teams Using Communication Tools: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Clark, Robert Lynn

    2014-01-01

    The purpose of this quantitative correlational study was to address leadership trust in virtual teams using communication tools in a small south-central, family-owned pharmaceutical organization, with multiple dispersed locations located in the United States. The results of the current research study could assist leaders to develop a communication…

  14. Systems for Lung Volume Standardization during Static and Dynamic MDCT-based Quantitative Assessment of Pulmonary Structure and Function

    PubMed Central

    Fuld, Matthew K.; Grout, Randall; Guo, Junfeng; Morgan, John H.; Hoffman, Eric A.

    2013-01-01

    Rationale and Objectives Multidetector-row Computed Tomography (MDCT) has emerged as a tool for quantitative assessment of parenchymal destruction, air trapping (density metrics) and airway remodeling (metrics relating airway wall and lumen geometry) in chronic obstructive pulmonary disease (COPD) and asthma. Critical to the accuracy and interpretability of these MDCT-derived metrics is the assurance that the lungs are scanned during a breath-hold at a standardized volume. Materials and Methods A computer monitored turbine-based flow meter system was developed to control patient breath-holds and facilitate static imaging at fixed percentages of the vital capacity. Due to calibration challenges with gas density changes during multi-breath xenon-CT an alternative system was required. The design incorporated dual rolling seal pistons. Both systems were tested in a laboratory environment and human subject trials. Results The turbine-based system successfully controlled lung volumes in 32/37 subjects, having a linear relationship for CT measured air volume between repeated scans: for all scans, the mean and confidence interval of the differences (scan1-scan2) was −9 ml (−169, 151); for TLC alone 6 ml (−164, 177); for FRC alone, −23 ml (−172, 126). The dual-piston system successfully controlled lung volume in 31/41 subjects. Study failures related largely to subject non-compliance with verbal instruction and gas leaks around the mouthpiece. Conclusion We demonstrate the successful use of a turbine-based system for static lung volume control and demonstrate its inadequacies for dynamic xenon-CT studies. Implementation of a dual-rolling seal spirometer has been shown to adequately control lung volume for multi-breath wash-in xenon-CT studies. These systems coupled with proper patient coaching provide the tools for the use of CT to quantitate regional lung structure and function. The wash-in xenon-CT method for assessing regional lung function, while not

  15. Systems for lung volume standardization during static and dynamic MDCT-based quantitative assessment of pulmonary structure and function.

    PubMed

    Fuld, Matthew K; Grout, Randall W; Guo, Junfeng; Morgan, John H; Hoffman, Eric A

    2012-08-01

    Multidetector-row computed tomography (MDCT) has emerged as a tool for quantitative assessment of parenchymal destruction, air trapping (density metrics), and airway remodeling (metrics relating airway wall and lumen geometry) in chronic obstructive pulmonary disease (COPD) and asthma. Critical to the accuracy and interpretability of these MDCT-derived metrics is the assurance that the lungs are scanned during a breathhold at a standardized volume. A computer monitored turbine-based flow meter system was developed to control patient breathholds and facilitate static imaging at fixed percentages of the vital capacity. Because of calibration challenges with gas density changes during multibreath xenon CT, an alternative system was required. The design incorporated dual rolling seal pistons. Both systems were tested in a laboratory environment and human subject trials. The turbine-based system successfully controlled lung volumes in 32/37 subjects, having a linear relationship for CT measured air volume between repeated scans: for all scans, the mean and confidence interval of the differences (scan1-scan2) was -9 mL (-169, 151); for total lung capacity alone 6 mL (-164, 177); for functional residual capacity alone, -23 mL (-172, 126). The dual-piston system successfully controlled lung volume in 31/41 subjects. Study failures related largely to subject noncompliance with verbal instruction and gas leaks around the mouthpiece. We demonstrate the successful use of a turbine-based system for static lung volume control and demonstrate its inadequacies for dynamic xenon CT studies. Implementation of a dual-rolling seal spirometer has been shown to adequately control lung volume for multibreath wash-in xenon CT studies. These systems coupled with proper patient coaching provide the tools for the use of CT to quantitate regional lung structure and function. The wash-in xenon CT method for assessing regional lung function, although not necessarily practical for routine

  16. Model of Values-Based Management Process in Schools: A Mixed Design Study

    ERIC Educational Resources Information Center

    Dogan, Soner

    2016-01-01

    The aim of this paper is to evaluate the school administrators' values-based management behaviours according to the teachers' perceptions and opinions and, accordingly, to build a model of values-based management process in schools. The study was conducted using explanatory design which is inclusive of both quantitative and qualitative methods.…

  17. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  18. A Methodological Self-Study of Quantitizing: Negotiating Meaning and Revealing Multiplicity

    ERIC Educational Resources Information Center

    Seltzer-Kelly, Deborah; Westwood, Sean J.; Pena-Guzman, David M.

    2012-01-01

    This inquiry developed during the process of "quantitizing" qualitative data the authors had gathered for a mixed methods curriculum efficacy study. Rather than providing the intended rigor to their data coding process, their use of an intercoder reliability metric prompted their investigation of the multiplicity and messiness that, as they…

  19. The Positive Alternative Credit Experience (PACE) Program a Quantitative Comparative Study

    ERIC Educational Resources Information Center

    Warren, Rebecca Anne

    2011-01-01

    The purpose of this quantitative comparative study was to evaluate the Positive Alternative Credit Experience (PACE) Program using an objectives-oriented approach to a formative program evaluation. The PACE Program was a semester-long high school alternative education program designed to serve students at-risk for academic failure or dropping out…

  20. Sustainable Urban Forestry Potential Based Quantitative And Qualitative Measurement Using Geospatial Technique

    NASA Astrophysics Data System (ADS)

    Rosli, A. Z.; Reba, M. N. M.; Roslan, N.; Room, M. H. M.

    2014-02-01

    In order to maintain the stability of natural ecosystems around urban areas, urban forestry will be the best initiative to maintain and control green space in our country. Integration between remote sensing (RS) and geospatial information system (GIS) serves as an effective tool for monitoring environmental changes and planning, managing and developing a sustainable urbanization. This paper aims to assess capability of the integration of RS and GIS to provide information for urban forest potential sites based on qualitative and quantitative by using priority parameter ranking in the new township of Nusajaya. SPOT image was used to provide high spatial accuracy while map of topography, landuse, soils group, hydrology, Digital Elevation Model (DEM) and soil series data were applied to enhance the satellite image in detecting and locating present attributes and features on the ground. Multi-Criteria Decision Making (MCDM) technique provides structural and pair wise quantification and comparison elements and criteria for priority ranking for urban forestry purpose. Slope, soil texture, drainage, spatial area, availability of natural resource, and vicinity of urban area are criteria considered in this study. This study highlighted the priority ranking MCDM is cost effective tool for decision-making in urban forestry planning and landscaping.

  1. [The positioning of nursing research in the academic studies: the origin and development of qualitative and quantitative studies].

    PubMed

    Lu, Pei-Pei; Ting, Shing-Shiang; Chen, Mei-Ling; Tang, Woung-Ru

    2005-12-01

    The purpose of this study is to discuss the historical context of qualitative and quantitative research so as to explain the principle of qualitative study and examine the positioning of nursing research within academic study as a whole. This paper guides the readers towards the historical context from empirical science, discusses the influences of qualitative and quantitative research on nursing research, then investigates the nature of research paradigms, examines the positioning of nursing research, which includes the characteristics of fields such as natural science, humanity and social studies, and science, and lastly, presents the research standard proposed by Yardley in 2000. The research paradigms include Positivism, Postpositivism, Criticism, and Constructivism, which can be compared with Ontology, Epistemology, and Methodology. The nature of the paradigm is to determine the assumption of the paradigm on the basis of Ontology, Epistemology, and Methodology. The paradigm determines how the researcher views the world and decides on what to answer, how to research, and how to answer. The difference in academic environment is reflected in the long-term dialogue between qualitative and quantitative studies, as well as the standard for criticism. This paper introduces the method of evaluation of the quality of qualitative study proposed by Yardley in 2002, namely the sensitivity of the context, the promise and conscientiousness, transparency and consistency, influence and significance. The paper is intended to provide a guideline for readers in evaluating the quality of qualitative study.

  2. Understanding Pre-Quantitative Risk in Projects

    NASA Technical Reports Server (NTRS)

    Cooper, Lynne P.

    2011-01-01

    Standard approaches to risk management in projects depend on the ability of teams to identify risks and quantify the probabilities and consequences of these risks (e.g., the 5 x 5 risk matrix). However, long before quantification does - or even can - occur, and long after, teams make decisions based on their pre-quantitative understanding of risk. These decisions can have long-lasting impacts on the project. While significant research has looked at the process of how to quantify risk, our understanding of how teams conceive of and manage pre-quantitative risk is lacking. This paper introduces the concept of pre-quantitative risk and discusses the implications of addressing pre-quantitative risk in projects.

  3. Affinity for Quantitative Tools: Undergraduate Marketing Students Moving beyond Quantitative Anxiety

    ERIC Educational Resources Information Center

    Tarasi, Crina O.; Wilson, J. Holton; Puri, Cheenu; Divine, Richard L.

    2013-01-01

    Marketing students are known as less likely to have an affinity for the quantitative aspects of the marketing discipline. In this article, we study the reasons why this might be true and develop a parsimonious 20-item scale for measuring quantitative affinity in undergraduate marketing students. The scale was administered to a sample of business…

  4. Skeletal scintigraphy and quantitative tracer studies in metabolic bone disease

    NASA Astrophysics Data System (ADS)

    Fogelman, Ignac

    Bone scan imaging with the current bone seeking radiopharmaceuticals, the technetium-99m labelled diphosphonates, has dramatically improved our ability to evaluate skeletal pathology. In this thesis, chapter 1 presents a review of the history of bone scanning, summarises present concepts as to the mechanism of uptake of bone seeking agents and briefly illustrates the role of bone scanning in clinical practice. In chapter 2 the applications of bone scan imaging and quantitative tracer techniques derived from the bone scan in the detection of metabolic bone disease are discussed. Since skeletal uptake of Tc-99m diphosphonate depends upon skeletal metabolism one might expect that the bone scan would be of considerable value in the assessment of metabolic bone disease. However in these disorders the whole skeleton is often diffusely involved by the metabolic process and simple visual inspection of the scan image may not reveal the uniformly increased uptake of tracer. Certain patterns of bone scan abnormality have, however, been reported in patients with primary hyperparathyroidism and renal osteo-dystrophy; the present studies extend these observations and introduce the concept of "metabolic features" which are often recognisable in conditions with generalised increased bone turnover. As an aid to systematic recognition of these features on a given bone scan image a semi-quantitative scoring system, the metabolic index, was introduced. The metabolic index allowed differentiation between various groups of patients with metabolic disorders and a control population. In addition, in a bone scan study of patients with acromegaly, it was found that the metabolic index correlated well with disease activity as measured by serum growth hormone levels. The metabolic index was, however, found to be a relatively insensitive means of identifying disease in individual patients. Patients with increased bone turnover will have an absolute increase in skeletal uptake of tracer. As a

  5. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  6. Conducting quantitative synthesis when comparing medical interventions: AHRQ and the Effective Health Care Program.

    PubMed

    Fu, Rongwei; Gartlehner, Gerald; Grant, Mark; Shamliyan, Tatyana; Sedrakyan, Art; Wilt, Timothy J; Griffith, Lauren; Oremus, Mark; Raina, Parminder; Ismaila, Afisi; Santaguida, Pasqualina; Lau, Joseph; Trikalinos, Thomas A

    2011-11-01

    This article is to establish recommendations for conducting quantitative synthesis, or meta-analysis, using study-level data in comparative effectiveness reviews (CERs) for the Evidence-based Practice Center (EPC) program of the Agency for Healthcare Research and Quality. We focused on recurrent issues in the EPC program and the recommendations were developed using group discussion and consensus based on current knowledge in the literature. We first discussed considerations for deciding whether to combine studies, followed by discussions on indirect comparison and incorporation of indirect evidence. Then, we described our recommendations on choosing effect measures and statistical models, giving special attention to combining studies with rare events; and on testing and exploring heterogeneity. Finally, we briefly presented recommendations on combining studies of mixed design and on sensitivity analysis. Quantitative synthesis should be conducted in a transparent and consistent way. Inclusion of multiple alternative interventions in CERs increases the complexity of quantitative synthesis, whereas the basic issues in quantitative synthesis remain crucial considerations in quantitative synthesis for a CER. We will cover more issues in future versions and update and improve recommendations with the accumulation of new research to advance the goal for transparency and consistency. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Quantitative MRI and spectroscopy of bone marrow

    PubMed Central

    Ruschke, Stefan; Dieckmeyer, Michael; Diefenbach, Maximilian; Franz, Daniela; Gersing, Alexandra S.; Krug, Roland; Baum, Thomas

    2017-01-01

    Bone marrow is one of the largest organs in the human body, enclosing adipocytes, hematopoietic stem cells, which are responsible for blood cell production, and mesenchymal stem cells, which are responsible for the production of adipocytes and bone cells. Magnetic resonance imaging (MRI) is the ideal imaging modality to monitor bone marrow changes in healthy and pathological states, thanks to its inherent rich soft‐tissue contrast. Quantitative bone marrow MRI and magnetic resonance spectroscopy (MRS) techniques have been also developed in order to quantify changes in bone marrow water–fat composition, cellularity and perfusion in different pathologies, and to assist in understanding the role of bone marrow in the pathophysiology of systemic diseases (e.g. osteoporosis). The present review summarizes a large selection of studies published until March 2017 in proton‐based quantitative MRI and MRS of bone marrow. Some basic knowledge about bone marrow anatomy and physiology is first reviewed. The most important technical aspects of quantitative MR methods measuring bone marrow water–fat composition, fatty acid composition, perfusion, and diffusion are then described. Finally, previous MR studies are reviewed on the application of quantitative MR techniques in both healthy aging and diseased bone marrow affected by osteoporosis, fractures, metabolic diseases, multiple myeloma, and bone metastases. Level of Evidence: 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018;47:332–353. PMID:28570033

  8. Magnetoresistive biosensors for quantitative proteomics

    NASA Astrophysics Data System (ADS)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  9. Questionnaire-based person trip visualization and its integration to quantitative measurements in Myanmar

    NASA Astrophysics Data System (ADS)

    Kimijiama, S.; Nagai, M.

    2016-06-01

    With telecommunication development in Myanmar, person trip survey is supposed to shift from conversational questionnaire to GPS survey. Integration of both historical questionnaire data to GPS survey and visualizing them are very important to evaluate chronological trip changes with socio-economic and environmental events. The objectives of this paper are to: (a) visualize questionnaire-based person trip data, (b) compare the errors between questionnaire and GPS data sets with respect to sex and age and (c) assess the trip behaviour in time-series. Totally, 345 individual respondents were selected through random stratification to assess person trip using a questionnaire and GPS survey for each. Conversion of trip information such as a destination from the questionnaires was conducted by using GIS. The results show that errors between the two data sets in the number of trips, total trip distance and total trip duration are 25.5%, 33.2% and 37.2%, respectively. The smaller errors are found among working-age females mainly employed with the project-related activities generated by foreign investment. The trip distant was yearly increased. The study concluded that visualization of questionnaire-based person trip data and integrating them to current quantitative measurements are very useful to explore historical trip changes and understand impacts from socio-economic events.

  10. A Quantitative Study of Oxygen as a Metabolic Regulator

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, Krishnan; LaManna, Joseph C.; Cabrera, Marco E.

    1999-01-01

    An acute reduction in oxygen (O2) delivery to a tissue is generally associated with a decrease in phosphocreatine, increases in ADP, NADH/NAD, and inorganic phosphate, increased rates of glycolysis and lactate production, and reduced rates of pyruvate and fatty acid oxidation. However, given the complexity of the human bioenergetic system and its components, it is difficult to determine quantitatively how cellular metabolic processes interact to maintain ATP homeostasis during stress (e.g., hypoxia, ischemia, and exercise). Of special interest is the determination of mechanisms relating tissue oxygenation to observed metabolic responses at the tissue, organ, and whole body levels and the quantification of how changes in tissue O2 availability affect the pathways of ATP synthesis and the metabolites that control these pathways. In this study, we extend a previously developed mathematical model of human bioenergetics to provide a physicochemical framework that permits quantitative understanding of O2 as a metabolic regulator. Specifically, the enhancement permits studying the effects of variations in tissue oxygenation and in parameters controlling the rate of cellular respiration on glycolysis, lactate production, and pyruvate oxidation. The whole body is described as a bioenergetic system consisting of metabolically distinct tissue/organ subsystems that exchange materials with the blood. In order to study the dynamic response of each subsystem to stimuli, we solve the ordinary differential equations describing the temporal evolution of metabolite levels, given the initial concentrations. The solver used in the present study is the packaged code LSODE, as implemented in the NASA Lewis kinetics and sensitivity analysis code, LSENS. A major advantage of LSENS is the efficient procedures supporting systematic sensitivity analysis, which provides the basic methods for studying parameter sensitivities (i.e., changes in model behavior due to parameter variation

  11. Qualitative and quantitative feedback in the context of competency-based education.

    PubMed

    Tekian, Ara; Watling, Christopher J; Roberts, Trudie E; Steinert, Yvonne; Norcini, John

    2017-12-01

    Research indicates the importance and usefulness of feedback, yet with the shift of medical curricula toward competencies, feedback is not well understood in this context. This paper attempts to identify how feedback fits within a competency-based curriculum. After careful consideration of the literature, the following conclusions are drawn: (1) Because feedback is predicated on assessment, the assessment should be designed to optimize and prevent inaccuracies in feedback; (2) Giving qualitative feedback in the form of a conversation would lend credibility to the feedback, address emotional obstacles and create a context in which feedback is comfortable; (3) Quantitative feedback in the form of individualized data could fulfill the demand for more feedback, help students devise strategies on how to improve, allow students to compare themselves to their peers, recognizing that big data have limitations; and (4) Faculty development needs to incorporate and promote cultural and systems changes with regard to feedback. A better understanding of the role of feedback in competency-based education could result in more efficient learning for students.

  12. Quantum dot nanoprobe-based quantitative analysis for prostate cancer (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Kang, Benedict J.; Jang, Gun Hyuk; Park, Sungwook; Lee, Kwan Hyi

    2016-09-01

    Prostate cancer causes one of the leading cancer-related deaths among the Caucasian adult males in Europe and the United State of America. However, it has a high recovery rate indicating when a proper treatment is delivered to a patient. There are cases of over- or under-treatments which exacerbate the disease states indicating the importance of proper therapeutic approach depending on stage of the disease. Recognition of the unmet needs has raised a need for stratification of the disease. There have been attempts to stratify based on biomarker expression patterns in the course of disease progression. To closely observe the biomarker expression patterns, we propose the use of quantitative imaging method by using fabricated quantum dot-based nanoprobe to quantify biomarker expression on the surface of prostate cancer cells. To characterize the cell line and analyze the biomarker levels, cluster of differentiation 44 (CD 44), prostate specific membrane antigen (PSMA), and epithelial cell adhesion molecule (EpCAM) are used. Each selected biomarker per cell line has been quantified from which we established a signature of biomarkers of a prostate cancer cell line.

  13. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. The effectiveness of clinical networks in improving quality of care and patient outcomes: a systematic review of quantitative and qualitative studies.

    PubMed

    Brown, Bernadette Bea; Patel, Cyra; McInnes, Elizabeth; Mays, Nicholas; Young, Jane; Haines, Mary

    2016-08-08

    Reorganisation of healthcare services into networks of clinical experts is increasing as a strategy to promote the uptake of evidence based practice and to improve patient care. This is reflected in significant financial investment in clinical networks. However, there is still some question as to whether clinical networks are effective vehicles for quality improvement. The aim of this systematic review was to ascertain the effectiveness of clinical networks and identify how successful networks improve quality of care and patient outcomes. A systematic search was undertaken in accordance with the PRISMA approach in Medline, Embase, CINAHL and PubMed for relevant papers between 1 January 1996 and 30 September 2014. Established protocols were used separately to examine and assess the evidence from quantitative and qualitative primary studies and then integrate findings. A total of 22 eligible studies (9 quantitative; 13 qualitative) were included. Of the quantitative studies, seven focused on improving quality of care and two focused on improving patient outcomes. Quantitative studies were limited by a lack of rigorous experimental design. The evidence indicates that clinical networks can be effective vehicles for quality improvement in service delivery and patient outcomes across a range of clinical disciplines. However, there was variability in the networks' ability to make meaningful network- or system-wide change in more complex processes such as those requiring intensive professional education or more comprehensive redesign of care pathways. Findings from qualitative studies indicated networks that had a positive impact on quality of care and patients outcomes were those that had adequate resources, credible leadership and efficient management coupled with effective communication strategies and collaborative trusting relationships. There is evidence that clinical networks can improve the delivery of healthcare though there are few high quality quantitative

  15. Efficient quantitative assessment of facial paralysis using iris segmentation and active contour-based key points detection with hybrid classifier.

    PubMed

    Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo

    2016-03-12

    Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree

  16. A quantitative swab is a good non-invasive alternative to a quantitative biopsy for quantifying bacterial load in wounds healing by second intention in horses.

    PubMed

    Van Hecke, L L; Hermans, K; Haspeslagh, M; Chiers, K; Pint, E; Boyen, F; Martens, A M

    2017-07-01

    The aim of this study was to evaluate different techniques for diagnosing wound infection in wounds healing by second intention in horses and to assess the effect of a vortex and sonication protocol on quantitative bacteriology in specimens with a histologically confirmed biofilm. In 50 wounds healing by second intention, a clinical assessment, a quantitative swab, a semi-quantitative swab, and a swab for cytology were compared to a quantitative tissue biopsy (reference standard). Part of the biopsy specimen was examined histologically for evidence of a biofilm. There was a significant, high correlation (P<0.001; r=0.747) between the outcome of the quantitative swabs and the quantitative biopsies. The semi-quantitative swabs showed a significant, moderate correlation with the quantitative biopsies (P<0.001; ρ=0.524). Higher white blood cell counts for cytology were significantly associated with lower log 10 colony-forming units (CFU) in the wounds (P=0.02). Wounds with black granulation tissue showed significantly higher log 10 CFU (P=0.003). Specimens with biofilms did not yield higher bacteriological counts after a vortex and sonication protocol was performed to release bacteria from the biofilm. Based on these findings, a quantitative swab is an acceptable non-invasive alternative to a quantitative biopsy for quantifying bacterial load in equine wounds healing by second intention. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Data from SILAC-based quantitative analysis of lysates from mouse microglial cells treated with Withaferin A (WA).

    PubMed

    Narayan, Malathi; Seeley, Kent W; Jinwal, Umesh K

    2016-06-01

    Mass spectrometry data collected in a study analyzing the effect of withaferin A (WA) on a mouse microglial (N9) cell line is presented in this article. Data was collected from SILAC-based quantitative analysis of lysates from mouse microglial cells treated with either WA or DMSO vehicle control. This article reports all the proteins that were identified in this analysis. The data presented here is related to the published research article on the effect of WA on the differential regulation of proteins in mouse microglial cells [1]. Mass spectrometry data has also been deposited in the ProteomeXchange with the identifier PXD003032.

  18. Reverse Brain Drain of South Asian IT Professionals: A Quantitative Repatriation Study

    ERIC Educational Resources Information Center

    Suppiah, Nithiyananthan

    2014-01-01

    The purpose of the present quantitative correlational study was to examine if a relationship existed between the RBD phenomenon and cultural, economic, or political factors of the native countries of South Asian IT professionals living in the United States. The study on reverse brain drain was conducted to explore a growing phenomenon in the…

  19. Liquid chromatography-mass spectrometry-based quantitative proteomics.

    PubMed

    Linscheid, Michael W; Ahrends, Robert; Pieper, Stefan; Kühn, Andreas

    2009-01-01

    During the last decades, molecular sciences revolutionized biomedical research and gave rise to the biotechnology industry. During the next decades, the application of the quantitative sciences--informatics, physics, chemistry, and engineering--to biomedical research brings about the next revolution that will improve human healthcare and certainly create new technologies, since there is no doubt that small changes can have great effects. It is not a question of "yes" or "no," but of "how much," to make best use of the medical options we will have. In this context, the development of accurate analytical methods must be considered a cornerstone, since the understanding of biological processes will be impossible without information about the minute changes induced in cells by interactions of cell constituents with all sorts of endogenous and exogenous influences and disturbances. The first quantitative techniques, which were developed, allowed monitoring relative changes only, but they clearly showed the significance of the information obtained. The recent advent of techniques claiming to quantify proteins and peptides not only relative to each other, but also in an absolute fashion, promised another quantum leap, since knowing the absolute amount will allow comparing even unrelated species and the definition of parameters will permit to model biological systems much more accurate than before. To bring these promises to life, several approaches are under development at this point in time and this review is focused on those developments.

  20. Quantitative Appearance Inspection for Film Coated Tablets.

    PubMed

    Yoshino, Hiroyuki; Yamashita, Kazunari; Iwao, Yasunori; Noguchi, Shuji; Itai, Shigeru

    2016-01-01

    The decision criteria for the physical appearance of pharmaceutical products are subjective and qualitative means of evaluation that are based entirely on human interpretation. In this study, we have developed a comprehensive method for the quantitative analysis of the physical appearance of film coated tablets. Three different kinds of film coated tablets with considerable differences in their physical appearances were manufactured as models, and their surface roughness, contact angle, color measurements and physicochemical properties were investigated as potential characteristics for the quantitative analysis of their physical appearance. All of these characteristics were useful for the quantitative evaluation of the physical appearances of the tablets, and could potentially be used to establish decision criteria to assess the quality of tablets. In particular, the analysis of the surface roughness and film coating properties of the tablets by terahertz spectroscopy allowed for an effective evaluation of the tablets' properties. These results indicated the possibility of inspecting the appearance of tablets during the film coating process.

  1. A Study of Engagement in Neighborhood-Based Child Welfare Services

    ERIC Educational Resources Information Center

    Altman, Julie Cooper

    2008-01-01

    This article reports the results of a mixed-method study that examined processes and outcomes of parent-worker engagement in child welfare. Knowledge gained from a qualitative exploration of engagement at one neighborhood-based child welfare agency informed the gathering of quantitative data from 74 different parent-worker dyads in this sequential…

  2. Experimental study on differences in clivus chordoma bone invasion: an iTRAQ-based quantitative proteomic analysis.

    PubMed

    Wu, Zhen; Wang, Liang; Guo, Zhengguang; Wang, Ke; Zhang, Yang; Tian, Kaibing; Zhang, Junting; Sun, Wei; Yu, Chunjiang

    2015-01-01

    Although a bone tumor, significant differences in the extent of bone invasion exist in skull base chordoma, which directly affect the extent of surgical resection, and have an impact on its prognosis. However, the underlying mechanism of the phenomenon is not clearly understood. Therefore, we used an iTRAQ-based quantitative proteomics strategy to identify potential molecular signatures, and to find predictive markers of discrepancy in bone invasion of clivus chordoma. According to bone invasive classification criteria, 35 specimens of clivus chordoma were calssified to be either endophytic type (Type I) or exophytic type (Type II). An initial screening of six specimens of endophytic type and six of exophytic was performed, and 250 differentially expressed proteins were identified. Through the GO and IPA analysis, we found evidence that the expression of inflammatory activity-associated proteins up-regulated in endophytic type, whereas the expression of cell motility-associated proteins up-regulated in exophytic ones. Moreover, TGFβ1 and mTOR signal pathway seemed to be related with bone invasion. Thus, TGFβ1, PI3K, Akt, mTOR, and PTEN were validated in the following 23 samples by immune histochemistry and Western blot. The expression levels of TGFβ1 and PTEN were significantly lower in the endophytic type than in the exophytic ones. It was found that TGFβ1 may play an important role in its bone invasion. The mechanisms may be related with conducting an increased inflammatory cell response and a decline in cytoskeletal protein expression. PTEN is confirmed to be associated with the degree of bone invasion. The PI3K/AKT/mTOR signaling pathway might be associated with the bone invasion, but still needs a larger sample size to be verified These results, for the first time, not only demonstrate the biological changes that occur in different growth patterns from the perspective of proteomics, but also provide novel markers that may help to reveal the mechanisms

  3. [Impact of point spread function correction in standardized uptake value quantitation for positron emission tomography images: a study based on phantom experiments and clinical images].

    PubMed

    Nakamura, Akihiro; Tanizaki, Yasuo; Takeuchi, Miho; Ito, Shigeru; Sano, Yoshitaka; Sato, Mayumi; Kanno, Toshihiko; Okada, Hiroyuki; Torizuka, Tatsuo; Nishizawa, Sadahiko

    2014-06-01

    While point spread function (PSF)-based positron emission tomography (PET) reconstruction effectively improves the spatial resolution and image quality of PET, it may damage its quantitative properties by producing edge artifacts, or Gibbs artifacts, which appear to cause overestimation of regional radioactivity concentration. In this report, we investigated how edge artifacts produce negative effects on the quantitative properties of PET. Experiments with a National Electrical Manufacturers Association (NEMA) phantom, containing radioactive spheres of a variety of sizes and background filled with cold air or water, or radioactive solutions, showed that profiles modified by edge artifacts were reproducible regardless of background μ values, and the effects of edge artifacts increased with increasing sphere-to-background radioactivity concentration ratio (S/B ratio). Profiles were also affected by edge artifacts in complex fashion in response to variable combinations of sphere sizes and S/B ratios; and central single-peak overestimation up to 50% was occasionally noted in relatively small spheres with high S/B ratios. Effects of edge artifacts were obscured in spheres with low S/B ratios. In patient images with a variety of focal lesions, areas of higher radioactivity accumulation were generally more enhanced by edge artifacts, but the effects were variable depending on the size of and accumulation in the lesion. PET images generated using PSF-based reconstruction are therefore not appropriate for the evaluation of SUV.

  4. Quantitative skeletal evaluation based on cervical vertebral maturation: a longitudinal study of adolescents with normal occlusion.

    PubMed

    Chen, L; Liu, J; Xu, T; Long, X; Lin, J

    2010-07-01

    The study aims were to investigate the correlation between vertebral shape and hand-wrist maturation and to select characteristic parameters of C2-C5 (the second to fifth cervical vertebrae) for cervical vertebral maturation determination by mixed longitudinal data. 87 adolescents (32 males, 55 females) aged 8-18 years with normal occlusion were studied. Sequential lateral cephalograms and hand-wrist radiographs were taken annually for 6 consecutive years. Lateral cephalograms were divided into 11 maturation groups according to Fishman Skeletal Maturity Indicators (SMI). 62 morphological measurements of C2-C5 at 11 different developmental stages (SMI1-11) were measured and analysed. Locally weighted scatterplot smoothing, correlation coefficient analysis and variable cluster analysis were used for statistical analysis. Of the 62 cervical vertebral parameters, 44 were positively correlated with SMI, 6 were negatively correlated and 12 were not correlated. The correlation coefficients between cervical vertebral parameters and SMI were relatively high. Characteristic parameters for quantitative analysis of cervical vertebral maturation were selected. In summary, cervical vertebral maturation could be used reliably to evaluate the skeletal stage instead of the hand-wrist radiographic method. Selected characteristic parameters offered a simple and objective reference for the assessment of skeletal maturity and timing of orthognathic surgery. Copyright 2010 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  5. Quantitative Agent Based Model of Opinion Dynamics: Polish Elections of 2015

    PubMed Central

    Sobkowicz, Pawel

    2016-01-01

    We present results of an abstract, agent based model of opinion dynamics simulations based on the emotion/information/opinion (E/I/O) approach, applied to a strongly polarized society, corresponding to the Polish political scene between 2005 and 2015. Under certain conditions the model leads to metastable coexistence of two subcommunities of comparable size (supporting the corresponding opinions)—which corresponds to the bipartisan split found in Poland. Spurred by the recent breakdown of this political duopoly, which occurred in 2015, we present a model extension that describes both the long term coexistence of the two opposing opinions and a rapid, transitory change due to the appearance of a third party alternative. We provide quantitative comparison of the model with the results of polls and elections in Poland, testing the assumptions related to the modeled processes and the parameters used in the simulations. It is shown, that when the propaganda messages of the two incumbent parties differ in emotional tone, the political status quo may be unstable. The asymmetry of the emotions within the support bases of the two parties allows one of them to be ‘invaded’ by a newcomer third party very quickly, while the second remains immune to such invasion. PMID:27171226

  6. Evaluation of airway protection: Quantitative timing measures versus penetration/aspiration score.

    PubMed

    Kendall, Katherine A

    2017-10-01

    Quantitative measures of swallowing function may improve the reliability and accuracy of modified barium swallow (MBS) study interpretation. Quantitative study analysis has not been widely instituted, however, secondary to concerns about the time required to make measures and a lack of research demonstrating impact on MBS interpretation. This study compares the accuracy of the penetration/aspiration (PEN/ASP) scale (an observational visual-perceptual assessment tool) to quantitative measures of airway closure timing relative to the arrival of the bolus at the upper esophageal sphincter in identifying a failure of airway protection during deglutition. Retrospective review of clinical swallowing data from a university-based outpatient clinic. Swallowing data from 426 patients were reviewed. Patients with normal PEN/ASP scores were identified, and the results of quantitative airway closure timing measures for three liquid bolus sizes were evaluated. The incidence of significant airway closure delay with and without a normal PEN/ASP score was determined. Inter-rater reliability for the quantitative measures was calculated. In patients with a normal PEN/ASP score, 33% demonstrated a delay in airway closure on at least one swallow during the MBS study. There was no correlation between PEN/ASP score and airway closure delay. Inter-rater reliability for the quantitative measure of airway closure timing was nearly perfect (intraclass correlation coefficient = 0.973). The use of quantitative measures of swallowing function, in conjunction with traditional visual perceptual methods of MBS study interpretation, improves the identification of airway closure delay, and hence, potential aspiration risk, even when no penetration or aspiration is apparent on the MBS study. 4. Laryngoscope, 127:2314-2318, 2017. © 2017 The American Laryngological, Rhinological and Otological Society, Inc.

  7. Outdoor blue spaces, human health and well-being: A systematic review of quantitative studies.

    PubMed

    Gascon, Mireia; Zijlema, Wilma; Vert, Cristina; White, Mathew P; Nieuwenhuijsen, Mark J

    2017-11-01

    A growing number of quantitative studies have investigated the potential benefits of outdoor blue spaces (lakes, rivers, sea, etc) and human health, but there is not yet a systematic review synthesizing this evidence. To systematically review the current quantitative evidence on human health and well-being benefits of outdoor blue spaces. Following PRISMA guidelines for reporting systematic reviews and meta-analysis, observational and experimental quantitative studies focusing on both residential and non-residential outdoor blue space exposure were searched using specific keywords. In total 35 studies were included in the current systematic review, most of them being classified as of "good quality" (N=22). The balance of evidence suggested a positive association between greater exposure to outdoor blue spaces and both benefits to mental health and well-being (N=12 studies) and levels of physical activity (N=13 studies). The evidence of an association between outdoor blue space exposure and general health (N=6 studies), obesity (N=8 studies) and cardiovascular (N=4 studies) and related outcomes was less consistent. Although encouraging, there remains relatively few studies and a large degree of heterogeneity in terms of study design, exposure metrics and outcome measures, making synthesis difficult. Further research is needed using longitudinal research and natural experiments, preferably across a broader range of countries, to better understand the causal associations between blue spaces, health and wellbeing. Copyright © 2017 Elsevier GmbH. All rights reserved.

  8. EFFECTIVE REMOVAL METHOD OF ILLEGAL PARKING BICYCLES BASED ON THE QUANTITATIVE CHANGE AFTER REMOVAL

    NASA Astrophysics Data System (ADS)

    Toi, Satoshi; Kajita, Yoshitaka; Nishikawa, Shuichirou

    This study aims to find an effective removal method of illegal parking bicycles based on the analysis on the numerical change of illegal bicycles. And then, we built the time and space quantitative distribution model of illegal parking bicycles after removal, considering the logistic increase of illegal parking bicycles, several behaviors concerning of direct return or indirect return to the original parking place and avoidance of the original parking place, based on the investigation of real condition of illegal bicycle parking at TENJIN area in FUKUOKA city. Moreover, we built the simulation model including above-mentioned model, and calculated the number of illegal parking bicycles when we change the removal frequency and the number of removal at one time. The next interesting four results were obtained. (1) Recovery speed from removal the illegal parking bicycles differs by each zone. (2) Thorough removal is effective to keep the number of illegal parking bicycles lower level. (3) Removal at one zone causes the increase of bicycles at other zones where the level of illegal parking is lower. (4) The relationship between effects and costs of removing the illegal parking bicycles was clarified.

  9. Comparison of methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles.

    PubMed

    Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E

    2013-01-01

    To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.

  10. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  11. Marker-based quantitative genetics in the wild?: the heritability and genetic correlation of chemical defenses in eucalyptus.

    PubMed

    Andrew, R L; Peakall, R; Wallis, I R; Wood, J T; Knight, E J; Foley, W J

    2005-12-01

    Marker-based methods for estimating heritability and genetic correlation in the wild have attracted interest because traditional methods may be impractical or introduce bias via G x E effects, mating system variation, and sampling effects. However, they have not been widely used, especially in plants. A regression-based approach, which uses a continuous measure of genetic relatedness, promises to be particularly appropriate for use in plants with mixed-mating systems and overlapping generations. Using this method, we found significant narrow-sense heritability of foliar defense chemicals in a natural population of Eucalyptus melliodora. We also demonstrated a genetic basis for the phenotypic correlation underlying an ecological example of conditioned flavor aversion involving different biosynthetic pathways. Our results revealed that heritability estimates depend on the spatial scale of the analysis in a way that offers insight into the distribution of genetic and environmental variance. This study is the first to successfully use a marker-based method to measure quantitative genetic parameters in a tree. We suggest that this method will prove to be a useful tool in other studies and offer some recommendations for future applications of the method.

  12. Sexing chick mRNA: A protocol based on quantitative real-time polymerase chain reaction.

    PubMed

    Wan, Z; Lu, Y; Rui, L; Yu, X; Li, Z

    2017-03-01

    The accurate identification of sex in birds is important for research on avian sex determination and differentiation. Polymerase chain reaction (PCR)-based methods have been widely applied for the molecular sexing of birds. However, these methods have used genomic DNA. Here, we present the first sexing protocol for chick mRNA based on real-time quantitative PCR. We demonstrate that this method can accurately determine sex using mRNA from chick gonads and other tissues, such as heart, liver, spleen, lung, and muscle. The strategy of this protocol also may be suitable for other species in which sex is determined by the inheritance of sex chromosomes (ZZ male and ZW female). © 2016 Poultry Science Association Inc.

  13. Quantitative inactivation-mechanisms of P. digitatum and A. niger spores based on atomic oxygen dose

    NASA Astrophysics Data System (ADS)

    Ito, Masafumi; Hashizume, Hiroshi; Ohta, Takayuki; Hori, Masaru

    2014-10-01

    We have investigated inactivation mechanisms of Penicillium digitatum and Asperguills niger spores using atmospheric-pressure radical source quantitatively. The radical source was specially developed for supplying only neutral radicals without charged species and UV-light emissions. Reactive oxygen radical densities such as grand-state oxygen atoms, excited-state oxygen molecules and ozone were measured using VUV and UV absorption spectroscopies. The measurements and the treatments of spores were carried out in an Ar-purged chamber for eliminating the influences of OH, NOx and so on. The results revealed that the inactivation of spores can be explained by atomic-oxygen dose under the conditions employing neutral ROS irradiations. On the basis of the dose, we have observed the changes of intracellular organelles and membrane functions using TEM, SEM and confocal- laser fluorescent microscopy. From these results, we discuss the detail inactivation-mechanisms quantitatively based on atomic-oxygen dose.

  14. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Recombinant plasmid-based quantitative Real-Time PCR analysis of Salmonella enterica serotypes and its application to milk samples.

    PubMed

    Gokduman, Kurtulus; Avsaroglu, M Dilek; Cakiris, Aris; Ustek, Duran; Gurakan, G Candan

    2016-03-01

    The aim of the current study was to develop, a new, rapid, sensitive and quantitative Salmonella detection method using a Real-Time PCR technique based on an inexpensive, easy to produce, convenient and standardized recombinant plasmid positive control. To achieve this, two recombinant plasmids were constructed as reference molecules by cloning the two most commonly used Salmonella-specific target gene regions, invA and ttrRSBC. The more rapid detection enabled by the developed method (21 h) compared to the traditional culture method (90 h) allows the quantitative evaluation of Salmonella (quantification limits of 10(1)CFU/ml and 10(0)CFU/ml for the invA target and the ttrRSBC target, respectively), as illustrated using milk samples. Three advantages illustrated by the current study demonstrate the potential of the newly developed method to be used in routine analyses in the medical, veterinary, food and water/environmental sectors: I--The method provides fast analyses including the simultaneous detection and determination of correct pathogen counts; II--The method is applicable to challenging samples, such as milk; III--The method's positive controls (recombinant plasmids) are reproducible in large quantities without the need to construct new calibration curves. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data

    NASA Astrophysics Data System (ADS)

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-01

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  17. Machine learning-based kinetic modeling: a robust and reproducible solution for quantitative analysis of dynamic PET data.

    PubMed

    Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia

    2017-05-07

    A variety of compartment models are used for the quantitative analysis of dynamic positron emission tomography (PET) data. Traditionally, these models use an iterative fitting (IF) method to find the least squares between the measured and calculated values over time, which may encounter some problems such as the overfitting of model parameters and a lack of reproducibility, especially when handling noisy data or error data. In this paper, a machine learning (ML) based kinetic modeling method is introduced, which can fully utilize a historical reference database to build a moderate kinetic model directly dealing with noisy data but not trying to smooth the noise in the image. Also, due to the database, the presented method is capable of automatically adjusting the models using a multi-thread grid parameter searching technique. Furthermore, a candidate competition concept is proposed to combine the advantages of the ML and IF modeling methods, which could find a balance between fitting to historical data and to the unseen target curve. The machine learning based method provides a robust and reproducible solution that is user-independent for VOI-based and pixel-wise quantitative analysis of dynamic PET data.

  18. Synthesis, quantitative structure-property relationship study of novel fluorescence active 2-pyrazolines and application.

    PubMed

    Girgis, Adel S; Basta, Altaf H; El-Saied, Houssni; Mohamed, Mohamed A; Bedair, Ahmad H; Salim, Ahmad S

    2018-03-01

    A variety of fluorescence-active fluorinated pyrazolines 13-33 was synthesized in good yields through cyclocondensation reaction of propenones 1-9 with aryl hydrazines 10-12 . Some of the synthesized compounds provided promising fluorescence properties with quantum yield ( Φ ) higher than that of quinine sulfate (standard reference). Quantitative structure-property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines ( 13 , 15 , 18 , 19 and 23 ) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23 , provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents.

  19. Synthesis, quantitative structure-property relationship study of novel fluorescence active 2-pyrazolines and application

    NASA Astrophysics Data System (ADS)

    Girgis, Adel S.; Basta, Altaf H.; El-Saied, Houssni; Mohamed, Mohamed A.; Bedair, Ahmad H.; Salim, Ahmad S.

    2018-03-01

    A variety of fluorescence-active fluorinated pyrazolines 13-33 was synthesized in good yields through cyclocondensation reaction of propenones 1-9 with aryl hydrazines 10-12. Some of the synthesized compounds provided promising fluorescence properties with quantum yield (Φ) higher than that of quinine sulfate (standard reference). Quantitative structure-property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines (13, 15, 18, 19 and 23) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23, provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents.

  20. Prospective evaluation of risk of vertebral fractures using quantitative ultrasound measurements and bone mineral density in a population-based sample of postmenopausal women: results of the Basel Osteoporosis Study.

    PubMed

    Hollaender, R; Hartl, F; Krieg, M-A; Tyndall, A; Geuckel, C; Buitrago-Tellez, C; Manghani, M; Kraenzlin, M; Theiler, R; Hans, D

    2009-03-01

    Prospective studies have shown that quantitative ultrasound (QUS) techniques predict the risk of fracture of the proximal femur with similar standardised risk ratios to dual-energy x-ray absorptiometry (DXA). Few studies have investigated these devices for the prediction of vertebral fractures. The Basel Osteoporosis Study (BOS) is a population-based prospective study to assess the performance of QUS devices and DXA in predicting incident vertebral fractures. 432 women aged 60-80 years were followed-up for 3 years. Incident vertebral fractures were assessed radiologically. Bone measurements using DXA (spine and hip) and QUS measurements (calcaneus and proximal phalanges) were performed. Measurements were assessed for their value in predicting incident vertebral fractures using logistic regression. QUS measurements at the calcaneus and DXA measurements discriminated between women with and without incident vertebral fracture, (20% height reduction). The relative risks (RRs) for vertebral fracture, adjusted for age, were 2.3 for the Stiffness Index (SI) and 2.8 for the Quantitative Ultrasound Index (QUI) at the calcaneus and 2.0 for bone mineral density at the lumbar spine. The predictive value (AUC (95% CI)) of QUS measurements at the calcaneus remained highly significant (0.70 for SI, 0.72 for the QUI, and 0.67 for DXA at the lumbar spine) even after adjustment for other confounding variables. QUS of the calcaneus and bone mineral density measurements were shown to be significant predictors of incident vertebral fracture. The RRs for QUS measurements at the calcaneus are of similar magnitude as for DXA measurements.

  1. Quantitative strain and compositional studies of InxGa1-xAs Epilayer in a GaAs-based pHEMT device structure by TEM techniques.

    PubMed

    Sridhara Rao, Duggi V; Sankarasubramanian, Ramachandran; Muraleedharan, Kuttanellore; Mehrtens, Thorsten; Rosenauer, Andreas; Banerjee, Dipankar

    2014-08-01

    In GaAs-based pseudomorphic high-electron mobility transistor device structures, strain and composition of the In x Ga1-x As channel layer are very important as they influence the electronic properties of these devices. In this context, transmission electron microscopy techniques such as (002) dark-field imaging, high-resolution transmission electron microscopy (HRTEM) imaging, scanning transmission electron microscopy-high angle annular dark field (STEM-HAADF) imaging and selected area diffraction, are useful. A quantitative comparative study using these techniques is relevant for assessing the merits and limitations of the respective techniques. In this article, we have investigated strain and composition of the In x Ga1-x As layer with the mentioned techniques and compared the results. The HRTEM images were investigated with strain state analysis. The indium content in this layer was quantified by HAADF imaging and correlated with STEM simulations. The studies showed that the In x Ga1-x As channel layer was pseudomorphically grown leading to tetragonal strain along the [001] growth direction and that the average indium content (x) in the epilayer is ~0.12. We found consistency in the results obtained using various methods of analysis.

  2. Speleogenesis, geometry, and topology of caves: A quantitative study of 3D karst conduits

    NASA Astrophysics Data System (ADS)

    Jouves, Johan; Viseur, Sophie; Arfib, Bruno; Baudement, Cécile; Camus, Hubert; Collon, Pauline; Guglielmi, Yves

    2017-12-01

    Karst systems are hierarchically spatially organized three-dimensional (3D) networks of conduits behaving as drains for groundwater flow. Recently, geostatistical approaches proposed to generate karst networks from data and parameters stemming from analogous observed karst features. Other studies have qualitatively highlighted relationships between speleogenetic processes and cave patterns. However, few studies have been performed to quantitatively define these relationships. This paper reports a quantitative study of cave geometries and topologies that takes the underlying speleogenetic processes into account. In order to study the spatial organization of caves, a 3D numerical database was built from 26 caves, corresponding to 621 km of cumulative cave passages representative of the variety of karst network patterns. The database includes 3D speleological surveys for which the speleogenetic context is known, allowing the polygenic karst networks to be divided into 48 monogenic cave samples and classified into four cave patterns: vadose branchwork (VB), water-table cave (WTC), looping cave (LC), and angular maze (AM). Eight morphometric cave descriptors were calculated, four geometrical parameters (width-height ratio, tortuosity, curvature, and vertical index) and four topological ones (degree of node connectivity, α and γ graph indices, and ramification index) respectively. The results were validated by statistical analyses (Kruskal-Wallis test and PCA). The VB patterns are clearly distinct from AM ones and from a third group including WTC and LC. A quantitative database of cave morphology characteristics is provided, depending on their speleogenetic processes. These characteristics can be used to constrain and/or validate 3D geostatistical simulations. This study shows how important it is to relate the geometry and connectivity of cave networks to recharge and flow processes. Conversely, the approach developed here provides proxies to estimate the evolution of

  3. Quantitative Decision Making.

    ERIC Educational Resources Information Center

    Baldwin, Grover H.

    The use of quantitative decision making tools provides the decision maker with a range of alternatives among which to decide, permits acceptance and use of the optimal solution, and decreases risk. Training line administrators in the use of these tools can help school business officials obtain reliable information upon which to base district…

  4. Kinetic quantitation of cerebral PET-FDG studies without concurrent blood sampling: statistical recovery of the arterial input function.

    PubMed

    O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A

    2010-03-01

    Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to

  5. Quality control for quantitative PCR based on amplification compatibility test.

    PubMed

    Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W

    2010-04-01

    Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.

  6. Measuring the Beginning: A Quantitative Study of the Transition to Higher Education

    ERIC Educational Resources Information Center

    Brooman, Simon; Darwent, Sue

    2014-01-01

    This quantitative study measures change in certain factors known to influence success of first-year students during the transition to higher education: self-efficacy, autonomous learning and social integration. A social integration scale was developed with three subscales: "sense of belonging", "relationship with staff" and…

  7. Conceptual Diversity, Moderators, and Theoretical Issues in Quantitative Studies of Cultural Capital Theory

    ERIC Educational Resources Information Center

    Tan, Cheng Yong

    2017-01-01

    The present study reviewed quantitative empirical studies examining the relationship between cultural capital and student achievement. Results showed that researchers had conceptualized and measured cultural capital in different ways. It is argued that the more holistic understanding of the construct beyond highbrow cultural consumption must be…

  8. Contribution of Insula in Parkinson’s Disease: A Quantitative Meta-Analysis Study

    PubMed Central

    Criaud, Marion; Christopher, Leigh; Boulinguez, Philippe; Ballanger, Benedicte; Lang, Anthony E.; Cho, Sang S.; Houle, Sylvain; Strafella, Antonio P.

    2016-01-01

    The insula region is known to be an integrating hub interacting with multiple brain networks involved in cognitive, affective, sensory, and autonomic processes. There is growing evidence suggesting that this region may have an important role in Parkinson’s disease (PD). Thus, to investigate the functional organization of the insular cortex and its potential role in parkinsonian features, we used a coordinate-based quantitative meta-analysis approach, the activation likelihood estimation. A total of 132 insular foci were selected from 96 published experiments comprising the five functional categories: cognition, affective/behavioral symptoms, bodily awareness/autonomic function, sensorimotor function, and nonspecific resting functional changes associated with the disease. We found a significant convergence of activation maxima related to PD in different insular regions including anterior and posterior regions bilaterally. This study provides evidence of an important functional distribution of different domains within the insular cortex in PD, particularly in relation to nonmotor aspects, with an influence of medication effect. PMID:26800238

  9. Affordable, automatic quantitative fall risk assessment based on clinical balance scales and Kinect data.

    PubMed

    Colagiorgio, P; Romano, F; Sardi, F; Moraschini, M; Sozzi, A; Bejor, M; Ricevuti, G; Buizza, A; Ramat, S

    2014-01-01

    The problem of a correct fall risk assessment is becoming more and more critical with the ageing of the population. In spite of the available approaches allowing a quantitative analysis of the human movement control system's performance, the clinical assessment and diagnostic approach to fall risk assessment still relies mostly on non-quantitative exams, such as clinical scales. This work documents our current effort to develop a novel method to assess balance control abilities through a system implementing an automatic evaluation of exercises drawn from balance assessment scales. Our aim is to overcome the classical limits characterizing these scales i.e. limited granularity and inter-/intra-examiner reliability, to obtain objective scores and more detailed information allowing to predict fall risk. We used Microsoft Kinect to record subjects' movements while performing challenging exercises drawn from clinical balance scales. We then computed a set of parameters quantifying the execution of the exercises and fed them to a supervised classifier to perform a classification based on the clinical score. We obtained a good accuracy (~82%) and especially a high sensitivity (~83%).

  10. Nuclear medicine and imaging research (quantitative studies in radiopharmaceutical science)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1990-09-01

    This is a report of progress in Year Two (January 1, 1990--December 31, 1990) of Grant FG02-86ER60438, Quantitative Studies in Radiopharmaceutical Science,'' awarded for the three-year period January 1, 1989--December 31, 1991 as a competitive renewal following site visit in the fall of 1988. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further themore » development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefit from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. 25 refs., 13 figs., 1 tab.« less

  11. Quantitative Amyloid Imaging in Autosomal Dominant Alzheimer's Disease: Results from the DIAN Study Group.

    PubMed

    Su, Yi; Blazey, Tyler M; Owen, Christopher J; Christensen, Jon J; Friedrichsen, Karl; Joseph-Mathurin, Nelly; Wang, Qing; Hornbeck, Russ C; Ances, Beau M; Snyder, Abraham Z; Cash, Lisa A; Koeppe, Robert A; Klunk, William E; Galasko, Douglas; Brickman, Adam M; McDade, Eric; Ringman, John M; Thompson, Paul M; Saykin, Andrew J; Ghetti, Bernardino; Sperling, Reisa A; Johnson, Keith A; Salloway, Stephen P; Schofield, Peter R; Masters, Colin L; Villemagne, Victor L; Fox, Nick C; Förster, Stefan; Chen, Kewei; Reiman, Eric M; Xiong, Chengjie; Marcus, Daniel S; Weiner, Michael W; Morris, John C; Bateman, Randall J; Benzinger, Tammie L S

    2016-01-01

    Amyloid imaging plays an important role in the research and diagnosis of dementing disorders. Substantial variation in quantitative methods to measure brain amyloid burden exists in the field. The aim of this work is to investigate the impact of methodological variations to the quantification of amyloid burden using data from the Dominantly Inherited Alzheimer's Network (DIAN), an autosomal dominant Alzheimer's disease population. Cross-sectional and longitudinal [11C]-Pittsburgh Compound B (PiB) PET imaging data from the DIAN study were analyzed. Four candidate reference regions were investigated for estimation of brain amyloid burden. A regional spread function based technique was also investigated for the correction of partial volume effects. Cerebellar cortex, brain-stem, and white matter regions all had stable tracer retention during the course of disease. Partial volume correction consistently improves sensitivity to group differences and longitudinal changes over time. White matter referencing improved statistical power in the detecting longitudinal changes in relative tracer retention; however, the reason for this improvement is unclear and requires further investigation. Full dynamic acquisition and kinetic modeling improved statistical power although it may add cost and time. Several technical variations to amyloid burden quantification were examined in this study. Partial volume correction emerged as the strategy that most consistently improved statistical power for the detection of both longitudinal changes and across-group differences. For the autosomal dominant Alzheimer's disease population with PiB imaging, utilizing brainstem as a reference region with partial volume correction may be optimal for current interventional trials. Further investigation of technical issues in quantitative amyloid imaging in different study populations using different amyloid imaging tracers is warranted.

  12. Characteristics of quantitative nursing research from 1990 to 2010.

    PubMed

    Yarcheski, Adela; Mahon, Noreen E

    2013-12-01

    To assess author credentials of quantitative research in nursing, the composition of the research teams, and the disciplinary focus of the theories tested. Nursing Research, Western Journal of Nursing Research, and Journal of Advanced Nursing were selected for this descriptive study; 1990, 1995, 2000, 2005, and 2010 were included. The final sample consisted of 484 quantitative research articles. From 1990 to 2010, there was an increase in first authors holding doctoral degrees, research from other countries, and funding. Solo authorship decreased; multi-authorship and multidisciplinary teams increased. Theories tested were mostly from psychology; the testing of nursing theory was modest. Multidisciplinary research far outdistanced interdisciplinary research. Quantitative nursing research can be characterized as multidisciplinary (distinct theories from different disciplines) rather than discipline-specific to nursing. Interdisciplinary (theories synthesized from different disciplines) research has been conducted minimally. This study provides information about the growth of the scientific knowledge base of nursing, which has implications for practice. © 2013 Sigma Theta Tau International.

  13. Adjusting for treatment effects in studies of quantitative traits: antihypertensive therapy and systolic blood pressure.

    PubMed

    Tobin, Martin D; Sheehan, Nuala A; Scurrah, Katrina J; Burton, Paul R

    2005-10-15

    A population-based study of a quantitative trait may be seriously compromised when the trait is subject to the effects of a treatment. For example, in a typical study of quantitative blood pressure (BP) 15 per cent or more of middle-aged subjects may take antihypertensive treatment. Without appropriate correction, this can lead to substantial shrinkage in the estimated effect of aetiological determinants of scientific interest and a marked reduction in statistical power. Correction relies upon imputation, in treated subjects, of the underlying BP from the observed BP having invoked one or more assumptions about the bioclinical setting. There is a range of different assumptions that may be made, and a number of different analytical models that may be used. In this paper, we motivate an approach based on a censored normal regression model and compare it with a range of other methods that are currently used or advocated. We compare these methods in simulated data sets and assess the estimation bias and the loss of power that ensue when treatment effects are not appropriately addressed. We also apply the same methods to real data and demonstrate a pattern of behaviour that is consistent with that in the simulation studies. Although all approaches to analysis are necessarily approximations, we conclude that two of the adjustment methods appear to perform well across a range of realistic settings. These are: (1) the addition of a sensible constant to the observed BP in treated subjects; and (2) the censored normal regression model. A third, non-parametric, method based on averaging ordered residuals may also be advocated in some settings. On the other hand, three approaches that are used relatively commonly are fundamentally flawed and should not be used at all. These are: (i) ignoring the problem altogether and analysing observed BP in treated subjects as if it was underlying BP; (ii) fitting a conventional regression model with treatment as a binary covariate; and (iii

  14. Selection of reference genes for gene expression studies in virus-infected monocots using quantitative real-time PCR.

    PubMed

    Zhang, Kun; Niu, Shaofang; Di, Dianping; Shi, Lindan; Liu, Deshui; Cao, Xiuling; Miao, Hongqin; Wang, Xianbing; Han, Chenggui; Yu, Jialin; Li, Dawei; Zhang, Yongliang

    2013-10-10

    Both genome-wide transcriptomic surveys of the mRNA expression profiles and virus-induced gene silencing-based molecular studies of target gene during virus-plant interaction involve the precise estimation of the transcript abundance. Quantitative real-time PCR (qPCR) is the most widely adopted technique for mRNA quantification. In order to obtain reliable quantification of transcripts, identification of the best reference genes forms the basis of the preliminary work. Nevertheless, the stability of internal controls in virus-infected monocots needs to be fully explored. In this work, the suitability of ten housekeeping genes (ACT, EF1α, FBOX, GAPDH, GTPB, PP2A, SAND, TUBβ, UBC18 and UK) for potential use as reference genes in qPCR were investigated in five different monocot plants (Brachypodium, barley, sorghum, wheat and maize) under infection with different viruses including Barley stripe mosaic virus (BSMV), Brome mosaic virus (BMV), Rice black-streaked dwarf virus (RBSDV) and Sugarcane mosaic virus (SCMV). By using three different algorithms, the most appropriate reference genes or their combinations were identified for different experimental sets and their effectiveness for the normalisation of expression studies were further validated by quantitative analysis of a well-studied PR-1 gene. These results facilitate the selection of desirable reference genes for more accurate gene expression studies in virus-infected monocots. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Test of Achievement in Quantitative Economics for Secondary Schools: Construction and Validation Using Item Response Theory

    ERIC Educational Resources Information Center

    Eleje, Lydia I.; Esomonu, Nkechi P. M.

    2018-01-01

    A Test to measure achievement in quantitative economics among secondary school students was developed and validated in this study. The test is made up 20 multiple choice test items constructed based on quantitative economics sub-skills. Six research questions guided the study. Preliminary validation was done by two experienced teachers in…

  16. Quantitative imaging of aggregated emulsions.

    PubMed

    Penfold, Robert; Watson, Andrew D; Mackie, Alan R; Hibberd, David J

    2006-02-28

    Noise reduction, restoration, and segmentation methods are developed for the quantitative structural analysis in three dimensions of aggregated oil-in-water emulsion systems imaged by fluorescence confocal laser scanning microscopy. Mindful of typical industrial formulations, the methods are demonstrated for concentrated (30% volume fraction) and polydisperse emulsions. Following a regularized deconvolution step using an analytic optical transfer function and appropriate binary thresholding, novel application of the Euclidean distance map provides effective discrimination of closely clustered emulsion droplets with size variation over at least 1 order of magnitude. The a priori assumption of spherical nonintersecting objects provides crucial information to combat the ill-posed inverse problem presented by locating individual particles. Position coordinates and size estimates are recovered with sufficient precision to permit quantitative study of static geometrical features. In particular, aggregate morphology is characterized by a novel void distribution measure based on the generalized Apollonius problem. This is also compared with conventional Voronoi/Delauney analysis.

  17. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  18. Teachers' Attitudes toward African American Vernacular English: A Quantitative Correlational Study

    ERIC Educational Resources Information Center

    Daily, Danny L., Jr.

    2017-01-01

    African Americans students, who use African American Vernacular English (AAVE) in the academic setting, receive negative misconceptions by English educators. Negative teacher attitudes might cause African American students to lack commitment to learning. The purpose of this quantitative correlational study was to examine whether English teachers…

  19. Optical tweezers based force measurement system for quantitating binding interactions: system design and application for the study of bacterial adhesion.

    PubMed

    Fällman, Erik; Schedin, Staffan; Jass, Jana; Andersson, Magnus; Uhlin, Bernt Eric; Axner, Ove

    2004-06-15

    An optical force measurement system for quantitating forces in the pN range between micrometer-sized objects has been developed. The system was based upon optical tweezers in combination with a sensitive position detection system and constructed around an inverted microscope. A trapped particle in the focus of the high numerical aperture microscope-objective behaves like an omnidirectional mechanical spring in response to an external force. The particle's displacement from the equilibrium position is therefore a direct measure of the exerted force. A weak probe laser beam, focused directly below the trapping focus, was used for position detection of the trapped particle (a polystyrene bead). The bead and the condenser focus the light to a distinct spot in the far field, monitored by a position sensitive detector. Various calibration procedures were implemented in order to provide absolute force measurements. The system has been used to measure the binding forces between Escherichia coli bacterial adhesins and galabiose-functionalized beads.

  20. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  1. Quantitative risk assessment of E. coli in street-vended cassava-based delicacies in the Philippines

    NASA Astrophysics Data System (ADS)

    Mesias, I. C. P.

    2018-01-01

    In the Philippines, rootcrop-based food products are gaining popularity in street food trade. However, a number of street-vended food products in the country are reported to be contaminated with E. coli posing possible risk among consumers. In this study, information on quantitative risk assessment of E. coli in street-vended cassava-based delicacies was generated. The assessment started with the prevalence and concentration of E. coli at post production in packages of the cassava-based delicacies. Combase growth predictor was used to trace the microbial population of E. coli in each step of the food chain. The @Risk software package, version 6 (Palisade USA) was used to run the simulations. Scenarios in the post-production to consumption pathway were simulated. The effect was then assessed in relation to exposure to the defined infective dose. In the worst case scenario, a minimum and most likely concentration of 6.3 and 7.8 log CFU of E. coli per serving respectively were observed. The simulation revealed that lowering the temperature in the chain considerably decreased the E. coli concentration prior to consumption and subsequently decreased the percentage of exposure to the infective dose. Exposure to infective dose however was increased with longer lag time from postproduction to consumption.

  2. 1, 2, 3, 4: infusing quantitative literacy into introductory biology.

    PubMed

    Speth, Elena Bray; Momsen, Jennifer L; Moyerbrailean, Gregory A; Ebert-May, Diane; Long, Tammy M; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills.

  3. 1, 2, 3, 4: Infusing Quantitative Literacy into Introductory Biology

    PubMed Central

    Momsen, Jennifer L.; Moyerbrailean, Gregory A.; Ebert-May, Diane; Long, Tammy M.; Wyse, Sara; Linton, Debra

    2010-01-01

    Biology of the twenty-first century is an increasingly quantitative science. Undergraduate biology education therefore needs to provide opportunities for students to develop fluency in the tools and language of quantitative disciplines. Quantitative literacy (QL) is important for future scientists as well as for citizens, who need to interpret numeric information and data-based claims regarding nearly every aspect of daily life. To address the need for QL in biology education, we incorporated quantitative concepts throughout a semester-long introductory biology course at a large research university. Early in the course, we assessed the quantitative skills that students bring to the introductory biology classroom and found that students had difficulties in performing simple calculations, representing data graphically, and articulating data-driven arguments. In response to students' learning needs, we infused the course with quantitative concepts aligned with the existing course content and learning objectives. The effectiveness of this approach is demonstrated by significant improvement in the quality of students' graphical representations of biological data. Infusing QL in introductory biology presents challenges. Our study, however, supports the conclusion that it is feasible in the context of an existing course, consistent with the goals of college biology education, and promotes students' development of important quantitative skills. PMID:20810965

  4. A thorough experimental study of CH/π interactions in water: quantitative structure-stability relationships for carbohydrate/aromatic complexes.

    PubMed

    Jiménez-Moreno, Ester; Jiménez-Osés, Gonzalo; Gómez, Ana M; Santana, Andrés G; Corzana, Francisco; Bastida, Agatha; Jiménez-Barbero, Jesus; Asensio, Juan Luis

    2015-11-13

    CH/π interactions play a key role in a large variety of molecular recognition processes of biological relevance. However, their origins and structural determinants in water remain poorly understood. In order to improve our comprehension of these important interaction modes, we have performed a quantitative experimental analysis of a large data set comprising 117 chemically diverse carbohydrate/aromatic stacking complexes, prepared through a dynamic combinatorial approach recently developed by our group. The obtained free energies provide a detailed picture of the structure-stability relationships that govern the association process, opening the door to the rational design of improved carbohydrate-based ligands or carbohydrate receptors. Moreover, this experimental data set, supported by quantum mechanical calculations, has contributed to the understanding of the main driving forces that promote complex formation, underlining the key role played by coulombic and solvophobic forces on the stabilization of these complexes. This represents the most quantitative and extensive experimental study reported so far for CH/π complexes in water.

  5. [Quantitative study of the prothallial morphogenesis in Asplenium species].

    PubMed

    Henriet, M; Auquière, J P; Moens, P

    1976-01-01

    A precedent paper concerned a qualitative analysis of the gametophytic development in nine Asplenium species. By a quantitative study, we specify the parental relationships among these species. The surface of the gametophyte and the number of maginal hairs increase differently for each species. The density of the marginal hairs depends on the considered species. The relation among the morphological gametophytic parameters is constant in a group of determined species. The principal componant analysis is realized for all the parameters measured during the prothallial development. It confirms parental relationships among the diploids and tetraploids species on a morphological point of vue.

  6. Molecular design of anticancer drug leads based on three-dimensional quantitative structure-activity relationship.

    PubMed

    Huang, Xiao Yan; Shan, Zhi Jie; Zhai, Hong Lin; Li, Li Na; Zhang, Xiao Yun

    2011-08-22

    Heat shock protein 90 (Hsp90) takes part in the developments of several cancers. Novobiocin, a typically C-terminal inhibitor for Hsp90, will probably used as an important anticancer drug in the future. In this work, we explored the valuable information and designed new novobiocin derivatives based on a three-dimensional quantitative structure-activity relationship (3D QSAR). The comparative molecular field analysis and comparative molecular similarity indices analysis models with high predictive capability were established, and their reliabilities are supported by the statistical parameters. Based on the several important influence factors obtained from these models, six new novobiocin derivatives with higher inhibitory activities were designed and confirmed by the molecular simulation with our models, which provide the potential anticancer drug leads for further research.

  7. Quantitative analysis of comparative genomic hybridization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manoir, S. du; Bentz, M.; Joos, S.

    1995-01-01

    Comparative genomic hybridization (CGH) is a new molecular cytogenetic method for the detection of chromosomal imbalances. Following cohybridization of DNA prepared from a sample to be studied and control DNA to normal metaphase spreads, probes are detected via different fluorochromes. The ratio of the test and control fluorescence intensities along a chromosome reflects the relative copy number of segments of a chromosome in the test genome. Quantitative evaluation of CGH experiments is required for the determination of low copy changes, e.g., monosomy or trisomy, and for the definition of the breakpoints involved in unbalanced rearrangements. In this study, a programmore » for quantitation of CGH preparations is presented. This program is based on the extraction of the fluorescence ratio profile along each chromosome, followed by averaging of individual profiles from several metaphase spreads. Objective parameters critical for quantitative evaluations were tested, and the criteria for selection of suitable CGH preparations are described. The granularity of the chromosome painting and the regional inhomogeneity of fluorescence intensities in metaphase spreads proved to be crucial parameters. The coefficient of variation of the ratio value for chromosomes in balanced state (CVBS) provides a general quality criterion for CGH experiments. Different cutoff levels (thresholds) of average fluorescence ratio values were compared for their specificity and sensitivity with regard to the detection of chromosomal imbalances. 27 refs., 15 figs., 1 tab.« less

  8. Spontaneous Focusing on Quantitative Relations: Towards a Characterization

    ERIC Educational Resources Information Center

    Degrande, Tine; Verschaffel, Lieven; Van Dooren, Wim

    2017-01-01

    In contrast to previous studies on Spontaneous Focusing on Quantitative Relations (SFOR), the present study investigated not only the "extent" to which children focus on (multiplicative) quantitative relations, but also the "nature" of children's quantitative focus (i.e., the types of quantitative relations that children focus…

  9. Quantitative measures of healthy aging and biological age

    PubMed Central

    Kim, Sangkyu; Jazwinski, S. Michal

    2015-01-01

    Numerous genetic and non-genetic factors contribute to aging. To facilitate the study of these factors, various descriptors of biological aging, including ‘successful aging’ and ‘frailty’, have been put forth as integrative functional measures of aging. A separate but related quantitative approach is the ‘frailty index’, which has been operationalized and frequently used. Various frailty indices have been constructed. Although based on different numbers and types of health variables, frailty indices possess several common properties that make them useful across different studies. We have been using a frailty index termed FI34 based on 34 health variables. Like other frailty indices, FI34 increases non-linearly with advancing age and is a better indicator of biological aging than chronological age. FI34 has a substantial genetic basis. Using FI34, we found elevated levels of resting metabolic rate linked to declining health in nonagenarians. Using FI34 as a quantitative phenotype, we have also found a genomic region on chromosome 12 that is associated with healthy aging and longevity. PMID:26005669

  10. Domestic violence against women in India: A systematic review of a decade of quantitative studies.

    PubMed

    Kalokhe, Ameeta; Del Rio, Carlos; Dunkle, Kristin; Stephenson, Rob; Metheny, Nicholas; Paranjape, Anuradha; Sahay, Seema

    2017-04-01

    Domestic violence (DV) is prevalent among women in India and has been associated with poor mental and physical health. We performed a systematic review of 137 quantitative studies published in the prior decade that directly evaluated the DV experiences of Indian women to summarise the breadth of recent work and identify gaps in the literature. Among studies surveying at least two forms of abuse, a median 41% of women reported experiencing DV during their lifetime and 30% in the past year. We noted substantial inter-study variance in DV prevalence estimates, attributable in part to different study populations and settings, but also to a lack of standardisation, validation, and cultural adaptation of DV survey instruments. There was paucity of studies evaluating the DV experiences of women over age 50, residing in live-in relationships, same-sex relationships, tribal villages, and of women from the northern regions of India. Additionally, our review highlighted a gap in research evaluating the impact of DV on physical health. We conclude with a research agenda calling for additional qualitative and longitudinal quantitative studies to explore the DV correlates proposed by this quantitative literature to inform the development of a culturally tailored DV scale and prevention strategies.

  11. Domestic violence against women in India: A systematic review of a decade of quantitative studies

    PubMed Central

    Kalokhe, Ameeta; del Rio, Carlos; Dunkle, Kristin; Stephenson, Rob; Metheny, Nicholas; Paranjape, Anuradha; Sahay, Seema

    2016-01-01

    Domestic violence (DV) is prevalent among women in India and has been associated with poor mental and physical health. We performed a systematic review of 137 quantitative studies published in the prior decade that directly evaluated the DV experiences of Indian women to summarise the breadth of recent work and identify gaps in the literature. Among studies surveying at least two forms of abuse, a median 41% of women reported experiencing DV during their lifetime and 30% in the past year. We noted substantial inter-study variance in DV prevalence estimates, attributable in part to different study populations and settings, but also to a lack of standardisation, validation, and cultural adaptation of DV survey instruments. There was paucity of studies evaluating the DV experiences of women over age 50, residing in live-in relationships, same-sex relationships, tribal villages, and of women from the northern regions of India. Additionally, our review highlighted a gap in research evaluating the impact of DV on physical health. We conclude with a research agenda calling for additional qualitative and longitudinal quantitative studies to explore the DV correlates proposed by this quantitative literature to inform the development of a culturally tailored DV scale and prevention strategies. PMID:26886155

  12. A Quantitative Study: Enhancing the Productivity of the Emotionally Challenged High School Students

    ERIC Educational Resources Information Center

    Mammen, John

    2013-01-01

    This quantitative, causal-comparative study examined the degree of influence the parent teacher relationship can make on the grade point averages and graduation rates of students in an alternative school setting. Findings of this study revealed that the active parent teacher communication had direct relationship with the success rate of…

  13. Infrared thermography quantitative image processing

    NASA Astrophysics Data System (ADS)

    Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB

    2017-11-01

    Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.

  14. Qualitative and Quantitative Imaging Evaluation of Renal Cell Carcinoma Subtypes with Grating-based X-ray Phase-contrast CT

    NASA Astrophysics Data System (ADS)

    Braunagel, Margarita; Birnbacher, Lorenz; Willner, Marian; Marschner, Mathias; De Marco, Fabio; Viermetz, Manuel; Notohamiprodjo, Susan; Hellbach, Katharina; Auweter, Sigrid; Link, Vera; Woischke, Christine; Reiser, Maximilian F.; Pfeiffer, Franz; Notohamiprodjo, Mike; Herzen, Julia

    2017-03-01

    Current clinical imaging methods face limitations in the detection and correct characterization of different subtypes of renal cell carcinoma (RCC), while these are important for therapy and prognosis. The present study evaluates the potential of grating-based X-ray phase-contrast computed tomography (gbPC-CT) for visualization and characterization of human RCC subtypes. The imaging results for 23 ex vivo formalin-fixed human kidney specimens obtained with phase-contrast CT were compared to the results of the absorption-based CT (gbCT), clinical CT and a 3T MRI and validated using histology. Regions of interest were placed on each specimen for quantitative evaluation. Qualitative and quantitative gbPC-CT imaging could significantly discriminate between normal kidney cortex (54 ± 4 HUp) and clear cell (42 ± 10), papillary (43 ± 6) and chromophobe RCCs (39 ± 7), p < 0.05 respectively. The sensitivity for detection of tumor areas was 100%, 50% and 40% for gbPC-CT, gbCT and clinical CT, respectively. RCC architecture like fibrous strands, pseudocapsules, necrosis or hyalinization was depicted clearly in gbPC-CT and was not equally well visualized in gbCT, clinical CT and MRI. The results show that gbPC-CT enables improved discrimination of normal kidney parenchyma and tumorous tissues as well as different soft-tissue components of RCCs without the use of contrast media.

  15. Two approaches to improving mental health care: positivist/quantitative versus skill-based/qualitative.

    PubMed

    Luchins, Daniel

    2012-01-01

    The quality improvement model currently used in medicine and mental health was adopted from industry, where it developed out of early 20th-century efforts to apply a positivist/quantitative agenda to improving manufacturing. This article questions the application of this model to mental health care. It argues that (1) developing "operational definitions" for something as value-laden as "quality" risks conflating two realms, what we measure with what we value; (2) when measurements that are tied to individuals are aggregated to establish benchmarks and goals, unwarranted mathematical assumptions are made; (3) choosing clinical outcomes is problematic; (4) there is little relationship between process measures and clinical outcomes; and (5) since changes in quality indices do not relate to improved clinical care, management's reliance on such indices provides an illusory sense of control. An alternative model is the older, skill-based/qualitative approach to knowing, which relies on "implicit/ expert" knowledge. These two approaches offer a series of contrasts: quality versus excellence, competence versus expertise, management versus leadership, extrinsic versus intrinsic rewards. The article concludes that we need not totally dispense with the current quality improvement model, but rather should balance quantitative efforts with the older qualitative approach in a mixed methods model.

  16. SearchLight: a freely available web-based quantitative spectral analysis tool (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Prabhat, Prashant; Peet, Michael; Erdogan, Turan

    2016-03-01

    In order to design a fluorescence experiment, typically the spectra of a fluorophore and of a filter set are overlaid on a single graph and the spectral overlap is evaluated intuitively. However, in a typical fluorescence imaging system the fluorophores and optical filters are not the only wavelength dependent variables - even the excitation light sources have been changing. For example, LED Light Engines may have a significantly different spectral response compared to the traditional metal-halide lamps. Therefore, for a more accurate assessment of fluorophore-to-filter-set compatibility, all sources of spectral variation should be taken into account simultaneously. Additionally, intuitive or qualitative evaluation of many spectra does not necessarily provide a realistic assessment of the system performance. "SearchLight" is a freely available web-based spectral plotting and analysis tool that can be used to address the need for accurate, quantitative spectral evaluation of fluorescence measurement systems. This tool is available at: http://searchlight.semrock.com/. Based on a detailed mathematical framework [1], SearchLight calculates signal, noise, and signal-to-noise ratio for multiple combinations of fluorophores, filter sets, light sources and detectors. SearchLight allows for qualitative and quantitative evaluation of the compatibility of filter sets with fluorophores, analysis of bleed-through, identification of optimized spectral edge locations for a set of filters under specific experimental conditions, and guidance regarding labeling protocols in multiplexing imaging assays. Entire SearchLight sessions can be shared with colleagues and collaborators and saved for future reference. [1] Anderson, N., Prabhat, P. and Erdogan, T., Spectral Modeling in Fluorescence Microscopy, http://www.semrock.com (2010).

  17. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies.

    PubMed

    Silva-Rodríguez, Jesús; Aguiar, Pablo; Sánchez, Manuel; Mosquera, Javier; Luna-Vega, Víctor; Cortés, Julia; Garrido, Miguel; Pombar, Miguel; Ruibal, Alvaro

    2014-05-01

    Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manual ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.

  18. Correction for FDG PET dose extravasations: Monte Carlo validation and quantitative evaluation of patient studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silva-Rodríguez, Jesús, E-mail: jesus.silva.rodriguez@sergas.es; Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es; Servicio de Medicina Nuclear, Complexo Hospitalario Universidade de Santiago de Compostela

    Purpose: Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. Methods: One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manualmore » ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Results: Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. Conclusions: The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.« less

  19. Application of image analysis in studies of quantitative disease resistance, exemplified using common bacterial blight-common bean pathosystem.

    PubMed

    Xie, Weilong; Yu, Kangfu; Pauls, K Peter; Navabi, Alireza

    2012-04-01

    The effectiveness of image analysis (IA) compared with an ordinal visual scale, for quantitative measurement of disease severity, its application in quantitative genetic studies, and its effect on the estimates of genetic parameters were investigated. Studies were performed using eight backcross-derived families of common bean (Phaseolus vulgaris) (n = 172) segregating for the molecular marker SU91, known to be associated with a quantitative trait locus (QTL) for resistance to common bacterial blight (CBB), caused by Xanthomonas campestris pv. phaseoli and X. fuscans subsp. fuscans. Even though both IA and visual assessments were highly repeatable, IA was more sensitive in detecting quantitative differences between bean genotypes. The CBB phenotypic difference between the two SU91 genotypic groups was consistently more than fivefold for IA assessments but generally only two- to threefold for visual assessments. Results suggest that the visual assessment results in overestimation of the effect of QTL in genetic studies. This may have been caused by lack of additivity and uneven intervals of the visual scale. Although visual assessment of disease severity is a useful tool for general selection in breeding programs, assessments using IA may be more suitable for phenotypic evaluations in quantitative genetic studies involving CBB resistance as well as other foliar diseases.

  20. Synthesis, quantitative structure–property relationship study of novel fluorescence active 2-pyrazolines and application

    PubMed Central

    Girgis, Adel S.; El-Saied, Houssni; Mohamed, Mohamed A.; Bedair, Ahmad H.; Salim, Ahmad S.

    2018-01-01

    A variety of fluorescence-active fluorinated pyrazolines 13–33 was synthesized in good yields through cyclocondensation reaction of propenones 1–9 with aryl hydrazines 10–12. Some of the synthesized compounds provided promising fluorescence properties with quantum yield (Φ) higher than that of quinine sulfate (standard reference). Quantitative structure–property relationship studies were undertaken supporting the exhibited fluorescence properties and estimating the parameters governing properties. Five synthesized fluorescence-active pyrazolines (13, 15, 18, 19 and 23) with variable Φ were selected for treating two types of paper sheets (Fabriano and Bible paper). These investigated fluorescence compounds, especially compounds 19 and 23, provide improvements in strength properties of paper sheets. Based on the observed performance they can be used as markers in security documents. PMID:29657796

  1. A generalised individual-based algorithm for modelling the evolution of quantitative herbicide resistance in arable weed populations.

    PubMed

    Liu, Chun; Bridges, Melissa E; Kaundun, Shiv S; Glasgow, Les; Owen, Micheal Dk; Neve, Paul

    2017-02-01

    Simulation models are useful tools for predicting and comparing the risk of herbicide resistance in weed populations under different management strategies. Most existing models assume a monogenic mechanism governing herbicide resistance evolution. However, growing evidence suggests that herbicide resistance is often inherited in a polygenic or quantitative fashion. Therefore, we constructed a generalised modelling framework to simulate the evolution of quantitative herbicide resistance in summer annual weeds. Real-field management parameters based on Amaranthus tuberculatus (Moq.) Sauer (syn. rudis) control with glyphosate and mesotrione in Midwestern US maize-soybean agroecosystems demonstrated that the model can represent evolved herbicide resistance in realistic timescales. Sensitivity analyses showed that genetic and management parameters were impactful on the rate of quantitative herbicide resistance evolution, whilst biological parameters such as emergence and seed bank mortality were less important. The simulation model provides a robust and widely applicable framework for predicting the evolution of quantitative herbicide resistance in summer annual weed populations. The sensitivity analyses identified weed characteristics that would favour herbicide resistance evolution, including high annual fecundity, large resistance phenotypic variance and pre-existing herbicide resistance. Implications for herbicide resistance management and potential use of the model are discussed. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  2. Variable selection based near infrared spectroscopy quantitative and qualitative analysis on wheat wet gluten

    NASA Astrophysics Data System (ADS)

    Lü, Chengxu; Jiang, Xunpeng; Zhou, Xingfan; Zhang, Yinqiao; Zhang, Naiqian; Wei, Chongfeng; Mao, Wenhua

    2017-10-01

    Wet gluten is a useful quality indicator for wheat, and short wave near infrared spectroscopy (NIRS) is a high performance technique with the advantage of economic rapid and nondestructive test. To study the feasibility of short wave NIRS analyzing wet gluten directly from wheat seed, 54 representative wheat seed samples were collected and scanned by spectrometer. 8 spectral pretreatment method and genetic algorithm (GA) variable selection method were used to optimize analysis. Both quantitative and qualitative model of wet gluten were built by partial least squares regression and discriminate analysis. For quantitative analysis, normalization is the optimized pretreatment method, 17 wet gluten sensitive variables are selected by GA, and GA model performs a better result than that of all variable model, with R2V=0.88, and RMSEV=1.47. For qualitative analysis, automatic weighted least squares baseline is the optimized pretreatment method, all variable models perform better results than those of GA models. The correct classification rates of 3 class of <24%, 24-30%, >30% wet gluten content are 95.45, 84.52, and 90.00%, respectively. The short wave NIRS technique shows potential for both quantitative and qualitative analysis of wet gluten for wheat seed.

  3. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    NASA Astrophysics Data System (ADS)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2017-12-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  4. Quantitative Peptidomics with Five-plex Reductive Methylation labels

    NASA Astrophysics Data System (ADS)

    Tashima, Alexandre K.; Fricker, Lloyd D.

    2018-05-01

    Quantitative peptidomics and proteomics often use chemical tags to covalently modify peptides with reagents that differ in the number of stable isotopes, allowing for quantitation of the relative peptide levels in the original sample based on the peak height of each isotopic form. Different chemical reagents have been used as tags for quantitative peptidomics and proteomics, and all have strengths and weaknesses. One of the simplest approaches uses formaldehyde and sodium cyanoborohydride to methylate amines, converting primary and secondary amines into tertiary amines. Up to five different isotopic forms can be generated, depending on the isotopic forms of formaldehyde and cyanoborohydride reagents, allowing for five-plex quantitation. However, the mass difference between each of these forms is only 1 Da per methyl group incorporated into the peptide, and for many peptides there is substantial overlap from the natural abundance of 13C and other isotopes. In this study, we calculated the contribution from the natural isotopes for 26 native peptides and derived equations to correct the peak intensities. These equations were applied to data from a study using human embryonic kidney HEK293T cells in which five replicates were treated with 100 nM vinblastine for 3 h and compared with five replicates of cells treated with control medium. The correction equations brought the replicates to the expected 1:1 ratios and revealed significant decreases in levels of 21 peptides upon vinblastine treatment. These equations enable accurate quantitation of small changes in peptide levels using the reductive methylation labeling approach. [Figure not available: see fulltext.

  5. A quantitative electroencephalographic study of meditation and binaural beat entrainment.

    PubMed

    Lavallee, Christina F; Koren, Stanley A; Persinger, Michael A

    2011-04-01

    The study objective was to determine the quantitative electroencephalographic correlates of meditation, as well as the effects of hindering (15 Hz) and facilitative (7 Hz) binaural beats on the meditative process. The study was a mixed design, with experience of the subject as the primary between-subject measure and power of the six classic frequency bands (δ, θ, low α, high α, β, γ), neocortical lobe (frontal, temporal, parietal, occipital), hemisphere (left, right), and condition (meditation only, meditation with 7-Hz beats, meditation with 15-Hz beats) as the within-subject measures. The study was conducted at Laurentian University in Sudbury, Ontario, Canada. The subjects comprised novice (mean of 8 months experience) and experienced (mean of 18 years experience) meditators recruited from local meditation groups. Experimental manipulation included application of hindering and facilitative binaural beats to the meditative process. Experienced meditators displayed increased left temporal lobe δ power when the facilitative binaural beats were applied, whereas the effect was not observed for the novice subjects in this condition. When the hindering binaural beats were introduced, the novice subjects consistently displayed more γ power than the experienced subjects over the course of their meditation, relative to baseline. Based on the results of this study, novice meditators were not able to maintain certain levels of θ power in the occipital regions when hindering binaural beats were presented, whereas when the facilitative binaural beats were presented, the experienced meditators displayed increased θ power in the left temporal lobe. These results suggest that the experienced meditators have developed techniques over the course of their meditation practice to counter hindering environmental stimuli, whereas the novice meditators have not yet developed those techniques.

  6. Exploring the use of storytelling in quantitative research fields using a multiple case study method

    NASA Astrophysics Data System (ADS)

    Matthews, Lori N. Hamlet

    The purpose of this study was to explore the emerging use of storytelling in quantitative research fields. The focus was not on examining storytelling in research, but rather how stories are used in various ways within the social context of quantitative research environments. In-depth interviews were conducted with seven professionals who had experience using storytelling in their work and my personal experience with the subject matter was also used as a source of data according to the notion of researcher-as-instrument. This study is qualitative in nature and is guided by two supporting theoretical frameworks, the sociological perspective and narrative inquiry. A multiple case study methodology was used to gain insight about why participants decided to use stories or storytelling in a quantitative research environment that may not be traditionally open to such methods. This study also attempted to identify how storytelling can strengthen or supplement existing research, as well as what value stories can provide to the practice of research in general. Five thematic findings emerged from the data and were grouped under two headings, "Experiencing Research" and "Story Work." The themes were found to be consistent with four main theoretical functions of storytelling identified in existing scholarly literature: (a) sense-making; (b) meaning-making; (c) culture; and (d) communal function. The five thematic themes that emerged from this study and were consistent with the existing literature include: (a) social context; (b) quantitative versus qualitative; (c) we think and learn in terms of stories; (d) stories tie experiences together; and (e) making sense and meaning. Recommendations are offered in the form of implications for various social contexts and topics for further research are presented as well.

  7. A comparative study of quantitative microsegregation analyses performed during the solidification of the Ni-base superalloy CMSX-10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Seong-Moon, E-mail: castme@kims.re.kr; Jeong, Hi-Won; Ahn, Young-Keun

    Quantitative microsegregation analyses were systematically carried out during the solidification of the Ni-base superalloy CMSX-10 to clarify the methodological effect on the quantification of microsegregation and to fully understand the solidification microstructure. Three experimental techniques, namely, mushy zone quenching (MZQ), planar directional solidification followed by quenching (PDSQ), and random sampling (RS), were implemented for the analysis of microsegregation tendency and the magnitude of solute elements by electron probe microanalysis. The microprobe data and the calculation results of the diffusion field ahead of the solid/liquid (S/L) interface of PDSQ samples revealed that the liquid composition at the S/L interface is significantlymore » influenced by quenching. By applying the PDSQ technique, it was also found that the partition coefficients of all solute elements do not change appreciably during the solidification of primary γ. All three techniques could reasonably predict the segregation behavior of most solute elements. Nevertheless, the RS approach has a tendency to overestimate the magnitude of segregation for most solute elements when compared to the MZQ and PDSQ techniques. Moreover, the segregation direction of Cr and Mo predicted by the RS approach was found to be opposite from the results obtained by the MZQ and PDSQ techniques. This conflicting segregation behavior of Cr and Mo was discussed intensively. It was shown that the formation of Cr-rich areas near the γ/γ′ eutectic in various Ni-base superalloys, including the CMSX-10 alloy, could be successfully explained by the results of microprobe analysis performed on a sample quenched during the planar directional solidification of γ/γ′ eutectic. - Highlights: • Methodological effect on the quantification of microsegregation was clarified. • The liquid composition at the S/L interface was influenced by quenching. • The segregation direction of Cr varied depending on

  8. A simple hemostasis model for the quantitative evaluation of hydrogel-based local hemostatic biomaterials on tissue surface.

    PubMed

    Murakami, Yoshihiko; Yokoyama, Masayuki; Nishida, Hiroshi; Tomizawa, Yasuko; Kurosawa, Hiromi

    2008-09-01

    Several hemostat hydrogels are clinically used, and some other agents are studied for safer, more facile, and more efficient hemostasis. In the present paper, we proposed a novel method to evaluate local hemostat hydrogel on tissue surface. The procedure consisted of the following steps: (step 1) a mouse was fixed on a cork board, and its abdomen was incised; (step 2) serous fluid was carefully removed because it affected the estimation of the weight gained by the filter paper, and parafilm and preweighted filter paper were placed beneath the liver (parafilm prevented the filter paper's absorption of gradually oozing serous fluid); (step 3) the cork board was tilted and maintained at an angle of about 45 degrees so that the bleeding would more easily flow from the liver toward the filter paper; and (step 4) the bleeding lasted for 3 min. In this step, a hemostat was applied to the liver wound immediately after the liver was pricked with a needle. We found that (1) a careful removal of serous fluid prior to a bleeding and (2) a quantitative determination of the amount of excess aqueous solution that oozed out from a hemostat were important to a rigorous evaluation of hemostat efficacy. We successfully evaluated the efficacy of a fibrin-based hemostat hydrogel by using our method. The method proposed in the present study enabled the quantitative, accurate, and easy evaluation of the efficacy of local hemostatic hydrogel which acts as tissue-adhesive agent on biointerfaces.

  9. Quantitation of Localized 31P Magnetic Resonance Spectra Based on the Reciprocity Principle

    NASA Astrophysics Data System (ADS)

    Kreis, R.; Slotboom, J.; Pietz, J.; Jung, B.; Boesch, C.

    2001-04-01

    There is a need for absolute quantitation methods in 31P magnetic resonance spectroscopy, because none of the phosphorous-containing metabolites is necessarily constant in pathology. Here, a method for absolute quantitation of in vivo31P MR spectra that provides reproducible metabolite contents in institutional or standard units is described. It relies on the reciprocity principle, i.e., the proportionality between the B1 field map and the map of reception strength for a coil with identical relative current distributions in receive and transmit mode. Cerebral tissue contents of 31P metabolites were determined in a predominantly white matter-containing location in healthy subjects. The results are in good agreement with the literature and the interexamination coefficient of variance is better than that in most previous studies. A gender difference found for some of the 31P metabolites may be explained by different voxel composition.

  10. Quantitative Relationship Between Cumulative Risk Alleles Based on Genome-Wide Association Studies and Type 2 Diabetes Mellitus: A Systematic Review and Meta-analysis.

    PubMed

    Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito

    2018-01-05

    Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13-1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08-1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution.

  11. Quantitative Relationship Between Cumulative Risk Alleles Based on Genome-Wide Association Studies and Type 2 Diabetes Mellitus: A Systematic Review and Meta-analysis

    PubMed Central

    Kodama, Satoru; Fujihara, Kazuya; Ishiguro, Hajime; Horikawa, Chika; Ohara, Nobumasa; Yachi, Yoko; Tanaka, Shiro; Shimano, Hitoshi; Kato, Kiminori; Hanyu, Osamu; Sone, Hirohito

    2018-01-01

    Many epidemiological studies have assessed the genetic risk of having undiagnosed or of developing type 2 diabetes mellitus (T2DM) using several single nucleotide polymorphisms (SNPs) based on findings of genome-wide association studies (GWAS). However, the quantitative association of cumulative risk alleles (RAs) of such SNPs with T2DM risk has been unclear. The aim of this meta-analysis is to review the strength of the association between cumulative RAs and T2DM risk. Systematic literature searches were conducted for cross-sectional or longitudinal studies that examined odds ratios (ORs) for T2DM in relation to genetic profiles. Logarithm of the estimated OR (log OR) of T2DM for 1 increment in RAs carried (1-ΔRA) in each study was pooled using a random-effects model. There were 46 eligible studies that included 74,880 cases among 249,365 participants. In 32 studies with a cross-sectional design, the pooled OR for T2DM morbidity for 1-ΔRA was 1.16 (95% confidence interval [CI], 1.13–1.19). In 15 studies that had a longitudinal design, the OR for incident T2DM was 1.10 (95% CI, 1.08–1.13). There was large heterogeneity in the magnitude of log OR (P < 0.001 for both cross-sectional studies and longitudinal studies). The top 10 commonly used genes significantly explained the variance in the log OR (P = 0.04 for cross-sectional studies; P = 0.006 for longitudinal studies). The current meta-analysis indicated that carrying 1-ΔRA in T2DM-associated SNPs was associated with a modest risk of prevalent or incident T2DM, although the heterogeneity in the used genes among studies requires us to interpret the results with caution. PMID:29093303

  12. An Inquiry-Based Quantitative Reasoning Course for Business Students

    ERIC Educational Resources Information Center

    Piercey, Victor; Militzer, Erin

    2017-01-01

    Quantitative Reasoning for Business is a two-semester sequence that serves as an alternative to elementary and intermediate algebra for first-year business students with weak mathematical preparation. Students who take the sequence have been retained at a higher rate and demonstrated a larger reduction in math anxiety than those who take the…

  13. Quantitative and multiplexed detection for blood typing based on quantum dot-magnetic bead assay.

    PubMed

    Xu, Ting; Zhang, Qiang; Fan, Ya-Han; Li, Ru-Qing; Lu, Hua; Zhao, Shu-Ming; Jiang, Tian-Lun

    2017-01-01

    Accurate and reliable blood grouping is essential for safe blood transfusion. However, conventional methods are qualitative and use only single-antigen detection. We overcame these limitations by developing a simple, quantitative, and multiplexed detection method for blood grouping using quantum dots (QDs) and magnetic beads. In the QD fluorescence assay (QFA), blood group A and B antigens were quantified using QD labeling and magnetic beads, and the blood groups were identified according to the R value (the value was calculated with the fluorescence intensity from dual QD labeling) of A and B antigens. The optimized performance of QFA was established by blood typing 791 clinical samples. Quantitative and multiplexed detection for blood group antigens can be completed within 35 min with more than 10 5 red blood cells. When conditions are optimized, the assay performance is satisfactory for weak samples. The coefficients of variation between and within days were less than 10% and the reproducibility was good. The ABO blood groups of 791 clinical samples were identified by QFA, and the accuracy obtained was 100% compared with the tube test. Receiver-operating characteristic curves revealed that the QFA has high sensitivity and specificity toward clinical samples, and the cutoff points of the R value of A and B antigens were 1.483 and 1.576, respectively. In this study, we reported a novel quantitative and multiplexed method for the identification of ABO blood groups and presented an effective alternative for quantitative blood typing. This method can be used as an effective tool to improve blood typing and further guarantee clinical transfusion safety.

  14. Implementing Response to Intervention in Title I Elementary Schools: A Quantitative Study of Teacher Response Relationships

    ERIC Educational Resources Information Center

    Webster, Katina F.

    2012-01-01

    General educators and special educators in Title I elementary schools perceive the relationships between principles of RTI and their state RTI framework, the implementation of RTI, and professional development received in RTI differently. A quantitative survey-based research methodology was employed including the use of Cronbach's alpha to…

  15. Quantitative analysis of glycated albumin in serum based on ATR-FTIR spectrum combined with SiPLS and SVM.

    PubMed

    Li, Yuanpeng; Li, Fucui; Yang, Xinhao; Guo, Liu; Huang, Furong; Chen, Zhenqiang; Chen, Xingdan; Zheng, Shifu

    2018-08-05

    A rapid quantitative analysis model for determining the glycated albumin (GA) content based on Attenuated total reflectance (ATR)-Fourier transform infrared spectroscopy (FTIR) combining with linear SiPLS and nonlinear SVM has been developed. Firstly, the real GA content in human serum was determined by GA enzymatic method, meanwhile, the ATR-FTIR spectra of serum samples from the population of health examination were obtained. The spectral data of the whole spectra mid-infrared region (4000-600 cm -1 ) and GA's characteristic region (1800-800 cm -1 ) were used as the research object of quantitative analysis. Secondly, several preprocessing steps including first derivative, second derivative, variable standardization and spectral normalization, were performed. Lastly, quantitative analysis regression models were established by using SiPLS and SVM respectively. The SiPLS modeling results are as follows: root mean square error of cross validation (RMSECV T ) = 0.523 g/L, calibration coefficient (R C ) = 0.937, Root Mean Square Error of Prediction (RMSEP T ) = 0.787 g/L, and prediction coefficient (R P ) = 0.938. The SVM modeling results are as follows: RMSECV T  = 0.0048 g/L, R C  = 0.998, RMSEP T  = 0.442 g/L, and R p  = 0.916. The results indicated that the model performance was improved significantly after preprocessing and optimization of characteristic regions. While modeling performance of nonlinear SVM was considerably better than that of linear SiPLS. Hence, the quantitative analysis model for GA in human serum based on ATR-FTIR combined with SiPLS and SVM is effective. And it does not need sample preprocessing while being characterized by simple operations and high time efficiency, providing a rapid and accurate method for GA content determination. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Quantile-based permutation thresholds for quantitative trait loci hotspots.

    PubMed

    Neto, Elias Chaibub; Keller, Mark P; Broman, Andrew F; Attie, Alan D; Jansen, Ritsert C; Broman, Karl W; Yandell, Brian S

    2012-08-01

    Quantitative trait loci (QTL) hotspots (genomic locations affecting many traits) are a common feature in genetical genomics studies and are biologically interesting since they may harbor critical regulators. Therefore, statistical procedures to assess the significance of hotspots are of key importance. One approach, randomly allocating observed QTL across the genomic locations separately by trait, implicitly assumes all traits are uncorrelated. Recently, an empirical test for QTL hotspots was proposed on the basis of the number of traits that exceed a predetermined LOD value, such as the standard permutation LOD threshold. The permutation null distribution of the maximum number of traits across all genomic locations preserves the correlation structure among the phenotypes, avoiding the detection of spurious hotspots due to nongenetic correlation induced by uncontrolled environmental factors and unmeasured variables. However, by considering only the number of traits above a threshold, without accounting for the magnitude of the LOD scores, relevant information is lost. In particular, biologically interesting hotspots composed of a moderate to small number of traits with strong LOD scores may be neglected as nonsignificant. In this article we propose a quantile-based permutation approach that simultaneously accounts for the number and the LOD scores of traits within the hotspots. By considering a sliding scale of mapping thresholds, our method can assess the statistical significance of both small and large hotspots. Although the proposed approach can be applied to any type of heritable high-volume "omic" data set, we restrict our attention to expression (e)QTL analysis. We assess and compare the performances of these three methods in simulations and we illustrate how our approach can effectively assess the significance of moderate and small hotspots with strong LOD scores in a yeast expression data set.

  17. Development and application of SINE multilocus and quantitative genetic markers to study oilseed rape (Brassica napus L.) crops.

    PubMed

    Allnutt, T R; Roper, K; Henry, C

    2008-01-23

    A genetic marker system based on the S1 Short Interspersed Elements (SINEs) in the important commercial crop, oilseed rape ( Brassica napus L.) has been developed. SINEs provided a successful multilocus, dominant marker system that was capable of clearly delineating winter- and spring-type crop varieties. Sixteen of 20 varieties tested showed unique profiles from the 17 polymorphic SINE markers generated. The 3' or 5' flank region of nine SINE markers were cloned, and DNA was sequenced. In addition, one putative pre-transposition SINE allele was cloned and sequenced. Two SINE flanking sequences were used to design real-time PCR assays. These quantitative SINE assays were applied to study the genetic structure of eight fields of oilseed rape crops. Studied fields were more genetically diverse than expected for the chosen loci (mean H T = 0.23). The spatial distribution of SINE marker frequencies was highly structured in some fields, suggesting locations of volunteer impurities within the crop. In one case, the assay identified a mislabeling of the crop variety. SINE markers were a useful tool for crop genetics, phylogenetics, variety identification, and purity analysis. The use and further application of quantitative, real-time PCR markers are discussed.

  18. Quantitative multiplex detection of biomarkers on a waveguide-based biosensor using quantum dots

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Hongzhi; Mukundan, Harshini; Martinez, Jennifer S

    2009-01-01

    The quantitative, simultaneous detection of multiple biomarkers with high sensitivity and specificity is critical for biomedical diagnostics, drug discovery and biomarker characterization [Wilson 2006, Tok 2006, Straub 2005, Joos 2002, Jani 2000]. Detection systems relying on optical signal transduction are, in general, advantageous because they are fast, portable, inexpensive, sensitive, and have the potential for multiplex detection of analytes of interest. However, conventional immunoassays for the detection of biomarkers, such as the Enzyme Linked Immunosorbant Assays (ELISAs) are semi-quantitative, time consuming and insensitive. ELISA assays are also limited by high non-specific binding, especially when used with complex biological samples suchmore » as serum and urine (REF). Organic fluorophores that are commonly used in such applications lack photostability and possess a narrow Stoke's shift that makes simultaneous detection of multiple fluorophores with a single excitation source difficult, thereby restricting their use in multiplex assays. The above limitations with traditional assay platforms have resulted in the increased use of nanotechnology-based tools and techniques in the fields of medical imaging [ref], targeted drug delivery [Caruthers 2007, Liu 2007], and sensing [ref]. One such area of increasing interest is the use of semiconductor quantum dots (QDs) for biomedical research and diagnostics [Gao and Cui 2004, Voura 2004, Michalet 2005, Chan 2002, Jaiswal 2004, Gao 2005, Medintz 2005, So 2006 2006, Wu 2003]. Compared to organic dyes, QDs provide several advantages for use in immunoassay platforms, including broad absorption bands with high extinction coefficients, narrow and symmetric emission bands with high quantum yields, high photostablility, and a large Stokes shift [Michalet 2005, Gu 2002]. These features prompted the use of QDs as probes in biodetection [Michalet 2005, Medintz 2005]. For example, Jaiswal et al. reported long term

  19. Quantitative background parenchymal uptake on molecular breast imaging and breast cancer risk: a case-control study.

    PubMed

    Hruska, Carrie B; Geske, Jennifer R; Swanson, Tiffinee N; Mammel, Alyssa N; Lake, David S; Manduca, Armando; Conners, Amy Lynn; Whaley, Dana H; Scott, Christopher G; Carter, Rickey E; Rhodes, Deborah J; O'Connor, Michael K; Vachon, Celine M

    2018-06-05

    Background parenchymal uptake (BPU), which refers to the level of Tc-99m sestamibi uptake within normal fibroglandular tissue on molecular breast imaging (MBI), has been identified as a breast cancer risk factor, independent of mammographic density. Prior analyses have used subjective categories to describe BPU. We evaluate a new quantitative method for assessing BPU by testing its reproducibility, comparing quantitative results with previously established subjective BPU categories, and determining the association of quantitative BPU with breast cancer risk. Two nonradiologist operators independently performed region-of-interest analysis on MBI images viewed in conjunction with corresponding digital mammograms. Quantitative BPU was defined as a unitless ratio of the average pixel intensity (counts/pixel) within the fibroglandular tissue versus the average pixel intensity in fat. Operator agreement and the correlation of quantitative BPU measures with subjective BPU categories assessed by expert radiologists were determined. Percent density on mammograms was estimated using Cumulus. The association of quantitative BPU with breast cancer (per one unit BPU) was examined within an established case-control study of 62 incident breast cancer cases and 177 matched controls. Quantitative BPU ranged from 0.4 to 3.2 across all subjects and was on average higher in cases compared to controls (1.4 versus 1.2, p < 0.007 for both operators). Quantitative BPU was strongly correlated with subjective BPU categories (Spearman's r = 0.59 to 0.69, p < 0.0001, for each paired combination of two operators and two radiologists). Interoperator and intraoperator agreement in the quantitative BPU measure, assessed by intraclass correlation, was 0.92 and 0.98, respectively. Quantitative BPU measures showed either no correlation or weak negative correlation with mammographic percent density. In a model adjusted for body mass index and percent density, higher quantitative BPU was

  20. The Relationship between Stress, Coping Style, and Academic Satisfaction: A Quantitative Study

    ERIC Educational Resources Information Center

    Hodge-Windover, Sheila T.

    2017-01-01

    College students experience a great deal of stress, which is associated with poor health and poor levels of academic satisfaction which can lead to low retention. The purpose of this quantitative correlational study was to investigate how stress and coping style predict academic satisfaction and understand how and if coping style moderates the…