Applying Mixed Methods Research at the Synthesis Level: An Overview
ERIC Educational Resources Information Center
Heyvaert, Mieke; Maes, Bea; Onghena, Patrick
2011-01-01
Historically, qualitative and quantitative approaches have been applied relatively separately in synthesizing qualitative and quantitative evidence, respectively, in several research domains. However, mixed methods approaches are becoming increasingly popular nowadays, and practices of combining qualitative and quantitative research components at…
de Greef-van der Sandt, I; Newgreen, D; Schaddelee, M; Dorrepaal, C; Martina, R; Ridder, A; van Maanen, R
2016-04-01
A multicriteria decision analysis (MCDA) approach was developed and used to estimate the benefit-risk of solifenacin and mirabegron and their combination in the treatment of overactive bladder (OAB). The objectives were 1) to develop an MCDA tool to compare drug effects in OAB quantitatively, 2) to establish transparency in the evaluation of the benefit-risk profile of various dose combinations, and 3) to quantify the added value of combination use compared to monotherapies. The MCDA model was developed using efficacy, safety, and tolerability attributes and the results of a phase II factorial design combination study were evaluated. Combinations of solifenacin 5 mg and mirabegron 25 mg and mirabegron 50 (5+25 and 5+50) scored the highest clinical utility and supported combination therapy development of solifenacin and mirabegron for phase III clinical development at these dose regimens. This case study underlines the benefit of using a quantitative approach in clinical drug development programs. © 2015 The American Society for Clinical Pharmacology and Therapeutics.
Boersema, Paul J.; Foong, Leong Yan; Ding, Vanessa M. Y.; Lemeer, Simone; van Breukelen, Bas; Philp, Robin; Boekhorst, Jos; Snel, Berend; den Hertog, Jeroen; Choo, Andre B. H.; Heck, Albert J. R.
2010-01-01
Several mass spectrometry-based assays have emerged for the quantitative profiling of cellular tyrosine phosphorylation. Ideally, these methods should reveal the exact sites of tyrosine phosphorylation, be quantitative, and not be cost-prohibitive. The latter is often an issue as typically several milligrams of (stable isotope-labeled) starting protein material are required to enable the detection of low abundance phosphotyrosine peptides. Here, we adopted and refined a peptidecentric immunoaffinity purification approach for the quantitative analysis of tyrosine phosphorylation by combining it with a cost-effective stable isotope dimethyl labeling method. We were able to identify by mass spectrometry, using just two LC-MS/MS runs, more than 1100 unique non-redundant phosphopeptides in HeLa cells from about 4 mg of starting material without requiring any further affinity enrichment as close to 80% of the identified peptides were tyrosine phosphorylated peptides. Stable isotope dimethyl labeling could be incorporated prior to the immunoaffinity purification, even for the large quantities (mg) of peptide material used, enabling the quantification of differences in tyrosine phosphorylation upon pervanadate treatment or epidermal growth factor stimulation. Analysis of the epidermal growth factor-stimulated HeLa cells, a frequently used model system for tyrosine phosphorylation, resulted in the quantification of 73 regulated unique phosphotyrosine peptides. The quantitative data were found to be exceptionally consistent with the literature, evidencing that such a targeted quantitative phosphoproteomics approach can provide reproducible results. In general, the combination of immunoaffinity purification of tyrosine phosphorylated peptides with large scale stable isotope dimethyl labeling provides a cost-effective approach that can alleviate variation in sample preparation and analysis as samples can be combined early on. Using this approach, a rather complete qualitative and quantitative picture of tyrosine phosphorylation signaling events can be generated. PMID:19770167
Safavi-Abbasi, Sam; de Oliveira, Jean G; Deshmukh, Pushpa; Reis, Cassius V; Brasiliense, Leonardo B C; Crawford, Neil R; Feiz-Erfan, Iman; Spetzler, Robert F; Preul, Mark C
2010-03-01
The aim of this study was to describe quantitatively the properties of the posterolateral approaches and their combination. Six silicone-injected cadaveric heads were dissected bilaterally. Quantitative data were generated with the Optotrak 3020 system (Northern Digital, Waterloo, Canada) and Surgiscope (Elekta Instruments, Inc., Atlanta, GA), including key anatomic points on the skull base and brainstem. All parameters were measured after the basic retrosigmoid craniectomy and then after combination with a basic far-lateral extension. The clinical results of 20 patients who underwent a combined retrosigmoid and far-lateral approach were reviewed. The change in accessibility to the lower clivus was greatest after the far-lateral extension (mean change, 43.62 +/- 10.98 mm2; P = .001). Accessibility to the constant landmarks, Meckel's cave, internal auditory meatus, and jugular foramen did not change significantly between the 2 approaches (P > .05). The greatest change in accessibility to soft tissue between the 2 approaches was to the lower brainstem (mean change, 33.88 +/- 5.25 mm2; P = .0001). Total removal was achieved in 75% of the cases. The average postoperative Glasgow Outcome Scale score of patients who underwent the combined retrosigmoid and far-lateral approach improved significantly, compared with the preoperative scores. The combination of the far-lateral and simple retrosigmoid approaches significantly increases the petroclival working area and access to the cranial nerves. However, risk of injury to neurovascular structures and time needed to extend the craniotomy must be weighed against the increased working area and angles of attack.
Rosedale, Mary; Malaspina, Dolores; Malamud, Daniel; Strauss, Shiela M; Horne, Jaclyn D; Abouzied, Salman; Cruciani, Ricardo A; Knotkova, Helena
2012-01-01
This article reports and discusses how quantitative (physiological and behavioral) and qualitative methods are being combined in an open-label pilot feasibility study. The study evaluates safety, tolerability, and acceptability of a protocol to treat depression in HIV-infected individuals, using a 2-week block of transcranial direct current stimulation (tDCS) over the dorsolateral prefrontal cortex. Major depressive disorder (MDD) is the second most prevalent psychiatric disorder after substance abuse among HIV-positive adults, and novel antidepressant treatments are needed for this vulnerable population. The authors describe the challenges and contributions derived from different research perspectives and methodological approaches and provide a philosophical framework for combining quantitative and qualitative measurements for a fuller examination of the disorder. Four methodological points are presented: (1) the value of combining quantitative and qualitative approaches; (2) the need for context-specific measures when studying patients with medical and psychiatric comorbidities; (3) the importance of research designs that integrate physiological, behavioral, and qualitative approaches when evaluating novel treatments; and (4) the need to explore the relationships between biomarkers, clinical symptom assessments, patient self-evaluations, and patient experiences when developing new, patient-centered protocols. The authors conclude that the complexity of studying novel treatments in complex and new patient populations requires complex research designs to capture the richness of data that inform translational research.
Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
A Qualitative-Quantitative H-NMR Experiment for the Instrumental Analysis Laboratory.
ERIC Educational Resources Information Center
Phillips, John S.; Leary, James J.
1986-01-01
Describes an experiment combining qualitative and quantitative information from hydrogen nuclear magnetic resonance spectra. Reviews theory, discusses the experimental approach, and provides sample results. (JM)
Halcomb, Elizabeth; Hickman, Louise
2015-04-08
Mixed methods research involves the use of qualitative and quantitative data in a single research project. It represents an alternative methodological approach, combining qualitative and quantitative research approaches, which enables nurse researchers to explore complex phenomena in detail. This article provides a practical overview of mixed methods research and its application in nursing, to guide the novice researcher considering a mixed methods research project.
1992-05-21
complete dependence on nerves. Organ culture of sciatic nerves, combined with an assay for axolotl transferrin developed earlier, allows quantitative study...axonal release of various unknown proteins. Combining this approach with the ELISA for quantitative measurement of axolotl transferrin developed with...light microscope autoradiographic analysis following binding of radiolabelled Tf. Studies of Tf synthesis will employ cDNA probes for axolotl Tf mRNA
Combining formal and functional approaches to topic structure.
Zellers, Margaret; Post, Brechtje
2012-03-01
Fragmentation between formal and functional approaches to prosodic variation is an ongoing problem in linguistic research. In particular, the frameworks of the Phonetics of Talk-in-Interaction (PTI) and Empirical Phonology (EP) take very different theoretical and methodological approaches to this kind of variation. We argue that it is fruitful to adopt the insights of both PTI's qualitative analysis and EP's quantitative analysis and combine them into a multiple-methods approach. One realm in which it is possible to combine these frameworks is in the analysis of discourse topic structure and the prosodic cues relevant to it. By combining a quantitative and a qualitative approach to discourse topic structure, it is possible to give a better account of the observed variation in prosody, for example in the case of fundamental frequency (F0) peak timing, which can be explained in terms of pitch accent distribution over different topic structure categories. Similarly, local and global patterns in speech rate variation can be better explained and motivated by adopting insights from both PTI and EP in the study of topic structure. Combining PTI and EP can provide better accounts of speech data as well as opening up new avenues of investigation which would not have been possible in either approach alone.
Kim, Jae Yoon; Moon, Jun-Cheol; Kim, Hyo Chul; Shin, Seungho; Song, Kitae; Kim, Kyung-Hee; Lee, Byung-Moo
2017-01-01
Premise of the study: Positional cloning in combination with phenotyping is a general approach to identify disease-resistance gene candidates in plants; however, it requires several time-consuming steps including population or fine mapping. Therefore, in the present study, we suggest a new combined strategy to improve the identification of disease-resistance gene candidates. Methods and Results: Downy mildew (DM)–resistant maize was selected from five cultivars using a spreader row technique. Positional cloning and bioinformatics tools were used to identify the DM-resistance quantitative trait locus marker (bnlg1702) and 47 protein-coding gene annotations. Eventually, five DM-resistance gene candidates, including bZIP34, Bak1, and Ppr, were identified by quantitative reverse-transcription PCR (RT-PCR) without fine mapping of the bnlg1702 locus. Conclusions: The combined protocol with the spreader row technique, quantitative trait locus positional cloning, and quantitative RT-PCR was effective for identifying DM-resistance candidate genes. This cloning approach may be applied to other whole-genome-sequenced crops or resistance to other diseases. PMID:28224059
Quantitative consensus of supervised learners for diffuse lung parenchymal HRCT patterns
NASA Astrophysics Data System (ADS)
Raghunath, Sushravya; Rajagopalan, Srinivasan; Karwoski, Ronald A.; Bartholmai, Brian J.; Robb, Richard A.
2013-03-01
Automated lung parenchymal classification usually relies on supervised learning of expert chosen regions representative of the visually differentiable HRCT patterns specific to different pathologies (eg. emphysema, ground glass, honey combing, reticular and normal). Considering the elusiveness of a single most discriminating similarity measure, a plurality of weak learners can be combined to improve the machine learnability. Though a number of quantitative combination strategies exist, their efficacy is data and domain dependent. In this paper, we investigate multiple (N=12) quantitative consensus approaches to combine the clusters obtained with multiple (n=33) probability density-based similarity measures. Our study shows that hypergraph based meta-clustering and probabilistic clustering provides optimal expert-metric agreement.
Urban Multisensory Laboratory, AN Approach to Model Urban Space Human Perception
NASA Astrophysics Data System (ADS)
González, T.; Sol, D.; Saenz, J.; Clavijo, D.; García, H.
2017-09-01
An urban sensory lab (USL or LUS an acronym in Spanish) is a new and avant-garde approach for studying and analyzing a city. The construction of this approach allows the development of new methodologies to identify the emotional response of public space users. The laboratory combines qualitative analysis proposed by urbanists and quantitative measures managed by data analysis applications. USL is a new approach to go beyond the borders of urban knowledge. The design thinking strategy allows us to implement methods to understand the results provided by our technique. In this first approach, the interpretation is made by hand. However, our goal is to combine design thinking and machine learning in order to analyze the qualitative and quantitative data automatically. Now, the results are being used by students from the Urbanism and Architecture courses in order to get a better understanding of public spaces in Puebla, Mexico and its interaction with people.
Readability Approaches: Implications for Turkey
ERIC Educational Resources Information Center
Ulusoy, Mustafa
2006-01-01
Finding the right fit between students' reading ability and textbooks is very important for comprehension. Readability studies aim to analyse texts to find the right fit between students and texts. In this literature review, readability studies are classified under quantitative, qualitative and combined quantitative-qualitative readability…
ERIC Educational Resources Information Center
Thang, Siew Ming; Mustaffa, Rosniah; Wong, Fook Fei; Noor, Noorizah Mohd.; Mahmud, Najihah; Latif, Hafizah; Aziz, Mohd. Sallehhudin Abd.
2013-01-01
Blended learning has been described as a pedagogical approach that combines effectiveness and socialization opportunities of the classroom with the technologically enhanced active learning possibilities of the online environment (Dziuban, Hartman, & Moskal, 2004). It has also been depicted as an approach that combines traditional learning with…
Quantitative Resistance to Plant Pathogens in Pyramiding Strategies for Durable Crop Protection.
Pilet-Nayel, Marie-Laure; Moury, Benoît; Caffier, Valérie; Montarry, Josselin; Kerlan, Marie-Claire; Fournet, Sylvain; Durel, Charles-Eric; Delourme, Régine
2017-01-01
Quantitative resistance has gained interest in plant breeding for pathogen control in low-input cropping systems. Although quantitative resistance frequently has only a partial effect and is difficult to select, it is considered more durable than major resistance (R) genes. With the exponential development of molecular markers over the past 20 years, resistance QTL have been more accurately detected and better integrated into breeding strategies for resistant varieties with increased potential for durability. This review summarizes current knowledge on the genetic inheritance, molecular basis, and durability of quantitative resistance. Based on this knowledge, we discuss how strategies that combine major R genes and QTL in crops can maintain the effectiveness of plant resistance to pathogens. Combining resistance QTL with complementary modes of action appears to be an interesting strategy for breeding effective and potentially durable resistance. Combining quantitative resistance with major R genes has proven to be a valuable approach for extending the effectiveness of major genes. In the plant genomics era, improved tools and methods are becoming available to better integrate quantitative resistance into breeding strategies. Nevertheless, optimal combinations of resistance loci will still have to be identified to preserve resistance effectiveness over time for durable crop protection.
Marchand, Jérémy; Martineau, Estelle; Guitton, Yann; Dervilly-Pinel, Gaud; Giraudeau, Patrick
2017-02-01
Multi-dimensional NMR is an appealing approach for dealing with the challenging complexity of biological samples in metabolomics. This article describes how spectroscopists have recently challenged their imagination in order to make 2D NMR a powerful tool for quantitative metabolomics, based on innovative pulse sequences combined with meticulous analytical chemistry approaches. Clever time-saving strategies have also been explored to make 2D NMR a high-throughput tool for metabolomics, relying on alternative data acquisition schemes such as ultrafast NMR. Currently, much work is aimed at drastically boosting the NMR sensitivity thanks to hyperpolarisation techniques, which have been used in combination with fast acquisition methods and could greatly expand the application potential of NMR metabolomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
What Good Are Statistics that Don't Generalize?
ERIC Educational Resources Information Center
Shaffer, David Williamson; Serlin, Ronald C.
2004-01-01
Quantitative and qualitative inquiry are sometimes portrayed as distinct and incompatible paradigms for research in education. Approaches to combining qualitative and quantitative research typically "integrate" the two methods by letting them co-exist independently within a single research study. Here we describe intra-sample statistical analysis…
Yoshikawa, Hirokazu; Weisner, Thomas S; Kalil, Ariel; Way, Niobe
2008-03-01
Multiple methods are vital to understanding development as a dynamic, transactional process. This article focuses on the ways in which quantitative and qualitative methodologies can be combined to enrich developmental science and the study of human development, focusing on the practical questions of "when" and "how." Research situations that may be especially suited to mixing qualitative and quantitative approaches are described. The authors also discuss potential choices for using mixed quantitative- qualitative approaches in study design, sampling, construction of measures or interview protocols, collaborations, and data analysis relevant to developmental science. Finally, they discuss some common pitfalls that occur in mixing these methods and include suggestions for surmounting them.
Mixed methods in gerontological research: Do the qualitative and quantitative data “touch”?
Happ, Mary Beth
2010-01-01
This paper distinguishes between parallel and integrated mixed methods research approaches. Barriers to integrated mixed methods approaches in gerontological research are discussed and critiqued. The author presents examples of mixed methods gerontological research to illustrate approaches to data integration at the levels of data analysis, interpretation, and research reporting. As a summary of the methodological literature, four basic levels of mixed methods data combination are proposed. Opportunities for mixing qualitative and quantitative data are explored using contemporary examples from published studies. Data transformation and visual display, judiciously applied, are proposed as pathways to fuller mixed methods data integration and analysis. Finally, practical strategies for mixing qualitative and quantitative data types are explicated as gerontological research moves beyond parallel mixed methods approaches to achieve data integration. PMID:20077973
Virtualising the Quantitative Research Methods Course: An Island-Based Approach
ERIC Educational Resources Information Center
Baglin, James; Reece, John; Baker, Jenalle
2015-01-01
Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…
Application of Mixed-Methods Approaches to Higher Education and Intersectional Analyses
ERIC Educational Resources Information Center
Griffin, Kimberly A.; Museus, Samuel D.
2011-01-01
In this article, the authors discuss the utility of combining quantitative and qualitative methods in conducting intersectional analyses. First, they discuss some of the paradigmatic underpinnings of qualitative and quantitative research, and how these methods can be used in intersectional analyses. They then consider how paradigmatic pragmatism…
Allelic-based gene-gene interaction associated with quantitative traits.
Jung, Jeesun; Sun, Bin; Kwon, Deukwoo; Koller, Daniel L; Foroud, Tatiana M
2009-05-01
Recent studies have shown that quantitative phenotypes may be influenced not only by multiple single nucleotide polymorphisms (SNPs) within a gene but also by the interaction between SNPs at unlinked genes. We propose a new statistical approach that can detect gene-gene interactions at the allelic level which contribute to the phenotypic variation in a quantitative trait. By testing for the association of allelic combinations at multiple unlinked loci with a quantitative trait, we can detect the SNP allelic interaction whether or not it can be detected as a main effect. Our proposed method assigns a score to unrelated subjects according to their allelic combination inferred from observed genotypes at two or more unlinked SNPs, and then tests for the association of the allelic score with a quantitative trait. To investigate the statistical properties of the proposed method, we performed a simulation study to estimate type I error rates and power and demonstrated that this allelic approach achieves greater power than the more commonly used genotypic approach to test for gene-gene interaction. As an example, the proposed method was applied to data obtained as part of a candidate gene study of sodium retention by the kidney. We found that this method detects an interaction between the calcium-sensing receptor gene (CaSR), the chloride channel gene (CLCNKB) and the Na, K, 2Cl cotransporter gene (CLC12A1) that contributes to variation in diastolic blood pressure.
Mixed Methods Approaches in Family Science Research
ERIC Educational Resources Information Center
Plano Clark, Vicki L.; Huddleston-Casas, Catherine A.; Churchill, Susan L.; Green, Denise O'Neil; Garrett, Amanda L.
2008-01-01
The complex phenomena of interest to family scientists require the use of quantitative and qualitative approaches. Researchers across the social sciences are now turning to mixed methods designs that combine these two approaches. Mixed methods research has great promise for addressing family science topics, but only if researchers understand the…
The Eudaimonic and Hedonic Components of Happiness: Qualitative and Quantitative Findings
ERIC Educational Resources Information Center
Fave, Antonella Delle; Brdar, Ingrid; Freire, Teresa; Vella-Brodrick, Dianne; Wissing, Marie P.
2011-01-01
This paper illustrates a new project developed by a cross-country team of researchers, with the aim of studying the hedonic and eudaimonic components of happiness through a mixed method approach combining both qualitative and quantitative analyses. Data were collected from 666 participants in Australia, Croatia, Germany, Italy, Portugal, Spain,…
A Study Assessing the Potential of Negative Effects in Interdisciplinary Math-Biology Instruction
ERIC Educational Resources Information Center
Madlung, Andreas; Bremer, Martina; Himelblau, Edward; Tullis, Alexa
2011-01-01
There is increasing enthusiasm for teaching approaches that combine mathematics and biology. The call for integrating more quantitative work in biology education has led to new teaching tools that improve quantitative skills. Little is known, however, about whether increasing interdisciplinary work can lead to adverse effects, such as the…
Chen, Wen Hao; Yang, Sam Y. S.; Xiao, Ti Qiao; Mayo, Sherry C.; Wang, Yu Dan; Wang, Hai Peng
2014-01-01
Quantifying three-dimensional spatial distributions of pores and material compositions in samples is a key materials characterization challenge, particularly in samples where compositions are distributed across a range of length scales, and where such compositions have similar X-ray absorption properties, such as in coal. Consequently, obtaining detailed information within sub-regions of a multi-length-scale sample by conventional approaches may not provide the resolution and level of detail one might desire. Herein, an approach for quantitative high-definition determination of material compositions from X-ray local computed tomography combined with a data-constrained modelling method is proposed. The approach is capable of dramatically improving the spatial resolution and enabling finer details within a region of interest of a sample larger than the field of view to be revealed than by using conventional techniques. A coal sample containing distributions of porosity and several mineral compositions is employed to demonstrate the approach. The optimal experimental parameters are pre-analyzed. The quantitative results demonstrated that the approach can reveal significantly finer details of compositional distributions in the sample region of interest. The elevated spatial resolution is crucial for coal-bed methane reservoir evaluation and understanding the transformation of the minerals during coal processing. The method is generic and can be applied for three-dimensional compositional characterization of other materials. PMID:24763649
NASA Astrophysics Data System (ADS)
Tavakoli, Vahid; Stoddard, Marcus F.; Amini, Amir A.
2013-03-01
Quantitative motion analysis of echocardiographic images helps clinicians with the diagnosis and therapy of patients suffering from cardiac disease. Quantitative analysis is usually based on TDI (Tissue Doppler Imaging) or speckle tracking. These methods are based on two independent techniques - the Doppler Effect and image registration, respectively. In order to increase the accuracy of the speckle tracking technique and cope with the angle dependency of TDI, herein, a combined approach dubbed TDIOF (Tissue Doppler Imaging Optical Flow) is proposed. TDIOF is formulated based on the combination of B-mode and Doppler energy terms in an optical flow framework and minimized using algebraic equations. In this paper, we report on validations with simulated, physical cardiac phantom, and in-vivo patient data. It is shown that the additional Doppler term is able to increase the accuracy of speckle tracking, the basis for several commercially available echocardiography analysis techniques.
Effectiveness of Social Media for Communicating Health Messages in Ghana
ERIC Educational Resources Information Center
Bannor, Richard; Asare, Anthony Kwame; Bawole, Justice Nyigmah
2017-01-01
Purpose: The purpose of this paper is to develop an in-depth understanding of the effectiveness, evolution and dynamism of the current health communication media used in Ghana. Design/methodology/approach: This paper uses a multi-method approach which utilizes a combination of qualitative and quantitative approaches. In-depth interviews are…
Illustrating a Mixed-Method Approach for Validating Culturally Specific Constructs
ERIC Educational Resources Information Center
Hitchcock, J.H.; Nastasi, B.K.; Dai, D.Y.; Newman, J.; Jayasena, A.; Bernstein-Moore, R.; Sarkar, S.; Varjas, K.
2005-01-01
The purpose of this article is to illustrate a mixed-method approach (i.e., combining qualitative and quantitative methods) for advancing the study of construct validation in cross-cultural research. The article offers a detailed illustration of the approach using the responses 612 Sri Lankan adolescents provided to an ethnographic survey. Such…
ERIC Educational Resources Information Center
Davis, Eric J.; Pauls, Steve; Dick, Jonathan
2017-01-01
Presented is a project-based learning (PBL) laboratory approach for an upper-division environmental chemistry or quantitative analysis course. In this work, a combined laboratory class of 11 environmental chemistry students developed a method based on published EPA methods for the extraction of dichlorodiphenyltrichloroethane (DDT) and its…
[Mixed methods research in public health: issues and illustration].
Guével, Marie-Renée; Pommier, Jeanine
2012-01-01
For many years, researchers in a range of fields have combined quantitative and qualitative methods. However, the combined use of quantitative and qualitative methods has only recently been conceptualized and defined as mixed methods research. Some authors have described the emerging field as a third methodological tradition (in addition to the qualitative and quantitative traditions). Mixed methods research combines different perspectives and facilitates the study of complex interventions or programs, particularly in public health, an area where interdisciplinarity is critical. However, the existing literature is primarily in English. By contrast, the literature in French remains limited. The purpose of this paper is to present the emergence of mixed methods research for francophone public health specialists. A literature review was conducted to identify the main characteristics of mixed methods research. The results provide an overall picture of the mixed methods approach through its history, definitions, and applications, and highlight the tools developed to clarify the approach (typologies) and to implement it (integration of results and quality standards). The tools highlighted in the literature review are illustrated by a study conducted in France. Mixed methods research opens new possibilities for examining complex research questions and provides relevant and promising opportunities for addressing current public health issues in France.
ERIC Educational Resources Information Center
Hampden-Thompson, Gillian; Lubben, Fred; Bennett, Judith
2011-01-01
Quantitative secondary analysis of large-scale data can be combined with in-depth qualitative methods. In this paper, we discuss the role of this combined methods approach in examining the uptake of physics and chemistry in post compulsory schooling for students in England. The secondary data analysis of the National Pupil Database (NPD) served…
Reverse engineering systems models of regulation: discovery, prediction and mechanisms.
Ashworth, Justin; Wurtmann, Elisabeth J; Baliga, Nitin S
2012-08-01
Biological systems can now be understood in comprehensive and quantitative detail using systems biology approaches. Putative genome-scale models can be built rapidly based upon biological inventories and strategic system-wide molecular measurements. Current models combine statistical associations, causative abstractions, and known molecular mechanisms to explain and predict quantitative and complex phenotypes. This top-down 'reverse engineering' approach generates useful organism-scale models despite noise and incompleteness in data and knowledge. Here we review and discuss the reverse engineering of biological systems using top-down data-driven approaches, in order to improve discovery, hypothesis generation, and the inference of biological properties. Copyright © 2011 Elsevier Ltd. All rights reserved.
Paradigms, pragmatism and possibilities: mixed-methods research in speech and language therapy.
Glogowska, Margaret
2011-01-01
After the decades of the so-called 'paradigm wars' in social science research methodology and the controversy about the relative place and value of quantitative and qualitative research methodologies, 'paradigm peace' appears to have now been declared. This has come about as many researchers have begun to take a 'pragmatic' approach in the selection of research methodology, choosing the methodology best suited to answering the research question rather than conforming to a methodological orthodoxy. With the differences in the philosophical underpinnings of the two traditions set to one side, an increasing awareness, and valuing, of the 'mixed-methods' approach to research is now present in the fields of social, educational and health research. To explore what is meant by mixed-methods research and the ways in which quantitative and qualitative methodologies and methods can be combined and integrated, particularly in the broad field of health services research and the narrower one of speech and language therapy. The paper discusses the ways in which methodological approaches have already been combined and integrated in health services research and speech and language therapy, highlighting the suitability of mixed-methods research for answering the typically multifaceted questions arising from the provision of complex interventions. The challenges of combining and integrating quantitative and qualitative methods and the barriers to the adoption of mixed-methods approaches are also considered. The questions about healthcare, as it is being provided in the 21st century, calls for a range of methodological approaches. This is particularly the case for human communication and its disorders, where mixed-methods research offers a wealth of possibilities. In turn, speech and language therapy research should be able to contribute substantively to the future development of mixed-methods research. © 2010 Royal College of Speech & Language Therapists.
Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian
2016-02-01
Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Weaving a Stronger Fabric for Improved Outcomes
ERIC Educational Resources Information Center
Lobry de Bruyn, Lisa; Prior, Julian; Lenehan, Jo
2014-01-01
Purpose: To explain how training and education events (TEEs) can be designed to increase the likelihood of achieving behavioural objectives. Approach: The approach combined both a quantitative review of evaluation surveys undertaken at the time of the TEE, and qualitative telephone interviews with selected attendees (2025% of the total population…
Industrial ecology: Quantitative methods for exploring a lower carbon future
NASA Astrophysics Data System (ADS)
Thomas, Valerie M.
2015-03-01
Quantitative methods for environmental and cost analyses of energy, industrial, and infrastructure systems are briefly introduced and surveyed, with the aim of encouraging broader utilization and development of quantitative methods in sustainable energy research. Material and energy flow analyses can provide an overall system overview. The methods of engineering economics and cost benefit analysis, such as net present values, are the most straightforward approach for evaluating investment options, with the levelized cost of energy being a widely used metric in electricity analyses. Environmental lifecycle assessment has been extensively developed, with both detailed process-based and comprehensive input-output approaches available. Optimization methods provide an opportunity to go beyond engineering economics to develop detailed least-cost or least-impact combinations of many different choices.
Partovi, Sasan; Yuh, Roger; Pirozzi, Sara; Lu, Ziang; Couturier, Spencer; Grosse, Ulrich; Schluchter, Mark D; Nelson, Aaron; Jones, Robert; O’Donnell, James K; Faulhaber, Peter
2017-01-01
The objective of this study was to assess the ability of a quantitative software-aided approach to improve the diagnostic accuracy of 18F FDG PET for Alzheimer’s dementia over visual analysis alone. Twenty normal subjects (M:F-12:8; mean age 80.6 years) and twenty mild AD subjects (M:F-12:8; mean age 70.6 years) with 18F FDG PET scans were obtained from the ADNI database. Three blinded readers interpreted these PET images first using a visual qualitative approach and then using a quantitative software-aided approach. Images were classified on two five-point scales based on normal/abnormal (1-definitely normal; 5-definitely abnormal) and presence of AD (1-definitely not AD; 5-definitely AD). Diagnostic sensitivity, specificity, and accuracy for both approaches were compared based on the aforementioned scales. The sensitivity, specificity, and accuracy for the normal vs. abnormal readings of all readers combined were higher when comparing the software-aided vs. visual approach (sensitivity 0.93 vs. 0.83 P = 0.0466; specificity 0.85 vs. 0.60 P = 0.0005; accuracy 0.89 vs. 0.72 P<0.0001). The specificity and accuracy for absence vs. presence of AD of all readers combined were higher when comparing the software-aided vs. visual approach (specificity 0.90 vs. 0.70 P = 0.0008; accuracy 0.81 vs. 0.72 P = 0.0356). Sensitivities of the software-aided and visual approaches did not differ significantly (0.72 vs. 0.73 P = 0.74). The quantitative software-aided approach appears to improve the performance of 18F FDG PET for the diagnosis of mild AD. It may be helpful for experienced 18F FDG PET readers analyzing challenging cases. PMID:28123864
Burnum-Johnson, Kristin E.; Nie, Song; Casey, Cameron P.; Monroe, Matthew E.; Orton, Daniel J.; Ibrahim, Yehia M.; Gritsenko, Marina A.; Clauss, Therese R. W.; Shukla, Anil K.; Moore, Ronald J.; Purvine, Samuel O.; Shi, Tujin; Qian, Weijun; Liu, Tao; Baker, Erin S.; Smith, Richard D.
2016-01-01
Current proteomic approaches include both broad discovery measurements and quantitative targeted analyses. In many cases, discovery measurements are initially used to identify potentially important proteins (e.g. candidate biomarkers) and then targeted studies are employed to quantify a limited number of selected proteins. Both approaches, however, suffer from limitations. Discovery measurements aim to sample the whole proteome but have lower sensitivity, accuracy, and quantitation precision than targeted approaches, whereas targeted measurements are significantly more sensitive but only sample a limited portion of the proteome. Herein, we describe a new approach that performs both discovery and targeted monitoring (DTM) in a single analysis by combining liquid chromatography, ion mobility spectrometry and mass spectrometry (LC-IMS-MS). In DTM, heavy labeled target peptides are spiked into tryptic digests and both the labeled and unlabeled peptides are detected using LC-IMS-MS instrumentation. Compared with the broad LC-MS discovery measurements, DTM yields greater peptide/protein coverage and detects lower abundance species. DTM also achieved detection limits similar to selected reaction monitoring (SRM) indicating its potential for combined high quality discovery and targeted analyses, which is a significant step toward the convergence of discovery and targeted approaches. PMID:27670688
Race and Older Mothers’ Differentiation: A Sequential Quantitative and Qualitative Analysis
Sechrist, Jori; Suitor, J. Jill; Riffin, Catherine; Taylor-Watson, Kadari; Pillemer, Karl
2011-01-01
The goal of this paper is to demonstrate a process by which qualitative and quantitative approaches are combined to reveal patterns in the data that are unlikely to be detected and confirmed by either method alone. Specifically, we take a sequential approach to combining qualitative and quantitative data to explore race differences in how mothers differentiate among their adult children. We began with a standard multivariate analysis examining race differences in mothers’ differentiation among their adult children regarding emotional closeness and confiding. Finding no race differences in this analysis, we conducted an in-depth comparison of the Black and White mothers’ narratives to determine whether there were underlying patterns that we had been unable to detect in our first analysis. Using this method, we found that Black mothers were substantially more likely than White mothers to emphasize interpersonal relationships within the family when describing differences among their children. In our final step, we developed a measure of familism based on the qualitative data and conducted a multivariate analysis to confirm the patterns revealed by the in-depth comparison of the mother’s narratives. We conclude that using such a sequential mixed methods approach to data analysis has the potential to shed new light on complex family relations. PMID:21967639
Using mixed methods effectively in prevention science: designs, procedures, and examples.
Zhang, Wanqing; Watanabe-Galloway, Shinobu
2014-10-01
There is growing interest in using a combination of quantitative and qualitative methods to generate evidence about the effectiveness of health prevention, services, and intervention programs. With the emerging importance of mixed methods research across the social and health sciences, there has been an increased recognition of the value of using mixed methods for addressing research questions in different disciplines. We illustrate the mixed methods approach in prevention research, showing design procedures used in several published research articles. In this paper, we focused on two commonly used mixed methods designs: concurrent and sequential mixed methods designs. We discuss the types of mixed methods designs, the reasons for, and advantages of using a particular type of design, and the procedures of qualitative and quantitative data collection and integration. The studies reviewed in this paper show that the essence of qualitative research is to explore complex dynamic phenomena in prevention science, and the advantage of using mixed methods is that quantitative data can yield generalizable results and qualitative data can provide extensive insights. However, the emphasis of methodological rigor in a mixed methods application also requires considerable expertise in both qualitative and quantitative methods. Besides the necessary skills and effective interdisciplinary collaboration, this combined approach also requires an open-mindedness and reflection from the involved researchers.
Tau-U: A Quantitative Approach for Analysis of Single-Case Experimental Data in Aphasia.
Lee, Jaime B; Cherney, Leora R
2018-03-01
Tau-U is a quantitative approach for analyzing single-case experimental design (SCED) data. It combines nonoverlap between phases with intervention phase trend and can correct for a baseline trend (Parker, Vannest, & Davis, 2011). We demonstrate the utility of Tau-U by comparing it with the standardized mean difference approach (Busk & Serlin, 1992) that is widely reported within the aphasia SCED literature. Repeated writing measures from 3 participants with chronic aphasia who received computer-based writing treatment are analyzed visually and quantitatively using both Tau-U and the standardized mean difference approach. Visual analysis alone was insufficient for determining an effect between the intervention and writing improvement. The standardized mean difference yielded effect sizes ranging from 4.18 to 26.72 for trained items and 1.25 to 3.20 for untrained items. Tau-U yielded significant (p < .05) effect sizes for 2 of 3 participants for trained probes and 1 of 3 participants for untrained probes. A baseline trend correction was applied to data from 2 of 3 participants. Tau-U has the unique advantage of allowing for the correction of an undesirable baseline trend. Although further study is needed, Tau-U shows promise as a quantitative approach to augment visual analysis of SCED data in aphasia.
Not Accepted by the Family: "Being Difficult" or "Being Different"?
ERIC Educational Resources Information Center
Komter, Aafke; Voorpostel, Marieke; Pels, Trees
2011-01-01
Using data from the Netherlands Kinship Panel Study (NKPS) and combining a quantitative approach and a qualitative approach (N = 8,148 and n = 43, respectively), this study investigates the mechanisms associated with a lack of acceptance by one's family. From the total NKPS sample, 12.1% did not feel (entirely) accepted by their family. The…
Pupils' Views of Religious Education in a Pluralistic Educational Context
ERIC Educational Resources Information Center
Kuusisto, Arniika; Kallioniemi, Arto
2014-01-01
This article examines Finnish pupils' views of religious education (RE) in a pluralistic educational context. The focus is on pupils' views of the aims and different approaches to RE in a multi-faith school. The study utilised a mixed method approach, combining quantitative and qualitative data. It employed a survey (n = 1301) and interviews (n =…
The Physics of Music with Interdisciplinary Approach: A Case of Prospective Music Teachers
ERIC Educational Resources Information Center
Turna, Özge; Bolat, Mualla
2016-01-01
Physics of music is an area that is covered by interdisciplinary approach. In this study it is aimed to determine prospective music teachers' level of association with physics concepts which are related to music. The research is a case study which combines qualitative and quantitative methods. Eighty-four students who were studying at the…
Combined qualitative and quantitative research designs.
Seymour, Jane
2012-12-01
Mixed methods research designs have been recognized as important in addressing complexity and are recommended particularly in the development and evaluation of complex interventions. This article reports a review of studies in palliative care published between 2010 and March 2012 that combine qualitative and quantitative approaches. A synthesis of approaches to mixed methods research taken in 28 examples of published research studies of relevance to palliative and supportive care is provided, using a typology based on a classic categorization put forward in 1992. Mixed-method studies are becoming more frequently employed in palliative care research and resonate with the complexity of the palliative care endeavour. Undertaking mixed methods research requires a sophisticated understanding of the research process and recognition of some of the underlying complexities encountered when working with different traditions and perspectives on issues of: sampling, validity, reliability and rigour, different sources of data and different data collection and analysis techniques.
Sintes, Eva; Herndl, Gerhard J
2006-11-01
Catalyzed reporter deposition fluorescence in situ hybridization combined with microautoradiography (MICRO-CARD-FISH) is increasingly being used to obtain qualitative information on substrate uptake by individual members of specific prokaryotic communities. Here we evaluated the potential for using this approach quantitatively by relating the measured silver grain area around cells taking up (3)H-labeled leucine to bulk leucine uptake measurements. The increase in the silver grain area over time around leucine-assimilating cells of coastal bacterial assemblages was linear during 4 to 6 h of incubation. By establishing standardized conditions for specific activity levels and concomitantly performing uptake measurements with the bulk community, MICRO-CARD-FISH can be used quantitatively to determine uptake rates on a single-cell level. Therefore, this approach allows comparisons of single-cell activities for bacterial communities obtained from different sites or growing under different ecological conditions.
Sintes, Eva; Herndl, Gerhard J.
2006-01-01
Catalyzed reporter deposition fluorescence in situ hybridization combined with microautoradiography (MICRO-CARD-FISH) is increasingly being used to obtain qualitative information on substrate uptake by individual members of specific prokaryotic communities. Here we evaluated the potential for using this approach quantitatively by relating the measured silver grain area around cells taking up 3H-labeled leucine to bulk leucine uptake measurements. The increase in the silver grain area over time around leucine-assimilating cells of coastal bacterial assemblages was linear during 4 to 6 h of incubation. By establishing standardized conditions for specific activity levels and concomitantly performing uptake measurements with the bulk community, MICRO-CARD-FISH can be used quantitatively to determine uptake rates on a single-cell level. Therefore, this approach allows comparisons of single-cell activities for bacterial communities obtained from different sites or growing under different ecological conditions. PMID:16950912
Ishikawa, Akira
2017-11-27
Large numbers of quantitative trait loci (QTL) affecting complex diseases and other quantitative traits have been reported in humans and model animals. However, the genetic architecture of these traits remains elusive due to the difficulty in identifying causal quantitative trait genes (QTGs) for common QTL with relatively small phenotypic effects. A traditional strategy based on techniques such as positional cloning does not always enable identification of a single candidate gene for a QTL of interest because it is difficult to narrow down a target genomic interval of the QTL to a very small interval harboring only one gene. A combination of gene expression analysis and statistical causal analysis can greatly reduce the number of candidate genes. This integrated approach provides causal evidence that one of the candidate genes is a putative QTG for the QTL. Using this approach, I have recently succeeded in identifying a single putative QTG for resistance to obesity in mice. Here, I outline the integration approach and discuss its usefulness using my studies as an example.
Standardization approaches in absolute quantitative proteomics with mass spectrometry.
Calderón-Celis, Francisco; Encinar, Jorge Ruiz; Sanz-Medel, Alfredo
2017-07-31
Mass spectrometry-based approaches have enabled important breakthroughs in quantitative proteomics in the last decades. This development is reflected in the better quantitative assessment of protein levels as well as to understand post-translational modifications and protein complexes and networks. Nowadays, the focus of quantitative proteomics shifted from the relative determination of proteins (ie, differential expression between two or more cellular states) to absolute quantity determination, required for a more-thorough characterization of biological models and comprehension of the proteome dynamism, as well as for the search and validation of novel protein biomarkers. However, the physico-chemical environment of the analyte species affects strongly the ionization efficiency in most mass spectrometry (MS) types, which thereby require the use of specially designed standardization approaches to provide absolute quantifications. Most common of such approaches nowadays include (i) the use of stable isotope-labeled peptide standards, isotopologues to the target proteotypic peptides expected after tryptic digestion of the target protein; (ii) use of stable isotope-labeled protein standards to compensate for sample preparation, sample loss, and proteolysis steps; (iii) isobaric reagents, which after fragmentation in the MS/MS analysis provide a final detectable mass shift, can be used to tag both analyte and standard samples; (iv) label-free approaches in which the absolute quantitative data are not obtained through the use of any kind of labeling, but from computational normalization of the raw data and adequate standards; (v) elemental mass spectrometry-based workflows able to provide directly absolute quantification of peptides/proteins that contain an ICP-detectable element. A critical insight from the Analytical Chemistry perspective of the different standardization approaches and their combinations used so far for absolute quantitative MS-based (molecular and elemental) proteomics is provided in this review. © 2017 Wiley Periodicals, Inc.
Paturzo, Marco; Colaceci, Sofia; Clari, Marco; Mottola, Antonella; Alvaro, Rosaria; Vellone, Ercole
2016-01-01
. Mixed methods designs: an innovative methodological approach for nursing research. The mixed method research designs (MM) combine qualitative and quantitative approaches in the research process, in a single study or series of studies. Their use can provide a wider understanding of multifaceted phenomena. This article presents a general overview of the structure and design of MM to spread this approach in the Italian nursing research community. The MM designs most commonly used in the nursing field are the convergent parallel design, the sequential explanatory design, the exploratory sequential design and the embedded design. For each method a research example is presented. The use of MM can be an added value to improve clinical practices as, through the integration of qualitative and quantitative methods, researchers can better assess complex phenomena typical of nursing.
Hu, Zhe-Yi; Parker, Robert B.; Herring, Vanessa L.; Laizure, S. Casey
2012-01-01
Dabigatran etexilate (DABE) is an oral prodrug that is rapidly converted by esterases to dabigatran (DAB), a direct inhibitor of thrombin. To elucidate the esterase-mediated metabolic pathway of DABE, a high-performance liquid chromatography/mass spectrometer (LC-MS/MS)-based metabolite identification and semi-quantitative estimation approach was developed. To overcome the poor full-scan sensitivity of conventional triple quadrupole mass spectrometry, precursor-product ion pairs were predicted, to search for the potential in vitro metabolites. The detected metabolites were confirmed by the product ion scan. A dilution method was introduced to evaluate the matrix effects of tentatively identified metabolites without chemical standards. Quantitative information on detected metabolites was obtained using ‘metabolite standards’ generated from incubation samples that contain a high concentration of metabolite in combination with a correction factor for mass spectrometry response. Two in vitro metabolites of DABE (M1 and M2) were identified, and quantified by the semi-quantitative estimation approach. It is noteworthy that CES1 convert DABE to M1 while CES2 mediates the conversion of DABE to M2. M1 (or M2) was further metabolized to DAB by CES2 (or CES1). The approach presented here provides a solution to a bioanalytical need for fast identification and semi-quantitative estimation of CES metabolites in preclinical samples. PMID:23239178
Lindén, Rolf O; Eronen, Ville-Pekka; Aittokallio, Tero
2011-03-24
High-throughput genetic screening approaches have enabled systematic means to study how interactions among gene mutations contribute to quantitative fitness phenotypes, with the aim of providing insights into the functional wiring diagrams of genetic interaction networks on a global scale. However, it is poorly known how well these quantitative interaction measurements agree across the screening approaches, which hinders their integrated use toward improving the coverage and quality of the genetic interaction maps in yeast and other organisms. Using large-scale data matrices from epistatic miniarray profiling (E-MAP), genetic interaction mapping (GIM), and synthetic genetic array (SGA) approaches, we carried out here a systematic comparative evaluation among these quantitative maps of genetic interactions in yeast. The relatively low association between the original interaction measurements or their customized scores could be improved using a matrix-based modelling framework, which enables the use of single- and double-mutant fitness estimates and measurements, respectively, when scoring genetic interactions. Toward an integrative analysis, we show how the detections from the different screening approaches can be combined to suggest novel positive and negative interactions which are complementary to those obtained using any single screening approach alone. The matrix approximation procedure has been made available to support the design and analysis of the future screening studies. We have shown here that even if the correlation between the currently available quantitative genetic interaction maps in yeast is relatively low, their comparability can be improved by means of our computational matrix approximation procedure, which will enable integrative analysis and detection of a wider spectrum of genetic interactions using data from the complementary screening approaches.
Mixed methods research - the best of both worlds?
van Griensven, Hubert; Moore, Ann P; Hall, Valerie
2014-10-01
There has been a bias towards quantitative research approaches within manual therapy, which may have resulted in a narrow understanding of manual therapy practice. The aim of this Masterclass is to make a contribution to the expansion of methodologies used in manual therapy enquiry by discussing mixed methods research (MMR), a methodology which utilises both qualitative and quantitative methods within a single study in order to provide more comprehensive insights. To review rationales for MMR, as well as some of the common design options and potential difficulties. The paper also discusses theoretical frameworks that have been used to underpin qualitative and quantitative research, and ongoing debates about the possibility of combining them. Complexities associated with health and manual therapy cannot always be investigated satisfactorily by using a single research method. Some issues require a more comprehensive understanding, which may be provided by combining the strengths of quantitative and qualitative methods in a mixed methods study. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Costanzi, Stefano; Tikhonova, Irina G.; Harden, T. Kendall; Jacobson, Kenneth A.
2009-11-01
Accurate in silico models for the quantitative prediction of the activity of G protein-coupled receptor (GPCR) ligands would greatly facilitate the process of drug discovery and development. Several methodologies have been developed based on the properties of the ligands, the direct study of the receptor-ligand interactions, or a combination of both approaches. Ligand-based three-dimensional quantitative structure-activity relationships (3D-QSAR) techniques, not requiring knowledge of the receptor structure, have been historically the first to be applied to the prediction of the activity of GPCR ligands. They are generally endowed with robustness and good ranking ability; however they are highly dependent on training sets. Structure-based techniques generally do not provide the level of accuracy necessary to yield meaningful rankings when applied to GPCR homology models. However, they are essentially independent from training sets and have a sufficient level of accuracy to allow an effective discrimination between binders and nonbinders, thus qualifying as viable lead discovery tools. The combination of ligand and structure-based methodologies in the form of receptor-based 3D-QSAR and ligand and structure-based consensus models results in robust and accurate quantitative predictions. The contribution of the structure-based component to these combined approaches is expected to become more substantial and effective in the future, as more sophisticated scoring functions are developed and more detailed structural information on GPCRs is gathered.
ERIC Educational Resources Information Center
Eyisi, Daniel
2016-01-01
Research in science education is to discover the truth which involves the combination of reasoning and experiences. In order to find out appropriate teaching methods that are necessary for teaching science students problem-solving skills, different research approaches are used by educational researchers based on the data collection and analysis…
Combining Qualitative and Quantitative Approaches: Some Arguments for Mixed Methods Research
ERIC Educational Resources Information Center
Lund, Thorleif
2012-01-01
One purpose of the present paper is to elaborate 4 general advantages of the mixed methods approach. Another purpose is to propose a 5-phase evaluation design, and to demonstrate its usefulness for mixed methods research. The account is limited to research on groups in need of treatment, i.e., vulnerable groups, and the advantages of mixed methods…
A quantitative approach to combine sources in stable isotope mixing models
Stable isotope mixing models, used to estimate source contributions to a mixture, typically yield highly uncertain estimates when there are many sources and relatively few isotope elements. Previously, ecologists have either accepted the uncertain contribution estimates for indiv...
Optical properties of acute kidney injury measured by quantitative phase imaging
Ban, Sungbea; Min, Eunjung; Baek, Songyee; Kwon, Hyug Moo; Popescu, Gabriel
2018-01-01
The diagnosis of acute kidney disease (AKI) has been examined mainly by histology, immunohistochemistry and western blot. Though these approaches are widely accepted in the field, it has an inherent limitation due to the lack of high-throughput and quantitative information. For a better understanding of prognosis in AKI, we present a new approach using quantitative phase imaging combined with a wide-field scanning platform. Through the phase-delay information from the tissue, we were able to predict a stage of AKI based on various optical properties such as light scattering coefficient and anisotropy. These optical parameters quantify the deterioration process of the AKI model of tissue. Our device would be a very useful tool when it is required to deliver fast feedback of tissue pathology or when diseases are related to mechanical properties such as fibrosis. PMID:29541494
Burnum-Johnson, Kristin E; Nie, Song; Casey, Cameron P; Monroe, Matthew E; Orton, Daniel J; Ibrahim, Yehia M; Gritsenko, Marina A; Clauss, Therese R W; Shukla, Anil K; Moore, Ronald J; Purvine, Samuel O; Shi, Tujin; Qian, Weijun; Liu, Tao; Baker, Erin S; Smith, Richard D
2016-12-01
Current proteomic approaches include both broad discovery measurements and quantitative targeted analyses. In many cases, discovery measurements are initially used to identify potentially important proteins (e.g. candidate biomarkers) and then targeted studies are employed to quantify a limited number of selected proteins. Both approaches, however, suffer from limitations. Discovery measurements aim to sample the whole proteome but have lower sensitivity, accuracy, and quantitation precision than targeted approaches, whereas targeted measurements are significantly more sensitive but only sample a limited portion of the proteome. Herein, we describe a new approach that performs both discovery and targeted monitoring (DTM) in a single analysis by combining liquid chromatography, ion mobility spectrometry and mass spectrometry (LC-IMS-MS). In DTM, heavy labeled target peptides are spiked into tryptic digests and both the labeled and unlabeled peptides are detected using LC-IMS-MS instrumentation. Compared with the broad LC-MS discovery measurements, DTM yields greater peptide/protein coverage and detects lower abundance species. DTM also achieved detection limits similar to selected reaction monitoring (SRM) indicating its potential for combined high quality discovery and targeted analyses, which is a significant step toward the convergence of discovery and targeted approaches. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
NASA Astrophysics Data System (ADS)
Shevade, Abhijit V.; Ryan, Margaret A.; Homer, Margie L.; Zhou, Hanying; Manfreda, Allison M.; Lara, Liana M.; Yen, Shiao-Pin S.; Jewell, April D.; Manatt, Kenneth S.; Kisor, Adam K.
We have developed a Quantitative Structure-Activity Relationships (QSAR) based approach to correlate the response of chemical sensors in an array with molecular descriptors. A novel molecular descriptor set has been developed; this set combines descriptors of sensing film-analyte interactions, representing sensor response, with a basic analyte descriptor set commonly used in QSAR studies. The descriptors are obtained using a combination of molecular modeling tools and empirical and semi-empirical Quantitative Structure-Property Relationships (QSPR) methods. The sensors under investigation are polymer-carbon sensing films which have been exposed to analyte vapors at parts-per-million (ppm) concentrations; response is measured as change in film resistance. Statistically validated QSAR models have been developed using Genetic Function Approximations (GFA) for a sensor array for a given training data set. The applicability of the sensor response models has been tested by using it to predict the sensor activities for test analytes not considered in the training set for the model development. The validated QSAR sensor response models show good predictive ability. The QSAR approach is a promising computational tool for sensing materials evaluation and selection. It can also be used to predict response of an existing sensing film to new target analytes.
Rodriguez-Falces, Javier
2013-12-01
In electrophysiology studies, it is becoming increasingly common to explain experimental observations using both descriptive methods and quantitative approaches. However, some electrophysiological phenomena, such as the generation of extracellular potentials that results from the propagation of the excitation source along the muscle fiber, are difficult to describe and conceptualize. In addition, most traditional approaches aimed at describing extracellular potentials consist of complex mathematical machinery that gives no chance for physical interpretation. The aim of the present study is to present a new method to teach the formation of extracellular potentials around a muscle fiber from both a descriptive and quantitative perspective. The implementation of this method was tested through a written exam and a satisfaction survey. The new method enhanced the ability of students to visualize the generation of bioelectrical potentials. In addition, the new approach improved students' understanding of how changes in the fiber-to-electrode distance and in the shape of the excitation source are translated into changes in the extracellular potential. The survey results show that combining general principles of electrical fields with accurate graphic imagery gives students an intuitive, yet quantitative, feel for electrophysiological signals and enhances their motivation to continue their studies in the biomedical engineering field.
Deployment of e-health services - a business model engineering strategy.
Kijl, Björn; Nieuwenhuis, Lambert J M; Huis in 't Veld, Rianne M H A; Hermens, Hermie J; Vollenbroek-Hutten, Miriam M R
2010-01-01
We designed a business model for deploying a myofeedback-based teletreatment service. An iterative and combined qualitative and quantitative action design approach was used for developing the business model and the related value network. Insights from surveys, desk research, expert interviews, workshops and quantitative modelling were combined to produce the first business model and then to refine it in three design cycles. The business model engineering strategy provided important insights which led to an improved, more viable and feasible business model and related value network design. Based on this experience, we conclude that the process of early stage business model engineering reduces risk and produces substantial savings in costs and resources related to service deployment.
Cantow, Kathleen; Arakelyan, Karen; Seeliger, Erdmann; Niendorf, Thoralf; Pohlmann, Andreas
2016-01-01
In vivo assessment of renal perfusion and oxygenation under (patho)physiological conditions by means of noninvasive diagnostic imaging is conceptually appealing. Blood oxygen level-dependent (BOLD) magnetic resonance imaging (MRI) and quantitative parametric mapping of the magnetic resonance (MR) relaxation times T 2* and T 2 are thought to provide surrogates of renal tissue oxygenation. The validity and efficacy of this technique for quantitative characterization of local tissue oxygenation and its changes under different functional conditions have not been systematically examined yet and remain to be established. For this purpose, the development of an integrative multimodality approaches is essential. Here we describe an integrated hybrid approach (MR-PHYSIOL) that combines established quantitative physiological measurements with T 2* (T 2) mapping and MR-based kidney size measurements. Standardized reversible (patho)physiologically relevant interventions, such as brief periods of aortic occlusion, hypoxia, and hyperoxia, are used for detailing the relation between the MR-PHYSIOL parameters, in particular between renal T 2* and tissue oxygenation.
Serials Evaluation: An Innovative Approach.
ERIC Educational Resources Information Center
Berger, Marilyn; Devine, Jane
1990-01-01
Describes a method of analyzing serials collections in special libraries that combines evaluative criteria with database management technology. Choice of computer software is discussed, qualitative information used to evaluate subject coverage is examined, and quantitative and descriptive data that can be used for collection management are…
Enhancing Institutional Assessment Efforts through Qualitative Methods
ERIC Educational Resources Information Center
Van Note Chism, Nancy; Banta, Trudy W.
2007-01-01
Qualitative methods can do much to describe context and illuminate the why behind patterns encountered in institutional assessment. Alone, or in combination with quantitative methods, they should be the approach of choice for many of the most important assessment questions. (Contains 1 table.)
Towards quantitative classification of folded proteins in terms of elementary functions.
Hu, Shuangwei; Krokhotin, Andrei; Niemi, Antti J; Peng, Xubiao
2011-04-01
A comparative classification scheme provides a good basis for several approaches to understand proteins, including prediction of relations between their structure and biological function. But it remains a challenge to combine a classification scheme that describes a protein starting from its well-organized secondary structures and often involves direct human involvement, with an atomary-level physics-based approach where a protein is fundamentally nothing more than an ensemble of mutually interacting carbon, hydrogen, oxygen, and nitrogen atoms. In order to bridge these two complementary approaches to proteins, conceptually novel tools need to be introduced. Here we explain how an approach toward geometric characterization of entire folded proteins can be based on a single explicit elementary function that is familiar from nonlinear physical systems where it is known as the kink soliton. Our approach enables the conversion of hierarchical structural information into a quantitative form that allows for a folded protein to be characterized in terms of a small number of global parameters that are in principle computable from atomary-level considerations. As an example we describe in detail how the native fold of the myoglobin 1M6C emerges from a combination of kink solitons with a very high atomary-level accuracy. We also verify that our approach describes longer loops and loops connecting α helices with β strands, with the same overall accuracy. ©2011 American Physical Society
Kimura, Akatsuki; Celani, Antonio; Nagao, Hiromichi; Stasevich, Timothy; Nakamura, Kazuyuki
2015-01-01
Construction of quantitative models is a primary goal of quantitative biology, which aims to understand cellular and organismal phenomena in a quantitative manner. In this article, we introduce optimization procedures to search for parameters in a quantitative model that can reproduce experimental data. The aim of optimization is to minimize the sum of squared errors (SSE) in a prediction or to maximize likelihood. A (local) maximum of likelihood or (local) minimum of the SSE can efficiently be identified using gradient approaches. Addition of a stochastic process enables us to identify the global maximum/minimum without becoming trapped in local maxima/minima. Sampling approaches take advantage of increasing computational power to test numerous sets of parameters in order to determine the optimum set. By combining Bayesian inference with gradient or sampling approaches, we can estimate both the optimum parameters and the form of the likelihood function related to the parameters. Finally, we introduce four examples of research that utilize parameter optimization to obtain biological insights from quantified data: transcriptional regulation, bacterial chemotaxis, morphogenesis, and cell cycle regulation. With practical knowledge of parameter optimization, cell and developmental biologists can develop realistic models that reproduce their observations and thus, obtain mechanistic insights into phenomena of interest.
Approaching human language with complex networks
NASA Astrophysics Data System (ADS)
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics).
CUMULATIVE RISK ASSESSMENT FOR QUANTITATIVE RESPONSE DATA
The Relative Potency Factor approach (RPF) is used to normalize and combine different toxic potencies among a group of chemicals selected for cumulative risk assessment. The RPF method assumes that the slopes of the dose-response functions are all equal; but this method depends o...
Soulet, Fabienne; Kilarski, Witold W.; Roux-Dalvai, Florence; Herbert, John M. J.; Sacewicz, Izabela; Mouton-Barbosa, Emmanuelle; Bicknell, Roy; Lalor, Patricia; Monsarrat, Bernard; Bikfalvi, Andreas
2013-01-01
In order to map the extracellular or membrane proteome associated with the vasculature and the stroma in an embryonic organism in vivo, we developed a biotinylation technique for chicken embryo and combined it with mass spectrometry and bioinformatic analysis. We also applied this procedure to implanted tumors growing on the chorioallantoic membrane or after the induction of granulation tissue. Membrane and extracellular matrix proteins were the most abundant components identified. Relative quantitative analysis revealed differential protein expression patterns in several tissues. Through a bioinformatic approach, we determined endothelial cell protein expression signatures, which allowed us to identify several proteins not yet reported to be associated with endothelial cells or the vasculature. This is the first study reported so far that applies in vivo biotinylation, in combination with robust label-free quantitative proteomics approaches and bioinformatic analysis, to an embryonic organism. It also provides the first description of the vascular and matrix proteome of the embryo that might constitute the starting point for further developments. PMID:23674615
Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura
2018-06-01
There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.
Global, quantitative and dynamic mapping of protein subcellular localization.
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg Hh
2016-06-09
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology.
Haines, Seth S.; Diffendorfer, Jay E.; Balistrieri, Laurie; ...
2013-05-15
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sagemore » grouse leks and pinon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. In conclusion, the framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.« less
Haines, Seth S.; Diffendorfer, James; Balistrieri, Laurie S.; Berger, Byron R.; Cook, Troy A.; Gautier, Donald L.; Gallegos, Tanya J.; Gerritsen, Margot; Graffy, Elisabeth; Hawkins, Sarah; Johnson, Kathleen; Macknick, Jordan; McMahon, Peter; Modde, Tim; Pierce, Brenda; Schuenemeyer, John H.; Semmens, Darius; Simon, Benjamin; Taylor, Jason; Walton-Day, Katherine
2013-01-01
Natural resource planning at all scales demands methods for assessing the impacts of resource development and use, and in particular it requires standardized methods that yield robust and unbiased results. Building from existing probabilistic methods for assessing the volumes of energy and mineral resources, we provide an algorithm for consistent, reproducible, quantitative assessment of resource development impacts. The approach combines probabilistic input data with Monte Carlo statistical methods to determine probabilistic outputs that convey the uncertainties inherent in the data. For example, one can utilize our algorithm to combine data from a natural gas resource assessment with maps of sage grouse leks and piñon-juniper woodlands in the same area to estimate possible future habitat impacts due to possible future gas development. As another example: one could combine geochemical data and maps of lynx habitat with data from a mineral deposit assessment in the same area to determine possible future mining impacts on water resources and lynx habitat. The approach can be applied to a broad range of positive and negative resource development impacts, such as water quantity or quality, economic benefits, or air quality, limited only by the availability of necessary input data and quantified relationships among geologic resources, development alternatives, and impacts. The framework enables quantitative evaluation of the trade-offs inherent in resource management decision-making, including cumulative impacts, to address societal concerns and policy aspects of resource development.
Lim, Eelin L.; Tomita, Aoy V.; Thilly, William G.; Polz, Martin F.
2001-01-01
A novel quantitative PCR (QPCR) approach, which combines competitive PCR with constant-denaturant capillary electrophoresis (CDCE), was adapted for enumerating microbial cells in environmental samples using the marine nanoflagellate Cafeteria roenbergensis as a model organism. Competitive PCR has been used successfully for quantification of DNA in environmental samples. However, this technique is labor intensive, and its accuracy is dependent on an internal competitor, which must possess the same amplification efficiency as the target yet can be easily discriminated from the target DNA. The use of CDCE circumvented these problems, as its high resolution permitted the use of an internal competitor which differed from the target DNA fragment by a single base and thus ensured that both sequences could be amplified with equal efficiency. The sensitivity of CDCE also enabled specific and precise detection of sequences over a broad range of concentrations. The combined competitive QPCR and CDCE approach accurately enumerated C. roenbergensis cells in eutrophic, coastal seawater at abundances ranging from approximately 10 to 104 cells ml−1. The QPCR cell estimates were confirmed by fluorescent in situ hybridization counts, but estimates of samples with <50 cells ml−1 by QPCR were less variable. This novel approach extends the usefulness of competitive QPCR by demonstrating its ability to reliably enumerate microorganisms at a range of environmentally relevant cell concentrations in complex aquatic samples. PMID:11525983
Cai, Xiang; Shen, Liguo; Zhang, Meijia; Chen, Jianrong; Hong, Huachang; Lin, Hongjun
2017-11-01
Quantitatively evaluating interaction energy between two randomly rough surfaces is the prerequisite to quantitatively understand and control membrane fouling in membrane bioreactors (MBRs). In this study, a new unified approach to construct rough topographies and to quantify interaction energy between a randomly rough particle and a randomly rough membrane was proposed. It was found that, natural rough topographies of both foulants and membrane could be well constructed by a modified two-variable Weierstrass-Mandelbrot (WM) function included in fractal theory. Spatial differential relationships between two constructed surfaces were accordingly established. Thereafter, a new approach combining these relationships, surface element integration (SEI) approach and composite Simpson's rule was deduced to calculate the interaction energy between two randomly rough surfaces in a submerged MBR. The obtained results indicate the profound effects of surface morphology on interaction energy and membrane fouling. This study provided a basic approach to investigate membrane fouling and interface behaviors. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Murray, Cathryn Clarke; Wong, Janson; Singh, Gerald G.; Mach, Megan; Lerner, Jackie; Ranieri, Bernardo; Peterson St-Laurent, Guillaume; Guimaraes, Alice; Chan, Kai M. A.
2018-06-01
Environmental assessment is the process that decision-makers rely on to predict, evaluate, and prevent biophysical, social, and economic impacts of potential project developments. The determination of significance in environmental assessment is central to environmental management in many nations. We reviewed ten recent environmental impact assessments from British Columbia, Canada and systematically reviewed and scored significance determination and the approaches used by assessors, the use of thresholds in significance determination, threshold exceedances, and the outcomes. Findings of significant impacts were exceedingly rare and practitioners used a combination of significance determination approaches, most commonly relying upon reasoned argumentation. Quantitative thresholds were rarely employed, with less than 10% of the valued components evaluated using thresholds. Even where quantitative thresholds for significance were exceeded, in every case practitioners used a variety of rationales to demote negative impacts to non-significance. These reasons include combinations of scale (temporal and spatial) of impacts, an already exceeded baseline, model uncertainty and/or substituting less stringent thresholds. Governments and agencies can better protect resources by requiring clear and defensible significance determinations, by making government-defined thresholds legally enforceable and accountable, and by requiring or encouraging significance determination through inclusive and collaborative approaches.
Genomic approaches for the elucidation of genes and gene networks underlying cardiovascular traits.
Adriaens, M E; Bezzina, C R
2018-06-22
Genome-wide association studies have shed light on the association between natural genetic variation and cardiovascular traits. However, linking a cardiovascular trait associated locus to a candidate gene or set of candidate genes for prioritization for follow-up mechanistic studies is all but straightforward. Genomic technologies based on next-generation sequencing technology nowadays offer multiple opportunities to dissect gene regulatory networks underlying genetic cardiovascular trait associations, thereby aiding in the identification of candidate genes at unprecedented scale. RNA sequencing in particular becomes a powerful tool when combined with genotyping to identify loci that modulate transcript abundance, known as expression quantitative trait loci (eQTL), or loci modulating transcript splicing known as splicing quantitative trait loci (sQTL). Additionally, the allele-specific resolution of RNA-sequencing technology enables estimation of allelic imbalance, a state where the two alleles of a gene are expressed at a ratio differing from the expected 1:1 ratio. When multiple high-throughput approaches are combined with deep phenotyping in a single study, a comprehensive elucidation of the relationship between genotype and phenotype comes into view, an approach known as systems genetics. In this review, we cover key applications of systems genetics in the broad cardiovascular field.
Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.
Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L
2015-09-01
Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
Barbosa, Jocelyn; Lee, Kyubum; Lee, Sunwon; Lodhi, Bilal; Cho, Jae-Gu; Seo, Woo-Keun; Kang, Jaewoo
2016-03-12
Facial palsy or paralysis (FP) is a symptom that loses voluntary muscles movement in one side of the human face, which could be very devastating in the part of the patients. Traditional methods are solely dependent to clinician's judgment and therefore time consuming and subjective in nature. Hence, a quantitative assessment system becomes apparently invaluable for physicians to begin the rehabilitation process; and to produce a reliable and robust method is challenging and still underway. We introduce a novel approach for a quantitative assessment of facial paralysis that tackles classification problem for FP type and degree of severity. Specifically, a novel method of quantitative assessment is presented: an algorithm that extracts the human iris and detects facial landmarks; and a hybrid approach combining the rule-based and machine learning algorithm to analyze and prognosticate facial paralysis using the captured images. A method combining the optimized Daugman's algorithm and Localized Active Contour (LAC) model is proposed to efficiently extract the iris and facial landmark or key points. To improve the performance of LAC, appropriate parameters of initial evolving curve for facial features' segmentation are automatically selected. The symmetry score is measured by the ratio between features extracted from the two sides of the face. Hybrid classifiers (i.e. rule-based with regularized logistic regression) were employed for discriminating healthy and unhealthy subjects, FP type classification, and for facial paralysis grading based on House-Brackmann (H-B) scale. Quantitative analysis was performed to evaluate the performance of the proposed approach. Experiments show that the proposed method demonstrates its efficiency. Facial movement feature extraction on facial images based on iris segmentation and LAC-based key point detection along with a hybrid classifier provides a more efficient way of addressing classification problem on facial palsy type and degree of severity. Combining iris segmentation and key point-based method has several merits that are essential for our real application. Aside from the facial key points, iris segmentation provides significant contribution as it describes the changes of the iris exposure while performing some facial expressions. It reveals the significant difference between the healthy side and the severe palsy side when raising eyebrows with both eyes directed upward, and can model the typical changes in the iris region.
Naming Block Structures: A Multimodal Approach
ERIC Educational Resources Information Center
Cohen, Lynn; Uhry, Joanna
2011-01-01
This study describes symbolic representation in block play in a culturally diverse suburban preschool classroom. Block play is "multimodal" and can allow children to experiment with materials to represent the world in many forms of literacy. Combined qualitative and quantitative data from seventy-seven block structures were collected and analyzed.…
Outstanding Teachers and Learner-Centered Teaching Practices at a Private Liberal Arts Institution
ERIC Educational Resources Information Center
Verst, Amy L.
2010-01-01
Using a combined quantitative, qualitative approach, this study explores the teaching practices of outstanding faculty at a private, liberal arts institutions by posing questions that revolve around learner-centered teaching practices, characteristics of outstanding teachers, effective teaching, and pressures on the professoriate related to the…
Beginning Special Education Teachers in Israel: Perceived Self-Efficacy
ERIC Educational Resources Information Center
Gavish, Bella; Bar-On, Sari; shein-kahalon, Rivka
2016-01-01
The purpose of this study was to examine perceived self-efficacy among beginning special education teachers in Israel related to their educational roles and responsibilities. Ninety-three first-year teachers participated in the study. The research was carried out using the mixed method approach, combining qualitative and quantitative research…
Do Highly Effective Principals Also Have High Levels of Cultural Intelligence?
ERIC Educational Resources Information Center
Naughton, Whitney Michelle
2010-01-01
Purpose: The purpose of this study was to determine if elementary school principals who exhibit characteristics of highly effective principals also possess high levels of cultural intelligence. Methodology: Three instruments were used in this study, combining both qualitative and quantitative approaches to the collection of data. The first…
Illinois Community College System. Performance Report For Fiscal Year 2004
ERIC Educational Resources Information Center
Illinois Community College Board, 2004
2004-01-01
The Illinois Community College System Performance Report replaces the Results Report and reflects an initial effort to increasingly streamline and integrate state outcomes and progress reporting in Illinois. The fresh approach taken this year further combines qualitative information and quantitative data reporting. The Performance Report is…
Combining In Vivo, In Vitro And Toxicokinetics Data In Readacross: A Case Study Using Caffeine (SOT)
Readacross can be used to fill data gaps in chemical safety assessment. Readacross starts with the identification and evaluation of source analogs, including assessment of the physicochemical and mechanistic similarity of source analogs. We describe an approach to quantitative re...
The Effects of Mobile Collaborative Activities in a Second Language Course
ERIC Educational Resources Information Center
Ilic, Peter
2015-01-01
This research is designed to explore the areas of collaborative learning and the use of smartphones as a support for collaborative learning through a year-long exploratory multiple case study approach integrating both qualitative and quantitative data analysis. Qualitative exploratory interviews are combined with Multidimensional Scaling Analysis…
Defining Sustainability Metric Targets in an Institutional Setting
ERIC Educational Resources Information Center
Rauch, Jason N.; Newman, Julie
2009-01-01
Purpose: The purpose of this paper is to expand on the development of university and college sustainability metrics by implementing an adaptable metric target strategy. Design/methodology/approach: A combined qualitative and quantitative methodology is derived that both defines what a sustainable metric target might be and describes the path a…
Stories and Statistics: Describing a Mixed Methods Study of Effective Classroom Practice
ERIC Educational Resources Information Center
Kington, Alison; Sammons, Pam; Day, Christopher; Regan, Elaine
2011-01-01
The Effective Classroom Practice project aimed to identify key factors that contribute to effective teaching in primary and secondary phases of schooling in different socioeconomic contexts. This article addresses the ways in which qualitative and quantitative approaches were combined within an integrated design to provide a comprehensive…
Combined analysis of magnetic and gravity anomalies using normalized source strength (NSS)
NASA Astrophysics Data System (ADS)
Li, L.; Wu, Y.
2017-12-01
Gravity field and magnetic field belong to potential fields which lead inherent multi-solution. Combined analysis of magnetic and gravity anomalies based on Poisson's relation is used to determinate homology gravity and magnetic anomalies and decrease the ambiguity. The traditional combined analysis uses the linear regression of the reduction to pole (RTP) magnetic anomaly to the first order vertical derivative of the gravity anomaly, and provides the quantitative or semi-quantitative interpretation by calculating the correlation coefficient, slope and intercept. In the calculation process, due to the effect of remanent magnetization, the RTP anomaly still contains the effect of oblique magnetization. In this case the homology gravity and magnetic anomalies display irrelevant results in the linear regression calculation. The normalized source strength (NSS) can be transformed from the magnetic tensor matrix, which is insensitive to the remanence. Here we present a new combined analysis using NSS. Based on the Poisson's relation, the gravity tensor matrix can be transformed into the pseudomagnetic tensor matrix of the direction of geomagnetic field magnetization under the homologous condition. The NSS of pseudomagnetic tensor matrix and original magnetic tensor matrix are calculated and linear regression analysis is carried out. The calculated correlation coefficient, slope and intercept indicate the homology level, Poisson's ratio and the distribution of remanent respectively. We test the approach using synthetic model under complex magnetization, the results show that it can still distinguish the same source under the condition of strong remanence, and establish the Poisson's ratio. Finally, this approach is applied in China. The results demonstrated that our approach is feasible.
The FAQUIRE Approach: FAst, QUantitative, hIghly Resolved and sEnsitivity Enhanced 1H, 13C Data.
Farjon, Jonathan; Milande, Clément; Martineau, Estelle; Akoka, Serge; Giraudeau, Patrick
2018-02-06
The targeted analysis of metabolites in complex mixtures is a challenging issue. NMR is one of the major tools in this field, but there is a strong need for more sensitive, better-resolved, and faster quantitative methods. In this framework, we introduce the concept of FAst, QUantitative, hIghly Resolved and sEnsitivity enhanced (FAQUIRE) NMR to push forward the limits of metabolite NMR analysis. 2D 1 H, 13 C 2D quantitative maps are promising alternatives for enhancing the spectral resolution but are highly time-consuming because of (i) the intrinsic nature of 2D, (ii) the longer recycling times required for quantitative conditions, and (iii) the higher number of scans needed to reduce the level of detection/quantification to access low concentrated metabolites. To reach this aim, speeding up the recently developed QUantItative Perfected and pUre shifted HSQC (QUIPU HSQC) is an interesting attempt to develop the FAQUIRE concept. Thanks to the combination of spectral aliasing, nonuniform sampling, and variable repetition time, the acquisition time of 2D quantitative maps is reduced by a factor 6 to 9, while conserving a high spectral resolution thanks to a pure shift approach. The analytical potential of the new Quick QUIPU HSQC (Q QUIPU HSQC) is evaluated on a model metabolite sample, and its potential is shown on breast-cell extracts embedding metabolites at millimolar to submillimolar concentrations.
Validation of a Three-Dimensional Method for Counting and Sizing Podocytes in Whole Glomeruli
van der Wolde, James W.; Schulze, Keith E.; Short, Kieran M.; Wong, Milagros N.; Bensley, Jonathan G.; Cullen-McEwen, Luise A.; Caruana, Georgina; Hokke, Stacey N.; Li, Jinhua; Firth, Stephen D.; Harper, Ian S.; Nikolic-Paterson, David J.; Bertram, John F.
2016-01-01
Podocyte depletion is sufficient for the development of numerous glomerular diseases and can be absolute (loss of podocytes) or relative (reduced number of podocytes per volume of glomerulus). Commonly used methods to quantify podocyte depletion introduce bias, whereas gold standard stereologic methodologies are time consuming and impractical. We developed a novel approach for assessing podocyte depletion in whole glomeruli that combines immunofluorescence, optical clearing, confocal microscopy, and three-dimensional analysis. We validated this method in a transgenic mouse model of selective podocyte depletion, in which we determined dose-dependent alterations in several quantitative indices of podocyte depletion. This new approach provides a quantitative tool for the comprehensive and time-efficient analysis of podocyte depletion in whole glomeruli. PMID:26975438
Supporting Worth Mapping with Sentence Completion
NASA Astrophysics Data System (ADS)
Cockton, Gilbert; Kujala, Sari; Nurkka, Piia; Hölttä, Taneli
Expectations for design and evaluation approaches are set by the development practices within which they are used. Worth-Centred Development (WCD) seeks to both shape and fit such practices. We report a study that combined two WCD approaches. Sentence completion gathered credible quantitative data on user values, which were used to identify relevant values and aversions of two player groups for an on-line gambling site. These values provided human value elements for a complementary WCD approach of worth mapping. Initial worth maps were extended in three workshops, which focused on outcomes and user experiences that could be better addressed in the current product and associated marketing materials. We describe how worth maps were prepared for, and presented in, workshops, and how product owners and associated business roles evaluated the combination of WCD approaches. Based on our experiences, we offer practical advice on this combinination.
Approaching human language with complex networks.
Cong, Jin; Liu, Haitao
2014-12-01
The interest in modeling and analyzing human language with complex networks is on the rise in recent years and a considerable body of research in this area has already been accumulated. We survey three major lines of linguistic research from the complex network approach: 1) characterization of human language as a multi-level system with complex network analysis; 2) linguistic typological research with the application of linguistic networks and their quantitative measures; and 3) relationships between the system-level complexity of human language (determined by the topology of linguistic networks) and microscopic linguistic (e.g., syntactic) features (as the traditional concern of linguistics). We show that the models and quantitative tools of complex networks, when exploited properly, can constitute an operational methodology for linguistic inquiry, which contributes to the understanding of human language and the development of linguistics. We conclude our review with suggestions for future linguistic research from the complex network approach: 1) relationships between the system-level complexity of human language and microscopic linguistic features; 2) expansion of research scope from the global properties to other levels of granularity of linguistic networks; and 3) combination of linguistic network analysis with other quantitative studies of language (such as quantitative linguistics). Copyright © 2014 Elsevier B.V. All rights reserved.
Nixon, Gavin J; Svenstrup, Helle F; Donald, Carol E; Carder, Caroline; Stephenson, Judith M; Morris-Jones, Stephen; Huggett, Jim F; Foy, Carole A
2014-12-01
Molecular diagnostic measurements are currently underpinned by the polymerase chain reaction (PCR). There are also a number of alternative nucleic acid amplification technologies, which unlike PCR, work at a single temperature. These 'isothermal' methods, reportedly offer potential advantages over PCR such as simplicity, speed and resistance to inhibitors and could also be used for quantitative molecular analysis. However there are currently limited mechanisms to evaluate their quantitative performance, which would assist assay development and study comparisons. This study uses a sexually transmitted infection diagnostic model in combination with an adapted metric termed isothermal doubling time (IDT), akin to PCR efficiency, to compare quantitative PCR and quantitative loop-mediated isothermal amplification (qLAMP) assays, and to quantify the impact of matrix interference. The performance metric described here facilitates the comparison of qLAMP assays that could assist assay development and validation activities.
Quantitative phase retrieval with arbitrary pupil and illumination
Claus, Rene A.; Naulleau, Patrick P.; Neureuther, Andrew R.; ...
2015-10-02
We present a general algorithm for combining measurements taken under various illumination and imaging conditions to quantitatively extract the amplitude and phase of an object wave. The algorithm uses the weak object transfer function, which incorporates arbitrary pupil functions and partially coherent illumination. The approach is extended beyond the weak object regime using an iterative algorithm. Finally, we demonstrate the method on measurements of Extreme Ultraviolet Lithography (EUV) multilayer mask defects taken in an EUV zone plate microscope with both a standard zone plate lens and a zone plate implementing Zernike phase contrast.
Berchialla, Paola; Scarinzi, Cecilia; Snidero, Silvia; Gregori, Dario
2016-08-01
Risk Assessment is the systematic study of decisions subject to uncertain consequences. An increasing interest has been focused on modeling techniques like Bayesian Networks since their capability of (1) combining in the probabilistic framework different type of evidence including both expert judgments and objective data; (2) overturning previous beliefs in the light of the new information being received and (3) making predictions even with incomplete data. In this work, we proposed a comparison among Bayesian Networks and other classical Quantitative Risk Assessment techniques such as Neural Networks, Classification Trees, Random Forests and Logistic Regression models. Hybrid approaches, combining both Classification Trees and Bayesian Networks, were also considered. Among Bayesian Networks, a clear distinction between purely data-driven approach and combination of expert knowledge with objective data is made. The aim of this paper consists in evaluating among this models which best can be applied, in the framework of Quantitative Risk Assessment, to assess the safety of children who are exposed to the risk of inhalation/insertion/aspiration of consumer products. The issue of preventing injuries in children is of paramount importance, in particular where product design is involved: quantifying the risk associated to product characteristics can be of great usefulness in addressing the product safety design regulation. Data of the European Registry of Foreign Bodies Injuries formed the starting evidence for risk assessment. Results showed that Bayesian Networks appeared to have both the ease of interpretability and accuracy in making prediction, even if simpler models like logistic regression still performed well. © The Author(s) 2013.
ABSTRACT: There are thousands of environmental chemicals subject to regulatory decisions for endocrine disrupting potential. A promising approach to manage this large universe of untested chemicals is to use a prioritization filter that combines in vitro assays with in silico QSA...
ERIC Educational Resources Information Center
Liu, Shiang-Yao; Lin, Chuan-Shun; Tsai, Chin-Chung
2011-01-01
This study aims to test the nature of the assumption that there are relationships between scientific epistemological views (SEVs) and reasoning processes in socioscientific decision making. A mixed methodology that combines both qualitative and quantitative approaches of data collection and analysis was adopted not only to verify the assumption…
ERIC Educational Resources Information Center
Spillane, James P.; Pareja, Amber Stitziel; Dorner, Lisa; Barnes, Carol; May, Henry; Huff, Jason; Camburn, Eric
2010-01-01
In this paper we described how we mixed research approaches in a Randomized Control Trial (RCT) of a school principal professional development program. Using examples from our study we illustrate how combining qualitative and quantitative data can address some key challenges from validating instruments and measures of mediator variables to…
ERIC Educational Resources Information Center
Li, Winnie Sim Siew; Arshad, Mohammad Yusof
2015-01-01
Purpose: Inquiry teaching has been suggested as one of the important approaches in teaching chemistry. This study investigates the inquiry practices among chemistry teachers. Method: A combination of quantitative and qualitative study was applied in this study to provide detailed information about inquiry teaching practices. Questionnaires,…
Integrating Science and Management to Assess Forest Ecosystem Vulnerability to Climate Change
Leslie A. Brandt; Patricia R. Butler; Stephen D. Handler; Maria K. Janowiak; P. Danielle Shannon; Christopher W. Swanston
2017-01-01
We developed the ecosystem vulnerability assessment approach (EVAA) to help inform potential adaptation actions in response to a changing climate. EVAA combines multiple quantitative models and expert elicitation from scientists and land managers. In each of eight assessment areas, a panel of local experts determined potential vulnerability of forest ecosystems to...
Promoting Students' Motivation in Learning Physical Science--An Action Research Approach.
ERIC Educational Resources Information Center
Tuan, Hsiao-Lin; Chin, Chi-Chin; Tsai, Chih-Chung
This study reported how four science teachers used action research to promote their students' motivation in learning physical science. Four teachers with one of their 8th grade physical science classes participated in the study. A combination of qualitative and quantitative research design were used in the study, and data collection included…
The Working Alliance in Online Therapy with Young People: Preliminary Findings
ERIC Educational Resources Information Center
Hanley, Terry
2009-01-01
This paper examines the quality of the working alliance in online counseling relationships with young people. A mixed method approach has been adopted which combines the completion of a self-report quantitative measure (the "Therapeutic Alliance Quality Scale") and qualitative interviews with service users of a UK-based service…
Assigning and Combining Probabilities in Single-Case Studies
ERIC Educational Resources Information Center
Manolov, Rumen; Solanas, Antonio
2012-01-01
There is currently a considerable diversity of quantitative measures available for summarizing the results in single-case studies. Given that the interpretation of some of them is difficult due to the lack of established benchmarks, the current article proposes an approach for obtaining further numerical evidence on the importance of the results,…
NASA Astrophysics Data System (ADS)
D'Angelo, Paola; Migliorati, Valentina; Mancini, Giordano; Barone, Vincenzo; Chillemi, Giovanni
2008-02-01
The structural and dynamic properties of the solvated Hg2+ ion in aqueous solution have been investigated by a combined experimental-theoretical approach employing x-ray absorption spectroscopy and molecular dynamics (MD) simulations. This method allows one to perform a quantitative analysis of the x-ray absorption near-edge structure (XANES) spectra of ionic solutions using a proper description of the thermal and structural fluctuations. XANES spectra have been computed starting from the MD trajectory, without carrying out any minimization in the structural parameter space. The XANES experimental data are accurately reproduced by a first-shell heptacoordinated cluster only if the second hydration shell is included in the calculations. These results confirm at the same time the existence of a sevenfold first hydration shell for the Hg2+ ion in aqueous solution and the reliability of the potentials used in the MD simulations. The combination of MD and XANES is found to be very helpful to get important new insights into the quantitative estimation of structural properties of disordered systems.
Measurement issues related to data collection on the World Wide Web.
Strickland, Ora L; Moloney, Margaret F; Dietrich, Alexa S; Myerburg, Stuart; Cotsonis, George A; Johnson, Robert V
2003-01-01
As the World Wide Web has become more prominent as a mode of communication, it has opened up new possibilities for research data collection. This article identifies measurement issues that occur with Internet data collection that are relevant to qualitative and quantitative research approaches as they occurred in a triangulated Internet study of perimenopausal women with migraine headaches. Issues associated with quantitative data collection over the Internet include (a) selecting and designing Internet data collection protocols that adequately address study aims while also taking advantage of the Internet, (b) ensuring the reliability and validity of Internet data collected, (c) adapting quantitative paper-and-pencil data collection protocols for the Internet, (d) making Internet data collection practical for respondents and researchers, and (e) ensuring the quality of quantitative data collected. Qualitative data collection over the Internet needs to remain true to the philosophical stance of the qualitative approach selected. Researcher expertise in qualitative data collection must be combined with expertise in computer technology and information services if data are to be of ultimate quality The advantages and limitations of collecting qualitative data in real time or at a later time are explored, as well as approaches to enhance qualitative data collection over the Internet. It was concluded that like any research approach or method, Internet data collection requires considerable creativity, expertise, and planning to take advantage of the technology for the collection of reliable and valid research data.
Sardiu, Mihaela E; Gilmore, Joshua M; Carrozza, Michael J; Li, Bing; Workman, Jerry L; Florens, Laurence; Washburn, Michael P
2009-10-06
Protein complexes are key molecular machines executing a variety of essential cellular processes. Despite the availability of genome-wide protein-protein interaction studies, determining the connectivity between proteins within a complex remains a major challenge. Here we demonstrate a method that is able to predict the relationship of proteins within a stable protein complex. We employed a combination of computational approaches and a systematic collection of quantitative proteomics data from wild-type and deletion strain purifications to build a quantitative deletion-interaction network map and subsequently convert the resulting data into an interdependency-interaction model of a complex. We applied this approach to a data set generated from components of the Saccharomyces cerevisiae Rpd3 histone deacetylase complexes, which consists of two distinct small and large complexes that are held together by a module consisting of Rpd3, Sin3 and Ume1. The resulting representation reveals new protein-protein interactions and new submodule relationships, providing novel information for mapping the functional organization of a complex.
Global, quantitative and dynamic mapping of protein subcellular localization
Itzhak, Daniel N; Tyanova, Stefka; Cox, Jürgen; Borner, Georg HH
2016-01-01
Subcellular localization critically influences protein function, and cells control protein localization to regulate biological processes. We have developed and applied Dynamic Organellar Maps, a proteomic method that allows global mapping of protein translocation events. We initially used maps statically to generate a database with localization and absolute copy number information for over 8700 proteins from HeLa cells, approaching comprehensive coverage. All major organelles were resolved, with exceptional prediction accuracy (estimated at >92%). Combining spatial and abundance information yielded an unprecedented quantitative view of HeLa cell anatomy and organellar composition, at the protein level. We subsequently demonstrated the dynamic capabilities of the approach by capturing translocation events following EGF stimulation, which we integrated into a quantitative model. Dynamic Organellar Maps enable the proteome-wide analysis of physiological protein movements, without requiring any reagents specific to the investigated process, and will thus be widely applicable in cell biology. DOI: http://dx.doi.org/10.7554/eLife.16950.001 PMID:27278775
Experimental approaches to well controlled studies of thin-film nucleation and growth.
NASA Technical Reports Server (NTRS)
Poppa, H.; Moorhead, R. D.; Heinemann, K.
1972-01-01
Particular features and the performance of two experimental systems are described for quantitative studies of thin-film nucleation and growth processes including epitaxial depositions. System I consists of a modified LEED-Auger instrument combined with high-resolution electron microscopy. System II is a UHV electron microscope adapted for in-situ deposition studies. The two systems complement each other ideally, and the combined use of both can result in a comprehensive investigation of vapor deposition processes not obtainable with any other known method.
Systems Toxicology: From Basic Research to Risk Assessment
2014-01-01
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment. PMID:24446777
Using mixed methods in health research
Woodman, Jenny
2013-01-01
Summary Mixed methods research is the use of quantitative and qualitative methods in a single study or series of studies. It is an emergent methodology which is increasingly used by health researchers, especially within health services research. There is a growing literature on the theory, design and critical appraisal of mixed methods research. However, there are few papers that summarize this methodological approach for health practitioners who wish to conduct or critically engage with mixed methods studies. The objective of this paper is to provide an accessible introduction to mixed methods for clinicians and researchers unfamiliar with this approach. We present a synthesis of key methodological literature on mixed methods research, with examples from our own work and that of others, to illustrate the practical applications of this approach within health research. We summarize definitions of mixed methods research, the value of this approach, key aspects of study design and analysis, and discuss the potential challenges of combining quantitative and qualitative methods and data. One of the key challenges within mixed methods research is the successful integration of quantitative and qualitative data during analysis and interpretation. However, the integration of different types of data can generate insights into a research question, resulting in enriched understanding of complex health research problems. PMID:23885291
Systems toxicology: from basic research to risk assessment.
Sturla, Shana J; Boobis, Alan R; FitzGerald, Rex E; Hoeng, Julia; Kavlock, Robert J; Schirmer, Kristin; Whelan, Maurice; Wilks, Martin F; Peitsch, Manuel C
2014-03-17
Systems Toxicology is the integration of classical toxicology with quantitative analysis of large networks of molecular and functional changes occurring across multiple levels of biological organization. Society demands increasingly close scrutiny of the potential health risks associated with exposure to chemicals present in our everyday life, leading to an increasing need for more predictive and accurate risk-assessment approaches. Developing such approaches requires a detailed mechanistic understanding of the ways in which xenobiotic substances perturb biological systems and lead to adverse outcomes. Thus, Systems Toxicology approaches offer modern strategies for gaining such mechanistic knowledge by combining advanced analytical and computational tools. Furthermore, Systems Toxicology is a means for the identification and application of biomarkers for improved safety assessments. In Systems Toxicology, quantitative systems-wide molecular changes in the context of an exposure are measured, and a causal chain of molecular events linking exposures with adverse outcomes (i.e., functional and apical end points) is deciphered. Mathematical models are then built to describe these processes in a quantitative manner. The integrated data analysis leads to the identification of how biological networks are perturbed by the exposure and enables the development of predictive mathematical models of toxicological processes. This perspective integrates current knowledge regarding bioanalytical approaches, computational analysis, and the potential for improved risk assessment.
Kang, Bo-Sik; Lee, Jang-Eun; Park, Hyun-Jin
2014-06-01
In Korean rice wine (makgeolli) model, we tried to develop a prediction model capable of eliciting a quantitative relationship between initial amino acids in makgeolli mash and major aromatic compounds, such as fusel alcohols, their acetate esters, and ethyl esters of fatty acids, in makgeolli brewed. Mass-spectrometry-based electronic nose (MS-EN) was used to qualitatively discriminate between makgeollis made from makgeolli mashes with different amino acid compositions. Following this measurement, headspace solid-phase microextraction coupled to gas chromatography-mass spectrometry (GC-MS) combined with partial least-squares regression (PLSR) method was employed to quantitatively correlate amino acid composition of makgeolli mash with major aromatic compounds evolved during makgeolli fermentation. In qualitative prediction with MS-EN analysis, the makgeollis were well discriminated according to the volatile compounds derived from amino acids of makgeolli mash. Twenty-seven ion fragments with mass-to-charge ratio (m/z) of 55 to 98 amu were responsible for the discrimination. In GC-MS combined with PLSR method, a quantitative approach between the initial amino acids of makgeolli mash and the fusel compounds of makgeolli demonstrated that coefficient of determination (R(2)) of most of the fusel compounds ranged from 0.77 to 0.94 in good correlation, except for 2-phenylethanol (R(2) = 0.21), whereas R(2) for ethyl esters of MCFAs including ethyl caproate, ethyl caprylate, and ethyl caprate was 0.17 to 0.40 in poor correlation. The amino acids have been known to affect the aroma in alcoholic beverages. In this study, we demonstrated that an electronic nose qualitatively differentiated Korean rice wines (makgeollis) by their volatile compounds evolved from amino acids with rapidity and reproducibility and successively, a quantitative correlation with acceptable R2 between amino acids and fusel compounds could be established via HS-SPME GC-MS combined with partial least-squares regression. Our approach for predicting the quantities of volatile compounds in the finished product from initial condition of fermentation will give an insight to food researchers to modify and optimize the qualities of the corresponding products. © 2014 Institute of Food Technologists®
Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia
2013-05-30
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.
Advanced imaging techniques in brain tumors
2009-01-01
Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287
Bacterial responses to antibiotics and their combinations.
Mitosch, Karin; Bollenbach, Tobias
2014-12-01
Antibiotics affect bacterial cell physiology at many levels. Rather than just compensating for the direct cellular defects caused by the drug, bacteria respond to antibiotics by changing their morphology, macromolecular composition, metabolism, gene expression and possibly even their mutation rate. Inevitably, these processes affect each other, resulting in a complex response with changes in the expression of numerous genes. Genome-wide approaches can thus help in gaining a comprehensive understanding of bacterial responses to antibiotics. In addition, a combination of experimental and theoretical approaches is needed for identifying general principles that underlie these responses. Here, we review recent progress in our understanding of bacterial responses to antibiotics and their combinations, focusing on effects at the levels of growth rate and gene expression. We concentrate on studies performed in controlled laboratory conditions, which combine promising experimental techniques with quantitative data analysis and mathematical modeling. While these basic research approaches are not immediately applicable in the clinic, uncovering the principles and mechanisms underlying bacterial responses to antibiotics may, in the long term, contribute to the development of new treatment strategies to cope with and prevent the rise of resistant pathogenic bacteria.
Lin, Kai; Zhang, Lanwei; Han, Xue; Meng, Zhaoxu; Zhang, Jianming; Wu, Yifan; Cheng, Dayou
2018-03-28
In this study, Qula casein derived from yak milk casein was hydrolyzed using a two-enzyme combination approach, and high angiotensin I-converting enzyme (ACE) inhibitory activity peptides were screened by quantitative structure-activity relationship (QSAR) modeling integrated with molecular docking analysis. Hydrolysates (<3 kDa) derived from combinations of thermolysin + alcalase and thermolysin + proteinase K demonstrated high ACE inhibitory activities. Peptide sequences in hydrolysates derived from these two combinations were identified by liquid chromatography-tandem mass spectrometry (LC-MS/MS). On the basis of the QSAR modeling prediction, a total of 16 peptides were selected for molecular docking analysis. The docking study revealed that four of the peptides (KFPQY, MPFPKYP, MFPPQ, and QWQVL) bound the active site of ACE. These four novel peptides were chemically synthesized, and their IC 50 was determined. Among these peptides, KFPQY showed the highest ACE inhibitory activity (IC 50 = 12.37 ± 0.43 μM). Our study indicated that Qula casein presents an excellent source to produce ACE inhibitory peptides.
A novel approach for evaluating the risk of health care failure modes.
Chang, Dong Shang; Chung, Jenq Hann; Sun, Kuo Lung; Yang, Fu Chiang
2012-12-01
Failure mode and effects analysis (FMEA) can be employed to reduce medical errors by identifying the risk ranking of the health care failure modes and taking priority action for safety improvement. The purpose of this paper is to propose a novel approach of data analysis. The approach is to integrate FMEA and a mathematical tool-Data envelopment analysis (DEA) with "slack-based measure" (SBM), in the field of data analysis. The risk indexes (severity, occurrence, and detection) of FMEA are viewed as multiple inputs of DEA. The practicality and usefulness of the proposed approach is illustrated by one case of health care. Being a systematic approach for improving the service quality of health care, the approach can offer quantitative corrective information of risk indexes that thereafter reduce failure possibility. For safety improvement, these new targets of the risk indexes could be used for management by objectives. But FMEA cannot provide quantitative corrective information of risk indexes. The novel approach can surely overcome this chief shortcoming of FMEA. After combining DEA SBM model with FMEA, the two goals-increase of patient safety, medical cost reduction-can be together achieved.
High-Content Screening for Quantitative Cell Biology.
Mattiazzi Usaj, Mojca; Styles, Erin B; Verster, Adrian J; Friesen, Helena; Boone, Charles; Andrews, Brenda J
2016-08-01
High-content screening (HCS), which combines automated fluorescence microscopy with quantitative image analysis, allows the acquisition of unbiased multiparametric data at the single cell level. This approach has been used to address diverse biological questions and identify a plethora of quantitative phenotypes of varying complexity in numerous different model systems. Here, we describe some recent applications of HCS, ranging from the identification of genes required for specific biological processes to the characterization of genetic interactions. We review the steps involved in the design of useful biological assays and automated image analysis, and describe major challenges associated with each. Additionally, we highlight emerging technologies and future challenges, and discuss how the field of HCS might be enhanced in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.
Sender–receiver systems and applying information theory for quantitative synthetic biology
Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark
2015-01-01
Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688
Issues in evaluation: evaluating assessments of elderly people using a combination of methods.
McEwan, R T
1989-02-01
In evaluating a health service, individuals will give differing accounts of its performance, according to their experiences of the service, and the evaluative perspective they adopt. The value of a service may also change through time, and according to the particular part of the service studied. Traditional health care evaluations have generally not accounted for this variability because of the approaches used. Studies evaluating screening or assessment programmes for the elderly have focused on programme effectiveness and efficiency, using relatively inflexible quantitative methods. Evaluative approaches must reflect the complexity of health service provision, and methods must vary to suit the particular research objective. Under these circumstances, this paper presents the case for the use of multiple triangulation in evaluative research, where differing methods and perspectives are combined in one study. Emphasis is placed on the applications and benefits of subjectivist approaches in evaluation. An example of combined methods is provided in the form of an evaluation of the Newcastle Care Plan for the Elderly.
Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun
2017-01-01
Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Confidence estimation for quantitative photoacoustic imaging
NASA Astrophysics Data System (ADS)
Gröhl, Janek; Kirchner, Thomas; Maier-Hein, Lena
2018-02-01
Quantification of photoacoustic (PA) images is one of the major challenges currently being addressed in PA research. Tissue properties can be quantified by correcting the recorded PA signal with an estimation of the corresponding fluence. Fluence estimation itself, however, is an ill-posed inverse problem which usually needs simplifying assumptions to be solved with state-of-the-art methods. These simplifications, as well as noise and artifacts in PA images reduce the accuracy of quantitative PA imaging (PAI). This reduction in accuracy is often localized to image regions where the assumptions do not hold true. This impedes the reconstruction of functional parameters when averaging over entire regions of interest (ROI). Averaging over a subset of voxels with a high accuracy would lead to an improved estimation of such parameters. To achieve this, we propose a novel approach to the local estimation of confidence in quantitative reconstructions of PA images. It makes use of conditional probability densities to estimate confidence intervals alongside the actual quantification. It encapsulates an estimation of the errors introduced by fluence estimation as well as signal noise. We validate the approach using Monte Carlo generated data in combination with a recently introduced machine learning-based approach to quantitative PAI. Our experiments show at least a two-fold improvement in quantification accuracy when evaluating on voxels with high confidence instead of thresholding signal intensity.
Lee, Alex Pui-Wai; Fang, Fang; Jin, Chun-Na; Kam, Kevin Ka-Ho; Tsui, Gary K W; Wong, Kenneth K Y; Looi, Jen-Li; Wong, Randolph H L; Wan, Song; Sun, Jing Ping; Underwood, Malcolm J; Yu, Cheuk-Man
2014-01-01
The mitral valve (MV) has complex 3-dimensional (3D) morphology and motion. Advance in real-time 3D echocardiography (RT3DE) has revolutionized clinical imaging of the MV by providing clinicians with realistic visualization of the valve. Thus far, RT3DE of the MV structure and dynamics has adopted an approach that depends largely on subjective and qualitative interpretation of the 3D images of the valve, rather than objective and reproducible measurement. RT3DE combined with image-processing computer techniques provides precise segmentation and reliable quantification of the complex 3D morphology and rapid motion of the MV. This new approach to imaging may provide additional quantitative descriptions that are useful in diagnostic and therapeutic decision-making. Quantitative analysis of the MV using RT3DE has increased our understanding of the pathologic mechanism of degenerative, ischemic, functional, and rheumatic MV disease. Most recently, 3D morphologic quantification has entered into clinical use to provide more accurate diagnosis of MV disease and for planning surgery and transcatheter interventions. Current limitations of this quantitative approach to MV imaging include labor-intensiveness during image segmentation and lack of a clear definition of the clinical significance of many of the morphologic parameters. This review summarizes the current development and applications of quantitative analysis of the MV morphology using RT3DE.
Using mixed methods when researching communities.
Ochieng, Bertha M N; Meetoo, Danny
2015-09-01
To argue for the use of mixed methods when researching communities. Although research involving minority communities is now advanced, not enough effort has been made to formulate methodological linkages between qualitative and quantitative methods in most studies. For instance, the quantitative approaches used by epidemiologists and others in examining the wellbeing of communities are usually empirical. While the rationale for this is sound, quantitative findings can be expanded with data from in-depth qualitative approaches, such as interviews or observations, which are likely to provide insights into the experiences of people in those communities and their relationships with their wellbeing. Academic databases including The Cochrane Library, MEDLINE, CINAHL, AMED, INTERNURSE, Science Direct, Web of Knowledge and PubMed. An iterative process of identifying eligible literature was carried out by comprehensively searching electronic databases. Using mixed-methods approaches is likely to address any potential drawbacks of individual methods by exploiting the strengths of each at the various stages of research. Combining methods can provide additional ways of looking at a complex problem and improve the understanding of a community's experiences. However, it is important for researchers to use the different methods interactively during their research. The use of qualitative and quantitative methods is likely to enrich our understanding of the interrelationship between wellbeing and the experiences of communities. This should help researchers to explore socio-cultural factors and experiences of health and healthcare practice more effectively.
Quantitative methods to direct exploration based on hydrogeologic information
Graettinger, A.J.; Lee, J.; Reeves, H.W.; Dethan, D.
2006-01-01
Quantitatively Directed Exploration (QDE) approaches based on information such as model sensitivity, input data covariance and model output covariance are presented. Seven approaches for directing exploration are developed, applied, and evaluated on a synthetic hydrogeologic site. The QDE approaches evaluate input information uncertainty, subsurface model sensitivity and, most importantly, output covariance to identify the next location to sample. Spatial input parameter values and covariances are calculated with the multivariate conditional probability calculation from a limited number of samples. A variogram structure is used during data extrapolation to describe the spatial continuity, or correlation, of subsurface information. Model sensitivity can be determined by perturbing input data and evaluating output response or, as in this work, sensitivities can be programmed directly into an analysis model. Output covariance is calculated by the First-Order Second Moment (FOSM) method, which combines the covariance of input information with model sensitivity. A groundwater flow example, modeled in MODFLOW-2000, is chosen to demonstrate the seven QDE approaches. MODFLOW-2000 is used to obtain the piezometric head and the model sensitivity simultaneously. The seven QDE approaches are evaluated based on the accuracy of the modeled piezometric head after information from a QDE sample is added. For the synthetic site used in this study, the QDE approach that identifies the location of hydraulic conductivity that contributes the most to the overall piezometric head variance proved to be the best method to quantitatively direct exploration. ?? IWA Publishing 2006.
Ou, Keli; Kesuma, Djohan; Ganesan, Kumaresan; Yu, Kun; Soon, Sou Yen; Lee, Suet Ying; Goh, Xin Pei; Hooi, Michelle; Chen, Wei; Jikuya, Hiroyuki; Ichikawa, Tetsuo; Kuyama, Hiroki; Matsuo, Ei-ichi; Nishimura, Osamu; Tan, Patrick
2006-09-01
The identification of drug-responsive biomarkers in complex protein mixtures is an important goal of quantitative proteomics. Here, we describe a novel approach for identifying such drug-induced protein alterations, which combines 2-nitrobenzenesulfenyl chloride (NBS) tryptophan labeling with two-dimensional gel electrophoresis (2DE)/mass spectrometry (MS). Lysates from drug-treated and control samples are labeled with light or heavy NBS moiety and separated on a common 2DE gel, and protein alterations are identified by MS through the differential intensity of paired NBS peptide peaks. Using NBS/2DE/MS, we profiled the proteomic alterations induced by tamoxifen (TAM) in the estrogen receptor (ER) positive MCF-7 breast cancer cell line. Of 88 protein spots that significantly changed upon TAM treatment, 44 spots representing 23 distinct protein species were successfully identified with NBS-paired peptides. Of these 23 TAM-altered proteins, 16 (70%) have not been previously associated with TAM or ER activity. We found the NBS labeling procedure to be both technically and biologically reproducible, and the NBS/2DE/MS alterations exhibited good concordance with conventional 2DE differential protein quantitation, with discrepancies largely due to the comigration of distinct proteins in the regular 2DE gels. To validate the NBS/2DE/MS results, we used immunoblotting to confirm GRP78, CK19, and PA2G4 as bona fide TAM-regulated proteins. Furthermore, we demonstrate that PA2G4 expression can serve as a novel prognostic factor for disease-free survival in two independent breast cancer patient cohorts. To our knowledge, this is the first report describing the proteomic changes in breast cancer cells induced by TAM, the most commonly used selective estrogen receptor modulator (SERM). Our results indicate that NBS/2DE/MS may represent a more reliable approach for cellular protein quantitation than conventional 2DE approaches.
Berisha, Arton; Dold, Sebastian; Guenther, Sabine; Desbenoit, Nicolas; Takats, Zoltan; Spengler, Bernhard; Römpp, Andreas
2014-08-30
An ideal method for bioanalytical applications would deliver spatially resolved quantitative information in real time and without sample preparation. In reality these requirements can typically not be met by a single analytical technique. Therefore, we combine different mass spectrometry approaches: chromatographic separation, ambient ionization and imaging techniques, in order to obtain comprehensive information about metabolites in complex biological samples. Samples were analyzed by laser desorption followed by electrospray ionization (LD-ESI) as an ambient ionization technique, by matrix-assisted laser desorption/ionization (MALDI) mass spectrometry imaging for spatial distribution analysis and by high-performance liquid chromatography/electrospray ionization mass spectrometry (HPLC/ESI-MS) for quantitation and validation of compound identification. All MS data were acquired with high mass resolution and accurate mass (using orbital trapping and ion cyclotron resonance mass spectrometers). Grape berries were analyzed and evaluated in detail, whereas wheat seeds and mouse brain tissue were analyzed in proof-of-concept experiments. In situ measurements by LD-ESI without any sample preparation allowed for fast screening of plant metabolites on the grape surface. MALDI imaging of grape cross sections at 20 µm pixel size revealed the detailed distribution of metabolites which were in accordance with their biological function. HPLC/ESI-MS was used to quantify 13 anthocyanin species as well as to separate and identify isomeric compounds. A total of 41 metabolites (amino acids, carbohydrates, anthocyanins) were identified with all three approaches. Mass accuracy for all MS measurements was better than 2 ppm (root mean square error). The combined approach provides fast screening capabilities, spatial distribution information and the possibility to quantify metabolites. Accurate mass measurements proved to be critical in order to reliably combine data from different MS techniques. Initial results on the mycotoxin deoxynivalenol (DON) in wheat seed and phospholipids in mouse brain as a model for mammalian tissue indicate a broad applicability of the presented workflow. Copyright © 2014 John Wiley & Sons, Ltd.
Clinical review: Reunification of acid–base physiology
Kellum, John A
2005-01-01
Recent advances in acid–base physiology and in the epidemiology of acid–base disorders have refined our understanding of the basic control mechanisms that determine blood pH in health and disease. These refinements have also brought parity between the newer, quantitative and older, descriptive approaches to acid–base physiology. This review explores how the new and older approaches to acid–base physiology can be reconciled and combined to result in a powerful bedside tool. A case based tutorial is also provided. PMID:16277739
Automatic Gleason grading of prostate cancer using quantitative phase imaging and machine learning
NASA Astrophysics Data System (ADS)
Nguyen, Tan H.; Sridharan, Shamira; Macias, Virgilia; Kajdacsy-Balla, Andre; Melamed, Jonathan; Do, Minh N.; Popescu, Gabriel
2017-03-01
We present an approach for automatic diagnosis of tissue biopsies. Our methodology consists of a quantitative phase imaging tissue scanner and machine learning algorithms to process these data. We illustrate the performance by automatic Gleason grading of prostate specimens. The imaging system operates on the principle of interferometry and, as a result, reports on the nanoscale architecture of the unlabeled specimen. We use these data to train a random forest classifier to learn textural behaviors of prostate samples and classify each pixel in the image into different classes. Automatic diagnosis results were computed from the segmented regions. By combining morphological features with quantitative information from the glands and stroma, logistic regression was used to discriminate regions with Gleason grade 3 versus grade 4 cancer in prostatectomy tissue. The overall accuracy of this classification derived from a receiver operating curve was 82%, which is in the range of human error when interobserver variability is considered. We anticipate that our approach will provide a clinically objective and quantitative metric for Gleason grading, allowing us to corroborate results across instruments and laboratories and feed the computer algorithms for improved accuracy.
Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L
2013-04-16
Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.
Quantification of EEG reactivity in comatose patients
Hermans, Mathilde C.; Westover, M. Brandon; van Putten, Michel J.A.M.; Hirsch, Lawrence J.; Gaspard, Nicolas
2016-01-01
Objective EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. Methods In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. Results The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet’s AC1: 65–70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts’ agreement regarding reactivity for each individual case. Conclusion Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Significance Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. PMID:26183757
2015-01-01
The use of mixed methods (combining quantitative and qualitative data) is developing in a variety of forms, especially in the health field. Our own research has adopted this perspective from the outset. We have sought all along to innovate in various ways and especially to develop an equal partnership, in the sense of not allowing any single approach to dominate. After briefly describing mixed methods, in this article we explain and illustrate how we have exploited both qualitative and quantitative methods to answer our research questions, ending with a reflective analysis of our experiment. PMID:26559730
Sresht, Vishnu; Lewandowski, Eric P; Blankschtein, Daniel; Jusufi, Arben
2017-08-22
A molecular modeling approach is presented with a focus on quantitative predictions of the surface tension of aqueous surfactant solutions. The approach combines classical Molecular Dynamics (MD) simulations with a molecular-thermodynamic theory (MTT) [ Y. J. Nikas, S. Puvvada, D. Blankschtein, Langmuir 1992 , 8 , 2680 ]. The MD component is used to calculate thermodynamic and molecular parameters that are needed in the MTT model to determine the surface tension isotherm. The MD/MTT approach provides the important link between the surfactant bulk concentration, the experimental control parameter, and the surfactant surface concentration, the MD control parameter. We demonstrate the capability of the MD/MTT modeling approach on nonionic alkyl polyethylene glycol surfactants at the air-water interface and observe reasonable agreement of the predicted surface tensions and the experimental surface tension data over a wide range of surfactant concentrations below the critical micelle concentration. Our modeling approach can be extended to ionic surfactants and their mixtures with both ionic and nonionic surfactants at liquid-liquid interfaces.
ERIC Educational Resources Information Center
O'Halloran, Kay L.; Tan, Sabine; Pham, Duc-Son; Bateman, John; Vande Moere, Andrew
2018-01-01
This article demonstrates how a digital environment offers new opportunities for transforming qualitative data into quantitative data in order to use data mining and information visualization for mixed methods research. The digital approach to mixed methods research is illustrated by a framework which combines qualitative methods of multimodal…
ERIC Educational Resources Information Center
Kadambi, Michaela A.; Truscott, Derek
2008-01-01
Concept mapping (a combined qualitative/quantitative approach) was used to clarify and understand 72 Canadian professionals' experience of what they found to be traumatizing about their work with sexual violence survivors in community settings. A sample of 30 professionals providing community-based treatment to survivors of sexual violence sorted…
A Mixed Methods Content Analysis of the Research Literature in Science Education
ERIC Educational Resources Information Center
Schram, Asta B.
2014-01-01
In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…
ERIC Educational Resources Information Center
Pigott, Julian
2012-01-01
In this paper I give an overview of recent developments in the L2 motivation field, in particular the movement away from quantitative, questionnaire-based methodologies toward smaller-scale qualitative studies incorporating concepts from complexity theory. While complexity theory provides useful concepts for exploring motivation in new ways, it…
ERIC Educational Resources Information Center
Powell, Heather; Mihalas, Stephanie; Onwuegbuzie, Anthony J.; Suldo, Shannon; Daley, Christine E.
2008-01-01
This article illustrates the utility of mixed methods research (i.e., combining quantitative and qualitative techniques) to the field of school psychology. First, the use of mixed methods approaches in school psychology practice is discussed. Second, the mixed methods research process is described in terms of school psychology research. Third, the…
Children's Structured Conceptualizations of Their Beliefs on the Causes of Learning Difficulties
ERIC Educational Resources Information Center
Nowicki, Elizabeth A.; Brown, Jason; Stepien, Magdalena
2014-01-01
Elementary school children between 9 and 12 years of age were interviewed on what they believed to be the causes of learning difficulties and were invited to take part in the analysis of the data. We achieved this with Trochim's concept mapping approach that combines qualitative and quantitative data analyses. Study results indicated that children…
Mapping Mixed Methods Research: Methods, Measures, and Meaning
ERIC Educational Resources Information Center
Wheeldon, J.
2010-01-01
This article explores how concept maps and mind maps can be used as data collection tools in mixed methods research to combine the clarity of quantitative counts with the nuance of qualitative reflections. Based on more traditional mixed methods approaches, this article details how the use of pre/post concept maps can be used to design qualitative…
Brenner, Stephan; Muula, Adamson S; Robyn, Paul Jacob; Bärnighausen, Till; Sarker, Malabika; Mathanga, Don P; Bossert, Thomas; De Allegri, Manuela
2014-04-22
In this article we present a study design to evaluate the causal impact of providing supply-side performance-based financing incentives in combination with a demand-side cash transfer component on equitable access to and quality of maternal and neonatal healthcare services. This intervention is introduced to selected emergency obstetric care facilities and catchment area populations in four districts in Malawi. We here describe and discuss our study protocol with regard to the research aims, the local implementation context, and our rationale for selecting a mixed methods explanatory design with a quasi-experimental quantitative component. The quantitative research component consists of a controlled pre- and post-test design with multiple post-test measurements. This allows us to quantitatively measure 'equitable access to healthcare services' at the community level and 'healthcare quality' at the health facility level. Guided by a theoretical framework of causal relationships, we determined a number of input, process, and output indicators to evaluate both intended and unintended effects of the intervention. Overall causal impact estimates will result from a difference-in-difference analysis comparing selected indicators across intervention and control facilities/catchment populations over time.To further explain heterogeneity of quantitatively observed effects and to understand the experiential dimensions of financial incentives on clients and providers, we designed a qualitative component in line with the overall explanatory mixed methods approach. This component consists of in-depth interviews and focus group discussions with providers, service user, non-users, and policy stakeholders. In this explanatory design comprehensive understanding of expected and unexpected effects of the intervention on both access and quality will emerge through careful triangulation at two levels: across multiple quantitative elements and across quantitative and qualitative elements. Combining a traditional quasi-experimental controlled pre- and post-test design with an explanatory mixed methods model permits an additional assessment of organizational and behavioral changes affecting complex processes. Through this impact evaluation approach, our design will not only create robust evidence measures for the outcome of interest, but also generate insights on how and why the investigated interventions produce certain intended and unintended effects and allows for a more in-depth evaluation approach.
Advances in chemical labeling of proteins in living cells.
Yan, Qi; Bruchez, Marcel P
2015-04-01
The pursuit of quantitative biological information via imaging requires robust labeling approaches that can be used in multiple applications and with a variety of detectable colors and properties. In addition to conventional fluorescent proteins, chemists and biologists have come together to provide a range of approaches that combine dye chemistry with the convenience of genetic targeting. This hybrid-tagging approach amalgamates the rational design of properties available through synthetic dye chemistry with the robust biological targeting available with genetic encoding. In this review, we discuss the current range of approaches that have been exploited for dye targeting or for targeting and activation and some of the recent applications that are uniquely permitted by these hybrid-tagging approaches.
Dekker, Job; Belmont, Andrew S; Guttman, Mitchell; Leshyk, Victor O; Lis, John T; Lomvardas, Stavros; Mirny, Leonid A; O'Shea, Clodagh C; Park, Peter J; Ren, Bing; Politz, Joan C Ritland; Shendure, Jay; Zhong, Sheng
2017-09-13
The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic insights into how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental technologies will be combined with biophysical approaches to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells.
Dekker, Job; Belmont, Andrew S.; Guttman, Mitchell; Leshyk, Victor O.; Lis, John T.; Lomvardas, Stavros; Mirny, Leonid A.; O’Shea, Clodagh C.; Park, Peter J.; Ren, Bing; Ritland Politz, Joan C.; Shendure, Jay; Zhong, Sheng
2017-01-01
Preface The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic understanding of how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental approaches will be combined with biophysical modeling to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells. PMID:28905911
NASA Astrophysics Data System (ADS)
Xu, Xiaoqing; Wang, Yawei; Ji, Ying; Xu, Yuanyuan; Xie, Ming; Han, Hao
2018-05-01
A new approach of quantitative phase imaging using four interferograms with special phase shifts in dual-wavelength in-line phase-shifting interferometry is presented. In this method, positive negative 2π phase shifts are employed to easily separate the incoherent addition of two single-wavelength interferograms by combining the phase-shifting technique with the subtraction procedure, then the quantitative phase at one of both wavelengths can be achieved based on two intensities without the corresponding dc terms by the use of the character of the trigonometric function. The quantitative phase of the other wavelength can be retrieved from two dc-term suppressed intensities obtained by employing the two-step phase-shifting technique or the filtering technique in the frequency domain. The proposed method is illustrated with theory, and its effectiveness is demonstrated by simulation experiments of the spherical cap and the HeLa cell, respectively.
Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun
2014-09-01
In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.
Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia
2014-01-01
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730
Wirtz, M A; Strohmer, J
2016-06-01
In order to develop and evaluate interventions in rehabilitation research a wide range of empirical research methods may be adopted. Qualitative research methods emphasize the relevance of an open research focus and a natural proximity to research objects. Accordingly, using qualitative methods special benefits may arise if researchers strive to identify and organize unknown information aspects (inductive purpose). Particularly, quantitative research methods require a high degree of standardization and transparency of the research process. Furthermore, a clear definition of efficacy and effectiveness exists (deductive purpose). These paradigmatic approaches are characterized by almost opposite key characteristics, application standards, purposes and quality criteria. Hence, specific aspects have to be regarded if researchers aim to select or combine those approaches in order to ensure an optimal gain in knowledge. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Astrophysics Data System (ADS)
Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi
2013-02-01
The combination of fluorescent contrast agents with microscopy is a powerful technique to obtain real time images of tissue histology without the need for fixing, sectioning, and staining. The potential of this technology lies in the identification of robust methods for image segmentation and quantitation, particularly in heterogeneous tissues. Our solution is to apply sparse decomposition (SD) to monochrome images of fluorescently-stained microanatomy to segment and quantify distinct tissue types. The clinical utility of our approach is demonstrated by imaging excised margins in a cohort of mice after surgical resection of a sarcoma. Representative images of excised margins were used to optimize the formulation of SD and tune parameters associated with the algorithm. Our results demonstrate that SD is a robust solution that can advance vital fluorescence microscopy as a clinically significant technology.
Flynn, Lorna C; McCulloch, Peter G; Morgan, Lauren J; Robertson, Eleanor R; New, Steve J; Stedman, Francesca E; Martin, Graham P
2016-12-01
To analyze the challenges encountered during surgical quality improvement interventions, and explain the relative success of different intervention strategies. Understanding why and how interventions work is vital for developing improvement science. The S3 Program of studies tested whether combining interventions addressing culture and system was more likely to result in improvement than either approach alone. Quantitative results supported this theory. This qualitative study investigates why this happened, what aspects of the interventions and their implementation most affected improvement, and the implications for similar programs. Semistructured interviews were conducted with hospital staff (23) and research team members (11) involved in S3 studies. Analysis was based on the constant comparative method, with coding conducted concurrently with data collection. Themes were identified and developed in relation to the program theory behind S3. The superior performance of combined intervention over single intervention arms appeared related to greater awareness and ability to act, supporting the S3 hypothesis. However, we also noted unforeseen differences in implementation that seemed to amplify this difference. The greater ambition and more sophisticated approach in combined intervention arms resulted in requests for more intensive expert support, which seemed crucial in their success. The contextual challenges encountered have potential implications for the replicability and sustainability of the approach. Our findings support the S3 hypothesis, triangulating with quantitative results and providing an explanatory account of the causal relationship between interventions and outcomes. They also highlight the importance of implementation strategies, and of factors outside the control of program designers.
Tenti, Lorenzo; Maynau, Daniel; Angeli, Celestino; Calzado, Carmen J
2016-07-21
A new strategy based on orthogonal valence-bond analysis of the wave function combined with intermediate Hamiltonian theory has been applied to the evaluation of the magnetic coupling constants in two AF systems. This approach provides both a quantitative estimate of the J value and a detailed analysis of the main physical mechanisms controlling the coupling, using a combined perturbative + variational scheme. The procedure requires a selection of the dominant excitations to be treated variationally. Two methods have been employed: a brute-force selection, using a logic similar to that of the CIPSI approach, or entanglement measures, which identify the most interacting orbitals in the system. Once a reduced set of excitations (about 300 determinants) is established, the interaction matrix is dressed at the second-order of perturbation by the remaining excitations of the CI space. The diagonalization of the dressed matrix provides J values in good agreement with experimental ones, at a very low-cost. This approach demonstrates the key role of d → d* excitations in the quantitative description of the magnetic coupling, as well as the importance of using an extended active space, including the bridging ligand orbitals, for the binuclear model of the intermediates of multicopper oxidases. The method is a promising tool for dealing with complex systems containing several active centers, as an alternative to both pure variational and DFT approaches.
Modeling Tumor Clonal Evolution for Drug Combinations Design.
Zhao, Boyang; Hemann, Michael T; Lauffenburger, Douglas A
2016-03-01
Cancer is a clonal evolutionary process. This presents challenges for effective therapeutic intervention, given the constant selective pressure towards drug resistance. Mathematical modeling from population genetics, evolutionary dynamics, and engineering perspectives are being increasingly employed to study tumor progression, intratumoral heterogeneity, drug resistance, and rational drug scheduling and combinations design. In this review, we discuss promising opportunities these inter-disciplinary approaches hold for advances in cancer biology and treatment. We propose that quantitative modeling perspectives can complement emerging experimental technologies to facilitate enhanced understanding of disease progression and improved capabilities for therapeutic drug regimen designs.
Masè, Michela; Cristoforetti, Alessandro; Avogaro, Laura; Tessarolo, Francesco; Piccoli, Federico; Caola, Iole; Pederzolli, Carlo; Graffigna, Angelo; Ravelli, Flavia
2015-01-01
The assessment of collagen structure in cardiac pathology, such as atrial fibrillation (AF), is essential for a complete understanding of the disease. This paper introduces a novel methodology for the quantitative description of collagen network properties, based on the combination of nonlinear optical microscopy with a spectral approach of image processing and analysis. Second-harmonic generation (SHG) microscopy was applied to atrial tissue samples from cardiac surgery patients, providing label-free, selective visualization of the collagen structure. The spectral analysis framework, based on 2D-FFT, was applied to the SHG images, yielding a multiparametric description of collagen fiber orientation (angle and anisotropy indexes) and texture scale (dominant wavelength and peak dispersion indexes). The proof-of-concept application of the methodology showed the capability of our approach to detect and quantify differences in the structural properties of the collagen network in AF versus sinus rhythm patients. These results suggest the potential of our approach in the assessment of collagen properties in cardiac pathologies related to a fibrotic structural component.
Hansen, Matthew; O’Brien, Kerth; Meckler, Garth; Chang, Anna Marie; Guise, Jeanne-Marie
2016-01-01
Mixed methods research has significant potential to broaden the scope of emergency care and specifically emergency medical services investigation. Mixed methods studies involve the coordinated use of qualitative and quantitative research approaches to gain a fuller understanding of practice. By combining what is learnt from multiple methods, these approaches can help to characterise complex healthcare systems, identify the mechanisms of complex problems such as medical errors and understand aspects of human interaction such as communication, behaviour and team performance. Mixed methods approaches may be particularly useful for out-of-hospital care researchers because care is provided in complex systems where equipment, interpersonal interactions, societal norms, environment and other factors influence patient outcomes. The overall objectives of this paper are to (1) introduce the fundamental concepts and approaches of mixed methods research and (2) describe the interrelation and complementary features of the quantitative and qualitative components of mixed methods studies using specific examples from the Children’s Safety Initiative-Emergency Medical Services (CSI-EMS), a large National Institutes of Health-funded research project conducted in the USA. PMID:26949970
Saitou, Takashi; Imamura, Takeshi
2016-01-01
Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation. © 2015 Japanese Society of Developmental Biologists.
Evans, Adam R; Robinson, Renã A S
2013-11-01
Recently, we reported a novel proteomics quantitation scheme termed "combined precursor isotopic labeling and isobaric tagging (cPILOT)" that allows for the identification and quantitation of nitrated peptides in as many as 12-16 samples in a single experiment. cPILOT offers enhanced multiplexing and posttranslational modification specificity, however excludes global quantitation for all peptides present in a mixture and underestimates reporter ion ratios similar to other isobaric tagging methods due to precursor co-isolation. Here, we present a novel chemical workflow for cPILOT that can be used for global tagging of all peptides in a mixture. Specifically, through low pH precursor dimethylation of tryptic or LysC peptides followed by high pH tandem mass tags, the same reporter ion can be used twice in a single experiment. Also, to improve triple-stage mass spectrometry (MS(3) ) data acquisition, a selective MS(3) method that focuses on product selection of the y1 fragment of lysine-terminated peptides is incorporated into the workflow. This novel cPILOT workflow has potential for global peptide quantitation that could lead to enhanced sample multiplexing and increase the number of quantifiable spectra obtained from MS(3) acquisition methods. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Roff, Derek A; Fairbairn, Daphne J
2007-01-01
Predicting evolutionary change is the central goal of evolutionary biology because it is the primary means by which we can test evolutionary hypotheses. In this article, we analyze the pattern of evolutionary change in a laboratory population of the wing-dimorphic sand cricket Gryllus firmus resulting from relaxation of selection favoring the migratory (long-winged) morph. Based on a well-characterized trade-off between fecundity and flight capability, we predict that evolution in the laboratory environment should result in a reduction in the proportion of long-winged morphs. We also predict increased fecundity and reduced functionality and weight of the major flight muscles in long-winged females but little change in short-winged (flightless) females. Based on quantitative genetic theory, we predict that the regression equation describing the trade-off between ovary weight and weight of the major flight muscles will show a change in its intercept but not in its slope. Comparisons across generations verify all of these predictions. Further, using values of genetic parameters estimated from previous studies, we show that a quantitative genetic simulation model can account for not only the qualitative changes but also the evolutionary trajectory. These results demonstrate the power of combining quantitative genetic and physiological approaches for understanding the evolution of complex traits.
Kasprowicz, Richard; Rand, Emma; O'Toole, Peter J; Signoret, Nathalie
2018-05-22
Cell-to-cell communication engages signaling and spatiotemporal reorganization events driven by highly context-dependent and dynamic intercellular interactions, which are difficult to capture within heterogeneous primary cell cultures. Here, we present a straightforward correlative imaging approach utilizing commonly available instrumentation to sample large numbers of cell-cell interaction events, allowing qualitative and quantitative characterization of rare functioning cell-conjugates based on calcium signals. We applied this approach to examine a previously uncharacterized immunological synapse, investigating autologous human blood CD4 + T cells and monocyte-derived macrophages (MDMs) forming functional conjugates in vitro. Populations of signaling conjugates were visualized, tracked and analyzed by combining live imaging, calcium recording and multivariate statistical analysis. Correlative immunofluorescence was added to quantify endogenous molecular recruitments at the cell-cell junction. By analyzing a large number of rare conjugates, we were able to define calcium signatures associated with different states of CD4 + T cell-MDM interactions. Quantitative image analysis of immunostained conjugates detected the propensity of endogenous T cell surface markers and intracellular organelles to polarize towards cell-cell junctions with high and sustained calcium signaling profiles, hence defining immunological synapses. Overall, we developed a broadly applicable approach enabling detailed single cell- and population-based investigations of rare cell-cell communication events with primary cells.
NASA Astrophysics Data System (ADS)
Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.
2006-03-01
Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.
Size Matters: Penis Size and Sexual Position in Gay Porn Profiles.
Brennan, Joseph
2018-01-01
This article combines qualitative and quantitative textual approaches to the representation of penis size and sexual position of performers in 10 of the most visited gay pornography Web sites currently in operation. Specifically, in excess of 6,900 performer profiles sourced from 10 commercial Web sites are analyzed. Textual analysis of the profile descriptions is combined with a quantitative representation of disclosed penis size and sexual position, which is presented visually by two figures. The figures confirm that these sites generally market themselves as featuring penises that are extraordinarily large and find a sample-wide correlation between smaller penis sizes (5-6.5 inches) and receptive sexual acts (bottoming), and larger (8.5-13 inches) with penetrative acts (topping). These observations are supported through the qualitative textual readings of how the performers are described on these popular sites, revealing the narratives and marketing strategies that shape the construction of popular porn brands, performers, and profitable fantasies.
Craig, Derek; Mazilu, Michael; Dholakia, Kishan
2015-01-01
Raman spectroscopy has proven to be an indispensable technique for the identification of various types of analytes due to the fingerprint vibration spectrum obtained. Paper microfluidics has also emerged as a low cost, easy to fabricate and portable approach for point of care testing. However, due to inherent background fluorescence, combining Raman spectroscopy with paper microfluidics is to date an unmet challenge in the absence of using surface enhanced mechanisms. We describe the first use of wavelength modulated Raman spectroscopy (WMRS) for analysis on a paper microfluidics platform. This study demonstrates the ability to suppress the background fluorescence of the paper using WMRS and the subsequent implementation of this technique for pharmaceutical analysis. The results of this study demonstrate that it is possible to discriminate between both paracetamol and ibuprofen, whilst, also being able to detect the presence of each analyte quantitatively at nanomolar concentrations. PMID:25938464
Revised planetary protection policy for solar system exploration.
DeVincenzi, D L; Stabekis, P D
1984-01-01
In order to control contamination of planets by terrestrial microorganisms and organic constituents, U.S. planetary missions have been governed by a planetary protection (or planetary quarantine) policy which has changed little since 1972. This policy has recently been reviewed in light of new information obtained from planetary exploration during the past decade and because of changes to, or uncertainties in, some parameters used in the existing quantitative approach. On the basis of this analysis, a revised planetary protection policy with the following key features is proposed: deemphasizing the use of mathematical models and quantitative analyses; establishing requirements for target planet/mission type (i.e., orbiter, lander, etc.) combinations; considering sample return missions a separate category; simplifying documentation; and imposing implementing procedures (i.e., trajectory biasing, cleanroom assembly, spacecraft sterilization, etc.) by exception, i.e., only if the planet/mission combination warrants such controls.
ERIC Educational Resources Information Center
Kiliç, Çigdem
2017-01-01
This study examined pre-service primary school teachers' performance in posing problems that require knowledge of problem-solving strategies. Quantitative and qualitative methods were combined. The 120 participants were asked to pose a problem that could be solved by using the find-a-pattern a particular problem-solving strategy. After that,…
ERIC Educational Resources Information Center
Aktas, Elif; Yurt, Serap Uzuner
2017-01-01
The aim of this study was to determine the effect of the learning environment where digital stories are used as a learning material on the motivation, academic success, retention, and students' opinions. The study was carried out with mixed method which is a combination of quantitative and qualitative research approach. The study was implemented…
NASA Astrophysics Data System (ADS)
Clevers, Jan G. P. W.
2018-05-01
This book provides a comprehensive and timely overview on all aspects of hyperspectral remote sensing combined with various applications. As such, it is an excellent book of reference for both students and professionals active in the field of optical remote sensing. It deals with all aspects of retrieving quantitative information on biophysical properties of the Earth's surface, the data corrections needed and the range of analysis approaches available.
ERIC Educational Resources Information Center
Poder, Kaire; Kerem, Kaie; Lauri, Triin
2013-01-01
We seek out the good institutional features of the European choice policies that can enhance both equity and efficiency at the system level. For causality analysis we construct the typology of 28 European educational systems by using fuzzy-set analysis. We combine five independent variables to indicate institutional features of school choice…
Using Image Modelling to Teach Newton's Laws with the Ollie Trick
ERIC Educational Resources Information Center
Dias, Marco Adriano; Carvalho, Paulo Simeão; Vianna, Deise Miranda
2016-01-01
Image modelling is a video-based teaching tool that is a combination of strobe images and video analysis. This tool can enable a qualitative and a quantitative approach to the teaching of physics, in a much more engaging and appealling way than the traditional expositive practice. In a specific scenario shown in this paper, the Ollie trick, we…
patches to cycle from sink to source status and back.Objective: Through a combination of field studies and state-of-the-art quantitative models, we...landscapes with dynamic changes in habitat quality due to management. We also validated our general approach by comparing patterns in our focal species to general, cross-taxa, patterns.
Report on Contract W911NF-05-1-0339 (Clarkson University)
2012-11-30
voltammetry and impedance spectroscopy: voltage dependent parameters of a silicon solar cell under controlled illumination and temperature, Energy...voltammetry for quantitative evaluation of temperature and voltage dependent parameters of a silicon solar cell , Solar Energy, (11 2011): 0. doi: 10.1016...characterization of silicon solar cells in the electro-analytical approach: Combined measurements of temperature and voltage dependent electrical
Jonathan M. Cohen; Jean C. Mangun; Mae A. Davenport; Andrew D. Carver
2008-01-01
Diverse public opinions, competing management goals, and polarized interest groups combine with problems of scale to create a complex management arena for managers in the Central Hardwood Forest region. A mixed-methods approach that incorporated quantitative analysis of data from a photo evaluation-attitude scale survey instrument was used to assess attitudes toward...
Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation. PMID:26180536
Guo, Rui; Wang, Yiqin; Yan, Hanxia; Yan, Jianjun; Yuan, Fengyin; Xu, Zhaoxia; Liu, Guoping; Xu, Wenjie
2015-01-01
Objective. This research provides objective and quantitative parameters of the traditional Chinese medicine (TCM) pulse conditions for distinguishing between patients with the coronary heart disease (CHD) and normal people by using the proposed classification approach based on Hilbert-Huang transform (HHT) and random forest. Methods. The energy and the sample entropy features were extracted by applying the HHT to TCM pulse by treating these pulse signals as time series. By using the random forest classifier, the extracted two types of features and their combination were, respectively, used as input data to establish classification model. Results. Statistical results showed that there were significant differences in the pulse energy and sample entropy between the CHD group and the normal group. Moreover, the energy features, sample entropy features, and their combination were inputted as pulse feature vectors; the corresponding average recognition rates were 84%, 76.35%, and 90.21%, respectively. Conclusion. The proposed approach could be appropriately used to analyze pulses of patients with CHD, which can lay a foundation for research on objective and quantitative criteria on disease diagnosis or Zheng differentiation.
Oisjöen, Fredrik; Schneiderman, Justin F; Astalan, Andrea Prieto; Kalabukhov, Alexey; Johansson, Christer; Winkler, Dag
2010-01-15
We demonstrate a one-step wash-free bioassay measurement system capable of tracking biochemical binding events. Our approach combines the high resolution of frequency- and high speed of time-domain measurements in a single device in combination with a fast one-step bioassay. The one-step nature of our magnetic nanoparticle (MNP) based assay reduces the time between sample extraction and quantitative results while mitigating the risks of contamination related to washing steps. Our method also enables tracking of binding events, providing the possibility of, for example, investigation of how chemical/biological environments affect the rate of a binding process or study of the action of certain drugs. We detect specific biological binding events occurring on the surfaces of fluid-suspended MNPs that modify their magnetic relaxation behavior. Herein, we extrapolate a modest sensitivity to analyte of 100 ng/ml with the present setup using our rapid one-step bioassay. More importantly, we determine the size-distributions of the MNP systems with theoretical fits to our data obtained from the two complementary measurement modalities and demonstrate quantitative agreement between them. Copyright 2009 Elsevier B.V. All rights reserved.
Validation of two ribosomal RNA removal methods for microbial metatranscriptomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Shaomei; Wurtzel, Omri; Singh, Kanwar
2010-10-01
The predominance of rRNAs in the transcriptome is a major technical challenge in sequence-based analysis of cDNAs from microbial isolates and communities. Several approaches have been applied to deplete rRNAs from (meta)transcriptomes, but no systematic investigation of potential biases introduced by any of these approaches has been reported. Here we validated the effectiveness and fidelity of the two most commonly used approaches, subtractive hybridization and exonuclease digestion, as well as combinations of these treatments, on two synthetic five-microorganism metatranscriptomes using massively parallel sequencing. We found that the effectiveness of rRNA removal was a function of community composition and RNA integritymore » for these treatments. Subtractive hybridization alone introduced the least bias in relative transcript abundance, whereas exonuclease and in particular combined treatments greatly compromised mRNA abundance fidelity. Illumina sequencing itself also can compromise quantitative data analysis by introducing a G+C bias between runs.« less
Mwakanyamale, Kisa; Day-Lewis, Frederick D.; Slater, Lee D.
2013-01-01
Fiber-optic distributed temperature sensing (FO-DTS) increasingly is used to map zones of focused groundwater/surface-water exchange (GWSWE). Previous studies of GWSWE using FO-DTS involved identification of zones of focused GWSWE based on arbitrary cutoffs of FO-DTS time-series statistics (e.g., variance, cross-correlation between temperature and stage, or spectral power). New approaches are needed to extract more quantitative information from large, complex FO-DTS data sets while concurrently providing an assessment of uncertainty associated with mapping zones of focused GSWSE. Toward this end, we present a strategy combining discriminant analysis (DA) and spectral analysis (SA). We demonstrate the approach using field experimental data from a reach of the Columbia River adjacent to the Hanford 300 Area site. Results of the combined SA/DA approach are shown to be superior to previous results from qualitative interpretation of FO-DTS spectra alone.
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman; House, Marcia; Mydlarz, Laura D.; Prager, Katherine C.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease. PMID:26880839
Complementary approaches to diagnosing marine diseases: a union of the modern and the classic
Burge, Colleen A.; Friedman, Carolyn S.; Getchell, Rodman G.; House, Marcia; Lafferty, Kevin D.; Mydlarz, Laura D.; Prager, Katherine C.; Sutherland, Kathryn P.; Renault, Tristan; Kiryu, Ikunari; Vega-Thurber, Rebecca
2016-01-01
Linking marine epizootics to a specific aetiology is notoriously difficult. Recent diagnostic successes show that marine disease diagnosis requires both modern, cutting-edge technology (e.g. metagenomics, quantitative real-time PCR) and more classic methods (e.g. transect surveys, histopathology and cell culture). Here, we discuss how this combination of traditional and modern approaches is necessary for rapid and accurate identification of marine diseases, and emphasize how sole reliance on any one technology or technique may lead disease investigations astray. We present diagnostic approaches at different scales, from the macro (environment, community, population and organismal scales) to the micro (tissue, organ, cell and genomic scales). We use disease case studies from a broad range of taxa to illustrate diagnostic successes from combining traditional and modern diagnostic methods. Finally, we recognize the need for increased capacity of centralized databases, networks, data repositories and contingency plans for diagnosis and management of marine disease.
Revenu, Céline; Streichan, Sebastian; Donà, Erika; Lecaudey, Virginie; Hufnagel, Lars; Gilmour, Darren
2014-03-01
The directed migration of cell collectives drives the formation of complex organ systems. A characteristic feature of many migrating collectives is a 'tissue-scale' polarity, whereby 'leader' cells at the edge of the tissue guide trailing 'followers' that become assembled into polarised epithelial tissues en route. Here, we combine quantitative imaging and perturbation approaches to investigate epithelial cell state transitions during collective migration and organogenesis, using the zebrafish lateral line primordium as an in vivo model. A readout of three-dimensional cell polarity, based on centrosomal-nucleus axes, allows the transition from migrating leaders to assembled followers to be quantitatively resolved for the first time in vivo. Using live reporters and a novel fluorescent protein timer approach, we investigate changes in cell-cell adhesion underlying this transition by monitoring cadherin receptor localisation and stability. This reveals that while cadherin 2 is expressed across the entire tissue, functional apical junctions are first assembled in the transition zone and become progressively more stable across the leader-follower axis of the tissue. Perturbation experiments demonstrate that the formation of these apical adherens junctions requires dynamic microtubules. However, once stabilised, adherens junction maintenance is microtubule independent. Combined, these data identify a mechanism for regulating leader-to-follower transitions within migrating collectives, based on the relocation and stabilisation of cadherins, and reveal a key role for dynamic microtubules in this process.
[A novel quantitative approach to study dynamic anaerobic process at micro scale].
Zhang, Zhong-Liang; Wu, Jing; Jiang, Jian-Kai; Jiang, Jie; Li, Huai-Zhi
2012-11-01
Anaerobic digestion is attracting more and more interests because of its advantages such as low cost and recovery of clean energy etc. In order to overcome the drawbacks of the existed methods to study the dynamic anaerobic process, a novel microscopical quantitative approach at the granule level was developed combining both the microdevice and the quantitative image analysis techniques. This experiment displayed the process and characteristics of the gas production at static state for the first time and the results indicated that the method was of satisfactory repeatability. The gas production process at static state could be divided into three stages including rapid linear increasing stage, decelerated increasing stage and slow linear increasing stage. The rapid linear increasing stage was long and the biogas rate was high under high initial organic loading rate. The results showed that it was feasible to make the anaerobic process to be carried out in the microdevice; furthermore this novel method was reliable and could clearly display the dynamic process of the anaerobic reaction at the micro scale. The results are helpful to understand the anaerobic process.
Zhang, Kai; Tang, Chaohua; Liang, Xiaowei; Zhao, Qingyu; Zhang, Junmin
2018-01-10
Salbutamol, a selective β 2 -agonist, endangers the safety of animal products as a result of illegal use in food animals. In this study, an iTRAQ-based untargeted quantitative proteomic approach was applied to screen potential protein biomarkers in plasma of cattle before and after treatment with salbutamol for 21 days. A total of 62 plasma proteins were significantly affected by salbutamol treatment, which can be used as potential biomarkers to screen for the illegal use of salbutamol in beef cattle. Enzyme-linked immunosorbent assay measurements of five selected proteins demonstrated the reliability of iTRAQ-based proteomics in screening of candidate biomarkers among the plasma proteins. The plasma samples collected before and after salbutamol treatment were well-separated by principal component analysis (PCA) using the differentially expressed proteins. These results suggested that an iTRAQ-based untargeted quantitative proteomic strategy combined with PCA pattern recognition methods can discriminate differences in plasma protein profiles collected before and after salbutamol treatment.
The use of "mixing" procedure of mixed methods in health services research.
Zhang, Wanqing; Creswell, John
2013-08-01
Mixed methods research has emerged alongside qualitative and quantitative approaches as an important tool for health services researchers. Despite growing interest, among health services researchers, in using mixed methods designs, little has been done to identify the procedural aspects of doing so. To describe how mixed methods researchers mix the qualitative and quantitative aspects of their studies in health services research. We searched the PubMed for articles, using mixed methods in health services research, published between January 1, 2006 and December 30, 2010. We identified and reviewed 30 published health services research articles on studies in which mixed methods had been used. We selected 3 articles as illustrations to help health services researcher conceptualize the type of mixing procedures that they were using. Three main "mixing" procedures have been applied within these studies: (1) the researchers analyzed the 2 types of data at the same time but separately and integrated the results during interpretation; (2) the researchers connected the qualitative and quantitative portions in phases in such a way that 1 approach was built upon the findings of the other approach; and (3) the researchers mixed the 2 data types by embedding the analysis of 1 data type within the other. "Mixing" in mixed methods is more than just the combination of 2 independent components of the quantitative and qualitative data. The use of "mixing" procedure in health services research involves the integration, connection, and embedding of these 2 data components.
Bayesian parameter estimation in spectral quantitative photoacoustic tomography
NASA Astrophysics Data System (ADS)
Pulkkinen, Aki; Cox, Ben T.; Arridge, Simon R.; Kaipio, Jari P.; Tarvainen, Tanja
2016-03-01
Photoacoustic tomography (PAT) is an imaging technique combining strong contrast of optical imaging to high spatial resolution of ultrasound imaging. These strengths are achieved via photoacoustic effect, where a spatial absorption of light pulse is converted into a measurable propagating ultrasound wave. The method is seen as a potential tool for small animal imaging, pre-clinical investigations, study of blood vessels and vasculature, as well as for cancer imaging. The goal in PAT is to form an image of the absorbed optical energy density field via acoustic inverse problem approaches from the measured ultrasound data. Quantitative PAT (QPAT) proceeds from these images and forms quantitative estimates of the optical properties of the target. This optical inverse problem of QPAT is illposed. To alleviate the issue, spectral QPAT (SQPAT) utilizes PAT data formed at multiple optical wavelengths simultaneously with optical parameter models of tissue to form quantitative estimates of the parameters of interest. In this work, the inverse problem of SQPAT is investigated. Light propagation is modelled using the diffusion equation. Optical absorption is described with chromophore concentration weighted sum of known chromophore absorption spectra. Scattering is described by Mie scattering theory with an exponential power law. In the inverse problem, the spatially varying unknown parameters of interest are the chromophore concentrations, the Mie scattering parameters (power law factor and the exponent), and Gruneisen parameter. The inverse problem is approached with a Bayesian method. It is numerically demonstrated, that estimation of all parameters of interest is possible with the approach.
Jastrzembski, Jillian A; Bee, Madeleine Y; Sacks, Gavin L
2017-10-25
Ambient ionization mass spectrometric (AI-MS) techniques like direct analysis in real time (DART) offer the potential for rapid quantitative analyses of trace volatiles in food matrices, but performance is generally limited by the lack of preconcentration and extraction steps. The sensitivity and selectivity of AI-MS approaches can be improved through solid-phase microextraction (SPME) with appropriate thin-film geometries, for example, solid-phase mesh-enhanced sorption from headspace (SPMESH). This work improves the SPMESH-DART-MS approach for use in food analyses and validates the approach for trace volatile analysis for two compounds in real samples (grape macerates). SPMESH units prepared with different sorbent coatings were evaluated for their ability to extract a range of odor-active volatiles, with poly(dimethylsiloxane)/divinylbenzene giving the most satisfactory results. In combination with high-resolution mass spectrometry (HRMS), detection limits for SPMESH-DART-MS under 4 ng/L in less than 30 s acquisition times could be achieved for some volatiles [3-isobutyl-2-methoxypyrazine (IBMP) and β-damascenone]. A comparison of SPMESH-DART-MS and SPME-GC-MS quantitation of linalool and IBMP demonstrates excellent agreement between the two methods for real grape samples (r 2 ≥ 0.90), although linalool measurements appeared to also include isobaric interference.
NASA Astrophysics Data System (ADS)
Hazenberg, P.; Torfs, P. J. J. F.; Leijnse, H.; Delrieu, G.; Uijlenhoet, R.
2013-09-01
This paper presents a novel approach to estimate the vertical profile of reflectivity (VPR) from volumetric weather radar data using both a traditional Eulerian as well as a newly proposed Lagrangian implementation. For this latter implementation, the recently developed Rotational Carpenter Square Cluster Algorithm (RoCaSCA) is used to delineate precipitation regions at different reflectivity levels. A piecewise linear VPR is estimated for either stratiform or neither stratiform/convective precipitation. As a second aspect of this paper, a novel approach is presented which is able to account for the impact of VPR uncertainty on the estimated radar rainfall variability. Results show that implementation of the VPR identification and correction procedure has a positive impact on quantitative precipitation estimates from radar. Unfortunately, visibility problems severely limit the impact of the Lagrangian implementation beyond distances of 100 km. However, by combining this procedure with the global Eulerian VPR estimation procedure for a given rainfall type (stratiform and neither stratiform/convective), the quality of the quantitative precipitation estimates increases up to a distance of 150 km. Analyses of the impact of VPR uncertainty shows that this aspect accounts for a large fraction of the differences between weather radar rainfall estimates and rain gauge measurements.
Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M
2016-03-15
Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified.
Characterization of quantum well structures using a photocathode electron microscope
NASA Technical Reports Server (NTRS)
Spencer, Michael G.; Scott, Craig J.
1989-01-01
Present day integrated circuits pose a challenge to conventional electronic and mechanical test methods. Feature sizes in the submicron and nanometric regime require radical approaches in order to facilitate electrical contact to circuits and devices being tested. In addition, microwave operating frequencies require careful attention to distributed effects when considering the electrical signal paths within and external to the device under test. An alternative testing approach which combines the best of electrical and optical time domain testing is presented, namely photocathode electron microscope quantitative voltage contrast (PEMQVC).
NASA Technical Reports Server (NTRS)
DeCarvalho, N. V.; Chen, B. Y.; Pinho, S. T.; Baiz, P. M.; Ratcliffe, J. G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
NASA Technical Reports Server (NTRS)
DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.
2013-01-01
A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.
Katritzky, Alan R; Fara, Dan C; Yang, Hongfang; Karelson, Mati; Suzuki, Takahiro; Solov'ev, Vitaly P; Varnek, Alexandre
2004-01-01
CODESSA-PRO was used to model binding energies for 1:1 complexation systems between 218 organic guest molecules and beta-cyclodextrin, using a seven-parameter equation with R2 = 0.796 and Rcv2 = 0.779. Fragment-based TRAIL calculations gave a better fit with R2 = 0.943 and Rcv2 = 0.848 for 195 data points in the database. The advantages and disadvantages of each approach are discussed, and it is concluded that a combination of the two approaches has much promise from a practical viewpoint.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burnum-Johnson, Kristin E.; Nie, Song; Casey, Cameron P.
Current proteomics approaches are comprised of both broad discovery measurements as well as more quantitative targeted measurements. These two different measurement types are used to initially identify potentially important proteins (e.g., candidate biomarkers) and then enable improved quantification for a limited number of selected proteins. However, both approaches suffer from limitations, particularly the lower sensitivity, accuracy, and quantitation precision for discovery approaches compared to targeted approaches, and the limited proteome coverage provided by targeted approaches. Herein, we describe a new proteomics approach that allows both discovery and targeted monitoring (DTM) in a single analysis using liquid chromatography, ion mobility spectrometrymore » and mass spectrometry (LC-IMS-MS). In DTM, heavy labeled peptides for target ions are spiked into tryptic digests and both the labeled and unlabeled peptides are broadly detected using LC-IMS-MS instrumentation, allowing the benefits of discovery and targeted approaches. To understand the possible improvement of the DTM approach, it was compared to LC-MS broad measurements using an accurate mass and time tag database and selected reaction monitoring (SRM) targeted measurements. The DTM results yielded greater peptide/protein coverage and a significant improvement in the detection of lower abundance species compared to LC-MS discovery measurements. DTM was also observed to have similar detection limits as SRM for the targeted measurements indicating its potential for combining the discovery and targeted approaches.« less
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods.
Pommier, Jeanine; Guével, Marie-Renée; Jourdan, Didier
2010-01-28
Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community.
Quantification of EEG reactivity in comatose patients.
Hermans, Mathilde C; Westover, M Brandon; van Putten, Michel J A M; Hirsch, Lawrence J; Gaspard, Nicolas
2016-01-01
EEG reactivity is an important predictor of outcome in comatose patients. However, visual analysis of reactivity is prone to subjectivity and may benefit from quantitative approaches. In EEG segments recorded during reactivity testing in 59 comatose patients, 13 quantitative EEG parameters were used to compare the spectral characteristics of 1-minute segments before and after the onset of stimulation (spectral temporal symmetry). Reactivity was quantified with probability values estimated using combinations of these parameters. The accuracy of probability values as a reactivity classifier was evaluated against the consensus assessment of three expert clinical electroencephalographers using visual analysis. The binary classifier assessing spectral temporal symmetry in four frequency bands (delta, theta, alpha and beta) showed best accuracy (Median AUC: 0.95) and was accompanied by substantial agreement with the individual opinion of experts (Gwet's AC1: 65-70%), at least as good as inter-expert agreement (AC1: 55%). Probability values also reflected the degree of reactivity, as measured by the inter-experts' agreement regarding reactivity for each individual case. Automated quantitative EEG approaches based on probabilistic description of spectral temporal symmetry reliably quantify EEG reactivity. Quantitative EEG may be useful for evaluating reactivity in comatose patients, offering increased objectivity. Copyright © 2015 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Mixed Methods Designs for Sports Medicine Research.
Kay, Melissa C; Kucera, Kristen L
2018-07-01
Mixed methods research is a relatively new approach in the field of sports medicine, where the benefits of qualitative and quantitative research are combined while offsetting the other's flaws. Despite its known and successful use in other populations, it has been used minimally in sports medicine, including studies of the clinician perspective, concussion, and patient outcomes. Therefore, there is a need for this approach to be applied in other topic areas not easily addressed by one type of research approach in isolation, such as the retirement from sport, effects of and return from injury, and catastrophic injury. Copyright © 2018 Elsevier Inc. All rights reserved.
Mobile Phone Sensing of Cocaine in a Lateral Flow Assay Combined with a Biomimetic Material.
Guler, Emine; Yilmaz Sengel, Tulay; Gumus, Z Pinar; Arslan, Mustafa; Coskunol, Hakan; Timur, Suna; Yagci, Yusuf
2017-09-19
Lateral flow assays (LFAs) are an ideal choice for drug abuse testing favored by their practicability, portability, and rapidity. LFA based on-site rapid screening devices provide positive/negative judgment in a short response time. The conventionally applied competitive assay format used for small molecule analysis such as abused drugs restricts the quantitation ability of LFA strips. We report herein, for the first time, a new strategy using the noncompetitive assay format via a biomimetic material, namely, poly(p-phenylene) β-cyclodextrin poly(ethylene glycol) (PPP-CD-g-PEG) combined with gold nanoparticle (AuNP) conjugates as the labeling agent to recognize the target cocaine molecule in the test zone. The intensities of the visualized red color in the test line indicate that the cocaine concentrations were analyzed via a smartphone application. Significantly, a combination of this platform with a smartphone application provides quantitative data on the cocaine amount, making it a very inventive and attractive approach especially for on-site applications at critical points such as traffic stops and the workplace.
Méndez, Lucía; Ciordia, Sergio; Fernández, María Soledad; Juárez, Silvia; Ramos, Antonio; Pazos, Manuel; Gallardo, José M; Torres, Josep Lluís; Nogués, M Rosa; Medina, Isabel
2017-03-01
This study considered the physiological modulation of liver proteins due to the supplementation with fish oils under two dietary backgrounds: standard or high in fat and sucrose (HFHS), and their combination with grape polyphenols. By using a quantitative proteomics approach, we showed that the capacity of the supplements for regulating proteins depended on the diet; namely, 10 different proteins changed into standard diets, while 45 changed into the HFHS diets and only scarcely proteins were found altered in common. However, in both contexts, fish oils were the main regulatory force, although the addition of polyphenols was able to modulate some fish oils' effects. Moreover, we demonstrated the ability of fish oils and their combination with grape polyphenols in improving biochemical parameters and reducing lipogenesis and glycolysis enzymes, enhancing fatty acid beta-oxidation and insulin signaling and ameliorating endoplasmic reticulum stress and protein oxidation when they are included in an unhealthy diet. Copyright © 2016 Elsevier Inc. All rights reserved.
Social policies related to parenthood and capabilities of Slovenian parents.
Mrčela, Aleksandra Kanjuo; Sadar, Nevenka Černigoj
2011-01-01
We apply Sen's capability approach to evaluate the capabilities of Slovenian parents to reconcile paid work and family in the context of the transition to a market economy. We examine how different levels of capabilities together affect the work–life balance (WLB) of employed parents. We combine both quantitative and qualitative methodological approaches. The results of our quantitative and qualitative research show that increased precariousness of employment and intensification of work create gaps between the legal and normative possibilities for successful reconciliation strategies and actual use of such arrangements in Slovenia. The existing social policies and the acceptance of gender equality in the sphere of paid work enhance capabilities for reconciliation of paid work and parenthood, whereas the intensification of working lives, the dominance of paid work over other parts of life, and the acceptance of gender inequalities in parental and household responsibilities limit parents’ capabilities to achieve WLB.
Davies, Stephen R; Jones, Kai; Goldys, Anna; Alamgir, Mahuiddin; Chan, Benjamin K H; Elgindy, Cecile; Mitchell, Peter S R; Tarrant, Gregory J; Krishnaswami, Maya R; Luo, Yawen; Moawad, Michael; Lawes, Douglas; Hook, James M
2015-04-01
Quantitative NMR spectroscopy (qNMR) has been examined for purity assessment using a range of organic calibration standards of varying structural complexities, certified using the traditional mass balance approach. Demonstrated equivalence between the two independent purity values confirmed the accuracy of qNMR and highlighted the benefit of using both methods in tandem to minimise the potential for hidden bias, thereby conferring greater confidence in the overall purity assessment. A comprehensive approach to purity assessment is detailed, utilising, where appropriate, multiple peaks in the qNMR spectrum, chosen on the basis of scientific reason and statistical analysis. Two examples are presented in which differences between the purity assignment by qNMR and mass balance are addressed in different ways depending on the requirement of the end user, affording fit-for-purpose calibration standards in a cost-effective manner.
León, Ileana R.; Schwämmle, Veit; Jensen, Ole N.; Sprenger, Richard R.
2013-01-01
The majority of mass spectrometry-based protein quantification studies uses peptide-centric analytical methods and thus strongly relies on efficient and unbiased protein digestion protocols for sample preparation. We present a novel objective approach to assess protein digestion efficiency using a combination of qualitative and quantitative liquid chromatography-tandem MS methods and statistical data analysis. In contrast to previous studies we employed both standard qualitative as well as data-independent quantitative workflows to systematically assess trypsin digestion efficiency and bias using mitochondrial protein fractions. We evaluated nine trypsin-based digestion protocols, based on standard in-solution or on spin filter-aided digestion, including new optimized protocols. We investigated various reagents for protein solubilization and denaturation (dodecyl sulfate, deoxycholate, urea), several trypsin digestion conditions (buffer, RapiGest, deoxycholate, urea), and two methods for removal of detergents before analysis of peptides (acid precipitation or phase separation with ethyl acetate). Our data-independent quantitative liquid chromatography-tandem MS workflow quantified over 3700 distinct peptides with 96% completeness between all protocols and replicates, with an average 40% protein sequence coverage and an average of 11 peptides identified per protein. Systematic quantitative and statistical analysis of physicochemical parameters demonstrated that deoxycholate-assisted in-solution digestion combined with phase transfer allows for efficient, unbiased generation and recovery of peptides from all protein classes, including membrane proteins. This deoxycholate-assisted protocol was also optimal for spin filter-aided digestions as compared with existing methods. PMID:23792921
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS.
Analyzing 7000 texts on deep brain stimulation: what do they tell us?
Ineichen, Christian; Christen, Markus
2015-01-01
The enormous increase in numbers of scientific publications in the last decades requires quantitative methods for obtaining a better understanding of topics and developments in various fields. In this exploratory study, we investigate the emergence, trends, and connections of topics within the whole text corpus of the deep brain stimulation (DBS) literature based on more than 7000 papers (title and abstracts) published between 1991 to 2014 using a network approach. Taking the co-occurrence of basic terms that represent important topics within DBS as starting point, we outline the statistics of interconnections between DBS indications, anatomical targets, positive, and negative effects, as well as methodological, technological, and economic issues. This quantitative approach confirms known trends within the literature (e.g., regarding the emergence of psychiatric indications). The data also reflect an increased discussion about complex issues such as personality connected tightly to the ethical context, as well as an apparent focus on depression as important DBS indication, where the co-occurrence of terms related to negative effects is low both for the indication as well as the related anatomical targets. We also discuss consequences of the analysis from a bioethical perspective, i.e., how such a quantitative analysis could uncover hidden subject matters that have ethical relevance. For example, we find that hardware-related issues in DBS are far more robustly connected to an ethical context compared to impulsivity, concrete side-effects or death/suicide. Our contribution also outlines the methodology of quantitative text analysis that combines statistical approaches with expert knowledge. It thus serves as an example how innovative quantitative tools can be made useful for gaining a better understanding in the field of DBS. PMID:26578908
Östlund, Ulrika; Kidd, Lisa; Wengström, Yvonne; Rowa-Dewar, Neneh
2011-03-01
It has been argued that mixed methods research can be useful in nursing and health science because of the complexity of the phenomena studied. However, the integration of qualitative and quantitative approaches continues to be one of much debate and there is a need for a rigorous framework for designing and interpreting mixed methods research. This paper explores the analytical approaches (i.e. parallel, concurrent or sequential) used in mixed methods studies within healthcare and exemplifies the use of triangulation as a methodological metaphor for drawing inferences from qualitative and quantitative findings originating from such analyses. This review of the literature used systematic principles in searching CINAHL, Medline and PsycINFO for healthcare research studies which employed a mixed methods approach and were published in the English language between January 1999 and September 2009. In total, 168 studies were included in the results. Most studies originated in the United States of America (USA), the United Kingdom (UK) and Canada. The analytic approach most widely used was parallel data analysis. A number of studies used sequential data analysis; far fewer studies employed concurrent data analysis. Very few of these studies clearly articulated the purpose for using a mixed methods design. The use of the methodological metaphor of triangulation on convergent, complementary, and divergent results from mixed methods studies is exemplified and an example of developing theory from such data is provided. A trend for conducting parallel data analysis on quantitative and qualitative data in mixed methods healthcare research has been identified in the studies included in this review. Using triangulation as a methodological metaphor can facilitate the integration of qualitative and quantitative findings, help researchers to clarify their theoretical propositions and the basis of their results. This can offer a better understanding of the links between theory and empirical findings, challenge theoretical assumptions and develop new theory. Copyright © 2010 Elsevier Ltd. All rights reserved.
A Computer Mediated Learning Environment for a Joint and Expeditionary Mindset
2010-08-01
Tashakkori & Teddlie, 1998). In the second part of each interview, the two experts were asked for their opinions on issues related to learner-centered...naturalistic observations (Camic et al., 2003; Denzin & Lincoln, 2003; Tashakkori & Teddlie, 1998). The concurrent development of a grounded theory 9... Tashakkori , A., & Teddlie, C. (1998). Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks, CA: Sage. Wallace
ERIC Educational Resources Information Center
Ranieri, Antonio, Ed.
2013-01-01
This report provides an analysis of the labour market impacts of EU policy interventions designed to support the transition to a job-rich, low-carbon economy. The approach taken is innovative as it combines quantitative (econometric modelling) and qualitative (case study) methods to investigate the expected impact of sustainable energy policies on…
USDA-ARS?s Scientific Manuscript database
Frogeye leaf spot (FLS), caused by Cercospora sojina Hara, is a common disease of soybean. Significant yield losses (10–60%) have been attributed to FLS. We present a novel trapping approach using Vaseline coated slides placed at a 45° angle within a passive, wind-vane spore trap used in combination...
Modeling Tumor Clonal Evolution for Drug Combinations Design
Zhao, Boyang; Hemann, Michael T.; Lauffenburger, Douglas A.
2016-01-01
Cancer is a clonal evolutionary process. This presents challenges for effective therapeutic intervention, given the constant selective pressure towards drug resistance. Mathematical modeling from population genetics, evolutionary dynamics, and engineering perspectives are being increasingly employed to study tumor progression, intratumoral heterogeneity, drug resistance, and rational drug scheduling and combinations design. In this review, we discuss promising opportunities these inter-disciplinary approaches hold for advances in cancer biology and treatment. We propose that quantitative modeling perspectives can complement emerging experimental technologies to facilitate enhanced understanding of disease progression and improved capabilities for therapeutic drug regimen designs. PMID:28435907
Combining landscape-level conservation planning and biodiversity offset programs: a case study.
Underwood, Jared G
2011-01-01
Habitat loss is a major factor in the endangerment and extinction of species around the world. One promising strategy to balance continued habitat loss and biodiversity conservation is that of biodiversity offsets. However, a major concern with offset programs is their consistency with landscape-level conservation goals. While merging offset policies and landscape-level conservation planning is thought to provide advantages over a traditional disconnected approach, few such landscape-level conservation-offset plans have been designed and implemented, so the effectiveness of such a strategy remains uncertain. In this study, we quantitatively assess the conservation impact of combining landscape-level conservation planning and biodiversity offset programs by comparing regions of San Diego County, USA with the combined approach to regions with only an offset program. This comparison is generally very difficult due to a variety of complicating factors. We overcome these complications and quantify the benefits to rare and threatened species of implementing a combined approach by assessing the amount of each species' predicted distribution, and the number of documented locations, conserved in comparison to the same metric for areas with an offset policy alone. We found that adoption of the combined approach has increased conservation for many rare species, often 5-10 times more than in the comparison area, and that conservation has been focused in the areas most important for these species. The level of conservation achieved reduces uncertainty that these species will persist in the region into the future. This San Diego County example demonstrates the potential benefits of combining landscape-level conservation planning and biodiversity offset programs.
Quantitative mass spectrometry methods for pharmaceutical analysis
Loos, Glenn; Van Schepdael, Ann
2016-01-01
Quantitative pharmaceutical analysis is nowadays frequently executed using mass spectrometry. Electrospray ionization coupled to a (hybrid) triple quadrupole mass spectrometer is generally used in combination with solid-phase extraction and liquid chromatography. Furthermore, isotopically labelled standards are often used to correct for ion suppression. The challenges in producing sensitive but reliable quantitative data depend on the instrumentation, sample preparation and hyphenated techniques. In this contribution, different approaches to enhance the ionization efficiencies using modified source geometries and improved ion guidance are provided. Furthermore, possibilities to minimize, assess and correct for matrix interferences caused by co-eluting substances are described. With the focus on pharmaceuticals in the environment and bioanalysis, different separation techniques, trends in liquid chromatography and sample preparation methods to minimize matrix effects and increase sensitivity are discussed. Although highly sensitive methods are generally aimed for to provide automated multi-residue analysis, (less sensitive) miniaturized set-ups have a great potential due to their ability for in-field usage. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644982
Bandara, Gayan C; Heist, Christopher A; Remcho, Vincent T
2018-02-20
Copper is widely applied in industrial and technological applications and is an essential micronutrient for humans and animals. However, exposure to high environmental levels of copper, especially through drinking water, can lead to copper toxicity, resulting in severe acute and chronic health effects. Therefore, regular monitoring of aqueous copper ions has become necessary as recent anthropogenic activities have led to elevated environmental concentrations of copper. On-site monitoring processes require an inexpensive, simple, and portable analytical approach capable of generating reliable qualitative and quantitative data efficiently. Membrane-based lateral flow microfluidic devices are ideal candidates as they facilitate rapid, inexpensive, and portable measurements. Here we present a simple, chromatographic separation approach in combination with a visual detection method for Cu 2+ quantitation, performed in a lateral flow microfluidic channel. This method appreciably minimizes interferences by incorporating a nonspecific polymer inclusion membrane (PIM) based assay with a "dot-counting" approach to quantification. In this study, hydrophobic polycaprolactone (PCL)-filled glass microfiber (GMF) membranes were used as the base substrate onto which the PIM was evenly dispensed as an array of dots. The devices thus prepared were then selectively exposed to oxygen radicals through a mask to generate a hydrophilic surface path along which the sample was wicked. Using this approach, copper concentrations from 1 to 20 ppm were quantified from 5 μL samples using only visual observation of the assay device.
A traits-based approach for prioritizing species for monitoring and surrogacy selection
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.; ...
2016-11-28
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
Pucher, Katharina K; Candel, Math J J M; Krumeich, Anja; Boot, Nicole M W M; De Vries, Nanne K
2015-07-05
We report on the longitudinal quantitative and qualitative data resulting from a two-year trajectory (2008-2011) based on the DIagnosis of Sustainable Collaboration (DISC) model. This trajectory aimed to support regional coordinators of comprehensive school health promotion (CSHP) in systematically developing change management and project management to establish intersectoral collaboration. Multilevel analyses of quantitative data on the determinants of collaborations according to the DISC model were done, with 90 respondents (response 57 %) at pretest and 69 respondents (52 %) at posttest. Nvivo analyses of the qualitative data collected during the trajectory included minutes of monthly/bimonthly personal/telephone interviews (N = 65) with regional coordinators, and documents they produced about their activities. Quantitative data showed major improvements in change management and project management. There were also improvements in consensus development, commitment formation, formalization of the CSHP, and alignment of policies, although organizational problems within the collaboration increased. Content analyses of qualitative data identified five main management styles, including (1) facilitating active involvement of relevant parties; (2) informing collaborating parties; (3) controlling and (4) supporting their task accomplishment; and (5) coordinating the collaborative processes. We have contributed to the fundamental understanding of the development of intersectoral collaboration by combining qualitative and quantitative data. Our results support a systematic approach to intersectoral collaboration using the DISC model. They also suggest five main management styles to improve intersectoral collaboration in the initial stage. The outcomes are useful for health professionals involved in similar ventures.
A traits-based approach for prioritizing species for monitoring and surrogacy selection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pracheil, Brenda M.; McManamay, Ryan A.; Bevelhimer, Mark S.
The bar for justifying the use of vertebrate animals for study is being increasingly raised, thus requiring increased rigor for species selection and study design. Although we have power analyses to provide quantitative backing for the numbers of organisms used, quantitative backing for selection of study species is not frequently employed. This can be especially important when measuring the impacts of ecosystem alteration, when study species must be chosen that are both sensitive to the alteration and of sufficient abundance for study. Just as important is providing justification for designation of surrogate species for study, especially when the species ofmore » interest is rare or of conservation concern and selection of an appropriate surrogate can have legal implications. In this study, we use a combination of GIS, a fish traits database and multivariate statistical analyses to quantitatively prioritize species for study and to determine potential study surrogate species. We provide two case studies to illustrate our quantitative, traits-based approach for designating study species and surrogate species. In the first case study, we select broadly representative fish species to understand the effects of turbine passage on adult fishes based on traits that suggest sensitivity to turbine passage. In our second case study, we present a framework for selecting a surrogate species for an endangered species. Lastly, we suggest that our traits-based framework can provide quantitative backing and added justification to selection of study species while expanding the inference space of study results.« less
NASA Astrophysics Data System (ADS)
Kudomi, Nobuyuki; Watabe, Hiroshi; Hayashi, Takuya; Iida, Hidehiro
2007-04-01
Cerebral metabolic rate of oxygen (CMRO2), oxygen extraction fraction (OEF) and cerebral blood flow (CBF) images can be quantified using positron emission tomography (PET) by administrating 15O-labelled water (H152O) and oxygen (15O2). Conventionally, those images are measured with separate scans for three tracers C15O for CBV, H152O for CBF and 15O2 for CMRO2, and there are additional waiting times between the scans in order to minimize the influence of the radioactivity from the previous tracers, which results in a relatively long study period. We have proposed a dual tracer autoradiographic (DARG) approach (Kudomi et al 2005), which enabled us to measure CBF, OEF and CMRO2 rapidly by sequentially administrating H152O and 15O2 within a short time. Because quantitative CBF and CMRO2 values are sensitive to arterial input function, it is necessary to obtain accurate input function and a drawback of this approach is to require separation of the measured arterial blood time-activity curve (TAC) into pure water and oxygen input functions under the existence of residual radioactivity from the first injected tracer. For this separation, frequent manual sampling was required. The present paper describes two calculation methods: namely a linear and a model-based method, to separate the measured arterial TAC into its water and oxygen components. In order to validate these methods, we first generated a blood TAC for the DARG approach by combining the water and oxygen input functions obtained in a series of PET studies on normal human subjects. The combined data were then separated into water and oxygen components by the present methods. CBF and CMRO2 were calculated using those separated input functions and tissue TAC. The quantitative accuracy in the CBF and CMRO2 values by the DARG approach did not exceed the acceptable range, i.e., errors in those values were within 5%, when the area under the curve in the input function of the second tracer was larger than half of the first one. Bias and deviation in those values were also compatible to that of the conventional method, when noise was imposed on the arterial TAC. We concluded that the present calculation based methods could be of use for quantitatively calculating CBF and CMRO2 with the DARG approach.
Chen, Xin; Qin, Shanshan; Chen, Shuai; Li, Jinlong; Li, Lixin; Wang, Zhongling; Wang, Quan; Lin, Jianping; Yang, Cheng; Shui, Wenqing
2015-01-01
In fragment-based lead discovery (FBLD), a cascade combining multiple orthogonal technologies is required for reliable detection and characterization of fragment binding to the target. Given the limitations of the mainstream screening techniques, we presented a ligand-observed mass spectrometry approach to expand the toolkits and increase the flexibility of building a FBLD pipeline especially for tough targets. In this study, this approach was integrated into a FBLD program targeting the HCV RNA polymerase NS5B. Our ligand-observed mass spectrometry analysis resulted in the discovery of 10 hits from a 384-member fragment library through two independent screens of complex cocktails and a follow-up validation assay. Moreover, this MS-based approach enabled quantitative measurement of weak binding affinities of fragments which was in general consistent with SPR analysis. Five out of the ten hits were then successfully translated to X-ray structures of fragment-bound complexes to lay a foundation for structure-based inhibitor design. With distinctive strengths in terms of high capacity and speed, minimal method development, easy sample preparation, low material consumption and quantitative capability, this MS-based assay is anticipated to be a valuable addition to the repertoire of current fragment screening techniques. PMID:25666181
Advantages and limitations of quantitative PCR (Q-PCR)-based approaches in microbial ecology.
Smith, Cindy J; Osborn, A Mark
2009-01-01
Quantitative PCR (Q-PCR or real-time PCR) approaches are now widely applied in microbial ecology to quantify the abundance and expression of taxonomic and functional gene markers within the environment. Q-PCR-based analyses combine 'traditional' end-point detection PCR with fluorescent detection technologies to record the accumulation of amplicons in 'real time' during each cycle of the PCR amplification. By detection of amplicons during the early exponential phase of the PCR, this enables the quantification of gene (or transcript) numbers when these are proportional to the starting template concentration. When Q-PCR is coupled with a preceding reverse transcription reaction, it can be used to quantify gene expression (RT-Q-PCR). This review firstly addresses the theoretical and practical implementation of Q-PCR and RT-Q-PCR protocols in microbial ecology, highlighting key experimental considerations. Secondly, we review the applications of (RT)-Q-PCR analyses in environmental microbiology and evaluate the contribution and advances gained from such approaches. Finally, we conclude by offering future perspectives on the application of (RT)-Q-PCR in furthering understanding in microbial ecology, in particular, when coupled with other molecular approaches and more traditional investigations of environmental systems.
Sacci, Robert L; Black, Jennifer M.; Wisinger, Nina; ...
2015-02-23
The performance characteristics of Li-ion batteries are intrinsically linked to evolving nanoscale interfacial electrochemical reactions. To probe the mechanisms of solid electrolyte interphase formation and Li electrodeposition from a standard battery electrolyte, we use in situ electrochemical scanning transmission electron microscopy for controlled potential sweep-hold electrochemical measurements with simultaneous BF and ADF STEM image acquisition. Through a combined quantitative electrochemical measurement and quantitative STEM imaging approach, based upon electron scattering theory, we show that chemically sensitive ADF STEM imaging can be used to estimate the density of evolving SEI constituents and distinguish contrast mechanisms of Li-bearing components in the liquidmore » cell.« less
Guidance for using mixed methods design in nursing practice research.
Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia
2016-08-01
The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.
A quantitative approach to neuropsychiatry: The why and the how.
Kas, Martien J; Penninx, Brenda; Sommer, Bernd; Serretti, Alessandro; Arango, Celso; Marston, Hugh
2017-12-12
The current nosology of neuropsychiatric disorders allows for a pragmatic approach to treatment choice, regulation and clinical research. However, without a biological rationale for these disorders, drug development has stagnated. The recently EU-funded PRISM project aims to develop a quantitative biological approach to the understanding and classification of neuropsychiatric diseases to accelerate the discovery and development of better treatments. By combining clinical data sets from major worldwide disease cohorts and by applying innovative technologies to deeply phenotype stratified patient groups, we will define a set of quantifiable biological parameters for social withdrawal and cognitive deficits common to Schizophrenia (SZ), Major Depression (MD), and Alzheimer's Disease (AD). These studies aim to provide new classification and assessment tools for social and cognitive performance across neuropsychiatric disorders, clinically relevant substrates for treatment development, and predictive, preclinical animal systems. With patients and regulatory agencies, we seek to provide clear routes for the future translation and regulatory approval for new treatments and provide solutions to the growing public health challenges of psychiatry and neurology. Copyright © 2017. Published by Elsevier Ltd.
Tucker, George; Loh, Po-Ru; Berger, Bonnie
2013-10-04
Comprehensive protein-protein interaction (PPI) maps are a powerful resource for uncovering the molecular basis of genetic interactions and providing mechanistic insights. Over the past decade, high-throughput experimental techniques have been developed to generate PPI maps at proteome scale, first using yeast two-hybrid approaches and more recently via affinity purification combined with mass spectrometry (AP-MS). Unfortunately, data from both protocols are prone to both high false positive and false negative rates. To address these issues, many methods have been developed to post-process raw PPI data. However, with few exceptions, these methods only analyze binary experimental data (in which each potential interaction tested is deemed either observed or unobserved), neglecting quantitative information available from AP-MS such as spectral counts. We propose a novel method for incorporating quantitative information from AP-MS data into existing PPI inference methods that analyze binary interaction data. Our approach introduces a probabilistic framework that models the statistical noise inherent in observations of co-purifications. Using a sampling-based approach, we model the uncertainty of interactions with low spectral counts by generating an ensemble of possible alternative experimental outcomes. We then apply the existing method of choice to each alternative outcome and aggregate results over the ensemble. We validate our approach on three recent AP-MS data sets and demonstrate performance comparable to or better than state-of-the-art methods. Additionally, we provide an in-depth discussion comparing the theoretical bases of existing approaches and identify common aspects that may be key to their performance. Our sampling framework extends the existing body of work on PPI analysis using binary interaction data to apply to the richer quantitative data now commonly available through AP-MS assays. This framework is quite general, and many enhancements are likely possible. Fruitful future directions may include investigating more sophisticated schemes for converting spectral counts to probabilities and applying the framework to direct protein complex prediction methods.
Spermatozoa quality assessment: a combined holographic and Raman microscopy approach
NASA Astrophysics Data System (ADS)
De Angelis, Annalisa; Ferrara, Maria A.; Di Caprio, Giuseppe; Managò, Stefano; Sirleto, Luigi; Coppola, Giuseppe; De Luca, Anna Chiara
2015-05-01
Semen analysis is widely used as diagnostic tool for assessing male fertility, controlling and managing the animal reproduction. The most important parameters measured in a semen analysis are the morphology and biochemical alterations. For obtaining such information, non-invasive, label-free and non-destructive techniques have to be used. Digital Holography (DH) combined with Raman Spectroscopy (RS) could represent the perfect candidate for a rapid, non-destructive and high-sensitive morphological and biochemical sperm cell analysis. In this study, DH-RS combined approach is used for a complete analysis of single bovine spermatozoa. High-resolution images of bovine sperm have been obtained by DH microscopy from the reconstruction of a single acquired hologram, highlighting in some cases morphological alterations. Quantitative 3D reconstructions of sperm head, both normal and anomalous, have been studied and an unexpected structure of the post-acrosomal region of the head has been detected. Such anomalies have been also confirmed by Raman imaging analysis, suggesting the protein vibrations as associated Raman marker of the defect.
Understanding and planning ecological restoration of plant-pollinator networks.
Devoto, Mariano; Bailey, Sallie; Craze, Paul; Memmott, Jane
2012-04-01
Theory developed from studying changes in the structure and function of communities during natural or managed succession can guide the restoration of particular communities. We constructed 30 quantitative plant-flower visitor networks along a managed successional gradient to identify the main drivers of change in network structure. We then applied two alternative restoration strategies in silico (restoring for functional complementarity or redundancy) to data from our early successional plots to examine whether different strategies affected the restoration trajectories. Changes in network structure were explained by a combination of age, tree density and variation in tree diameter, even when variance explained by undergrowth structure was accounted for first. A combination of field data, a network approach and numerical simulations helped to identify which species should be given restoration priority in the context of different restoration targets. This combined approach provides a powerful tool for directing management decisions, particularly when management seeks to restore or conserve ecosystem function. © 2012 Blackwell Publishing Ltd/CNRS.
ERIC Educational Resources Information Center
Uzuner Yurt, Serap; Aktas, Elif
2016-01-01
In this study, the effects of the use of peer tutoring in Effective and Good Speech Course on students' success, perception of speech self-efficacy and speaking skills were examined. The study, designed as a mixed pattern in which quantitative and qualitative research approaches were combined, was carried out together with 57 students in 2014 to…
Glycoproteins Enrichment and LC-MS/MS Glycoproteomics in Central Nervous System Applications.
Zhu, Rui; Song, Ehwang; Hussein, Ahmed; Kobeissy, Firas H; Mechref, Yehia
2017-01-01
Proteins and glycoproteins play important biological roles in central nervous systems (CNS). Qualitative and quantitative evaluation of proteins and glycoproteins expression in CNS is critical to reveal the inherent biomolecular mechanism of CNS diseases. This chapter describes proteomic and glycoproteomic approaches based on liquid chromatography/tandem mass spectrometry (LC-MS or LC-MS/MS) for the qualitative and quantitative assessment of proteins and glycoproteins expressed in CNS. Proteins and glycoproteins, extracted by a mass spectrometry friendly surfactant from CNS samples, were subjected to enzymatic (tryptic) digestion and three down-stream analyses: (1) a nano LC system coupled with a high-resolution MS instrument to achieve qualitative proteomic profile, (2) a nano LC system combined with a triple quadrupole MS to quantify identified proteins, and (3) glycoprotein enrichment prior to LC-MS/MS analysis. Enrichment techniques can be applied to improve coverage of low abundant glycopeptides/glycoproteins. An example described in this chapter is hydrophilic interaction liquid chromatographic (HILIC) enrichment to capture glycopeptides, allowing efficient removal of peptides. The combination of three LC-MS/MS-based approaches is capable of the investigation of large-scale proteins and glycoproteins from CNS with an in-depth coverage, thus offering a full view of proteins and glycoproteins changes in CNS.
A Routine 'Top-Down' Approach to Analysis of the Human Serum Proteome.
D'Silva, Arlene M; Hyett, Jon A; Coorssen, Jens R
2017-06-06
Serum provides a rich source of potential biomarker proteoforms. One of the major obstacles in analysing serum proteomes is detecting lower abundance proteins owing to the presence of hyper-abundant species (e.g., serum albumin and immunoglobulins). Although depletion methods have been used to address this, these can lead to the concomitant removal of non-targeted protein species, and thus raise issues of specificity, reproducibility, and the capacity for meaningful quantitative analyses. Altering the native stoichiometry of the proteome components may thus yield a more complex series of issues than dealing directly with the inherent complexity of the sample. Hence, here we targeted method refinements so as to ensure optimum resolution of serum proteomes via a top down two-dimensional gel electrophoresis (2DE) approach that enables the routine assessment of proteoforms and is fully compatible with subsequent mass spectrometric analyses. Testing included various fractionation and non-fractionation approaches. The data show that resolving 500 µg protein on 17 cm 3-10 non-linear immobilised pH gradient strips in the first dimension followed by second dimension resolution on 7-20% gradient gels with a combination of lithium dodecyl sulfate (LDS) and sodium dodecyl sulfate (SDS) detergents markedly improves the resolution and detection of proteoforms in serum. In addition, well established third dimension electrophoretic separations in combination with deep imaging further contributed to the best available resolution, detection, and thus quantitative top-down analysis of serum proteomes.
Cao, Yiping; Sivaganesan, Mano; Kelty, Catherine A; Wang, Dan; Boehm, Alexandria B; Griffith, John F; Weisberg, Stephen B; Shanks, Orin C
2018-01-01
Human fecal pollution of recreational waters remains a public health concern worldwide. As a result, there is a growing interest in the application of human-associated fecal source identification quantitative real-time PCR (qPCR) technologies for water quality research and management. However, there are currently no standardized approaches for field implementation and interpretation of qPCR data. In this study, a standardized HF183/BacR287 qPCR method was combined with a water sampling strategy and a novel Bayesian weighted average approach to establish a human fecal contamination score (HFS) that can be used to prioritize sampling sites for remediation based on measured human waste levels. The HFS was then used to investigate 975 study design scenarios utilizing different combinations of sites with varying sampling intensities (daily to once per week) and number of qPCR replicates per sample (2-14 replicates). Findings demonstrate that site prioritization with HFS is feasible and that both sampling intensity and number of qPCR replicates influence reliability of HFS estimates. The novel data analysis strategy presented here provides a prescribed approach for the implementation and interpretation of human-associated HF183/BacR287 qPCR data with the goal of site prioritization based on human fecal pollution levels. In addition, information is provided for future users to customize study designs for optimal HFS performance. Published by Elsevier Ltd.
Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi
The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.
Bakri, Barbara; Weimer, Marco; Hauck, Gerrit; Reich, Gabriele
2015-11-01
Scope of the study was (1) to develop a lean quantitative calibration for real-time near-infrared (NIR) blend monitoring, which meets the requirements in early development of pharmaceutical products and (2) to compare the prediction performance of this approach with the results obtained from stratified sampling using a sample thief in combination with off-line high pressure liquid chromatography (HPLC) and at-line near-infrared chemical imaging (NIRCI). Tablets were manufactured from powder blends and analyzed with NIRCI and HPLC to verify the real-time results. The model formulation contained 25% w/w naproxen as a cohesive active pharmaceutical ingredient (API), microcrystalline cellulose and croscarmellose sodium as cohesive excipients and free-flowing mannitol. Five in-line NIR calibration approaches, all using the spectra from the end of the blending process as reference for PLS modeling, were compared in terms of selectivity, precision, prediction accuracy and robustness. High selectivity could be achieved with a "reduced" approach i.e. API and time saving approach (35% reduction of API amount) based on six concentration levels of the API with three levels realized by three independent powder blends and the additional levels obtained by simply increasing the API concentration in these blends. Accuracy and robustness were further improved by combining this calibration set with a second independent data set comprising different excipient concentrations and reflecting different environmental conditions. The combined calibration model was used to monitor the blending process of independent batches. For this model formulation the target concentration of the API could be achieved within 3 min indicating a short blending time. The in-line NIR approach was verified by stratified sampling HPLC and NIRCI results. All three methods revealed comparable results regarding blend end point determination. Differences in both mean API concentration and RSD values could be attributed to differences in effective sample size and thief sampling errors. This conclusion was supported by HPLC and NIRCI analysis of tablets manufactured from powder blends after different blending times. In summary, the study clearly demonstrates the ability to develop efficient and robust quantitative calibrations for real-time NIR powder blend monitoring with a reduced set of powder blends while avoiding any bias caused by physical sampling. Copyright © 2015 Elsevier B.V. All rights reserved.
Combining Radiography and Passive Measurements for Radiological Threat Detection in Cargo
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; White, Timothy A.; Jarman, Kenneth D.
Abstract Radiography is widely understood to provide information complimentary to passive detection: while not directly sensitive to radiological materials, radiography can reveal highly shielded regions which may mask a passive radiological signal. We present a method for combining radiographic and passive data which uses the radiograph to provide an estimate of scatter and attenuation for possible sources. This approach allows quantitative use of radiographic images without relying on image interpretation, and results in a probabilistic description of likely source locations and strengths. We present first results for this method for a simple modeled test case of a cargo container drivingmore » through a PVT portal. With this inversion approach, we address criteria for an integrated passive and radiographic screening system and how detection of SNM threats might be improved in such a system.« less
NASA Astrophysics Data System (ADS)
Radun, Jenni E.; Virtanen, Toni; Olives, Jean-Luc; Vaahteranoksa, Mikko; Vuori, Tero; Nyman, Göte
2007-01-01
We present an effective method for comparing subjective audiovisual quality and the features related to the quality changes of different video cameras. Both quantitative estimation of overall quality and qualitative description of critical quality features are achieved by the method. The aim was to combine two image quality evaluation methods, the quantitative Absolute Category Rating (ACR) method with hidden reference removal and the qualitative Interpretation- Based Quality (IBQ) method in order to see how they complement each other in audiovisual quality estimation tasks. 26 observers estimated the audiovisual quality of six different cameras, mainly mobile phone video cameras. In order to achieve an efficient subjective estimation of audiovisual quality, only two contents with different quality requirements were recorded with each camera. The results show that the subjectively important quality features were more related to the overall estimations of cameras' visual video quality than to the features related to sound. The data demonstrated two significant quality dimensions related to visual quality: darkness and sharpness. We conclude that the qualitative methodology can complement quantitative quality estimations also with audiovisual material. The IBQ approach is valuable especially, when the induced quality changes are multidimensional.
Evaluation of health promotion in schools: a realistic evaluation approach using mixed methods
2010-01-01
Background Schools are key settings for health promotion (HP) but the development of suitable approaches for evaluating HP in schools is still a major topic of discussion. This article presents a research protocol of a program developed to evaluate HP. After reviewing HP evaluation issues, the various possible approaches are analyzed and the importance of a realistic evaluation framework and a mixed methods (MM) design are demonstrated. Methods/Design The design is based on a systemic approach to evaluation, taking into account the mechanisms, context and outcomes, as defined in realistic evaluation, adjusted to our own French context using an MM approach. The characteristics of the design are illustrated through the evaluation of a nationwide HP program in French primary schools designed to enhance children's social, emotional and physical health by improving teachers' HP practices and promoting a healthy school environment. An embedded MM design is used in which a qualitative data set plays a supportive, secondary role in a study based primarily on a different quantitative data set. The way the qualitative and quantitative approaches are combined through the entire evaluation framework is detailed. Discussion This study is a contribution towards the development of suitable approaches for evaluating HP programs in schools. The systemic approach of the evaluation carried out in this research is appropriate since it takes account of the limitations of traditional evaluation approaches and considers suggestions made by the HP research community. PMID:20109202
Tate, Sonya C; Burke, Teresa F; Hartman, Daisy; Kulanthaivel, Palaniappan; Beckmann, Richard P; Cronier, Damien M
2016-01-01
Background: Resistance to BRAF inhibition is a major cause of treatment failure for BRAF-mutated metastatic melanoma patients. Abemaciclib, a cyclin-dependent kinase 4 and 6 inhibitor, overcomes this resistance in xenograft tumours and offers a promising drug combination. The present work aims to characterise the quantitative pharmacology of the abemaciclib/vemurafenib combination using a semimechanistic pharmacokinetic/pharmacodynamic modelling approach and to identify an optimum dosing regimen for potential clinical evaluation. Methods: A PK/biomarker model was developed to connect abemaciclib/vemurafenib concentrations to changes in MAPK and cell cycle pathway biomarkers in A375 BRAF-mutated melanoma xenografts. Resultant tumour growth inhibition was described by relating (i) MAPK pathway inhibition to apoptosis, (ii) mitotic cell density to tumour growth and, under resistant conditions, (iii) retinoblastoma protein inhibition to cell survival. Results: The model successfully described vemurafenib/abemaciclib-mediated changes in MAPK pathway and cell cycle biomarkers. Initial tumour shrinkage by vemurafenib, acquisition of resistance and subsequent abemaciclib-mediated efficacy were successfully captured and externally validated. Model simulations illustrate the benefit of intermittent vemurafenib therapy over continuous treatment, and indicate that continuous abemaciclib in combination with intermittent vemurafenib offers the potential for considerable tumour regression. Conclusions: The quantitative pharmacology of the abemaciclib/vemurafenib combination was successfully characterised and an optimised, clinically-relevant dosing strategy was identified. PMID:26978007
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-01-01
Local surface charge density of lipid membranes influences membrane–protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values. PMID:27561322
Multidisciplinary, interdisciplinary, or dysfunctional? Team working in mixed-methods research.
O'Cathain, Alicia; Murphy, Elizabeth; Nicholl, Jon
2008-11-01
Combining qualitative and quantitative methods in a single study-otherwise known as mixed-methods research-is common. In health research these projects can be delivered by research teams. A typical scenario, for example, involves medical sociologists delivering qualitative components and researchers from medicine or health economics delivering quantitative components. We undertook semistructured interviews with 20 researchers who had worked on mixed-methods studies in health services research to explore the facilitators of and barriers to exploiting the potential of this approach. Team working emerged as a key issue, with three models of team working apparent: multidisciplinary, interdisciplinary, and dysfunctional. Interdisciplinary research was associated with integration of data or findings from the qualitative and quantitative components in both the final reports and the peer-reviewed publications. Methodological respect between team members and a principal investigator who valued integration emerged as essential to achieving integrated research outcomes.
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy
NASA Astrophysics Data System (ADS)
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-08-01
Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.
von Gunten, Lucien; D'Andrea, William J.; Bradley, Raymond S.; Huang, Yongsong
2012-01-01
High-resolution paleoclimate reconstructions are often restricted by the difficulties of sampling geologic archives in great detail and the analytical costs of processing large numbers of samples. Using sediments from Lake Braya Sø, Greenland, we introduce a new method that provides a quantitative high-resolution paleoclimate record by combining measurements of the alkenone unsaturation index () with non-destructive scanning reflectance spectroscopic measurements in the visible range (VIS-RS). The proxy-to-proxy (PTP) method exploits two distinct calibrations: the in situ calibration of to lake water temperature and the calibration of scanning VIS-RS data to down core data. Using this approach, we produced a quantitative temperature record that is longer and has 5 times higher sampling resolution than the original time series, thereby allowing detection of temperature variability in frequency bands characteristic of the AMO over the past 7,000 years. PMID:22934132
Mapping surface charge density of lipid bilayers by quantitative surface conductivity microscopy.
Klausen, Lasse Hyldgaard; Fuhs, Thomas; Dong, Mingdong
2016-08-26
Local surface charge density of lipid membranes influences membrane-protein interactions leading to distinct functions in all living cells, and it is a vital parameter in understanding membrane-binding mechanisms, liposome design and drug delivery. Despite the significance, no method has so far been capable of mapping surface charge densities under physiologically relevant conditions. Here, we use a scanning nanopipette setup (scanning ion-conductance microscope) combined with a novel algorithm to investigate the surface conductivity near supported lipid bilayers, and we present a new approach, quantitative surface conductivity microscopy (QSCM), capable of mapping surface charge density with high-quantitative precision and nanoscale resolution. The method is validated through an extensive theoretical analysis of the ionic current at the nanopipette tip, and we demonstrate the capacity of QSCM by mapping the surface charge density of model cationic, anionic and zwitterionic lipids with results accurately matching theoretical values.
Combined linkage and association mapping of flowering time in Sunflower (Helianthus annuus L.).
Cadic, Elena; Coque, Marie; Vear, Felicity; Grezes-Besset, Bruno; Pauquet, Jerôme; Piquemal, Joël; Lippi, Yannick; Blanchard, Philippe; Romestant, Michel; Pouilly, Nicolas; Rengel, David; Gouzy, Jerôme; Langlade, Nicolas; Mangin, Brigitte; Vincourt, Patrick
2013-05-01
Association mapping and linkage mapping were used to identify quantitative trait loci (QTL) and/or causative mutations involved in the control of flowering time in cultivated sunflower Helianthus annuus. A panel of 384 inbred lines was phenotyped through testcrosses with two tester inbred lines across 15 location × year combinations. A recombinant inbred line (RIL) population comprising 273 lines was phenotyped both per se and through testcrosses with one or two testers in 16 location × year combinations. In the association mapping approach, kinship estimation using 5,923 single nucleotide polymorphisms was found to be the best covariate to correct for effects of panel structure. Linkage disequilibrium decay ranged from 0.08 to 0.26 cM for a threshold of 0.20, after correcting for structure effects, depending on the linkage group (LG) and the ancestry of inbred lines. A possible hitchhiking effect is hypothesized for LG10 and LG08. A total of 11 regions across 10 LGs were found to be associated with flowering time, and QTLs were mapped on 11 LGs in the RIL population. Whereas eight regions were demonstrated to be common between the two approaches, the linkage disequilibrium approach did not detect a documented QTL that was confirmed using the linkage mapping approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Que, Emily L.; Bleher, Reiner; Duncan, Francesca E.
2014-12-15
Fertilization of a mammalian egg initiates a series of 'zinc sparks' that are necessary to induce the egg-to-embryo transition. Despite the importance of these zinc-efflux events little is known about their origin. To understand the molecular mechanism of the zinc spark we combined four physical approaches that resolve zinc distributions in single cells: a chemical probe for dynamic live-cell fluorescence imaging and a combination of scanning transmission electron microscopy with energy-dispersive spectroscopy, X-ray fluorescence microscopy and three-dimensional elemental tomography for high-resolution elemental mapping. We show that the zinc spark arises from a system of thousands of zinc-loaded vesicles, each ofmore » which contains, on average, 10(6) zinc atoms. These vesicles undergo dynamic movement during oocyte maturation and exocytosis at the time of fertilization. The discovery of these vesicles and the demonstration that zinc sparks originate from them provides a quantitative framework for understanding how zinc fluxes regulate cellular processes« less
Que, Emily L; Bleher, Reiner; Duncan, Francesca E; Kong, Betty Y; Gleber, Sophie C; Vogt, Stefan; Chen, Si; Garwin, Seth A; Bayer, Amanda R; Dravid, Vinayak P; Woodruff, Teresa K; O'Halloran, Thomas V
2015-02-01
Fertilization of a mammalian egg initiates a series of 'zinc sparks' that are necessary to induce the egg-to-embryo transition. Despite the importance of these zinc-efflux events little is known about their origin. To understand the molecular mechanism of the zinc spark we combined four physical approaches that resolve zinc distributions in single cells: a chemical probe for dynamic live-cell fluorescence imaging and a combination of scanning transmission electron microscopy with energy-dispersive spectroscopy, X-ray fluorescence microscopy and three-dimensional elemental tomography for high-resolution elemental mapping. We show that the zinc spark arises from a system of thousands of zinc-loaded vesicles, each of which contains, on average, 10(6) zinc atoms. These vesicles undergo dynamic movement during oocyte maturation and exocytosis at the time of fertilization. The discovery of these vesicles and the demonstration that zinc sparks originate from them provides a quantitative framework for understanding how zinc fluxes regulate cellular processes.
Thermodynamics of Hydrophobic Amino Acids in Solution: A Combined Experimental–Computational Study
Song, Lingshuang; Yang, Lin; Meng, Jie; ...
2016-12-29
Here, we present a joint experimental-computational study to quantitatively describe the thermodynamics of hydrophobic leucine amino acids in aqueous solution. X-ray scattering data were acquired at a series of solute and salt concentrations to effectively measure inter-leucine interactions, indicating that a major scattering peak is observed consistently at q = 0.83 Å -1. Atomistic molecular dynamics simulations were then performed and compared with the scattering data, achieving high consistency at both small and wider scattering angles (q = 0$-$1.5 Å -1). This experimental-computational consistence enables a first glimpse of the leucineleucine interacting landscape, where two leucine molecules are aligned mostlymore » in a parallel fashion, as opposed to anti-parallel, but also allows us to derive effective leucine-leucine interactions in solution. Collectively, this combined approach of employing experimental scattering and molecular simulation enables a quantitative characterization on effective inter-molecular interactions of hydrophobic amino acids, critical for protein function and dynamics such as protein folding.« less
Thermodynamics of Hydrophobic Amino Acids in Solution: A Combined Experimental–Computational Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Lingshuang; Yang, Lin; Meng, Jie
Here, we present a joint experimental-computational study to quantitatively describe the thermodynamics of hydrophobic leucine amino acids in aqueous solution. X-ray scattering data were acquired at a series of solute and salt concentrations to effectively measure inter-leucine interactions, indicating that a major scattering peak is observed consistently at q = 0.83 Å -1. Atomistic molecular dynamics simulations were then performed and compared with the scattering data, achieving high consistency at both small and wider scattering angles (q = 0$-$1.5 Å -1). This experimental-computational consistence enables a first glimpse of the leucineleucine interacting landscape, where two leucine molecules are aligned mostlymore » in a parallel fashion, as opposed to anti-parallel, but also allows us to derive effective leucine-leucine interactions in solution. Collectively, this combined approach of employing experimental scattering and molecular simulation enables a quantitative characterization on effective inter-molecular interactions of hydrophobic amino acids, critical for protein function and dynamics such as protein folding.« less
Mukumbang, Ferdinand C; Alindekane, Leka Marcel
2017-04-01
The aim of this study was to explore the teacher identity formation dynamics of student nurse-educators about the subject matter, pedagogy and didactics. A case study using descriptive quantitative design was employed. Using a cross-sectional approach, data were collected in 2014 using a self-administered questionnaire. Participants were asked to self-evaluate their teaching competencies on the nursing subject matter, pedagogical expertise and didactical expertise. Using descriptive analysis we determined the central tendencies of the constructs. The descriptive analysis revealed a very small variance (0.0011) and standard deviation (0.04) among the means of the three constructs, which indicates a fair balance in the contribution of the subject matter, pedagogy and didactics towards teacher identity formation. Nursing student-educators can achieve a balanced combination of subject matter expert, pedagogical expert and didactical expert combination during the formation of their teacher identity. This could be indicative of how effective the training programme is in helping the students achieve a balanced teacher identity.
Emory, Hamlin; Wells, Christopher; Mizrahi, Neptune
2015-07-01
Two adolescent females with absence epilepsy were classified, one as attention deficit and the other as bipolar disorder. Physical and cognitive exams identified hypotension, bradycardia, and cognitive dysfunction. Their initial electroencephalograms (EEGs) were considered slightly slow, but within normal limits. Quantitative EEG (QEEG) data included relative theta excess and low alpha mean frequencies. A combined treatment of antiepileptic drugs with a catecholamine agonist/reuptake inhibitor was sequentially used. Both patients' physical and cognitive functions improved and they have remained seizure free. The clinical outcomes were correlated with statistically significant changes in QEEG measures toward normal Z-scores in both anterior and posterior regions. In addition, low resolution electromagnetic tomography (LORETA) Z-scored source correlation analyses of the initial and treated QEEG data showed normalized patterns, supporting a neuroanatomic resolution. This study presents preliminary evidence for a neurophysiologic approach to patients with absence epilepsy and comorbid disorders and may provide a method for further research. © EEG and Clinical Neuroscience Society (ECNS) 2014.
Que, Emily L.; Bleher, Reiner; Duncan, Francesca E.; Kong, Betty Y.; Gleber, Sophie C.; Vogt, Stefan; Chen, Si; Garwin, Seth A.; Bayer, Amanda R.; Dravid, Vinayak; Woodruff, Teresa K.; O’Halloran, Thomas V.
2015-01-01
Fertilization of a mammalian egg induces a series of ‘zinc sparks’ that are necessary for inducing the egg-to-embryo transition. Despite the importance of these zinc efflux events little is known about their origin. To understand the molecular mechanism of the zinc spark we combined four physical approaches to resolve zinc distributions in single cells: a chemical probe for dynamic live-cell fluorescence imaging and a combination of scanning transmission electron microscopy with energy dispersive spectroscopy, X-ray fluorescence microscopy, and 3D elemental tomography for high resolution elemental mapping. We show that the zinc spark arises from a system of thousands of zinc-loaded vesicles, each of which contains, on average, 106 zinc atoms. These vesicles undergo dynamic movement during oocyte maturation and exocytosis at the time of fertilization. The discovery of these vesicles and the demonstration that zinc sparks originate from them provides a quantitative framework for understanding how zinc fluxes regulate cellular processes. PMID:25615666
Hempel, Kristina; Herbst, Florian-Alexander; Moche, Martin; Hecker, Michael; Becher, Dörte
2011-04-01
Staphylococcus aureus is capable of colonizing and infecting humans by its arsenal of surface-exposed and secreted proteins. Iron-limited conditions in mammalian body fluids serve as a major environmental signal to bacteria to express virulence determinants. Here we present a comprehensive, gel-free, and GeLC-MS/MS-based quantitative proteome profiling of S. aureus under this infection-relevant situation. (14)N(15)N metabolic labeling and three complementing approaches were combined for relative quantitative analyses of surface-associated proteins. The surface-exposed and secreted proteome profiling approaches comprise trypsin shaving, biotinylation, and precipitation of the supernatant. By analysis of the outer subproteomic and cytoplasmic protein fraction, 1210 proteins could be identified including 221 surface-associated proteins. Thus, access was enabled to 70% of the predicted cell wall-associated proteins, 80% of the predicted sortase substrates, two/thirds of lipoproteins and more than 50% of secreted and cytoplasmic proteins. For iron-deficiency, 158 surface-associated proteins were quantified. Twenty-nine proteins were found in altered amounts showing particularly surface-exposed proteins strongly induced, such as the iron-regulated surface determinant proteins IsdA, IsdB, IsdC and IsdD as well as lipid-anchored iron compound-binding proteins. The work presents a crucial subject for understanding S. aureus pathophysiology by the use of methods that allow quantitative surface proteome profiling.
Insights from quantitative metaproteomics and protein-stable isotope probing into microbial ecology.
von Bergen, Martin; Jehmlich, Nico; Taubert, Martin; Vogt, Carsten; Bastida, Felipe; Herbst, Florian-Alexander; Schmidt, Frank; Richnow, Hans-Hermann; Seifert, Jana
2013-10-01
The recent development of metaproteomics has enabled the direct identification and quantification of expressed proteins from microbial communities in situ, without the need for microbial enrichment. This became possible by (1) significant increases in quality and quantity of metagenome data and by improvements of (2) accuracy and (3) sensitivity of modern mass spectrometers (MS). The identification of physiologically relevant enzymes can help to understand the role of specific species within a community or an ecological niche. Beside identification, relative and absolute quantitation is also crucial. We will review label-free and label-based methods of quantitation in MS-based proteome analysis and the contribution of quantitative proteome data to microbial ecology. Additionally, approaches of protein-based stable isotope probing (protein-SIP) for deciphering community structures are reviewed. Information on the species-specific metabolic activity can be obtained when substrates or nutrients are labeled with stable isotopes in a protein-SIP approach. The stable isotopes ((13)C, (15)N, (36)S) are incorporated into proteins and the rate of incorporation can be used for assessing the metabolic activity of the corresponding species. We will focus on the relevance of the metabolic and phylogenetic information retrieved with protein-SIP studies and for detecting and quantifying the carbon flux within microbial consortia. Furthermore, the combination of protein-SIP with established tools in microbial ecology such as other stable isotope probing techniques are discussed.
Glass, Samuel; Gajwani, Ruchika; Turner-Halliday, Fiona
Background and Aims. Research on child maltreatment has largely overlooked the under-five age group and focuses primarily on quantitative measurement. This mixed-methods study of maltreated children (N = 92) entering care (age 6-60 months) combines a quantitative focus on the associations between care journey characteristics and mental health outcomes with a qualitative exploration of maltreatment in four different families. Methods. Care journey data was obtained from social care records; mental health and attachment assessments were carried out following entry to care; qualitative data comprised semistructured interviews with professionals, foster carers, and parents. Results. Significant associations were found between suspected sexual abuse and increased DAI inhibited attachment symptoms (p = 0.001) and between reported domestic violence and decreased DAI inhibited (p = 0.016) and disinhibited (p = 0.004) attachment symptoms. Qualitative results: two themes demonstrate the complexity of assessing maltreatment: (1) overlapping maltreatment factors occur in most cases and (2) maltreatment effects may be particularly challenging to isolate. Conclusions. Qualitative exploration has underscored the complexity of assessing maltreatment, indicating why expected associations were not found in this study and posing questions for the quantitative measurement of maltreatment in general. We therefore suggest a new categorisation of maltreatment and call for the complimentary research lenses of further mixed-methods approaches.
Ueda, Jun; Yoshimura, Hajime; Shimizu, Keiji; Hino, Megumu; Kohara, Nobuo
2017-07-01
Visual and semi-quantitative assessments of 123 I-FP-CIT single-photon emission computed tomography (SPECT) are useful for the diagnosis of dopaminergic neurodegenerative diseases (dNDD), including Parkinson's disease, dementia with Lewy bodies, progressive supranuclear palsy, multiple system atrophy, and corticobasal degeneration. However, the diagnostic value of combined visual and semi-quantitative assessment in dNDD remains unclear. Among 239 consecutive patients with a newly diagnosed possible parkinsonian syndrome who underwent 123 I-FP-CIT SPECT in our medical center, 114 patients with a disease duration less than 7 years were diagnosed as dNDD with the established criteria or as non-dNDD according to clinical judgment. We retrospectively examined their clinical characteristics and visual and semi-quantitative assessments of 123 I-FP-CIT SPECT. The striatal binding ratio (SBR) was used as a semi-quantitative measure of 123 I-FP-CIT SPECT. We calculated the sensitivity and specificity of visual assessment alone, semi-quantitative assessment alone, and combined visual and semi-quantitative assessment for the diagnosis of dNDD. SBR was correlated with visual assessment. Some dNDD patients with a normal visual assessment had an abnormal SBR, and vice versa. There was no statistically significant difference between sensitivity of the diagnosis with visual assessment alone and semi-quantitative assessment alone (91.2 vs. 86.8%, respectively, p = 0.29). Combined visual and semi-quantitative assessment demonstrated superior sensitivity (96.7%) to visual assessment (p = 0.03) or semi-quantitative assessment (p = 0.003) alone with equal specificity. Visual and semi-quantitative assessments of 123 I-FP-CIT SPECT are helpful for the diagnosis of dNDD, and combined visual and semi-quantitative assessment shows superior sensitivity with equal specificity.
Mapping tropical rainforest canopies using multi-temporal spaceborne imaging spectroscopy
NASA Astrophysics Data System (ADS)
Somers, Ben; Asner, Gregory P.
2013-10-01
The use of imaging spectroscopy for florisic mapping of forests is complicated by the spectral similarity among coexisting species. Here we evaluated an alternative spectral unmixing strategy combining a time series of EO-1 Hyperion images and an automated feature selection strategy in MESMA. Instead of using the same spectral subset to unmix each image pixel, our modified approach allowed the spectral subsets to vary on a per pixel basis such that each pixel is evaluated using a spectral subset tuned towards maximal separability of its specific endmember class combination or species mixture. The potential of the new approach for floristic mapping of tree species in Hawaiian rainforests was quantitatively demonstrated using both simulated and actual hyperspectral image time-series. With a Cohen's Kappa coefficient of 0.65, our approach provided a more accurate tree species map compared to MESMA (Kappa = 0.54). In addition, by the selection of spectral subsets our approach was about 90% faster than MESMA. The flexible or adaptive use of band sets in spectral unmixing as such provides an interesting avenue to address spectral similarities in complex vegetation canopies.
Morris, Jeffrey S; Baladandayuthapani, Veerabhadran; Herrick, Richard C; Sanna, Pietro; Gutstein, Howard
2011-01-01
Image data are increasingly encountered and are of growing importance in many areas of science. Much of these data are quantitative image data, which are characterized by intensities that represent some measurement of interest in the scanned images. The data typically consist of multiple images on the same domain and the goal of the research is to combine the quantitative information across images to make inference about populations or interventions. In this paper, we present a unified analysis framework for the analysis of quantitative image data using a Bayesian functional mixed model approach. This framework is flexible enough to handle complex, irregular images with many local features, and can model the simultaneous effects of multiple factors on the image intensities and account for the correlation between images induced by the design. We introduce a general isomorphic modeling approach to fitting the functional mixed model, of which the wavelet-based functional mixed model is one special case. With suitable modeling choices, this approach leads to efficient calculations and can result in flexible modeling and adaptive smoothing of the salient features in the data. The proposed method has the following advantages: it can be run automatically, it produces inferential plots indicating which regions of the image are associated with each factor, it simultaneously considers the practical and statistical significance of findings, and it controls the false discovery rate. Although the method we present is general and can be applied to quantitative image data from any application, in this paper we focus on image-based proteomic data. We apply our method to an animal study investigating the effects of opiate addiction on the brain proteome. Our image-based functional mixed model approach finds results that are missed with conventional spot-based analysis approaches. In particular, we find that the significant regions of the image identified by the proposed method frequently correspond to subregions of visible spots that may represent post-translational modifications or co-migrating proteins that cannot be visually resolved from adjacent, more abundant proteins on the gel image. Thus, it is possible that this image-based approach may actually improve the realized resolution of the gel, revealing differentially expressed proteins that would not have even been detected as spots by modern spot-based analyses.
Popp, Oliver; Müller, Dirk; Didzus, Katharina; Paul, Wolfgang; Lipsmeier, Florian; Kirchner, Florian; Niklas, Jens; Mauch, Klaus; Beaucamp, Nicola
2016-09-01
In-depth characterization of high-producer cell lines and bioprocesses is vital to ensure robust and consistent production of recombinant therapeutic proteins in high quantity and quality for clinical applications. This requires applying appropriate methods during bioprocess development to enable meaningful characterization of CHO clones and processes. Here, we present a novel hybrid approach for supporting comprehensive characterization of metabolic clone performance. The approach combines metabolite profiling with multivariate data analysis and fluxomics to enable a data-driven mechanistic analysis of key metabolic traits associated with desired cell phenotypes. We applied the methodology to quantify and compare metabolic performance in a set of 10 recombinant CHO-K1 producer clones and a host cell line. The comprehensive characterization enabled us to derive an extended set of clone performance criteria that not only captured growth and product formation, but also incorporated information on intracellular clone physiology and on metabolic changes during the process. These criteria served to establish a quantitative clone ranking and allowed us to identify metabolic differences between high-producing CHO-K1 clones yielding comparably high product titers. Through multivariate data analysis of the combined metabolite and flux data we uncovered common metabolic traits characteristic of high-producer clones in the screening setup. This included high intracellular rates of glutamine synthesis, low cysteine uptake, reduced excretion of aspartate and glutamate, and low intracellular degradation rates of branched-chain amino acids and of histidine. Finally, the above approach was integrated into a workflow that enables standardized high-content selection of CHO producer clones in a high-throughput fashion. In conclusion, the combination of quantitative metabolite profiling, multivariate data analysis, and mechanistic network model simulations can identify metabolic traits characteristic of high-performance clones and enables informed decisions on which clones provide a good match for a particular process platform. The proposed approach also provides a mechanistic link between observed clone phenotype, process setup, and feeding regimes, and thereby offers concrete starting points for subsequent process optimization. Biotechnol. Bioeng. 2016;113: 2005-2019. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
A New Algorithm Using Cross-Assignment for Label-Free Quantitation with LC/LTQ-FT MS
Andreev, Victor P.; Li, Lingyun; Cao, Lei; Gu, Ye; Rejtar, Tomas; Wu, Shiaw-Lin; Karger, Barry L.
2008-01-01
A new algorithm is described for label-free quantitation of relative protein abundances across multiple complex proteomic samples. Q-MEND is based on the denoising and peak picking algorithm, MEND, previously developed in our laboratory. Q-MEND takes advantage of the high resolution and mass accuracy of the hybrid LTQFT MS mass spectrometer (or other high resolution mass spectrometers, such as a Q-TOF MS). The strategy, termed “cross-assignment”, is introduced to increase substantially the number of quantitated proteins. In this approach, all MS/MS identifications for the set of analyzed samples are combined into a master ID list, and then each LC/MS run is searched for the features that can be assigned to a specific identification from that master list. The reliability of quantitation is enhanced by quantitating separately all peptide charge states, along with a scoring procedure to filter out less reliable peptide abundance measurements. The effectiveness of Q-MEND is illustrated in the relative quantitative analysis of E.coli samples spiked with known amounts of non-E.coli protein digests. A mean quantitation accuracy of 7% and mean precision of 15% is demonstrated. Q-MEND can perform relative quantitation of a set of LC/MS datasets without manual intervention and can generate files compatible with the Guidelines for Proteomic Data Publication. PMID:17441747
A new algorithm using cross-assignment for label-free quantitation with LC-LTQ-FT MS.
Andreev, Victor P; Li, Lingyun; Cao, Lei; Gu, Ye; Rejtar, Tomas; Wu, Shiaw-Lin; Karger, Barry L
2007-06-01
A new algorithm is described for label-free quantitation of relative protein abundances across multiple complex proteomic samples. Q-MEND is based on the denoising and peak picking algorithm, MEND, previously developed in our laboratory. Q-MEND takes advantage of the high resolution and mass accuracy of the hybrid LTQ-FT MS mass spectrometer (or other high-resolution mass spectrometers, such as a Q-TOF MS). The strategy, termed "cross-assignment", is introduced to increase substantially the number of quantitated proteins. In this approach, all MS/MS identifications for the set of analyzed samples are combined into a master ID list, and then each LC-MS run is searched for the features that can be assigned to a specific identification from that master list. The reliability of quantitation is enhanced by quantitating separately all peptide charge states, along with a scoring procedure to filter out less reliable peptide abundance measurements. The effectiveness of Q-MEND is illustrated in the relative quantitative analysis of Escherichia coli samples spiked with known amounts of non-E. coli protein digests. A mean quantitation accuracy of 7% and mean precision of 15% is demonstrated. Q-MEND can perform relative quantitation of a set of LC-MS data sets without manual intervention and can generate files compatible with the Guidelines for Proteomic Data Publication.
The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods
NASA Astrophysics Data System (ADS)
Crootof, A.; Albrecht, T.; Scott, C. A.
2017-12-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.
Sinclair, Karen; Kinable, Els; Grosch, Kai; Wang, Jixian
2016-05-01
In current industry practice, it is difficult to assess QT effects at potential therapeutic doses based on Phase I dose-escalation trials in oncology due to data scarcity, particularly in combinations trials. In this paper, we propose to use dose-concentration and concentration-QT models jointly to model the exposures and effects of multiple drugs in combination. The fitted models then can be used to make early predictions for QT prolongation to aid choosing recommended dose combinations for further investigation. The models consider potential correlation between concentrations of test drugs and potential drug-drug interactions at PK and QT levels. In addition, this approach allows for the assessment of the probability of QT prolongation exceeding given thresholds of clinical significance. The performance of this approach was examined via simulation under practical scenarios for dose-escalation trials for a combination of two drugs. The simulation results show that invaluable information of QT effects at therapeutic dose combinations can be gained by the proposed approaches. Early detection of dose combinations with substantial QT prolongation is evaluated effectively through the CIs of the predicted peak QT prolongation at each dose combination. Furthermore, the probability of QT prolongation exceeding a certain threshold is also computed to support early detection of safety signals while accounting for uncertainty associated with data from Phase I studies. While the prediction of QT effects is sensitive to the dose escalation process, the sensitivity and limited sample size should be considered when providing support to the decision-making process for further developing certain dose combinations. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Hare, Jonathan A; Morrison, Wendy E; Nelson, Mark W; Stachura, Megan M; Teeters, Eric J; Griffis, Roger B; Alexander, Michael A; Scott, James D; Alade, Larry; Bell, Richard J; Chute, Antonie S; Curti, Kiersten L; Curtis, Tobey H; Kircheis, Daniel; Kocik, John F; Lucey, Sean M; McCandless, Camilla T; Milke, Lisa M; Richardson, David E; Robillard, Eric; Walsh, Harvey J; McManus, M Conor; Marancik, Katrin E; Griswold, Carolyn A
2016-01-01
Climate change and decadal variability are impacting marine fish and invertebrate species worldwide and these impacts will continue for the foreseeable future. Quantitative approaches have been developed to examine climate impacts on productivity, abundance, and distribution of various marine fish and invertebrate species. However, it is difficult to apply these approaches to large numbers of species owing to the lack of mechanistic understanding sufficient for quantitative analyses, as well as the lack of scientific infrastructure to support these more detailed studies. Vulnerability assessments provide a framework for evaluating climate impacts over a broad range of species with existing information. These methods combine the exposure of a species to a stressor (climate change and decadal variability) and the sensitivity of species to the stressor. These two components are then combined to estimate an overall vulnerability. Quantitative data are used when available, but qualitative information and expert opinion are used when quantitative data is lacking. Here we conduct a climate vulnerability assessment on 82 fish and invertebrate species in the Northeast U.S. Shelf including exploited, forage, and protected species. We define climate vulnerability as the extent to which abundance or productivity of a species in the region could be impacted by climate change and decadal variability. We find that the overall climate vulnerability is high to very high for approximately half the species assessed; diadromous and benthic invertebrate species exhibit the greatest vulnerability. In addition, the majority of species included in the assessment have a high potential for a change in distribution in response to projected changes in climate. Negative effects of climate change are expected for approximately half of the species assessed, but some species are expected to be positively affected (e.g., increase in productivity or move into the region). These results will inform research and management activities related to understanding and adapting marine fisheries management and conservation to climate change and decadal variability.
Hare, Jonathan A.; Morrison, Wendy E.; Nelson, Mark W.; Stachura, Megan M.; Teeters, Eric J.; Griffis, Roger B.; Alexander, Michael A.; Scott, James D.; Alade, Larry; Bell, Richard J.; Chute, Antonie S.; Curti, Kiersten L.; Curtis, Tobey H.; Kircheis, Daniel; Kocik, John F.; Lucey, Sean M.; McCandless, Camilla T.; Milke, Lisa M.; Richardson, David E.; Robillard, Eric; Walsh, Harvey J.; McManus, M. Conor; Marancik, Katrin E.; Griswold, Carolyn A.
2016-01-01
Climate change and decadal variability are impacting marine fish and invertebrate species worldwide and these impacts will continue for the foreseeable future. Quantitative approaches have been developed to examine climate impacts on productivity, abundance, and distribution of various marine fish and invertebrate species. However, it is difficult to apply these approaches to large numbers of species owing to the lack of mechanistic understanding sufficient for quantitative analyses, as well as the lack of scientific infrastructure to support these more detailed studies. Vulnerability assessments provide a framework for evaluating climate impacts over a broad range of species with existing information. These methods combine the exposure of a species to a stressor (climate change and decadal variability) and the sensitivity of species to the stressor. These two components are then combined to estimate an overall vulnerability. Quantitative data are used when available, but qualitative information and expert opinion are used when quantitative data is lacking. Here we conduct a climate vulnerability assessment on 82 fish and invertebrate species in the Northeast U.S. Shelf including exploited, forage, and protected species. We define climate vulnerability as the extent to which abundance or productivity of a species in the region could be impacted by climate change and decadal variability. We find that the overall climate vulnerability is high to very high for approximately half the species assessed; diadromous and benthic invertebrate species exhibit the greatest vulnerability. In addition, the majority of species included in the assessment have a high potential for a change in distribution in response to projected changes in climate. Negative effects of climate change are expected for approximately half of the species assessed, but some species are expected to be positively affected (e.g., increase in productivity or move into the region). These results will inform research and management activities related to understanding and adapting marine fisheries management and conservation to climate change and decadal variability. PMID:26839967
Hui, Lung Kit; Ng, Roger M K; Pau, Lui; Yip, K C
2012-03-01
The aim of this study was to explore and describe the subjective experiences of agoraphobia in Hong Kong Chinese. This was a cross-sectional descriptive study, using a combined qualitative-quantitative approach. In the qualitative part, two focus groups were held with nine participants suffering from DSM-IV panic disorder with agoraphobia, followed up in a regional hospital in Hong Kong. The audiotaped was transcribed and analysed into four main categories and 13 subcategories based on a grounded theory approach. One subcategory ('Fear of making others worried and being a burden to others') was identified as a novel, culture-specific concept in agoraphobia that was not reported in Western literature. In the quantitative part, this subcategory was redefined and measured by a two-item, self-rated questionnaire survey in another 35 participants suffering from DSM-IV defined panic disorder with agoraphobia. Qualitative data showed that the clinical manifestations of agoraphobia were specifically related to the underlying corresponding catastrophic cognitions. An individual's agoraphobic cognitions and symptoms were highly related to the identity of the surrounding people during panic attacks in agoraphobic situations, which reflected the characteristic structure of the Chinese interpersonal network. Participants preferred reliance on self to cope with the anxiety first, then turned to their family members for help due to higher interpersonal trust. Participants also expressed fear of affecting others due to their illness. A new sub-theme of agoraphobia ('Fear of making others worried and being a burden to others') was extracted from the qualitative data. Its validity was confirmed by the quantitative description of this new theme using a self-rated questionnaire as a methodological triangulation. The central theme to emerge from the qualitative data was that agoraphobia is a clinical condition that has a close relationship to Chinese cultural factors. 'Fear of making others worried and being a burden to others' is a new concept in agoraphobia worthy of further study.
An optimal strategy for functional mapping of dynamic trait loci.
Jin, Tianbo; Li, Jiahan; Guo, Ying; Zhou, Xiaojing; Yang, Runqing; Wu, Rongling
2010-02-01
As an emerging powerful approach for mapping quantitative trait loci (QTLs) responsible for dynamic traits, functional mapping models the time-dependent mean vector with biologically meaningful equations and are likely to generate biologically relevant and interpretable results. Given the autocorrelation nature of a dynamic trait, functional mapping needs the implementation of the models for the structure of the covariance matrix. In this article, we have provided a comprehensive set of approaches for modelling the covariance structure and incorporated each of these approaches into the framework of functional mapping. The Bayesian information criterion (BIC) values are used as a model selection criterion to choose the optimal combination of the submodels for the mean vector and covariance structure. In an example for leaf age growth from a rice molecular genetic project, the best submodel combination was found between the Gaussian model for the correlation structure, power equation of order 1 for the variance and the power curve for the mean vector. Under this combination, several significant QTLs for leaf age growth trajectories were detected on different chromosomes. Our model can be well used to study the genetic architecture of dynamic traits of agricultural values.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pincus, David; Ryan, Christopher J.; Smith, Richard D.
2013-03-12
Cell signaling systems transmit information by post-translationally modifying signaling proteins, often via phosphorylation. While thousands of sites of phosphorylation have been identified in proteomic studies, the vast majority of sites have no known function. Assigning functional roles to the catalog of uncharacterized phosphorylation sites is a key research challenge. Here we present a general approach to address this challenge and apply it to a prototypical signaling pathway, the pheromone response pathway in Saccharomyces cerevisiae. The pheromone pathway includes a mitogen activated protein kinase (MAPK) cascade activated by a G-protein coupled receptor (GPCR). We used mass spectrometry-based proteomics to identify sitesmore » whose phosphorylation changed when the system was active, and evolutionary conservation to assign priority to a list of candidate MAPK regulatory sites. We made targeted alterations in those sites, and measured the effects of the mutations on pheromone pathway output in single cells. Our work identified six new sites that quantitatively tuned system output. We developed simple computational models to find system architectures that recapitulated the quantitative phenotypes of the mutants. Our results identify a number of regulated phosphorylation events that contribute to adjust the input-output relationship of this model eukaryotic signaling system. We believe this combined approach constitutes a general means not only to reveal modification sites required to turn a pathway on and off, but also those required for more subtle quantitative effects that tune pathway output. Our results further suggest that relatively small quantitative influences from individual regulatory phosphorylation events endow signaling systems with plasticity that evolution may exploit to quantitatively tailor signaling outcomes.« less
Quantitative evaluation of phase processing approaches in susceptibility weighted imaging
NASA Astrophysics Data System (ADS)
Li, Ningzhi; Wang, Wen-Tung; Sati, Pascal; Pham, Dzung L.; Butman, John A.
2012-03-01
Susceptibility weighted imaging (SWI) takes advantage of the local variation in susceptibility between different tissues to enable highly detailed visualization of the cerebral venous system and sensitive detection of intracranial hemorrhages. Thus, it has been increasingly used in magnetic resonance imaging studies of traumatic brain injury as well as other intracranial pathologies. In SWI, magnitude information is combined with phase information to enhance the susceptibility induced image contrast. Because of global susceptibility variations across the image, the rate of phase accumulation varies widely across the image resulting in phase wrapping artifacts that interfere with the local assessment of phase variation. Homodyne filtering is a common approach to eliminate this global phase variation. However, filter size requires careful selection in order to preserve image contrast and avoid errors resulting from residual phase wraps. An alternative approach is to apply phase unwrapping prior to high pass filtering. A suitable phase unwrapping algorithm guarantees no residual phase wraps but additional computational steps are required. In this work, we quantitatively evaluate these two phase processing approaches on both simulated and real data using different filters and cutoff frequencies. Our analysis leads to an improved understanding of the relationship between phase wraps, susceptibility effects, and acquisition parameters. Although homodyne filtering approaches are faster and more straightforward, phase unwrapping approaches perform more accurately in a wider variety of acquisition scenarios.
Modeling Drug- and Chemical-Induced Hepatotoxicity with Systems Biology Approaches
Bhattacharya, Sudin; Shoda, Lisl K.M.; Zhang, Qiang; Woods, Courtney G.; Howell, Brett A.; Siler, Scott Q.; Woodhead, Jeffrey L.; Yang, Yuching; McMullen, Patrick; Watkins, Paul B.; Andersen, Melvin E.
2012-01-01
We provide an overview of computational systems biology approaches as applied to the study of chemical- and drug-induced toxicity. The concept of “toxicity pathways” is described in the context of the 2007 US National Academies of Science report, “Toxicity testing in the 21st Century: A Vision and A Strategy.” Pathway mapping and modeling based on network biology concepts are a key component of the vision laid out in this report for a more biologically based analysis of dose-response behavior and the safety of chemicals and drugs. We focus on toxicity of the liver (hepatotoxicity) – a complex phenotypic response with contributions from a number of different cell types and biological processes. We describe three case studies of complementary multi-scale computational modeling approaches to understand perturbation of toxicity pathways in the human liver as a result of exposure to environmental contaminants and specific drugs. One approach involves development of a spatial, multicellular “virtual tissue” model of the liver lobule that combines molecular circuits in individual hepatocytes with cell–cell interactions and blood-mediated transport of toxicants through hepatic sinusoids, to enable quantitative, mechanistic prediction of hepatic dose-response for activation of the aryl hydrocarbon receptor toxicity pathway. Simultaneously, methods are being developing to extract quantitative maps of intracellular signaling and transcriptional regulatory networks perturbed by environmental contaminants, using a combination of gene expression and genome-wide protein-DNA interaction data. A predictive physiological model (DILIsym™) to understand drug-induced liver injury (DILI), the most common adverse event leading to termination of clinical development programs and regulatory actions on drugs, is also described. The model initially focuses on reactive metabolite-induced DILI in response to administration of acetaminophen, and spans multiple biological scales. PMID:23248599
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oakdale, James S.; Kwisnek, Luke; Fokin, Valery V.
2016-06-10
Functional polystyrenes and polyacrylamides, containing combinations of fluorosulfate, aromatic silyl ether, and azide side chains, were used as scaffolds to demonstrate the postpolymerization modification capabilities of sulfur(VI) fluoride exchange (SuFEx) and CuAAC chemistries. Fluorescent dyes bearing appropriate functional groups were sequentially attached to the backbone of the copolymers, quantitatively and selectively addressing their reactive partners. Furthermore, this combined SuFEx and CuAAC approach proved to be robust and versatile, allowing for a rare accomplishment: triple orthogonal functionalization of a copolymer under essentially ambient conditions without protecting groups.
Yanina, Irina Y; Popov, Alexey P; Bykov, Alexander V; Meglinski, Igor V; Tuchin, Valery V
2018-01-01
Observation of temperature-mediated phase transitions between lipid components of the adipose tissues has been performed by combined use of the Abbe refractometry and optical coherence tomography. The phase transitions of the lipid components were clearly observed in the range of temperatures from 24°C to 60°C, and assessed by quantitatively monitoring the changes of the refractive index of 1- to 2-mm-thick porcine fat tissue slices. The developed approach has a great potential as an alternative method for obtaining accurate information on the processes occurring during thermal lipolysis. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).
Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J
2017-11-01
Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to be required for further progress in understanding and treating AD. Copyright © 2017 the Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
SC3 - consensus clustering of single-cell RNA-Seq data
Kiselev, Vladimir Yu.; Kirschner, Kristina; Schaub, Michael T.; Andrews, Tallulah; Yiu, Andrew; Chandra, Tamir; Natarajan, Kedar N; Reik, Wolf; Barahona, Mauricio; Green, Anthony R; Hemberg, Martin
2017-01-01
Single-cell RNA-seq (scRNA-seq) enables a quantitative cell-type characterisation based on global transcriptome profiles. We present Single-Cell Consensus Clustering (SC3), a user-friendly tool for unsupervised clustering which achieves high accuracy and robustness by combining multiple clustering solutions through a consensus approach. We demonstrate that SC3 is capable of identifying subclones based on the transcriptomes from neoplastic cells collected from patients. PMID:28346451
Application of Fuzzy Logic to Matrix FMECA
NASA Astrophysics Data System (ADS)
Shankar, N. Ravi; Prabhu, B. S.
2001-04-01
A methodology combining the benefits of Fuzzy Logic and Matrix FMEA is presented in this paper. The presented methodology extends the risk prioritization beyond the conventional Risk Priority Number (RPN) method. Fuzzy logic is used to calculate the criticality rank. Also the matrix approach is improved further to develop a pictorial representation retaining all relevant qualitative and quantitative information of several FMEA elements relationships. The methodology presented is demonstrated by application to an illustrative example.
Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution
Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn
2012-01-01
Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907
Ikeda, Mayumi; Ishima, Yu; Shibata, Akitomo; Chuang, Victor T G; Sawa, Tomohiro; Ihara, Hideshi; Watanabe, Hiroshi; Xian, Ming; Ouchi, Yuya; Shimizu, Taro; Ando, Hidenori; Ukawa, Masami; Ishida, Tatsuhiro; Akaike, Takaaki; Otagiri, Masaki; Maruyama, Toru
2017-05-29
Hydrogen sulfide (H 2 S) signaling involves polysulfide (RSS n SR') formation on various proteins. However, the current lack of sensitive polysulfide detection assays poses methodological challenges for understanding sulfane sulfur homeostasis and signaling. We developed a novel combined assay by modifying Sulfide Antioxidant Buffer (SAOB) to produce an "Elimination Method of Sulfide from Polysulfide" (EMSP) treatment solution that liberates sulfide, followed with methylene blue (MB) sulfide detection assay. The combined EMSP-MB sulfide detection assay performed on low molecular weight sulfur species showed that sulfide was produced from trisulfide compounds such as glutathione trisulfide and diallyl trisulfide, but not from the thiol compounds such as cysteine, cystine and glutathione. In the case of plasma proteins, this novel combined detection assay revealed that approximately 14.7, 1.7, 3.9, 3.7 sulfide mol/mol released from human serum albumin, α 1 -anti-trypsin, α 1 -acid glycoprotein and ovalbumin, respectively, suggesting that serum albumin is a major pool of polysulfide in human blood circulation. Taken together with the results of albumins of different species, the liberated sulfide has a good correlation with cysteine instead of methionine, indicating the site of incorporation of polysulfide is cysteine. With this novel sulfide detention assay, approximately 8,000, 120 and 1100 μM of polysulfide concentrations was quantitated in human healthy plasma, saliva and tear, respectively. Our promising polysulfide specific detection assay can be a very important tool because quantitative determination of polysulfide sheds light on the functional consequence of protein-bound cysteine polysulfide and expands the research area of reactive oxygen to reactive polysulfide species. Copyright © 2017 Elsevier B.V. All rights reserved.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-01-01
Background Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Methods Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. Results A ‘Rich Picture’ was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. Conclusions When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. PMID:28062603
LeBlanc, André; Michaud, Sarah A; Percy, Andrew J; Hardie, Darryl B; Yang, Juncong; Sinclair, Nicholas J; Proudfoot, Jillaine I; Pistawka, Adam; Smith, Derek S; Borchers, Christoph H
2017-07-07
When quantifying endogenous plasma proteins for fundamental and biomedical research - as well as for clinical applications - precise, reproducible, and robust assays are required. Targeted detection of peptides in a bottom-up strategy is the most common and precise mass spectrometry-based quantitation approach when combined with the use of stable isotope-labeled peptides. However, when measuring protein in plasma, the unknown endogenous levels prevent the implementation of the best calibration strategies, since no blank matrix is available. Consequently, several alternative calibration strategies are employed by different laboratories. In this study, these methods were compared to a new approach using two different stable isotope-labeled standard (SIS) peptide isotopologues for each endogenous peptide to be quantified, enabling an external calibration curve as well as the quality control samples to be prepared in pooled human plasma without interference from endogenous peptides. This strategy improves the analytical performance of the assay and enables the accuracy of the assay to be monitored, which can also facilitate method development and validation.
Müller, Andreas; Neukam, Martin; Ivanova, Anna; Sönmez, Anke; Münster, Carla; Kretschmar, Susanne; Kalaidzidis, Yannis; Kurth, Thomas; Verbavatz, Jean-Marc; Solimena, Michele
2017-02-02
Correlative light and electron microscopy (CLEM) is a powerful approach to investigate the molecular ultrastructure of labeled cell compartments. However, quantitative CLEM studies are rare, mainly due to small sample sizes and the sensitivity of fluorescent proteins to strong fixatives and contrasting reagents for EM. Here, we show that fusion of a self-labeling protein to insulin allows for the quantification of age-distinct insulin granule pools in pancreatic beta cells by a combination of super resolution and transmission electron microscopy on Tokuyasu cryosections. In contrast to fluorescent proteins like GFP organic dyes covalently bound to self-labeling proteins retain their fluorescence also in epoxy resin following high pressure freezing and freeze substitution, or remarkably even after strong chemical fixation. This enables for the assessment of age-defined granule morphology and degradation. Finally, we demonstrate that this CLEM protocol is highly versatile, being suitable for single and dual fluorescent labeling and detection of different proteins with optimal ultrastructure preservation and contrast.
COMPUTER-AIDED DRUG DISCOVERY AND DEVELOPMENT (CADDD): in silico-chemico-biological approach
Kapetanovic, I.M.
2008-01-01
It is generally recognized that drug discovery and development are very time and resources consuming processes. There is an ever growing effort to apply computational power to the combined chemical and biological space in order to streamline drug discovery, design, development and optimization. In biomedical arena, computer-aided or in silico design is being utilized to expedite and facilitate hit identification, hit-to-lead selection, optimize the absorption, distribution, metabolism, excretion and toxicity profile and avoid safety issues. Commonly used computational approaches include ligand-based drug design (pharmacophore, a 3-D spatial arrangement of chemical features essential for biological activity), structure-based drug design (drug-target docking), and quantitative structure-activity and quantitative structure-property relationships. Regulatory agencies as well as pharmaceutical industry are actively involved in development of computational tools that will improve effectiveness and efficiency of drug discovery and development process, decrease use of animals, and increase predictability. It is expected that the power of CADDD will grow as the technology continues to evolve. PMID:17229415
Aarons, Gregory A; Fettes, Danielle L; Sommerfeld, David H; Palinkas, Lawrence A
2012-02-01
Many public sector service systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This article describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. The authors integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research.
Gao, Xi; Kong, Bo; Vigil, R Dennis
2017-01-01
A comprehensive quantitative model incorporating the effects of fluid flow patterns, light distribution, and algal growth kinetics on biomass growth rate is developed in order to predict the performance of a Taylor vortex algal photobioreactor for culturing Chlorella vulgaris. A commonly used Lagrangian strategy for coupling the various factors influencing algal growth was employed whereby results from computational fluid dynamics and radiation transport simulations were used to compute numerous microorganism light exposure histories, and this information in turn was used to estimate the global biomass specific growth rate. The simulations provide good quantitative agreement with experimental data and correctly predict the trend in reactor performance as a key reactor operating parameter is varied (inner cylinder rotation speed). However, biomass growth curves are consistently over-predicted and potential causes for these over-predictions and drawbacks of the Lagrangian approach are addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.
El-Rami, Fadi; Nelson, Kristina; Xu, Ping
2017-01-01
Streptococcus sanguinis is a commensal and early colonizer of oral cavity as well as an opportunistic pathogen of infectious endocarditis. Extracting the soluble proteome of this bacterium provides deep insights about the physiological dynamic changes under different growth and stress conditions, thus defining “proteomic signatures” as targets for therapeutic intervention. In this protocol, we describe an experimentally verified approach to extract maximal cytoplasmic proteins from Streptococcus sanguinis SK36 strain. A combination of procedures was adopted that broke the thick cell wall barrier and minimized denaturation of the intracellular proteome, using optimized buffers and a sonication step. Extracted proteome was quantitated using Pierce BCA Protein Quantitation assay and protein bands were macroscopically assessed by Coomassie Blue staining. Finally, a high resolution detection of the extracted proteins was conducted through Synapt G2Si mass spectrometer, followed by label-free relative quantification via Progenesis QI. In conclusion, this pipeline for proteomic extraction and analysis of soluble proteins provides a fundamental tool in deciphering the biological complexity of Streptococcus sanguinis. PMID:29152022
Aarons, Gregory A.; Fettes, Danielle L.; Sommerfeld, David H.; Palinkas, Lawrence
2013-01-01
Many public sector services systems and provider organizations are in some phase of learning about or implementing evidence-based interventions. Child welfare service systems represent a context where implementation spans system, management, and organizational concerns. Research utilizing mixed methods that combine qualitative and quantitative design, data collection, and analytic approaches are particularly well-suited to understanding both the process and outcomes of dissemination and implementation efforts in child welfare systems. This paper describes the process of using mixed methods in implementation research and provides an applied example of an examination of factors impacting staff retention during an evidence-based intervention implementation in a statewide child welfare system. We integrate qualitative data with previously published quantitative analyses of job autonomy and staff turnover during this statewide implementation project in order to illustrate the utility of mixed method approaches in providing a more comprehensive understanding of opportunities and challenges in implementation research. PMID:22146861
Quantitative lung perfusion evaluation using Fourier decomposition perfusion MRI.
Kjørstad, Åsmund; Corteville, Dominique M R; Fischer, Andre; Henzler, Thomas; Schmid-Bindert, Gerald; Zöllner, Frank G; Schad, Lothar R
2014-08-01
To quantitatively evaluate lung perfusion using Fourier decomposition perfusion MRI. The Fourier decomposition (FD) method is a noninvasive method for assessing ventilation- and perfusion-related information in the lungs, where the perfusion maps in particular have shown promise for clinical use. However, the perfusion maps are nonquantitative and dimensionless, making follow-ups and direct comparisons between patients difficult. We present an approach to obtain physically meaningful and quantifiable perfusion maps using the FD method. The standard FD perfusion images are quantified by comparing the partially blood-filled pixels in the lung parenchyma with the fully blood-filled pixels in the aorta. The percentage of blood in a pixel is then combined with the temporal information, yielding quantitative blood flow values. The values of 10 healthy volunteers are compared with SEEPAGE measurements which have shown high consistency with dynamic contrast enhanced-MRI. All pulmonary blood flow (PBF) values are within the expected range. The two methods are in good agreement (mean difference = 0.2 mL/min/100 mL, mean absolute difference = 11 mL/min/100 mL, mean PBF-FD = 150 mL/min/100 mL, mean PBF-SEEPAGE = 151 mL/min/100 mL). The Bland-Altman plot shows a good spread of values, indicating no systematic bias between the methods. Quantitative lung perfusion can be obtained using the Fourier Decomposition method combined with a small amount of postprocessing. Copyright © 2013 Wiley Periodicals, Inc.
A Study Assessing the Potential of Negative Effects in Interdisciplinary Math–Biology Instruction
Madlung, Andreas; Bremer, Martina; Himelblau, Edward; Tullis, Alexa
2011-01-01
There is increasing enthusiasm for teaching approaches that combine mathematics and biology. The call for integrating more quantitative work in biology education has led to new teaching tools that improve quantitative skills. Little is known, however, about whether increasing interdisciplinary work can lead to adverse effects, such as the development of broader but shallower skills or the possibility that math anxiety causes some students to disengage in the classroom, or, paradoxically, to focus so much on the mathematics that they lose sight of its application for the biological concepts in the center of the unit at hand. We have developed and assessed an integrative learning module and found disciplinary learning gains to be equally strong in first-year students who actively engaged in embedded quantitative calculations as in those students who were merely presented with quantitative data in the context of interpreting biological and biostatistical results. When presented to advanced biology students, our quantitative learning tool increased test performance significantly. We conclude from our study that the addition of mathematical calculations to the first year and advanced biology curricula did not hinder overall student learning, and may increase disciplinary learning and data interpretation skills in advanced students. PMID:21364099
Vasquez, Joshua J; Hussien, Rajaa; Aguilar-Rodriguez, Brandon; Junger, Henrik; Dobi, Dejan; Henrich, Timothy J; Thanh, Cassandra; Gibson, Erica; Hogan, Louise E; McCune, Joseph; Hunt, Peter W; Stoddart, Cheryl A; Laszik, Zoltan G
2018-06-01
Persistent tissue reservoirs of HIV present a major barrier to cure. Defining subsets of infected cells in tissues is a major focus of HIV cure research. Herein, we describe a novel multiplexed in situ hybridization (ISH) (RNAscope) protocol to detect HIV-DNA (vDNA) and HIV-RNA (vRNA) in formalin-fixed paraffin-embedded (FFPE) human tissues in combination with immunofluorescence (IF) phenotyping of the infected cells. We show that multiplexed IF and ISH (mIFISH) is suitable for quantitative assessment of HIV vRNA and vDNA and that multiparameter IF phenotyping allows precise identification of the cellular source of the ISH signal. We also provide semi-quantitative data on the impact of various tissue fixatives on the detectability of vDNA and vRNA with RNAscope technology. Finally, we describe methods to quantitate the ISH signal on whole-slide digital images and validation of the quantitative ISH data with quantitative real-time PCR for vRNA. It is our hope that this approach will provide insight into the biology of HIV tissue reservoirs and to inform strategies aimed at curing HIV.
Baldwin, Nicole E.; Chesler, Elissa J.; Kirov, Stefan; ...
2005-01-01
Gene expression microarray data can be used for the assembly of genetic coexpression network graphs. Using mRNA samples obtained from recombinant inbred Mus musculus strains, it is possible to integrate allelic variation with molecular and higher-order phenotypes. The depth of quantitative genetic analysis of microarray data can be vastly enhanced utilizing this mouse resource in combination with powerful computational algorithms, platforms, and data repositories. The resulting network graphs transect many levels of biological scale. This approach is illustrated with the extraction of cliques of putatively co-regulated genes and their annotation using gene ontology analysis and cis -regulatory element discovery. Themore » causal basis for co-regulation is detected through the use of quantitative trait locus mapping.« less
Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.
2014-01-01
Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139
Bartlett, John M S; Christiansen, Jason; Gustavson, Mark; Rimm, David L; Piper, Tammy; van de Velde, Cornelis J H; Hasenburg, Annette; Kieback, Dirk G; Putter, Hein; Markopoulos, Christos J; Dirix, Luc Y; Seynaeve, Caroline; Rea, Daniel W
2016-01-01
Hormone receptors HER2/neu and Ki-67 are markers of residual risk in early breast cancer. An algorithm (IHC4) combining these markers may provide additional information on residual risk of recurrence in patients treated with hormone therapy. To independently validate the IHC4 algorithm in the multinational Tamoxifen Versus Exemestane Adjuvant Multicenter Trial (TEAM) cohort, originally developed on the trans-ATAC (Arimidex, Tamoxifen, Alone or in Combination Trial) cohort, by comparing 2 methodologies. The IHC4 biomarker expression was quantified on TEAM cohort samples (n = 2919) by using 2 independent methodologies (conventional 3,3'-diaminobezidine [DAB] immunohistochemistry with image analysis and standardized quantitative immunofluorescence [QIF] by AQUA technology). The IHC4 scores were calculated by using the same previously established coefficients and then compared with recurrence-free and distant recurrence-free survival, using multivariate Cox proportional hazards modeling. The QIF model was highly significant for prediction of residual risk (P < .001), with continuous model scores showing a hazard ratio (HR) of 1.012 (95% confidence interval [95% CI]: 1.010-1.014), which was significantly higher than that for the DAB model (HR: 1.008, 95% CI: 1.006-1.009); P < .001). Each model added significant prognostic value in addition to recognized clinical prognostic factors, including nodal status, in multivariate analyses. Quantitative immunofluorescence, however, showed more accuracy with respect to overall residual risk assessment than the DAB model. The use of the IHC4 algorithm was validated on the TEAM trial for predicting residual risk in patients with breast cancer. These data support the use of the IHC4 algorithm clinically, but quantitative and standardized approaches need to be used.
Hard x-ray phase contrastmicroscopy - techniques and applications
NASA Astrophysics Data System (ADS)
Holzner, Christian
In 1918, Einstein provided the first description of the nature of the refractive index for X-rays, showing that phase contrast effects are significant. A century later, most x-ray microscopy and nearly all medical imaging remains based on absorption contrast, even though phase contrast offers orders of magnitude improvements in contrast and reduced radiation exposure at multi-keV x-ray energies. The work presented is concerned with developing practical and quantitative methods of phase contrast for x-ray microscopy. A theoretical framework for imaging in phase contrast is put forward; this is used to obtain quantitative images in a scanning microscope using a segmented detector, and to correct for artifacts in a commercial phase contrast x-ray nano-tomography system. The principle of reciprocity between scanning and full-field microscopes is then used to arrive at a novel solution: Zernike contrast in a scanning microscope. These approaches are compared on a theoretical and experimental basis in direct connection with applications using multi-keV x-ray microscopes at the Advanced Photon Source at Argonne National Laboratory. Phase contrast provides the best means to image mass and ultrastructure of light elements that mainly constitute biological matter, while stimulated x-ray fluorescence provides high sensitivity for studies of the distribution of heavier trace elements, such as metals. These approaches are combined in a complementary way to yield quantitative maps of elemental concentration from 2D images, with elements placed in their ultrastructural context. The combination of x-ray fluorescence and phase contrast poses an ideal match for routine, high resolution tomographic imaging of biological samples in the future. The presented techniques and demonstration experiments will help pave the way for this development.
NASA Astrophysics Data System (ADS)
Bhave, Ajay; Dessai, Suraje; Conway, Declan; Stainforth, David
2016-04-01
Deep uncertainty in future climate change and socio-economic conditions necessitates the use of assess-risk-of-policy approaches over predict-then-act approaches for adaptation decision making. Robust Decision Making (RDM) approaches embody this principle and help evaluate the ability of adaptation options to satisfy stakeholder preferences under wide-ranging future conditions. This study involves the simultaneous application of two RDM approaches; qualitative and quantitative, in the Cauvery River Basin in Karnataka (population ~23 million), India. The study aims to (a) determine robust water resources adaptation options for the 2030s and 2050s and (b) compare the usefulness of a qualitative stakeholder-driven approach with a quantitative modelling approach. For developing a large set of future scenarios a combination of climate narratives and socio-economic narratives was used. Using structured expert elicitation with a group of climate experts in the Indian Summer Monsoon, climatic narratives were developed. Socio-economic narratives were developed to reflect potential future urban and agricultural water demand. In the qualitative RDM approach, a stakeholder workshop helped elicit key vulnerabilities, water resources adaptation options and performance criteria for evaluating options. During a second workshop, stakeholders discussed and evaluated adaptation options against the performance criteria for a large number of scenarios of climatic and socio-economic change in the basin. In the quantitative RDM approach, a Water Evaluation And Planning (WEAP) model was forced by precipitation and evapotranspiration data, coherent with the climatic narratives, together with water demand data based on socio-economic narratives. We find that compared to business-as-usual conditions options addressing urban water demand satisfy performance criteria across scenarios and provide co-benefits like energy savings and reduction in groundwater depletion, while options reducing agricultural water demand significantly affect downstream water availability. Water demand options demonstrate potential to improve environmental flow conditions and satisfy legal water supply requirements for downstream riparian states. On the other hand, currently planned large scale infrastructural projects demonstrate reduced value in certain scenarios, illustrating the impacts of lock-in effects of large scale infrastructure. From a methodological perspective, we find that while the stakeholder-driven approach revealed robust options in a resource-light manner and helped initiate much needed interaction amongst stakeholders, the modelling approach provides complementary quantitative information. The study reveals robust adaptation options for this important basin and provides a strong methodological basis for carrying out future studies that support adaptation decision making.
Quantitative myocardial perfusion from static cardiac and dynamic arterial CT
NASA Astrophysics Data System (ADS)
Bindschadler, Michael; Branch, Kelley R.; Alessio, Adam M.
2018-05-01
Quantitative myocardial blood flow (MBF) estimation by dynamic contrast enhanced cardiac computed tomography (CT) requires multi-frame acquisition of contrast transit through the blood pool and myocardium to inform the arterial input and tissue response functions. Both the input and the tissue response functions for the entire myocardium are sampled with each acquisition. However, the long breath holds and frequent sampling can result in significant motion artifacts and relatively high radiation dose. To address these limitations, we propose and evaluate a new static cardiac and dynamic arterial (SCDA) quantitative MBF approach where (1) the input function is well sampled using either prediction from pre-scan timing bolus data or measured from dynamic thin slice ‘bolus tracking’ acquisitions, and (2) the whole-heart tissue response data is limited to one contrast enhanced CT acquisition. A perfusion model uses the dynamic arterial input function to generate a family of possible myocardial contrast enhancement curves corresponding to a range of MBF values. Combined with the timing of the single whole-heart acquisition, these curves generate a lookup table relating myocardial contrast enhancement to quantitative MBF. We tested the SCDA approach in 28 patients that underwent a full dynamic CT protocol both at rest and vasodilator stress conditions. Using measured input function plus single (enhanced CT only) or plus double (enhanced and contrast free baseline CT’s) myocardial acquisitions yielded MBF estimates with root mean square (RMS) error of 1.2 ml/min/g and 0.35 ml/min/g, and radiation dose reductions of 90% and 83%, respectively. The prediction of the input function based on timing bolus data and the static acquisition had an RMS error compared to the measured input function of 26.0% which led to MBF estimation errors greater than threefold higher than using the measured input function. SCDA presents a new, simplified approach for quantitative perfusion imaging with an acquisition strategy offering substantial radiation dose and computational complexity savings over dynamic CT.
Vikingsson, Svante; Dahlberg, Jan-Olof; Hansson, Johan; Höiom, Veronica; Gréen, Henrik
2017-06-01
Dabrafenib is an inhibitor of BRAF V600E used for treating metastatic melanoma but a majority of patients experience adverse effects. Methods to measure the levels of dabrafenib and major metabolites during treatment are needed to allow development of individualized dosing strategies to reduce the burden of such adverse events. In this study, an LC-MS/MS method capable of measuring dabrafenib quantitatively and six metabolites semi-quantitatively is presented. The method is fully validated with regard to dabrafenib in human plasma in the range 5-5000 ng/mL. The analytes were separated on a C18 column after protein precipitation and detected in positive electrospray ionization mode using a Xevo TQ triple quadrupole mass spectrometer. As no commercial reference standards are available, the calibration curve of dabrafenib was used for semi-quantification of dabrafenib metabolites. Compared to earlier methods the presented method represents a simpler and more cost-effective approach suitable for clinical studies. Graphical abstract Combined multi reaction monitoring transitions of dabrafenib and metabolites in a typical case sample.
Hansen, Matthew; O'Brien, Kerth; Meckler, Garth; Chang, Anna Marie; Guise, Jeanne-Marie
2016-07-01
Mixed methods research has significant potential to broaden the scope of emergency care and specifically emergency medical services investigation. Mixed methods studies involve the coordinated use of qualitative and quantitative research approaches to gain a fuller understanding of practice. By combining what is learnt from multiple methods, these approaches can help to characterise complex healthcare systems, identify the mechanisms of complex problems such as medical errors and understand aspects of human interaction such as communication, behaviour and team performance. Mixed methods approaches may be particularly useful for out-of-hospital care researchers because care is provided in complex systems where equipment, interpersonal interactions, societal norms, environment and other factors influence patient outcomes. The overall objectives of this paper are to (1) introduce the fundamental concepts and approaches of mixed methods research and (2) describe the interrelation and complementary features of the quantitative and qualitative components of mixed methods studies using specific examples from the Children's Safety Initiative-Emergency Medical Services (CSI-EMS), a large National Institutes of Health-funded research project conducted in the USA. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Observation of the immune response of cells and tissue through multimodal label-free microscopy
NASA Astrophysics Data System (ADS)
Pavillon, Nicolas; Smith, Nicholas I.
2017-02-01
We present applications of a label-free approach to assess the immune response based on the combination of interferometric microscopy and Raman spectroscopy, which makes it possible to simultaneously acquire morphological and molecular information of live cells. We employ this approach to derive statistical models for predicting the activation state of macrophage cells based both on morphological parameters extracted from the high-throughput full-field quantitative phase imaging, and on the molecular content information acquired through Raman spectroscopy. We also employ a system for 3D imaging based on coherence gating, enabling specific targeting of the Raman channel to structures of interest within tissue.
Customized Body Mapping to Facilitate the Ergonomic Design of Sportswear.
Cao, Mingliang; Li, Yi; Guo, Yueping; Yao, Lei; Pan, Zhigeng
2016-01-01
A successful high-performance sportswear design that considers human factors should result in a significant increase in thermal comfort and reduce energy loss. The authors describe a body-mapping approach that facilitates the effective ergonomic design of sportswear. Their general framework can be customized based on the functional requirements of various sports and sportswear, the desired combination and selection of mapping areas for the human body, and customized quantitative data distribution of target physiological indicators.
Spin structure of electron subbands in (110)-grown quantum wells
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nestoklon, M. O.; Tarasenko, S. A.; Jancu, J.-M.
We present the theory of fine structure of electron states in symmetric and asymmetric zinc-blende-type quantum wells with the (110) crystallographic orientation. By combining the symmetry analysis, sp{sup 3}d{sup 5}s* tight-binding method, and envelope-function approach we obtain quantitative description of in-plane wave vector, well width and applied electric field dependencies of the zero-magnetic-field spin splitting of electron subbands and extract spin-orbit-coupling parameters.
Zhou, Xiaoying; Schoenung, Julie M
2009-12-15
There are two quantitative indicators that are most widely used to assess the extent of compliance of industrial facilities with environmental regulations: the quantity of hazardous waste generated and the amount of toxics released. These indicators, albeit useful in terms of some environmental monitoring, fail to account for direct or indirect effects on human and environmental health, especially when aggregating total quantity of releases for a facility or industry sector. Thus, there is a need for a more comprehensive approach that can prioritize a particular chemical (or industry sector) on the basis of its relevant environmental performance and impact on human health. Accordingly, the objective of the present study is to formulate an aggregation of tools that can simultaneously capture multiple effects and several environmental impact categories. This approach allows us to compare and combine results generated with the aid of select U.S.-based quantitative impact assessment tools, thereby supplementing compliance-based metrics such as data from the U.S. Toxic Release Inventory. A case study, which presents findings for the U.S. chemical manufacturing industry, is presented to illustrate the aggregation of these tools. Environmental impacts due to both upstream and manufacturing activities are also evaluated for each industry sector. The proposed combinatorial analysis allows for a more robust evaluation for rating and prioritizing the environmental impacts of industrial waste.
Monitoring cryptic amphibians and reptiles in a Florida state park.
Engeman, Richard M; Meshaka, Walter E; Severson, Robert; Severson, Mary Ann; Kaufman, Greg; Groninger, N Paige; Smith, Henry T
2016-04-01
We monitored cryptic herpetofauna at Savannas Preserve State Park, Florida, by combining artificial cover counts with a quantitative paradigm for constructing and calculating population indices. Weekly indices were calculated from two consecutive days of data collection each week for 7 months from mid-winter to mid-summer in three habitats. Seventeen species were observed at least once, and time trends using index values were followed for six species. Among these, abundance and seasonal pattern information were obtained for an exotic species (greenhouse frog) and a species identified by the Florida Committee on Rare and Endangered Plants and Animals as threatened (Florida scrub lizard). We identified winter as the optimal time in this area to monitor populations for conducting annual assessments. This combined observation and indexing approach could provide managers or researchers with an economical means to quantitatively index population trends for multiple cryptic herpetofauna species simultaneously. Using artificial cover to sample within a population indexing design can be generalized beyond monitoring herpetofauna. Other forms of artificial cover that can be used as observation stations include aquatic artificial substrates, artificial tree cavities, artificial reefs, and other artificial aquatic structures and artificial sea grass units, among many others, and a wide range of taxa are suitable for population monitoring using artificial cover as observation stations in the approach we present, including insects, soil invertebrates, micro and macro aquatic invertebrates, fish, crustaceans, and small mammals.
Developing a Social, Cultural and Economic Report Card for a Regional Industrial Harbour.
Pascoe, Sean; Tobin, Renae; Windle, Jill; Cannard, Toni; Marshall, Nadine; Kabir, Zobaidul; Flint, Nicole
2016-01-01
Report cards are increasingly used to provide ongoing snap-shots of progress towards specific ecosystem health goals, particularly in coastal regions where planners need to balance competing demands for coastal resources from a range of industries. While most previous report cards focus on the biophysical components of the system, there is a growing interest in including the social and economic implications of ecosystem management to provide a greater social-ecological system understanding. Such a report card was requested on the Gladstone Harbour area in central Queensland, Australia. Gladstone Harbour adjoins the southern Great Barrier Reef, and is also a major industrial and shipping port. Balancing social, economic and environmental interests is therefore of great concern to the regional managers. While environmental benchmarking procedures are well established within Australia (and elsewhere), a method for assessing social and economic performance of coastal management is generally lacking. The key aim of this study was to develop and pilot a system for the development of a report card relating to appropriate cultural, social and economic objectives. The approach developed uses a range of multicriteria decision analysis methods to assess and combine different qualitative and quantitative measures, including the use of Bayesian Belief Networks to combine the different measures and provide an overall quantitative score for each of the key management objectives. The approach developed is readily transferable for purposes of similar assessments in other regions.
Developing a Social, Cultural and Economic Report Card for a Regional Industrial Harbour
Pascoe, Sean; Tobin, Renae; Windle, Jill; Cannard, Toni; Marshall, Nadine; Kabir, Zobaidul; Flint, Nicole
2016-01-01
Report cards are increasingly used to provide ongoing snap-shots of progress towards specific ecosystem health goals, particularly in coastal regions where planners need to balance competing demands for coastal resources from a range of industries. While most previous report cards focus on the biophysical components of the system, there is a growing interest in including the social and economic implications of ecosystem management to provide a greater social-ecological system understanding. Such a report card was requested on the Gladstone Harbour area in central Queensland, Australia. Gladstone Harbour adjoins the southern Great Barrier Reef, and is also a major industrial and shipping port. Balancing social, economic and environmental interests is therefore of great concern to the regional managers. While environmental benchmarking procedures are well established within Australia (and elsewhere), a method for assessing social and economic performance of coastal management is generally lacking. The key aim of this study was to develop and pilot a system for the development of a report card relating to appropriate cultural, social and economic objectives. The approach developed uses a range of multicriteria decision analysis methods to assess and combine different qualitative and quantitative measures, including the use of Bayesian Belief Networks to combine the different measures and provide an overall quantitative score for each of the key management objectives. The approach developed is readily transferable for purposes of similar assessments in other regions. PMID:26839949
Pelgrom, K.; Stam, P.; Lindhout, P.
2008-01-01
In plants, several population types [F2, recombinant inbred lines, backcross inbred lines (BILs), etc.] are used for quantitative trait locus (QTL) analyses. However, dissection of the trait of interest and subsequent confirmation by introgression of QTLs for breeding purposes has not been as successful as that predicted from theoretical calculations. More practical knowledge of different QTL mapping approaches is needed. In this recent study, we describe the detection and mapping of quantitative resistances to downy mildew in a set of 29 BILs of cultivated lettuce (L. sativa) containing genome segments introgressed from wild lettuce (L. saligna). Introgression regions that are associated with quantitative resistance are considered to harbor a QTL. Furthermore, we compare this with results from an already existing F2 population derived from the same parents. We identified six QTLs in our BIL approach compared to only three in the F2 approach, while there were two QTLs in common. We performed a simulation study based on our actual data to help us interpret them. This revealed that two newly detected QTLs in the BILs had gone unnoticed in the F2, due to a combination of recessiveness of the trait and skewed segregation, causing a deficit of the wild species alleles. This study clearly illustrates the added value of extended genetic studies on two different population types (BILs and F2) to dissect complex genetic traits. PMID:18251002
Dynamics of driven flow with exclusion in graphenelike structures
NASA Astrophysics Data System (ADS)
Stinchcombe, R. B.; de Queiroz, S. L. A.
2015-05-01
We present a mean-field theory for the dynamics of driven flow with exclusion in graphenelike structures, and numerically check its predictions. We treat first a specific combination of bond transmissivity rates, where mean field predicts, and numerics to a large extent confirms, that the sublattice structure characteristic of honeycomb networks becomes irrelevant. Dynamics, in the various regions of the phase diagram set by open boundary injection and ejection rates, is then in general identical to that of one-dimensional systems, although some discrepancies remain between mean-field theory and numerical results, in similar ways for both geometries. However, at the critical point for which the characteristic exponent is z =3 /2 in one dimension, the mean-field value z =2 is approached for very large systems with constant (finite) aspect ratio. We also treat a second combination of bond (and boundary) rates where, more typically, sublattice distinction persists. For the two rate combinations, in continuum or late-time limits, respectively, the coupled sets of mean-field dynamical equations become tractable with various techniques and give a two-band spectrum, gapless in the critical phase. While for the second rate combination quantitative discrepancies between mean-field theory and simulations increase for most properties and boundary rates investigated, theory still is qualitatively correct in general, and gives a fairly good quantitative account of features such as the late-time evolution of density profile differences from their steady-state values.
Cazes, Marie-Hélène
2006-09-01
The development of demographic studies in anthropology is directly linked to the success of population genetics. The anthropodemographic or anthropogenetic approach is thus underpinned by questions of genetics. While demographers focus on population dynamics and renewal in quantitative terms, population geneticists refer not to individuals but to the sets of genes carried by individuals in a population. Their aim is to detect the factors and processes which influence the genetic evolution of a group, i.e. which modify gene frequencies from one generation to the next. Among them are the factors which affect modes of reproduction. To illustrate the association of these three approaches, i.e. demographic, anthropological and genetic, I use here the example of matrimonial exchanges--which lie at the heart of the population renewal process--among the Dogon of Boni, a Malian ethnic group living in the southern Sahel. We can see how successive analyses--starting with endogamy at macroscopic level and moving down to the individual with choice of spouse and preferential marriage-- combining both quantitative and qualitative approaches, can be used to obtain a detailed description of matrimonial exchanges which shed light upon and complement the three different viewpoints.
Comparison of two trajectory based models for locating particle sources for two rural New York sites
NASA Astrophysics Data System (ADS)
Zhou, Liming; Hopke, Philip K.; Liu, Wei
Two back trajectory-based statistical models, simplified quantitative transport bias analysis and residence-time weighted concentrations (RTWC) have been compared for their capabilities of identifying likely locations of source emissions contributing to observed particle concentrations at Potsdam and Stockton, New York. Quantitative transport bias analysis (QTBA) attempts to take into account the distribution of concentrations around the directions of the back trajectories. In full QTBA approach, deposition processes (wet and dry) are also considered. Simplified QTBA omits the consideration of deposition. It is best used with multiple site data. Similarly the RTWC approach uses concentrations measured at different sites along with the back trajectories to distribute the concentration contributions across the spatial domain of the trajectories. In this study, these models are used in combination with the source contribution values obtained by the previous positive matrix factorization analysis of particle composition data from Potsdam and Stockton. The six common sources for the two sites, sulfate, soil, zinc smelter, nitrate, wood smoke and copper smelter were analyzed. The results of the two methods are consistent and locate large and clearly defined sources well. RTWC approach can find more minor sources but may also give unrealistic estimations of the source locations.
Fournier-Level, Alexandre; Le Cunff, Loïc; Gomez, Camila; Doligez, Agnès; Ageorges, Agnès; Roux, Catherine; Bertrand, Yves; Souquet, Jean-Marc; Cheynier, Véronique; This, Patrice
2009-11-01
The combination of QTL mapping studies of synthetic lines and association mapping studies of natural diversity represents an opportunity to throw light on the genetically based variation of quantitative traits. With the positional information provided through quantitative trait locus (QTL) mapping, which often leads to wide intervals encompassing numerous genes, it is now feasible to directly target candidate genes that are likely to be responsible for the observed variation in completely sequenced genomes and to test their effects through association genetics. This approach was performed in grape, a newly sequenced genome, to decipher the genetic architecture of anthocyanin content. Grapes may be either white or colored, ranging from the lightest pink to the darkest purple tones according to the amount of anthocyanin accumulated in the berry skin, which is a crucial trait for both wine quality and human nutrition. Although the determinism of the white phenotype has been fully identified, the genetic bases of the quantitative variation of anthocyanin content in berry skin remain unclear. A single QTL responsible for up to 62% of the variation in the anthocyanin content was mapped on a Syrah x Grenache F(1) pseudo-testcross. Among the 68 unigenes identified in the grape genome within the QTL interval, a cluster of four Myb-type genes was selected on the basis of physiological evidence (VvMybA1, VvMybA2, VvMybA3, and VvMybA4). From a core collection of natural resources (141 individuals), 32 polymorphisms revealed significant association, and extended linkage disequilibrium was observed. Using a multivariate regression method, we demonstrated that five polymorphisms in VvMybA genes except VvMybA4 (one retrotransposon, three single nucleotide polymorphisms and one 2-bp insertion/deletion) accounted for 84% of the observed variation. All these polymorphisms led to either structural changes in the MYB proteins or differences in the VvMybAs promoters. We concluded that the continuous variation in anthocyanin content in grape was explained mainly by a single gene cluster of three VvMybA genes. The use of natural diversity helped to reduce one QTL to a set of five quantitative trait nucleotides and gave a clear picture of how isogenes combined their effects to shape grape color. Such analysis also illustrates how isogenes combine their effect to shape a complex quantitative trait and enables the definition of markers directly targeted for upcoming breeding programs.
Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression.
Henn, Mark-Alexander; Silver, Richard M; Villarrubia, John S; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M; Ming, Bin; Vladár, András E
2015-01-01
Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ 2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges.
Azimzadeh, Omid; Scherthan, Harry; Yentrapalli, Ramesh; Barjaktarovic, Zarko; Ueffing, Marius; Conrad, Marcus; Neff, Frauke; Calzada-Wack, Julia; Aubele, Michaela; Buske, Christian; Atkinson, Michael J; Hauck, Stefanie M; Tapio, Soile
2012-04-18
Qualitative proteome profiling of formalin-fixed, paraffin-embedded (FFPE) tissue is advancing the field of clinical proteomics. However, quantitative proteome analysis of FFPE tissue is hampered by the lack of an efficient labelling method. The usage of conventional protein labelling on FFPE tissue has turned out to be inefficient. Classical labelling targets lysine residues that are blocked by the formalin treatment. The aim of this study was to establish a quantitative proteomics analysis of FFPE tissue by combining the label-free approach with optimised protein extraction and separation conditions. As a model system we used FFPE heart tissue of control and exposed C57BL/6 mice after total body irradiation using a gamma ray dose of 3 gray. We identified 32 deregulated proteins (p≤0.05) in irradiated hearts 24h after the exposure. The proteomics data were further evaluated and validated by bioinformatics and immunoblotting investigation. In good agreement with our previous results using fresh-frozen tissue, the analysis indicated radiation-induced alterations in three main biological pathways: respiratory chain, lipid metabolism and pyruvate metabolism. The label-free approach enables the quantitative measurement of radiation-induced alterations in FFPE tissue and facilitates retrospective biomarker identification using clinical archives. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Uras-Aytemiz, Nevin; Abrrey Monreal, I.; Devlin, J. Paul
2011-10-01
A simple method has been developed for the measurement of high quality FTIR spectra of aerosols of gas-hydrate nanoparticles. The application of this method enables quantitative observation of gas hydrates that form on subsecond timescales using our all-vapor approach that includes an ether catalyst rather than high pressures to promote hydrate formation. The sampling method is versatile allowing routine studies at temperatures ranging from 120 to 210 K of either a single gas or the competitive uptake of different gas molecules in small cages of the hydrates. The present study emphasizes hydrate aerosols formed by pulsing vapor mixtures into a cold chamber held at 160 or 180 K. We emphasize aerosol spectra from 6 scans recorded an average of 8 s after "instantaneous" hydrate formation as well as of the gas hydrates as they evolve with time. Quantitative aerosol data are reported and analyzed for single small-cage guests and for mixed hydrates of CO2, CH4, C2H2, N2O, N2, and air. The approach, combined with the instant formation of gas hydrates from vapors only, offers promise with respect to optimization of methods for the formation and control of gas hydrates.
Ullmann-Zeunert, Lynn; Muck, Alexander; Wielsch, Natalie; Hufsky, Franziska; Stanton, Mariana A; Bartram, Stefan; Böcker, Sebastian; Baldwin, Ian T; Groten, Karin; Svatoš, Aleš
2012-10-05
Herbivory leads to changes in the allocation of nitrogen among different pools and tissues; however, a detailed quantitative analysis of these changes has been lacking. Here, we demonstrate that a mass spectrometric data-independent acquisition approach known as LC-MS(E), combined with a novel algorithm to quantify heavy atom enrichment in peptides, is able to quantify elicited changes in protein amounts and (15)N flux in a high throughput manner. The reliable identification/quantitation of rabbit phosphorylase b protein spiked into leaf protein extract was achieved. The linear dynamic range, reproducibility of technical and biological replicates, and differences between measured and expected (15)N-incorporation into the small (SSU) and large (LSU) subunits of ribulose-1,5-bisphosphate-carboxylase/oxygenase (RuBisCO) and RuBisCO activase 2 (RCA2) of Nicotiana attenuata plants grown in hydroponic culture at different known concentrations of (15)N-labeled nitrate were used to further evaluate the procedure. The utility of the method for whole-plant studies in ecologically realistic contexts was demonstrated by using (15)N-pulse protocols on plants growing in soil under unknown (15)N-incorporation levels. Additionally, we quantified the amounts of lipoxygenase 2 (LOX2) protein, an enzyme important in antiherbivore defense responses, demonstrating that the approach allows for in-depth quantitative proteomics and (15)N flux analyses of the metabolic dynamics elicited during plant-herbivore interactions.
Cowan, Dallas M; Maskrey, Joshua R; Fung, Ernest S; Woods, Tyler A; Stabryla, Lisa M; Scott, Paul K; Finley, Brent L
2016-07-01
Alcohol concentrations in biological matrices offer information regarding an individual's intoxication level at a given time. In forensic cases, the alcohol concentration in the blood (BAC) at the time of death is sometimes used interchangeably with the BAC measured post-mortem, without consideration for alcohol concentration changes in the body after death. However, post-mortem factors must be taken into account for accurate forensic determination of BAC prior to death to avoid incorrect conclusions. The main objective of this work was to describe best practices for relating ante-mortem and post-mortem alcohol concentrations, using a combination of modeling, empirical data and other qualitative considerations. The Widmark modeling approach is a best practices method for superimposing multiple alcohol doses ingested at various times with alcohol elimination rate adjustments based on individual body factors. We combined the selected ante-mortem model with a suggestion for an approach used to roughly estimate changes in BAC post-mortem, and then analyzed the available data on post-mortem alcohol production in human bodies and potential markers for alcohol production through decomposition and putrefaction. Hypothetical cases provide best practice approaches as an example for determining alcohol concentration in biological matrices ante-mortem, as well as potential issues encountered with quantitative post-mortem approaches. This study provides information for standardizing BAC determination in forensic toxicology, while minimizing real world case uncertainties. Copyright © 2016 Elsevier Inc. All rights reserved.
A hybrid agent-based approach for modeling microbiological systems.
Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing
2008-11-21
Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.
NASA Astrophysics Data System (ADS)
Seidel, H.; Blüthner, R.; Hinz, B.; Schust, M.
1998-08-01
The guidance on the effects of vibration on health in standards for whole-body vibration (WBV) does not provide quantitative relationships between WBV and health risk. The paper aims at the elucidation of exposure-response relationships. An analysis of published data on the static and dynamic strength of vertebrae and bone, loaded with various frequencies under different conditions, provided the basis for a theoretical approach to evaluate repetitive loads on the lumbar spine (“internal loads”). The approach enabled the calculation of “equivalent”—with respect to cumulative fatigue failure—combinations of amplitudes and numbers of internal cyclic stress. In order to discover the relation between external peak accelerations at the seat and internal peak loads, biodynamic data of experiments (36 subjects, three somatotypes, two different postures—relaxed and bent forward; random WBV,aw, r.m.s. 1·4 ms-2, containing high transients) were used as input to a biomechanical model. Internal pressure changes were calculated using individual areas of vertebral endplates. The assessment of WBV was based on the quantitative relations between peak accelerations at the seat and pressures predicted for the disk L5/S1. For identical exposures clearly higher rates of pressure rise in the bent forward compared to the relaxed posture were predicted. The risk assessment for internal forces considered the combined internal static and dynamic loads, in relation to the predicted individual strength, and Miner's hypothesis. For exposure durations between 1 min and 8 h, energy equivalent vibration magnitudes (formula B.1, ISO 2631-1, 1997) and equivalent vibration magnitudes according to formula B.2 (time dependence over-energetic) were compared with equivalent combinations of upward peak accelerations and exposure durations according to predicted cumulative fatigue failures of lumbar vertebrae. Formula B.1 seems to underestimate the health risk caused by high magnitudes, formula B.2 is recommended for the evaluation of such conditions.
Kazakiewicz, Denis; Karr, Jonathan R; Langner, Karol M; Plewczynski, Dariusz
2015-12-01
Bacteria are increasingly resistant to existing antibiotics, which target a narrow range of pathways. New methods are needed to identify targets, including repositioning targets among distantly related species. We developed a novel combination of systems and structural modeling and bioinformatics to reposition known antibiotics and targets to new species. We applied this approach to Mycoplasma genitalium, a common cause of urethritis. First, we used quantitative metabolic modeling to identify enzymes whose expression affects the cellular growth rate. Second, we searched the literature for inhibitors of homologs of the most fragile enzymes. Next, we used sequence alignment to assess that the binding site is shared by M. genitalium, but not by humans. Lastly, we used molecular docking to verify that the reported inhibitors preferentially interact with M. genitalium proteins over their human homologs. Thymidylate kinase was the top predicted target and piperidinylthymines were the top compounds. Further work is needed to experimentally validate piperidinylthymines. In summary, combined systems and structural modeling is a powerful tool for drug repositioning. Copyright © 2015 Elsevier Ltd. All rights reserved.
Barišić, Josip; Filipović Marijić, Vlatka; Mijošek, Tatjana; Čož-Rakovac, Rozelindra; Dragun, Zrinka; Krasnići, Nesrete; Ivanković, Dušica; Kružlicová, Dáša; Erk, Marijana
2018-06-14
In the present study novel histopathological approach, using fish intestine as a sensitive bioindicator organ of pollution impact in the freshwater ecosystem, was proposed. Histopathological alterations were compared between native brown trout (Salmo trutta Linnaeus, 1758) from the reference (Krka River spring) and pollution impacted location (influence of technological/municipal wastewaters and agricultural runoff near the Town of Knin) of the karst Krka River in Croatia. In brown trout from both locations, severe parasitic infestation with acanthocephalan species Dentitruncus trutae was found, enabling evaluation of acanthocephalan infestation histopathology, which indicated parasite tissue reaction in a form of inflammatory, necrotic and hyperplastic response that extended throughout lamina epithelialis mucosae, lamina propria, and lamina muscularis mucosae. New semi-quantitative histological approach was proposed in order to foresee alterations classified in three reaction patterns: control tissue appearance, moderate (progressive) tissue impairment and severe (regressive and inflammatory) tissue damage. The most frequent progressive alteration was hyperplasia of epithelium on the reference site, whereas the most frequent regressive alterations were atrophy and necrosis seen on the polluted site. Furthermore, histopathological approach was combined with micromorphological and macromorphological assessment as an additional indicator of pollution impact. Among 15 observed intestinal measures, two biomarkers of intestinal tissue damage were indicated as significant, height of supranuclear space (hSN) and number of mucous cells over 100 μm fold distance of intestinal mucosa (nM), which measures were significantly lower in fish from polluted area compared to the reference site. Obtained results indicated that combined histological and morphological approach on fish intestinal tissue might be used as a valuable biological tool for assessing pollution impact on aquatic organisms. Therefore, semi quantitative scoring and multiparametric morphological assessment of intestinal tissue lesion magnitude should become a common approach to handle environmental pollution impact. Copyright © 2018 Elsevier B.V. All rights reserved.
Computational Prediction of Metabolism: Sites, Products, SAR, P450 Enzyme Dynamics, and Mechanisms
2012-01-01
Metabolism of xenobiotics remains a central challenge for the discovery and development of drugs, cosmetics, nutritional supplements, and agrochemicals. Metabolic transformations are frequently related to the incidence of toxic effects that may result from the emergence of reactive species, the systemic accumulation of metabolites, or by induction of metabolic pathways. Experimental investigation of the metabolism of small organic molecules is particularly resource demanding; hence, computational methods are of considerable interest to complement experimental approaches. This review provides a broad overview of structure- and ligand-based computational methods for the prediction of xenobiotic metabolism. Current computational approaches to address xenobiotic metabolism are discussed from three major perspectives: (i) prediction of sites of metabolism (SOMs), (ii) elucidation of potential metabolites and their chemical structures, and (iii) prediction of direct and indirect effects of xenobiotics on metabolizing enzymes, where the focus is on the cytochrome P450 (CYP) superfamily of enzymes, the cardinal xenobiotics metabolizing enzymes. For each of these domains, a variety of approaches and their applications are systematically reviewed, including expert systems, data mining approaches, quantitative structure–activity relationships (QSARs), and machine learning-based methods, pharmacophore-based algorithms, shape-focused techniques, molecular interaction fields (MIFs), reactivity-focused techniques, protein–ligand docking, molecular dynamics (MD) simulations, and combinations of methods. Predictive metabolism is a developing area, and there is still enormous potential for improvement. However, it is clear that the combination of rapidly increasing amounts of available ligand- and structure-related experimental data (in particular, quantitative data) with novel and diverse simulation and modeling approaches is accelerating the development of effective tools for prediction of in vivo metabolism, which is reflected by the diverse and comprehensive data sources and methods for metabolism prediction reviewed here. This review attempts to survey the range and scope of computational methods applied to metabolism prediction and also to compare and contrast their applicability and performance. PMID:22339582
Less label, more free: approaches in label-free quantitative mass spectrometry.
Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A
2011-02-01
In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sun, Jimeng; Hu, Jianying; Luo, Dijun; Markatou, Marianthi; Wang, Fei; Edabollahi, Shahram; Steinhubl, Steven E.; Daar, Zahra; Stewart, Walter F.
2012-01-01
Background: The ability to identify the risk factors related to an adverse condition, e.g., heart failures (HF) diagnosis, is very important for improving care quality and reducing cost. Existing approaches for risk factor identification are either knowledge driven (from guidelines or literatures) or data driven (from observational data). No existing method provides a model to effectively combine expert knowledge with data driven insight for risk factor identification. Methods: We present a systematic approach to enhance known knowledge-based risk factors with additional potential risk factors derived from data. The core of our approach is a sparse regression model with regularization terms that correspond to both knowledge and data driven risk factors. Results: The approach is validated using a large dataset containing 4,644 heart failure cases and 45,981 controls. The outpatient electronic health records (EHRs) for these patients include diagnosis, medication, lab results from 2003–2010. We demonstrate that the proposed method can identify complementary risk factors that are not in the existing known factors and can better predict the onset of HF. We quantitatively compare different sets of risk factors in the context of predicting onset of HF using the performance metric, the Area Under the ROC Curve (AUC). The combined risk factors between knowledge and data significantly outperform knowledge-based risk factors alone. Furthermore, those additional risk factors are confirmed to be clinically meaningful by a cardiologist. Conclusion: We present a systematic framework for combining knowledge and data driven insights for risk factor identification. We demonstrate the power of this framework in the context of predicting onset of HF, where our approach can successfully identify intuitive and predictive risk factors beyond a set of known HF risk factors. PMID:23304365
Stochasticity in the signalling network of a model microbe
NASA Astrophysics Data System (ADS)
Bischofs, Ilka; Foley, Jonathan; Battenberg, Eric; Fontaine-Bodin, Lisa; Price, Gavin; Wolf, Denise; Arkin, Adam
2007-03-01
The soil dwelling bacterium Bacillus subtilis is an excellent model organism for studying stochastic stress response induction in an isoclonal population. Subjected to the same stressor cells undergo different cell fates, including sporulation, competence, degradative enzyme synthesis and motility. For example, under conditions of nutrient deprivation and high cell density only a portion of the cell population forms an endospore. Here we use a combined experimental and theoretical approach to study stochastic sporulation induction in Bacillus subtilis. Using several fluorescent reporter strains we apply time lapse fluorescent microscopy in combination with quantitative image analysis to study cell fate progression on a single cell basis and elucidate key noise generators in the underlying cellular network.
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process.
Lin, Yang-Cheng; Yeh, Chung-Hsing; Wang, Chen-Cheng; Wei, Chun-Chun
2012-01-01
How to design highly reputable and hot-selling products is an essential issue in product design. Whether consumers choose a product depends largely on their perception of the product image. A consumer-oriented design approach presented in this paper helps product designers incorporate consumers' perceptions of product forms in the design process. The consumer-oriented design approach uses quantification theory type I, grey prediction (the linear modeling technique), and neural networks (the nonlinear modeling technique) to determine the optimal form combination of product design for matching a given product image. An experimental study based on the concept of Kansei Engineering is conducted to collect numerical data for examining the relationship between consumers' perception of product image and product form elements of personal digital assistants (PDAs). The result of performance comparison shows that the QTTI model is good enough to help product designers determine the optimal form combination of product design. Although the PDA form design is used as a case study, the approach is applicable to other consumer products with various design elements and product images. The approach provides an effective mechanism for facilitating the consumer-oriented product design process. PMID:23258961
An overview of quantitative approaches in Gestalt perception.
Jäkel, Frank; Singh, Manish; Wichmann, Felix A; Herzog, Michael H
2016-09-01
Gestalt psychology is often criticized as lacking quantitative measurements and precise mathematical models. While this is true of the early Gestalt school, today there are many quantitative approaches in Gestalt perception and the special issue of Vision Research "Quantitative Approaches in Gestalt Perception" showcases the current state-of-the-art. In this article we give an overview of these current approaches. For example, ideal observer models are one of the standard quantitative tools in vision research and there is a clear trend to try and apply this tool to Gestalt perception and thereby integrate Gestalt perception into mainstream vision research. More generally, Bayesian models, long popular in other areas of vision research, are increasingly being employed to model perceptual grouping as well. Thus, although experimental and theoretical approaches to Gestalt perception remain quite diverse, we are hopeful that these quantitative trends will pave the way for a unified theory. Copyright © 2016 Elsevier Ltd. All rights reserved.
Computational Biochemistry-Enzyme Mechanisms Explored.
Culka, Martin; Gisdon, Florian J; Ullmann, G Matthias
2017-01-01
Understanding enzyme mechanisms is a major task to achieve in order to comprehend how living cells work. Recent advances in biomolecular research provide huge amount of data on enzyme kinetics and structure. The analysis of diverse experimental results and their combination into an overall picture is, however, often challenging. Microscopic details of the enzymatic processes are often anticipated based on several hints from macroscopic experimental data. Computational biochemistry aims at creation of a computational model of an enzyme in order to explain microscopic details of the catalytic process and reproduce or predict macroscopic experimental findings. Results of such computations are in part complementary to experimental data and provide an explanation of a biochemical process at the microscopic level. In order to evaluate the mechanism of an enzyme, a structural model is constructed which can be analyzed by several theoretical approaches. Several simulation methods can and should be combined to get a reliable picture of the process of interest. Furthermore, abstract models of biological systems can be constructed combining computational and experimental data. In this review, we discuss structural computational models of enzymatic systems. We first discuss various models to simulate enzyme catalysis. Furthermore, we review various approaches how to characterize the enzyme mechanism both qualitatively and quantitatively using different modeling approaches. © 2017 Elsevier Inc. All rights reserved.
Mátrai, Zoltán; Péley, Gábor; Kovács, Tibor; Rényi, Vámos Ferenc; Szívós, Edit; Szabó, Eszter; Vereczkey, Ildikó; Török, Klára; Köves, István
2006-01-29
Peritoneal carcinomatosis is a fatal diagnosis, associated with poor prognosis and quality of life. Survival is usually estimated in month. Traditionally surgery for peritoneal carcinomatosis was indicated only for palliative effort. Advances in tumour biology, cytoreductive surgery and pharmacology have improved the approach for this condition. An aggressive combined approach to peritoneal surface malignancy involves peritonectomy and intraperitoneal perioperative hyperthermic chemotherapy. Cytoreductive surgery reduces carcinomatosis to microscopic residual disease so that intraperitoneal hyperthermic chemotherapy is able to eradicate cancer. Hyperthermic chemotherapy enhances the cytotoxicity of the drugs and increases their penetration into the cancerous tissue. Careful patient selection is crucial for this multimodality approach. Quantitative prognostic indicators are useful in the assessment of outcome, like peritoneal cancer index and completeness of cytoreduction score. Cytoreductive surgery combined with intraperitoneal hyperthermic chemotherapy improves survival but is associated with significant morbidity and mortality. This review is based on a case report of a 22-year-old female patient who had peritoneal carcinomatosis of inflammatory myofibroblastic sarcoma and was treated by cytoreductive surgery and intraperitoneal hyperthermic chemotherapy at our department.
A versatile method for groundwater vulnerability projections in future scenarios.
Stevenazzi, Stefania; Bonfanti, Marianna; Masetti, Marco; Nghiem, Son V; Sorichetta, Alessandro
2017-02-01
Water scarcity and associated risks are serious societal problems. A major challenge for the future will be to ensure the short-term and long-term provision of accessible and safe freshwater to meet the needs of the rapidly growing human population and changes in land cover and land use, where conservation and protection play a key role. Through a Bayesian spatial statistical method, a time-dependent approach for groundwater vulnerability assessment is developed to account for both the recent status of groundwater contamination and its evolution, as required by the European Union (Groundwater Directive, 2006/118/EC). This approach combines natural and anthropogenic factors to identify areas with a critical combination of high levels and increasing trends of nitrate concentrations, together with a quantitative evaluation of how different future scenarios would impact the quality of groundwater resources in a given area. In particular, the proposed approach can determine potential impacts on groundwater resources if policies are maintained at the status quo or if new measures are implemented for safeguarding groundwater quality, as natural factors are changing under climatic or anthropogenic stresses. Copyright © 2016 Elsevier Ltd. All rights reserved.
Achleitner, Stefan; De Toffol, Sara; Engelhard, Carolina; Rauch, Wolfgang
2005-04-01
The European Water framework directive (WFD) is probably the most important environmental management directive that has been enacted over the last decade in the European Union. The directive aims at achieving an overall good ecological status in all European water bodies. In this article, we discuss the implementation steps of the WFD and their implications for environmental engineering practice while focusing on rivers as the main receiving waters. Arising challenges for engineers and scientists are seen in the quantitative assessment of water quality, where standardized systems are needed to estimate the biological status. This is equally of concern in engineering planning, where the prediction of ecological impacts is required. Studies dealing with both classification and prediction of the ecological water quality are reviewed. Further, the combined emission-water quality approach is discussed. Common understanding of this combined approach is to apply the most stringent of either water quality or emission standard to a certain case. In contrast, for example, the Austrian water act enables the application of only the water quality based approach--at least on a temporary basis.
Nonparametric regression applied to quantitative structure-activity relationships
Constans; Hirst
2000-03-01
Several nonparametric regressors have been applied to modeling quantitative structure-activity relationship (QSAR) data. The simplest regressor, the Nadaraya-Watson, was assessed in a genuine multivariate setting. Other regressors, the local linear and the shifted Nadaraya-Watson, were implemented within additive models--a computationally more expedient approach, better suited for low-density designs. Performances were benchmarked against the nonlinear method of smoothing splines. A linear reference point was provided by multilinear regression (MLR). Variable selection was explored using systematic combinations of different variables and combinations of principal components. For the data set examined, 47 inhibitors of dopamine beta-hydroxylase, the additive nonparametric regressors have greater predictive accuracy (as measured by the mean absolute error of the predictions or the Pearson correlation in cross-validation trails) than MLR. The use of principal components did not improve the performance of the nonparametric regressors over use of the original descriptors, since the original descriptors are not strongly correlated. It remains to be seen if the nonparametric regressors can be successfully coupled with better variable selection and dimensionality reduction in the context of high-dimensional QSARs.
White-light diffraction phase microscopy at doubled space-bandwidth product.
Shan, Mingguang; Kandel, Mikhail E; Majeed, Hassaan; Nastasa, Viorel; Popescu, Gabriel
2016-12-12
White light diffraction microscopy (wDPM) is a quantitative phase imaging method that benefits from both temporal and spatial phase sensitivity, granted, respectively, by the common-path geometry and white light illumination. However, like all off-axis quantitative phase imaging methods, wDPM is characterized by a reduced space-bandwidth product compared to phase shifting approaches. This happens essentially because the ultimate resolution of the image is governed by the period of the interferogram and not just the diffraction limit. As a result, off-axis techniques generates single-shot, i.e., high time-bandwidth, phase measurements, at the expense of either spatial resolution or field of view. Here, we show that combining phase-shifting and off-axis, the original space-bandwidth is preserved. Specifically, we developed phase-shifting diffraction phase microscopy with white light, in which we measure and combine two phase shifted interferograms. Due to the white light illumination, the phase images are characterized by low spatial noise, i.e., <1nm pathlength. We illustrate the operation of the instrument with test samples, blood cells, and unlabeled prostate tissue biopsy.
Que, Emily L.; Bleher, Reiner; Duncan, Francesca E.; ...
2014-12-15
Fertilization of a mammalian egg induces a series of ‘zinc sparks’ that are necessary for inducing the egg-to-embryo transition. Despite the importance of these zinc efflux events little is known about their origin. To understand the molecular mechanism of the zinc spark we combined four physical approaches to resolve zinc distributions in single cells: a chemical probe for dynamic live-cell fluorescence imaging and a combination of scanning transmission electron microscopy with energy dispersive spectroscopy, X-ray fluorescence microscopy, and 3D elemental tomography for high resolution elemental mapping. Here we show that the zinc spark arises from a system of thousands ofmore » zinc-loaded vesicles, each of which contains, on average, 106 zinc atoms. These vesicles undergo dynamic movement during oocyte maturation and exocytosis at the time of fertilization. We conclude that the discovery of these vesicles and the demonstration that zinc sparks originate from them provides a quantitative framework for understanding how zinc fluxes regulate cellular processes.« less
Choice of Illumination System & Fluorophore for Multiplex Immunofluorescence on FFPE Tissue Sections
Kishen, Ria E. B.; Kluth, David C.; Bellamy, Christopher O. C.
2016-01-01
The recent availability of novel dyes and alternative light sources to facilitate complex tissue immunofluorescence studies such as multiplex labelling has not been matched by reports critically evaluating the considerations and relative benefits of these new tools, particularly in combination. Product information is often limited to wavelengths used for older fluorophores (FITC, TRITC & corresponding Alexa dyes family). Consequently, novel agents such as Quantum dots are not widely appreciated or used, despite highly favourable properties including extremely bright emission, stability and potentially reduced tissue autofluorescence at the excitation wavelength. Using spectral analysis, we report here a detailed critical appraisal and comparative evaluation of different light sources and fluorophores in multiplex immunofluorescence of clinical biopsy sections. The comparison includes mercury light, metal halide and 3 different LED-based systems, using 7 Qdots (525, 565, 585, 605, 625, 705), Cy3 and Cy5. We discuss the considerations relevant to achieving the best combination of light source and fluorophore for accurate multiplex fluorescence quantitation. We highlight practical limitations and confounders to quantitation with filter-based approaches. PMID:27632367
Methodological triangulation in a study of social support for siblings of children with cancer.
Murray, J S
1999-10-01
Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.
Hanna, Debra; Romero, Klaus; Schito, Marco
2017-03-01
The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.
Tyurin, Vladimir A.; Tyurina, Yulia Y.; Jung, Mi-Yeon; Tungekar, Muhammad A.; Wasserloos, Karla J.; Bayir, Hülya; Greenberger, Joel S.; Kochanek, Patrick M.; Shvedova, Anna A.; Pitt, Bruce; Kagan, Valerian E.
2009-01-01
Oxidation of two anionic phospholipids - cardiolipin (CL) in mitochondria and phosphatidylserine (PS) in extramitochondrial compartments - are important signaling events, particularly during the execution of programmed cell death and clearance of apoptotic cells. Quantitative analysis of CL and PS oxidation products is central to understanding their molecular mechanisms of action. We combined the identification of diverse phospholipid molecular species by ESI-MS with quantitative assessments of lipid hydroperoxides using a fluorescence HPLC-based protocol. We characterized CL and PS oxidation products formed in a model system (cyt c/H2O2), in apoptotic cells (neurons, pulmonary artery endothelial cells) and mouse lung under inflammatory/oxidative stress conditions (hyperoxia, inhalation of single walled carbon nanotubes). Our results demonstrate the usefulness of this approach for quantitative assessments, identification of individual molecular species and structural characterization of anionic phospholipids that are involved in oxidative modification in cells and tissues. PMID:19328050
Novel cardiac magnetic resonance biomarkers: native T1 and extracellular volume myocardial mapping.
Cannaò, Paola Maria; Altabella, Luisa; Petrini, Marcello; Alì, Marco; Secchi, Francesco; Sardanelli, Francesco
2016-04-28
Cardiac magnetic resonance (CMR) is a non-invasive diagnostic tool playing a key role in the assessment of cardiac morphology and function as well as in tissue characterization. Late gadolinium enhancement is a fundamental CMR technique for detecting focal or regional abnormalities such as scar tissue, replacement fibrosis, or inflammation using qualitative, semi-quantitative, or quantitative methods, but not allowing for evaluating the whole myocardium in the presence of diffuse disease. The novel T1 mapping approach permits a quantitative assessment of the entire myocardium providing a voxel-by-voxel map of native T1 relaxation time, obtained before the intravenous administration of gadolinium-based contrast material. Combining T1 data obtained before and after contrast injection, it is also possible to calculate the voxel-by-voxel extracellular volume (ECV), resulting in another myocardial parametric map. This article describes technical challenges and clinical perspectives of these two novel CMR biomarkers: myocardial native T1 and ECV mapping.
The right hemisphere in esthetic perception.
Bromberger, Bianca; Sternschein, Rebecca; Widick, Page; Smith, William; Chatterjee, Anjan
2011-01-01
Little about the neuropsychology of art perception and evaluation is known. Most neuropsychological approaches to art have focused on art production and have been anecdotal and qualitative. The field is in desperate need of quantitative methods if it is to advance. Here, we combine a quantitative approach to the assessment of art with modern voxel-lesion-symptom-mapping methods to determine brain-behavior relationships in art perception. We hypothesized that perception of different attributes of art are likely to be disrupted by damage to different regions of the brain. Twenty participants with right hemisphere damage were given the Assessment of Art Attributes, which is designed to quantify judgments of descriptive attributes of visual art. Each participant rated 24 paintings on 6 conceptual attributes (depictive accuracy, abstractness, emotion, symbolism, realism, and animacy) and 6 perceptual attributes (depth, color temperature, color saturation, balance, stroke, and simplicity) and their interest in and preference for these paintings. Deviation scores were obtained for each brain-damaged participant for each attribute based on correlations with group average ratings from 30 age-matched healthy participants. Right hemisphere damage affected participants' judgments of abstractness, accuracy, and stroke quality. Damage to areas within different parts of the frontal parietal and lateral temporal cortices produced deviation in judgments in four of six conceptual attributes (abstractness, symbolism, realism, and animacy). Of the formal attributes, only depth was affected by inferior prefrontal damage. No areas of brain damage were associated with deviations in interestingness or preference judgments. The perception of conceptual and formal attributes in artwork may in part dissociate from each other and from evaluative judgments. More generally, this approach demonstrates the feasibility of quantitative approaches to the neuropsychology of art.
Characterization and prediction of chemical functions and weight fractions in consumer products.
Isaacs, Kristin K; Goldsmith, Michael-Rock; Egeghy, Peter; Phillips, Katherine; Brooks, Raina; Hong, Tao; Wambaugh, John F
2016-01-01
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-based chemical prioritization.
Quantitative identification of senescent cells in aging and disease.
Biran, Anat; Zada, Lior; Abou Karam, Paula; Vadai, Ezra; Roitman, Lior; Ovadya, Yossi; Porat, Ziv; Krizhanovsky, Valery
2017-08-01
Senescent cells are present in premalignant lesions and sites of tissue damage and accumulate in tissues with age. In vivo identification, quantification and characterization of senescent cells are challenging tasks that limit our understanding of the role of senescent cells in diseases and aging. Here, we present a new way to precisely quantify and identify senescent cells in tissues on a single-cell basis. The method combines a senescence-associated beta-galactosidase assay with staining of molecular markers for cellular senescence and of cellular identity. By utilizing technology that combines flow cytometry with high-content image analysis, we were able to quantify senescent cells in tumors, fibrotic tissues, and tissues of aged mice. Our approach also yielded the finding that senescent cells in tissues of aged mice are larger than nonsenescent cells. Thus, this method provides a basis for quantitative assessment of senescent cells and it offers proof of principle for combination of different markers of senescence. It paves the way for screening of senescent cells for identification of new senescence biomarkers, genes that bypass senescence or senolytic compounds that eliminate senescent cells, thus enabling a deeper understanding of the senescent state in vivo. © 2017 The Authors. Aging Cell published by the Anatomical Society and John Wiley & Sons Ltd.
Characterization and Prediction of Chemical Functions and ...
Assessing exposures from the thousands of chemicals in commerce requires quantitative information on the chemical constituents of consumer products. Unfortunately, gaps in available composition data prevent assessment of exposure to chemicals in many products. Here we propose filling these gaps via consideration of chemical functional role. We obtained function information for thousands of chemicals from public sources and used a clustering algorithm to assign chemicals into 35 harmonized function categories (e.g., plasticizers, antimicrobials, solvents). We combined these functions with weight fraction data for 4115 personal care products (PCPs) to characterize the composition of 66 different product categories (e.g., shampoos). We analyzed the combined weight fraction/function dataset using machine learning techniques to develop quantitative structure property relationship (QSPR) classifier models for 22 functions and for weight fraction, based on chemical-specific descriptors (including chemical properties). We applied these classifier models to a library of 10196 data-poor chemicals. Our predictions of chemical function and composition will inform exposure-based screening of chemicals in PCPs for combination with hazard data in risk-based evaluation frameworks. As new information becomes available, this approach can be applied to other classes of products and the chemicals they contain in order to provide essential consumer product data for use in exposure-b
Khodakov, Dmitriy; Wang, Chunyan; Zhang, David Yu
2016-10-01
Nucleic acid sequence variations have been implicated in many diseases, and reliable detection and quantitation of DNA/RNA biomarkers can inform effective therapeutic action, enabling precision medicine. Nucleic acid analysis technologies being translated into the clinic can broadly be classified into hybridization, PCR, and sequencing, as well as their combinations. Here we review the molecular mechanisms of popular commercial assays, and their progress in translation into in vitro diagnostics. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Synthesis of qualitative research and evidence-based nursing.
Flemming, Kate
Evidence-based nursing is central to nursing practice. Systematic reviews have played a key part in providing evidence for decision making in nursing. Traditionally, these have consisted of syntheses of randomised controlled trials. New approaches to combining research include the synthesis of qualitative research. This article discusses the development of research synthesis as a method for creating evidence of effectiveness identified in quantitative research; more effective use of primary data; enhancing the generalizability of qualitative research; the identification of future nursing research topics.
Optical assay for biotechnology and clinical diagnosis.
Moczko, Ewa; Cauchi, Michael; Turner, Claire; Meglinski, Igor; Piletsky, Sergey
2011-08-01
In this paper, we present an optical diagnostic assay consisting of a mixture of environmental-sensitive fluorescent dyes combined with multivariate data analysis for quantitative and qualitative examination of biological and clinical samples. The performance of the assay is based on the analysis of spectrum of the selected fluorescent dyes with the operational principle similar to electronic nose and electronic tongue systems. This approach has been successfully applied for monitoring of growing cell cultures and identification of gastrointestinal diseases in humans.
Liao, Hsiao-Wei; Chen, Guan-Yuan; Wu, Ming-Shiang; Liao, Wei-Chih; Lin, Ching-Hung; Kuo, Ching-Hua
2017-02-03
Quantitative metabolomics has become much more important in clinical research in recent years. Individual differences in matrix effects (MEs) and the injection order effect are two major factors that reduce the quantification accuracy in liquid chromatography-electrospray ionization-mass spectrometry-based (LC-ESI-MS) metabolomics studies. This study proposed a postcolumn infused-internal standard (PCI-IS) combined with a matrix normalization factor (MNF) strategy to improve the analytical accuracy of quantitative metabolomics. The PCI-IS combined with the MNF method was applied for a targeted metabolomics study of amino acids (AAs). D8-Phenylalanine was used as the PCI-IS, and it was postcolumn-infused into the ESI interface for calibration purposes. The MNF was used to bridge the AA response in a standard solution with the plasma samples. The MEs caused signal changes that were corrected by dividing the AA signal intensities by the PCI-IS intensities after adjustment with the MNF. After the method validation, we evaluated the method applicability for breast cancer research using 100 plasma samples. The quantification results revealed that the 11 tested AAs exhibit an accuracy between 88.2 and 110.7%. The principal component analysis score plot revealed that the injection order effect can be successfully removed, and most of the within-group variation of the tested AAs decreased after the PCI-IS correction. Finally, targeted metabolomics studies on the AAs showed that tryptophan was expressed more in malignant patients than in the benign group. We anticipate that a similar approach can be applied to other endogenous metabolites to facilitate quantitative metabolomics studies.
Marchand, C; Gagnayre, R; d'Ivernois, J F
1996-01-01
There are very few examples of health training assessment in developing countries. Such an undertaking faces a number of difficulties concerning the problems inherent to assessment, the particular and unstable nature of the environment, and the problems associated with humanitarian action and development aid. It is difficult to choose between a formal and a natural approach. Indeed, a dual approach, combining quantitative and qualitative data seems best suited to a variety of cultural contexts of variable stability. Faced with these difficulties, a criteria-based, formative, quality-oriented assessment aimed at improving teaching and learning methods should be able to satisfy the needs of training professionals. We propose a training assessment guide based on an assessment model which aims to improve training techniques using comprehensive, descriptive and prescriptive approaches.
Sedykh, Alexander; Zhu, Hao; Tang, Hao; Zhang, Liying; Richard, Ann; Rusyn, Ivan; Tropsha, Alexander
2011-01-01
Background Quantitative high-throughput screening (qHTS) assays are increasingly being used to inform chemical hazard identification. Hundreds of chemicals have been tested in dozens of cell lines across extensive concentration ranges by the National Toxicology Program in collaboration with the National Institutes of Health Chemical Genomics Center. Objectives Our goal was to test a hypothesis that dose–response data points of the qHTS assays can serve as biological descriptors of assayed chemicals and, when combined with conventional chemical descriptors, improve the accuracy of quantitative structure–activity relationship (QSAR) models applied to prediction of in vivo toxicity end points. Methods We obtained cell viability qHTS concentration–response data for 1,408 substances assayed in 13 cell lines from PubChem; for a subset of these compounds, rodent acute toxicity half-maximal lethal dose (LD50) data were also available. We used the k nearest neighbor classification and random forest QSAR methods to model LD50 data using chemical descriptors either alone (conventional models) or combined with biological descriptors derived from the concentration–response qHTS data (hybrid models). Critical to our approach was the use of a novel noise-filtering algorithm to treat qHTS data. Results Both the external classification accuracy and coverage (i.e., fraction of compounds in the external set that fall within the applicability domain) of the hybrid QSAR models were superior to conventional models. Conclusions Concentration–response qHTS data may serve as informative biological descriptors of molecules that, when combined with conventional chemical descriptors, may considerably improve the accuracy and utility of computational approaches for predicting in vivo animal toxicity end points. PMID:20980217
Wei, Binnian; McGuffey, James E; Blount, Benjamin C; Wang, Lanqing
2016-01-01
Maternal exposure to marijuana during the lactation period-either active or passive-has prompted concerns about transmission of cannabinoids to breastfed infants and possible subsequent adverse health consequences. Assessing these health risks requires a sensitive analytical approach that is able to quantitatively measure trace-level cannabinoids in breast milk. Here, we describe a saponification-solid phase extraction approach combined with ultra-high-pressure liquid chromatography-tandem mass spectrometry for simultaneously quantifying Δ9-tetrahydrocannabinol (THC), cannabidiol (CBD), and cannabinol (CBN) in breast milk. We demonstrate for the first time that constraints on sensitivity can be overcome by utilizing alkaline saponification of the milk samples. After extensively optimizing the saponification procedure, the validated method exhibited limits of detections of 13, 4, and 66 pg/mL for THC, CBN, and CBD, respectively. Notably, the sensitivity achieved was significantly improved, for instance, the limits of detection for THC is at least 100-fold more sensitive compared to that previously reported in the literature. This is essential for monitoring cannabinoids in breast milk resulting from passive or nonrecent active maternal exposure. Furthermore, we simultaneously acquired multiple reaction monitoring transitions for 12 C- and 13 C-analyte isotopes. This combined analysis largely facilitated data acquisition by reducing the repetitive analysis rate for samples exceeding the linear limits of 12 C-analytes. In addition to high sensitivity and broad quantitation range, this method delivers excellent accuracy (relative error within ±10%), precision (relative standard deviation <10%), and efficient analysis. In future studies, we expect this method to play a critical role in assessing infant exposure to cannabinoids through breastfeeding.
Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne; Johnson, Andrew M
2016-03-08
Twitter's 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts.
Hamad, Eradah O; Savundranayagam, Marie Y; Holmes, Jeffrey D; Kinsella, Elizabeth Anne
2016-01-01
Background Twitter’s 140-character microblog posts are increasingly used to access information and facilitate discussions among health care professionals and between patients with chronic conditions and their caregivers. Recently, efforts have emerged to investigate the content of health care-related posts on Twitter. This marks a new area for researchers to investigate and apply content analysis (CA). In current infodemiology, infoveillance and digital disease detection research initiatives, quantitative and qualitative Twitter data are often combined, and there are no clear guidelines for researchers to follow when collecting and evaluating Twitter-driven content. Objective The aim of this study was to identify studies on health care and social media that used Twitter feeds as a primary data source and CA as an analysis technique. We evaluated the resulting 18 studies based on a narrative review of previous methodological studies and textbooks to determine the criteria and main features of quantitative and qualitative CA. We then used the key features of CA and mixed-methods research designs to propose the combined content-analysis (CCA) model as a solid research framework for designing, conducting, and evaluating investigations of Twitter-driven content. Methods We conducted a PubMed search to collect studies published between 2010 and 2014 that used CA to analyze health care-related tweets. The PubMed search and reference list checks of selected papers identified 21 papers. We excluded 3 papers and further analyzed 18. Results Results suggest that the methods used in these studies were not purely quantitative or qualitative, and the mixed-methods design was not explicitly chosen for data collection and analysis. A solid research framework is needed for researchers who intend to analyze Twitter data through the use of CA. Conclusions We propose the CCA model as a useful framework that provides a straightforward approach to guide Twitter-driven studies and that adds rigor to health care social media investigations. We provide suggestions for the use of the CCA model in elder care-related contexts. PMID:26957477
van Ruymbeke, E; Lee, H; Chang, T; Nikopoulou, A; Hadjichristidis, N; Snijkers, F; Vlassopoulos, D
2014-07-21
An emerging challenge in polymer physics is the quantitative understanding of the influence of a macromolecular architecture (i.e., branching) on the rheological response of entangled complex polymers. Recent investigations of the rheology of well-defined architecturally complex polymers have determined the composition in the molecular structure and identified the role of side-products in the measured samples. The combination of different characterization techniques, experimental and/or theoretical, represents the current state-of-the-art. Here we review this interdisciplinary approach to molecular rheology of complex polymers, and show the importance of confronting these different tools for ensuring an accurate characterization of a given polymeric sample. We use statistical tools in order to relate the information available from the synthesis protocols of a sample and its experimental molar mass distribution (typically obtained from size exclusion chromatography), and hence obtain precise information about its structural composition, i.e. enhance the existing sensitivity limit. We critically discuss the use of linear rheology as a reliable quantitative characterization tool, along with the recently developed temperature gradient interaction chromatography. The latter, which has emerged as an indispensable characterization tool for branched architectures, offers unprecedented sensitivity in detecting the presence of different molecular structures in a sample. Combining these techniques is imperative in order to quantify the molecular composition of a polymer and its consequences on the macroscopic properties. We validate this approach by means of a new model asymmetric comb polymer which was synthesized anionically. It was thoroughly characterized and its rheology was carefully analyzed. The main result is that the rheological signal reveals fine molecular details, which must be taken into account to fully elucidate the viscoelastic response of entangled branched polymers. It is important to appreciate that, even optimal model systems, i.e., those synthesized with high-vacuum anionic methods, need thorough characterization via a combination of techniques. Besides helping to improve synthetic techniques, this methodology will be significant in fine-tuning mesoscopic tube-based models and addressing outstanding issues such as the quantitative description of the constraint release mechanism.
Dolled-Filhart, Marisa P; Gustavson, Mark D
2012-11-01
Translational oncology has been improved by using tissue microarrays (TMAs), which facilitate biomarker analysis of large cohorts on a single slide. This has allowed for rapid analysis and validation of potential biomarkers for prognostic and predictive value, as well as for evaluation of biomarker prevalence. Coupled with quantitative analysis of immunohistochemical (IHC) staining, objective and standardized biomarker data from tumor samples can further advance companion diagnostic approaches for the identification of drug-responsive or resistant patient subpopulations. This review covers the advantages, disadvantages and applications of TMAs for biomarker research. Research literature and reviews of TMAs and quantitative image analysis methodology have been surveyed for this review (with an AQUA® analysis focus). Applications such as multi-marker diagnostic development and pathway-based biomarker subpopulation analyses are described. Tissue microarrays are a useful tool for biomarker analyses including prevalence surveys, disease progression assessment and addressing potential prognostic or predictive value. By combining quantitative image analysis with TMAs, analyses will be more objective and reproducible, allowing for more robust IHC-based diagnostic test development. Quantitative multi-biomarker IHC diagnostic tests that can predict drug response will allow for greater success of clinical trials for targeted therapies and provide more personalized clinical decision making.
Pesavento, James J; Bullock, Courtney R; LeDuc, Richard D; Mizzen, Craig A; Kelleher, Neil L
2008-05-30
Quantitative proteomics has focused heavily on correlating protein abundances, ratios, and dynamics by developing methods that are protein expression-centric (e.g. isotope coded affinity tag, isobaric tag for relative and absolute quantification, etc.). These methods effectively detect changes in protein abundance but fail to provide a comprehensive perspective of the diversity of proteins such as histones, which are regulated by post-translational modifications. Here, we report the characterization of modified forms of HeLa cell histone H4 with a dynamic range >10(4) using a strictly Top Down mass spectrometric approach coupled with two dimensions of liquid chromatography. This enhanced dynamic range enabled the precise characterization and quantitation of 42 forms uniquely modified by combinations of methylation and acetylation, including those with trimethylated Lys-20, monomethylated Arg-3, and the novel dimethylated Arg-3 (each <1% of all H4 forms). Quantitative analyses revealed distinct trends in acetylation site occupancy depending on Lys-20 methylation state. Because both modifications are dynamically regulated through the cell cycle, we simultaneously investigated acetylation and methylation kinetics through three cell cycle phases and used these data to statistically assess the robustness of our quantitative analysis. This work represents the most comprehensive analysis of histone H4 forms present in human cells reported to date.
Revisiting the Quantitative-Qualitative Debate: Implications for Mixed-Methods Research
SALE, JOANNA E. M.; LOHFELD, LYNNE H.; BRAZIL, KEVIN
2015-01-01
Health care research includes many studies that combine quantitative and qualitative methods. In this paper, we revisit the quantitative-qualitative debate and review the arguments for and against using mixed-methods. In addition, we discuss the implications stemming from our view, that the paradigms upon which the methods are based have a different view of reality and therefore a different view of the phenomenon under study. Because the two paradigms do not study the same phenomena, quantitative and qualitative methods cannot be combined for cross-validation or triangulation purposes. However, they can be combined for complementary purposes. Future standards for mixed-methods research should clearly reflect this recommendation. PMID:26523073
Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.
2015-01-01
In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724
Machine learning for predicting the response of breast cancer to neoadjuvant chemotherapy
Mani, Subramani; Chen, Yukun; Li, Xia; Arlinghaus, Lori; Chakravarthy, A Bapsi; Abramson, Vandana; Bhave, Sandeep R; Levy, Mia A; Xu, Hua; Yankeelov, Thomas E
2013-01-01
Objective To employ machine learning methods to predict the eventual therapeutic response of breast cancer patients after a single cycle of neoadjuvant chemotherapy (NAC). Materials and methods Quantitative dynamic contrast-enhanced MRI and diffusion-weighted MRI data were acquired on 28 patients before and after one cycle of NAC. A total of 118 semiquantitative and quantitative parameters were derived from these data and combined with 11 clinical variables. We used Bayesian logistic regression in combination with feature selection using a machine learning framework for predictive model building. Results The best predictive models using feature selection obtained an area under the curve of 0.86 and an accuracy of 0.86, with a sensitivity of 0.88 and a specificity of 0.82. Discussion With the numerous options for NAC available, development of a method to predict response early in the course of therapy is needed. Unfortunately, by the time most patients are found not to be responding, their disease may no longer be surgically resectable, and this situation could be avoided by the development of techniques to assess response earlier in the treatment regimen. The method outlined here is one possible solution to this important clinical problem. Conclusions Predictive modeling approaches based on machine learning using readily available clinical and quantitative MRI data show promise in distinguishing breast cancer responders from non-responders after the first cycle of NAC. PMID:23616206
Insights into multimodal imaging classification of ADHD
Colby, John B.; Rudie, Jeffrey D.; Brown, Jesse A.; Douglas, Pamela K.; Cohen, Mark S.; Shehzad, Zarrar
2012-01-01
Attention deficit hyperactivity disorder (ADHD) currently is diagnosed in children by clinicians via subjective ADHD-specific behavioral instruments and by reports from the parents and teachers. Considering its high prevalence and large economic and societal costs, a quantitative tool that aids in diagnosis by characterizing underlying neurobiology would be extremely valuable. This provided motivation for the ADHD-200 machine learning (ML) competition, a multisite collaborative effort to investigate imaging classifiers for ADHD. Here we present our ML approach, which used structural and functional magnetic resonance imaging data, combined with demographic information, to predict diagnostic status of individuals with ADHD from typically developing (TD) children across eight different research sites. Structural features included quantitative metrics from 113 cortical and non-cortical regions. Functional features included Pearson correlation functional connectivity matrices, nodal and global graph theoretical measures, nodal power spectra, voxelwise global connectivity, and voxelwise regional homogeneity. We performed feature ranking for each site and modality using the multiple support vector machine recursive feature elimination (SVM-RFE) algorithm, and feature subset selection by optimizing the expected generalization performance of a radial basis function kernel SVM (RBF-SVM) trained across a range of the top features. Site-specific RBF-SVMs using these optimal feature sets from each imaging modality were used to predict the class labels of an independent hold-out test set. A voting approach was used to combine these multiple predictions and assign final class labels. With this methodology we were able to predict diagnosis of ADHD with 55% accuracy (versus a 39% chance level in this sample), 33% sensitivity, and 80% specificity. This approach also allowed us to evaluate predictive structural and functional features giving insight into abnormal brain circuitry in ADHD. PMID:22912605
Quantitative dose-response assessment of inhalation exposures to toxic air pollutants
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarabek, A.M.; Foureman, G.L.; Gift, J.S.
1997-12-31
Implementation of the 1990 Clean Air Act Amendments, including evaluation of residual risks. requires accurate human health risk estimates of both acute and chronic inhalation exposures to toxic air pollutants. The U.S. Environmental Protection Agency`s National Center for Environmental Assessment, Research Triangle Park, NC, has a research program that addresses several key issues for development of improved quantitative approaches for dose-response assessment. This paper describes three projects underway in the program. Project A describes a Bayesian approach that was developed to base dose-response estimates on combined data sets and that expresses these estimates as probability density functions. A categorical regressionmore » model has been developed that allows for the combination of all available acute data, with toxicity expressed as severity categories (e.g., mild, moderate, severe), and with both duration and concentration as governing factors. Project C encompasses two refinements to uncertainty factors (UFs) often applied to extrapolate dose-response estimates from laboratory animal data to human equivalent concentrations. Traditional UFs have been based on analyses of oral administration and may not be appropriate for extrapolation of inhalation exposures. Refinement of the UF applied to account for the use of subchronic rather than chronic data was based on an analysis of data from inhalation exposures (Project C-1). Mathematical modeling using the BMD approach was used to calculate the dose-response estimates for comparison between the subchronic and chronic data so that the estimates were not subject to dose-spacing or sample size variability. The second UF that was refined for extrapolation of inhalation data was the adjustment for the use of a LOAEL rather than a NOAEL (Project C-2).« less
Spies, Annika; Korzun, Viktor; Bayles, Rosemary; Rajaraman, Jeyaraman; Himmelbach, Axel; Hedley, Pete E.; Schweizer, Patrick
2012-01-01
Race-non-specific, or quantitative, pathogen resistance is of high importance to plant breeders due to its expected durability. However, it is usually controlled by multiple quantitative trait loci (QTL) and therefore difficult to handle in practice. Knowing the genes that underlie race-non-specific resistance (NR) would allow its exploitation in a more targeted manner. Here, we performed an association-genetic study in a customized worldwide collection of spring barley accessions for candidate genes of race-NR to the powdery mildew fungus Blumeria graminis f. sp. hordei (Bgh) and combined data with results from QTL mapping as well as functional-genomics approaches. This led to the identification of 11 associated genes with converging evidence for an important role in race-NR in the presence of the Mlo gene for basal susceptibility. Outstanding in this respect was the gene encoding the transcription factor WRKY2. The results suggest that unlocking plant genetic resources and integrating functional-genomic with genetic approaches can accelerate the discovery of genes underlying race-NR in barley and other crop plants. PMID:22629270
NASA Astrophysics Data System (ADS)
Zhang, Jianyuan; Hu, Bin; Chen, Wenjuan; Moore, Philip; Xu, Tingting; Dong, Qunxi; Liu, Zhenyu; Luo, Yuejia; Chen, Shanguang
2014-12-01
The focus of the study is the estimation of the effects of microgravity on the central nervous activity and its underlying influencing mechanisms. To validate the microgravity-induced physiological and psychological effects on EEG, quantitative EEG features, cardiovascular indicators, mood state, and cognitive performances data collection was achieved during a 45 day period using a -6°head-down bed rest (HDBR) integrated approach. The results demonstrated significant differences in EEG data, as an increased Theta wave, a decreased Beta wave and a reduced complexity of brain, accompanied with an increased heart rate and pulse rate, decreased positive emotion, and degraded emotion conflict monitoring performance. The canonical correlation analysis (CCA) based cardiovascular and cognitive related EEG model showed the cardiovascular effect on EEG mainly affected bilateral temporal region and the cognitive effect impacted parietal-occipital and frontal regions. The results obtained in the study support the use of an approach which combines a multi-factor influential mechanism hypothesis. The changes in the EEG data may be influenced by both cardiovascular and cognitive effects.
Stochastic hydrogeology: what professionals really need?
Renard, Philippe
2007-01-01
Quantitative hydrogeology celebrated its 150th anniversary in 2006. Geostatistics is younger but has had a very large impact in hydrogeology. Today, geostatistics is used routinely to interpolate deterministically most of the parameters that are required to analyze a problem or make a quantitative analysis. In a small number of cases, geostatistics is combined with deterministic approaches to forecast uncertainty. At a more academic level, geostatistics is used extensively to study physical processes in heterogeneous aquifers. Yet, there is an important gap between the academic use and the routine applications of geostatistics. The reasons for this gap are diverse. These include aspects related to the hydrogeology consulting market, technical reasons such as the lack of widely available software, but also a number of misconceptions. A change in this situation requires acting at different levels. First, regulators must be convinced of the benefit of using geostatistics. Second, the economic potential of the approach must be emphasized to customers. Third, the relevance of the theories needs to be increased. Last, but not least, software, data sets, and computing infrastructure such as grid computing need to be widely available.
Simultaneous fluorescence and quantitative phase microscopy with single-pixel detectors
NASA Astrophysics Data System (ADS)
Liu, Yang; Suo, Jinli; Zhang, Yuanlong; Dai, Qionghai
2018-02-01
Multimodal microscopy offers high flexibilities for biomedical observation and diagnosis. Conventional multimodal approaches either use multiple cameras or a single camera spatially multiplexing different modes. The former needs expertise demanding alignment and the latter suffers from limited spatial resolution. Here, we report an alignment-free full-resolution simultaneous fluorescence and quantitative phase imaging approach using single-pixel detectors. By combining reference-free interferometry with single-pixel detection, we encode the phase and fluorescence of the sample in two detection arms at the same time. Then we employ structured illumination and the correlated measurements between the sample and the illuminations for reconstruction. The recovered fluorescence and phase images are inherently aligned thanks to single-pixel detection. To validate the proposed method, we built a proof-of-concept setup for first imaging the phase of etched glass with the depth of a few hundred nanometers and then imaging the fluorescence and phase of the quantum dot drop. This method holds great potential for multispectral fluorescence microscopy with additional single-pixel detectors or a spectrometer. Besides, this cost-efficient multimodal system might find broad applications in biomedical science and neuroscience.
Magnetic resonance imaging in active surveillance—a modern approach
Moore, Caroline M.
2018-01-01
In recent years, active surveillance has been increasingly adopted as a conservative management approach to low and sometimes intermediate risk prostate cancer, to avoid or delay treatment until there is evidence of higher risk disease. A number of studies have investigated the role of multiparametric magnetic resonance imaging (mpMRI) in this setting. MpMRI refers to the use of multiple MRI sequences (T2-weighted anatomical and functional imaging which can include diffusion-weighted imaging, dynamic contrast enhanced imaging, spectroscopy). Each of the parameters investigates different aspects of the prostate gland (anatomy, cellularity, vascularity, etc.). In addition to a qualitative assessment, the radiologist can also extrapolate quantitative imaging biomarkers from these sequences, for example the apparent diffusion coefficient from diffusion-weighted imaging. There are many different types of articles (e.g., reviews, commentaries, consensus meetings, etc.) that address the use of mpMRI in men on active surveillance for prostate cancer. In this paper, we compare original articles that investigate the role of the different mpMRI sequences in men on active surveillance for prostate cancer, in order to discuss the relative utility of the different sequences, and combinations of sequences. We searched MEDLINE/PubMed for manuscripts published from inception to 1st December 2017. The search terms used were (prostate cancer or prostate adenocarcinoma or prostatic carcinoma or prostate carcinoma or prostatic adenocarcinoma) and (MRI or NMR or magnetic resonance imaging or mpMRI or multiparametric MRI) and active surveillance. Overall, 425 publications were found. All abstracts were reviewed to identify papers with original data. Twenty-five papers were analysed and summarised. Some papers based their analysis only on one mpMRI sequence, while others assessed two or more. The evidence from this review suggests that qualitative assessments and quantitative data from different mpMRI sequences hold promise in the management of men on active surveillance for prostate cancer. Both qualitative and quantitative approaches should be considered when assessing mpMRI of the prostate. There is a need for robust studies assessing the relative utility of different combinations of sequences in a systematic manner to determine the most efficient use of mpMRI in men on active surveillance. PMID:29594026
NASA Astrophysics Data System (ADS)
Li, Hong-Yi; Sivapalan, Murugesu; Tian, Fuqiang; Harman, Ciaran
2014-12-01
Inspired by the Dunne diagram, the climatic and landscape controls on the partitioning of annual runoff into its various components (Hortonian and Dunne overland flow and subsurface stormflow) are assessed quantitatively, from a purely theoretical perspective. A simple distributed hydrologic model has been built sufficient to simulate the effects of different combinations of climate, soil, and topography on the runoff generation processes. The model is driven by a sequence of simple hypothetical precipitation events, for a large combination of climate and landscape properties, and hydrologic responses at the catchment scale are obtained through aggregation of grid-scale responses. It is found, first, that the water balance responses, including relative contributions of different runoff generation mechanisms, could be related to a small set of dimensionless similarity parameters. These capture the competition between the wetting, drying, storage, and drainage functions underlying the catchment responses, and in this way, provide a quantitative approximation of the conceptual Dunne diagram. Second, only a subset of all hypothetical catchment/climate combinations is found to be "behavioral," in terms of falling sufficiently close to the Budyko curve, describing mean annual runoff as a function of climate aridity. Furthermore, these behavioral combinations are mostly consistent with the qualitative picture presented in the Dunne diagram, indicating clearly the commonality between the Budyko curve and the Dunne diagram. These analyses also suggest clear interrelationships amongst the "behavioral" climate, soil, and topography parameter combinations, implying these catchment properties may be constrained to be codependent in order to satisfy the Budyko curve.
Yegnasubramanian, Srinivasan; Lin, Xiaohui; Haffner, Michael C; DeMarzo, Angelo M; Nelson, William G
2006-02-09
Hypermethylation of CpG island (CGI) sequences is a nearly universal somatic genome alteration in cancer. Rapid and sensitive detection of DNA hypermethylation would aid in cancer diagnosis and risk stratification. We present a novel technique, called COMPARE-MS, that can rapidly and quantitatively detect CGI hypermethylation with high sensitivity and specificity in hundreds of samples simultaneously. To quantitate CGI hypermethylation, COMPARE-MS uses real-time PCR of DNA that was first digested by methylation-sensitive restriction enzymes and then precipitated by methyl-binding domain polypeptides immobilized on a magnetic solid matrix. We show that COMPARE-MS could detect five genome equivalents of methylated CGIs in a 1000- to 10,000-fold excess of unmethylated DNA. COMPARE-MS was used to rapidly quantitate hypermethylation at multiple CGIs in >155 prostate tissues, including benign and malignant prostate specimens, and prostate cell lines. This analysis showed that GSTP1, MDR1 and PTGS2 CGI hypermethylation as determined by COMPARE-MS could differentiate between malignant and benign prostate with sensitivities >95% and specificities approaching 100%. This novel technology could significantly improve our ability to detect CGI hypermethylation.
Wu, Zengnan; Khan, Mashooq; Mao, Sifeng; Lin, Ling; Lin, Jin-Ming
2018-05-01
Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is a fast analysis tool for the detection of a wide range of analytes. However, heterogeneous distribution of matrix/analyte cocrystal, variation in signal intensity and poor experimental reproducibility at different locations of the same spot means difficulty in quantitative analysis. In this work, carbon nanotubes (CNTs) were employed as adsorbent for analyte cum matrix on a conductive porous membrane as a novel mass target plate. The sample pretreatment step was achieved by enrichment and dead-end filtration and dried by a solid-liquid separation. This approach enables the homogeneous distribution of analyte in the matrix, good shot-to-shot reproducibility in signals and quantitative detection of peptide and protein at different concentrations with correlation coefficient (R 2 ) of 0.9920 and 0.9909, respectively. The simple preparation of sample in a short time, uniform distribution of analyte, easy quantitative detection, and high reproducibility makes this technique useful and may diversify the application of MALDI-MS for quantitative detection of a variety of proteins. Copyright © 2018 Elsevier B.V. All rights reserved.
LeBlanc, André; Shiao, Tze Chieh; Roy, René; Sleno, Lekha
2014-09-15
Acetaminophen is known to cause hepatoxicity via the formation of a reactive metabolite, N-acetyl p-benzoquinone imine (NAPQI), as a result of covalent binding to liver proteins. Serum albumin (SA) is known to be covalently modified by NAPQI and is present at high concentrations in the bloodstream and is therefore a potential biomarker to assess the levels of protein modification by NAPQI. A newly developed method for the absolute quantitation of serum albumin containing NAPQI covalently bound to its active site cysteine (Cys34) is described. This optimized assay represents the first absolute quantitation of a modified protein, with very low stoichiometric abundance, using a protein-level standard combined with isotope dilution. The LC-MS/MS assay is based on a protein standard modified with a custom-designed reagent, yielding a surrogate peptide (following digestion) that is a positional isomer to the target peptide modified by NAPQI. To illustrate the potential of this approach, the method was applied to quantify NAPQI-modified SA in plasma from rats dosed with acetaminophen. The resulting method is highly sensitive (capable of quantifying down to 0.0006% of total RSA in its NAPQI-modified form) and yields excellent precision and accuracy statistics. A time-course pharmacokinetic study was performed to test the usefulness of this method for following acetaminophen-induced covalent binding at four dosing levels (75-600 mg/kg IP), showing the viability of this approach to directly monitor in vivo samples. This approach can reliably quantify NAPQI-modified albumin, allowing direct monitoring of acetaminophen-related covalent binding.
Morawski, Markus; Kirilina, Evgeniya; Scherf, Nico; Jäger, Carsten; Reimann, Katja; Trampel, Robert; Gavriilidis, Filippos; Geyer, Stefan; Biedermann, Bernd; Arendt, Thomas; Weiskopf, Nikolaus
2017-11-28
Recent breakthroughs in magnetic resonance imaging (MRI) enabled quantitative relaxometry and diffusion-weighted imaging with sub-millimeter resolution. Combined with biophysical models of MR contrast the emerging methods promise in vivo mapping of cyto- and myelo-architectonics, i.e., in vivo histology using MRI (hMRI) in humans. The hMRI methods require histological reference data for model building and validation. This is currently provided by MRI on post mortem human brain tissue in combination with classical histology on sections. However, this well established approach is limited to qualitative 2D information, while a systematic validation of hMRI requires quantitative 3D information on macroscopic voxels. We present a promising histological method based on optical 3D imaging combined with a tissue clearing method, Clear Lipid-exchanged Acrylamide-hybridized Rigid Imaging compatible Tissue hYdrogel (CLARITY), adapted for hMRI validation. Adapting CLARITY to the needs of hMRI is challenging due to poor antibody penetration into large sample volumes and high opacity of aged post mortem human brain tissue. In a pilot experiment we achieved transparency of up to 8 mm-thick and immunohistochemical staining of up to 5 mm-thick post mortem brain tissue by a combination of active and passive clearing, prolonged clearing and staining times. We combined 3D optical imaging of the cleared samples with tailored image processing methods. We demonstrated the feasibility for quantification of neuron density, fiber orientation distribution and cell type classification within a volume with size similar to a typical MRI voxel. The presented combination of MRI, 3D optical microscopy and image processing is a promising tool for validation of MRI-based microstructure estimates. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
A powerful approach reveals numerous expression quantitative trait haplotypes in multiple tissues.
Ying, Dingge; Li, Mulin Jun; Sham, Pak Chung; Li, Miaoxin
2018-04-26
Recently many studies showed single nucleotide polymorphisms (SNPs) affect gene expression and contribute to development of complex traits/diseases in a tissue context-dependent manner. However, little is known about haplotype's influence on gene expression and complex traits, which reflects the interaction effect between SNPs. In the present study, we firstly proposed a regulatory region guided eQTL haplotype association analysis approach, and then systematically investigate the expression quantitative trait loci (eQTL) haplotypes in 20 different tissues by the approach. The approach has a powerful design of reducing computational burden by the utilization of regulatory predictions for candidate SNP selection and multiple testing corrections on non-independent haplotypes. The application results in multiple tissues showed that haplotype-based eQTLs not only increased the number of eQTL genes in a tissue specific manner, but were also enriched in loci that associated with complex traits in a tissue-matched manner. In addition, we found that tag SNPs of eQTL haplotypes from whole blood were selectively enriched in certain combination of regulatory elements (e.g. promoters and enhancers) according to predicted chromatin states. In summary, this eQTL haplotype detection approach, together with the application results, shed insights into synergistic effect of sequence variants on gene expression and their susceptibility to complex diseases. The executable application "eHaplo" is implemented in Java and is publicly available at http://grass.cgs.hku.hk/limx/ehaplo/. jonsonfox@gmail.com, limiaoxin@mail.sysu.edu.cn. Supplementary data are available at Bioinformatics online.
Breaking from binaries - using a sequential mixed methods design.
Larkin, Patricia Mary; Begley, Cecily Marion; Devane, Declan
2014-03-01
To outline the traditional worldviews of healthcare research and discuss the benefits and challenges of using mixed methods approaches in contributing to the development of nursing and midwifery knowledge. There has been much debate about the contribution of mixed methods research to nursing and midwifery knowledge in recent years. A sequential exploratory design is used as an exemplar of a mixed methods approach. The study discussed used a combination of focus-group interviews and a quantitative instrument to obtain a fuller understanding of women's experiences of childbirth. In the mixed methods study example, qualitative data were analysed using thematic analysis and quantitative data using regression analysis. Polarised debates about the veracity, philosophical integrity and motivation for conducting mixed methods research have largely abated. A mixed methods approach can contribute to a deeper, more contextual understanding of a variety of subjects and experiences; as a result, it furthers knowledge that can be used in clinical practice. The purpose of the research study should be the main instigator when choosing from an array of mixed methods research designs. Mixed methods research offers a variety of models that can augment investigative capabilities and provide richer data than can a discrete method alone. This paper offers an example of an exploratory, sequential approach to investigating women's childbirth experiences. A clear framework for the conduct and integration of the different phases of the mixed methods research process is provided. This approach can be used by practitioners and policy makers to improve practice.
An interdisciplinary approach for earthquake modelling and forecasting
NASA Astrophysics Data System (ADS)
Han, P.; Zhuang, J.; Hattori, K.; Ogata, Y.
2016-12-01
Earthquake is one of the most serious disasters, which may cause heavy casualties and economic losses. Especially in the past two decades, huge/mega earthquakes have hit many countries. Effective earthquake forecasting (including time, location, and magnitude) becomes extremely important and urgent. To date, various heuristically derived algorithms have been developed for forecasting earthquakes. Generally, they can be classified into two types: catalog-based approaches and non-catalog-based approaches. Thanks to the rapid development of statistical seismology in the past 30 years, now we are able to evaluate the performances of these earthquake forecast approaches quantitatively. Although a certain amount of precursory information is available in both earthquake catalogs and non-catalog observations, the earthquake forecast is still far from satisfactory. In most case, the precursory phenomena were studied individually. An earthquake model that combines self-exciting and mutually exciting elements was developed by Ogata and Utsu from the Hawkes process. The core idea of this combined model is that the status of the event at present is controlled by the event itself (self-exciting) and all the external factors (mutually exciting) in the past. In essence, the conditional intensity function is a time-varying Poisson process with rate λ(t), which is composed of the background rate, the self-exciting term (the information from past seismic events), and the external excitation term (the information from past non-seismic observations). This model shows us a way to integrate the catalog-based forecast and non-catalog-based forecast. Against this background, we are trying to develop a new earthquake forecast model which combines catalog-based and non-catalog-based approaches.
Cost-effectiveness analysis of risk-reduction measures to reach water safety targets.
Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof; Pettersson, Thomas J R
2011-01-01
Identifying the most suitable risk-reduction measures in drinking water systems requires a thorough analysis of possible alternatives. In addition to the effects on the risk level, also the economic aspects of the risk-reduction alternatives are commonly considered important. Drinking water supplies are complex systems and to avoid sub-optimisation of risk-reduction measures, the entire system from source to tap needs to be considered. There is a lack of methods for quantification of water supply risk reduction in an economic context for entire drinking water systems. The aim of this paper is to present a novel approach for risk assessment in combination with economic analysis to evaluate risk-reduction measures based on a source-to-tap approach. The approach combines a probabilistic and dynamic fault tree method with cost-effectiveness analysis (CEA). The developed approach comprises the following main parts: (1) quantification of risk reduction of alternatives using a probabilistic fault tree model of the entire system; (2) combination of the modelling results with CEA; and (3) evaluation of the alternatives with respect to the risk reduction, the probability of not reaching water safety targets and the cost-effectiveness. The fault tree method and CEA enable comparison of risk-reduction measures in the same quantitative unit and consider costs and uncertainties. The approach provides a structured and thorough analysis of risk-reduction measures that facilitates transparency and long-term planning of drinking water systems in order to avoid sub-optimisation of available resources for risk reduction. Copyright © 2010 Elsevier Ltd. All rights reserved.
Quali-quantitative analysis (QQA): why it could open new frontiers for holistic health practice.
Bell, Erica
2006-12-15
Holistic health practice is often described as being about understanding the larger contexts of patients, their health services, and their communities. Yet do traditional quantitative and qualitative health research methods produce the best possible evidence for the holistic practices of doctors, nurses, and allied health professionals? This paper argues "no", and examines the potential of a cutting-edge, social science research method--Quali-Quantitative Research (QQA)--for providing better evidence for holistic practice, particularly in small-N populations, such as rural and remote communities. It does so with reference to the international literature on holistic medicine, as well as three holistic health projects conducted in Tasmania: about prevention of falls in older people, adolescent substance abuse, and interventions for children aged 0-5 exposed to domestic violence. The findings suggest that much health research fails to capture rigorously the contextual complexity of holistic health challenges: the multiple different needs of individual patients, and the interprofessional approaches needed to deliver multidisciplinary and multiservice health interventions tailored to meet those needs in particular community contexts. QQA offers a "configurational", case-based, diversity-oriented approach to analysing data that combines qualitative and quantitative techniques to overcome the limitations of both research traditions. The author concludes that QQA could open new frontiers for holistic health by helping doctors, nurses, and allied health professionals answer a fundamental question presented by complex health challenges: "Given this set of whole-of-patient needs, what elements of which interventions in what services would work best in this particular community?"
Esposito, Dominick; Taylor, Erin Fries; Gold, Marsha
2009-02-01
Interest in disease management programs continues to grow as managed care plans, the federal and state governments, and other organizations consider such efforts as a means to improve health care quality and reduce costs. These efforts vary in size, scope, and target population. While large-scale programs provide the means to measure impacts, evaluation of smaller interventions remains valuable as they often represent the early planning stages of larger initiatives. This paper describes a multi-method approach for evaluating small interventions that sought to improve the quality of care for Medicaid beneficiaries with multiple chronic conditions. Our approach relied on quantitative and qualitative methods to develop a complete understanding of each intervention. Quantitative data in the form of both process measures, such as case manager contacts, and outcome measures, such as hospital use, were reported and analyzed. Qualitative information was collected through interviews and the development of logic models to document the flow of intervention activities and how they were intended to affect outcomes. The logic models helped us to understand the underlying reasons for the success or lack thereof of each intervention. The analysis provides useful information on several fronts. First, qualitative data provided valuable information about implementation. Second, process measures helped determine whether implementation occurred as anticipated. Third, outcome measures indicated the potential for favorable results later, possibly suggesting further study. Finally, the evaluation of qualitative and quantitative data in combination helped us assess the potential promise of each intervention and identify common themes and challenges across all interventions.
Video stereolization: combining motion analysis with user interaction.
Liao, Miao; Gao, Jizhou; Yang, Ruigang; Gong, Minglun
2012-07-01
We present a semiautomatic system that converts conventional videos into stereoscopic videos by combining motion analysis with user interaction, aiming to transfer as much as possible labeling work from the user to the computer. In addition to the widely used structure from motion (SFM) techniques, we develop two new methods that analyze the optical flow to provide additional qualitative depth constraints. They remove the camera movement restriction imposed by SFM so that general motions can be used in scene depth estimation-the central problem in mono-to-stereo conversion. With these algorithms, the user's labeling task is significantly simplified. We further developed a quadratic programming approach to incorporate both quantitative depth and qualitative depth (such as these from user scribbling) to recover dense depth maps for all frames, from which stereoscopic view can be synthesized. In addition to visual results, we present user study results showing that our approach is more intuitive and less labor intensive, while producing 3D effect comparable to that from current state-of-the-art interactive algorithms.
Analysis of brain activity and response to colour stimuli during learning tasks: an EEG study
NASA Astrophysics Data System (ADS)
Folgieri, Raffaella; Lucchiari, Claudio; Marini, Daniele
2013-02-01
The research project intends to demonstrate how EEG detection through BCI device can improve the analysis and the interpretation of colours-driven cognitive processes through the combined approach of cognitive science and information technology methods. To this end, firstly it was decided to design an experiment based on comparing the results of the traditional (qualitative and quantitative) cognitive analysis approach with the EEG signal analysis of the evoked potentials. In our case, the sensorial stimulus is represented by the colours, while the cognitive task consists in remembering the words appearing on the screen, with different combination of foreground (words) and background colours. In this work we analysed data collected from a sample of students involved in a learning process during which they received visual stimuli based on colour variation. The stimuli concerned both the background of the text to learn and the colour of the characters. The experiment indicated some interesting results concerning the use of primary (RGB) and complementary (CMY) colours.
Joint reconstruction of x-ray fluorescence and transmission tomography
Di, Zichao Wendy; Chen, Si; Hong, Young Pyo; Jacobsen, Chris; Leyffer, Sven; Wild, Stefan M.
2017-01-01
X-ray fluorescence tomography is based on the detection of fluorescence x-ray photons produced following x-ray absorption while a specimen is rotated; it provides information on the 3D distribution of selected elements within a sample. One limitation in the quality of sample recovery is the separation of elemental signals due to the finite energy resolution of the detector. Another limitation is the effect of self-absorption, which can lead to inaccurate results with dense samples. To recover a higher quality elemental map, we combine x-ray fluorescence detection with a second data modality: conventional x-ray transmission tomography using absorption. By using these combined signals in a nonlinear optimization-based approach, we demonstrate the benefit of our algorithm on real experimental data and obtain an improved quantitative reconstruction of the spatial distribution of dominant elements in the sample. Compared with single-modality inversion based on x-ray fluorescence alone, this joint inversion approach reduces ill-posedness and should result in improved elemental quantification and better correction of self-absorption. PMID:28788848
Marti, Guillaume; Boccard, Julien; Mehl, Florence; Debrus, Benjamin; Marcourt, Laurence; Merle, Philippe; Delort, Estelle; Baroux, Lucie; Sommer, Horst; Rudaz, Serge; Wolfender, Jean-Luc
2014-05-01
The detailed characterization of cold-pressed lemon oils (CPLOs) is of great importance for the flavor and fragrance (F&F) industry. Since a control of authenticity by standard analytical techniques can be bypassed using elaborated adulterated oils to pretend a higher quality, a combination of advanced orthogonal methods has been developed. The present study describes a combined metabolomic approach based on UHPLC-TOF-MS profiling and (1)H NMR fingerprinting to highlight metabolite differences on a set of representative samples used in the F&F industry. A new protocol was set up and adapted to the use of CPLO residues. Multivariate analysis based on both fingerprinting methods showed significant chemical variations between Argentinian and Italian samples. Discriminating markers identified in mixtures belong to furocoumarins, flavonoids, terpenoids and fatty acids. Quantitative NMR revealed low citropten and high bergamottin content in Italian samples. The developed metabolomic approach applied to CPLO residues gives some new perspectives for authenticity assessment. Copyright © 2013 Elsevier Ltd. All rights reserved.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
... quantitative approaches to determine the levels of taking that would result in a negligible impact to affected species or stocks of marine mammals. The quantitative approach is more appropriate for serious injury and... required a more quantitative approach for assessing what level of removals from a population stock of...
Pluye, Pierre; Hong, Quan Nha
2014-01-01
This article provides an overview of mixed methods research and mixed studies reviews. These two approaches are used to combine the strengths of quantitative and qualitative methods and to compensate for their respective limitations. This article is structured in three main parts. First, the epistemological background for mixed methods will be presented. Afterward, we present the main types of mixed methods research designs and techniques as well as guidance for planning, conducting, and appraising mixed methods research. In the last part, we describe the main types of mixed studies reviews and provide a tool kit and examples. Future research needs to offer guidance for assessing mixed methods research and reporting mixed studies reviews, among other challenges.
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2015-07-01
We combine pulsed photothermal radiometry (PPTR) depth profiling with diffuse reflectance spectroscopy (DRS) measurements for a comprehensive analysis of bruise evolution in vivo. While PPTR enables extraction of detailed depth distribution and concentration profiles of selected absorbers (e.g. melanin, hemoglobin), DRS provides information in a wide range of visible wavelengths and thus offers an additional insight into dynamics of the hemoglobin degradation products. Combining the two approaches enables us to quantitatively characterize bruise evolution dynamics. Our results indicate temporal variations of the bruise evolution parameters in the course of bruise self-healing process. The obtained parameter values and trends represent a basis for a future development of an objective technique for bruise age determination.
Ashouri, Sajad; Abedi, Mohsen; Abdollahi, Masoud; Dehghan Manshadi, Farideh; Parnianpour, Mohamad; Khalaf, Kinda
2017-10-01
This paper presents a novel approach for evaluating LBP in various settings. The proposed system uses cost-effective inertial sensors, in conjunction with pattern recognition techniques, for identifying sensitive classifiers towards discriminate identification of LB patients. 24 healthy individuals and 28 low back pain patients performed trunk motion tasks in five different directions for validation. Four combinations of these motions were selected based on literature, and the corresponding kinematic data was collected. Upon filtering (4th order, low pass Butterworth filter) and normalizing the data, Principal Component Analysis was used for feature extraction, while Support Vector Machine classifier was applied for data classification. The results reveal that non-linear Kernel classification can be adequately employed for low back pain identification. Our preliminary results demonstrate that using a single inertial sensor placed on the thorax, in conjunction with a relatively simple test protocol, can identify low back pain with an accuracy of 96%, a sensitivity of %100, and specificity of 92%. While our approach shows promising results, further validation in a larger population is required towards using the methodology as a practical quantitative assessment tool for the detection of low back pain in clinical/rehabilitation settings. Copyright © 2017 Elsevier Ltd. All rights reserved.
Pouillot, Régis; Chen, Yuhuan; Hoelzer, Karin
2015-02-01
When developing quantitative risk assessment models, a fundamental consideration for risk assessors is to decide whether to evaluate changes in bacterial levels in terms of concentrations or in terms of bacterial numbers. Although modeling bacteria in terms of integer numbers may be regarded as a more intuitive and rigorous choice, modeling bacterial concentrations is more popular as it is generally less mathematically complex. We tested three different modeling approaches in a simulation study. The first approach considered bacterial concentrations; the second considered the number of bacteria in contaminated units, and the third considered the expected number of bacteria in contaminated units. Simulation results indicate that modeling concentrations tends to overestimate risk compared to modeling the number of bacteria. A sensitivity analysis using a regression tree suggests that processes which include drastic scenarios consisting of combinations of large bacterial inactivation followed by large bacterial growth frequently lead to a >10-fold overestimation of the average risk when modeling concentrations as opposed to bacterial numbers. Alternatively, the approach of modeling the expected number of bacteria in positive units generates results similar to the second method and is easier to use, thus potentially representing a promising compromise. Published by Elsevier Ltd.
Hazard Screening Methods for Nanomaterials: A Comparative Study
Murphy, Finbarr; Mullins, Martin; Furxhi, Irini; Costa, Anna L.; Simeone, Felice C.
2018-01-01
Hazard identification is the key step in risk assessment and management of manufactured nanomaterials (NM). However, the rapid commercialisation of nano-enabled products continues to out-pace the development of a prudent risk management mechanism that is widely accepted by the scientific community and enforced by regulators. However, a growing body of academic literature is developing promising quantitative methods. Two approaches have gained significant currency. Bayesian networks (BN) are a probabilistic, machine learning approach while the weight of evidence (WoE) statistical framework is based on expert elicitation. This comparative study investigates the efficacy of quantitative WoE and Bayesian methodologies in ranking the potential hazard of metal and metal-oxide NMs—TiO2, Ag, and ZnO. This research finds that hazard ranking is consistent for both risk assessment approaches. The BN and WoE models both utilize physico-chemical, toxicological, and study type data to infer the hazard potential. The BN exhibits more stability when the models are perturbed with new data. The BN has the significant advantage of self-learning with new data; however, this assumes all input data is equally valid. This research finds that a combination of WoE that would rank input data along with the BN is the optimal hazard assessment framework. PMID:29495342
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1995-06-01
The basic relationships between stress and strain under cyclic conditions of loading are not at present well understood. It would seem that information of this type is vital for a fundamental approach to understand the fatigue behavior of dynamically loaded structures. In this paper, experimental and computational methods are utilized to study the fatigue behavior of a thin aluminum cantilever plate subjected to dynamic loading. The studies are performed by combining optomechanical and finite element methods. The cantilever plate is loaded periodically by excitation set at a fixed amplitude and at a specific resonance frequency of the plate. By continuously applying this type of loading and using holographic interferometry, the behavior of the plate during a specific period of time is investigated. Quantitative information is obtained from laser vibrometry data which are utilized by a finite element program to calculate strains and stresses assuming a homogeneous and isotropic material and constant strain elements. It is shown that the use of experimental and computational hybrid methodologies allows identification of different zones of the plate that are fatigue critical. This optomechanical approach proves to be a viable tool for understanding of fatigue behavior of mechanical components and for performing optimization of structures subjected to fatigue conditions.
Schaafsma, G; Hertsenberg, A J; Marquart, J
2011-08-01
Within the framework of REACH, an assessment regarding local dermal effects and skin sensitisation should be performed for substances. Quantitative hazard information for these effects is often not available. Furthermore, it is difficult to relate the way in which animals are exposed in dermal toxicity studies directly to dermal exposure in practice. In the absence of quantitative information, a qualitative assessment for dermal effects is the most reasonable option. The qualitative approach as proposed in the REACH guidance recommends only general risk management measures (RMM) for three categories with a low, moderate and high identified hazard, without specifying which RMM are needed for a specific exposure scenario. We propose to differentiate frequency of exposure based on differences in activities and to compare measured and estimated local skin exposure levels with rules of thumb for evaluation of control of risks per hazard category. For workers, specific RMM regimes are assigned to each combination of hazard category and process category (PROC). For consumers, a strategy in which RMM are arranged from product-integrated measures to the use of personal protective equipment (PPE) is presented. Our approach may be transferred into automated assessment tools like Chesar and CEFIC GES. Copyright © 2011 Elsevier Inc. All rights reserved.
Baeßler, Bettina; Schaarschmidt, Frank; Treutlein, Melanie; Stehning, Christian; Schnackenburg, Bernhard; Michels, Guido; Maintz, David; Bunck, Alexander C
2017-12-01
To re-evaluate a recently suggested approach of quantifying myocardial oedema and increased tissue inhomogeneity in myocarditis by T2-mapping. Cardiac magnetic resonance data of 99 patients with myocarditis were retrospectively analysed. Thirthy healthy volunteers served as controls. T2-mapping data were acquired at 1.5 T using a gradient-spin-echo T2-mapping sequence. T2-maps were segmented according to the 16-segments AHA-model. Segmental T2-values, segmental pixel-standard deviation (SD) and the derived parameters maxT2, maxSD and madSD were analysed and compared to the established Lake Louise criteria (LLC). A re-estimation of logistic regression models revealed that all models containing an SD-parameter were superior to any model containing global myocardial T2. Using a combined cut-off of 1.8 ms for madSD + 68 ms for maxT2 resulted in a diagnostic sensitivity of 75% and specificity of 80% and showed a similar diagnostic performance compared to LLC in receiver-operating-curve analyses. Combining madSD, maxT2 and late gadolinium enhancement (LGE) in a model resulted in a superior diagnostic performance compared to LLC (sensitivity 93%, specificity 83%). The results show that the novel T2-mapping-derived parameters exhibit an additional diagnostic value over LGE with the inherent potential to overcome the current limitations of T2-mapping. • A novel quantitative approach to myocardial oedema imaging in myocarditis was re-evaluated. • The T2-mapping-derived parameters maxT2 and madSD were compared to traditional Lake-Louise criteria. • Using maxT2 and madSD with dedicated cut-offs performs similarly to Lake-Louise criteria. • Adding maxT2 and madSD to LGE results in further increased diagnostic performance. • This novel approach has the potential to overcome the limitations of T2-mapping.
Clustering of samples and variables with mixed-type data
Edelmann, Dominic; Kopp-Schneider, Annette
2017-01-01
Analysis of data measured on different scales is a relevant challenge. Biomedical studies often focus on high-throughput datasets of, e.g., quantitative measurements. However, the need for integration of other features possibly measured on different scales, e.g. clinical or cytogenetic factors, becomes increasingly important. The analysis results (e.g. a selection of relevant genes) are then visualized, while adding further information, like clinical factors, on top. However, a more integrative approach is desirable, where all available data are analyzed jointly, and where also in the visualization different data sources are combined in a more natural way. Here we specifically target integrative visualization and present a heatmap-style graphic display. To this end, we develop and explore methods for clustering mixed-type data, with special focus on clustering variables. Clustering of variables does not receive as much attention in the literature as does clustering of samples. We extend the variables clustering methodology by two new approaches, one based on the combination of different association measures and the other on distance correlation. With simulation studies we evaluate and compare different clustering strategies. Applying specific methods for mixed-type data proves to be comparable and in many cases beneficial as compared to standard approaches applied to corresponding quantitative or binarized data. Our two novel approaches for mixed-type variables show similar or better performance than the existing methods ClustOfVar and bias-corrected mutual information. Further, in contrast to ClustOfVar, our methods provide dissimilarity matrices, which is an advantage, especially for the purpose of visualization. Real data examples aim to give an impression of various kinds of potential applications for the integrative heatmap and other graphical displays based on dissimilarity matrices. We demonstrate that the presented integrative heatmap provides more information than common data displays about the relationship among variables and samples. The described clustering and visualization methods are implemented in our R package CluMix available from https://cran.r-project.org/web/packages/CluMix. PMID:29182671
NASA Astrophysics Data System (ADS)
McLeod, Euan
2016-03-01
The sizing of individual nanoparticles and the recovery of the distributions of sizes from populations of nanoparticles provide valuable information in virology, exosome analysis, air and water quality monitoring, and nanomaterials synthesis. Conventional approaches for nanoparticle sizing include those based on costly or low-throughput laboratory-scale equipment such as transmission electron microscopy or nanoparticle tracking analysis, as well as those approaches that only provide population-averaged quantities, such as dynamic light scattering. Some of these limitations can be overcome using a new family of alternative approaches based on quantitative phase imaging that combines lensfree holographic on-chip microscopy with self-assembled liquid nanolenses. In these approaches, the particles of interest are deposited onto a glass coverslip and the sample is coated with either pure liquid polyethylene glycol (PEG) or aqueous solutions of PEG. Due to surface tension, the PEG self-assembles into nano-scale lenses around the particles of interest. These nanolenses enhance the scattering signatures of the embedded particles such that individual nanoparticles as small as 40 nm are clearly visible in phase images reconstructed from captured holograms. The magnitude of the phase quantitatively corresponds to particle size with an accuracy of +/-11 nm. This family of approaches can individually size more than 10^5 particles in parallel, can handle a large dynamic range of particle sizes (40 nm - 100s of microns), and can accurately size multi-modal distributions of particles. Furthermore, the entire approach has been implemented in a compact and cost-effective device suitable for use in the field or in low-resource settings.
Lanfranchi, Fiorella; Alaimo, Sara; Conway, P M
2014-01-01
In 2010, Italian regulatory guidelines have been issued consisting of a stepwise procedure for the assessment and management of work-related stress. However, research that empirically examines whether this procedure proves effective in accurately identifying critical psychosocial factors and informing risk management is scarce. To examine the differential sensitivity of two approaches to risk assessment, the first based on objective instruments only, the second consisting of an integrated approach combining different methods and theoretical perspectives. We examined a sample of 306 healthcare employees in a large-size hospital in northern Italy, using a series of tools, both quantitative (an observational checklist and the HSE-IT and MOHQ questionnaires) and qualitative (Focus Groups). Through instrument-specific reference values, we then compared risk profiles between different homogeneous groups within the institution. The psychosocial work environment resulted to be far more positive when adopting the first compared to the second approach to risk assessment. The latter approach was also more sensitive in detecting between-groups differences in risk profiles. Furthermore, the Focus Groups returned a more context-specific picture of the psychosocial work environment. Finally, going beyond the emphasis on negative working conditions inherent in the other quantitative instruments, the MOHQ allowed for also identifying health-promoting factors in need for improvement. Although more research is needed to confirm our findings, the present study suggests that using an integrated approach to assess the psychosocial work environment may be the most effective way to accurately identify risk factors and support the management process.
Crowe, Sonya; Brown, Katherine; Tregay, Jenifer; Wray, Jo; Knowles, Rachel; Ridout, Deborah A; Bull, Catherine; Utley, Martin
2017-08-01
Improving integration and continuity of care across sectors within resource constraints is a priority in many health systems. Qualitative operational research methods of problem structuring have been used to address quality improvement in services involving multiple sectors but not in combination with quantitative operational research methods that enable targeting of interventions according to patient risk. We aimed to combine these methods to augment and inform an improvement initiative concerning infants with congenital heart disease (CHD) whose complex care pathway spans multiple sectors. Soft systems methodology was used to consider systematically changes to services from the perspectives of community, primary, secondary and tertiary care professionals and a patient group, incorporating relevant evidence. Classification and regression tree (CART) analysis of national audit datasets was conducted along with data visualisation designed to inform service improvement within the context of limited resources. A 'Rich Picture' was developed capturing the main features of services for infants with CHD pertinent to service improvement. This was used, along with a graphical summary of the CART analysis, to guide discussions about targeting interventions at specific patient risk groups. Agreement was reached across representatives of relevant health professions and patients on a coherent set of targeted recommendations for quality improvement. These fed into national decisions about service provision and commissioning. When tackling complex problems in service provision across multiple settings, it is important to acknowledge and work with multiple perspectives systematically and to consider targeting service improvements in response to confined resources. Our research demonstrates that applying a combination of qualitative and quantitative operational research methods is one approach to doing so that warrants further consideration. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Ab initio structures and polarizabilities of sodium clusters
NASA Astrophysics Data System (ADS)
Kronik, Leeor; Vasiliev, Igor; Jain, Manish; Chelikowsky, James R.
2001-09-01
We present quantitative ab initio calculations for Na cluster structures and polarizabilities, for all cluster sizes up to 20 atoms. Our calculations are performed by combining an ab initio core-corrected pseudopotential and a gradient-corrected density functional within a real space approach. We find the cluster bonding to be very floppy and catalog a host of low-energy quasi-degenerate isomers for all second-decade clusters. The existence of these isomers results in a band of polarizability values for each cluster size even at zero temperature. This eliminates any finer structure in the polarizability curve. We further show that the experimental polarizability values are consistently underestimated by calculations at zero temperature. By computing the effects of structure expansion and distortion due to a finite temperature we arrive at a quantitative agreement between theory and experiment.
Quinn, T. Alexander; Kohl, Peter
2013-01-01
Since the development of the first mathematical cardiac cell model 50 years ago, computational modelling has become an increasingly powerful tool for the analysis of data and for the integration of information related to complex cardiac behaviour. Current models build on decades of iteration between experiment and theory, representing a collective understanding of cardiac function. All models, whether computational, experimental, or conceptual, are simplified representations of reality and, like tools in a toolbox, suitable for specific applications. Their range of applicability can be explored (and expanded) by iterative combination of ‘wet’ and ‘dry’ investigation, where experimental or clinical data are used to first build and then validate computational models (allowing integration of previous findings, quantitative assessment of conceptual models, and projection across relevant spatial and temporal scales), while computational simulations are utilized for plausibility assessment, hypotheses-generation, and prediction (thereby defining further experimental research targets). When implemented effectively, this combined wet/dry research approach can support the development of a more complete and cohesive understanding of integrated biological function. This review illustrates the utility of such an approach, based on recent examples of multi-scale studies of cardiac structure and mechano-electric function. PMID:23334215
Shafirkin, A V; Mukhamedieva, L N; Tatarkin, S V; Barantseva, M Iu
2012-01-01
The work had the aim to anatomize the existing issues with providing safety in extended orbital and exploration missions for ensuing estimation of actual values of the total radiation risk for the crew, and risks of other delayed effects of simultaneous exposure to ionizing radiation and chemical pollutants in cabin air, and a number of other stressful factors inevitable in space flight. The flow of chronic experiments for separate and combined studies with reproduction of air makeup and radiation doses in actual orbital and predicted exploration missions is outlined. To set safety limits, new approaches should be applied to the description of gradual norm degradation to pathologies in addition to several generalized quantitative indices of adaptation and straining of the regulatory systems, as well as of effectiveness of the compensatory body reserve against separate and combined exposure.
Principles of dynamical modularity in biological regulatory networks
Deritei, Dávid; Aird, William C.; Ercsey-Ravasz, Mária; Regan, Erzsébet Ravasz
2016-01-01
Intractable diseases such as cancer are associated with breakdown in multiple individual functions, which conspire to create unhealthy phenotype-combinations. An important challenge is to decipher how these functions are coordinated in health and disease. We approach this by drawing on dynamical systems theory. We posit that distinct phenotype-combinations are generated by interactions among robust regulatory switches, each in control of a discrete set of phenotypic outcomes. First, we demonstrate the advantage of characterizing multi-switch regulatory systems in terms of their constituent switches by building a multiswitch cell cycle model which points to novel, testable interactions critical for early G2/M commitment to division. Second, we define quantitative measures of dynamical modularity, namely that global cell states are discrete combinations of switch-level phenotypes. Finally, we formulate three general principles that govern the way coupled switches coordinate their function. PMID:26979940
Optimizing Hybrid Metrology: Rigorous Implementation of Bayesian and Combined Regression
Henn, Mark-Alexander; Silver, Richard M.; Villarrubia, John S.; Zhang, Nien Fan; Zhou, Hui; Barnes, Bryan M.; Ming, Bin; Vladár, András E.
2015-01-01
Hybrid metrology, e.g., the combination of several measurement techniques to determine critical dimensions, is an increasingly important approach to meet the needs of the semiconductor industry. A proper use of hybrid metrology may yield not only more reliable estimates for the quantitative characterization of 3-D structures but also a more realistic estimation of the corresponding uncertainties. Recent developments at the National Institute of Standards and Technology (NIST) feature the combination of optical critical dimension (OCD) measurements and scanning electron microscope (SEM) results. The hybrid methodology offers the potential to make measurements of essential 3-D attributes that may not be otherwise feasible. However, combining techniques gives rise to essential challenges in error analysis and comparing results from different instrument models, especially the effect of systematic and highly correlated errors in the measurement on the χ2 function that is minimized. Both hypothetical examples and measurement data are used to illustrate solutions to these challenges. PMID:26681991
Vázquez-Rowe, Ian; Iribarren, Diego
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting.
Vázquez-Rowe, Ian
2015-01-01
Life-cycle (LC) approaches play a significant role in energy policy making to determine the environmental impacts associated with the choice of energy source. Data envelopment analysis (DEA) can be combined with LC approaches to provide quantitative benchmarks that orientate the performance of energy systems towards environmental sustainability, with different implications depending on the selected LC + DEA method. The present paper examines currently available LC + DEA methods and develops a novel method combining carbon footprinting (CFP) and DEA. Thus, the CFP + DEA method is proposed, a five-step structure including data collection for multiple homogenous entities, calculation of target operating points, evaluation of current and target carbon footprints, and result interpretation. As the current context for energy policy implies an anthropocentric perspective with focus on the global warming impact of energy systems, the CFP + DEA method is foreseen to be the most consistent LC + DEA approach to provide benchmarks for energy policy making. The fact that this method relies on the definition of operating points with optimised resource intensity helps to moderate the concerns about the omission of other environmental impacts. Moreover, the CFP + DEA method benefits from CFP specifications in terms of flexibility, understanding, and reporting. PMID:25654136
Godinez, William J; Rohr, Karl
2015-02-01
Tracking subcellular structures as well as viral structures displayed as 'particles' in fluorescence microscopy images yields quantitative information on the underlying dynamical processes. We have developed an approach for tracking multiple fluorescent particles based on probabilistic data association. The approach combines a localization scheme that uses a bottom-up strategy based on the spot-enhancing filter as well as a top-down strategy based on an ellipsoidal sampling scheme that uses the Gaussian probability distributions computed by a Kalman filter. The localization scheme yields multiple measurements that are incorporated into the Kalman filter via a combined innovation, where the association probabilities are interpreted as weights calculated using an image likelihood. To track objects in close proximity, we compute the support of each image position relative to the neighboring objects of a tracked object and use this support to recalculate the weights. To cope with multiple motion models, we integrated the interacting multiple model algorithm. The approach has been successfully applied to synthetic 2-D and 3-D images as well as to real 2-D and 3-D microscopy images, and the performance has been quantified. In addition, the approach was successfully applied to the 2-D and 3-D image data of the recent Particle Tracking Challenge at the IEEE International Symposium on Biomedical Imaging (ISBI) 2012.
Geary, Nori
2013-02-01
Analysis of the interactive effects of combinations of hormones or other manipulations with qualitatively similar individual effects is an important topic in basic and clinical endocrinology as well as other branches of basic and clinical research related to integrative physiology. Functional, as opposed to mechanistic, analyses of interactions rely on the concept of synergy, which can be defined qualitatively as a cooperative action or quantitatively as a supra-additive effect according to some metric for the addition of different dose-effect curves. Unfortunately, dose-effect curve addition is far from straightforward; rather, it requires the development of an axiomatic mathematical theory. I review the mathematical soundness, face validity, and utility of the most frequently used approaches to supra-additive synergy. These criteria highlight serious problems in the two most common synergy approaches, response additivity and Loewe additivity, which is the basis of the isobole and related response surface approaches. I conclude that there is no adequate, generally applicable, supra-additive synergy metric appropriate for endocrinology or any other field of basic and clinical integrative physiology. I recommend that these metrics be abandoned in favor of the simpler definition of synergy as a cooperative, i.e., nonantagonistic, effect. This simple definition avoids mathematical difficulties, is easily applicable, meets regulatory requirements for combination therapy development, and suffices to advance phenomenological basic research to mechanistic studies of interactions and clinical combination therapy research.
Diffusion Lung Imaging with Hyperpolarized Gas MRI
Yablonskiy, Dmitriy A; Sukstanskii, Alexander L; Quirk, James D
2015-01-01
Lung imaging using conventional 1H MRI presents great challenges due to low density of lung tissue, lung motion and very fast lung tissue transverse relaxation (typical T2* is about 1-2 ms). MRI with hyperpolarized gases (3He and 129Xe) provides a valuable alternative due to a very strong signal originated from inhaled gas residing in the lung airspaces and relatively slow gas T2* relaxation (typical T2* is about 20-30 ms). Though in vivo human experiments should be done very fast – usually during a single breath-hold. In this review we describe the recent developments in diffusion lung MRI with hyperpolarized gases. We show that a combination of modeling results of gas diffusion in lung airspaces and diffusion measurements with variable diffusion-sensitizing gradients allows extracting quantitative information on the lung microstructure at the alveolar level. This approach, called in vivo lung morphometry, allows from a less than 15-second MRI scan, providing quantitative values and spatial distributions of the same physiological parameters as are measured by means of the “standard” invasive stereology (mean linear intercept, surface-to-volume ratio, density of alveoli, etc.). Besides, the approach makes it possible to evaluate some advanced Weibel parameters characterizing lung microstructure - average radii of alveolar sacs and ducts, as well as the depth of their alveolar sleeves. Such measurements, providing in vivo information on the integrity of pulmonary acinar airways and their changes in different diseases, are of great importance and interest to a broad range of physiologists and clinicians. We also discuss a new type of experiments that are based on the in vivo lung morphometry technique combined with quantitative CT measurements as well as with the Gradient Echo MRI measurements of hyperpolarized gas transverse relaxation in the lung airspaces. Such experiments provide additional information on the blood vessel volume fraction, specific gas volume, the length of acinar airways, and allows evaluation of lung parenchymal and non-parenchymal tissue. PMID:26676342
Quantitative mouse brain phenotyping based on single and multispectral MR protocols
Badea, Alexandra; Gewalt, Sally; Avants, Brian B.; Cook, James J.; Johnson, G. Allan
2013-01-01
Sophisticated image analysis methods have been developed for the human brain, but such tools still need to be adapted and optimized for quantitative small animal imaging. We propose a framework for quantitative anatomical phenotyping in mouse models of neurological and psychiatric conditions. The framework encompasses an atlas space, image acquisition protocols, and software tools to register images into this space. We show that a suite of segmentation tools (Avants, Epstein et al., 2008) designed for human neuroimaging can be incorporated into a pipeline for segmenting mouse brain images acquired with multispectral magnetic resonance imaging (MR) protocols. We present a flexible approach for segmenting such hyperimages, optimizing registration, and identifying optimal combinations of image channels for particular structures. Brain imaging with T1, T2* and T2 contrasts yielded accuracy in the range of 83% for hippocampus and caudate putamen (Hc and CPu), but only 54% in white matter tracts, and 44% for the ventricles. The addition of diffusion tensor parameter images improved accuracy for large gray matter structures (by >5%), white matter (10%), and ventricles (15%). The use of Markov random field segmentation further improved overall accuracy in the C57BL/6 strain by 6%; so Dice coefficients for Hc and CPu reached 93%, for white matter 79%, for ventricles 68%, and for substantia nigra 80%. We demonstrate the segmentation pipeline for the widely used C57BL/6 strain, and two test strains (BXD29, APP/TTA). This approach appears promising for characterizing temporal changes in mouse models of human neurological and psychiatric conditions, and may provide anatomical constraints for other preclinical imaging, e.g. fMRI and molecular imaging. This is the first demonstration that multiple MR imaging modalities combined with multivariate segmentation methods lead to significant improvements in anatomical segmentation in the mouse brain. PMID:22836174
The interplay between QSAR/QSPR studies and partial order ranking and formal concept analyses.
Carlsen, Lars
2009-04-17
The often observed scarcity of physical-chemical and well as toxicological data hampers the assessment of potentially hazardous chemicals released to the environment. In such cases Quantitative Structure-Activity Relationships/Quantitative Structure-Property Relationships (QSAR/QSPR) constitute an obvious alternative for rapidly, effectively and inexpensively generatng missing experimental values. However, typically further treatment of the data appears necessary, e.g., to elucidate the possible relations between the single compounds as well as implications and associations between the various parameters used for the combined characterization of the compounds under investigation. In the present paper the application of QSAR/QSPR in combination with Partial Order Ranking (POR) methodologies will be reviewed and new aspects using Formal Concept Analysis (FCA) will be introduced. Where POR constitutes an attractive method for, e.g., prioritizing a series of chemical substances based on a simultaneous inclusion of a range of parameters, FCA gives important information on the implications associations between the parameters. The combined approach thus constitutes an attractive method to a preliminary assessment of the impact on environmental and human health by primary pollutants or possibly by a primary pollutant well as a possible suite of transformation subsequent products that may be both persistent in and bioaccumulating and toxic. The present review focus on the environmental - and human health impact by residuals of the rocket fuel 1,1-dimethylhydrazine (heptyl) and its transformation products as an illustrative example.
NASA Astrophysics Data System (ADS)
Ibrahim, Bashirah; Ding, Lin; Heckler, Andrew F.; White, Daniel R.; Badeau, Ryan
2017-12-01
We examine students' mathematical performance on quantitative "synthesis problems" with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students' mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students' simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students' formulation and combination of equations. Several reasons may explain this difference, including the students' different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.
Chen, Lian; Zhou, Shenglu; Wu, Shaohua; Wang, Chunhui; Li, Baojie; Li, Yan; Wang, Junxiao
2018-08-01
Two quantitative methods (emission inventory and isotope ratio analysis) were combined to apportion source contributions of heavy metals entering agricultural soils in the Lihe River watershed (Taihu region, east China). Source apportionment based on the emission inventory method indicated that for Cd, Cr, Cu, Pb, and Zn, the mean percentage input from atmospheric deposition was highest (62-85%), followed by irrigation (12-27%) and fertilization (1-14%). Thus, the heavy metals were derived mainly from industrial activities and traffic emissions. For Ni the combined percentage input from irrigation and fertilization was approximately 20% higher than that from atmospheric deposition, indicating that Ni was mainly derived from agricultural activities. Based on isotope ratio analysis, atmospheric deposition accounted for 57-93% of Pb entering soil, with the mean value of 69.3%, which indicates that this was the major source of Pb entering soil in the study area. The mean contributions of irrigation and fertilization to Pb pollution of soil ranged from 0% to 10%, indicating that they played only a marginally important role. Overall, the results obtained using the two methods were similar. This study provides a reliable approach for source apportionment of heavy metals entering agricultural soils in the study area, and clearly have potential application for future studies in other regions. Copyright © 2018 Elsevier Ltd. All rights reserved.
2010-01-01
Background The modular approach to analysis of genetically modified organisms (GMOs) relies on the independence of the modules combined (i.e. DNA extraction and GM quantification). The validity of this assumption has to be proved on the basis of specific performance criteria. Results An experiment was conducted using, as a reference, the validated quantitative real-time polymerase chain reaction (PCR) module for detection of glyphosate-tolerant Roundup Ready® GM soybean (RRS). Different DNA extraction modules (CTAB, Wizard and Dellaporta), were used to extract DNA from different food/feed matrices (feed, biscuit and certified reference material [CRM 1%]) containing the target of the real-time PCR module used for validation. Purity and structural integrity (absence of inhibition) were used as basic criteria that a DNA extraction module must satisfy in order to provide suitable template DNA for quantitative real-time (RT) PCR-based GMO analysis. When performance criteria were applied (removal of non-compliant DNA extracts), the independence of GMO quantification from the extraction method and matrix was statistically proved, except in the case of Wizard applied to biscuit. A fuzzy logic-based procedure also confirmed the relatively poor performance of the Wizard/biscuit combination. Conclusions For RRS, this study recognises that modularity can be generally accepted, with the limitation of avoiding combining highly processed material (i.e. biscuit) with a magnetic-beads system (i.e. Wizard). PMID:20687918
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator)
1983-01-01
The geometric quality of the TM and MSS film products were evaluated by making selective photo measurements such as scale, linear and area determinations; and by measuring the coordinates of known features on both the film products and map products and then relating these paired observations using a standard linear least squares regression approach. Quantitative interpretation tests are described which evaluate the quality and utility of the TM film products and various band combinations for detecting and identifying important forest and agricultural features.
Microwave resonances in dielectric samples probed in Corbino geometry: simulation and experiment.
Felger, M Maximilian; Dressel, Martin; Scheffler, Marc
2013-11-01
The Corbino approach, where the sample of interest terminates a coaxial cable, is a well-established method for microwave spectroscopy. If the sample is dielectric and if the probe geometry basically forms a conductive cavity, this combination can sustain well-defined microwave resonances that are detrimental for broadband measurements. Here, we present detailed simulations and measurements to investigate the resonance frequencies as a function of sample and probe size and of sample permittivity. This allows a quantitative optimization to increase the frequency of the lowest-lying resonance.
On the stability and instantaneous velocity of grasped frictionless objects
NASA Technical Reports Server (NTRS)
Trinkle, Jeffrey C.
1992-01-01
A quantitative test for form closure valid for any number of contact points is formulated as a linear program, the optimal objective value of which provides a measure of how far a grasp is from losing form closure. Another contribution of the study is the formulation of a linear program whose solution yields the same information as the classical approach. The benefit of the formulation is that explicit testing of all possible combinations of contact interactions can be avoided by the algorithm used to solve the linear program.
Publics and biobanks: Pan-European diversity and the challenge of responsible innovation.
Gaskell, George; Gottweis, Herbert; Starkbaum, Johannes; Gerber, Monica M; Broerse, Jacqueline; Gottweis, Ursula; Hobbs, Abbi; Helén, Ilpo; Paschou, Maria; Snell, Karoliina; Soulier, Alexandra
2013-01-01
This article examines public perceptions of biobanks in Europe using a multi-method approach combining quantitative and qualitative data. It is shown that public support for biobanks in Europe is variable and dependent on a range of interconnected factors: people's engagement with biobanks; concerns about privacy and data security, and trust in the socio-political system, key actors and institutions involved in biobanks. We argue that the biobank community needs to acknowledge the impact of these factors if they are to successfully develop and integrate biobanks at a pan-European level.
Report on Physics of Channelization: Theory, Experiment, and Observation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kudrolli, Arshad
2014-05-19
The project involved a study of physical processes that create eroded channel and drainage networks. A particular focus was on how the shape of the channels and the network depended on the nature of the fluid flow. Our approach was to combine theoretical, experimental, and observational studies in close collaboration with Professor Daniel Rothman of the Massachusetts Institute of Technology. Laboratory -scaled experiments were developed and quantitative data on the shape of the pattern and erosion dynamics are obtained with a laser-aided topography technique and fluorescent optical imaging techniques.
Image registration of low signal-to-noise cryo-STEM data.
Savitzky, Benjamin H; El Baggari, Ismail; Clement, Colin B; Waite, Emily; Goodge, Berit H; Baek, David J; Sheckelton, John P; Pasco, Christopher; Nair, Hari; Schreiber, Nathaniel J; Hoffman, Jason; Admasu, Alemayehu S; Kim, Jaewook; Cheong, Sang-Wook; Bhattacharya, Anand; Schlom, Darrell G; McQueen, Tyrel M; Hovden, Robert; Kourkoutis, Lena F
2018-08-01
Combining multiple fast image acquisitions to mitigate scan noise and drift artifacts has proven essential for picometer precision, quantitative analysis of atomic resolution scanning transmission electron microscopy (STEM) data. For very low signal-to-noise ratio (SNR) image stacks - frequently required for undistorted imaging at liquid nitrogen temperatures - image registration is particularly delicate, and standard approaches may either fail, or produce subtly specious reconstructed lattice images. We present an approach which effectively registers and averages image stacks which are challenging due to their low-SNR and propensity for unit cell misalignments. Registering all possible image pairs in a multi-image stack leads to significant information surplus. In combination with a simple physical picture of stage drift, this enables identification of incorrect image registrations, and determination of the optimal image shifts from the complete set of relative shifts. We demonstrate the effectiveness of our approach on experimental, cryogenic STEM datasets, highlighting subtle artifacts endemic to low-SNR lattice images and how they can be avoided. High-SNR average images with information transfer out to 0.72 Å are achieved at 300 kV and with the sample cooled to near liquid nitrogen temperature. Copyright © 2018 Elsevier B.V. All rights reserved.
Industrial inspection of specular surfaces using a new calibration procedure
NASA Astrophysics Data System (ADS)
Aswendt, Petra; Hofling, Roland; Gartner, Soren
2005-06-01
The methodology of phase encoded reflection measurements has become a valuable tool for the industrial inspection of components with glossy surfaces. The measuring principle provides outstanding sensitivity for tiny variations of surface curvature so that sub-micron waviness and flaws are reliably detected. Quantitative curvature measurements can be obtained from a simple approach if the object is almost flat. 3D-objects with a high aspect ratio require more effort to determine both coordinates and normal direction of a surface point unambiguously. Stereoscopic solutions have been reported using more than one camera for a certain surface area. This paper will describe the combined double camera steady surface approach (DCSS) that is well suited for the implementation in industrial testing stations
Sparse reconstruction of liver cirrhosis from monocular mini-laparoscopic sequences
NASA Astrophysics Data System (ADS)
Marcinczak, Jan Marek; Painer, Sven; Grigat, Rolf-Rainer
2015-03-01
Mini-laparoscopy is a technique which is used by clinicians to inspect the liver surface with ultra-thin laparoscopes. However, so far no quantitative measures based on mini-laparoscopic sequences are possible. This paper presents a Structure from Motion (SfM) based methodology to do 3D reconstruction of liver cirrhosis from mini-laparoscopic videos. The approach combines state-of-the-art tracking, pose estimation, outlier rejection and global optimization to obtain a sparse reconstruction of the cirrhotic liver surface. Specular reflection segmentation is included into the reconstruction framework to increase the robustness of the reconstruction. The presented approach is evaluated on 15 endoscopic sequences using three cirrhotic liver phantoms. The median reconstruction accuracy ranges from 0.3 mm to 1 mm.
2004-09-01
The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.
Assessment of umbilical artery flow and fetal heart rate to predict delivery time in bitches.
Giannico, Amália Turner; Garcia, Daniela Aparecida Ayres; Gil, Elaine Mayumi Ueno; Sousa, Marlos Gonçalves; Froes, Tilde Rodrigues
2016-10-15
The aim of this study was to quantitatively investigate the oscillation of the fetal heart rate (HR) in advance of normal delivery and whether this index could be used to indicate impending delivery. In addition, fetal HR oscillation and umbilical artery resistive index (RI) were correlated to determine if the combination of these parameters provided a more accurate prediction of the time of delivery. Sonographic evaluation was performed in 11 pregnant bitches to evaluate the fetal HR and umbilical artery RI at the following antepartum times: 120 to 96 hours, 72 to 48 hours, 24 to 12 hours, and 12 to 1 hours. Statistical analysis indicated a correlation between the oscillation of fetal HR and the umbilical artery RI. As delivery approached a considerable reduction in the umbilical artery RI was documented and greater oscillations between maximum and minimum HRs occurred. We conclude that the quantitative analysis of fetal HR oscillations may be used to predict the time of delivery in bitches. The combination of fetal HR and umbilical artery RI together may provide more accurate predictions of time of delivery. Copyright © 2016 Elsevier Inc. All rights reserved.
Connelly, N A; Knuth, B A
1998-10-01
Information format can influence the extent to which target audiences understand and respond to risk-related information. This study examined four elements of risk information presentation format. Using printed materials, we examined target audience perceptions about: (a) reading level; (b) use of diagrams vs. text; (c) commanding versus cajoling tone; and (d) use of qualitative vs. quantitative information presented in a risk ladder. We used the risk communication topic of human health concerns related to eating noncommercial Great Lakes fish affected by chemical contaminants. Results from the comparisons of specific communication formats indicated that multiple formats are required to meet the needs of a significant percent of anglers for three of the four format types examined. Advisory text should be reviewed to ensure the reading level is geared to abilities of the target audience. For many audiences, a combination of qualitative and quantitative information, and a combination of diagrams and text may be most effective. For most audiences, a cajoling rather than commanding tone better provides them with the information they need to make a decision about fish consumption. Segmenting audiences regarding information needs and communication formats may help clarify which approaches to take with each audience.
Electrochemical and Infrared Absorption Spectroscopy Detection of SF₆ Decomposition Products.
Dong, Ming; Zhang, Chongxing; Ren, Ming; Albarracín, Ricardo; Ye, Rixin
2017-11-15
Sulfur hexafluoride (SF₆) gas-insulated electrical equipment is widely used in high-voltage (HV) and extra-high-voltage (EHV) power systems. Partial discharge (PD) and local heating can occur in the electrical equipment because of insulation faults, which results in SF₆ decomposition and ultimately generates several types of decomposition products. These SF₆ decomposition products can be qualitatively and quantitatively detected with relevant detection methods, and such detection contributes to diagnosing the internal faults and evaluating the security risks of the equipment. At present, multiple detection methods exist for analyzing the SF₆ decomposition products, and electrochemical sensing (ES) and infrared (IR) spectroscopy are well suited for application in online detection. In this study, the combination of ES with IR spectroscopy is used to detect SF₆ gas decomposition. First, the characteristics of these two detection methods are studied, and the data analysis matrix is established. Then, a qualitative and quantitative analysis ES-IR model is established by adopting a two-step approach. A SF₆ decomposition detector is designed and manufactured by combining an electrochemical sensor and IR spectroscopy technology. The detector is used to detect SF₆ gas decomposition and is verified to reliably and accurately detect the gas components and concentrations.
Sacci, Robert L; Black, Jennifer M; Balke, Nina; Dudney, Nancy J; More, Karren L; Unocic, Raymond R
2015-03-11
The performance characteristics of Li-ion batteries are intrinsically linked to evolving nanoscale interfacial electrochemical reactions. To probe the mechanisms of solid electrolyte interphase (SEI) formation and to track Li nucleation and growth mechanisms from a standard organic battery electrolyte (LiPF6 in EC:DMC), we used in situ electrochemical scanning transmission electron microscopy (ec-S/TEM) to perform controlled electrochemical potential sweep measurements while simultaneously imaging site-specific structures resulting from electrochemical reactions. A combined quantitative electrochemical measurement and STEM imaging approach is used to demonstrate that chemically sensitive annular dark field STEM imaging can be used to estimate the density of the evolving SEI and to identify Li-containing phases formed in the liquid cell. We report that the SEI is approximately twice as dense as the electrolyte as determined from imaging and electron scattering theory. We also observe site-specific locations where Li nucleates and grows on the surface and edge of the glassy carbon electrode. Lastly, this report demonstrates the investigative power of quantitative nanoscale imaging combined with electrochemical measurements for studying fluid-solid interfaces and their evolving chemistries.
Andersen, S T; Erichsen, A C; Mark, O; Albrechtsen, H-J
2013-12-01
Quantitative microbial risk assessments (QMRAs) often lack data on water quality leading to great uncertainty in the QMRA because of the many assumptions. The quantity of waste water contamination was estimated and included in a QMRA on an extreme rain event leading to combined sewer overflow (CSO) to bathing water where an ironman competition later took place. Two dynamic models, (1) a drainage model and (2) a 3D hydrodynamic model, estimated the dilution of waste water from source to recipient. The drainage model estimated that 2.6% of waste water was left in the system before CSO and the hydrodynamic model estimated that 4.8% of the recipient bathing water came from the CSO, so on average there was 0.13% of waste water in the bathing water during the ironman competition. The total estimated incidence rate from a conservative estimate of the pathogenic load of five reference pathogens was 42%, comparable to 55% in an epidemiological study of the case. The combination of applying dynamic models and exposure data led to an improved QMRA that included an estimate of the dilution factor. This approach has not been described previously.
Understanding online health information: Evaluation, tools, and strategies.
Beaunoyer, Elisabeth; Arsenault, Marianne; Lomanowska, Anna M; Guitton, Matthieu J
2017-02-01
Considering the status of the Internet as a prominent source of health information, assessing online health material has become a central issue in patient education. We describe the strategies available to evaluate the characteristics of online health information, including readability, emotional content, understandability, usability. Popular tools used in assessment of readability, emotional content and comprehensibility of online health information were reviewed. Tools designed to evaluate both printed and online material were considered. Readability tools are widely used in online health material evaluation and are highly covariant. Assessment of emotional content of online health-related communications via sentiment analysis tools is becoming more popular. Understandability and usability tools have been developed specifically for health-related material, but each tool has important limitations and has been tested on a limited number of health issues. Despite the availability of numerous assessment tools, their overall reliability differs between readability (high) and understandability (low). Approaches combining multiple assessment tools and involving both quantitative and qualitative observations would optimize assessment strategies. Effective assessment of online health information should rely on mixed strategies combining quantitative and qualitative evaluations. Assessment tools should be selected according to their functional properties and compatibility with target material. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Kennedy, Jacob J.; Whiteaker, Jeffrey R.; Schoenherr, Regine M.; Yan, Ping; Allison, Kimberly; Shipley, Melissa; Lerch, Melissa; Hoofnagle, Andrew N.; Baird, Geoffrey Stuart; Paulovich, Amanda G.
2016-01-01
Despite a clinical, economic, and regulatory imperative to develop companion diagnostics, precious few new biomarkers have been successfully translated into clinical use, due in part to inadequate protein assay technologies to support large-scale testing of hundreds of candidate biomarkers in formalin-fixed paraffin embedded (FFPE) tissues. While the feasibility of using targeted, multiple reaction monitoring-mass spectrometry (MRM-MS) for quantitative analyses of FFPE tissues has been demonstrated, protocols have not been systematically optimized for robust quantification across a large number of analytes, nor has the performance of peptide immuno-MRM been evaluated. To address this gap, we used a test battery approach coupled to MRM-MS with the addition of stable isotope labeled standard peptides (targeting 512 analytes) to quantitatively evaluate the performance of three extraction protocols in combination with three trypsin digestion protocols (i.e. 9 processes). A process based on RapiGest buffer extraction and urea-based digestion was identified to enable similar quantitation results from FFPE and frozen tissues. Using the optimized protocols for MRM-based analysis of FFPE tissues, median precision was 11.4% (across 249 analytes). There was excellent correlation between measurements made on matched FFPE and frozen tissues, both for direct MRM analysis (R2 = 0.94) and immuno-MRM (R2 = 0.89). The optimized process enables highly reproducible, multiplex, standardizable, quantitative MRM in archival tissue specimens. PMID:27462933
Sun, Lirui; Jia, Longfei; Xie, Xing; Xie, Kaizhou; Wang, Jianfeng; Liu, Jianyu; Cui, Lulu; Zhang, Genxi; Dai, Guojun; Wang, Jinyu
2016-02-01
In this present study, we developed a simple, rapid and specific method for the quantitative analysis of the contents of amoxicillin (AMO), AMO metabolites and ampicillin (AMP) in eggs. This method uses a simple liquid-liquid extraction with acetonitrile followed by liquid chromatography-tandem mass spectrometry (LC-MS/MS). The optimized method has been validated according to requirements defined by the European Union and Food and Drug Administration. Extraction recoveries of the target compounds from the egg at 5, 10 and 25 μg/kg were all higher than 80%, with relative standard deviations not exceeding 10.00%. The limits of quantification in eggs were below the maximum residue limits (MRLs). The decision limits (CCα) ranged between 11.1 and 11.5 μg/kg, while detection capabilities (CCβ) from 12.1 to 13.0 μg/kg. These values were very close to the corresponding MRLs. Finally, the new approach was successfully verified for the quantitative determination of these analytes in 40 commercial eggs from local supermarkets. Copyright © 2015 Elsevier Ltd. All rights reserved.
A Simple Configuration for Quantitative Phase Contrast Microscopy of Transmissible Samples
NASA Astrophysics Data System (ADS)
Sengupta, Chandan; Dasgupta, Koustav; Bhattacharya, K.
Phase microscopy attempts to visualize and quantify the phase distribution of samples which are otherwise invisible under microscope without the use of stains. The two principal approaches to phase microscopy are essentially those of Fourier plane modulation and interferometric techniques. Although the former, first proposed by Zernike, had been the harbinger of phase microscopy, it was the latter that allowed for quantitative evaluation of phase samples. However interferometric techniques are fraught with associated problems such as complicated setup involving mirrors and beam-splitters, the need for a matched objective in the reference arm and also the need for vibration isolation. The present work proposes a single element cube beam-splitter (CBS) interferometer combined with a microscope objective (MO) for interference microscopy. Because of the monolithic nature of the interferometer, the system is almost insensitive to vibrations and relatively simple to align. It will be shown that phase shifting properties may also be introduced by suitable and proper use of polarizing devices. Initial results showing the quantitative three dimensional phase profiles of simulated and actual biological specimens are presented.
The place of words and numbers in psychiatric research.
Falissard, Bruno; Révah, Anne; Yang, Suzanne; Fagot-Largeault, Anne
2013-11-18
In recent decades, there has been widespread debate in the human and social sciences regarding the compatibility and the relative merits of quantitative and qualitative approaches in research. In psychiatry, depending on disciplines and traditions, objects of study can be represented either in words or using two types of mathematization. In the latter case, the use of mathematics in psychiatry is most often only local, as opposed to global as in the case of classical mechanics. Relationships between these objects of study can in turn be explored in three different ways: 1/ by a hermeneutic process, 2/ using statistics, the most frequent method in psychiatric research today, 3/ using equations, i.e. using mathematical relationships that are formal and deterministic. The 3 ways of representing entities (with language, locally with mathematics or globally with mathematics) and the 3 ways of expressing the relationships between entities (using hermeneutics, statistics or equations) can be combined in a cross-tabulation, and nearly all nine combinations can be described using examples. A typology of this nature may be useful in assessing which epistemological perspectives are currently dominant in a constantly evolving field such as psychiatry, and which other perspectives still need to be developed. It also contributes to undermining the overly simplistic and counterproductive beliefs that accompany the assumption of a Manichean "quantitative/qualitative" dichotomy. Systematic examination of this set of typologies could be useful in indicating new directions for future research beyond the quantitative/qualitative divide.
NASA Astrophysics Data System (ADS)
Kar, Supratik; Roy, Juganta K.; Leszczynski, Jerzy
2017-06-01
Advances in solar cell technology require designing of new organic dye sensitizers for dye-sensitized solar cells with high power conversion efficiency to circumvent the disadvantages of silicon-based solar cells. In silico studies including quantitative structure-property relationship analysis combined with quantum chemical analysis were employed to understand the primary electron transfer mechanism and photo-physical properties of 273 arylamine organic dyes from 11 diverse chemical families explicit to iodine electrolyte. The direct quantitative structure-property relationship models enable identification of the essential electronic and structural attributes necessary for quantifying the molecular prerequisites of 11 classes of arylamine organic dyes, responsible for high power conversion efficiency of dye-sensitized solar cells. Tetrahydroquinoline, N,N'-dialkylaniline and indoline have been least explored classes under arylamine organic dyes for dye-sensitized solar cells. Therefore, the identified properties from the corresponding quantitative structure-property relationship models of the mentioned classes were employed in designing of "lead dyes". Followed by, a series of electrochemical and photo-physical parameters were computed for designed dyes to check the required variables for electron flow of dye-sensitized solar cells. The combined computational techniques yielded seven promising lead dyes each for all three chemical classes considered. Significant (130, 183, and 46%) increment in predicted %power conversion efficiency was observed comparing with the existing dye with highest experimental %power conversion efficiency value for tetrahydroquinoline, N,N'-dialkylaniline and indoline, respectively maintaining required electrochemical parameters.
The place of words and numbers in psychiatric research
2013-01-01
In recent decades, there has been widespread debate in the human and social sciences regarding the compatibility and the relative merits of quantitative and qualitative approaches in research. In psychiatry, depending on disciplines and traditions, objects of study can be represented either in words or using two types of mathematization. In the latter case, the use of mathematics in psychiatry is most often only local, as opposed to global as in the case of classical mechanics. Relationships between these objects of study can in turn be explored in three different ways: 1/ by a hermeneutic process, 2/ using statistics, the most frequent method in psychiatric research today, 3/ using equations, i.e. using mathematical relationships that are formal and deterministic. The 3 ways of representing entities (with language, locally with mathematics or globally with mathematics) and the 3 ways of expressing the relationships between entities (using hermeneutics, statistics or equations) can be combined in a cross-tabulation, and nearly all nine combinations can be described using examples. A typology of this nature may be useful in assessing which epistemological perspectives are currently dominant in a constantly evolving field such as psychiatry, and which other perspectives still need to be developed. It also contributes to undermining the overly simplistic and counterproductive beliefs that accompany the assumption of a Manichean “quantitative/qualitative” dichotomy. Systematic examination of this set of typologies could be useful in indicating new directions for future research beyond the quantitative/qualitative divide. PMID:24246064
Combining disparate data for decision making
NASA Astrophysics Data System (ADS)
Gettings, M. E.
2010-12-01
Combining information of disparate types from multiple data or model sources is a fundamental task in decision making theory. Procedures for combining and utilizing quantitative data with uncertainties are well-developed in several approaches, but methods for including qualitative and semi-quantitative data are much less so. Possibility theory offers an approach to treating all three data types in an objective and repeatable way. In decision making, biases are frequently present in several forms, including those arising from data quality, data spatial and temporal distribution, and the analyst's knowledge and beliefs as to which data or models are most important. The latter bias is particularly evident in the case of qualitative data and there are numerous examples of analysts feeling that a qualitative dataset is more relevant than a quantified one. Possibility theory and fuzzy logic now provide fairly general rules for quantifying qualitative and semi-quantitative data in ways that are repeatable and minimally biased. Once a set of quantified data and/or model layers is obtained, there are several methods of combining them to obtain insight useful in decision making. These include: various combinations of layers using formal fuzzy logic (for example, layer A and (layer B or layer C) but not layer D); connecting the layers with varying influence links in a Fuzzy Cognitive Map; and using the set of layers for the universe of discourse for agent based model simulations. One example of logical combinations that have proven useful is the definition of possible habitat for valley fever fungus (Coccidioides sp.) using variables such as soil type, altitude, aspect, moisture and temperature. A second example is the delineation of the lithology and possible mineralization of several areas beneath basin fill in southern Arizona. A Fuzzy Cognitive Map example is the impacts of development and operation of a hypothetical mine in an area adjacent to a city. In this model variables such as water use, environmental quality measures (visual and geochemical), deposit quality, rate of development, and commodity price combine in complex ways to yield frequently counter-intuitive results. By varying the interaction strengths linking the variables, insight into the complex interactions of the system can be gained. An example using agent-based modeling is a model designed to test the hypothesis that new valley fever fungus sites could be established from existing sites by wind transport of fungal spores. The variables include layers simulating precipitation, temperature, soil moisture, and soil chemistry based on historical climate records and studies of known valley fever habitat. Numerous agent-based model runs show that the system is self organizing to the extent that there will be new sites established by wind transport over decadal scales. Possibility theory provides a framework for gaining insight into the interaction of known or suspected variables in a complex system. Once the data layers are quantified into possibility functions, varying hypotheses of the relative importance of variables and processes can be obtained by repeated combinations with varying weights. This permits an evaluation of the effects of various data layers, their uncertainties, and biases from the layers, all of which improve the objectivity of decision making.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
Ostovaneh, Mohammad R; Vavere, Andrea L; Mehra, Vishal C; Kofoed, Klaus F; Matheson, Matthew B; Arbab-Zadeh, Armin; Fujisawa, Yasuko; Schuijf, Joanne D; Rochitte, Carlos E; Scholte, Arthur J; Kitagawa, Kakuya; Dewey, Marc; Cox, Christopher; DiCarli, Marcelo F; George, Richard T; Lima, Joao A C
To determine the diagnostic accuracy of semi-automatic quantitative metrics compared to expert reading for interpretation of computed tomography perfusion (CTP) imaging. The CORE320 multicenter diagnostic accuracy clinical study enrolled patients between 45 and 85 years of age who were clinically referred for invasive coronary angiography (ICA). Computed tomography angiography (CTA), CTP, single photon emission computed tomography (SPECT), and ICA images were interpreted manually in blinded core laboratories by two experienced readers. Additionally, eight quantitative CTP metrics as continuous values were computed semi-automatically from myocardial and blood attenuation and were combined using logistic regression to derive a final quantitative CTP metric score. For the reference standard, hemodynamically significant coronary artery disease (CAD) was defined as a quantitative ICA stenosis of 50% or greater and a corresponding perfusion defect by SPECT. Diagnostic accuracy was determined by area under the receiver operating characteristic curve (AUC). Of the total 377 included patients, 66% were male, median age was 62 (IQR: 56, 68) years, and 27% had prior myocardial infarction. In patient based analysis, the AUC (95% CI) for combined CTA-CTP expert reading and combined CTA-CTP semi-automatic quantitative metrics was 0.87(0.84-0.91) and 0.86 (0.83-0.9), respectively. In vessel based analyses the AUC's were 0.85 (0.82-0.88) and 0.84 (0.81-0.87), respectively. No significant difference in AUC was found between combined CTA-CTP expert reading and CTA-CTP semi-automatic quantitative metrics in patient based or vessel based analyses(p > 0.05 for all). Combined CTA-CTP semi-automatic quantitative metrics is as accurate as CTA-CTP expert reading to detect hemodynamically significant CAD. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Chiarenza, Giuseppe A; Villa, Stefania; Galan, Lidice; Valdes-Sosa, Pedro; Bosch-Bayard, Jorge
2018-05-19
Oppositional defiant disorder (ODD) is frequently associated with Attention Deficit Hyperactivity Disorder (ADHD) but no clear neurophysiological evidence exists that distinguishes the two groups. Our aim was to identify biomarkers that distinguish children with Attention Deficit Hyperactivity Disorder combined subtype (ADHD_C) from children with ADHD_C + ODD, by combining the results of quantitative EEG (qEEG) and the Junior Temperament Character Inventory (JTCI). 28 ADHD_C and 22 ADHD_C + ODD children who met the DSMV criteria participated in the study. JTCI and EEG were analyzed. Stability based Biomarkers identification methodology was applied to the JTCI and the qEEG separately and combined. The qEEG was tested at the scalp and the sources levels. The classification power of the selected biomarkers was tested with a robust ROC technique. The best discriminant power was obtained when TCI and qEEG were analyzed together. Novelty seeking, self-directedness and cooperativeness were selected as biomarkers together with F4 and Cz in Delta; Fz and F4 in Theta and F7 and F8 in Beta, with a robust AUC of 0.95 for the ROC. At sources level: the regions were the right lateral and medial orbito-frontal cortex, cingular region, angular gyrus, right inferior occipital gyrus, occipital pole and the left insula in Theta, Alpha and Beta. The robust estimate of the total AUC was 0.91. These structures are part of extensive networks of novelty seeking, self-directedness and cooperativeness systems that seem dysregulated in these children. These methods represent an original approach to associate differences of personality and behavior to specific neuronal systems and subsystems. Copyright © 2018 Elsevier B.V. All rights reserved.
A Quantitative Approach to the Formal Verification of Real-Time Systems.
1996-09-01
Computer Science A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos September 1996 CMU-CS-96-199...ptisiic raieaiSI v Diambimos Lboiamtad _^ A Quantitative Approach to the Formal Verification of Real - Time Systems Sergio Vale Aguiar Campos...implied, of NSF, the Semiconduc- tor Research Corporation, ARPA or the U.S. government. Keywords: real - time systems , formal verification, symbolic
2008-09-01
Research Methods: Qualitative and Quantitative Approaches (Boston: Pearson, 2006), 1-592. 48 This project demanded the use of a primarily...enforcement practices. 200 Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 152...www.socialresearchmethods.net/kb/strucres.php (accessed July 12, 2008). 203 Neuman, Social Research Methods: Qualitative and Quantitative Approaches, 149. 204 Paul
Quantitative prediction of drug side effects based on drug-related features.
Niu, Yanqing; Zhang, Wen
2017-09-01
Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.
Bruijn, Merel M C; Kamphuis, Esme I; Hoesli, Irene M; Martinez de Tejada, Begoña; Loccufier, Anne R; Kühnert, Maritta; Helmer, Hanns; Franz, Marie; Porath, Martina M; Oudijk, Martijn A; Jacquemyn, Yves; Schulzke, Sven M; Vetter, Grit; Hoste, Griet; Vis, Jolande Y; Kok, Marjolein; Mol, Ben W J; van Baaren, Gert-Jan
2016-12-01
The combination of the qualitative fetal fibronectin test and cervical length measurement has a high negative predictive value for preterm birth within 7 days; however, positive prediction is poor. A new bedside quantitative fetal fibronectin test showed potential additional value over the conventional qualitative test, but there is limited evidence on the combination with cervical length measurement. The purpose of this study was to compare quantitative fetal fibronectin and qualitative fetal fibronectin testing in the prediction of spontaneous preterm birth within 7 days in symptomatic women who undergo cervical length measurement. We performed a European multicenter cohort study in 10 perinatal centers in 5 countries. Women between 24 and 34 weeks of gestation with signs of active labor and intact membranes underwent quantitative fibronectin testing and cervical length measurement. We assessed the risk of preterm birth within 7 days in predefined strata based on fibronectin concentration and cervical length. Of 455 women who were included in the study, 48 women (11%) delivered within 7 days. A combination of cervical length and qualitative fibronectin resulted in the identification of 246 women who were at low risk: 164 women with a cervix between 15 and 30 mm and a negative fibronectin test (<50 ng/mL; preterm birth rate, 2%) and 82 women with a cervix at >30 mm (preterm birth rate, 2%). Use of quantitative fibronectin alone resulted in a predicted risk of preterm birth within 7 days that ranged from 2% in the group with the lowest fibronectin level (<10 ng/mL) to 38% in the group with the highest fibronectin level (>500 ng/mL), with similar accuracy as that of the combination of cervical length and qualitative fibronectin. Combining cervical length and quantitative fibronectin resulted in the identification of an additional 19 women at low risk (preterm birth rate, 5%), using a threshold of 10 ng/mL in women with a cervix at <15 mm, and 6 women at high risk (preterm birth rate, 33%) using a threshold of >500 ng/mL in women with a cervix at >30 mm. In women with threatened preterm birth, quantitative fibronectin testing alone performs equal to the combination of cervical length and qualitative fibronectin. Possibly, the combination of quantitative fibronectin testing and cervical length increases this predictive capacity. Cost-effectiveness analysis and the availability of these tests in a local setting should determine the final choice. Copyright © 2016 Elsevier Inc. All rights reserved.
Quantitative Measures of Immersion in Cloud and the Biogeography of Cloud Forests
NASA Technical Reports Server (NTRS)
Lawton, R. O.; Nair, U. S.; Ray, D.; Regmi, A.; Pounds, J. A.; Welch, R. M.
2010-01-01
Sites described as tropical montane cloud forests differ greatly, in part because observers tend to differ in their opinion as to what constitutes frequent and prolonged immersion in cloud. This definitional difficulty interferes with hydrologic analyses, assessments of environmental impacts on ecosystems, and biogeographical analyses of cloud forest communities and species. Quantitative measurements of cloud immersion can be obtained on site, but the observations are necessarily spatially limited, although well-placed observers can examine 10 50 km of a mountain range under rainless conditions. Regional analyses, however, require observations at a broader scale. This chapter discusses remote sensing and modeling approaches that can provide quantitative measures of the spatiotemporal patterns of cloud cover and cloud immersion in tropical mountain ranges. These approaches integrate remote sensing tools of various spatial resolutions and frequencies of observation, digital elevation models, regional atmospheric models, and ground-based observations to provide measures of cloud cover, cloud base height, and the intersection of cloud and terrain. This combined approach was applied to the Monteverde region of northern Costa Rica to illustrate how the proportion of time the forest is immersed in cloud may vary spatially and temporally. The observed spatial variation was largely due to patterns of airflow over the mountains. The temporal variation reflected the diurnal rise and fall of the orographic cloud base, which was influenced in turn by synoptic weather conditions, the seasonal movement of the Intertropical Convergence Zone and the north-easterly trade winds. Knowledge of the proportion of the time that sites are immersed in clouds should facilitate ecological comparisons and biogeographical analyses, as well as land use planning and hydrologic assessments in areas where intensive on-site work is not feasible.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-09-01
Partial volume effects (PVEs) are consequences of the limited spatial resolution in emission tomography leading to underestimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multiresolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low-resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model, which may introduce artifacts in regions where no significant correlation exists between anatomical and functional details. A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present, the new model outperformed the 2D global approach, avoiding artifacts and significantly improving quality of the corrected images and their quantitative accuracy. A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multiresolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information.
Le Pogam, Adrien; Hatt, Mathieu; Descourt, Patrice; Boussion, Nicolas; Tsoumpas, Charalampos; Turkheimer, Federico E.; Prunier-Aesch, Caroline; Baulieu, Jean-Louis; Guilloteau, Denis; Visvikis, Dimitris
2011-01-01
Purpose Partial volume effects (PVE) are consequences of the limited spatial resolution in emission tomography leading to under-estimation of uptake in tissues of size similar to the point spread function (PSF) of the scanner as well as activity spillover between adjacent structures. Among PVE correction methodologies, a voxel-wise mutual multi-resolution analysis (MMA) was recently introduced. MMA is based on the extraction and transformation of high resolution details from an anatomical image (MR/CT) and their subsequent incorporation into a low resolution PET image using wavelet decompositions. Although this method allows creating PVE corrected images, it is based on a 2D global correlation model which may introduce artefacts in regions where no significant correlation exists between anatomical and functional details. Methods A new model was designed to overcome these two issues (2D only and global correlation) using a 3D wavelet decomposition process combined with a local analysis. The algorithm was evaluated on synthetic, simulated and patient images, and its performance was compared to the original approach as well as the geometric transfer matrix (GTM) method. Results Quantitative performance was similar to the 2D global model and GTM in correlated cases. In cases where mismatches between anatomical and functional information were present the new model outperformed the 2D global approach, avoiding artefacts and significantly improving quality of the corrected images and their quantitative accuracy. Conclusions A new 3D local model was proposed for a voxel-wise PVE correction based on the original mutual multi-resolution analysis approach. Its evaluation demonstrated an improved and more robust qualitative and quantitative accuracy compared to the original MMA methodology, particularly in the absence of full correlation between anatomical and functional information. PMID:21978037
Molins, C; Hogendoorn, E A; Dijkman, E; Heusinkveld, H A; Baumann, R A
2000-02-11
The combination of microwave-assisted solvent extraction (MASE) and reversed-phase liquid chromatography (RPLC) with UV detection has been investigated for the efficient determination of phenylurea herbicides in soils involving the single-residue method (SRM) approach (linuron) and the multi-residue method (MRM) approach (monuron, monolinuron, isoproturon, metobromuron, diuron and linuron). Critical parameters of MASE, viz, extraction temperature, water content and extraction solvent were varied in order to optimise recoveries of the analytes while simultaneously minimising co-extraction of soil interferences. The optimised extraction procedure was applied to different types of soil with an organic carbon content of 0.4-16.7%. Besides freshly spiked soil samples, method validation included the analysis of samples with aged residues. A comparative study between the applicability of RPLC-UV without and with the use of column switching for the processing of uncleaned extracts, was carried out. For some of the tested analyte/matrix combinations the one-column approach (LC mode) is feasible. In comparison to LC, coupled-column LC (LC-LC mode) provides high selectivity in single-residue analysis (linuron) and, although less pronounced in multi-residue analysis (all six phenylurea herbicides), the clean-up performance of LC-LC improves both time of analysis and sample throughput. In the MRM approach the developed procedure involving MASE and LC-LC-UV provided acceptable recoveries (range, 80-120%) and RSDs (<12%) at levels of 10 microg/kg (n=9) and 50 microg/kg (n=7), respectively, for most analyte/matrix combinations. Recoveries from aged residue samples spiked at a level of 100 microg/kg (n=7) ranged, depending of the analyte/soil type combination, from 41-113% with RSDs ranging from 1-35%. In the SRM approach the developed LC-LC procedure was applied for the determination of linuron in 28 sandy soil samples collected in a field study. Linuron could be determined in soil with a limit of quantitation of 10 microg/kg.
NASA Astrophysics Data System (ADS)
Chambers, Timothy
This dissertation presents the results of an experiment that measured the learning outcomes associated with three different pedagogical approaches to introductory physics labs. These three pedagogical approaches presented students with the same apparatus and covered the same physics content, but used different lab manuals to guide students through distinct cognitive processes in conducting their laboratory investigations. We administered post-tests containing multiple-choice conceptual questions and free-response quantitative problems one week after students completed these laboratory investigations. In addition, we collected data from the laboratory practical exam taken by students at the end of the semester. Using these data sets, we compared the learning outcomes for the three curricula in three dimensions of ability: conceptual understanding, quantitative problem-solving skill, and laboratory skills. Our three pedagogical approaches are as follows. Guided labs lead students through their investigations via a combination of Socratic-style questioning and direct instruction, while students record their data and answers to written questions in the manual during the experiment. Traditional labs provide detailed written instructions, which students follow to complete the lab objectives. Open labs provide students with a set of apparatus and a question to be answered, and leave students to devise and execute an experiment to answer the question. In general, we find that students performing Guided labs perform better on some conceptual assessment items, and that students performing Open labs perform significantly better on experimental tasks. Combining a classical test theory analysis of post-test results with in-lab classroom observations allows us to identify individual components of the laboratory manuals and investigations that are likely to have influenced the observed differences in learning outcomes associated with the different pedagogical approaches. Due to the novel nature of this research and the large number of item-level results we produced, we recommend additional research to determine the reproducibility of our results. Analyzing the data with item response theory yields additional information about the performance of our students on both conceptual questions and quantitative problems. We find that performing lab activities on a topic does lead to better-than-expected performance on some conceptual questions regardless of pedagogical approach, but that this acquired conceptual understanding is strongly context-dependent. The results also suggest that a single "Newtonian reasoning ability" is inadequate to explain student response patterns to items from the Force Concept Inventory. We develop a framework for applying polytomous item response theory to the analysis of quantitative free-response problems and for analyzing how features of student solutions are influenced by problem-solving ability. Patterns in how students at different abilities approach our post-test problems are revealed, and we find hints as to how features of a free-response problem influence its item parameters. The item-response theory framework we develop provides a foundation for future development of quantitative free-response research instruments. Chapter 1 of the dissertation presents a brief history of physics education research and motivates the present study. Chapter 2 describes our experimental methodology and discusses the treatments applied to students and the instruments used to measure their learning. Chapter 3 provides an introduction to the statistical and analytical methods used in our data analysis. Chapter 4 presents the full data set, analyzed using both classical test theory and item response theory. Chapter 5 contains a discussion of the implications of our results and a data-driven analysis of our experimental methods. Chapter 6 describes the importance of this work to the field and discusses the relevance of our research to curriculum development and to future work in physics education research.
NASA Astrophysics Data System (ADS)
Nijzink, Remko C.; Samaniego, Luis; Mai, Juliane; Kumar, Rohini; Thober, Stephan; Zink, Matthias; Schäfer, David; Savenije, Hubert H. G.; Hrachowitz, Markus
2016-03-01
Heterogeneity of landscape features like terrain, soil, and vegetation properties affects the partitioning of water and energy. However, it remains unclear to what extent an explicit representation of this heterogeneity at the sub-grid scale of distributed hydrological models can improve the hydrological consistency and the robustness of such models. In this study, hydrological process complexity arising from sub-grid topography heterogeneity was incorporated into the distributed mesoscale Hydrologic Model (mHM). Seven study catchments across Europe were used to test whether (1) the incorporation of additional sub-grid variability on the basis of landscape-derived response units improves model internal dynamics, (2) the application of semi-quantitative, expert-knowledge-based model constraints reduces model uncertainty, and whether (3) the combined use of sub-grid response units and model constraints improves the spatial transferability of the model. Unconstrained and constrained versions of both the original mHM and mHMtopo, which allows for topography-based sub-grid heterogeneity, were calibrated for each catchment individually following a multi-objective calibration strategy. In addition, four of the study catchments were simultaneously calibrated and their feasible parameter sets were transferred to the remaining three receiver catchments. In a post-calibration evaluation procedure the probabilities of model and transferability improvement, when accounting for sub-grid variability and/or applying expert-knowledge-based model constraints, were assessed on the basis of a set of hydrological signatures. In terms of the Euclidian distance to the optimal model, used as an overall measure of model performance with respect to the individual signatures, the model improvement achieved by introducing sub-grid heterogeneity to mHM in mHMtopo was on average 13 %. The addition of semi-quantitative constraints to mHM and mHMtopo resulted in improvements of 13 and 19 %, respectively, compared to the base case of the unconstrained mHM. Most significant improvements in signature representations were, in particular, achieved for low flow statistics. The application of prior semi-quantitative constraints further improved the partitioning between runoff and evaporative fluxes. In addition, it was shown that suitable semi-quantitative prior constraints in combination with the transfer-function-based regularization approach of mHM can be beneficial for spatial model transferability as the Euclidian distances for the signatures improved on average by 2 %. The effect of semi-quantitative prior constraints combined with topography-guided sub-grid heterogeneity on transferability showed a more variable picture of improvements and deteriorations, but most improvements were observed for low flow statistics.
Agapova, Maria; Devine, Emily Beth; Bresnahan, Brian W; Higashi, Mitchell K; Garrison, Louis P
2014-09-01
Health agencies making regulatory marketing-authorization decisions use qualitative and quantitative approaches to assess expected benefits and expected risks associated with medical interventions. There is, however, no universal standard approach that regulatory agencies consistently use to conduct benefit-risk assessment (BRA) for pharmaceuticals or medical devices, including for imaging technologies. Economics, health services research, and health outcomes research use quantitative approaches to elicit preferences of stakeholders, identify priorities, and model health conditions and health intervention effects. Challenges to BRA in medical devices are outlined, highlighting additional barriers in radiology. Three quantitative methods--multi-criteria decision analysis, health outcomes modeling and stated-choice survey--are assessed using criteria that are important in balancing benefits and risks of medical devices and imaging technologies. To be useful in regulatory BRA, quantitative methods need to: aggregate multiple benefits and risks, incorporate qualitative considerations, account for uncertainty, and make clear whose preferences/priorities are being used. Each quantitative method performs differently across these criteria and little is known about how BRA estimates and conclusions vary by approach. While no specific quantitative method is likely to be the strongest in all of the important areas, quantitative methods may have a place in BRA of medical devices and radiology. Quantitative BRA approaches have been more widely applied in medicines, with fewer BRAs in devices. Despite substantial differences in characteristics of pharmaceuticals and devices, BRA methods may be as applicable to medical devices and imaging technologies as they are to pharmaceuticals. Further research to guide the development and selection of quantitative BRA methods for medical devices and imaging technologies is needed. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.
Zhang, Xirui; Daaboul, George G; Spuhler, Philipp S; Dröge, Peter; Ünlü, M Selim
2016-03-14
DNA-binding proteins play crucial roles in the maintenance and functions of the genome and yet, their specific binding mechanisms are not fully understood. Recently, it was discovered that DNA-binding proteins recognize specific binding sites to carry out their functions through an indirect readout mechanism by recognizing and capturing DNA conformational flexibility and deformation. High-throughput DNA microarray-based methods that provide large-scale protein-DNA binding information have shown effective and comprehensive analysis of protein-DNA binding affinities, but do not provide information of DNA conformational changes in specific protein-DNA complexes. Building on the high-throughput capability of DNA microarrays, we demonstrate a quantitative approach that simultaneously measures the amount of protein binding to DNA and nanometer-scale DNA conformational change induced by protein binding in a microarray format. Both measurements rely on spectral interferometry on a layered substrate using a single optical instrument in two distinct modalities. In the first modality, we quantitate the amount of binding of protein to surface-immobilized DNA in each DNA spot using a label-free spectral reflectivity technique that accurately measures the surface densities of protein and DNA accumulated on the substrate. In the second modality, for each DNA spot, we simultaneously measure DNA conformational change using a fluorescence vertical sectioning technique that determines average axial height of fluorophores tagged to specific nucleotides of the surface-immobilized DNA. The approach presented in this paper, when combined with current high-throughput DNA microarray-based technologies, has the potential to serve as a rapid and simple method for quantitative and large-scale characterization of conformational specific protein-DNA interactions.
Durable resistance: A key to sustainable management of pathogens and pests
Mundt, Christopher C.
2014-01-01
This review briefly addresses what has been learned about resistance durability in recent years, as well as the questions that still remain. Molecular analyses of major gene interactions have potential to contribute to both breeding for resistance and improved understanding of virulence impacts on pathogen fitness. Though the molecular basis of quantitative resistance is less clear substantial evidence has accumulated for the relative simplicity of inheritance. There is increasing evidence for specific interactions with quantitative resistance, though implications o this for durability are still unknown. Mechanisms by which resistance gene pyramids contribute to durability remain elusive, though ideas have been generated for identifying gene combinations that may be more durable. Though cultivar mixtures and related approaches have been used successfully, identifying the diseases and conditions that are most conducive to the use of diversity has been surprisingly difficult, and the selective influence of diversity on pathogen populations is complex. The importance of considering resistance durability in a landscape context has received increasing emphasis and is an important future area of research. Experimental systems are being developed to test resistance gene deployment strategies that previously could be addressed only with logic and observation. The value of molecular markers for identifying and pyramiding major genes is quite clear, but the successful use of quantitative trait loci (QTL) for marker-assisted selection of quantitative resistance will depend greatly on the degree to which the identified QTL are expressed in different genetic backgrounds. Transgenic approaches will likely provide opportunities for control of some recalcitrant pathogens, though issues of durability for transgenes are likely to be no different than other genes for resistance. The need for high quality phenotypic analysis and screening methodologies is a priority, and field-based studies are likely to remain of signal importance in the foreseeable future. PMID:24486735
Universal fragment descriptors for predicting properties of inorganic crystals
NASA Astrophysics Data System (ADS)
Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander
2017-06-01
Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.
Zherebtsov, Evgeny A; Zherebtsova, Angelina I; Doronin, Alexander; Dunaev, Andrey V; Podmasteryev, Konstantin V; Bykov, Alexander; Meglinski, Igor
2017-04-01
We introduce a noninvasive diagnostic approach for functional monitoring of blood microflows in capillaries and thermoregulatory vessels within the skin. The measuring system is based on the combined use of laser Doppler flowmetry and skin contact thermometry. The obtained results suggest that monitoring of blood microcirculation during the occlusion, performed in conjunction with the skin temperature measurements in the thermally stabilized medium, has a great potential for quantitative assessment of angiospatic dysfunctions of the peripheral blood vessels. The indices of blood flow reserve and temperature response were measured and used as the primarily parameters of the functional diagnostics of the peripheral vessels of skin. Utilizing these parameters, a simple phenomenological model has been suggested to identify patients with angiospastic violations in the vascular system.
Kletenik-Edelman, Orly; Reichman, David R; Rabani, Eran
2011-01-28
A novel quantum mode coupling theory combined with a kinetic approach is developed for the description of collective density fluctuations in quantum liquids characterized by Boltzmann statistics. Three mode-coupling approximations are presented and applied to study the dynamic response of para-hydrogen near the triple point and normal liquid helium above the λ-transition. The theory is compared with experimental results and to the exact imaginary time data generated by path integral Monte Carlo simulations. While for liquid para-hydrogen the combination of kinetic and quantum mode-coupling theory provides semi-quantitative results for both short and long time dynamics, it fails for normal liquid helium. A discussion of this failure based on the ideal gas limit is presented.
Universal fragment descriptors for predicting properties of inorganic crystals.
Isayev, Olexandr; Oses, Corey; Toher, Cormac; Gossett, Eric; Curtarolo, Stefano; Tropsha, Alexander
2017-06-05
Although historically materials discovery has been driven by a laborious trial-and-error process, knowledge-driven materials design can now be enabled by the rational combination of Machine Learning methods and materials databases. Here, data from the AFLOW repository for ab initio calculations is combined with Quantitative Materials Structure-Property Relationship models to predict important properties: metal/insulator classification, band gap energy, bulk/shear moduli, Debye temperature and heat capacities. The prediction's accuracy compares well with the quality of the training data for virtually any stoichiometric inorganic crystalline material, reciprocating the available thermomechanical experimental data. The universality of the approach is attributed to the construction of the descriptors: Property-Labelled Materials Fragments. The representations require only minimal structural input allowing straightforward implementations of simple heuristic design rules.
Translational research impacting on crop productivity in drought-prone environments.
Reynolds, Matthew; Tuberosa, Roberto
2008-04-01
Conventional breeding for drought-prone environments (DPE) has been complemented by using exotic germplasm to extend crop gene pools and physiological approaches that consider water uptake (WU), water-use efficiency (WUE), and harvest index (HI) as drivers of yield. Drivers are associated with proxy genetic markers, such as carbon-isotope discrimination for WUE, canopy temperature for WU, and anthesis-silking interval for HI in maize. Molecular markers associated with relevant quantitative trait loci are being developed. WUE has also been increased through combining understanding of root-to-shoot signaling with deficit irrigation. Impacts in DPE will be accelerated by combining proven technologies with promising new strategies such as marker-assisted selection, and genetic transformation, as well as conservation agriculture that can increase WU while averting soil degradation.
NASA Astrophysics Data System (ADS)
Zherebtsov, Evgeny A.; Zherebtsova, Angelina I.; Doronin, Alexander; Dunaev, Andrey V.; Podmasteryev, Konstantin V.; Bykov, Alexander; Meglinski, Igor
2017-04-01
We introduce a noninvasive diagnostic approach for functional monitoring of blood microflows in capillaries and thermoregulatory vessels within the skin. The measuring system is based on the combined use of laser Doppler flowmetry and skin contact thermometry. The obtained results suggest that monitoring of blood microcirculation during the occlusion, performed in conjunction with the skin temperature measurements in the thermally stabilized medium, has a great potential for quantitative assessment of angiospatic dysfunctions of the peripheral blood vessels. The indices of blood flow reserve and temperature response were measured and used as the primarily parameters of the functional diagnostics of the peripheral vessels of skin. Utilizing these parameters, a simple phenomenological model has been suggested to identify patients with angiospastic violations in the vascular system.
Kisely, Stephen; Kendall, Elizabeth
2011-08-01
Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.
Population- and individual-specific regulatory variation in Sardinia.
Pala, Mauro; Zappala, Zachary; Marongiu, Mara; Li, Xin; Davis, Joe R; Cusano, Roberto; Crobu, Francesca; Kukurba, Kimberly R; Gloudemans, Michael J; Reinier, Frederic; Berutti, Riccardo; Piras, Maria G; Mulas, Antonella; Zoledziewska, Magdalena; Marongiu, Michele; Sorokin, Elena P; Hess, Gaelen T; Smith, Kevin S; Busonero, Fabio; Maschio, Andrea; Steri, Maristella; Sidore, Carlo; Sanna, Serena; Fiorillo, Edoardo; Bassik, Michael C; Sawcer, Stephen J; Battle, Alexis; Novembre, John; Jones, Chris; Angius, Andrea; Abecasis, Gonçalo R; Schlessinger, David; Cucca, Francesco; Montgomery, Stephen B
2017-05-01
Genetic studies of complex traits have mainly identified associations with noncoding variants. To further determine the contribution of regulatory variation, we combined whole-genome and transcriptome data for 624 individuals from Sardinia to identify common and rare variants that influence gene expression and splicing. We identified 21,183 expression quantitative trait loci (eQTLs) and 6,768 splicing quantitative trait loci (sQTLs), including 619 new QTLs. We identified high-frequency QTLs and found evidence of selection near genes involved in malarial resistance and increased multiple sclerosis risk, reflecting the epidemiological history of Sardinia. Using family relationships, we identified 809 segregating expression outliers (median z score of 2.97), averaging 13.3 genes per individual. Outlier genes were enriched for proximal rare variants, providing a new approach to study large-effect regulatory variants and their relevance to traits. Our results provide insight into the effects of regulatory variants and their relationship to population history and individual genetic risk.
Metabolic modelling in the development of cell factories by synthetic biology
Jouhten, Paula
2012-01-01
Cell factories are commonly microbial organisms utilized for bioconversion of renewable resources to bulk or high value chemicals. Introduction of novel production pathways in chassis strains is the core of the development of cell factories by synthetic biology. Synthetic biology aims to create novel biological functions and systems not found in nature by combining biology with engineering. The workflow of the development of novel cell factories with synthetic biology is ideally linear which will be attainable with the quantitative engineering approach, high-quality predictive models, and libraries of well-characterized parts. Different types of metabolic models, mathematical representations of metabolism and its components, enzymes and metabolites, are useful in particular phases of the synthetic biology workflow. In this minireview, the role of metabolic modelling in synthetic biology will be discussed with a review of current status of compatible methods and models for the in silico design and quantitative evaluation of a cell factory. PMID:24688669
How to Perform a Systematic Review and Meta-analysis of Diagnostic Imaging Studies.
Cronin, Paul; Kelly, Aine Marie; Altaee, Duaa; Foerster, Bradley; Petrou, Myria; Dwamena, Ben A
2018-05-01
A systematic review is a comprehensive search, critical evaluation, and synthesis of all the relevant studies on a specific (clinical) topic that can be applied to the evaluation of diagnostic and screening imaging studies. It can be a qualitative or a quantitative (meta-analysis) review of available literature. A meta-analysis uses statistical methods to combine and summarize the results of several studies. In this review, a 12-step approach to performing a systematic review (and meta-analysis) is outlined under the four domains: (1) Problem Formulation and Data Acquisition, (2) Quality Appraisal of Eligible Studies, (3) Statistical Analysis of Quantitative Data, and (4) Clinical Interpretation of the Evidence. This review is specifically geared toward the performance of a systematic review and meta-analysis of diagnostic test accuracy (imaging) studies. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
Pflock, Tobias J; Oellerich, Silke; Southall, June; Cogdell, Richard J; Ullmann, G Matthias; Köhler, Jürgen
2011-07-21
We have employed time-resolved spectroscopy on the picosecond time scale in combination with dynamic Monte Carlo simulations to investigate the photophysical properties of light-harvesting 2 (LH2) complexes from the purple photosynthetic bacterium Rhodopseudomonas acidophila. The variations of the fluorescence transients were studied as a function of the excitation fluence, the repetition rate of the excitation and the sample preparation conditions. Here we present the results obtained on detergent solubilized LH2 complexes, i.e., avoiding intercomplex interactions, and show that a simple four-state model is sufficient to grasp the experimental observations quantitatively without the need for any free parameters. This approach allows us to obtain a quantitative measure for the singlet-triplet annihilation rate in isolated, noninteracting LH2 complexes.
General solution for quantitative dark-field contrast imaging with grating interferometers
NASA Astrophysics Data System (ADS)
Strobl, M.
2014-11-01
Grating interferometer based imaging with X-rays and neutrons has proven to hold huge potential for applications in key research fields conveying biology and medicine as well as engineering and magnetism, respectively. The thereby amenable dark-field imaging modality implied the promise to access structural information beyond reach of direct spatial resolution. However, only here a yet missing approach is reported that finally allows exploiting this outstanding potential for non-destructive materials characterizations. It enables to obtain quantitative structural small angle scattering information combined with up to 3-dimensional spatial image resolution even at lab based x-ray or at neutron sources. The implied two orders of magnitude efficiency gain as compared to currently available techniques in this regime paves the way for unprecedented structural investigations of complex sample systems of interest for material science in a vast range of fields.
Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas
2018-02-06
We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.
New approaches to investigating social gestures in autism spectrum disorder
2012-01-01
The combination of economic games and human neuroimaging presents the possibility of using economic probes to identify biomarkers for quantitative features of healthy and diseased cognition. These probes span a range of important cognitive functions, but one new use is in the domain of reciprocating social exchange with other humans - a capacity perturbed in a number of psychopathologies. We summarize the use of a reciprocating exchange game to elicit neural and behavioral signatures for subjects diagnosed with autism spectrum disorder (ASD). Furthermore, we outline early efforts to capture features of social exchange in computational models and use these to identify quantitative behavioral differences between subjects with ASD and matched controls. Lastly, we summarize a number of subsequent studies inspired by the modeling results, which suggest new neural and behavioral signatures that could be used to characterize subtle deficits in information processing during interactions with other humans. PMID:22958572
A Resource of Quantitative Functional Annotation for Homo sapiens Genes.
Taşan, Murat; Drabkin, Harold J; Beaver, John E; Chua, Hon Nian; Dunham, Julie; Tian, Weidong; Blake, Judith A; Roth, Frederick P
2012-02-01
The body of human genomic and proteomic evidence continues to grow at ever-increasing rates, while annotation efforts struggle to keep pace. A surprisingly small fraction of human genes have clear, documented associations with specific functions, and new functions continue to be found for characterized genes. Here we assembled an integrated collection of diverse genomic and proteomic data for 21,341 human genes and make quantitative associations of each to 4333 Gene Ontology terms. We combined guilt-by-profiling and guilt-by-association approaches to exploit features unique to the data types. Performance was evaluated by cross-validation, prospective validation, and by manual evaluation with the biological literature. Functional-linkage networks were also constructed, and their utility was demonstrated by identifying candidate genes related to a glioma FLN using a seed network from genome-wide association studies. Our annotations are presented-alongside existing validated annotations-in a publicly accessible and searchable web interface.
Tu, Qing; Lange, Björn; Parlak, Zehra; Lopes, Joao Marcelo J; Blum, Volker; Zauscher, Stefan
2016-07-26
Interfaces and subsurface layers are critical for the performance of devices made of 2D materials and heterostructures. Facile, nondestructive, and quantitative ways to characterize the structure of atomically thin, layered materials are thus essential to ensure control of the resultant properties. Here, we show that contact-resonance atomic force microscopy-which is exquisitely sensitive to stiffness changes that arise from even a single atomic layer of a van der Waals-adhered material-is a powerful experimental tool to address this challenge. A combined density functional theory and continuum modeling approach is introduced that yields sub-surface-sensitive, nanomechanical fingerprints associated with specific, well-defined structure models of individual surface domains. Where such models are known, this information can be correlated with experimentally obtained contact-resonance frequency maps to reveal the (sub)surface structure of different domains on the sample.
Numerical Investigation of Vertical Plunging Jet Using a Hybrid Multifluid–VOF Multiphase CFD Solver
Shonibare, Olabanji Y.; Wardle, Kent E.
2015-06-28
A novel hybrid multiphase flow solver has been used to conduct simulations of a vertical plunging liquid jet. This solver combines a multifluid methodology with selective interface sharpening to enable simulation of both the initial jet impingement and the long-time entrained bubble plume phenomena. Models are implemented for variable bubble size capturing and dynamic switching of interface sharpened regions to capture transitions between the initially fully segregated flow types into the dispersed bubbly flow regime. It was found that the solver was able to capture the salient features of the flow phenomena under study and areas for quantitative improvement havemore » been explored and identified. In particular, a population balance approach is employed and detailed calibration of the underlying models with experimental data is required to enable quantitative prediction of bubble size and distribution to capture the transition between segregated and dispersed flow types with greater fidelity.« less
Luo, Zhi; Marson, Domenico; Ong, Quy K; Loiudice, Anna; Kohlbrecher, Joachim; Radulescu, Aurel; Krause-Heuer, Anwen; Darwish, Tamim; Balog, Sandor; Buonsanti, Raffaella; Svergun, Dmitri I; Posocco, Paola; Stellacci, Francesco
2018-04-09
The ligand shell (LS) determines a number of nanoparticles' properties. Nanoparticles' cores can be accurately characterized; yet the structure of the LS, when composed of mixture of molecules, can be described only qualitatively (e.g., patchy, Janus, and random). Here we show that quantitative description of the LS' morphology of monodisperse nanoparticles can be obtained using small-angle neutron scattering (SANS), measured at multiple contrasts, achieved by either ligand or solvent deuteration. Three-dimensional models of the nanoparticles' core and LS are generated using an ab initio reconstruction method. Characteristic length scales extracted from the models are compared with simulations. We also characterize the evolution of the LS upon thermal annealing, and investigate the LS morphology of mixed-ligand copper and silver nanoparticles as well as gold nanoparticles coated with ternary mixtures. Our results suggest that SANS combined with multiphase modeling is a versatile approach for the characterization of nanoparticles' LS.
Mechanochemical models of processive molecular motors
NASA Astrophysics Data System (ADS)
Lan, Ganhui; Sun, Sean X.
2012-05-01
Motor proteins are the molecular engines powering the living cell. These nanometre-sized molecules convert chemical energy, both enthalpic and entropic, into useful mechanical work. High resolution single molecule experiments can now observe motor protein movement with increasing precision. The emerging data must be combined with structural and kinetic measurements to develop a quantitative mechanism. This article describes a modelling framework where quantitative understanding of motor behaviour can be developed based on the protein structure. The framework is applied to myosin motors, with emphasis on how synchrony between motor domains give rise to processive unidirectional movement. The modelling approach shows that the elasticity of protein domains are important in regulating motor function. Simple models of protein domain elasticity are presented. The framework can be generalized to other motor systems, or an ensemble of motors such as muscle contraction. Indeed, for hundreds of myosins, our framework can be reduced to the Huxely-Simmons description of muscle movement in the mean-field limit.
NASA Astrophysics Data System (ADS)
Cho, Sehyeon; Choi, Min Ji; Kim, Minju; Lee, Sunhoe; Lee, Jinsung; Lee, Seok Joon; Cho, Haelim; Lee, Kyung-Tae; Lee, Jae Yeol
2015-03-01
A series of 3,4-dihydroquinazoline derivatives with anti-cancer activities against human lung cancer A549 cells were subjected to three-dimensional quantitative structure-activity relationship (3D-QSAR) studies using the comparative molecular similarity indices analysis (CoMSIA) approaches. The most potent compound, 1 was used to align the molecules. As a result, the best prediction was obtained with CoMSIA combined the steric, electrostatic, hydrophobic, hydrogen bond donor, and hydrogen bond acceptor fields (q2 = 0.720, r2 = 0.897). This model was validated by an external test set of 6 compounds giving satisfactory predictive r2 value of 0.923 as well as the scrambling stability test. This model would guide the design of potent 3,4-dihydroquinazoline derivatives as anti-cancer agent for the treatment of human lung cancer.
Uludağ, Yildiz; Piletsky, Sergey A; Turner, Anthony P F; Cooper, Matthew A
2007-11-01
Biomimetic recognition elements employed for the detection of analytes are commonly based on proteinaceous affibodies, immunoglobulins, single-chain and single-domain antibody fragments or aptamers. The alternative supra-molecular approach using a molecularly imprinted polymer now has proven utility in numerous applications ranging from liquid chromatography to bioassays. Despite inherent advantages compared with biochemical/biological recognition (which include robustness, storage endurance and lower costs) there are few contributions that describe quantitative analytical applications of molecularly imprinted polymers for relevant small molecular mass compounds in real-world samples. There is, however, significant literature describing the use of low-power, portable piezoelectric transducers to detect analytes in environmental monitoring and other application areas. Here we review the combination of molecularly imprinted polymers as recognition elements with piezoelectric biosensors for quantitative detection of small molecules. Analytes are classified by type and sample matrix presentation and various molecularly imprinted polymer synthetic fabrication strategies are also reviewed.
Hispanic fertility, immigration, and race in the twenty-first century
Parrado, Emilio A.; Flippen, Chenoa A.
2012-01-01
In this paper we systematically describe the connection between immigration and fertility in light of the increasing nativist reaction to Hispanic groups. We follow a life-course perspective to directly link migration and fertility transitions. The analysis combines original qualitative and quantitative data collected in Durham/Chapel Hill, NC as well as national level information from the Current Population Survey. The qualitative data provides a person-centered approach to the connection between migration and fertility that we then extend in quantitative analyses. Results demonstrate that standard demographic measures that treat migration and fertility as separate processes considerably distort the childbearing experience of immigrant women, inflating fertility estimates for Hispanics as a whole. Once this connection is taken into consideration the fertility levels of Hispanic women are much lower than those reported with standard measures and the fertility-specific contribution of Hispanics to U.S. population growth is much reduced. PMID:23066430
Automated solid-phase extraction workstations combined with quantitative bioanalytical LC/MS.
Huang, N H; Kagel, J R; Rossi, D T
1999-03-01
An automated solid-phase extraction workstation was used to develop, characterize and validate an LC/MS/MS method for quantifying a novel lipid-regulating drug in dog plasma. Method development was facilitated by workstation functions that allowed wash solvents of varying organic composition to be mixed and tested automatically. Precision estimates for this approach were within 9.8% relative standard deviation (RSD) across the calibration range. Accuracy for replicate determinations of quality controls was between -7.2 and +6.2% relative error (RE) over 5-1,000 ng/ml(-1). Recoveries were evaluated for a wide variety of wash solvents, elution solvents and sorbents. Optimized recoveries were generally > 95%. A sample throughput benchmark for the method was approximately equal 8 min per sample. Because of parallel sample processing, 100 samples were extracted in less than 120 min. The approach has proven useful for use with LC/MS/MS, using a multiple reaction monitoring (MRM) approach.
Receptor-based 3D-QSAR in Drug Design: Methods and Applications in Kinase Studies.
Fang, Cheng; Xiao, Zhiyan
2016-01-01
Receptor-based 3D-QSAR strategy represents a superior integration of structure-based drug design (SBDD) and three-dimensional quantitative structure-activity relationship (3D-QSAR) analysis. It combines the accurate prediction of ligand poses by the SBDD approach with the good predictability and interpretability of statistical models derived from the 3D-QSAR approach. Extensive efforts have been devoted to the development of receptor-based 3D-QSAR methods and two alternative approaches have been exploited. One associates with computing the binding interactions between a receptor and a ligand to generate structure-based descriptors for QSAR analyses. The other concerns the application of various docking protocols to generate optimal ligand poses so as to provide reliable molecular alignments for the conventional 3D-QSAR operations. This review highlights new concepts and methodologies recently developed in the field of receptorbased 3D-QSAR, and in particular, covers its application in kinase studies.
de Vargas Roditi, Laura; Claassen, Manfred
2015-08-01
Novel technological developments enable single cell population profiling with respect to their spatial and molecular setup. These include single cell sequencing, flow cytometry and multiparametric imaging approaches and open unprecedented possibilities to learn about the heterogeneity, dynamics and interplay of the different cell types which constitute tissues and multicellular organisms. Statistical and dynamic systems theory approaches have been applied to quantitatively describe a variety of cellular processes, such as transcription and cell signaling. Machine learning approaches have been developed to define cell types, their mutual relationships, and differentiation hierarchies shaping heterogeneous cell populations, yielding insights into topics such as, for example, immune cell differentiation and tumor cell type composition. This combination of experimental and computational advances has opened perspectives towards learning predictive multi-scale models of heterogeneous cell populations. Copyright © 2014 Elsevier Ltd. All rights reserved.
Multicriteria mapping of stakeholder preferences in regulating nanotechnology
NASA Astrophysics Data System (ADS)
Hansen, Steffen Foss
2010-08-01
In order to facilitate stakeholder discussions on how to regulate nanotechnology, the opensource program multicriteria mapping (MCM) was used to structure 26 interviews with stakeholders in the USA. MCM offers a systematic part quantitative, part qualitative approach to clarify why some regulatory options (bans, moratoriums, voluntary measures, etc.) were deemed to be acceptable/unacceptable by various stakeholders and which criteria stakeholders used to evaluate the different regulatory options. Adopting an incremental approach and implementing a new regulatory framework was evaluated as the best options whereas a complete ban and no additional regulation of nanotechnology were found to be the least favorable. Criteria applied differed substantially among stakeholders and included social, ethical, regulatory, environmental, and health issues. Opinions on future regulation seem far less polarized than expected and it seems that stakeholders would welcome a combination of voluntary measures, an incremental approach and forming of a new regulatory framework.
Recent Advances in Targeted and Untargeted Metabolomics by NMR and MS/NMR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingol, Kerem
Metabolomics has made significant progress in multiple fronts in the last 18 months. This minireview aimed to give an overview of these advancements in the light of their contribution to targeted and untargeted metabolomics. New computational approaches have emerged to overcome manual absolute quantitation step of metabolites in 1D 1H NMR spectra. This provides more consistency between inter-laboratory comparisons. Integration of 2D NMR metabolomics databases under a unified web server allowed very accurate identification of the metabolites that have been catalogued in these databases. For the remaining uncatalogued and unknown metabolites, new cheminformatics approaches have been developed by combining NMRmore » and mass spectrometry. These hybrid NMR/MS approaches accelerated the identification of unknowns in untargeted studies, and now they are allowing to profile ever larger number of metabolites in application studies.« less
Tug-of-war lacunarity—A novel approach for estimating lacunarity
NASA Astrophysics Data System (ADS)
Reiss, Martin A.; Lemmerer, Birgit; Hanslmeier, Arnold; Ahammer, Helmut
2016-11-01
Modern instrumentation provides us with massive repositories of digital images that will likely only increase in the future. Therefore, it has become increasingly important to automatize the analysis of digital images, e.g., with methods from pattern recognition. These methods aim to quantify the visual appearance of captured textures with quantitative measures. As such, lacunarity is a useful multi-scale measure of texture's heterogeneity but demands high computational efforts. Here we investigate a novel approach based on the tug-of-war algorithm, which estimates lacunarity in a single pass over the image. We computed lacunarity for theoretical and real world sample images, and found that the investigated approach is able to estimate lacunarity with low uncertainties. We conclude that the proposed method combines low computational efforts with high accuracy, and that its application may have utility in the analysis of high-resolution images.
Papamokos, George; Silins, Ilona
2016-01-01
There is an increasing need for new reliable non-animal based methods to predict and test toxicity of chemicals. Quantitative structure-activity relationship (QSAR), a computer-based method linking chemical structures with biological activities, is used in predictive toxicology. In this study, we tested the approach to combine QSAR data with literature profiles of carcinogenic modes of action automatically generated by a text-mining tool. The aim was to generate data patterns to identify associations between chemical structures and biological mechanisms related to carcinogenesis. Using these two methods, individually and combined, we evaluated 96 rat carcinogens of the hematopoietic system, liver, lung, and skin. We found that skin and lung rat carcinogens were mainly mutagenic, while the group of carcinogens affecting the hematopoietic system and the liver also included a large proportion of non-mutagens. The automatic literature analysis showed that mutagenicity was a frequently reported endpoint in the literature of these carcinogens, however, less common endpoints such as immunosuppression and hormonal receptor-mediated effects were also found in connection with some of the carcinogens, results of potential importance for certain target organs. The combined approach, using QSAR and text-mining techniques, could be useful for identifying more detailed information on biological mechanisms and the relation with chemical structures. The method can be particularly useful in increasing the understanding of structure and activity relationships for non-mutagens.
Papamokos, George; Silins, Ilona
2016-01-01
There is an increasing need for new reliable non-animal based methods to predict and test toxicity of chemicals. Quantitative structure-activity relationship (QSAR), a computer-based method linking chemical structures with biological activities, is used in predictive toxicology. In this study, we tested the approach to combine QSAR data with literature profiles of carcinogenic modes of action automatically generated by a text-mining tool. The aim was to generate data patterns to identify associations between chemical structures and biological mechanisms related to carcinogenesis. Using these two methods, individually and combined, we evaluated 96 rat carcinogens of the hematopoietic system, liver, lung, and skin. We found that skin and lung rat carcinogens were mainly mutagenic, while the group of carcinogens affecting the hematopoietic system and the liver also included a large proportion of non-mutagens. The automatic literature analysis showed that mutagenicity was a frequently reported endpoint in the literature of these carcinogens, however, less common endpoints such as immunosuppression and hormonal receptor-mediated effects were also found in connection with some of the carcinogens, results of potential importance for certain target organs. The combined approach, using QSAR and text-mining techniques, could be useful for identifying more detailed information on biological mechanisms and the relation with chemical structures. The method can be particularly useful in increasing the understanding of structure and activity relationships for non-mutagens. PMID:27625608
Combinatorial Drug Screening Identifies Ewing Sarcoma-specific Sensitivities.
Radic-Sarikas, Branka; Tsafou, Kalliopi P; Emdal, Kristina B; Papamarkou, Theodore; Huber, Kilian V M; Mutz, Cornelia; Toretsky, Jeffrey A; Bennett, Keiryn L; Olsen, Jesper V; Brunak, Søren; Kovar, Heinrich; Superti-Furga, Giulio
2017-01-01
Improvements in survival for Ewing sarcoma pediatric and adolescent patients have been modest over the past 20 years. Combinations of anticancer agents endure as an option to overcome resistance to single treatments caused by compensatory pathways. Moreover, combinations are thought to lessen any associated adverse side effects through reduced dosing, which is particularly important in childhood tumors. Using a parallel phenotypic combinatorial screening approach of cells derived from three pediatric tumor types, we identified Ewing sarcoma-specific interactions of a diverse set of targeted agents including approved drugs. We were able to retrieve highly synergistic drug combinations specific for Ewing sarcoma and identified signaling processes important for Ewing sarcoma cell proliferation determined by EWS-FLI1 We generated a molecular target profile of PKC412, a multikinase inhibitor with strong synergistic propensity in Ewing sarcoma, revealing its targets in critical Ewing sarcoma signaling routes. Using a multilevel experimental approach including quantitative phosphoproteomics, we analyzed the molecular rationale behind the disease-specific synergistic effect of simultaneous application of PKC412 and IGF1R inhibitors. The mechanism of the drug synergy between these inhibitors is different from the sum of the mechanisms of the single agents. The combination effectively inhibited pathway crosstalk and averted feedback loop repression, in EWS-FLI1-dependent manner. Mol Cancer Ther; 16(1); 88-101. ©2016 AACR. ©2016 American Association for Cancer Research.
Image-based quantification and mathematical modeling of spatial heterogeneity in ESC colonies.
Herberg, Maria; Zerjatke, Thomas; de Back, Walter; Glauche, Ingmar; Roeder, Ingo
2015-06-01
Pluripotent embryonic stem cells (ESCs) have the potential to differentiate into cells of all three germ layers. This unique property has been extensively studied on the intracellular, transcriptional level. However, ESCs typically form clusters of cells with distinct size and shape, and establish spatial structures that are vital for the maintenance of pluripotency. Even though it is recognized that the cells' arrangement and local interactions play a role in fate decision processes, the relations between transcriptional and spatial patterns have not yet been studied. We present a systems biology approach which combines live-cell imaging, quantitative image analysis, and multiscale, mathematical modeling of ESC growth. In particular, we develop quantitative measures of the morphology and of the spatial clustering of ESCs with different expression levels and apply them to images of both in vitro and in silico cultures. Using the same measures, we are able to compare model scenarios with different assumptions on cell-cell adhesions and intercellular feedback mechanisms directly with experimental data. Applying our methodology to microscopy images of cultured ESCs, we demonstrate that the emerging colonies are highly variable regarding both morphological and spatial fluorescence patterns. Moreover, we can show that most ESC colonies contain only one cluster of cells with high self-renewing capacity. These cells are preferentially located in the interior of a colony structure. The integrated approach combining image analysis with mathematical modeling allows us to reveal potential transcription factor related cellular and intercellular mechanisms behind the emergence of observed patterns that cannot be derived from images directly. © 2015 International Society for Advancement of Cytometry.
Kasiri, Keyvan; Kazemi, Kamran; Dehghani, Mohammad Javad; Helfroush, Mohammad Sadegh
2013-01-01
In this paper, we present a new semi-automatic brain tissue segmentation method based on a hybrid hierarchical approach that combines a brain atlas as a priori information and a least-square support vector machine (LS-SVM). The method consists of three steps. In the first two steps, the skull is removed and the cerebrospinal fluid (CSF) is extracted. These two steps are performed using the toolbox FMRIB's automated segmentation tool integrated in the FSL software (FSL-FAST) developed in Oxford Centre for functional MRI of the brain (FMRIB). Then, in the third step, the LS-SVM is used to segment grey matter (GM) and white matter (WM). The training samples for LS-SVM are selected from the registered brain atlas. The voxel intensities and spatial positions are selected as the two feature groups for training and test. SVM as a powerful discriminator is able to handle nonlinear classification problems; however, it cannot provide posterior probability. Thus, we use a sigmoid function to map the SVM output into probabilities. The proposed method is used to segment CSF, GM and WM from the simulated magnetic resonance imaging (MRI) using Brainweb MRI simulator and real data provided by Internet Brain Segmentation Repository. The semi-automatically segmented brain tissues were evaluated by comparing to the corresponding ground truth. The Dice and Jaccard similarity coefficients, sensitivity and specificity were calculated for the quantitative validation of the results. The quantitative results show that the proposed method segments brain tissues accurately with respect to corresponding ground truth. PMID:24696800
Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study
Sappa, Angel D.; Carvajal, Juan A.; Aguilera, Cristhian A.; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X.
2016-01-01
This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR). PMID:27294938
Automatic detection and quantitative analysis of cells in the mouse primary motor cortex
NASA Astrophysics Data System (ADS)
Meng, Yunlong; He, Yong; Wu, Jingpeng; Chen, Shangbin; Li, Anan; Gong, Hui
2014-09-01
Neuronal cells play very important role on metabolism regulation and mechanism control, so cell number is a fundamental determinant of brain function. Combined suitable cell-labeling approaches with recently proposed three-dimensional optical imaging techniques, whole mouse brain coronal sections can be acquired with 1-μm voxel resolution. We have developed a completely automatic pipeline to perform cell centroids detection, and provided three-dimensional quantitative information of cells in the primary motor cortex of C57BL/6 mouse. It involves four principal steps: i) preprocessing; ii) image binarization; iii) cell centroids extraction and contour segmentation; iv) laminar density estimation. Investigations on the presented method reveal promising detection accuracy in terms of recall and precision, with average recall rate 92.1% and average precision rate 86.2%. We also analyze laminar density distribution of cells from pial surface to corpus callosum from the output vectorizations of detected cell centroids in mouse primary motor cortex, and find significant cellular density distribution variations in different layers. This automatic cell centroids detection approach will be beneficial for fast cell-counting and accurate density estimation, as time-consuming and error-prone manual identification is avoided.
Daher, Céline; Pimenta, Vanessa; Bellot-Gurlet, Ludovic
2014-11-01
The compositions of ancient varnishes are mainly determined destructively by separation methods coupled to mass spectrometry. In this study, a methodology for non-invasive quantitative analyses of varnishes by vibrational spectroscopies is proposed. For that, experimental simplified varnishes of colophony and linseed oil were prepared according to 18th century traditional recipes with an increasing mass concentration ratio of colophony/linseed oil. FT-Raman and IR analyses using ATR and non-invasive reflectance modes were done on the "pure" materials and on the different mixtures. Then, a new approach involving spectral decomposition calculation was developed considering the mixture spectra as a linear combination of the pure materials ones, and giving a relative amount of each component. Specific spectral regions were treated and the obtained results show a good accuracy between the prepared and calculated amounts of the two compounds. We were thus able to detect and quantify from 10% to 50% of colophony in linseed oil using non-invasive techniques that can also be conducted in situ with portable instruments when it comes to museum varnished objects and artifacts. Copyright © 2014 Elsevier B.V. All rights reserved.
Malucelli, Emil; Iotti, Stefano; Gianoncelli, Alessandra; Fratini, Michela; Merolle, Lucia; Notargiacomo, Andrea; Marraccini, Chiara; Sargenti, Azzurra; Cappadone, Concettina; Farruggia, Giovanna; Bukreeva, Inna; Lombardo, Marco; Trombini, Claudio; Maier, Jeanette A; Lagomarsino, Stefano
2014-05-20
We report a method that allows a complete quantitative characterization of whole single cells, assessing the total amount of carbon, nitrogen, oxygen, sodium, and magnesium and providing submicrometer maps of element molar concentration, cell density, mass, and volume. This approach allows quantifying elements down to 10(6) atoms/μm(3). This result was obtained by applying a multimodal fusion approach that combines synchrotron radiation microscopy techniques with off-line atomic force microscopy. The method proposed permits us to find the element concentration in addition to the mass fraction and provides a deeper and more complete knowledge of cell composition. We performed measurements on LoVo human colon cancer cells sensitive (LoVo-S) and resistant (LoVo-R) to doxorubicin. The comparison of LoVo-S and LoVo-R revealed different patterns in the maps of Mg concentration with higher values within the nucleus in LoVo-R and in the perinuclear region in LoVo-S cells. This feature was not so evident for the other elements, suggesting that Mg compartmentalization could be a significant trait of the drug-resistant cells.
Wavelet-Based Visible and Infrared Image Fusion: A Comparative Study.
Sappa, Angel D; Carvajal, Juan A; Aguilera, Cristhian A; Oliveira, Miguel; Romero, Dennis; Vintimilla, Boris X
2016-06-10
This paper evaluates different wavelet-based cross-spectral image fusion strategies adopted to merge visible and infrared images. The objective is to find the best setup independently of the evaluation metric used to measure the performance. Quantitative performance results are obtained with state of the art approaches together with adaptations proposed in the current work. The options evaluated in the current work result from the combination of different setups in the wavelet image decomposition stage together with different fusion strategies for the final merging stage that generates the resulting representation. Most of the approaches evaluate results according to the application for which they are intended for. Sometimes a human observer is selected to judge the quality of the obtained results. In the current work, quantitative values are considered in order to find correlations between setups and performance of obtained results; these correlations can be used to define a criteria for selecting the best fusion strategy for a given pair of cross-spectral images. The whole procedure is evaluated with a large set of correctly registered visible and infrared image pairs, including both Near InfraRed (NIR) and Long Wave InfraRed (LWIR).
NASA Astrophysics Data System (ADS)
Banerjee, Paromita; Soni, Jalpa; Ghosh, Nirmalya; Sengupta, Tapas K.
2013-02-01
It is of considerable current interest to develop various methods which help to understand and quantify the cellular association in growing bacterial colonies and is also important in terms of detection and identification of a bacterial species. A novel approach is used here to probe the morphological structural changes occurring during the growth of the bacterial colony of Bacillus thuringiensis under different environmental conditions (in normal nutrient agar, in presence of glucose - acting as additional nutrient and additional 3mM arsenate as additional toxic material). This approach combines the quantitative Mueller matrix polarimetry to extract intrinsic polarization properties and inverse analysis of the polarization preserving part of the light scattering spectra to determine the fractal parameter H (Hurst exponent) using Born approximation. Interesting differences are observed in the intrinsic polarization parameters and also in the Hurst exponent, which is a measurement of the fractality of a pattern formed by bacteria while growing as a colony. These findings are further confirmed with optical microscopic studies of the same sample and the results indicate a very strong and distinct dependence on the environmental conditions during growth, which can be exploited to quantify different bacterial species and their growth patterns.
McMeekin, T A
2007-09-01
Predictive microbiology is considered in the context of the conference theme "chance, innovation and challenge", together with the impact of quantitative approaches on food microbiology, generally. The contents of four prominent texts on predictive microbiology are analysed and the major contributions of two meat microbiologists, Drs. T.A. Roberts and C.O. Gill, to the early development of predictive microbiology are highlighted. These provide a segue into R&D trends in predictive microbiology, including the Refrigeration Index, an example of science-based, outcome-focussed food safety regulation. Rapid advances in technologies and systems for application of predictive models are indicated and measures to judge the impact of predictive microbiology are suggested in terms of research outputs and outcomes. The penultimate section considers the future of predictive microbiology and advances that will become possible when data on population responses are combined with data derived from physiological and molecular studies in a systems biology approach. Whilst the emphasis is on science and technology for food safety management, it is suggested that decreases in foodborne illness will also arise from minimising human error by changing the food safety culture.
Ho, Jenny T. C.; Smit, August B.; Li, Ka Wan
2018-01-01
Abstract Data‐independent acquisition (DIA) is an emerging technology for quantitative proteomics. Current DIA focusses on the identification and quantitation of fragment ions that are generated from multiple peptides contained in the same selection window of several to tens of m/z. An alternative approach is WiSIM‐DIA, which combines conventional DIA with wide‐SIM (wide selected‐ion monitoring) windows to partition the precursor m/z space to produce high‐quality precursor ion chromatograms. However, WiSIM‐DIA has been underexplored; it remains unclear if it is a viable alternative to DIA. We demonstrate that WiSIM‐DIA quantified more than 24 000 unique peptides over five orders of magnitude in a single 2 h analysis of a neuronal synapse‐enriched fraction, compared to 31 000 in DIA. There is a strong correlation between abundance values of peptides quantified in both the DIA and WiSIM‐DIA datasets. Interestingly, the S/N ratio of these peptides is not correlated. We further show that peptide identification directly from DIA spectra identified >2000 proteins, which included unique peptides not found in spectral libraries generated by DDA. PMID:29134766
O'Neill, Sharon; Mathis, Magalie; Kovačič, Lidija; Zhang, Suisheng; Reinhardt, Jürgen; Scholz, Dimitri; Schopfer, Ulrich; Bouhelal, Rochdi; Knaus, Ulla G
2018-06-08
Protein-protein interactions critically regulate many biological systems, but quantifying functional assembly of multipass membrane complexes in their native context is still challenging. Here, we combined modeling-assisted protein modification and information from human disease variants with a minimal-size fusion tag, split-luciferase-based approach to probe assembly of the NADPH oxidase 4 (NOX4)-p22 phox enzyme, an integral membrane complex with unresolved structure, which is required for electron transfer and generation of reactive oxygen species (ROS). Integrated analyses of heterodimerization, trafficking, and catalytic activity identified determinants for the NOX4-p22 phox interaction, such as heme incorporation into NOX4 and hot spot residues in transmembrane domains 1 and 4 in p22 phox Moreover, their effect on NOX4 maturation and ROS generation was analyzed. We propose that this reversible and quantitative protein-protein interaction technique with its small split-fragment approach will provide a protein engineering and discovery tool not only for NOX research, but also for other intricate membrane protein complexes, and may thereby facilitate new drug discovery strategies for managing NOX-associated diseases. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized "Theory of Planned Behaviour" was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV.
Chilisa, Bagele; Mohiemang, Irene; Mpeta, Kolentino Nyamadzapasi; Malinga, Tumane; Ntshwarang, Poloko; Koyabe, Bramwell Walela; Heeren, G. Anita
2016-01-01
Culture and tradition influences behaviour. Multiple partner and concurrent relationships are made responsible for the increase of HIV infection in Sub-Saharan Africa. A contextualized “Theory of Planned Behaviour” was used to identify predictors of intention to practice monogamy. A mixed method design using qualitative data from focus groups, stories and a survey were analyzed for quantitative data. The qualitative data added to the behavioural beliefs a socio-cultural belief domain as well as attitudes, subjective norms, and perceived behavioural control predicted the intention to practice monogamy. The adolescents showed a tendency towards having more than one sexual partner. The normative beliefs and the socio cultural beliefs also predicted intentions while hedonistic belief and partner reaction did not. In contextualizing theory-based interventions, it is important to draw from stories and the langauage that circulate in a community about a given behaviour. More studies are needed on ways to combine qualitative approaches with quantitative approaches to inform the development of theory based culturally appropriate and context specific intervention strategies to reduce the risk of HIV. PMID:28090169
Nonlinear optical microscopy and ultrasound imaging of human cervical structure
NASA Astrophysics Data System (ADS)
Reusch, Lisa M.; Feltovich, Helen; Carlson, Lindsey C.; Hall, Gunnsteinn; Campagnola, Paul J.; Eliceiri, Kevin W.; Hall, Timothy J.
2013-03-01
The cervix softens and shortens as its collagen microstructure rearranges in preparation for birth, but premature change may lead to premature birth. The global preterm birth rate has not decreased despite decades of research, likely because cervical microstructure is poorly understood. Our group has developed a multilevel approach to evaluating the human cervix. We are developing quantitative ultrasound (QUS) techniques for noninvasive interrogation of cervical microstructure and corroborating those results with high-resolution images of microstructure from second harmonic generation imaging (SHG) microscopy. We obtain ultrasound measurements from hysterectomy specimens, prepare the tissue for SHG, and stitch together several hundred images to create a comprehensive view of large areas of cervix. The images are analyzed for collagen orientation and alignment with curvelet transform, and registered with QUS data, facilitating multiscale analysis in which the micron-scale SHG images and millimeter-scale ultrasound data interpretation inform each other. This novel combination of modalities allows comprehensive characterization of cervical microstructure in high resolution. Through a detailed comparative study, we demonstrate that SHG imaging both corroborates the quantitative ultrasound measurements and provides further insight. Ultimately, a comprehensive understanding of specific microstructural cervical change in pregnancy should lead to novel approaches to the prevention of preterm birth.
NASA Astrophysics Data System (ADS)
Jézéquel, Tangi; Silvestre, Virginie; Dinis, Katy; Giraudeau, Patrick; Akoka, Serge
2018-04-01
Isotope ratio monitoring by 13C NMR spectrometry (irm-13C NMR) provides the complete 13C intramolecular position-specific composition at natural abundance. It represents a powerful tool to track the (bio)chemical pathway which has led to the synthesis of targeted molecules, since it allows Position-specific Isotope Analysis (PSIA). Due to the very small composition range (which represents the range of variation of the isotopic composition of a given nuclei) of 13C natural abundance values (50‰), irm-13C NMR requires a 1‰ accuracy and thus highly quantitative analysis by 13C NMR. Until now, the conventional strategy to determine the position-specific abundance xi relies on the combination of irm-MS (isotopic ratio monitoring Mass Spectrometry) and 13C quantitative NMR. However this approach presents a serious drawback since it relies on two different techniques and requires to measure separately the signal of all the carbons of the analyzed compound, which is not always possible. To circumvent this constraint, we recently proposed a new methodology to perform 13C isotopic analysis using an internal reference method and relying on NMR only. The method combines a highly quantitative 1H NMR pulse sequence (named DWET) with a 13C isotopic NMR measurement. However, the recently published DWET sequence is unsuited for samples with short T1, which forms a serious limitation for irm-13C NMR experiments where a relaxing agent is added. In this context, we suggest two variants of the DWET called Multi-WET and Profiled-WET, developed and optimized to reach the same accuracy of 1‰ with a better immunity towards T1 variations. Their performance is evaluated on the determination of the 13C isotopic profile of vanillin. Both pulse sequences show a 1‰ accuracy with an increased robustness to pulse miscalibrations compared to the initial DWET method. This constitutes a major advance in the context of irm-13C NMR since it is now possible to perform isotopic analysis with high relaxing agent concentrations, leading to a strong reduction of the overall experiment time.
Gunawardena, Harsha P.; Feltcher, Meghan E.; Wrobel, John A.; Gu, Sheng; Braunstein, Miriam; Chen, Xian
2015-01-01
The Mycobacterium tuberculosis (MTB) membrane is rich in antigens that are potential targets for diagnostics and the development of new vaccines. To better understand the mechanisms underlying MTB virulence and identify new targets for therapeutic intervention we investigated the differential composition of membrane proteomes between virulent M. tuberculosis H37Rv (MTB) and the Mycobacterium bovis BCG vaccine strain. To compare the membrane proteomes, we used LC-MS/MS analysis in combination with label-free quantitative (LFQ) proteomics, utilizing the area-under-curve (AUC) of the extracted ion chromatograms (XIC) of peptides obtained from m/z and retention time alignment of MS1 features. With this approach, we obtained relative abundance ratios for 2,203 identified membrane-associated proteins in high confidence. Of these proteins, 294 showed statistically significant differences of at least 2 fold, in relative abundance between MTB and BCG membrane fractions. Our comparative analysis detected several proteins associated with known genomic regions of difference between MTB and BCG as being absent, which validated the accuracy of our approach. In further support of our label-free quantitative data, we verified select protein differences by immunoblotting. To our knowledge we have generated the first comprehensive and high coverage profile of comparative membrane proteome changes between virulent MTB and its attenuated relative BCG, which helps elucidate the proteomic basis of the intrinsic virulence of the MTB pathogen. PMID:24093440
NASA Astrophysics Data System (ADS)
Mok, Aaron T. Y.; Lee, Kelvin C. M.; Wong, Kenneth K. Y.; Tsia, Kevin K.
2018-02-01
Biophysical properties of cells could complement and correlate biochemical markers to characterize a multitude of cellular states. Changes in cell size, dry mass and subcellular morphology, for instance, are relevant to cell-cycle progression which is prevalently evaluated by DNA-targeted fluorescence measurements. Quantitative-phase microscopy (QPM) is among the effective biophysical phenotyping tools that can quantify cell sizes and sub-cellular dry mass density distribution of single cells at high spatial resolution. However, limited camera frame rate and thus imaging throughput makes QPM incompatible with high-throughput flow cytometry - a gold standard in multiparametric cell-based assay. Here we present a high-throughput approach for label-free analysis of cell cycle based on quantitative-phase time-stretch imaging flow cytometry at a throughput of > 10,000 cells/s. Our time-stretch QPM system enables sub-cellular resolution even at high speed, allowing us to extract a multitude (at least 24) of single-cell biophysical phenotypes (from both amplitude and phase images). Those phenotypes can be combined to track cell-cycle progression based on a t-distributed stochastic neighbor embedding (t-SNE) algorithm. Using multivariate analysis of variance (MANOVA) discriminant analysis, cell-cycle phases can also be predicted label-free with high accuracy at >90% in G1 and G2 phase, and >80% in S phase. We anticipate that high throughput label-free cell cycle characterization could open new approaches for large-scale single-cell analysis, bringing new mechanistic insights into complex biological processes including diseases pathogenesis.
Ebalunode, Jerry O; Zheng, Weifan; Tropsha, Alexander
2011-01-01
Optimization of chemical library composition affords more efficient identification of hits from biological screening experiments. The optimization could be achieved through rational selection of reagents used in combinatorial library synthesis. However, with a rapid advent of parallel synthesis methods and availability of millions of compounds synthesized by many vendors, it may be more efficient to design targeted libraries by means of virtual screening of commercial compound collections. This chapter reviews the application of advanced cheminformatics approaches such as quantitative structure-activity relationships (QSAR) and pharmacophore modeling (both ligand and structure based) for virtual screening. Both approaches rely on empirical SAR data to build models; thus, the emphasis is placed on achieving models of the highest rigor and external predictive power. We present several examples of successful applications of both approaches for virtual screening to illustrate their utility. We suggest that the expert use of both QSAR and pharmacophore models, either independently or in combination, enables users to achieve targeted libraries enriched with experimentally confirmed hit compounds.
Quantitative assessment in thermal image segmentation for artistic objects
NASA Astrophysics Data System (ADS)
Yousefi, Bardia; Sfarra, Stefano; Maldague, Xavier P. V.
2017-07-01
The application of the thermal and infrared technology in different areas of research is considerably increasing. These applications involve Non-destructive Testing (NDT), Medical analysis (Computer Aid Diagnosis/Detection- CAD), Arts and Archaeology among many others. In the arts and archaeology field, infrared technology provides significant contributions in term of finding defects of possible impaired regions. This has been done through a wide range of different thermographic experiments and infrared methods. The proposed approach here focuses on application of some known factor analysis methods such as standard Non-Negative Matrix Factorization (NMF) optimized by gradient-descent-based multiplicative rules (SNMF1) and standard NMF optimized by Non-negative least squares (NNLS) active-set algorithm (SNMF2) and eigen decomposition approaches such as Principal Component Thermography (PCT), Candid Covariance-Free Incremental Principal Component Thermography (CCIPCT) to obtain the thermal features. On one hand, these methods are usually applied as preprocessing before clustering for the purpose of segmentation of possible defects. On the other hand, a wavelet based data fusion combines the data of each method with PCT to increase the accuracy of the algorithm. The quantitative assessment of these approaches indicates considerable segmentation along with the reasonable computational complexity. It shows the promising performance and demonstrated a confirmation for the outlined properties. In particular, a polychromatic wooden statue and a fresco were analyzed using the above mentioned methods and interesting results were obtained.
Improved Determination of Subnuclear Position Enabled by Three-Dimensional Membrane Reconstruction.
Zhao, Yao; Schreiner, Sarah M; Koo, Peter K; Colombi, Paolo; King, Megan C; Mochrie, Simon G J
2016-07-12
Many aspects of chromatin biology are influenced by the nuclear compartment in which a locus resides, from transcriptional regulation to DNA repair. Further, the dynamic and variable localization of a particular locus across cell populations and over time makes analysis of a large number of cells critical. As a consequence, robust and automatable methods to measure the position of individual loci within the nuclear volume in populations of cells are necessary to support quantitative analysis of nuclear position. Here, we describe a three-dimensional membrane reconstruction approach that uses fluorescently tagged nuclear envelope or endoplasmic reticulum membrane marker proteins to precisely map the nuclear volume. This approach is robust to a variety of nuclear shapes, providing greater biological accuracy than alternative methods that enforce nuclear circularity, while also describing nuclear position in all three dimensions. By combining this method with established approaches to reconstruct the position of diffraction-limited chromatin markers-in this case, lac Operator arrays bound by lacI-GFP-the distribution of loci positions within the nuclear volume with respect to the nuclear periphery can be quantitatively obtained. This stand-alone image analysis pipeline should be of broad practical utility for individuals interested in various aspects of chromatin biology, while also providing, to our knowledge, a new conceptual framework for investigators who study organelle shape. Copyright © 2016 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Petticrew, Mark; Rehfuess, Eva; Noyes, Jane; Higgins, Julian P T; Mayhew, Alain; Pantoja, Tomas; Shemilt, Ian; Sowden, Amanda
2013-11-01
Although there is increasing interest in the evaluation of complex interventions, there is little guidance on how evidence from complex interventions may be reviewed and synthesized, and the relevance of the plethora of evidence synthesis methods to complexity is unclear. This article aims to explore how different meta-analytical approaches can be used to examine aspects of complexity; describe the contribution of various narrative, tabular, and graphical approaches to synthesis; and give an overview of the potential choice of selected qualitative and mixed-method evidence synthesis approaches. The methodological discussions presented here build on a 2-day workshop held in Montebello, Canada, in January 2012, involving methodological experts from the Campbell and Cochrane Collaborations and from other international review centers (Anderson L, Petticrew M, Chandler J, et al. systematic reviews of complex interventions. In press). These systematic review methodologists discussed the broad range of existing methods and considered the relevance of these methods to reviews of complex interventions. The evidence from primary studies of complex interventions may be qualitative or quantitative. There is a wide range of methodological options for reviewing and presenting this evidence. Specific contributions of statistical approaches include the use of meta-analysis, meta-regression, and Bayesian methods, whereas narrative summary approaches provide valuable precursors or alternatives to these. Qualitative and mixed-method approaches include thematic synthesis, framework synthesis, and realist synthesis. A suitable combination of these approaches allows synthesis of evidence for understanding complex interventions. Reviewers need to consider which aspects of complex interventions should be a focus of their review and what types of quantitative and/or qualitative studies they will be including, and this will inform their choice of review methods. These may range from standard meta-analysis through to more complex mixed-method synthesis and synthesis approaches that incorporate theory and/or user's perspectives. Copyright © 2013 Elsevier Inc. All rights reserved.
Using the Blended Learning Approach in a Quantitative Literacy Course
ERIC Educational Resources Information Center
Botts, Ryan T.; Carter, Lori; Crockett, Catherine
2018-01-01
The efforts to improve the quantitative reasoning (quantitative literacy) skills of college students in the United States have been gaining momentum in recent years. At the same time, the blended learning approach to course delivery has gained in popularity, promising better learning with flexible modalities and pace. This paper presents the…
ERIC Educational Resources Information Center
Smith, John K.; Heshusius, Lous
Educational researchers have claimed that the quantitative and qualitative approaches to educational inquiry are, indeed, compatible. However, it would be unfortunate to discontinue this debate. The quantitative-qualitative debate began with the interpretive approach to social inquiry. Dilthey argued that since cultural/moral sciences differ from…
Hart-Smith, Gene; Reis, Rodrigo S.; Waterhouse, Peter M.; Wilkins, Marc R.
2017-01-01
Quantitative proteomics strategies – which are playing important roles in the expanding field of plant molecular systems biology – are traditionally designated as either hypothesis driven or non-hypothesis driven. Many of these strategies aim to select individual peptide ions for tandem mass spectrometry (MS/MS), and to do this mixed hypothesis driven and non-hypothesis driven approaches are theoretically simple to implement. In-depth investigations into the efficacies of such approaches have, however, yet to be described. In this study, using combined samples of unlabeled and metabolically 15N-labeled Arabidopsis thaliana proteins, we investigate the mixed use of targeted data acquisition (TDA) and data dependent acquisition (DDA) – referred to as TDA/DDA – to facilitate both hypothesis driven and non-hypothesis driven quantitative data collection in individual LC-MS/MS experiments. To investigate TDA/DDA for hypothesis driven data collection, 7 miRNA target proteins of differing size and abundance were targeted using inclusion lists comprised of 1558 m/z values, using 3 different TDA/DDA experimental designs. In samples in which targeted peptide ions were of particularly low abundance (i.e., predominantly only marginally above mass analyser detection limits), TDA/DDA produced statistically significant increases in the number of targeted peptides identified (230 ± 8 versus 80 ± 3 for DDA; p = 1.1 × 10-3) and quantified (35 ± 3 versus 21 ± 2 for DDA; p = 0.038) per experiment relative to the use of DDA only. These expected improvements in hypothesis driven data collection were observed alongside unexpected improvements in non-hypothesis driven data collection. Untargeted peptide ions with m/z values matching those in inclusion lists were repeatedly identified and quantified across technical replicate TDA/DDA experiments, resulting in significant increases in the percentages of proteins repeatedly quantified in TDA/DDA experiments only relative to DDA experiments only (33.0 ± 2.6% versus 8.0 ± 2.7%, respectively; p = 0.011). These results were observed together with uncompromised broad-scale MS/MS data collection in TDA/DDA experiments relative to DDA experiments. Using our observations we provide guidelines for TDA/DDA method design for quantitative plant proteomics studies, and suggest that TDA/DDA is a broadly underutilized proteomics data acquisition strategy. PMID:29021799
Combining Qualitative and Quantitative Data: An Example.
ERIC Educational Resources Information Center
Sikka, Anjoo; And Others
Methodology from an ongoing research study to validate teaching techniques for deaf and blind students provides an example of the ways that several types of quantitative and qualitative data can be combined in analysis. Four teacher and student pairs were selected. The students were between 14 and 21 years old, had both auditory and visual…
Male-male competition, female mate choice and their interaction: determining total sexual selection.
Hunt, John; Breuker, Casper J; Sadowski, Jennifer A; Moore, Allen J
2009-01-01
Empirical studies of sexual selection typically focus on one of the two mechanisms of sexual selection without integrating these into a description of total sexual selection, or study total sexual selection without quantifying the contributions of all of the mechanisms of sexual selection. However, this can provide an incomplete or misleading view of how sexually selected traits evolve if the mechanisms of sexual selection are opposing or differ in form. Here, we take a two-fold approach to advocate a direction for future studies of sexual selection. We first show how a quantitative partitioning and examination of sexual selection mechanisms can inform by identifying illustrative studies that describe both male-male competition and female mate choice acting on the same trait. In our sample, the most common trait where this occurred was body size, and selection was typically linear. We found that male-male competition and female mate choice can be reinforcing or opposing, although the former is most common in the literature. The mechanisms of sexual selection can occur simultaneously or sequentially, and we found they were more likely to be opposing when the mechanisms operated sequentially. The degree and timing that these mechanisms interact have important implications for the operation of sexual selection and needs to be considered in designing studies. Our examples highlight where empirical data are needed. We especially lack standardized measures of the form and strength of selection imposed by each mechanism of sexual selection and how they combine to determine total sexual selection. Secondly, using quantitative genetic principles, we outline how the selection imposed by individual mechanisms can be measured and combined to estimate the total strength and form of sexual selection. We discuss the evolutionary consequences of combining the mechanisms of sexual selection and interpreting total sexual selection. We suggest how this approach may result in empirical progress in the field of sexual selection.