Testa, Maria; Livingston, Jennifer A; VanZile-Tamsen, Carol
2011-02-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women's sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided.
Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol
2011-01-01
A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032
Weisner, Thomas S; Fiese, Barbara H
2011-12-01
Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.
Sano, Yuko; Okuyama, Chio; Iehara, Tomoko; Matsushima, Shigenori; Yamada, Kei; Hosoi, Hajime; Nishimura, Tsunehiko
2012-07-01
The purpose of this study is to evaluate a new semi-quantitative estimation method using (123)I-MIBG retention ratio to assess response to chemotherapy for advanced neuroblastoma. Thirteen children with advanced neuroblastoma (International Neuroblastoma Risk Group Staging System: stage M) were examined for a total of 51 studies with (123)I-MIBG scintigraphy (before and during chemotherapy). We proposed a new semi-quantitative method using MIBG retention ratio (count obtained with delayed image/count obtained with early image with decay correction) to estimate MIBG accumulation. We analyzed total (123)I-MIBG retention ratio (TMRR: total body count obtained with delayed image/total body count obtained with early image with decay correction) and compared with a scoring method in terms of correlation with tumor markers. TMRR showed significantly higher correlations with urinary catecholamine metabolites before chemotherapy (VMA: r(2) = 0.45, P < 0.05, HVA: r(2) = 0.627, P < 0.01) than MIBG score (VMA: r(2) = 0.19, P = 0.082, HVA: r(2) = 0.25, P = 0.137). There were relatively good correlations between serial change of TMRR and those of urinary catecholamine metabolites (VMA: r(2) = 0.274, P < 0.001, HVA: r(2) = 0.448, P < 0.0001) compared with serial change of MIBG score and those of tumor markers (VMA: r(2) = 0.01, P = 0.537, HVA: 0.084, P = 0.697) during chemotherapy for advanced neuroblastoma. TMRR could be a useful semi-quantitative method for estimating early response to chemotherapy of advanced neuroblastoma because of its high correlation with urine catecholamine metabolites.
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Optical Ptychographic Microscope for Quantitative Bio-Mechanical Imaging
NASA Astrophysics Data System (ADS)
Anthony, Nicholas; Cadenazzi, Guido; Nugent, Keith; Abbey, Brian
The role that mechanical forces play in biological processes such as cell movement and death is becoming of significant interest to further develop our understanding of the inner workings of cells. The most common method used to obtain stress information is photoelasticity which maps a samples birefringence, or its direction dependent refractive indices, using polarized light. However this method only provides qualitative data and for stress information to be useful quantitative data is required. Ptychography is a method for quantitatively determining the phase of a samples complex transmission function. The technique relies upon the collection of multiple overlapping coherent diffraction patterns from laterally displaced points on the sample. The overlap of measurement points provides complementary information that significantly aids in the reconstruction of the complex wavefield exiting the sample and allows for quantitative imaging of weakly interacting specimens. Here we describe recent advances at La Trobe University Melbourne on achieving quantitative birefringence mapping using polarized light ptychography with applications in cell mechanics. Australian Synchrotron, ARC Centre of Excellence for Advanced Molecular Imaging.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-28
... systems. E. Quantitative Methods for Comparing Capital Frameworks The NPR sought comment on how the... industry while assessing levels of capital. This commenter points out maintaining reliable comparative data over time could make quantitative methods for this purpose difficult. For example, evaluating asset...
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
Quantitative Imaging in Cancer Clinical Trials
Yankeelov, Thomas E.; Mankoff, David A.; Schwartz, Lawrence H.; Lieberman, Frank S.; Buatti, John M.; Mountz, James M.; Erickson, Bradley J.; Fennessy, Fiona M.M.; Huang, Wei; Kalpathy-Cramer, Jayashree; Wahl, Richard L.; Linden, Hannah M.; Kinahan, Paul; Zhao, Binsheng; Hylton, Nola M.; Gillies, Robert J.; Clarke, Laurence; Nordstrom, Robert; Rubin, Daniel L.
2015-01-01
As anti-cancer therapies designed to target specific molecular pathways have been developed, it has become critical to develop methods to assess the response induced by such agents. While traditional, anatomic CT and MRI exams are useful in many settings, there is increasing evidence that these methods cannot answer the fundamental biological and physiological questions essential for assessment and, eventually, prediction of treatment response in the clinical trial setting, especially in the critical period soon after treatment is initiated. To optimally apply advances in quantitative imaging methods to trials of targeted cancer therapy, new infrastructure improvements are needed that incorporate these emerging techniques into the settings where they are most likely to have impact. In this review, we first elucidate the needs for therapeutic response assessment in the era of molecularly targeted therapy and describe how quantitative imaging can most effectively provide scientifically and clinically relevant data. We then describe the tools and methods required to apply quantitative imaging and provide concrete examples of work making these advances practically available for routine application in clinical trials. We conclude by proposing strategies to surmount barriers to wider incorporation of these quantitative imaging methods into clinical trials and, eventually, clinical practice. Our goal is to encourage and guide the oncology community to deploy standardized quantitative imaging techniques in clinical trials to further personalize care for cancer patients, and to provide a more efficient path for the development of improved targeted therapies. PMID:26773162
Systems Biology, Neuroimaging, Neuropsychology, Neuroconnectivity and Traumatic Brain Injury
Bigler, Erin D.
2016-01-01
The patient who sustains a traumatic brain injury (TBI) typically undergoes neuroimaging studies, usually in the form of computed tomography (CT) and magnetic resonance imaging (MRI). In most cases the neuroimaging findings are clinically assessed with descriptive statements that provide qualitative information about the presence/absence of visually identifiable abnormalities; though little if any of the potential information in a scan is analyzed in any quantitative manner, except in research settings. Fortunately, major advances have been made, especially during the last decade, in regards to image quantification techniques, especially those that involve automated image analysis methods. This review argues that a systems biology approach to understanding quantitative neuroimaging findings in TBI provides an appropriate framework for better utilizing the information derived from quantitative neuroimaging and its relation with neuropsychological outcome. Different image analysis methods are reviewed in an attempt to integrate quantitative neuroimaging methods with neuropsychological outcome measures and to illustrate how different neuroimaging techniques tap different aspects of TBI-related neuropathology. Likewise, how different neuropathologies may relate to neuropsychological outcome is explored by examining how damage influences brain connectivity and neural networks. Emphasis is placed on the dynamic changes that occur following TBI and how best to capture those pathologies via different neuroimaging methods. However, traditional clinical neuropsychological techniques are not well suited for interpretation based on contemporary and advanced neuroimaging methods and network analyses. Significant improvements need to be made in the cognitive and behavioral assessment of the brain injured individual to better interface with advances in neuroimaging-based network analyses. By viewing both neuroimaging and neuropsychological processes within a systems biology perspective could represent a significant advancement for the field. PMID:27555810
ERIC Educational Resources Information Center
Yeager, Joseph; Sommer, Linda
2007-01-01
Combining psycholinguistic technologies and systems analysis created advances in motivational profiling and numerous new behavioral engineering applications. These advances leapfrog many mainstream statistical research methods, producing superior research results via cause-effect language mechanisms. Entire industries explore motives ranging from…
Methods for assessing geodiversity
NASA Astrophysics Data System (ADS)
Zwoliński, Zbigniew; Najwer, Alicja; Giardino, Marco
2017-04-01
The accepted systematics of geodiversity assessment methods will be presented in three categories: qualitative, quantitative and qualitative-quantitative. Qualitative methods are usually descriptive methods that are suited to nominal and ordinal data. Quantitative methods use a different set of parameters and indicators to determine the characteristics of geodiversity in the area being researched. Qualitative-quantitative methods are a good combination of the collection of quantitative data (i.e. digital) and cause-effect data (i.e. relational and explanatory). It seems that at the current stage of the development of geodiversity research methods, qualitative-quantitative methods are the most advanced and best assess the geodiversity of the study area. Their particular advantage is the integration of data from different sources and with different substantive content. Among the distinguishing features of the quantitative and qualitative-quantitative methods for assessing geodiversity are their wide use within geographic information systems, both at the stage of data collection and data integration, as well as numerical processing and their presentation. The unresolved problem for these methods, however, is the possibility of their validation. It seems that currently the best method of validation is direct filed confrontation. Looking to the next few years, the development of qualitative-quantitative methods connected with cognitive issues should be expected, oriented towards ontology and the Semantic Web.
[Methods of quantitative proteomics].
Kopylov, A T; Zgoda, V G
2007-01-01
In modern science proteomic analysis is inseparable from other fields of systemic biology. Possessing huge resources quantitative proteomics operates colossal information on molecular mechanisms of life. Advances in proteomics help researchers to solve complex problems of cell signaling, posttranslational modification, structure and functional homology of proteins, molecular diagnostics etc. More than 40 various methods have been developed in proteomics for quantitative analysis of proteins. Although each method is unique and has certain advantages and disadvantages all these use various isotope labels (tags). In this review we will consider the most popular and effective methods employing both chemical modifications of proteins and also metabolic and enzymatic methods of isotope labeling.
New High Throughput Methods to Estimate Chemical Exposure
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing an...
Percy, Andrew J; Yang, Juncong; Hardie, Darryl B; Chambers, Andrew G; Tamura-Wells, Jessica; Borchers, Christoph H
2015-06-15
Spurred on by the growing demand for panels of validated disease biomarkers, increasing efforts have focused on advancing qualitative and quantitative tools for more highly multiplexed and sensitive analyses of a multitude of analytes in various human biofluids. In quantitative proteomics, evolving strategies involve the use of the targeted multiple reaction monitoring (MRM) mode of mass spectrometry (MS) with stable isotope-labeled standards (SIS) used for internal normalization. Using that preferred approach with non-invasive urine samples, we have systematically advanced and rigorously assessed the methodology toward the precise quantitation of the largest, multiplexed panel of candidate protein biomarkers in human urine to date. The concentrations of the 136 proteins span >5 orders of magnitude (from 8.6 μg/mL to 25 pg/mL), with average CVs of 8.6% over process triplicate. Detailed here is our quantitative method, the analysis strategy, a feasibility application to prostate cancer samples, and a discussion of the utility of this method in translational studies. Copyright © 2015 Elsevier Inc. All rights reserved.
VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.
Little, Todd D; Wang, Eugene W; Gorrall, Britt K
2017-06-01
This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.
Advances in Imaging Approaches to Fracture Risk Evaluation
Manhard, Mary Kate; Nyman, Jeffry S.; Does, Mark D.
2016-01-01
Fragility fractures are a growing problem worldwide, and current methods for diagnosing osteoporosis do not always identify individuals who require treatment to prevent a fracture and may misidentify those not a risk. Traditionally, fracture risk is assessed using dual-energy X-ray absorptiometry, which provides measurements of areal bone mineral density (BMD) at sites prone to fracture. Recent advances in imaging show promise in adding new information that could improve the prediction of fracture risk in the clinic. As reviewed herein, advances in quantitative computed tomography (QCT) predict hip and vertebral body strength; high resolution HR-peripheral QCT (HR-pQCT) and micro-magnetic resonance imaging (μMRI) assess the micro-architecture of trabecular bone; quantitative ultrasound (QUS) measures the modulus or tissue stiffness of cortical bone; and quantitative ultra-short echo time MRI methods quantify the concentrations of bound water and pore water in cortical bone, which reflect a variety of mechanical properties of bone. Each of these technologies provides unique characteristics of bone and may improve fracture risk diagnoses and reduce prevalence of fractures by helping to guide treatment decisions. PMID:27816505
Evaluation of background parenchymal enhancement on breast MRI: a systematic review
Signori, Alessio; Valdora, Francesca; Rossi, Federica; Calabrese, Massimo; Durando, Manuela; Mariscotto, Giovanna; Tagliafico, Alberto
2017-01-01
Objective: To perform a systematic review of the methods used for background parenchymal enhancement (BPE) evaluation on breast MRI. Methods: Studies dealing with BPE assessment on breast MRI were retrieved from major medical libraries independently by four reviewers up to 6 October 2015. The keywords used for database searching are “background parenchymal enhancement”, “parenchymal enhancement”, “MRI” and “breast”. The studies were included if qualitative and/or quantitative methods for BPE assessment were described. Results: Of the 420 studies identified, a total of 52 articles were included in the systematic review. 28 studies performed only a qualitative assessment of BPE, 13 studies performed only a quantitative assessment and 11 studies performed both qualitative and quantitative assessments. A wide heterogeneity was found in the MRI sequences and in the quantitative methods used for BPE assessment. Conclusion: A wide variability exists in the quantitative evaluation of BPE on breast MRI. More studies focused on a reliable and comparable method for quantitative BPE assessment are needed. Advances in knowledge: More studies focused on a quantitative BPE assessment are needed. PMID:27925480
Clementi, Massimo; Bagnarelli, Patrizia
2015-10-01
In the last two decades, development of quantitative molecular methods has characterized the evolution of clinical virology more than any other methodological advancement. Using these methods, a great deal of studies has addressed efficiently in vivo the role of viral load, viral replication activity, and viral transcriptional profiles as correlates of disease outcome and progression, and has highlighted the physio-pathology of important virus diseases of humans. Furthermore, these studies have contributed to a better understanding of virus-host interactions and have sharply revolutionized the research strategies in basic and medical virology. In addition and importantly from a medical point of view, quantitative methods have provided a rationale for the therapeutic intervention and therapy monitoring in medically important viral diseases. Despite the advances in technology and the development of three generations of molecular methods within the last two decades (competitive PCR, real-time PCR, and digital PCR), great challenges still remain for viral testing related not only to standardization, accuracy, and precision, but also to selection of the best molecular targets for clinical use and to the identification of thresholds for risk stratification and therapeutic decisions. Future research directions, novel methods and technical improvements could be important to address these challenges.
Recent advances in the sequencing of relevant water intrusion fungi by the EPA, combined with the development of probes and primers have allowed for the unequivocal quantitative and qualitative identification of fungi in selected matrices.
In this pilot study, quantitative...
Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology
Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.
2016-01-01
Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199
Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology.
Counsell, Alyssa; Cribbie, Robert A; Harlow, Lisa L
2016-08-01
Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada.
Cross-Proportions: A Conceptual Method for Developing Quantitative Problem-Solving Skills
ERIC Educational Resources Information Center
Cook, Elzbieta; Cook, Stephen L.
2005-01-01
The cross-proportion method allows both the instructor and the student to easily determine where an error is made during problem solving. The C-P method supports a strong cognitive foundation upon which students can develop other diagnostic methods as they advance in chemistry and scientific careers.
Targeted Quantitation of Proteins by Mass Spectrometry
2013-01-01
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement. PMID:23517332
Targeted quantitation of proteins by mass spectrometry.
Liebler, Daniel C; Zimmerman, Lisa J
2013-06-04
Quantitative measurement of proteins is one of the most fundamental analytical tasks in a biochemistry laboratory, but widely used immunochemical methods often have limited specificity and high measurement variation. In this review, we discuss applications of multiple-reaction monitoring (MRM) mass spectrometry, which allows sensitive, precise quantitative analyses of peptides and the proteins from which they are derived. Systematic development of MRM assays is permitted by databases of peptide mass spectra and sequences, software tools for analysis design and data analysis, and rapid evolution of tandem mass spectrometer technology. Key advantages of MRM assays are the ability to target specific peptide sequences, including variants and modified forms, and the capacity for multiplexing that allows analysis of dozens to hundreds of peptides. Different quantitative standardization methods provide options that balance precision, sensitivity, and assay cost. Targeted protein quantitation by MRM and related mass spectrometry methods can advance biochemistry by transforming approaches to protein measurement.
Mapping the Mixed Methods–Mixed Research Synthesis Terrain
Sandelowski, Margarete; Voils, Corrine I.; Leeman, Jennifer; Crandell, Jamie L.
2012-01-01
Mixed methods–mixed research synthesis is a form of systematic review in which the findings of qualitative and quantitative studies are integrated via qualitative and/or quantitative methods. Although methodological advances have been made, efforts to differentiate research synthesis methods have been too focused on methods and not focused enough on the defining logics of research synthesis—each of which may be operationalized in different ways—or on the research findings themselves that are targeted for synthesis. The conduct of mixed methods–mixed research synthesis studies may more usefully be understood in terms of the logics of aggregation and configuration. Neither logic is preferable to the other nor tied exclusively to any one method or to any one side of the qualitative/quantitative binary. PMID:23066379
Brown, J Quincy; Vishwanath, Karthik; Palmer, Gregory M; Ramanujam, Nirmala
2009-02-01
Methods of optical spectroscopy that provide quantitative, physically or physiologically meaningful measures of tissue properties are an attractive tool for the study, diagnosis, prognosis, and treatment of various cancers. Recent development of methodologies to convert measured reflectance and fluorescence spectra from tissue to cancer-relevant parameters such as vascular volume, oxygenation, extracellular matrix extent, metabolic redox states, and cellular proliferation have significantly advanced the field of tissue optical spectroscopy. The number of publications reporting quantitative tissue spectroscopy results in the UV-visible wavelength range has increased sharply in the past three years, and includes new and emerging studies that correlate optically measured parameters with independent measures such as immunohistochemistry, which should aid in increased clinical acceptance of these technologies.
Mosley, Garrett L; Nguyen, Phuong; Wu, Benjamin M; Kamei, Daniel T
2016-08-07
The lateral-flow immunoassay (LFA) is a well-established diagnostic technology that has recently seen significant advancements due in part to the rapidly expanding fields of paper diagnostics and paper-fluidics. As LFA-based diagnostics become more complex, it becomes increasingly important to quantitatively determine important parameters during the design and evaluation process. However, current experimental methods for determining these parameters have certain limitations when applied to LFA systems. In this work, we describe our novel methods of combining paper and radioactive measurements to determine nanoprobe molarity, the number of antibodies per nanoprobe, and the forward and reverse rate constants for nanoprobe binding to immobilized target on the LFA test line. Using a model LFA system that detects for the presence of the protein transferrin (Tf), we demonstrate the application of our methods, which involve quantitative experimentation and mathematical modeling. We also compare the results of our rate constant experiments with traditional experiments to demonstrate how our methods more appropriately capture the influence of the LFA environment on the binding interaction. Our novel experimental approaches can therefore more efficiently guide the research process for LFA design, leading to more rapid advancement of the field of paper-based diagnostics.
Nomi, Yuri; Annaka, Hironori; Sato, Shinji; Ueta, Etsuko; Ohkura, Tsuyoshi; Yamamoto, Kazuhiro; Homma, Seiichi; Suzuki, Emiko; Otsuka, Yuzuru
2016-11-09
The aim of this study was to develop a simple and sensitive method to analyze several advanced glycation end products (AGEs) simultaneously using liquid chromatography-tandem mass spectrometry (LC-MS/MS), and to apply this method to the quantitation of AGEs in brown-colored foods. The developed method enabled to separate and quantitate simultaneously seven AGEs, and was applied to the determination of free AGEs contained in various kinds of soy sauce and beer. The major AGEs in soy sauce and beer were N ε -carboxymethyllysine (CML), N ε -carboxyethyllysine (CEL), and N δ -(5-hydro-5-methyl-4-imidazolon-2-yl)ornithine (MG-H1). Using the developed LC-MS/MS method, recovery test on soy sauce and beer samples showed the recovery values of 85.3-103.9% for CML, 95.9-107.4% for CEL, and 69.5-123.2% for MG-H1. In particular, it is the first report that free CML, CEL, and MG-H1 were present in beer. Furthermore, long-term storage and heating process of soy sauce increased CML and MG-H1.
The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods
NASA Astrophysics Data System (ADS)
Crootof, A.; Albrecht, T.; Scott, C. A.
2017-12-01
The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper finds that, to make the WEF nexus effective as a policy-relevant analytical tool, methods are needed that incorporate social and political dimensions of water, energy, and food; utilize multiple and interdisciplinary approaches; and engage stakeholders and policy-makers.
Advanced body composition assessment: from body mass index to body composition profiling.
Borga, Magnus; West, Janne; Bell, Jimmy D; Harvey, Nicholas C; Romu, Thobias; Heymsfield, Steven B; Dahlqvist Leinhard, Olof
2018-06-01
This paper gives a brief overview of common non-invasive techniques for body composition analysis and a more in-depth review of a body composition assessment method based on fat-referenced quantitative MRI. Earlier published studies of this method are summarized, and a previously unpublished validation study, based on 4753 subjects from the UK Biobank imaging cohort, comparing the quantitative MRI method with dual-energy X-ray absorptiometry (DXA) is presented. For whole-body measurements of adipose tissue (AT) or fat and lean tissue (LT), DXA and quantitative MRIs show excellent agreement with linear correlation of 0.99 and 0.97, and coefficient of variation (CV) of 4.5 and 4.6 per cent for fat (computed from AT) and LT, respectively, but the agreement was found significantly lower for visceral adipose tissue, with a CV of >20 per cent. The additional ability of MRI to also measure muscle volumes, muscle AT infiltration and ectopic fat, in combination with rapid scanning protocols and efficient image analysis tools, makes quantitative MRI a powerful tool for advanced body composition assessment. © American Federation for Medical Research (unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Examining the Inclusion of Quantitative Research in a Meta-Ethnographic Review
ERIC Educational Resources Information Center
Booker, Rhae-Ann Richardson
2010-01-01
This study explored how one might extend meta-ethnography to quantitative research for the advancement of interpretive review methods. Using the same population of 139 studies on racial-ethnic matching as data, my investigation entailed an extended meta-ethnography (EME) and comparison of its results to a published meta-analysis (PMA). Adhering to…
Apostolou, N; Papazoglou, Th; Koutsouris, D
2006-01-01
Image fusion is a process of combining information from multiple sensors. It is a useful tool implemented in the treatment planning programme of Gamma Knife Radiosurgery. In this paper we evaluate advanced image fusion algorithms for Matlab platform and head images. We develop nine level grayscale image fusion methods: average, principal component analysis (PCA), discrete wavelet transform (DWT) and Laplacian, filter - subtract - decimate (FSD), contrast, gradient, morphological pyramid and a shift invariant discrete wavelet transform (SIDWT) method in Matlab platform. We test these methods qualitatively and quantitatively. The quantitative criteria we use are the Root Mean Square Error (RMSE), the Mutual Information (MI), the Standard Deviation (STD), the Entropy (H), the Difference Entropy (DH) and the Cross Entropy (CEN). The qualitative are: natural appearance, brilliance contrast, presence of complementary features and enhancement of common features. Finally we make clinically useful suggestions.
Quantitative Evaluation Method of Each Generation Margin for Power System Planning
NASA Astrophysics Data System (ADS)
Su, Su; Tanaka, Kazuyuki
As the power system deregulation advances, the competition among the power companies becomes heated, and they seek more efficient system planning using existing facilities. Therefore, an efficient system planning method has been expected. This paper proposes a quantitative evaluation method for the (N-1) generation margin considering the overload and the voltage stability restriction. Concerning the generation margin related with the overload, a fast solution method without the recalculation of the (N-1) Y-matrix is proposed. Referred to the voltage stability, this paper proposes an efficient method to search the stability limit. The IEEE30 model system which is composed of 6 generators and 14 load nodes is employed to validate the proposed method. According to the results, the proposed method can reduce the computational cost for the generation margin related with the overload under the (N-1) condition, and specify the value quantitatively.
Wright, Kevin A; Bouffard, Leana A
2016-02-01
The qualitative analysis of individual cases has a prominent place in the development of criminological theory, yet progression in the scientific study of crime has largely been viewed as a distinctly quantitative endeavor. In the process, much of the theoretical depth and precision supplied by earlier methods of criminological knowledge production have been sacrificed. The current work argues for a return to our criminological roots by supplementing quantitative analyses with the qualitative inspection of individual cases. We provide a specific example of a literature (i.e., criminal specialization/versatility) that has become increasingly quantitative and could benefit from the use of the proposed approach. We conclude by offering additional areas of research that might be advanced by our framework presented here. © The Author(s) 2014.
ERIC Educational Resources Information Center
Bramwell-Lalor, Sharon; Rainford, Marcia
2015-01-01
This paper reports on a Mixed Methods study involving an investigation into the attitudes of advanced level biology teachers towards assessment and describes the teachers' experiences while being engaged in Assessment for Learning (AfL) practices such as sharing of learning objectives and peer- and self-assessment. Quantitative data were collected…
Analysis of live cell images: Methods, tools and opportunities.
Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens
2017-02-15
Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.
Illustrating a Mixed-Method Approach for Validating Culturally Specific Constructs
ERIC Educational Resources Information Center
Hitchcock, J.H.; Nastasi, B.K.; Dai, D.Y.; Newman, J.; Jayasena, A.; Bernstein-Moore, R.; Sarkar, S.; Varjas, K.
2005-01-01
The purpose of this article is to illustrate a mixed-method approach (i.e., combining qualitative and quantitative methods) for advancing the study of construct validation in cross-cultural research. The article offers a detailed illustration of the approach using the responses 612 Sri Lankan adolescents provided to an ethnographic survey. Such…
A General Method for Targeted Quantitative Cross-Linking Mass Spectrometry.
Chavez, Juan D; Eng, Jimmy K; Schweppe, Devin K; Cilia, Michelle; Rivera, Keith; Zhong, Xuefei; Wu, Xia; Allen, Terrence; Khurgel, Moshe; Kumar, Akhilesh; Lampropoulos, Athanasios; Larsson, Mårten; Maity, Shuvadeep; Morozov, Yaroslav; Pathmasiri, Wimal; Perez-Neut, Mathew; Pineyro-Ruiz, Coriness; Polina, Elizabeth; Post, Stephanie; Rider, Mark; Tokmina-Roszyk, Dorota; Tyson, Katherine; Vieira Parrine Sant'Ana, Debora; Bruce, James E
2016-01-01
Chemical cross-linking mass spectrometry (XL-MS) provides protein structural information by identifying covalently linked proximal amino acid residues on protein surfaces. The information gained by this technique is complementary to other structural biology methods such as x-ray crystallography, NMR and cryo-electron microscopy[1]. The extension of traditional quantitative proteomics methods with chemical cross-linking can provide information on the structural dynamics of protein structures and protein complexes. The identification and quantitation of cross-linked peptides remains challenging for the general community, requiring specialized expertise ultimately limiting more widespread adoption of the technique. We describe a general method for targeted quantitative mass spectrometric analysis of cross-linked peptide pairs. We report the adaptation of the widely used, open source software package Skyline, for the analysis of quantitative XL-MS data as a means for data analysis and sharing of methods. We demonstrate the utility and robustness of the method with a cross-laboratory study and present data that is supported by and validates previously published data on quantified cross-linked peptide pairs. This advance provides an easy to use resource so that any lab with access to a LC-MS system capable of performing targeted quantitative analysis can quickly and accurately measure dynamic changes in protein structure and protein interactions.
Vessel wall characterization using quantitative MRI: what's in a number?
Coolen, Bram F; Calcagno, Claudia; van Ooij, Pim; Fayad, Zahi A; Strijkers, Gustav J; Nederveen, Aart J
2018-02-01
The past decade has witnessed the rapid development of new MRI technology for vessel wall imaging. Today, with advances in MRI hardware and pulse sequences, quantitative MRI of the vessel wall represents a real alternative to conventional qualitative imaging, which is hindered by significant intra- and inter-observer variability. Quantitative MRI can measure several important morphological and functional characteristics of the vessel wall. This review provides a detailed introduction to novel quantitative MRI methods for measuring vessel wall dimensions, plaque composition and permeability, endothelial shear stress and wall stiffness. Together, these methods show the versatility of non-invasive quantitative MRI for probing vascular disease at several stages. These quantitative MRI biomarkers can play an important role in the context of both treatment response monitoring and risk prediction. Given the rapid developments in scan acceleration techniques and novel image reconstruction, we foresee the possibility of integrating the acquisition of multiple quantitative vessel wall parameters within a single scan session.
Intelligent model-based diagnostics for vehicle health management
NASA Astrophysics Data System (ADS)
Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki
2003-08-01
The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.
Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S
2016-11-01
There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2012-02-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Da Silva, Laeticia; Collino, Sebastiano; Cominetti, Ornella; Martin, Francois-Pierre; Montoliu, Ivan; Moreno, Sergio Oller; Corthesy, John; Kaput, Jim; Kussmann, Martin; Monteiro, Jacqueline Pontes; Guiraud, Seu Ping
2016-09-01
There is increasing interest in the profiling and quantitation of methionine pathway metabolites for health management research. Currently, several analytical approaches are required to cover metabolites and co-factors. We report the development and the validation of a method for the simultaneous detection and quantitation of 13 metabolites in red blood cells. The method, validated in a cohort of healthy human volunteers, shows a high level of accuracy and reproducibility. This high-throughput protocol provides a robust coverage of central metabolites and co-factors in one single analysis and in a high-throughput fashion. In large-scale clinical settings, the use of such an approach will significantly advance the field of nutritional research in health and disease.
Takamura, Ayari; Watanabe, Ken; Akutsu, Tomoko
2017-07-01
Identification of human semen is indispensable for the investigation of sexual assaults. Fluorescence staining methods using commercial kits, such as the series of SPERM HY-LITER™ kits, have been useful to detect human sperm via strong fluorescence. These kits have been examined from various forensic aspects. However, because of a lack of evaluation methods, these studies did not provide objective, or quantitative, descriptions of the results nor clear criteria for the decisions reached. In addition, the variety of validations was considerably limited. In this study, we conducted more advanced validations of SPERM HY-LITER™ Express using our established image analysis method. Use of this method enabled objective and specific identification of fluorescent sperm's spots and quantitative comparisons of the sperm detection performance under complex experimental conditions. For body fluid mixtures, we examined interference with the fluorescence staining from other body fluid components. Effects of sample decomposition were simulated in high humidity and high temperature conditions. Semen with quite low sperm concentrations, such as azoospermia and oligospermia samples, represented the most challenging cases in application of the kit. Finally, the tolerance of the kit against various acidic and basic environments was analyzed. The validations herein provide useful information for the practical applications of the SPERM HY-LITER™ Express kit, which were previously unobtainable. Moreover, the versatility of our image analysis method toward various complex cases was demonstrated.
Advanced quantitative measurement methodology in physics education research
NASA Astrophysics Data System (ADS)
Wang, Jing
The ultimate goal of physics education research (PER) is to develop a theoretical framework to understand and improve the learning process. In this journey of discovery, assessment serves as our headlamp and alpenstock. It sometimes detects signals in student mental structures, and sometimes presents the difference between expert understanding and novice understanding. Quantitative assessment is an important area in PER. Developing research-based effective assessment instruments and making meaningful inferences based on these instruments have always been important goals of the PER community. Quantitative studies are often conducted to provide bases for test development and result interpretation. Statistics are frequently used in quantitative studies. The selection of statistical methods and interpretation of the results obtained by these methods shall be connected to the education background. In this connecting process, the issues of educational models are often raised. Many widely used statistical methods do not make assumptions on the mental structure of subjects, nor do they provide explanations tailored to the educational audience. There are also other methods that consider the mental structure and are tailored to provide strong connections between statistics and education. These methods often involve model assumption and parameter estimation, and are complicated mathematically. The dissertation provides a practical view of some advanced quantitative assessment methods. The common feature of these methods is that they all make educational/psychological model assumptions beyond the minimum mathematical model. The purpose of the study is to provide a comparison between these advanced methods and the pure mathematical methods. The comparison is based on the performance of the two types of methods under physics education settings. In particular, the comparison uses both physics content assessments and scientific ability assessments. The dissertation includes three parts. The first part involves the comparison between item response theory (IRT) and classical test theory (CTT). The two theories both provide test item statistics for educational inferences and decisions. The two theories are both applied to Force Concept Inventory data obtained from students enrolled in The Ohio State University. Effort was made to examine the similarity and difference between the two theories, and the possible explanation to the difference. The study suggests that item response theory is more sensitive to the context and conceptual features of the test items than classical test theory. The IRT parameters provide a better measure than CTT parameters for the educational audience to investigate item features. The second part of the dissertation is on the measure of association for binary data. In quantitative assessment, binary data is often encountered because of its simplicity. The current popular measures of association fail under some extremely unbalanced conditions. However, the occurrence of these conditions is not rare in educational data. Two popular association measures, the Pearson's correlation and the tetrachoric correlation are examined. A new method, model based association is introduced, and an educational testing constraint is discussed. The existing popular methods are compared with the model based association measure with and without the constraint. Connections between the value of association and the context and conceptual features of questions are discussed in detail. Results show that all the methods have their advantages and disadvantages. Special attention to the test and data conditions is necessary. The last part of the dissertation is focused on exploratory factor analysis (EFA). The theoretical advantages of EFA are discussed. Typical misunderstanding and misusage of EFA are explored. The EFA is performed on Lawson's Classroom Test of Scientific Reasoning (LCTSR), a widely used assessment on scientific reasoning skills. The reasoning ability structures for U.S. and Chinese students at different educational levels are given by the analysis. A final discussion on the advanced quantitative assessment methodology and the pure mathematical methodology is presented at the end.
Planner-Based Control of Advanced Life Support Systems
NASA Technical Reports Server (NTRS)
Muscettola, Nicola; Kortenkamp, David; Fry, Chuck; Bell, Scott
2005-01-01
The paper describes an approach to the integration of qualitative and quantitative modeling techniques for advanced life support (ALS) systems. Developing reliable control strategies that scale up to fully integrated life support systems requires augmenting quantitative models and control algorithms with the abstractions provided by qualitative, symbolic models and their associated high-level control strategies. This will allow for effective management of the combinatorics due to the integration of a large number of ALS subsystems. By focusing control actions at different levels of detail and reactivity we can use faster: simpler responses at the lowest level and predictive but complex responses at the higher levels of abstraction. In particular, methods from model-based planning and scheduling can provide effective resource management over long time periods. We describe reference implementation of an advanced control system using the IDEA control architecture developed at NASA Ames Research Center. IDEA uses planning/scheduling as the sole reasoning method for predictive and reactive closed loop control. We describe preliminary experiments in planner-based control of ALS carried out on an integrated ALS simulation developed at NASA Johnson Space Center.
Hattotuwagama, Channa K; Guan, Pingping; Doytchinova, Irini A; Flower, Darren R
2004-11-21
Quantitative structure-activity relationship (QSAR) analysis is a main cornerstone of modern informatic disciplines. Predictive computational models, based on QSAR technology, of peptide-major histocompatibility complex (MHC) binding affinity have now become a vital component of modern day computational immunovaccinology. Historically, such approaches have been built around semi-qualitative, classification methods, but these are now giving way to quantitative regression methods. The additive method, an established immunoinformatics technique for the quantitative prediction of peptide-protein affinity, was used here to identify the sequence dependence of peptide binding specificity for three mouse class I MHC alleles: H2-D(b), H2-K(b) and H2-K(k). As we show, in terms of reliability the resulting models represent a significant advance on existing methods. They can be used for the accurate prediction of T-cell epitopes and are freely available online ( http://www.jenner.ac.uk/MHCPred).
3D Slicer as an Image Computing Platform for the Quantitative Imaging Network
Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron
2012-01-01
Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690
Dispersion Interactions between Rare Gas Atoms: Testing the London Equation Using ab Initio Methods
ERIC Educational Resources Information Center
Halpern, Arthur M.
2011-01-01
A computational chemistry experiment is described in which students can use advanced ab initio quantum mechanical methods to test the ability of the London equation to account quantitatively for the attractive (dispersion) interactions between rare gas atoms. Using readily available electronic structure applications, students can calculate the…
RECENT ADVANCES IN QUANTITATIVE NEUROPROTEOMICS
Craft, George E; Chen, Anshu; Nairn, Angus C
2014-01-01
The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson’s disease and Alzheimer’s disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed light on a number of aspects of neuroscience that relates to normal brain function as well as of the changes in protein expression and regulation that occurs in neuropsychiatric and neurodegenerative disorders. PMID:23623823
Recent advances in quantitative neuroproteomics.
Craft, George E; Chen, Anshu; Nairn, Angus C
2013-06-15
The field of proteomics is undergoing rapid development in a number of different areas including improvements in mass spectrometric platforms, peptide identification algorithms and bioinformatics. In particular, new and/or improved approaches have established robust methods that not only allow for in-depth and accurate peptide and protein identification and modification, but also allow for sensitive measurement of relative or absolute quantitation. These methods are beginning to be applied to the area of neuroproteomics, but the central nervous system poses many specific challenges in terms of quantitative proteomics, given the large number of different neuronal cell types that are intermixed and that exhibit distinct patterns of gene and protein expression. This review highlights the recent advances that have been made in quantitative neuroproteomics, with a focus on work published over the last five years that applies emerging methods to normal brain function as well as to various neuropsychiatric disorders including schizophrenia and drug addiction as well as of neurodegenerative diseases including Parkinson's disease and Alzheimer's disease. While older methods such as two-dimensional polyacrylamide electrophoresis continued to be used, a variety of more in-depth MS-based approaches including both label (ICAT, iTRAQ, TMT, SILAC, SILAM), label-free (label-free, MRM, SWATH) and absolute quantification methods, are rapidly being applied to neurobiological investigations of normal and diseased brain tissue as well as of cerebrospinal fluid (CSF). While the biological implications of many of these studies remain to be clearly established, that there is a clear need for standardization of experimental design and data analysis, and that the analysis of protein changes in specific neuronal cell types in the central nervous system remains a serious challenge, it appears that the quality and depth of the more recent quantitative proteomics studies is beginning to shed light on a number of aspects of neuroscience that relates to normal brain function as well as of the changes in protein expression and regulation that occurs in neuropsychiatric and neurodegenerative disorders. Copyright © 2013. Published by Elsevier Inc.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
2005-06-01
Afghanistan References Bryman A. The debate about quantitative and qualitative research : a question of method or epistemology. British Journal of...offers a significant advance over other assessment methods employed by civil affairs units. Involved units agreed that the process was well worth...1 Methods
Probabilistic Geoacoustic Inversion in Complex Environments
2015-09-30
Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is
NASA Astrophysics Data System (ADS)
Edmiston, John Kearney
This work explores the field of continuum plasticity from two fronts. On the theory side, we establish a complete specification of a phenomenological theory of plasticity for single crystals. The model serves as an alternative to the popular crystal plasticity formulation. Such a model has been previously proposed in the literature; the new contribution made here is the constitutive framework and resulting simulations. We calibrate the model to available data and use a simple numerical method to explore resulting predictions in plane strain boundary value problems. Results show promise for further investigation of the plasticity model. Conveniently, this theory comes with a corresponding experimental tool in X-ray diffraction. Recent advances in hardware technology at synchrotron sources have led to an increased use of the technique for studies of plasticity in the bulk of materials. The method has been successful in qualitative observations of material behavior, but its use in quantitative studies seeking to extract material properties is open for investigation. Therefore in the second component of the thesis several contributions are made to synchrotron X-ray diffraction experiments, in terms of method development as well as the quantitative reporting of constitutive parameters. In the area of method development, analytical tools are developed to determine the available precision of this type of experiment—a crucial aspect to determine if the method is to be used for quantitative studies. We also extract kinematic information relating to intragranular inhomogeneity which is not accessible with traditional methods of data analysis. In the area of constitutive parameter identification, we use the method to extract parameters corresponding to the proposed formulation of plasticity for a titanium alloy (HCP) which is continuously sampled by X-ray diffraction during uniaxial extension. These results and the lessons learned from the efforts constitute early reporting of the quantitative profitability of undertaking such a line of experimentation for the study of plastic deformation processes.
NASA Astrophysics Data System (ADS)
Donnay, Karsten
2015-03-01
The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungate, Bruce; Pett-Ridge, Jennifer; Blazewicz, Steven
In this project, we developed an innovative and ground-breaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peer-review process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hungate, Bruce; PettRidge, Jennifer; Blazewicz, St
In this project, we developed an innovative and groundbreaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peerreview process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less
Sakai, Shinobu; Adachi, Reiko; Akiyama, Hiroshi; Teshima, Reiko
2013-06-19
A labeling system for food allergenic ingredients was established in Japan in April 2002. To monitor the labeling, the Japanese government announced official methods for detecting allergens in processed foods in November 2002. The official methods consist of quantitative screening tests using enzyme-linked immunosorbent assays (ELISAs) and qualitative confirmation tests using Western blotting or polymerase chain reactions (PCR). In addition, the Japanese government designated 10 μg protein/g food (the corresponding allergenic ingredient soluble protein weight/food weight), determined by ELISA, as the labeling threshold. To standardize the official methods, the criteria for the validation protocol were described in the official guidelines. This paper, which was presented at the Advances in Food Allergen Detection Symposium, ACS National Meeting and Expo, San Diego, CA, Spring 2012, describes the validation protocol outlined in the official Japanese guidelines, the results of interlaboratory studies for the quantitative detection method (ELISA for crustacean proteins) and the qualitative detection method (PCR for shrimp and crab DNAs), and the reliability of the detection methods.
Atmospheric Pressure Photoionization Tandem Mass Spectrometry of Androgens in Prostate Cancer
Lih, Fred Bjørn; Titus, Mark A.; Mohler, James L.; Tomer, Kenneth B.
2010-01-01
Androgen deprivation therapy is the most common treatment option for advanced prostate cancer. Almost all prostate cancers recur during androgen deprivation therapy, and new evidence suggests that androgen receptor activation persists despite castrate levels of circulating androgens. Quantitation of tissue levels of androgens is critical to understanding the mechanism of recurrence of prostate cancer during androgen deprivation therapy. A liquid chromatography atmospheric pressure photoionization tandem mass spectrometric method was developed for quantitation of tissue levels of androgens. Quantitation of the saturated keto-steroids dihydrotestosterone and 5-α-androstanedione required detection of a novel parent ion, [M + 15]+. The nature of this parent ion was explored and the method applied to prostate tissue and cell culture with comparison to results achieved using electrospray ionization. PMID:20560527
ADVANCED COMPUTATIONAL METHODS IN DOSE MODELING
The overall goal of the EPA-ORD NERL research program on Computational Toxicology (CompTox) is to provide the Agency with the tools of modern chemistry, biology, and computing to improve quantitative risk assessments and reduce uncertainties in the source-to-adverse outcome conti...
20180312 - Mechanistic Modeling of Developmental Defects through Computational Embryology (SOT)
Significant advances in the genome sciences, in automated high-throughput screening (HTS), and in alternative methods for testing enable rapid profiling of chemical libraries for quantitative effects on diverse cellular activities. While a surfeit of HTS data and information is n...
Hanna, Debra; Romero, Klaus; Schito, Marco
2017-03-01
The development of novel tuberculosis (TB) multi-drug regimens that are more efficacious and of shorter duration requires a robust drug development pipeline. Advances in quantitative modeling and simulation can be used to maximize the utility of patient-level data from prior and contemporary clinical trials, thus optimizing study design for anti-TB regimens. This perspective article highlights the work of seven project teams developing first-in-class translational and quantitative methodologies that aim to inform drug development decision-making, dose selection, trial design, and safety assessments, in order to achieve shorter and safer therapies for patients in need. These tools offer the opportunity to evaluate multiple hypotheses and provide a means to identify, quantify, and understand relevant sources of variability, to optimize translation and clinical trial design. When incorporated into the broader regulatory sciences framework, these efforts have the potential to transform the development paradigm for TB combination development, as well as other areas of global health. Copyright © 2016. Published by Elsevier Ltd.
Leake, Mark C
2016-01-01
Our understanding of the processes involved in infection has grown enormously in the past decade due in part to emerging methods of biophysics. This new insight has been enabled through advances in interdisciplinary experimental technologies and theoretical methods at the cutting-edge interface of the life and physical sciences. For example, this has involved several state-of-the-art biophysical tools used in conjunction with molecular and cell biology approaches, which enable investigation of infection in living cells. There are also new, emerging interfacial science tools which enable significant improvements to the resolution of quantitative measurements both in space and time. These include single-molecule biophysics methods and super-resolution microscopy approaches. These new technological tools in particular have underpinned much new understanding of dynamic processes of infection at a molecular length scale. Also, there are many valuable advances made recently in theoretical approaches of biophysics which enable advances in predictive modelling to generate new understanding of infection. Here, I discuss these advances, and take stock on our knowledge of the biophysics of infection and discuss where future advances may lead.
Quantitative Analysis of Cell Nucleus Organisation
Shiels, Carol; Adams, Niall M; Islam, Suhail A; Stephens, David A; Freemont, Paul S
2007-01-01
There are almost 1,300 entries for higher eukaryotes in the Nuclear Protein Database. The proteins' subcellular distribution patterns within interphase nuclei can be complex, ranging from diffuse to punctate or microspeckled, yet they all work together in a coordinated and controlled manner within the three-dimensional confines of the nuclear volume. In this review we describe recent advances in the use of quantitative methods to understand nuclear spatial organisation and discuss some of the practical applications resulting from this work. PMID:17676980
Three-dimensional quantitative flow diagnostics
NASA Technical Reports Server (NTRS)
Miles, Richard B.; Nosenchuck, Daniel M.
1989-01-01
The principles, capabilities, and practical implementation of advanced measurement techniques for the quantitative characterization of three-dimensional flows are reviewed. Consideration is given to particle, Rayleigh, and Raman scattering; fluorescence; flow marking by H2 bubbles, photochromism, photodissociation, and vibrationally excited molecules; light-sheet volume imaging; and stereo imaging. Also discussed are stereo schlieren methods, holographic particle imaging, optical tomography, acoustic and magnetic-resonance imaging, and the display of space-filling data. Extensive diagrams, graphs, photographs, sample images, and tables of numerical data are provided.
eSIP: A Novel Solution-Based Sectioned Image Property Approach for Microscope Calibration
Butzlaff, Malte; Weigel, Arwed; Ponimaskin, Evgeni; Zeug, Andre
2015-01-01
Fluorescence confocal microscopy represents one of the central tools in modern sciences. Correspondingly, a growing amount of research relies on the development of novel microscopic methods. During the last decade numerous microscopic approaches were developed for the investigation of various scientific questions. Thereby, the former qualitative imaging methods became replaced by advanced quantitative methods to gain more and more information from a given sample. However, modern microscope systems being as complex as they are, require very precise and appropriate calibration routines, in particular when quantitative measurements should be compared over longer time scales or between different setups. Multispectral beads with sub-resolution size are often used to describe the point spread function and thus the optical properties of the microscope. More recently, a fluorescent layer was utilized to describe the axial profile for each pixel, which allows a spatially resolved characterization. However, fabrication of a thin fluorescent layer with matching refractive index is technically not solved yet. Therefore, we propose a novel type of calibration concept for sectioned image property (SIP) measurements which is based on fluorescent solution and makes the calibration concept available for a broader number of users. Compared to the previous approach, additional information can be obtained by application of this extended SIP chart approach, including penetration depth, detected number of photons, and illumination profile shape. Furthermore, due to the fit of the complete profile, our method is less susceptible to noise. Generally, the extended SIP approach represents a simple and highly reproducible method, allowing setup independent calibration and alignment procedures, which is mandatory for advanced quantitative microscopy. PMID:26244982
NASA Technical Reports Server (NTRS)
Young, S. G.
1973-01-01
The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.
Research using qualitative, quantitative or mixed methods and choice based on the research.
McCusker, K; Gunaydin, S
2015-10-01
Research is fundamental to the advancement of medicine and critical to identifying the most optimal therapies unique to particular societies. This is easily observed through the dynamics associated with pharmacology, surgical technique and the medical equipment used today versus short years ago. Advancements in knowledge synthesis and reporting guidelines enhance the quality, scope and applicability of results; thus, improving health science and clinical practice and advancing health policy. While advancements are critical to the progression of optimal health care, the high cost associated with these endeavors cannot be ignored. Research fundamentally needs to be evaluated to identify the most efficient methods of evaluation. The primary objective of this paper is to look at a specific research methodology when applied to the area of clinical research, especially extracorporeal circulation and its prognosis for the future. © The Author(s) 2014.
Using mixed methods when researching communities.
Ochieng, Bertha M N; Meetoo, Danny
2015-09-01
To argue for the use of mixed methods when researching communities. Although research involving minority communities is now advanced, not enough effort has been made to formulate methodological linkages between qualitative and quantitative methods in most studies. For instance, the quantitative approaches used by epidemiologists and others in examining the wellbeing of communities are usually empirical. While the rationale for this is sound, quantitative findings can be expanded with data from in-depth qualitative approaches, such as interviews or observations, which are likely to provide insights into the experiences of people in those communities and their relationships with their wellbeing. Academic databases including The Cochrane Library, MEDLINE, CINAHL, AMED, INTERNURSE, Science Direct, Web of Knowledge and PubMed. An iterative process of identifying eligible literature was carried out by comprehensively searching electronic databases. Using mixed-methods approaches is likely to address any potential drawbacks of individual methods by exploiting the strengths of each at the various stages of research. Combining methods can provide additional ways of looking at a complex problem and improve the understanding of a community's experiences. However, it is important for researchers to use the different methods interactively during their research. The use of qualitative and quantitative methods is likely to enrich our understanding of the interrelationship between wellbeing and the experiences of communities. This should help researchers to explore socio-cultural factors and experiences of health and healthcare practice more effectively.
The "Science of HRD Research": Reshaping HRD Research through Scientometrics
ERIC Educational Resources Information Center
Wang, Greg G.; Gilley, Jerry W.; Sun, Judy Y.
2012-01-01
We explore opportunities for assessing and advancing Human Resource Development (HRD) research through an integrative literature review of scientometric theories and methods. Known as the "science of science," scientometrics is concerned with the quantitative study of scholarly communications, disciplinary structure and assessment and measurement…
Radiogenomics and radiotherapy response modeling
NASA Astrophysics Data System (ADS)
El Naqa, Issam; Kerns, Sarah L.; Coates, James; Luo, Yi; Speers, Corey; West, Catharine M. L.; Rosenstein, Barry S.; Ten Haken, Randall K.
2017-08-01
Advances in patient-specific information and biotechnology have contributed to a new era of computational medicine. Radiogenomics has emerged as a new field that investigates the role of genetics in treatment response to radiation therapy. Radiation oncology is currently attempting to embrace these recent advances and add to its rich history by maintaining its prominent role as a quantitative leader in oncologic response modeling. Here, we provide an overview of radiogenomics starting with genotyping, data aggregation, and application of different modeling approaches based on modifying traditional radiobiological methods or application of advanced machine learning techniques. We highlight the current status and potential for this new field to reshape the landscape of outcome modeling in radiotherapy and drive future advances in computational oncology.
Image analysis and modeling in medical image computing. Recent developments and advances.
Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T
2012-01-01
Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body. Hence, model-based image computing methods are important tools to improve medical diagnostics and patient treatment in future.
Piedra, Jose; Ontiveros, Maria; Miravet, Susana; Penalva, Cristina; Monfar, Mercè; Chillon, Miguel
2015-02-01
Recombinant adeno-associated viruses (rAAVs) are promising vectors in preclinical and clinical assays for the treatment of diseases with gene therapy strategies. Recent technological advances in amplification and purification have allowed the production of highly purified rAAV vector preparations. Although quantitative polymerase chain reaction (qPCR) is the current method of choice for titrating rAAV genomes, it shows high variability. In this work, we report a rapid and robust rAAV titration method based on the quantitation of encapsidated DNA with the fluorescent dye PicoGreen®. This method allows detection from 3×10(10) viral genome/ml up to 2.4×10(13) viral genome/ml in a linear range. Contrasted with dot blot or qPCR, the PicoGreen-based assay has less intra- and interassay variability. Moreover, quantitation is rapid, does not require specific primers or probes, and is independent of the rAAV pseudotype analyzed. In summary, development of this universal rAAV-titering method may have substantive implications in rAAV technology.
Stereotype Threat? Male and Female Students in Advanced High School Courses
NASA Astrophysics Data System (ADS)
Corra, Mamadi
Propositions of stereotype threat theory imply that the social consequences of academic distinction in advanced quantitative areas (such as math and the physical sciences) for women may promote the under representation of female students in advanced quantitative academic courses. The hypothesis that female students will be underrepresented in advanced quantitative (honors and advanced placement math and physical science) courses is tested using academic performance and enrollment data for high school students in a "Student/Parent Informed Choice" (open registration) school district in North Carolina. Results show female students to be overrepresented in both advanced verbal/writing intensive (honors and advanced placement English, foreign language, and social science) and advanced quantitative (honors and advanced placement math and physical science) courses compared to their proportion of the student body. More surprisingly, results also indicate female students (compared to male students) to be overrepresented in advanced courses compared to their proportion of high-performing students. Furthermore, as with patterns observed at the district level, additional analysis of enrollment data for the entire state reveals similar results. Taken together, the findings call into question the prevailing presumption that female students continue to be underrepresented in math and physical science courses. Instead, the changing social context within which females and males experience schooling may provide an explanation for the findings.
Designing a mixed methods study in primary care.
Creswell, John W; Fetters, Michael D; Ivankova, Nataliya V
2004-01-01
Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research.
Thinking Fast and Slow about Causality: Response to Palinkas
ERIC Educational Resources Information Center
Marsh, Jeanne C.
2014-01-01
Larry Palinkas advances the developing science of social work by providing an explanation of how social science research methods, both qualitative and quantitative, can improve our capacity to draw casual inferences. Understanding causal relations and making causal inferences--with the promise of being able to predict and control outcomes--is…
Real-time PCR: Advanced technologies and applications
USDA-ARS?s Scientific Manuscript database
This book brings together contributions from 20 experts in the field of PCR, providing a broad perspective of the applications of quantitative real-time PCR (qPCR). The editors state in the preface that the aim is to provide detailed insight into underlying principles and methods of qPCR to provide ...
Reflections on Education and Social Mobility
ERIC Educational Resources Information Center
Halsey, A. H.
2013-01-01
This article is a brief personal reflection on the state of research into the relation between education and social mobility. Quantitative methods are both essential and advancing in this field. Sociologists seek scientific solutions but achieve ethical neutrality only with difficulty because all are tempted to bias from social and political…
The National Health Educator Job Analysis 2010: Process and Outcomes
ERIC Educational Resources Information Center
Doyle, Eva I.; Caro, Carla M.; Lysoby, Linda; Auld, M. Elaine; Smith, Becky J.; Muenzen, Patricia M.
2012-01-01
The National Health Educator Job Analysis 2010 was conducted to update the competencies model for entry- and advanced-level health educators. Qualitative and quantitative methods were used. Structured interviews, focus groups, and a modified Delphi technique were implemented to engage 59 health educators from diverse work settings and experience…
Simulation and Advanced Practice Nursing Education
ERIC Educational Resources Information Center
Blue, Dawn I.
2016-01-01
This quantitative study compared changes in level of confidence resulting from participation in simulation or traditional instructional methods for BSN (Bachelor of Science in Nursing) to DNP (Doctor of Nursing Practice) students in a nurse practitioner course when they entered the clinical practicum. Simulation has been used in many disciplines…
e-Learning Success Model: An Information Systems Perspective
ERIC Educational Resources Information Center
Lee-Post, Anita
2009-01-01
This paper reports the observations made and experience gained from developing and delivering an online quantitative methods course for Business undergraduates. Inspired by issues and challenges experienced in developing the online course, a model is advanced to address the question of how to guide the design, development, and delivery of…
QuantCrit: Rectifying Quantitative Methods through Critical Race Theory
ERIC Educational Resources Information Center
Garcia, Nichole M.; López, Nancy; Vélez, Verónica N.
2018-01-01
Critical race theory (CRT) in education centers, examines, and seeks to transform the relationship that undergirds race, racism, and power. CRT scholars have applied a critical race framework to advance research methodologies, namely qualitative interventions. Informed by this work, and 15 years later, this article reconsiders the possibilities of…
Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia AM
2013-01-01
Background: There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. Objective: To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. Methods: We obtained conventional PDw and T2w images from 10 patients with relapsing–remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Results: Our study’s ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. Conclusion: ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished. PMID:23037551
Current methods and advances in bone densitometry
NASA Technical Reports Server (NTRS)
Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.
1995-01-01
Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.
Designing A Mixed Methods Study In Primary Care
Creswell, John W.; Fetters, Michael D.; Ivankova, Nataliya V.
2004-01-01
BACKGROUND Mixed methods or multimethod research holds potential for rigorous, methodologically sound investigations in primary care. The objective of this study was to use criteria from the literature to evaluate 5 mixed methods studies in primary care and to advance 3 models useful for designing such investigations. METHODS We first identified criteria from the social and behavioral sciences to analyze mixed methods studies in primary care research. We then used the criteria to evaluate 5 mixed methods investigations published in primary care research journals. RESULTS Of the 5 studies analyzed, 3 included a rationale for mixing based on the need to develop a quantitative instrument from qualitative data or to converge information to best understand the research topic. Quantitative data collection involved structured interviews, observational checklists, and chart audits that were analyzed using descriptive and inferential statistical procedures. Qualitative data consisted of semistructured interviews and field observations that were analyzed using coding to develop themes and categories. The studies showed diverse forms of priority: equal priority, qualitative priority, and quantitative priority. Data collection involved quantitative and qualitative data gathered both concurrently and sequentially. The integration of the quantitative and qualitative data in these studies occurred between data analysis from one phase and data collection from a subsequent phase, while analyzing the data, and when reporting the results. DISCUSSION We recommend instrument-building, triangulation, and data transformation models for mixed methods designs as useful frameworks to add rigor to investigations in primary care. We also discuss the limitations of our study and the need for future research. PMID:15053277
NASA Technical Reports Server (NTRS)
Miller, James G.
1993-01-01
In this Progress Report, we describe our current research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the characterization of stitched composite materials and bonded aluminum plate specimens. One purpose of this investigation is to identify and characterize specific features of polar backscatter interrogation which enhance the ability of ultrasound to detect flaws in a stitched composite laminate. Another focus is to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize bonded aluminum lap joints. As an approach to implementing quantitative ultrasonic inspection methods to both of these materials, we focus on the physics that underlies the detection of flaws in such materials.
Van Oudenhove, Laurence; Devreese, Bart
2013-06-01
Proteomics has evolved substantially since its early days, some 20 years ago. In this mini-review, we aim to provide an overview of general methodologies and more recent developments in mass spectrometric approaches used for relative and absolute quantitation of proteins. Enhancement of sensitivity of the mass spectrometers as well as improved sample preparation and protein fractionation methods are resulting in a more comprehensive analysis of proteomes. We also document some upcoming trends for quantitative proteomics such as the use of label-free quantification methods. Hopefully, microbiologists will continue to explore proteomics as a tool in their research to understand the adaptation of microorganisms to their ever changing environment. We encourage them to incorporate some of the described new developments in mass spectrometry to facilitate their analyses and improve the general knowledge of the fascinating world of microorganisms.
Quantitative ultrasonic evaluation of engineering properties in metals, composites and ceramics
NASA Technical Reports Server (NTRS)
Vary, A.
1980-01-01
Ultrasonic technology from the perspective of nondestructive evaluation approaches to material strength prediction and property verification is reviewed. Emergent advanced technology involving quantitative ultrasonic techniques for materials characterization is described. Ultrasonic methods are particularly useful in this area because they involve mechanical elastic waves that are strongly modulated by the same morphological factors that govern mechanical strength and dynamic failure processes. It is emphasized that the technology is in its infancy and that much effort is still required before all the available techniques can be transferred from laboratory to industrial environments.
Hill, Ryan C; Oman, Trent J; Shan, Guomin; Schafer, Barry; Eble, Julie; Chen, Cynthia
2015-08-26
Currently, traditional immunochemistry technologies such as enzyme-linked immunosorbent assays (ELISA) are the predominant analytical tool used to measure levels of recombinant proteins expressed in genetically engineered (GE) plants. Recent advances in agricultural biotechnology have created a need to develop methods capable of selectively detecting and quantifying multiple proteins in complex matrices because of increasing numbers of transgenic proteins being coexpressed or "stacked" to achieve tolerance to multiple herbicides or to provide multiple modes of action for insect control. A multiplexing analytical method utilizing liquid chromatography with tandem mass spectrometry (LC-MS/MS) has been developed and validated to quantify three herbicide-tolerant proteins in soybean tissues: aryloxyalkanoate dioxygenase (AAD-12), 5-enol-pyruvylshikimate-3-phosphate synthase (2mEPSPS), and phosphinothricin acetyltransferase (PAT). Results from the validation showed high recovery and precision over multiple analysts and laboratories. Results from this method were comparable to those obtained with ELISA with respect to protein quantitation, and the described method was demonstrated to be suitable for multiplex quantitation of transgenic proteins in GE crops.
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis. PMID:25705672
Wu, Jianning; Wu, Bin
2015-01-01
The accurate identification of gait asymmetry is very beneficial to the assessment of at-risk gait in the clinical applications. This paper investigated the application of classification method based on statistical learning algorithm to quantify gait symmetry based on the assumption that the degree of intrinsic change in dynamical system of gait is associated with the different statistical distributions between gait variables from left-right side of lower limbs; that is, the discrimination of small difference of similarity between lower limbs is considered the reorganization of their different probability distribution. The kinetic gait data of 60 participants were recorded using a strain gauge force platform during normal walking. The classification method is designed based on advanced statistical learning algorithm such as support vector machine algorithm for binary classification and is adopted to quantitatively evaluate gait symmetry. The experiment results showed that the proposed method could capture more intrinsic dynamic information hidden in gait variables and recognize the right-left gait patterns with superior generalization performance. Moreover, our proposed techniques could identify the small significant difference between lower limbs when compared to the traditional symmetry index method for gait. The proposed algorithm would become an effective tool for early identification of the elderly gait asymmetry in the clinical diagnosis.
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-01-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
NASA Astrophysics Data System (ADS)
Krizova, Aneta; Collakova, Jana; Dostal, Zbynek; Kvasnica, Lukas; Uhlirova, Hana; Zikmund, Tomas; Vesely, Pavel; Chmelik, Radim
2015-11-01
Quantitative phase imaging (QPI) brought innovation to noninvasive observation of live cell dynamics seen as cell behavior. Unlike the Zernike phase contrast or differential interference contrast, QPI provides quantitative information about cell dry mass distribution. We used such data for objective evaluation of live cell behavioral dynamics by the advanced method of dynamic phase differences (DPDs). The DPDs method is considered a rational instrument offered by QPI. By subtracting the antecedent from the subsequent image in a time-lapse series, only the changes in mass distribution in the cell are detected. The result is either visualized as a two-dimensional color-coded projection of these two states of the cell or as a time dependence of changes quantified in picograms. Then in a series of time-lapse recordings, the chain of cell mass distribution changes that would otherwise escape attention is revealed. Consequently, new salient features of live cell behavior should emerge. Construction of the DPDs method and results exhibiting the approach are presented. Advantage of the DPDs application is demonstrated on cells exposed to an osmotic challenge. For time-lapse acquisition of quantitative phase images, the recently developed coherence-controlled holographic microscope was employed.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2013-03-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Nondestructive Evaluation of Adhesive Bond Quality: State of the Art Review
1989-06-01
conducted using a and Harnik (6) developed a quantitative medium-focused, 1/4-inch diameter, 10-MHz method of testing adhesive bonded joints transducer...Couchman." Spectral Analysis Tech- E. Harnik . "The Testing of Adhesive- nique of Ultrasonic NDT of Advanced Bonded Joints by a Very High Resolu
Making and Using a Sensing Polymeric Material for Cu[superscript 2+
ERIC Educational Resources Information Center
Paddock, Jean R.; Maghasi, Anne T.; Heineman, William R.; Seliskar, Carl J.
2005-01-01
A simple chemical sensor-related experiment rooted in the synthesis of polymeric materials for use in either an advanced high-school or undergraduate college laboratory is presented. Students are introduced to and combine to the concepts of the chemical sensor, polymer chemistry, spectroscopy, metal chelates, and quantitative analytical methods.
Fuzzy method of recognition of high molecular substances in evidence-based biology
NASA Astrophysics Data System (ADS)
Olevskyi, V. I.; Smetanin, V. T.; Olevska, Yu. B.
2017-10-01
Nowadays modern requirements to achieving reliable results along with high quality of researches put mathematical analysis methods of results at the forefront. Because of this, evidence-based methods of processing experimental data have become increasingly popular in the biological sciences and medicine. Their basis is meta-analysis, a method of quantitative generalization of a large number of randomized trails contributing to a same special problem, which are often contradictory and performed by different authors. It allows identifying the most important trends and quantitative indicators of the data, verification of advanced hypotheses and discovering new effects in the population genotype. The existing methods of recognizing high molecular substances by gel electrophoresis of proteins under denaturing conditions are based on approximate methods for comparing the contrast of electrophoregrams with a standard solution of known substances. We propose a fuzzy method for modeling experimental data to increase the accuracy and validity of the findings of the detection of new proteins.
Artificial intelligence in radiology.
Hosny, Ahmed; Parmar, Chintan; Quackenbush, John; Schwartz, Lawrence H; Aerts, Hugo J W L
2018-05-17
Artificial intelligence (AI) algorithms, particularly deep learning, have demonstrated remarkable progress in image-recognition tasks. Methods ranging from convolutional neural networks to variational autoencoders have found myriad applications in the medical image analysis field, propelling it forward at a rapid pace. Historically, in radiology practice, trained physicians visually assessed medical images for the detection, characterization and monitoring of diseases. AI methods excel at automatically recognizing complex patterns in imaging data and providing quantitative, rather than qualitative, assessments of radiographic characteristics. In this Opinion article, we establish a general understanding of AI methods, particularly those pertaining to image-based tasks. We explore how these methods could impact multiple facets of radiology, with a general focus on applications in oncology, and demonstrate ways in which these methods are advancing the field. Finally, we discuss the challenges facing clinical implementation and provide our perspective on how the domain could be advanced.
Major advances in testing of dairy products: milk component and dairy product attribute testing.
Barbano, D M; Lynch, J M
2006-04-01
Milk component analysis is relatively unusual in the field of quantitative analytical chemistry because an analytical test result determines the allocation of very large amounts of money between buyers and sellers of milk. Therefore, there is high incentive to develop and refine these methods to achieve a level of analytical performance rarely demanded of most methods or laboratory staff working in analytical chemistry. In the last 25 yr, well-defined statistical methods to characterize and validate analytical method performance combined with significant improvements in both the chemical and instrumental methods have allowed achievement of improved analytical performance for payment testing. A shift from marketing commodity dairy products to the development, manufacture, and marketing of value added dairy foods for specific market segments has created a need for instrumental and sensory approaches and quantitative data to support product development and marketing. Bringing together sensory data from quantitative descriptive analysis and analytical data from gas chromatography olfactometry for identification of odor-active compounds in complex natural dairy foods has enabled the sensory scientist and analytical chemist to work together to improve the consistency and quality of dairy food flavors.
Doria, Andrea S; Zhang, Ningning; Lundin, Bjorn; Hilliard, Pamela; Man, Carina; Weiss, Ruth; Detzler, Gary; Blanchette, Victor; Moineddin, Rahim; Eckstein, Felix; Sussman, Marshall S
2014-05-01
Recent advances in hemophilia prophylaxis have raised the need for accurate noninvasive methods for assessment of early cartilage damage in maturing joints to guide initiation of prophylaxis. Such methods can either be semiquantitative or quantitative. Whereas semiquantitative scores are less time-consuming to be performed than quantitative methods, they are prone to subjective interpretation. To test the feasibility of a manual segmentation and a quantitative methodology for cross-sectional evaluation of articular cartilage status in growing ankles of children with blood-induced arthritis, as compared with a semiquantitative scoring system and clinical-radiographic constructs. Twelve boys, 11 with hemophilia (A, n = 9; B, n = 2) and 1 with von Willebrand disease (median age: 13; range: 6-17), underwent physical examination and MRI at 1.5 T. Two radiologists semiquantitatively scored the MRIs for cartilage pathology (surface erosions, cartilage loss) with blinding to clinical information. An experienced operator applied a validated quantitative 3-D MRI method to determine the percentage area of denuded bone (dAB) and the cartilage thickness (ThCtAB) in the joints' MRIs. Quantitative and semiquantitative MRI methods and clinical-radiographic constructs (Hemophilia Joint Health Score [HJHS], Pettersson radiograph scores) were compared. Moderate correlations were noted between erosions and dAB (r = 0.62, P = 0.03) in the talus but not in the distal tibia (P > 0.05). Whereas substantial to high correlations (r range: 0.70-0.94, P < 0.05) were observed between erosions, cartilage loss, HJHS and Pettersson scores both at the distal tibia and talus levels, moderate/borderline substantial (r range: 0.55-0.61, P < 0.05) correlations were noted between dAB/ThCtAB and clinical-radiographic constructs. Whereas the semiquantitative method of assessing cartilage status is closely associated with clinical-radiographic scores in cross-sectional studies of blood-induced arthropathy, quantitative measures provide independent information and are therefore less applicable for that research design.
Investigation on microfluidic particles manipulation by holographic 3D tracking strategies
NASA Astrophysics Data System (ADS)
Cacace, Teresa; Paturzo, Melania; Memmolo, Pasquale; Vassalli, Massimo; Fraldi, Massimiliano; Mensitieri, Giuseppe; Ferraro, Pietro
2017-06-01
We demonstrate a 3D holographic tracking method to investigate particles motion in a microfluidic channel while unperturbed while inducing their migration through microfluidic manipulation. Digital holography (DH) in microscopy is a full-field, label-free imaging technique able to provide quantitative phase-contrast. The employed 3D tracking method is articulated in steps. First, the displacements along the optical axis are assessed by numerical refocusing criteria. In particular, an automatic refocusing method to recover the particles axial position is implemented employing a contrast-based refocusing criterion. Then, the transverse position of the in-focus object is evaluated through quantitative phase map segmentation methods and centroid-based 2D tracking strategy. The introduction of DH is thus suggested as a powerful approach for control of particles and biological samples manipulation, as well as a possible aid to precise design and implementation of advanced lab-on-chip microfluidic devices.
2012-09-10
Advanced Technology Research Center, U.S. Army Medical Research and Materiel Command, Fort Detrick, Maryland 21702, United States ABSTRACT: Toxicological ...species. Thus, it is more advantageous to predict the toxicological effects of a compound on humans directly from the human toxicological data of related...compounds. However, many popular quantitative structure−activity relationship ( QSAR ) methods that build a single global model by fitting all training
Systems Biology-Driven Hypotheses Tested In Vivo: The Need to Advancing Molecular Imaging Tools.
Verma, Garima; Palombo, Alessandro; Grigioni, Mauro; La Monaca, Morena; D'Avenio, Giuseppe
2018-01-01
Processing and interpretation of biological images may provide invaluable insights on complex, living systems because images capture the overall dynamics as a "whole." Therefore, "extraction" of key, quantitative morphological parameters could be, at least in principle, helpful in building a reliable systems biology approach in understanding living objects. Molecular imaging tools for system biology models have attained widespread usage in modern experimental laboratories. Here, we provide an overview on advances in the computational technology and different instrumentations focused on molecular image processing and analysis. Quantitative data analysis through various open source software and algorithmic protocols will provide a novel approach for modeling the experimental research program. Besides this, we also highlight the predictable future trends regarding methods for automatically analyzing biological data. Such tools will be very useful to understand the detailed biological and mathematical expressions under in-silico system biology processes with modeling properties.
Metrology Standards for Quantitative Imaging Biomarkers
Obuchowski, Nancy A.; Kessler, Larry G.; Raunig, David L.; Gatsonis, Constantine; Huang, Erich P.; Kondratovich, Marina; McShane, Lisa M.; Reeves, Anthony P.; Barboriak, Daniel P.; Guimaraes, Alexander R.; Wahl, Richard L.
2015-01-01
Although investigators in the imaging community have been active in developing and evaluating quantitative imaging biomarkers (QIBs), the development and implementation of QIBs have been hampered by the inconsistent or incorrect use of terminology or methods for technical performance and statistical concepts. Technical performance is an assessment of how a test performs in reference objects or subjects under controlled conditions. In this article, some of the relevant statistical concepts are reviewed, methods that can be used for evaluating and comparing QIBs are described, and some of the technical performance issues related to imaging biomarkers are discussed. More consistent and correct use of terminology and study design principles will improve clinical research, advance regulatory science, and foster better care for patients who undergo imaging studies. © RSNA, 2015 PMID:26267831
Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.
Mantle, M D
2011-09-30
The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.
MO-G-12A-01: Quantitative Imaging Metrology: What Should Be Assessed and How?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, M; Petrick, N; Obuchowski, N
The first two symposia in the Quantitative Imaging Track focused on 1) the introduction of quantitative imaging (QI) challenges and opportunities, and QI efforts of agencies and organizations such as the RSNA, NCI, FDA, and NIST, and 2) the techniques, applications, and challenges of QI, with specific examples from CT, PET/CT, and MR. This third symposium in the QI Track will focus on metrology and its importance in successfully advancing the QI field. While the specific focus will be on QI, many of the concepts presented are more broadly applicable to many areas of medical physics research and applications. Asmore » such, the topics discussed should be of interest to medical physicists involved in imaging as well as therapy. The first talk of the session will focus on the introduction to metrology and why it is critically important in QI. The second talk will focus on appropriate methods for technical performance assessment. The third talk will address statistically valid methods for algorithm comparison, a common problem not only in QI but also in other areas of medical physics. The final talk in the session will address strategies for publication of results that will allow statistically valid meta-analyses, which is critical for combining results of individual studies with typically small sample sizes in a manner that can best inform decisions and advance the field. Learning Objectives: Understand the importance of metrology in the QI efforts. Understand appropriate methods for technical performance assessment. Understand methods for comparing algorithms with or without reference data (i.e., “ground truth”). Understand the challenges and importance of reporting results in a manner that allows for statistically valid meta-analyses.« less
Chapter 16: Lignin Visualization: Advanced Microscopy Techniques for Lignin Characterization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, Yining; Donohoe, Bryon S
Visualization of lignin in plant cell walls, with both spatial and chemical resolution, is emerging as an important tool to understand lignin's role in the plant cell wall's nanoscale architecture and to understand and design processes intended to modify the lignin. As such, this chapter reviews recent advances in advanced imaging methods with respect to lignin in plant cell walls. This review focuses on the importance of lignin detection and localization for studies in both plant biology and biotechnology. Challenges going forward to identify and delineate lignin from other plant cell wall components and to quantitatively analyze lignin in wholemore » cell walls from native plant tissue and treated biomass are also discussed.« less
ERIC Educational Resources Information Center
Nelson, Regina K.
2013-01-01
A mixed-methods research study was designed to test whether undergraduate engineering students were better prepared to learn advanced topics in biomedical engineering if they learned physiology via a quantitative, concept-based approach rather than a qualitative, system-based approach. Experiments were conducted with undergraduate engineering…
ERIC Educational Resources Information Center
Brown, N. Anthony; Solovieva, Raissa V.; Eggett, Dennis L.
2011-01-01
This research describes a method applied at a U.S. university in a third-year Russian language course designed to facilitate Advanced and Superior second language writing proficiency through the forum of argumentation and debate. Participants had extensive informal language experience living in a Russian-speaking country but comparatively little…
ERIC Educational Resources Information Center
Ojose, Bobby
2015-01-01
This study investigated proportional reasoning and the related concepts of decimal, percent, and ratio. In particular, the research focused on analyzing the gaps and understandings that grades 6, 7, and 8 students have and advanced factors for such gaps and understandings. The study employed a mixed method approach in which quantitative data was…
USDA-ARS?s Scientific Manuscript database
Backcross breeding is an important method to improve elite cultivars for traits controlled by a small number of loci but has been used less frequently to improve quantitatively controlled traits. Resistances to Fusarium ear rot and contamination by the associated mycotoxin fumonisin in maize are qua...
ERIC Educational Resources Information Center
Preacher, Kristopher J.; Kelley, Ken
2011-01-01
The statistical analysis of mediation effects has become an indispensable tool for helping scientists investigate processes thought to be causal. Yet, in spite of many recent advances in the estimation and testing of mediation effects, little attention has been given to methods for communicating effect size and the practical importance of those…
A Quantitative Analysis of Organizational Factors That Relate to Data Mining Success
ERIC Educational Resources Information Center
Huebner, Richard A.
2017-01-01
The ubiquity of data in various forms has fueled the need for advanced data-mining techniques within organizations. The advent of data mining methods used to uncover hidden nuggets of information buried within large data sets has also fueled the need for determining how these unique projects can be successful. There are many challenges associated…
System safety education focused on flight safety
NASA Technical Reports Server (NTRS)
Holt, E.
1971-01-01
The measures necessary for achieving higher levels of system safety are analyzed with an eye toward maintaining the combat capability of the Air Force. Several education courses were provided for personnel involved in safety management. Data include: (1) Flight Safety Officer Course, (2) Advanced Safety Program Management, (3) Fundamentals of System Safety, and (4) Quantitative Methods of Safety Analysis.
Cargo, Margaret; Harris, Janet; Pantoja, Tomas; Booth, Andrew; Harden, Angela; Hannes, Karin; Thomas, James; Flemming, Kate; Garside, Ruth; Noyes, Jane
2018-05-01
This article provides reviewers with guidance on methods for identifying and processing evidence to understand intervention implementation. Strategies, tools, and methods are applied to the systematic review process to illustrate how process and implementation can be addressed using quantitative, qualitative, and other sources of evidence (i.e., descriptive textual and nonempirical). Reviewers can take steps to navigate the heterogeneity and level of uncertainty present in the concepts, measures, and methods used to assess implementation. Activities can be undertaken in advance of a Cochrane quantitative review to develop program theory and logic models that situate implementation in the causal chain. Four search strategies are offered to retrieve process and implementation evidence. Recommendations are made for addressing rigor or risk of bias in process evaluation or implementation evidence. Strategies are recommended for locating and extracting data from primary studies. The basic logic is presented to assist reviewers to make initial review-level judgments about implementation failure and theory failure. Although strategies, tools, and methods can assist reviewers to address process and implementation using quantitative, qualitative, and other forms of evidence, few exemplar reviews exist. There is a need for further methodological development and trialing of proposed approaches. Copyright © 2017 Elsevier Inc. All rights reserved.
Landslide hazard assessment: recent trends and techniques.
Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S
2013-01-01
Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.
NASA Astrophysics Data System (ADS)
Sugano, Koji; Ikegami, Kohei; Isono, Yoshitada
2017-06-01
In this paper, a characterization method for Raman enhancement for highly sensitive and quantitative surface-enhanced Raman spectroscopy (SERS) is reported. A particle dimer shows a marked electromagnetic enhancement when the particle connection direction is matched to the polarization direction of incident light. In this study, dimers were arrayed by nanotrench-guided self-assembly for a marked total Raman enhancement. By measuring acetonedicarboxylic acid, the fabricated structures were characterized for SERS depending on the polarization angle against the particle connection direction. This indicates that the fabricated structures cause an effective SERS enhancement, which is dominated by the electromagnetic enhancement. Then, we measured 4,4‧-bipyridine, which is a pesticide material, for quantitative analysis. In advance, we evaluated the enhancement of the particle structure by the Raman measurement of acetonedicarboxylic acid. Finally, we compared the Raman intensities of acetonedicarboxylic acid and 4,4‧-bipyridine. Their intensities showed good correlation. The advantage of this method for previously evaluating the enhancement of the substrate was demonstrated. This developed SERS characterization method is expected to be applied to various quantitative trace analyses of molecules with high sensitivity.
Choi, Seo Yeon; Yang, Nuri; Jeon, Soo Kyung; Yoon, Tae Hyun
2014-09-01
In this study, we have demonstrated feasibility of a semi-quantitative approach for the estimation of cellular SiO2 nanoparticles (NPs), which is based on the flow cytometry measurements of their normalized side scattering intensity. In order to improve our understanding on the quantitative aspects of cell-nanoparticle interactions, flow cytometry, transmission electron microscopy, and X-ray fluorescence experiments were carefully performed for the HeLa cells exposed to SiO2 NPs with different core diameters, hydrodynamic sizes, and surface charges. Based on the observed relationships among the experimental data, a semi-quantitative cellular SiO2 NPs estimation method from their normalized side scattering and core diameters was proposed, which can be applied for the determination of cellular SiO2 NPs within their size-dependent linear ranges. © 2014 International Society for Advancement of Cytometry.
Quantitative Modeling of Earth Surface Processes
NASA Astrophysics Data System (ADS)
Pelletier, Jon D.
This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.
Highlights of NASA's Role in Developing State-of-the-Art Nondestructive Evaluation for Composites
NASA Technical Reports Server (NTRS)
2001-01-01
Since the 1970's, when the promise of composites was being pursued for aeronautics applications, NASA has had programs that addressed the development of NDE methods for composites. These efforts included both microscopic and macroscopic NDE. At the microscopic level, NDE investigations interrogated composites at the submicron to micron level to understand a composite's microstructure. A novel microfocus CT system was developed as well as the science underlying applications of acoustic microscopy to a composite's component material properties. On the macroscopic scale NDE techniques were developed that advanced the capabilities to be faster and more quantitative. Techniques such as stiffness imaging, ultrasonic arrays, laser based ultrasound, advanced acoustic emission, thermography, and novel health monitoring systems were researched. Underlying these methods has been a strong modeling capability that has aided in method development.
A Method for Label-Free, Differential Top-Down Proteomics.
Ntai, Ioanna; Toby, Timothy K; LeDuc, Richard D; Kelleher, Neil L
2016-01-01
Biomarker discovery in the translational research has heavily relied on labeled and label-free quantitative bottom-up proteomics. Here, we describe a new approach to biomarker studies that utilizes high-throughput top-down proteomics and is the first to offer whole protein characterization and relative quantitation within the same experiment. Using yeast as a model, we report procedures for a label-free approach to quantify the relative abundance of intact proteins ranging from 0 to 30 kDa in two different states. In this chapter, we describe the integrated methodology for the large-scale profiling and quantitation of the intact proteome by liquid chromatography-mass spectrometry (LC-MS) without the need for metabolic or chemical labeling. This recent advance for quantitative top-down proteomics is best implemented with a robust and highly controlled sample preparation workflow before data acquisition on a high-resolution mass spectrometer, and the application of a hierarchical linear statistical model to account for the multiple levels of variance contained in quantitative proteomic comparisons of samples for basic and clinical research.
Using Active Learning to Teach Concepts and Methods in Quantitative Biology.
Waldrop, Lindsay D; Adolph, Stephen C; Diniz Behn, Cecilia G; Braley, Emily; Drew, Joshua A; Full, Robert J; Gross, Louis J; Jungck, John A; Kohler, Brynja; Prairie, Jennifer C; Shtylla, Blerta; Miller, Laura A
2015-11-01
This article provides a summary of the ideas discussed at the 2015 Annual Meeting of the Society for Integrative and Comparative Biology society-wide symposium on Leading Students and Faculty to Quantitative Biology through Active Learning. It also includes a brief review of the recent advancements in incorporating active learning approaches into quantitative biology classrooms. We begin with an overview of recent literature that shows that active learning can improve students' outcomes in Science, Technology, Engineering and Math Education disciplines. We then discuss how this approach can be particularly useful when teaching topics in quantitative biology. Next, we describe some of the recent initiatives to develop hands-on activities in quantitative biology at both the graduate and the undergraduate levels. Throughout the article we provide resources for educators who wish to integrate active learning and technology into their classrooms. © The Author 2015. Published by Oxford University Press on behalf of the Society for Integrative and Comparative Biology. All rights reserved. For permissions please email: journals.permissions@oup.com.
Campbell, David J T; Tam-Tham, Helen; Dhaliwal, Kirnvir K; Manns, Braden J; Hemmelgarn, Brenda R; Sanmartin, Claudia; King-Shier, Kathryn
2017-01-01
Mixed methods research, the use of both qualitative and quantitative methods within 1 program of study, is becoming increasingly popular to allow investigators to explore patient experiences (qualitative) and also measure outcomes (quantitative). Coronary artery disease and its risk factors are some of the most studied conditions; however, the extent to which mixed methods studies are being conducted in these content areas is unknown. We sought to comprehensively describe the characteristics of published mixed methods studies on coronary artery disease and major risk factors (diabetes mellitus and hypertension). We conducted a scoping review of the literature indexed in PubMed, Medline, EMBASE, and CINAHL. We identified 811 abstracts for screening, of which 254 articles underwent full-text review and 97 reports of 81 studies met criteria for inclusion. The majority of studies in this area were conducted in the past 10 years by nurse researchers from the United States and United Kingdom. Diabetes mellitus was the most common content area for mixed methods investigation (compared with coronary artery disease and hypertension). Most authors described their rationale for using mixed methods as complementarity and did not describe study priority or how they reconciled differences in methodological paradigms. Some mixed methods study designs were more commonly used than others, including concurrent timing and integration at the interpretation stage. Qualitative strands were most commonly descriptive studies using interviews for data collection. Quantitative strands were most commonly cross-sectional observational studies, which relied heavily on self-report data such as surveys and scales. Although mixed methods research is becoming increasingly popular in the area of coronary artery disease and its risk factors, many of the more advanced mixed methods, qualitative, and quantitative techniques have not been commonly used in these areas. © 2016 American Heart Association, Inc.
Quantitative live-cell imaging of human immunodeficiency virus (HIV-1) assembly.
Baumgärtel, Viola; Müller, Barbara; Lamb, Don C
2012-05-01
Advances in fluorescence methodologies make it possible to investigate biological systems in unprecedented detail. Over the last few years, quantitative live-cell imaging has increasingly been used to study the dynamic interactions of viruses with cells and is expected to become even more indispensable in the future. Here, we describe different fluorescence labeling strategies that have been used to label HIV-1 for live cell imaging and the fluorescence based methods used to visualize individual aspects of virus-cell interactions. This review presents an overview of experimental methods and recent experiments that have employed quantitative microscopy in order to elucidate the dynamics of late stages in the HIV-1 replication cycle. This includes cytosolic interactions of the main structural protein, Gag, with itself and the viral RNA genome, the recruitment of Gag and RNA to the plasma membrane, virion assembly at the membrane and the recruitment of cellular proteins involved in HIV-1 release to the nascent budding site.
Quantitative Live-Cell Imaging of Human Immunodeficiency Virus (HIV-1) Assembly
Baumgärtel, Viola; Müller, Barbara; Lamb, Don C.
2012-01-01
Advances in fluorescence methodologies make it possible to investigate biological systems in unprecedented detail. Over the last few years, quantitative live-cell imaging has increasingly been used to study the dynamic interactions of viruses with cells and is expected to become even more indispensable in the future. Here, we describe different fluorescence labeling strategies that have been used to label HIV-1 for live cell imaging and the fluorescence based methods used to visualize individual aspects of virus-cell interactions. This review presents an overview of experimental methods and recent experiments that have employed quantitative microscopy in order to elucidate the dynamics of late stages in the HIV-1 replication cycle. This includes cytosolic interactions of the main structural protein, Gag, with itself and the viral RNA genome, the recruitment of Gag and RNA to the plasma membrane, virion assembly at the membrane and the recruitment of cellular proteins involved in HIV-1 release to the nascent budding site. PMID:22754649
Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207
Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.
Guo, Baoshan; Lei, Cheng; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke
2017-05-01
The development of reliable, sustainable, and economical sources of alternative fuels to petroleum is required to tackle the global energy crisis. One such alternative is microalgal biofuel, which is expected to play a key role in reducing the detrimental effects of global warming as microalgae absorb atmospheric CO 2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid amounts and fail to characterize a diverse population of microalgal cells with single-cell resolution in a non-invasive and interference-free manner. Here high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy was demonstrated. In particular, Euglena gracilis, an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement), within lipid droplets was investigated. The optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch quantitative phase microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase maps of every single cell at a high throughput of 10,000 cells/s, enabling accurate cell classification without the need for fluorescent staining. Specifically, the dataset was used to characterize heterogeneous populations of E. gracilis cells under two different culture conditions (nitrogen-sufficient and nitrogen-deficient) and achieve the cell classification with an error rate of only 2.15%. The method holds promise as an effective analytical tool for microalgae-based biofuel production. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Yiannakas, Marios C; Tozer, Daniel J; Schmierer, Klaus; Chard, Declan T; Anderson, Valerie M; Altmann, Daniel R; Miller, David H; Wheeler-Kingshott, Claudia A M
2013-05-01
There are modest correlations between multiple sclerosis (MS) disability and white matter lesion (WML) volumes, as measured by T2-weighted (T2w) magnetic resonance imaging (MRI) scans (T2-WML). This may partly reflect pathological heterogeneity in WMLs, which is not apparent on T2w scans. To determine if ADvanced IMage Algebra (ADIMA), a novel MRI post-processing method, can reveal WML heterogeneity from proton-density weighted (PDw) and T2w images. We obtained conventional PDw and T2w images from 10 patients with relapsing-remitting MS (RRMS) and ADIMA images were calculated from these. We classified all WML into bright (ADIMA-b) and dark (ADIMA-d) sub-regions, which were segmented. We obtained conventional T2-WML and T1-WML volumes for comparison, as well as the following quantitative magnetic resonance parameters: magnetisation transfer ratio (MTR), T1 and T2. Also, we assessed the reproducibility of the segmentation for ADIMA-b, ADIMA-d and T2-WML. Our study's ADIMA-derived volumes correlated with conventional lesion volumes (p < 0.05). ADIMA-b exhibited higher T1 and T2, and lower MTR than the T2-WML (p < 0.001). Despite the similarity in T1 values between ADIMA-b and T1-WML, these regions were only partly overlapping with each other. ADIMA-d exhibited quantitative characteristics similar to T2-WML; however, they were only partly overlapping. Mean intra- and inter-observer coefficients of variation for ADIMA-b, ADIMA-d and T2-WML volumes were all < 6 % and < 10 %, respectively. ADIMA enabled the simple classification of WML into two groups having different quantitative magnetic resonance properties, which can be reproducibly distinguished.
Xu, Leilei; Wang, Fang; Xu, Ying; Wang, Yi; Zhang, Cuiping; Qin, Xue; Yu, Hongxiu; Yang, Pengyuan
2015-12-07
As a key post-translational modification mechanism, protein acetylation plays critical roles in regulating and/or coordinating cell metabolism. Acetylation is a prevalent modification process in enzymes. Protein acetylation modification occurs in sub-stoichiometric amounts; therefore extracting biologically meaningful information from these acetylation sites requires an adaptable, sensitive, specific, and robust method for their quantification. In this work, we combine immunoassays and multiple reaction monitoring-mass spectrometry (MRM-MS) technology to develop an absolute quantification for acetylation modification. With this hybrid method, we quantified the acetylation level of metabolic enzymes, which could demonstrate the regulatory mechanisms of the studied enzymes. The development of this quantitative workflow is a pivotal step for advancing our knowledge and understanding of the regulatory effects of protein acetylation in physiology and pathophysiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heyman, Heino M.; Zhang, Xing; Tang, Keqi
2016-02-16
Metabolomics is the quantitative analysis of all metabolites in a given sample. Due to the chemical complexity of the metabolome, optimal separations are required for comprehensive identification and quantification of sample constituents. This chapter provides an overview of both conventional and advanced separations methods in practice for reducing the complexity of metabolite extracts delivered to the mass spectrometer detector, and covers gas chromatography (GC), liquid chromatography (LC), capillary electrophoresis (CE), supercritical fluid chromatography (SFC) and ion mobility spectrometry (IMS) separation techniques coupled with mass spectrometry (MS) as both uni-dimensional and as multi-dimensional approaches.
Applications of Microfluidics in Quantitative Biology.
Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang
2018-05-01
Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.
The Use of Google Apps to Support Teaching and Learning: A UAE Case Study
ERIC Educational Resources Information Center
Ishtaiwa-Dweikat, Fawzi Fayez; Aburezeq, Ibtehal M.
2016-01-01
This study was designed to examine pre-service teachers' proficiency, use, and their perceptions of the benefits and barriers to the use of Google Apps to support their teaching and learning practices. Quantitative and qualitative methods were used. The findings showed that almost two-thirds (63%) of participants were advanced or expert users of…
ERIC Educational Resources Information Center
López, Nancy; Erwin, Christopher; Binder, Melissa; Chavez, Mario Javier
2018-01-01
We appeal to critical race theory and intersectionality to examine achievement gaps at a large public university in the American southwest from 2000 to 2015. Using white, high-income women as our reference group, we report linear combinations of marginal effects for six-year graduation rates and developmental course taking across 20 distinct…
How Are They Now? Longer Term Effects of eCoaching through Online Bug-in-Ear Technology
ERIC Educational Resources Information Center
Rock, Marcia L.; Schumacker, Randall E.; Gregg, Madeleine; Howard, Pamela W.; Gable, Robert A.; Zigmond, Naomi
2014-01-01
In this study, using mixed methods, we investigated the longer term effects of eCoaching through advanced online bug-in-ear (BIE) technology. Quantitative data on five dependent variables were extracted from 14 participants' electronically archived video files at three points in time--Spring 1 (i.e., baseline, which was the first semester of…
ERIC Educational Resources Information Center
Saunders, Daniel B.; Kolek, Ethan A.; Williams, Elizabeth A.; Wells, Ryan S.
2016-01-01
Previous research has found the field of higher education, particularly in the United States, is dominated by functionalist approaches, a preponderance of survey data, and the ubiquitous use of advanced quantitative methods to investigate educational phenomena. This descriptive study aims to illuminate why the field is constructed in this way.…
ERIC Educational Resources Information Center
Pogrow, Stanley
2017-01-01
One of the major successes of advanced quantitative methods has been its seeming ability to provide unbiased determinations of which education practices are effective for education in general and for improving the educational achievement and opportunity of the neediest students. The power of this methodology as applied in the top education…
Advanced Productivity Analysis Methods for Air Traffic Control Operations
1976-12-01
Routine Work ............................... 37 4.2.2. Surveillance Work .......................... 40 4.2.3. Conflict Prcessing Work ................... 41...crossing and overtake conflicts) includes potential- conflict recognition, assessment, and resolution decision making and A/N voice communications...makers to utilize £ .quantitative and dynamic analysis as a tool for decision - making. 1.1.3 Types of Simulation Models Although there are many ways to
ERIC Educational Resources Information Center
Carpenter, Serena; Hoag, Anne; Grant, August E.; Bowe, Brian J.
2015-01-01
The newsroom is a powerful influence in a journalist's identity formation. Research has yet to verify the socializing impact of academia. This research utilized the quantitative survey method applying it to undergraduate journalism students (n = 798) to assess how academic status relates to students' degree motivations, life values, and technology…
ERIC Educational Resources Information Center
Plomin, Robert; Davis, Oliver S. P.
2009-01-01
Background: Much of what we thought we knew about genetics needs to be modified in light of recent discoveries. What are the implications of these advances for identifying genes responsible for the high heritability of many behavioural disorders and dimensions in childhood? Methods: Although quantitative genetics such as twin studies will continue…
2011-07-28
4874–48??, 1970. [16] R. O. Jung , J. B. Boffard, L. W. Anderson, and C. C. Lin. Electron-impact excitation cross sections from the xenon j = 2...Journal of Quantitative Spectroscopy and Radiative Transfer, 5(2):503– 510, 1965. [35] O. Zatsarinny and K. Bartschat. B -spline Breit- Pauli R-matrix
Ohmido, Nobuko; Fukui, Kiichi; Kinoshita, Toshiro
2010-01-01
Fluorescence in situ hybridization (FISH) is an effective method for the physical mapping of genes and repetitive DNA sequences on chromosomes. Physical mapping of unique nucleotide sequences on specific rice chromosome regions was performed using a combination of chromosome identification and highly sensitive FISH. Increases in the detection sensitivity of smaller DNA sequences and improvements in spatial resolution have ushered in a new phase in FISH technology. Thus, it is now possible to perform in situ hybridization on somatic chromosomes, pachytene chromosomes, and even on extended DNA fibers (EDFs). Pachytene-FISH allows the integration of genetic linkage maps and quantitative chromosome maps. Visualization methods using FISH can reveal the spatial organization of the centromere, heterochromatin/euchromatin, and the terminal structures of rice chromosomes. Furthermore, EDF-FISH and the DNA combing technique can resolve a spatial distance of 1 kb between adjacent DNA sequences, and the detection of even a 300-bp target is now feasible. The copy numbers of various repetitive sequences and the sizes of various DNA molecules were quantitatively measured using the molecular combing technique. This review describes the significance of these advances in molecular cytology in rice and discusses future applications in plant studies using visualization techniques.
Gangolli, Mihika; Holleran, Laurena; Kim, Joong Hee; Stein, Thor D.; Alvarez, Victor; McKee, Ann C.; Brody, David L.
2017-01-01
Advanced diffusion MRI methods have recently been proposed for detection of pathologies such as traumatic axonal injury and chronic traumatic encephalopathy which commonly affect complex cortical brain regions. However, radiological-pathological correlations in human brain tissue that detail the relationship between the multi-component diffusion signal and underlying pathology are lacking. We present a nonlinear voxel based two dimensional coregistration method that is useful for matching diffusion signals to quantitative metrics of high resolution histological images. When validated in ex vivo human cortical tissue at a 250 × 250 × 500 micron spatial resolution, the method proved robust in correlations between generalized q-sampling imaging and histologically based white matter fiber orientations, with r = 0.94 for the primary fiber direction and r = 0.88 for secondary fiber direction in each voxel. Importantly, however, the correlation was substantially worse with reduced spatial resolution or with fiber orientations derived using a diffusion tensor model. Furthermore, we have detailed a quantitative histological metric of white matter fiber integrity termed power coherence capable of distinguishing between architecturally complex but intact white matter from disrupted white matter regions. These methods may allow for more sensitive and specific radiological-pathological correlations of neurodegenerative diseases affecting complex gray and white matter. PMID:28365421
Computer Aided Enzyme Design and Catalytic Concepts
Frushicheva, Maria P.; Mills, Matthew J. L.; Schopf, Patrick; Singh, Manoj K.; Warshel, Arieh
2014-01-01
Gaining a deeper understanding of enzyme catalysis is of great practical and fundamental importance. Over the years it has become clear that despite advances made in experimental mutational studies, a quantitative understanding of enzyme catalysis will not be possible without the use of computer modeling approaches. While we believe that electrostatic preorganization is by far the most important catalytic factor, convincing the wider scientific community of this may require the demonstration of effective rational enzyme design. Here we make the point that the main current advances in enzyme design are basically advances in directed evolution and that computer aided enzyme design must involve approaches that can reproduce catalysis in well-defined test cases. Such an approach is provided by the empirical valence bond method. PMID:24814389
Rautiainen, Jari; Nissi, Mikko J.; Salo, Elli-Noora; Tiitu, Virpi; Finnilä, Mikko A.J.; Aho, Olli-Matti; Saarakkala, Simo; Lehenkari, Petri; Ellermann, Jutta; Nieminen, Miika T.
2014-01-01
Purpose To evaluate the sensitivity of quantitative MRI techniques (T1, T1,Gd, T2, continous wave (CW) T1ρ dispersion, adiabatic T1ρ, adiabatic T2ρ, RAFF and inversion-prepared magnetization transfer (MT)) for assessment of human articular cartilage with varying degrees of natural degeneration. Methods Osteochondral samples (n = 14) were obtained from the tibial plateaus of patients undergoing total knee replacement. MRI of the specimens was performed at 9.4 T and the relaxation time maps were evaluated in the cartilage zones. For reference, quantitative histology, OARSI grading and biomechanical measurements were performed and correlated with MRI findings. Results All MRI parameters, except T1,Gd, showed statistically significant differences in tangential and full-thickness ROIs between early and advanced osteoarthritis (OA) groups, as classified by OARSI grading. CW-T1ρ showed significant dispersion in all ROIs and featured classical laminar structure of cartilage with spin-lock powers below 1000 Hz. Adiabatic T1ρ, T2ρ, CW-T1ρ, MT and RAFF correlated strongly with OARSI grade and biomechanical parameters. Conclusion MRI parameters were able to differentiate between early and advanced OA. Furthermore, rotating frame methods, namely adiabatic T1ρ, adiabatic T2ρ, CW-T1ρ and RAFF, as well as MT experiment correlated strongly with biomechanical parameters and OARSI grade, suggesting high sensitivity of the parameters for cartilage degeneration. PMID:25104181
Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten
2011-01-01
Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new “omics”-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign. PMID:22073191
Andreatta, Massimo; Schafer-Nielsen, Claus; Lund, Ole; Buus, Søren; Nielsen, Morten
2011-01-01
Recent advances in high-throughput technologies have made it possible to generate both gene and protein sequence data at an unprecedented rate and scale thereby enabling entirely new "omics"-based approaches towards the analysis of complex biological processes. However, the amount and complexity of data that even a single experiment can produce seriously challenges researchers with limited bioinformatics expertise, who need to handle, analyze and interpret the data before it can be understood in a biological context. Thus, there is an unmet need for tools allowing non-bioinformatics users to interpret large data sets. We have recently developed a method, NNAlign, which is generally applicable to any biological problem where quantitative peptide data is available. This method efficiently identifies underlying sequence patterns by simultaneously aligning peptide sequences and identifying motifs associated with quantitative readouts. Here, we provide a web-based implementation of NNAlign allowing non-expert end-users to submit their data (optionally adjusting method parameters), and in return receive a trained method (including a visual representation of the identified motif) that subsequently can be used as prediction method and applied to unknown proteins/peptides. We have successfully applied this method to several different data sets including peptide microarray-derived sets containing more than 100,000 data points. NNAlign is available online at http://www.cbs.dtu.dk/services/NNAlign.
Køppe, Simo; Dammeyer, Jesper
2014-09-01
The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.
Tong, Haizhou; Gao, Feng; Yin, Jiapeng; Shi, Zehong; Song, Tao; Li, Haidong; Sun, Xiaomei; Wang, Yongqian; Yin, Ningbei; Zhao, Zhenmin
2015-11-01
Trans-sutural distraction osteogenesis (TSDO) is an alternative method for the early treatment of midfacial hypoplasia in growing patients with cleft lip and palate (CLP). The purpose of this study was to analyze three-dimensional (3D) midfacial skeletal changes after TSDO and to explore the mechanism in this process. All patients with nonsyndromic CLP who underwent bone-borne TSDO for midfacial hypoplasia from 2005 to 2014 were reviewed in this retrospective study. 3D morphological and quantitative measurement analyses were performed to evaluate midfacial skeletal changes by superimposition of preoperative and postoperative computed tomographic images. Twenty-six patients with mean age of 11.5 years met the inclusion criteria. The 3D morphological findings exhibited the most significant suture stress changes at the pterygomaxillary suture area, with obvious bone generation in all patients. The whole midfacial skeleton had progressively increased advancement in a craniocaudal direction along the midface segment, associated with morphological changes in skeleton itself. The 3D quantitative measurement findings showed differential advancement of each landmark at the maxillary alveolar, zygomatic bone, orbital rim, and nasal bone, which was consistent with morphological findings. TSDO allows rotation advancement of the midfacial skeleton to achieve occlusal correction and facial harmony through the mechanism of both suture remodeling and bone remodeling. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.
Wutich, Amber; Budds, Jessica; Eichelberger, Laura; Geere, Jo; Harris, Leila; Horney, Jennifer; Jepson, Wendy; Norman, Emma; O'Reilly, Kathleen; Pearson, Amber; Shah, Sameer; Shinn, Jamie; Simpson, Karen; Staddon, Chad; Stoler, Justin; Teodoro, Manuel P; Young, Sera
2017-11-01
Household water insecurity has serious implications for the health, livelihoods and wellbeing of people around the world. Existing methods to assess the state of household water insecurity focus largely on water quality, quantity or adequacy, source or reliability, and affordability. These methods have significant advantages in terms of their simplicity and comparability, but are widely recognized to oversimplify and underestimate the global burden of household water insecurity. In contrast, a broader definition of household water insecurity should include entitlements and human capabilities, sociocultural dynamics, and political institutions and processes. This paper proposes a mix of qualitative and quantitative methods that can be widely adopted across cultural, geographic, and demographic contexts to assess hard-to-measure dimensions of household water insecurity. In doing so, it critically evaluates existing methods for assessing household water insecurity and suggests ways in which methodological innovations advance a broader definition of household water insecurity.
NASA Astrophysics Data System (ADS)
Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.
2016-02-01
Two advanced, accurate and precise chemometric methods are developed for the simultaneous determination of amlodipine besylate (AML) and atorvastatin calcium (ATV) in the presence of their acidic degradation products in tablet dosage forms. The first method was Partial Least Squares (PLS-1) and the second was Artificial Neural Networks (ANN). PLS was compared to ANN models with and without variable selection procedure (genetic algorithm (GA)). For proper analysis, a 5-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the interfering species. Fifteen mixtures were used as calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested models. The proposed methods were successfully applied to the analysis of pharmaceutical tablets containing AML and ATV. The methods indicated the ability of the mentioned models to solve the highly overlapped spectra of the quinary mixture, yet using inexpensive and easy to handle instruments like the UV-VIS spectrophotometer.
Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan
2018-01-01
Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.
NASA Astrophysics Data System (ADS)
Darwish, Hany W.; Hassan, Said A.; Salem, Maissa Y.; El-Zeany, Badr A.
2014-03-01
Different chemometric models were applied for the quantitative analysis of Amlodipine (AML), Valsartan (VAL) and Hydrochlorothiazide (HCT) in ternary mixture, namely, Partial Least Squares (PLS) as traditional chemometric model and Artificial Neural Networks (ANN) as advanced model. PLS and ANN were applied with and without variable selection procedure (Genetic Algorithm GA) and data compression procedure (Principal Component Analysis PCA). The chemometric methods applied are PLS-1, GA-PLS, ANN, GA-ANN and PCA-ANN. The methods were used for the quantitative analysis of the drugs in raw materials and pharmaceutical dosage form via handling the UV spectral data. A 3-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the drugs. Fifteen mixtures were used as a calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested methods. The validity of the proposed methods was assessed using the standard addition technique.
Ultrasonic NDE Simulation for Composite Manufacturing Defects
NASA Technical Reports Server (NTRS)
Leckey, Cara A. C.; Juarez, Peter D.
2016-01-01
The increased use of composites in aerospace components is expected to continue into the future. The large scale use of composites in aerospace necessitates the development of composite-appropriate nondestructive evaluation (NDE) methods to quantitatively characterize defects in as-manufactured parts and damage incurred during or post manufacturing. Ultrasonic techniques are one of the most common approaches for defect/damage detection in composite materials. One key technical challenge area included in NASA's Advanced Composite's Project is to develop optimized rapid inspection methods for composite materials. Common manufacturing defects in carbon fiber reinforced polymer (CFRP) composites include fiber waviness (in-plane and out-of-plane), porosity, and disbonds; among others. This paper is an overview of ongoing work to develop ultrasonic wavefield based methods for characterizing manufacturing waviness defects. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with in-plane fiber waviness (also known as marcelling). Wavefield data processing methods are applied to the simulation data to explore possible routes for quantitative defect characterization.
NASA Technical Reports Server (NTRS)
Roth, Don J.; Kautz, Harold E.; Abel, Phillip B.; Whalen, Mike F.; Hendricks, J. Lynne; Bodis, James R.
2000-01-01
Surface topography, which significantly affects the performance of many industrial components, is normally measured with diamond-tip profilometry over small areas or with optical scattering methods over larger areas. To develop air-coupled surface profilometry, the NASA Glenn Research Center at Lewis Field initiated a Space Act Agreement with Sonix, Inc., through two Glenn programs, the Advanced High Temperature Engine Materials Program (HITEMP) and COMMTECH. The work resulted in quantitative surface topography profiles obtained using only high-frequency, focused ultrasonic pulses in air. The method is nondestructive, noninvasive, and noncontact, and it does not require light-reflective surfaces. Air surface profiling may be desirable when diamond-tip or laserbased methods are impractical, such as over large areas, when a significant depth range is required, or for curved surfaces. When the configuration is optimized, the method is reasonably rapid and all the quantitative analysis facilities are online, including two- and three-dimensional visualization, extreme value filtering (for faulty data), and leveling.
Genetics and child psychiatry: I Advances in quantitative and molecular genetics.
Rutter, M; Silberg, J; O'Connor, T; Simonoff, E
1999-01-01
Advances in quantitative psychiatric genetics as a whole are reviewed with respect to conceptual and methodological issues in relation to statistical model fitting, new genetic designs, twin and adoptee studies, definition of the phenotype, pervasiveness of genetic influences, pervasiveness of environmental influences, shared and nonshared environmental effects, and nature-nurture interplay. Advances in molecular genetics are discussed in relation to the shifts in research strategies to investigate multifactorial disorders (affected relative linkage designs, association strategies, and quantitative trait loci studies); new techniques and identified genetic mechanisms (expansion of trinucleotide repeats, genomic imprinting, mitochondrial DNA, fluorescent in-situ hybridisation, behavioural phenotypes, and animal models); and the successful localisation of genes.
Probabilistic framework for product design optimization and risk management
NASA Astrophysics Data System (ADS)
Keski-Rahkonen, J. K.
2018-05-01
Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.
Geerts, Hugo; Dacks, Penny A; Devanarayan, Viswanath; Haas, Magali; Khachaturian, Zaven S; Gordon, Mark Forrest; Maudsley, Stuart; Romero, Klaus; Stephenson, Diane
2016-09-01
Massive investment and technological advances in the collection of extensive and longitudinal information on thousands of Alzheimer patients results in large amounts of data. These "big-data" databases can potentially advance CNS research and drug development. However, although necessary, they are not sufficient, and we posit that they must be matched with analytical methods that go beyond retrospective data-driven associations with various clinical phenotypes. Although these empirically derived associations can generate novel and useful hypotheses, they need to be organically integrated in a quantitative understanding of the pathology that can be actionable for drug discovery and development. We argue that mechanism-based modeling and simulation approaches, where existing domain knowledge is formally integrated using complexity science and quantitative systems pharmacology can be combined with data-driven analytics to generate predictive actionable knowledge for drug discovery programs, target validation, and optimization of clinical development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Owens, Susan T.
2017-01-01
Technology is becoming an integral tool in the classroom and can make a positive impact on how the students learn. This quantitative comparative research study examined gender-based differences among secondary Advanced Placement (AP) Statistic students comparing Educational Testing Service (ETS) College Board AP Statistic examination scores…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peyrin, Francoise; Attali, Dominique; Chappard, Christine
Purpose: Trabecular bone microarchitecture is made of a complex network of plate and rod structures evolving with age and disease. The purpose of this article is to propose a new 3D local analysis method for the quantitative assessment of parameters related to the geometry of trabecular bone microarchitecture. Methods: The method is based on the topologic classification of the medial axis of the 3D image into branches, rods, and plates. Thanks to the reversibility of the medial axis, the classification is next extended to the whole 3D image. Finally, the percentages of rods and plates as well as their meanmore » thicknesses are calculated. The method was applied both to simulated test images and 3D micro-CT images of human trabecular bone. Results: The classification of simulated phantoms made of plates and rods shows that the maximum error in the quantitative percentages of plate and rods is less than 6% and smaller than with the structure model index (SMI). Micro-CT images of human femoral bone taken in osteoporosis and early or advanced osteoarthritis were analyzed. Despite the large physiological variability, the present method avoids the underestimation of rods observed with other local methods. The relative percentages of rods and plates were not significantly different between osteoarthritis and osteoporotic groups, whereas their absolute percentages were in relation to an increase of rod and plate thicknesses in advanced osteoarthritis with also higher relative and absolute number of nodes. Conclusions: The proposed method is model-independent, robust to surface irregularities, and enables geometrical characterization of not only skeletal structures but entire 3D images. Its application provided more accurate results than the standard SMI on simple simulated phantoms, but the discrepancy observed on the advanced osteoarthritis group raises questions that will require further investigations. The systematic use of such a local method in the characterization of trabecular bone samples could provide new insight in bone microarchitecture changes related to bone diseases or to those induced by drugs or therapy.« less
Missing data in forest ecology and management: advances in quantitative methods [Preface
Tara Barrett; Matti Maltomo
2012-01-01
In recent years, substantial progress has been made for handling missing data issues for applications in forest ecology and management, particularly in the area of imputation techniques. A session on this topic was held at the XXlll IUFRO World Congress in Seoul, South Korea, on August 23-28, 2010, resulting in this special issue of six papers that address recent...
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime.
Fitterer, Jessica L; Nelson, Trisalyn A
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks).
A Review of the Statistical and Quantitative Methods Used to Study Alcohol-Attributable Crime
Fitterer, Jessica L.; Nelson, Trisalyn A.
2015-01-01
Modelling the relationship between alcohol consumption and crime generates new knowledge for crime prevention strategies. Advances in data, particularly data with spatial and temporal attributes, have led to a growing suite of applied methods for modelling. In support of alcohol and crime researchers we synthesized and critiqued existing methods of spatially and quantitatively modelling the effects of alcohol exposure on crime to aid method selection, and identify new opportunities for analysis strategies. We searched the alcohol-crime literature from 1950 to January 2014. Analyses that statistically evaluated or mapped the association between alcohol and crime were included. For modelling purposes, crime data were most often derived from generalized police reports, aggregated to large spatial units such as census tracts or postal codes, and standardized by residential population data. Sixty-eight of the 90 selected studies included geospatial data of which 48 used cross-sectional datasets. Regression was the prominent modelling choice (n = 78) though dependent on data many variations existed. There are opportunities to improve information for alcohol-attributable crime prevention by using alternative population data to standardize crime rates, sourcing crime information from non-traditional platforms (social media), increasing the number of panel studies, and conducting analysis at the local level (neighbourhood, block, or point). Due to the spatio-temporal advances in crime data, we expect a continued uptake of flexible Bayesian hierarchical modelling, a greater inclusion of spatial-temporal point pattern analysis, and shift toward prospective (forecast) modelling over small areas (e.g., blocks). PMID:26418016
Less is More: Membrane Protein Digestion Beyond Urea–Trypsin Solution for Next-level Proteomics*
Zhang, Xi
2015-01-01
The goal of next-level bottom-up membrane proteomics is protein function investigation, via high-coverage high-throughput peptide-centric quantitation of expression, modifications and dynamic structures at systems scale. Yet efficient digestion of mammalian membrane proteins presents a daunting barrier, and prevalent day-long urea–trypsin in-solution digestion proved insufficient to reach this goal. Many efforts contributed incremental advances over past years, but involved protein denaturation that disconnected measurement from functional states. Beyond denaturation, the recent discovery of structure/proteomics omni-compatible detergent n-dodecyl-β-d-maltopyranoside, combined with pepsin and PNGase F columns, enabled breakthroughs in membrane protein digestion: a 2010 DDM-low-TCEP (DLT) method for H/D-exchange (HDX) using human G protein-coupled receptor, and a 2015 flow/detergent-facilitated protease and de-PTM digestions (FDD) for integrative deep sequencing and quantitation using full-length human ion channel complex. Distinguishing protein solubilization from denaturation, protease digestion reliability from theoretical specificity, and reduction from alkylation, these methods shifted day(s)-long paradigms into minutes, and afforded fully automatable (HDX)-protein-peptide-(tandem mass tag)-HPLC pipelines to instantly measure functional proteins at deep coverage, high peptide reproducibility, low artifacts and minimal leakage. Promoting—not destroying—structures and activities harnessed membrane proteins for the next-level streamlined functional proteomics. This review analyzes recent advances in membrane protein digestion methods and highlights critical discoveries for future proteomics. PMID:26081834
A Checklist for Successful Quantitative Live Cell Imaging in Systems Biology
Sung, Myong-Hee
2013-01-01
Mathematical modeling of signaling and gene regulatory networks has provided unique insights about systems behaviors for many cell biological problems of medical importance. Quantitative single cell monitoring has a crucial role in advancing systems modeling of molecular networks. However, due to the multidisciplinary techniques that are necessary for adaptation of such systems biology approaches, dissemination to a wide research community has been relatively slow. In this essay, I focus on some technical aspects that are often under-appreciated, yet critical in harnessing live cell imaging methods to achieve single-cell-level understanding and quantitative modeling of molecular networks. The importance of these technical considerations will be elaborated with examples of successes and shortcomings. Future efforts will benefit by avoiding some pitfalls and by utilizing the lessons collectively learned from recent applications of imaging in systems biology. PMID:24709701
Glycan reductive isotope labeling for quantitative glycomics.
Xia, Baoyun; Feasley, Christa L; Sachdev, Goverdhan P; Smith, David F; Cummings, Richard D
2009-04-15
Many diseases and disorders are characterized by quantitative and/or qualitative changes in complex carbohydrates. Mass spectrometry methods show promise in monitoring and detecting these important biological changes. Here we report a new glycomics method, termed glycan reductive isotope labeling (GRIL), where free glycans are derivatized by reductive amination with the differentially coded stable isotope tags [(12)C(6)]aniline and [(13)C(6)]aniline. These dual-labeled aniline-tagged glycans can be recovered by reverse-phase chromatography and can be quantified based on ultraviolet (UV) absorbance and relative ion abundances. Unlike previously reported isotopically coded reagents for glycans, GRIL does not contain deuterium, which can be chromatographically resolved. Our method shows no chromatographic resolution of differentially labeled glycans. Mixtures of differentially tagged glycans can be directly compared and quantified using mass spectrometric techniques. We demonstrate the use of GRIL to determine relative differences in glycan amount and composition. We analyze free glycans and glycans enzymatically or chemically released from a variety of standard glycoproteins, as well as human and mouse serum glycoproteins, using this method. This technique allows linear relative quantitation of glycans over a 10-fold concentration range and can accurately quantify sub-picomole levels of released glycans, providing a needed advancement in the field of glycomics.
GLYCAN REDUCTIVE ISOTOPE LABELING (GRIL) FOR QUANTITATIVE GLYCOMICS
Xia, Baoyun; Feasley, Christa L.; Sachdev, Goverdhan P.; Smith, David F.; Cummings, Richard D.
2009-01-01
Many diseases and disorders are characterized by quantitative and/or qualitative changes in complex carbohydrates. Mass spectrometry methods show promise in monitoring and detecting these important biological changes. Here we report a new glycomics method, termed Glycan Reductive Isotope Labeling (GRIL), where free glycans are derivatized by reductive amination with the differentially coded stable isotope tags [12C6]-aniline and [13C6]-aniline. These dual-labeled aniline-tagged glycans can be recovered by reversed-phase chromatography and quantified based on UV-absorbance and relative ion abundances. Unlike previously reported isotopically coded reagents for glycans, GRIL does not contain deuterium, which can be chromatographically resolved. Our method shows no chromatographic resolution of differentially labeled glycans. Mixtures of differentially tagged glycans can be directly compared and quantified using mass spectrometric techniques. We demonstrate the use of GRIL to determine relative differences in glycan amount and composition. We analyze free glycans and glycans enzymatically or chemically released from a variety of standard glycoproteins, as well as human and mouse serum glycoproteins using this method. This technique allows for linear, relative quantitation of glycans over a 10-fold concentration range and can accurately quantify sub-picomole levels of released glycans, providing a needed advancement in the field of Glycomics. PMID:19454239
Wound Healing Finally Enters the Age of Molecular Diagnostic Medicine
Tatum, Owatha L.; Dowd, Scot E.
2012-01-01
Background Many wounds are difficult to heal because of the large, complex community of microbes present within the wound. The Problem Classical laboratory culture methods do not provide an accurate picture of the microbial interactions or representation of microorganisms within a wound. There is an inherent bias in diagnosis based upon classical culture stemming from the ability of certain organisms to thrive in culture while others are underrepresented or fail to be identified in culture altogether. Chronic wounds also contain polymicrobial infections existing as a cooperative community that is resistant to antibiotic therapy. Basic/Clinical Science Advances New methods in molecular diagnostic medicine allow the identification of nearly all organisms present in a wound irrespective of the ability of these organisms to be grown in culture. Advances in DNA analyses allow absolute identification of microorganisms from very small clinical specimens. These new methods also provide a quantitative representation of all microorganisms contributing to these polymicrobial infections. Clinical Care Relevance Technological advances in laboratory diagnostics can significantly shorten the time required to heal chronic wounds. Identification of the genetic signatures of organisms present within a wound allows clinicians to identify and treat the primary organisms responsible for nonhealing wounds. Conclusion Advanced genetic technologies targeting the specific needs of wound care patients are now accessible to all wound care clinicians. PMID:24527290
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klein, Adam
2015-01-01
This thesis presents work on advancements and applications of methodology for the analysis of biological samples using mass spectrometry. Included in this work are improvements to chemical cross-linking mass spectrometry (CXMS) for the study of protein structures and mass spectrometry imaging and quantitative analysis to study plant metabolites. Applications include using matrix-assisted laser desorption/ionization-mass spectrometry imaging (MALDI-MSI) to further explore metabolic heterogeneity in plant tissues and chemical interactions at the interface between plants and pests. Additional work was focused on developing liquid chromatography-mass spectrometry (LC-MS) methods to investigate metabolites associated with plant-pest interactions.
Dreizin, David; Nam, Arthur J; Hirsch, Jeffrey; Bernstein, Mark P
2018-06-20
This article reviews the conceptual framework, available evidence, and practical considerations pertaining to nascent and emerging advances in patient-centered CT-imaging and CT-guided surgery for maxillofacial trauma. These include cinematic rendering-a novel method for advanced 3D visualization, incorporation of quantitative CT imaging into the assessment of orbital fractures, low-dose CT imaging protocols made possible with contemporary scanners and reconstruction techniques, the rapidly growing use of cone-beam CT, virtual fracture reduction with design software for surgical pre-planning, the use of 3D printing for fabricating models and implants, and new avenues in CT-guided computer-aided surgery.
The role of magnetic resonance imaging in the diagnosis of Parkinson's disease: a review.
Al-Radaideh, Ali M; Rababah, Eman M
2016-01-01
Parkinson's disease (PD) is the second most common neurodegenerative disease after Alzheimer's in elderly people. Different structural and functional neuroimaging methods play a great role in the early diagnosis of neurodegenerative diseases. This review discusses the role of magnetic resonance imaging (MRI) in the diagnosis of PD. MRI provides clinicians with structural and functional information of human brain noninvasively. Advanced quantitative MRI techniques have shown promise for detecting pathological changes related to different stages of PD. Collectively, advanced MRI techniques at high and ultrahigh magnetic fields aid in better understanding of the nature and progression of PD. Copyright © 2016 Elsevier Inc. All rights reserved.
Diffusion-Weighted Imaging Outside the Brain: Consensus Statement From an ISMRM-Sponsored Workshop
Taouli, Bachir; Beer, Ambros J.; Chenevert, Thomas; Collins, David; Lehman, Constance; Matos, Celso; Padhani, Anwar R.; Rosenkrantz, Andrew B.; Shukla-Dave, Amita; Sigmund, Eric; Tanenbaum, Lawrence; Thoeny, Harriet; Thomassin-Naggara, Isabelle; Barbieri, Sebastiano; Corcuera-Solano, Idoia; Orton, Matthew; Partridge, Savannah C.; Koh, Dow-Mu
2016-01-01
The significant advances in magnetic resonance imaging (MRI) hardware and software, sequence design, and postprocessing methods have made diffusion-weighted imaging (DWI) an important part of body MRI protocols and have fueled extensive research on quantitative diffusion outside the brain, particularly in the oncologic setting. In this review, we summarize the most up-to-date information on DWI acquisition and clinical applications outside the brain, as discussed in an ISMRM-sponsored symposium held in April 2015. We first introduce recent advances in acquisition, processing, and quality control; then review scientific evidence in major organ systems; and finally describe future directions. PMID:26892827
Highly Reproducible Label Free Quantitative Proteomic Analysis of RNA Polymerase Complexes*
Mosley, Amber L.; Sardiu, Mihaela E.; Pattenden, Samantha G.; Workman, Jerry L.; Florens, Laurence; Washburn, Michael P.
2011-01-01
The use of quantitative proteomics methods to study protein complexes has the potential to provide in-depth information on the abundance of different protein components as well as their modification state in various cellular conditions. To interrogate protein complex quantitation using shotgun proteomic methods, we have focused on the analysis of protein complexes using label-free multidimensional protein identification technology and studied the reproducibility of biological replicates. For these studies, we focused on three highly related and essential multi-protein enzymes, RNA polymerase I, II, and III from Saccharomyces cerevisiae. We found that label-free quantitation using spectral counting is highly reproducible at the protein and peptide level when analyzing RNA polymerase I, II, and III. In addition, we show that peptide sampling does not follow a random sampling model, and we show the need for advanced computational models to predict peptide detection probabilities. In order to address these issues, we used the APEX protocol to model the expected peptide detectability based on whole cell lysate acquired using the same multidimensional protein identification technology analysis used for the protein complexes. Neither method was able to predict the peptide sampling levels that we observed using replicate multidimensional protein identification technology analyses. In addition to the analysis of the RNA polymerase complexes, our analysis provides quantitative information about several RNAP associated proteins including the RNAPII elongation factor complexes DSIF and TFIIF. Our data shows that DSIF and TFIIF are the most highly enriched RNAP accessory factors in Rpb3-TAP purifications and demonstrate our ability to measure low level associated protein abundance across biological replicates. In addition, our quantitative data supports a model in which DSIF and TFIIF interact with RNAPII in a dynamic fashion in agreement with previously published reports. PMID:21048197
The promise of mixed-methods for advancing latino health research.
Apesoa-Varano, Ester Carolina; Hinton, Ladson
2013-09-01
Mixed-methods research in the social sciences has been conducted for quite some time. More recently, mixed-methods have become popular in health research, with the National Institutes of Health leading the impetus to fund studies that implement such an approach. The public health issues facing us today are great and they range from policy and other macro-level issues, to systems level problems to individuals' health behaviors. For Latinos, who are projected to become the largest minority group bearing a great deal of the burden of social inequality in the U.S., it is important to understand the deeply-rooted nature of these health disparities in order to close the gap in health outcomes. Mixed-methodology thus holds promise for advancing research on Latino heath by tackling health disparities from a variety of standpoints and approaches. The aim of this manuscript is to provide two examples of mixed methods research, each of which addresses a health topic of considerable importance to older Latinos and their families. These two examples will illustrate a) the complementary use of qualitative and quantitative methods to advance health of older Latinos in an area that is important from a public health perspective, and b) the "translation" of findings from observational studies (informed by social science and medicine) to the development and testing of interventions.
Advanced NDE techniques for quantitative characterization of aircraft
NASA Technical Reports Server (NTRS)
Heyman, Joseph S.; Winfree, William P.
1990-01-01
Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.
Current trends in quantitative proteomics - an update.
Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H
2017-05-01
Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Cheng, Wei-Chung; Suresh, Nitin; Hua, Hong
2016-05-01
With improved diagnostic capabilities and complex optical designs, endoscopic technologies are advancing. As one of the several important optical performance characteristics, geometric distortion can negatively affect size estimation and feature identification related diagnosis. Therefore, a quantitative and simple distortion evaluation method is imperative for both the endoscopic industry and the medical device regulatory agent. However, no such method is available yet. While the image correction techniques are rather mature, they heavily depend on computational power to process multidimensional image data based on complex mathematical model, i.e., difficult to understand. Some commonly used distortion evaluation methods, such as the picture height distortion (DPH) or radial distortion (DRAD), are either too simple to accurately describe the distortion or subject to the error of deriving a reference image. We developed the basic local magnification (ML) method to evaluate endoscope distortion. Based on the method, we also developed ways to calculate DPH and DRAD. The method overcomes the aforementioned limitations, has clear physical meaning in the whole field of view, and can facilitate lesion size estimation during diagnosis. Most importantly, the method can facilitate endoscopic technology to market and potentially be adopted in an international endoscope standard.
Mirsky, Simcha K; Barnea, Itay; Levi, Mattan; Greenspan, Hayit; Shaked, Natan T
2017-09-01
Currently, the delicate process of selecting sperm cells to be used for in vitro fertilization (IVF) is still based on the subjective, qualitative analysis of experienced clinicians using non-quantitative optical microscopy techniques. In this work, a method was developed for the automated analysis of sperm cells based on the quantitative phase maps acquired through use of interferometric phase microscopy (IPM). Over 1,400 human sperm cells from 8 donors were imaged using IPM, and an algorithm was designed to digitally isolate sperm cell heads from the quantitative phase maps while taking into consideration both the cell 3D morphology and contents, as well as acquire features describing sperm head morphology. A subset of these features was used to train a support vector machine (SVM) classifier to automatically classify sperm of good and bad morphology. The SVM achieves an area under the receiver operating characteristic curve of 88.59% and an area under the precision-recall curve of 88.67%, as well as precisions of 90% or higher. We believe that our automatic analysis can become the basis for objective and automatic sperm cell selection in IVF. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.
Community Game Day: Using an End-of-Life Conversation Game to Encourage Advance Care Planning.
Van Scoy, Lauren J; Reading, Jean M; Hopkins, Margaret; Smith, Brandi; Dillon, Judy; Green, Michael J; Levi, Benjamin H
2017-11-01
Advance care planning (ACP) is an important process that involves discussing and documenting one's values and preferences for medical care, particularly end-of-life treatments. This convergent, mixed-methods study assessed whether an end-of-life conversation card game is an acceptable and effective means for performing ACP for patients with chronic illness and/or their caregivers when deployed in a community setting. Twenty-two games (n = 93 participants) were held in community settings surrounding Hershey, PA in 2016. Participants were recruited using random sampling from patient databases and also convenience sampling (i.e., flyers). Quantitative questionnaires and qualitative focus group interviews were administered to assess the game experience and subsequent performance of ACP behaviors. Both quantitative and qualitative data found that Community Game Day was a well-received, positive experience for participants and 75% of participants performed ACP within three months post-intervention. These findings suggest that using a conversation game during community outreach is a useful approach for engaging patients and caregivers in ACP. The convergence of quantitative and qualitative data strongly supports the continued investigation of the game in randomized controlled trials. Copyright © 2017 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Yehia, Ali M.; Arafa, Reham M.; Abbas, Samah S.; Amer, Sawsan M.
2016-01-01
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL- 1. Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method.
Gómez-Batiste, Xavier; Mateo-Ortega, Dolors; Lasmarías, Cristina; Novellas, Anna; Espinosa, Jose; Beas, Elba; Ela, Sara; Barbero, Javier
2017-02-01
We aimed to describe the overall quantitative and qualitative results of a "La Caixa" Foundation and World Health Organization Collaborating Center Program entitled "Comprehensive Care for Patients with Advanced Illnesses and their Families" after four years of experience. Qualitative and quantitative methods were employed to assess the program. Quasiexperimental, prospective, multicenter, single-group, and pretest/posttest methods were utilized to assess the quantitative data. The effectiveness of psychosocial interventions was assessed at baseline (visit 1) and after four follow-up visits. The following dimensions were assessed: mood state, discomfort, anxiety, degree of adjustment or adaptation to disease, and suffering. We also assessed the four dimensions of the spiritual pain scale: faith or spiritual beliefs, valuable faith or spiritual beliefs, meaning in life, and peace of mind/forgiveness. Qualitative analyses were performed via surveys to evaluate stakeholder satisfaction. We built 29 psychosocial support teams involving 133 professionals-mainly psychologists and social workers. During the study period, 8,964 patients and 11,810 family members attended. Significant improvements were observed in the psychosocial and spiritual dimensions assessed. Patients, family members, and stakeholders all showed high levels of satisfaction. This model of psychosocial care could serve as an example for other countries that wish to improve psychosocial and spiritual support. Our results confirm that specific psychosocial interventions delivered by well-trained experts can help to ease suffering and discomfort in end-of-life and palliative care patients, particularly those with high levels of pain or emotional distress.
Preparation of metagenomic libraries from naturally occurring marine viruses.
Solonenko, Sergei A; Sullivan, Matthew B
2013-01-01
Microbes are now well recognized as major drivers of the biogeochemical cycling that fuels the Earth, and their viruses (phages) are known to be abundant and important in microbial mortality, horizontal gene transfer, and modulating microbial metabolic output. Investigation of environmental phages has been frustrated by an inability to culture the vast majority of naturally occurring diversity coupled with the lack of robust, quantitative, culture-independent methods for studying this uncultured majority. However, for double-stranded DNA phages, a quantitative viral metagenomic sample-to-sequence workflow now exists. Here, we review these advances with special emphasis on the technical details of preparing DNA sequencing libraries for metagenomic sequencing from environmentally relevant low-input DNA samples. Library preparation steps broadly involve manipulating the sample DNA by fragmentation, end repair and adaptor ligation, size fractionation, and amplification. One critical area of future research and development is parallel advances for alternate nucleic acid types such as single-stranded DNA and RNA viruses that are also abundant in nature. Combinations of recent advances in fragmentation (e.g., acoustic shearing and tagmentation), ligation reactions (adaptor-to-template ratio reference table availability), size fractionation (non-gel-sizing), and amplification (linear amplification for deep sequencing and linker amplification protocols) enhance our ability to generate quantitatively representative metagenomic datasets from low-input DNA samples. Such datasets are already providing new insights into the role of viruses in marine systems and will continue to do so as new environments are explored and synergies and paradigms emerge from large-scale comparative analyses. © 2013 Elsevier Inc. All rights reserved.
Advanced wave field sensing using computational shear interferometry
NASA Astrophysics Data System (ADS)
Falldorf, Claas; Agour, Mostafa; Bergmann, Ralf B.
2014-07-01
In this publication we give a brief introduction into the field of Computational Shear Interferometry (CoSI), which allows for determining arbitrary wave fields from a set of shear interferograms. We discuss limitations of the method with respect to the coherence of the underlying wave field and present various numerical methods to recover it from its sheared representations. Finally, we show experimental results on Digital Holography of objects with rough surface using a fiber coupled light emitting diode and quantitative phase contrast imaging as well as numerical refocusing in Differential Interference Contrast (DIC) microscopy.
The evolving role of MRI in the assessment of coronary artery disease.
Blackwell, G G; Pohost, G M
1995-04-13
Magnetic resonance imaging (MRI) methods are positioned to make a major impact in the care of patients with ischemic heart disease. Further advances are to be expected in the area of myocardial perfusion imaging and noninvasive MRI coronary "angiography." Work also continues in determining quantitative flow via MRI. Although expensive, the unique ability of MRI methods to provide multiple pieces of information in a single examination may make this technology cost effective. The concept of a "one-step shop" is progressing steadily toward a clinical reality.
Safety evaluation methodology for advanced coal extraction systems
NASA Technical Reports Server (NTRS)
Zimmerman, W. F.
1981-01-01
Qualitative and quantitative evaluation methods for coal extraction systems were developed. The analysis examines the soundness of the design, whether or not the major hazards have been eliminated or reduced, and how the reduction would be accomplished. The quantitative methodology establishes the approximate impact of hazards on injury levels. The results are weighted by peculiar geological elements, specialized safety training, peculiar mine environmental aspects, and reductions in labor force. The outcome is compared with injury level requirements based on similar, safer industries to get a measure of the new system's success in reducing injuries. This approach provides a more detailed and comprehensive analysis of hazards and their effects than existing safety analyses.
ERIC Educational Resources Information Center
Zhang, Lin
2014-01-01
Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to…
1974-03-01
approaches to specific problems within the foreign affairs community ( e . g. , within DoD). The lag resulted from a variety of factors, in...ICATIOH / Prepared by: Herman M. Weil Airon Greenberg Larry German Douglas Hartwick Michael R. Leavitt Sponsored by: Defense Advanced...I Dr. G. Robert Franco, Project Director Dr. Herman M. Weil Mr. Aaron Greenberg Mr. Larry German Mr. Douglas Hartwick Dr. Michael
Compliance and patching and atropine amblyopia treatments.
Wang, Jingyun
2015-09-01
In the past 20 years, there has been a great advancement in knowledge pertaining to compliance with amblyopia treatments. The occlusion dose monitor introduced quantitative monitoring methods in patching, which sparked our initial understanding of the dose-response relationship for patching amblyopia treatment. This review focuses on current compliance knowledge and the impact it has on patching and atropine amblyopia treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Alongi, Pierpaolo; Iaccarino, Leonardo; Losa, Marco; Del Vecchio, Antonella; Gerevini, Simonetta; Plebani, Valentina; Di Muzio, Nadia; Mortini, Pietro; Gianolli, Luigi; Perani, Daniela
2018-05-25
Even though the benefits of radiation therapy are well established, it is important to recognize the broad spectrum of radiation-induced changes, particularly in the central nervous system. The possible damage to the brain parenchyma may have clinical consequences and in particular cognitive impairment might be one of the major complication of radiotherapy. To date, no studies have investigated the effects of focal radiation therapy on brain structure and function together with the assessment of their clinical outcomes at a long follow-up. In this prospective study, we evaluated in six patients the possible brain late effects after radiation therapy, using a standardized neuropsychological battery, MRI and 18F-FDG PET using SPM and semi-quantitative methods, in patients affected by cranial base tumors who underwent gamma knife or tomotherapy. Neuropsychological examinations showed no cognitive impairment after the treatment. In all patients, both MRI assessment and 18F-FDG-PET did not reveal any local or distant anatomical and metabolic late effects. The present study support the safety of advanced radiation therapy techniques. 18F-FDG-PET, using SPM and semi-quantitative methods, might be a valuable tool to evaluate the cerebral radiotoxicity in patients treated for brain neoplasms. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Curtis, Tyler E; Roeder, Ryan K
2017-10-01
Advances in photon-counting detectors have enabled quantitative material decomposition using multi-energy or spectral computed tomography (CT). Supervised methods for material decomposition utilize an estimated attenuation for each material of interest at each photon energy level, which must be calibrated based upon calculated or measured values for known compositions. Measurements using a calibration phantom can advantageously account for system-specific noise, but the effect of calibration methods on the material basis matrix and subsequent quantitative material decomposition has not been experimentally investigated. Therefore, the objective of this study was to investigate the influence of the range and number of contrast agent concentrations within a modular calibration phantom on the accuracy of quantitative material decomposition in the image domain. Gadolinium was chosen as a model contrast agent in imaging phantoms, which also contained bone tissue and water as negative controls. The maximum gadolinium concentration (30, 60, and 90 mM) and total number of concentrations (2, 4, and 7) were independently varied to systematically investigate effects of the material basis matrix and scaling factor calibration on the quantitative (root mean squared error, RMSE) and spatial (sensitivity and specificity) accuracy of material decomposition. Images of calibration and sample phantoms were acquired using a commercially available photon-counting spectral micro-CT system with five energy bins selected to normalize photon counts and leverage the contrast agent k-edge. Material decomposition of gadolinium, calcium, and water was performed for each calibration method using a maximum a posteriori estimator. Both the quantitative and spatial accuracy of material decomposition were most improved by using an increased maximum gadolinium concentration (range) in the basis matrix calibration; the effects of using a greater number of concentrations were relatively small in magnitude by comparison. The material basis matrix calibration was more sensitive to changes in the calibration methods than the scaling factor calibration. The material basis matrix calibration significantly influenced both the quantitative and spatial accuracy of material decomposition, while the scaling factor calibration influenced quantitative but not spatial accuracy. Importantly, the median RMSE of material decomposition was as low as ~1.5 mM (~0.24 mg/mL gadolinium), which was similar in magnitude to that measured by optical spectroscopy on the same samples. The accuracy of quantitative material decomposition in photon-counting spectral CT was significantly influenced by calibration methods which must therefore be carefully considered for the intended diagnostic imaging application. © 2017 American Association of Physicists in Medicine.
Towards standardized assessment of endoscope optical performance: geometric distortion
NASA Astrophysics Data System (ADS)
Wang, Quanzeng; Desai, Viraj N.; Ngo, Ying Z.; Cheng, Wei-Chung; Pfefer, Joshua
2013-12-01
Technological advances in endoscopes, such as capsule, ultrathin and disposable devices, promise significant improvements in safety, clinical effectiveness and patient acceptance. Unfortunately, the industry lacks test methods for preclinical evaluation of key optical performance characteristics (OPCs) of endoscopic devices that are quantitative, objective and well-validated. As a result, it is difficult for researchers and developers to compare image quality and evaluate equivalence to, or improvement upon, prior technologies. While endoscope OPCs include resolution, field of view, and depth of field, among others, our focus in this paper is geometric image distortion. We reviewed specific test methods for distortion and then developed an objective, quantitative test method based on well-defined experimental and data processing steps to evaluate radial distortion in the full field of view of an endoscopic imaging system. Our measurements and analyses showed that a second-degree polynomial equation could well describe the radial distortion curve of a traditional endoscope. The distortion evaluation method was effective for correcting the image and can be used to explain other widely accepted evaluation methods such as picture height distortion. Development of consensus standards based on promising test methods for image quality assessment, such as the method studied here, will facilitate clinical implementation of innovative endoscopic devices.
Profiles of dyadic adjustment for advanced prostate cancer to inform couple-based intervention.
Elliott, Kate-Ellen J; Scott, Jennifer L; Monsour, Michael; Nuwayhid, Fadi
2015-01-01
The purpose of the study is to describe from a relational perspective, partners' psychological adjustment, coping and support needs for advanced prostate cancer. A mixed methods design was adopted, employing triangulation of qualitative and quantitative data, to produce dyadic profiles of adjustment for six couples recruited from the urology clinics of local hospitals in Tasmania, Australia. Dyads completed a video-taped communication task, semi-structured interview and standardised self-report questionnaires. Themes identified were associated with the dyadic challenges of the disease experience (e.g. relationship intimacy, disease progression and carer burden). Couples with poor psychological adjustment profiles had both clinical and global locus of distress, treatment side-effects, carer burden and poor general health. Resilient couples demonstrated relationship closeness and adaptive cognitive and behavioural coping strategies. The themes informed the adaption of an effective program for couples coping with women's cancers (CanCOPE, to create a program for couples facing advanced prostate cancer (ProCOPE-Adv). Mixed method results inform the development of psychological therapy components for couples coping with advanced prostate cancer. The concomitance of co-morbid health problems may have implications for access and engagement for older adult populations in face-to-face intervention.
Quantitative Microscopic Analysis of Plasma Membrane Receptor Dynamics in Living Plant Cells.
Luo, Yu; Russinova, Eugenia
2017-01-01
Plasma membrane-localized receptors are essential for cellular communication and signal transduction. In Arabidopsis thaliana, BRASSINOSTEROID INSENSITIVE1 (BRI1) is one of the receptors that is activated by binding to its ligand, the brassinosteroid (BR) hormone, at the cell surface to regulate diverse plant developmental processes. The availability of BRI1 in the plasma membrane is related to its signaling output and is known to be controlled by the dynamic endomembrane trafficking. Advances in fluorescence labeling and confocal microscopy techniques enabled us to gain a better understanding of plasma membrane receptor dynamics in living cells. Here we describe different quantitative microscopy methods to monitor the relative steady-state levels of the BRI1 protein in the plasma membrane of root epidermal cells and its relative exocytosis and recycling rates. The methods can be applied also to analyze similar dynamics of other plasma membrane-localized receptors.
NASA Astrophysics Data System (ADS)
Zhang, Lin
2014-02-01
Educators design and create various technology tools to scaffold students' learning. As more and more technology designs are incorporated into learning, growing attention has been paid to the study of technology-based learning tool. This paper discusses the emerging issues, such as how can learning effectiveness be understood in relation to different technology features? And how can pieces of qualitative and quantitative results be integrated to achieve a broader understanding of technology designs? To address these issues, this paper proposes a meta-analysis method. Detailed explanations about the structure of the methodology and its scientific mechanism are provided for discussions and suggestions. This paper ends with an in-depth discussion on the concerns and questions that educational researchers might raise, such as how this methodology takes care of learning contexts.
Variables affecting learning in a simulation experience: a mixed methods study.
Beischel, Kelly P
2013-02-01
The primary purpose of this study was to test a hypothesized model describing the direct effects of learning variables on anxiety and cognitive learning outcomes in a high-fidelity simulation (HFS) experience. The secondary purpose was to explain and explore student perceptions concerning the qualities and context of HFS affecting anxiety and learning. This study used a mixed methods quantitative-dominant explanatory design with concurrent qualitative data collection to examine variables affecting learning in undergraduate, beginning nursing students (N = 124). Being ready to learn, having a strong auditory-verbal learning style, and being prepared for simulation directly affected anxiety, whereas learning outcomes were directly affected by having strong auditory-verbal and hands-on learning styles. Anxiety did not quantitatively mediate cognitive learning outcomes as theorized, although students qualitatively reported debilitating levels of anxiety. This study advances nursing education science by providing evidence concerning variables affecting learning outcomes in HFS.
Recent advances in mass spectrometry-based proteomics of gastric cancer.
Kang, Changwon; Lee, Yejin; Lee, J Eugene
2016-10-07
The last decade has witnessed remarkable technological advances in mass spectrometry-based proteomics. The development of proteomics techniques has enabled the reliable analysis of complex proteomes, leading to the identification and quantification of thousands of proteins in gastric cancer cells, tissues, and sera. This quantitative information has been used to profile the anomalies in gastric cancer and provide insights into the pathogenic mechanism of the disease. In this review, we mainly focus on the advances in mass spectrometry and quantitative proteomics that were achieved in the last five years and how these up-and-coming technologies are employed to track biochemical changes in gastric cancer cells. We conclude by presenting a perspective on quantitative proteomics and its future applications in the clinic and translational gastric cancer research.
Kim, Honggyu; Zhang, Jack Y.; Raghavan, Santosh; ...
2016-12-22
Unveiling the identity, spatial configuration, and microscopic structure of point defects is one of the key challenges in materials science. Here, we demonstrate that quantitative scanning transmission electron microscopy (STEM) can be used to directly observe Sr vacancies in SrTiO 3 and to determine the atom column relaxations around them. By combining recent advances in quantitative STEM, including variableangle, high-angle annular dark-field imaging and rigid registration methods, with frozen phonon multislice image simulations, we identify which Sr columns contain vacancies and quantify the number of vacancies in them. Here, picometer precision measurements of the surrounding atom column positions show thatmore » the nearest-neighbor Ti atoms are displaced away from the Sr vacancies. The results open up a new methodology for studying the microscopic mechanisms by which point defects control materials properties.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Honggyu; Zhang, Jack Y.; Raghavan, Santosh
Unveiling the identity, spatial configuration, and microscopic structure of point defects is one of the key challenges in materials science. Here, we demonstrate that quantitative scanning transmission electron microscopy (STEM) can be used to directly observe Sr vacancies in SrTiO 3 and to determine the atom column relaxations around them. By combining recent advances in quantitative STEM, including variableangle, high-angle annular dark-field imaging and rigid registration methods, with frozen phonon multislice image simulations, we identify which Sr columns contain vacancies and quantify the number of vacancies in them. Here, picometer precision measurements of the surrounding atom column positions show thatmore » the nearest-neighbor Ti atoms are displaced away from the Sr vacancies. The results open up a new methodology for studying the microscopic mechanisms by which point defects control materials properties.« less
Aultman, Julie; Baughman, Kristin R; Ludwick, Ruth
2018-05-18
To examine barriers of advance care planning (ACP) experienced by care managers (CMs) through a mixed methods approach. A concurrent nested design was used to acquire a deeper understanding of ACP, and to identify nuances between quantitative and qualitative data. Past quantitative studies on providers have identified barriers related to time, culture, knowledge, responsibility, and availability of legal documents. These barriers, and accompanying attitudes and feelings, have been taken at face value without rich qualitative data to identify under what conditions and to what extent a barrier impacts care. A two-part multisite, mixed methods study was conducted using surveys and focus groups. Surveys were completed by 458 CMs at 10 Area Agencies on Aging and 62 participated in one of 8 focus groups. Data were analyzed using a concurrent nested design with individual data analysis and a merged data approach. There were three main distinctions between the quantitative and qualitative data. First, while CMs reported on the survey that ACP was not too time consuming, focus group data revealed that time was an issue especially related to competing priorities. Second on the survey 60% of the CMS reported they had enough knowledge, but qualitative data revealed about more nuances. Last, the reported comfort levels in the quantitative data were less overt in the qualitative date where additional feelings and attitudes were revealed, e.g., frustration with families, preferences for more physician involvement. CMs reported their attitudes about ACP, clarified through a rigorous mixed methods analysis. Care managers can successfully lead ACP discussions, but require further education, resources, and team-based guidance. ACP is essential for reducing emotional, social, and financial burdens associated with health care decision-making, and CMs can positively impact ACP discussions when appropriately supported by the clinical community. The many nuances in the ACP process that we found illustrate the need for ongoing discussions, education and research on this important topic. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Recent advances in quantitative high throughput and high content data analysis.
Moutsatsos, Ioannis K; Parker, Christian N
2016-01-01
High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.
Darwish, Hany W; Hassan, Said A; Salem, Maissa Y; El-Zeany, Badr A
2016-02-05
Two advanced, accurate and precise chemometric methods are developed for the simultaneous determination of amlodipine besylate (AML) and atorvastatin calcium (ATV) in the presence of their acidic degradation products in tablet dosage forms. The first method was Partial Least Squares (PLS-1) and the second was Artificial Neural Networks (ANN). PLS was compared to ANN models with and without variable selection procedure (genetic algorithm (GA)). For proper analysis, a 5-factor 5-level experimental design was established resulting in 25 mixtures containing different ratios of the interfering species. Fifteen mixtures were used as calibration set and the other ten mixtures were used as validation set to validate the prediction ability of the suggested models. The proposed methods were successfully applied to the analysis of pharmaceutical tablets containing AML and ATV. The methods indicated the ability of the mentioned models to solve the highly overlapped spectra of the quinary mixture, yet using inexpensive and easy to handle instruments like the UV-VIS spectrophotometer. Copyright © 2015 Elsevier B.V. All rights reserved.
Advances in methods for detection of anaerobic ammonium oxidizing (anammox) bacteria.
Li, Meng; Gu, Ji-Dong
2011-05-01
Anaerobic ammonium oxidation (anammox), the biochemical process oxidizing ammonium into dinitrogen gas using nitrite as an electron acceptor, has only been recognized for its significant role in the global nitrogen cycle not long ago, and its ubiquitous distribution in a wide range of environments has changed our knowledge about the contributors to the global nitrogen cycle. Currently, several groups of methods are used in detection of anammox bacteria based on their physiological and biochemical characteristics, cellular chemical composition, and both 16S rRNA gene and selective functional genes as biomarkers, including hydrazine oxidoreductase and nitrite reductase encoding genes hzo and nirS, respectively. Results from these methods coupling with advances in quantitative PCR, reverse transcription of mRNA genes and stable isotope labeling have improved our understanding on the distribution, diversity, and activity of anammox bacteria in different environments both natural and engineered ones. In this review, we summarize these methods used in detection of anammox bacteria from various environments, highlight the strengths and weakness of these methods, and also discuss the new development potentials on the existing and new techniques in the future.
NASA Astrophysics Data System (ADS)
McCray, Wilmon Wil L., Jr.
The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.
Advances in multiplexed MRM-based protein biomarker quantitation toward clinical utility.
Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Hardie, Darryl B; Borchers, Christoph H
2014-05-01
Accurate and rapid protein quantitation is essential for screening biomarkers for disease stratification and monitoring, and to validate the hundreds of putative markers in human biofluids, including blood plasma. An analytical method that utilizes stable isotope-labeled standard (SIS) peptides and selected/multiple reaction monitoring-mass spectrometry (SRM/MRM-MS) has emerged as a promising technique for determining protein concentrations. This targeted approach has analytical merit, but its true potential (in terms of sensitivity and multiplexing) has yet to be realized. Described herein is a method that extends the multiplexing ability of the MRM method to enable the quantitation 142 high-to-moderate abundance proteins (from 31mg/mL to 44ng/mL) in undepleted and non-enriched human plasma in a single run. The proteins have been reported to be associated to a wide variety of non-communicable diseases (NCDs), from cardiovascular disease (CVD) to diabetes. The concentrations of these proteins in human plasma are inferred from interference-free peptides functioning as molecular surrogates (2 peptides per protein, on average). A revised data analysis strategy, involving the linear regression equation of normal control plasma, has been instituted to enable the facile application to patient samples, as demonstrated in separate nutrigenomics and CVD studies. The exceptional robustness of the LC/MS platform and the quantitative method, as well as its high throughput, makes the assay suitable for application to patient samples for the verification of a condensed or complete protein panel. This article is part of a Special Issue entitled: Biomarkers: A Proteomic Challenge. © 2013.
Sensitivity analysis of infectious disease models: methods, advances and their application
Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.
2013-01-01
Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497
Radiation exposure in X-ray-based imaging techniques used in osteoporosis
Adams, Judith E.; Guglielmi, Giuseppe; Link, Thomas M.
2010-01-01
Recent advances in medical X-ray imaging have enabled the development of new techniques capable of assessing not only bone quantity but also structure. This article provides (a) a brief review of the current X-ray methods used for quantitative assessment of the skeleton, (b) data on the levels of radiation exposure associated with these methods and (c) information about radiation safety issues. Radiation doses associated with dual-energy X-ray absorptiometry are very low. However, as with any X-ray imaging technique, each particular examination must always be clinically justified. When an examination is justified, the emphasis must be on dose optimisation of imaging protocols. Dose optimisation is more important for paediatric examinations because children are more vulnerable to radiation than adults. Methods based on multi-detector CT (MDCT) are associated with higher radiation doses. New 3D volumetric hip and spine quantitative computed tomography (QCT) techniques and high-resolution MDCT for evaluation of bone structure deliver doses to patients from 1 to 3 mSv. Low-dose protocols are needed to reduce radiation exposure from these methods and minimise associated health risks. PMID:20559834
Schneider, Barbara St Pierre; Nicholas, Jennifer; Kurrus, Jeffrey E
2013-01-01
To compare the methodologic quality and study/report characteristics between quantitative clinical nursing and nursing education research articles. The methodologic quality of quantitative nursing education research needs to advance to a higher level. Clinical research can provide guidance for nursing education to reach this level. One hundred quantitative clinical research articles from-high impact journals published in 2007 and 37 education research articles from high impact journals published in 2006 to 2007 were chosen for analysis. Clinical articles had significantly higher quality scores than education articles in three domains: number of institutions studied, type of data, and outcomes. The findings indicate three ways in which nursing education researchers can strengthen the methodologic quality of their quantitative research. With this approach, greater funding may be secured for advancing the science of nursing education.
Burns, Malcolm; Wiseman, Gordon; Knight, Angus; Bramley, Peter; Foster, Lucy; Rollinson, Sophie; Damant, Andrew; Primrose, Sandy
2016-01-07
Following a report on a significant amount of horse DNA being detected in a beef burger product on sale to the public at a UK supermarket in early 2013, the Elliott report was published in 2014 and contained a list of recommendations for helping ensure food integrity. One of the recommendations included improving laboratory testing capacity and capability to ensure a harmonised approach for testing for food authenticity. Molecular biologists have developed exquisitely sensitive methods based on the polymerase chain reaction (PCR) or mass spectrometry for detecting the presence of particular nucleic acid or peptide/protein sequences. These methods have been shown to be specific and sensitive in terms of lower limits of applicability, but they are largely qualitative in nature. Historically, the conversion of these qualitative techniques into reliable quantitative methods has been beset with problems even when used on relatively simple sample matrices. When the methods are applied to complex sample matrices, as found in many foods, the problems are magnified resulting in a high measurement uncertainty associated with the result which may mean that the assay is not fit for purpose. However, recent advances in the technology and the understanding of molecular biology approaches have further given rise to the re-assessment of these methods for their quantitative potential. This review focuses on important issues for consideration when validating a molecular biology assay and the various factors that can impact on the measurement uncertainty of a result associated with molecular biology approaches used in detection of food fraud, with a particular focus on quantitative PCR-based and proteomics assays.
Pennell, Matthew W; Harmon, Luke J
2013-06-01
Recent innovations in phylogenetic comparative methods (PCMs) have spurred a renaissance of research into the causes and consequences of large-scale patterns of biodiversity. In this paper, we review these advances. We also highlight the potential of comparative methods to integrate across fields and focus on three examples where such integration might be particularly valuable: quantitative genetics, community ecology, and paleobiology. We argue that PCMs will continue to be a key set of tools in evolutionary biology, shedding new light on how evolutionary processes have shaped patterns of biodiversity through deep time. © 2013 New York Academy of Sciences.
Less is More: Membrane Protein Digestion Beyond Urea-Trypsin Solution for Next-level Proteomics.
Zhang, Xi
2015-09-01
The goal of next-level bottom-up membrane proteomics is protein function investigation, via high-coverage high-throughput peptide-centric quantitation of expression, modifications and dynamic structures at systems scale. Yet efficient digestion of mammalian membrane proteins presents a daunting barrier, and prevalent day-long urea-trypsin in-solution digestion proved insufficient to reach this goal. Many efforts contributed incremental advances over past years, but involved protein denaturation that disconnected measurement from functional states. Beyond denaturation, the recent discovery of structure/proteomics omni-compatible detergent n-dodecyl-β-d-maltopyranoside, combined with pepsin and PNGase F columns, enabled breakthroughs in membrane protein digestion: a 2010 DDM-low-TCEP (DLT) method for H/D-exchange (HDX) using human G protein-coupled receptor, and a 2015 flow/detergent-facilitated protease and de-PTM digestions (FDD) for integrative deep sequencing and quantitation using full-length human ion channel complex. Distinguishing protein solubilization from denaturation, protease digestion reliability from theoretical specificity, and reduction from alkylation, these methods shifted day(s)-long paradigms into minutes, and afforded fully automatable (HDX)-protein-peptide-(tandem mass tag)-HPLC pipelines to instantly measure functional proteins at deep coverage, high peptide reproducibility, low artifacts and minimal leakage. Promoting-not destroying-structures and activities harnessed membrane proteins for the next-level streamlined functional proteomics. This review analyzes recent advances in membrane protein digestion methods and highlights critical discoveries for future proteomics. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Hemodynamic flow visualization of early embryonic great vessels using μPIV.
Goktas, Selda; Chen, Chia-Yuan; Kowalski, William J; Pekkan, Kerem
2015-01-01
Microparticle image velocimetry (μPIV) is an evolving quantitative methodology to closely and accurately monitor the cardiac flow dynamics and mechanotransduction during vascular morphogenesis. While PIV technique has a long history, contemporary developments in advanced microscopy have significantly expanded its power. This chapter includes three new methods for μPIV acquisition in selected embryonic structures achieved through advanced optical imaging: (1) high-speed confocal scanning of transgenic zebrafish embryos, where the transgenic erythrocytes act as the tracing particles; (2) microinjection of artificial seeding particles in chick embryos visualized with stereomicroscopy; and (3) real-time, time-resolved optical coherence tomography acquisition of vitelline vessel flow profiles in chick embryos, tracking the erythrocytes.
Advances in human chorionic gonadotropin detection technologies: a review.
Fan, Jing; Wang, Mandy; Wang, Chengyin; Cao, Yu
2017-10-01
Human chorionic gonadotropin (HCG) is a glycoprotein secreted by placental trophoblast cells in pregnancy. HCG is a heterodimer composed of two different α- and β-subunits, with the latter being unique to HCG. As well as being the most important diagnostic markers for pregnancy, HCG is also a tumor marker, therefore, quantitative detection of HCG is of great value. Numerous advanced technologies have been developed for HCG concentration detection including electrochemical immunoassay, chemiluminescent immunoassay, fluorescence immunoassay, resonance scattering spectrometry, atomic emission spectrometry, radioimmunoassay, MS and so on. Some have pursued simple and easy operation, while others have emphasized on accuracy and applications in clinical medicine. This review provides a comprehensive summary of various methods of detecting HCG.
A Study Assessing the Potential of Negative Effects in Interdisciplinary Math–Biology Instruction
Madlung, Andreas; Bremer, Martina; Himelblau, Edward; Tullis, Alexa
2011-01-01
There is increasing enthusiasm for teaching approaches that combine mathematics and biology. The call for integrating more quantitative work in biology education has led to new teaching tools that improve quantitative skills. Little is known, however, about whether increasing interdisciplinary work can lead to adverse effects, such as the development of broader but shallower skills or the possibility that math anxiety causes some students to disengage in the classroom, or, paradoxically, to focus so much on the mathematics that they lose sight of its application for the biological concepts in the center of the unit at hand. We have developed and assessed an integrative learning module and found disciplinary learning gains to be equally strong in first-year students who actively engaged in embedded quantitative calculations as in those students who were merely presented with quantitative data in the context of interpreting biological and biostatistical results. When presented to advanced biology students, our quantitative learning tool increased test performance significantly. We conclude from our study that the addition of mathematical calculations to the first year and advanced biology curricula did not hinder overall student learning, and may increase disciplinary learning and data interpretation skills in advanced students. PMID:21364099
Crayton, Samuel H.; Elias, Andrew; Al-Zaki, Ajlan; Cheng, Zhiliang; Tsourkas, Andrew
2011-01-01
Recent advances in material science and chemistry have led to the development of nanoparticles with diverse physicochemical properties, e.g. size, charge, shape, and surface chemistry. Evaluating which physicochemical properties are best for imaging and therapeutic studies is challenging not only because of the multitude of samples to evaluate, but also because of the large experimental variability associated with in vivo studies (e.g. differences in tumor size, injected dose, subject weight, etc.). To address this issue, we have developed a lanthanide-doped nanoparticle system and analytical method that allows for the quantitative comparison of multiple nanoparticle compositions simultaneously. Specifically, superparamagnetic iron oxide (SPIO) with a range of different sizes and charges were synthesized, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy (ICP-MS) was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood samples and the resected tumor and organs. The method proved generalizable to other nanoparticle platforms, including dendrimers, liposomes, and polymersomes. This approach provides a simple, cost-effective, and non-radiative method to quantitatively compare tumor localization, biodistribution, and blood clearance of more than 10 nanoparticle compositions simultaneously, removing subject-to-subject variability. PMID:22100983
Eroglu, Ela; Okada, Shigeru; Melis, Anastasios
2011-08-01
Six different strains of the green microalgae Botryococcus belonging to the A-race or B-race, accumulating alkadiene or botryococcene hydrocarbons, respectively, were compared for biomass and hydrocarbon productivities. Biomass productivity was assessed gravimetrically upon strain growth in the laboratory under defined conditions. Hydrocarbon productivities were measured by three different and independent experimental approaches, including density equilibrium of the intact cells and micro-colonies, spectrophotometric analysis of hydrocarbon extracts, and gravimetric quantitation of eluted hydrocarbons. All three hydrocarbon-quantitation methods yielded similar results for each of the strains examined. The B-race microalgae Botryococcus braunii var. Showa and Kawaguchi-1 constitutively accumulated botryococcene hydrocarbons equivalent to 30% and 20%, respectively, of their overall biomass. The A-race microalgae Botryococcus braunii, varieties Yamanaka, UTEX 2441 and UTEX LB572 constitutively accumulated alkadiene hydrocarbons ranging from 14% to 13% and 10% of their overall biomass, respectively. Botryococcus sudeticus (UTEX 2629), a morphologically different green microalga, had the lowest hydrocarbon accumulation, equal to about 3% of its overall biomass. Results validate the density equilibrium and spectrophotometric analysis methods in the quantitation of botryococcene-type hydrocarbons. These analytical advances will serve in the screening and selection of B. braunii and of other microalgae in efforts to identify those having a high hydrocarbon content for use in commercial applications.
Tataw, David Besong; Ekúndayò, Olúgbémiga T
2017-01-01
This article reports on the use of sequential and integrated mixed-methods approach in a focused population and small-area analysis. The study framework integrates focus groups, survey research, and community engagement strategies in a search for evidence related to prostate cancer screening services utilization as a component of cancer prevention planning in a marginalized African American community in the United States. Research and data analysis methods are synthesized by aggregation, configuration, and interpretive analysis. The results of synthesis show that qualitative and quantitative data validate and complement each other in advancing our knowledge of population characteristics, variable associations, the complex context in which variables exist, and the best options for prevention and service planning. Synthesis of findings and interpretive analysis provided two important explanations which seemed inexplicable in regression outputs: (a) Focus group data on the limitations of the church as an educational source explain the negative association between preferred educational channels and screening behavior found in quantitative analysis. (b) Focus group data on unwelcoming provider environments explain the inconsistent relationship between knowledge of local sites and screening services utilization found in quantitative analysis. The findings suggest that planners, evaluators, and scientists should grow their planning and evaluation evidence from the community they serve.
Cell bioprocessing in space - Applications of analytical cytology
NASA Technical Reports Server (NTRS)
Todd, P.; Hymer, W. C.; Goolsby, C. L.; Hatfield, J. M.; Morrison, D. R.
1988-01-01
Cell bioprocessing experiments in space are reviewed and the development of on-board cell analytical cytology techniques that can serve such experiments is discussed. Methods and results of experiments involving the cultivation and separation of eukaryotic cells in space are presented. It is suggested that an advanced cytometer should be developed for the quantitative analysis of large numbers of specimens of suspended eukaryotic cells and bioparticles in experiments on the Space Station.
Ge, Jing; Zhang, Guoping
2015-01-01
Advanced intelligent methodologies could help detect and predict diseases from the EEG signals in cases the manual analysis is inefficient available, for instance, the epileptic seizures detection and prediction. This is because the diversity and the evolution of the epileptic seizures make it very difficult in detecting and identifying the undergoing disease. Fortunately, the determinism and nonlinearity in a time series could characterize the state changes. Literature review indicates that the Delay Vector Variance (DVV) could examine the nonlinearity to gain insight into the EEG signals but very limited work has been done to address the quantitative DVV approach. Hence, the outcomes of the quantitative DVV should be evaluated to detect the epileptic seizures. To develop a new epileptic seizure detection method based on quantitative DVV. This new epileptic seizure detection method employed an improved delay vector variance (IDVV) to extract the nonlinearity value as a distinct feature. Then a multi-kernel functions strategy was proposed in the extreme learning machine (ELM) network to provide precise disease detection and prediction. The nonlinearity is more sensitive than the energy and entropy. 87.5% overall accuracy of recognition and 75.0% overall accuracy of forecasting were achieved. The proposed IDVV and multi-kernel ELM based method was feasible and effective for epileptic EEG detection. Hence, the newly proposed method has importance for practical applications.
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... possess appropriate advanced degrees and experience in test development or psychometric research, to... quantitative domains, the Secretary reviews only those subtests covering verbal and quantitative domains...
The Promise of Mixed-Methods for Advancing Latino Health Research
Apesoa-Varano, Ester Carolina; Hinton, Ladson
2015-01-01
Mixed-methods research in the social sciences has been conducted for quite some time. More recently, mixed-methods have become popular in health research, with the National Institutes of Health leading the impetus to fund studies that implement such an approach. The public health issues facing us today are great and they range from policy and other macro-level issues, to systems level problems to individuals' health behaviors. For Latinos, who are projected to become the largest minority group bearing a great deal of the burden of social inequality in the U.S., it is important to understand the deeply-rooted nature of these health disparities in order to close the gap in health outcomes. Mixed-methodology thus holds promise for advancing research on Latino heath by tackling health disparities from a variety of standpoints and approaches. The aim of this manuscript is to provide two examples of mixed methods research, each of which addresses a health topic of considerable importance to older Latinos and their families. These two examples will illustrate a) the complementary use of qualitative and quantitative methods to advance health of older Latinos in an area that is important from a public health perspective, and b) the “translation” of findings from observational studies (informed by social science and medicine) to the development and testing of interventions. PMID:23996325
Analysis of dynamic brain oscillations: methodological advances.
Le Van Quyen, Michel; Bragin, Anatol
2007-07-01
In recent years, new recording technologies have advanced such that, at high temporal and spatial resolutions, oscillations of neuronal networks can be identified from simultaneous, multisite recordings. However, because of the deluge of multichannel data generated by these experiments, achieving the full potential of parallel neuronal recordings also depends on the development of new mathematical methods that can extract meaningful information relating to time, frequency and space. Here, we aim to bridge this gap by focusing on up-to-date recording techniques for measurement of network oscillations and new analysis tools for their quantitative assessment. In particular, we emphasize how these methods can be applied, what property might be inferred from neuronal signals and potentially productive future directions. This review is part of the INMED and TINS special issue, Physiogenic and pathogenic oscillations: the beauty and the beast, derived from presentations at the annual INMED and TINS symposium (http://inmednet.com).
Principles and applications of polymerase chain reaction in medical diagnostic fields: a review
Valones, Marcela Agne Alves; Guimarães, Rafael Lima; Brandão, Lucas André Cavalcanti; de Souza, Paulo Roberto Eleutério; de Albuquerque Tavares Carvalho, Alessandra; Crovela, Sergio
2009-01-01
Recent developments in molecular methods have revolutionized the detection and characterization of microorganisms in a broad range of medical diagnostic fields, including virology, mycology, parasitology, microbiology and dentistry. Among these methods, Polymerase Chain Reaction (PCR) has generated great benefits and allowed scientific advancements. PCR is an excellent technique for the rapid detection of pathogens, including those difficult to culture. Along with conventional PCR techniques, Real-Time PCR has emerged as a technological innovation and is playing an ever-increasing role in clinical diagnostics and research laboratories. Due to its capacity to generate both qualitative and quantitative results, Real-Time PCR is considered a fast and accurate platform. The aim of the present literature review is to explore the clinical usefulness and potential of both conventional PCR and Real-Time PCR assays in diverse medical fields, addressing its main uses and advances. PMID:24031310
[The role of meta-analysis in assessing the treatment of advanced non-small cell lung cancer].
Pérol, M; Pérol, D
2004-02-01
Meta-analysis is a statistical method allowing an evaluation of the direction and quantitative importance of a treatment effect observed in randomized trials which have tested the treatment but have not provided a definitive conclusion. In the present review, we discuss the methodology and the contribution of meta-analyses to the treatment of advanced-stage or metastatic non-small-cell lung cancer. In this area of cancerology, meta-analyses have provided determining information demonstrating the impact of chemotherapy on patient survival. They have also helped define a two-drug regimen based on cisplatin as the gold standard treatment for patients with a satisfactory general status. Recently, the meta-analysis method was used to measure the influence of gemcitabin in combination with platinium salts and demonstrated a small but significant benefit in survival, confirming that gemcitabin remains the gold standard treatment in combination with cisplatin.
NASA Astrophysics Data System (ADS)
Chyasnavichyus, Marius; Young, Seth L.; Tsukruk, Vladimir V.
2015-08-01
Probing of micro- and nanoscale mechanical properties of soft materials with atomic force microscopy (AFM) gives essential information about the performance of the nanostructured polymer systems, natural nanocomposites, ultrathin coatings, and cell functioning. AFM provides efficient and is some cases the exclusive way to study these properties nondestructively in controlled environment. Precise force control in AFM methods allows its application to variety of soft materials and can be used to go beyond elastic properties and examine temperature and rate dependent materials response. In this review, we discuss experimental AFM methods currently used in the field of soft nanostructured composites and biomaterials. We discuss advantages and disadvantages of common AFM probing techniques, which allow for both qualitative and quantitative mappings of the elastic modulus of soft materials with nanosacle resolution. We also discuss several advanced techniques for more elaborate measurements of viscoelastic properties of soft materials and experiments on single cells.
Multidimensional Processing and Visual Rendering of Complex 3D Biomedical Images
NASA Technical Reports Server (NTRS)
Sams, Clarence F.
2016-01-01
The proposed technology uses advanced image analysis techniques to maximize the resolution and utility of medical imaging methods being used during spaceflight. We utilize COTS technology for medical imaging, but our applications require higher resolution assessment of the medical images than is routinely applied with nominal system software. By leveraging advanced data reduction and multidimensional imaging techniques utilized in analysis of Planetary Sciences and Cell Biology imaging, it is possible to significantly increase the information extracted from the onboard biomedical imaging systems. Year 1 focused on application of these techniques to the ocular images collected on ground test subjects and ISS crewmembers. Focus was on the choroidal vasculature and the structure of the optic disc. Methods allowed for increased resolution and quantitation of structural changes enabling detailed assessment of progression over time. These techniques enhance the monitoring and evaluation of crew vision issues during space flight.
Consolidation of molecular testing in clinical virology.
Scagnolari, Carolina; Turriziani, Ombretta; Monteleone, Katia; Pierangeli, Alessandra; Antonelli, Guido
2017-04-01
The development of quantitative methods for the detection of viral nucleic acids have significantly improved our ability to manage disease progression and to assess the efficacy of antiviral treatment. Moreover, major advances in molecular technologies during the last decade have allowed the identification of new host genetic markers associated with antiviral drug response but have also strongly revolutionized the way we see and perform virus diagnostics in the coming years. Areas covered: In this review, we describe the history and development of virology diagnostic methods, dedicating particular emphasis on the gradual evolution and recent advances toward the introduction of multiparametric platforms for the syndromic diagnosis. In parallel, we outline the consolidation of viral genome quantification practice in different clinical settings. Expert commentary: More rapid, accurate and affordable molecular technology can be predictable with particular emphasis on emerging techniques (next generation sequencing, digital PCR, point of care testing and syndromic diagnosis) to simplify viral diagnosis in the next future.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Dynamic Quantitative Trait Locus Analysis of Plant Phenomic Data.
Li, Zitong; Sillanpää, Mikko J
2015-12-01
Advanced platforms have recently become available for automatic and systematic quantification of plant growth and development. These new techniques can efficiently produce multiple measurements of phenotypes over time, and introduce time as an extra dimension to quantitative trait locus (QTL) studies. Functional mapping utilizes a class of statistical models for identifying QTLs associated with the growth characteristics of interest. A major benefit of functional mapping is that it integrates information over multiple timepoints, and therefore could increase the statistical power for QTL detection. We review the current development of computationally efficient functional mapping methods which provide invaluable tools for analyzing large-scale timecourse data that are readily available in our post-genome era. Copyright © 2015 Elsevier Ltd. All rights reserved.
Ulmer, Candice Z; Ragland, Jared M; Koelmel, Jeremy P; Heckert, Alan; Jones, Christina M; Garrett, Timothy J; Yost, Richard A; Bowden, John A
2017-12-19
As advances in analytical separation techniques, mass spectrometry instrumentation, and data processing platforms continue to spur growth in the lipidomics field, more structurally unique lipid species are detected and annotated. The lipidomics community is in need of benchmark reference values to assess the validity of various lipidomics workflows in providing accurate quantitative measurements across the diverse lipidome. LipidQC addresses the harmonization challenge in lipid quantitation by providing a semiautomated process, independent of analytical platform, for visual comparison of experimental results of National Institute of Standards and Technology Standard Reference Material (SRM) 1950, "Metabolites in Frozen Human Plasma", against benchmark consensus mean concentrations derived from the NIST Lipidomics Interlaboratory Comparison Exercise.
Chemical Bonding in Sulfide Minerals
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vaughan, David J.; Rosso, Kevin M.
An understanding of chemical bonding and electronic structure in sulfide minerals is central to any attempt at understanding their crystal structures, stabilities and physical properties. It is also an essential precursor to understanding reactivity through modeling surface structure at the molecular scale. In recent decades, there have been remarkable advances in first principles (ab initio) methods for the quantitative calculation of electronic structure. These advances have been made possible by the very rapid development of high performance computers. Several review volumes that chart the applications of these developments in mineralogy and geochemistry are available (Tossell and Vaughan, 1992; Cygan andmore » Kubicki, 2001). An important feature of the sulfide minerals is the diversity of their electronic structures, as evidenced by their electrical and magnetic properties (see Pearce et al. 2006, this volume). Thus, sulfide minerals range from insulators through semiconductors to metals, and exhibit every type of magnetic behavior. This has presented problems for those attempting to develop bonding models for sulfides, and also led to certain misconceptions regarding the kinds of models that may be appropriate. In this chapter, chemical bonding and electronic structure models for sulfides are reviewed with emphasis on more recent developments. Although the fully ab initio quantitative methods are now capable of a remarkable degree of sophistication in terms of agreement with experiment and potential to interpret and predict behavior with varying conditions, both qualitative and more simplistic quantitative approaches will also be briefly discussed. This is because we believe that the insights which they provide are still helpful to those studying sulfide minerals. In addition to the application of electronic structure models and calculations to solid sulfides, work on sulfide mineral surfaces (Rosso and Vaughan 2006a,b) and solution complexes and clusters (Rickard and Luther, 2006) are discussed in detail later in this volume.« less
Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo
2013-10-10
The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
Klein, Theo; Viner, Rosa I; Overall, Christopher M
2016-10-28
Adaptive immunity is the specialized defence mechanism in vertebrates that evolved to eliminate pathogens. Specialized lymphocytes recognize specific protein epitopes through antigen receptors to mount potent immune responses, many of which are initiated by nuclear factor-kappa B activation and gene transcription. Most, if not all, pathways in adaptive immunity are further regulated by post-translational modification (PTM) of signalling proteins, e.g. phosphorylation, citrullination, ubiquitination and proteolytic processing. The importance of PTMs is reflected by genetic or acquired defects in these pathways that lead to a dysfunctional immune response. Here we discuss the state of the art in targeted proteomics and systems biology approaches to dissect the PTM landscape specifically regarding ubiquitination and proteolysis in B- and T-cell activation. Recent advances have occurred in methods for specific enrichment and targeted quantitation. Together with improved instrument sensitivity, these advances enable the accurate analysis of often rare PTM events that are opaque to conventional proteomics approaches, now rendering in-depth analysis and pathway dissection possible. We discuss published approaches, including as a case study the profiling of the N-terminome of lymphocytes of a rare patient with a genetic defect in the paracaspase protease MALT1, a key regulator protease in antigen-driven signalling, which was manifested by elevated linear ubiquitination.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.
Fairhurst, Karen; Ziebland, Sue; Wyke, Sally; Seaman, Peter; Glasier, Anna
2004-07-01
The Lothian Emergency Contraception Project (LECP)--a primary care-based intervention to offer advance supplies of emergency contraception (EC) to women aged 16-29 was not associated with a reduction in abortion rates. We undertook case studies, utilizing qualitative and quantitative methods, to evaluate the intervention. In this article we present findings from qualitative interviews with 44 primary care professionals working at case study sites and 22 women who had received advance supplies to explain this failure. Professionals reported that women rarely asked for advance supplies of EC and they were reluctant to offer supplies to women because of concerns about contradictory sexual health messages implied by the offer, a perceived association of EC use with chaotic behavior by women, views about the sort of women suitable for advance supplies and practical difficulties making the offer. Women were reluctant to ask for advance supplies because of misgivings about the appropriateness of offering advance supplies to everybody and concerns about being judged by health professionals as morally inadequate. If advance provision of EC is to be successful in reducing abortion rates, professionals must address their concerns about EC and develop imaginative ways of encouraging women most at risk of unwanted pregnancy to take supplies home.
Sharpening advanced land imager multispectral data using a sensor model
Lemeshewsky, G.P.; ,
2005-01-01
The Advanced Land Imager (ALI) instrument on NASA's Earth Observing One (EO-1) satellite provides for nine spectral bands at 30m ground sample distance (GSD) and a 10m GSD panchromatic band. This report describes an image sharpening technique where the higher spatial resolution information of the panchromatic band is used to increase the spatial resolution of ALI multispectral (MS) data. To preserve the spectral characteristics, this technique combines reported deconvolution deblurring methods for the MS data with highpass filter-based fusion methods for the Pan data. The deblurring process uses the point spread function (PSF) model of the ALI sensor. Information includes calculation of the PSF from pre-launch calibration data. Performance was evaluated using simulated ALI MS data generated by degrading the spatial resolution of high resolution IKONOS satellite MS data. A quantitative measure of performance was the error between sharpened MS data and high resolution reference. This report also compares performance with that of a reported method that includes PSF information. Preliminary results indicate improved sharpening with the method reported here.
Quantitative genetic methods depending on the nature of the phenotypic trait.
de Villemereuil, Pierre
2018-01-24
A consequence of the assumptions of the infinitesimal model, one of the most important theoretical foundations of quantitative genetics, is that phenotypic traits are predicted to be most often normally distributed (so-called Gaussian traits). But phenotypic traits, especially those interesting for evolutionary biology, might be shaped according to very diverse distributions. Here, I show how quantitative genetics tools have been extended to account for a wider diversity of phenotypic traits using first the threshold model and then more recently using generalized linear mixed models. I explore the assumptions behind these models and how they can be used to study the genetics of non-Gaussian complex traits. I also comment on three recent methodological advances in quantitative genetics that widen our ability to study new kinds of traits: the use of "modular" hierarchical modeling (e.g., to study survival in the context of capture-recapture approaches for wild populations); the use of aster models to study a set of traits with conditional relationships (e.g., life-history traits); and, finally, the study of high-dimensional traits, such as gene expression. © 2018 New York Academy of Sciences.
Yehia, Ali M; Arafa, Reham M; Abbas, Samah S; Amer, Sawsan M
2016-01-15
Spectral resolution of cefquinome sulfate (CFQ) in the presence of its degradation products was studied. Three selective, accurate and rapid spectrophotometric methods were performed for the determination of CFQ in the presence of either its hydrolytic, oxidative or photo-degradation products. The proposed ratio difference, derivative ratio and mean centering are ratio manipulating spectrophotometric methods that were satisfactorily applied for selective determination of CFQ within linear range of 5.0-40.0 μg mL(-1). Concentration Residuals Augmented Classical Least Squares was applied and evaluated for the determination of the cited drug in the presence of its all degradation products. Traditional Partial Least Squares regression was also applied and benchmarked against the proposed advanced multivariate calibration. Experimentally designed 25 synthetic mixtures of three factors at five levels were used to calibrate and validate the multivariate models. Advanced chemometrics succeeded in quantitative and qualitative analyses of CFQ along with its hydrolytic, oxidative and photo-degradation products. The proposed methods were applied successfully for different pharmaceutical formulations analyses. These developed methods were simple and cost-effective compared with the manufacturer's RP-HPLC method. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo
2018-04-01
For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.
Xiu, Junshan; Liu, Shiming; Sun, Meiling; Dong, Lili
2018-01-20
The photoelectric performance of metal ion-doped TiO 2 film will be improved with the changing of the compositions and concentrations of additive elements. In this work, the TiO 2 films doped with different Sn concentrations were obtained with the hydrothermal method. Qualitative and quantitative analysis of the Sn element in TiO 2 film was achieved with laser induced breakdown spectroscopy (LIBS) with the calibration curves plotted accordingly. The photoelectric characteristics of TiO 2 films doped with different Sn content were observed with UV visible absorption spectra and J-V curves. All results showed that Sn doping could improve the optical absorption to be red-shifted and advance the photoelectric properties of the TiO 2 films. We had obtained that when the concentration of Sn doping in TiO 2 films was 11.89 mmol/L, which was calculated by the LIBS calibration curves, the current density of the film was the largest, which indicated the best photoelectric performance. It indicated that LIBS was a potential and feasible measured method, which was applied to qualitative and quantitative analysis of the additive element in metal oxide nanometer film.
Quantitative optical metrology with CMOS cameras
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.
2004-08-01
Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.
Advancing the Science of Qualitative Research to Promote Health Equity.
Griffith, Derek M; Shelton, Rachel C; Kegler, Michelle
2017-10-01
Qualitative methods have long been a part of health education research, but how qualitative approaches advance health equity has not been well described. Qualitative research is an increasingly important methodologic tool to use in efforts to understand, inform, and advance health equity. Qualitative research provides critical insight into the subjective meaning and context of health that can be essential for understanding where and how to intervene to inform health equity research and practice. We describe the larger context for this special theme issue of Health Education & Behavior, provide brief overviews of the 15 articles that comprise the issue, and discuss the promise of qualitative research that seeks to contextualize and illuminate answers to research questions in efforts to promote health equity. We highlight the critical role that qualitative research can play in considering and incorporating a diverse array of contextual information that is difficult to capture in quantitative research.
NASA Astrophysics Data System (ADS)
Pratt, Jon R.; Kramar, John A.; Newell, David B.; Smith, Douglas T.
2005-05-01
If nanomechanical testing is to evolve into a tool for process and quality control in semiconductor fabrication, great advances in throughput, repeatability, and accuracy of the associated instruments and measurements will be required. A recent grant awarded by the NIST Advanced Technology Program seeks to address the throughput issue by developing a high-speed AFM-based platform for quantitative nanomechanical measurements. The following paper speaks to the issue of quantitative accuracy by presenting an overview of various standards and techniques under development at NIST and other national metrology institutes (NMIs) that can provide a metrological basis for nanomechanical testing. The infrastructure we describe places firm emphasis on traceability to the International System of Units, paving the way for truly quantitative, rather than qualitative, physical property testing.
Comparison of Variance-to-Mean Ratio Methods for Reparables Inventory Management
2006-03-01
for Recoverable Items in the ALS [Advanced Logistics System] Marginal Analysis Algorithms”. Marginal analysis is a microeconomics technique used...in the Demands Workbook . The quantitative expected backorder and aircraft availability percentage result. Each of the 30 simulations is run five...10A, B-2A, C-17A and F-15E aircraft. The data was selected from D200A’s Ddb04 tables and flying hour programs respectively. The two workbook (OIM
Balboni, Michael J.; Babar, Amenah; Dillinger, Jennifer; Phelps, Andrea C.; George, Emily; Block, Susan D.; Kachnic, Lisa; Hunt, Jessica; Peteet, John; Prigerson, Holly G.; VanderWeele, Tyler J.; Balboni, Tracy A.
2012-01-01
Context Although prayer potentially serves as an important practice in offering religious/spiritual support, its role in the clinical setting remains disputed. Few data exist to guide the role of patient-practitioner prayer in the setting of advanced illness. Objectives To inform the role of prayer in the setting of life-threatening illness, this study used mixed quantitative-qualitative methods to describe the viewpoints expressed by patients with advanced cancer, oncology nurses, and oncology physicians concerning the appropriateness of clinician prayer. Methods This is a cross-sectional, multisite, mixed-methods study of advanced cancer patients (n = 70), oncology physicians (n = 206), and oncology nurses (n = 115). Semistructured interviews were used to assess respondents’ attitudes toward the appropriate role of prayer in the context of advanced cancer. Theme extraction was performed based on interdisciplinary input using grounded theory. Results Most advanced cancer patients (71%), nurses (83%), and physicians (65%) reported that patient-initiated patient-practitioner prayer was at least occasionally appropriate. Furthermore, clinician prayer was viewed as at least occasionally appropriate by the majority of patients (64%), nurses (76%), and physicians (59%). Of those patients who could envision themselves asking their physician or nurse for prayer (61%), 86% would find this form of prayer spiritually supportive. Most patients (80%) viewed practitioner-initiated prayer as spiritually supportive. Open-ended responses regarding the appropriateness of patient-practitioner prayer in the advanced cancer setting revealed six themes shaping respondents’ viewpoints: necessary conditions for prayer, potential benefits of prayer, critical attitudes toward prayer, positive attitudes toward prayer, potential negative consequences of prayer, and prayer alternatives. Conclusion Most patients and practitioners view patient-practitioner prayer as at least occasionally appropriate in the advanced cancer setting, and most patients view prayer as spiritually supportive. However, the appropriateness of patient-practitioner prayer is case specific, requiring consideration of multiple factors. PMID:21276700
Use of Nanoparticle Contrast Agents for Cell Tracking with Computed Tomography
2017-01-01
Efforts to develop novel cell-based therapies originated with the first bone marrow transplant on a leukemia patient in 1956. Preclinical and clinical examples of cell-based treatment strategies have shown promising results across many disciplines in medicine, with recent advances in immune cell therapies for cancer producing remarkable response rates, even in patients with multiple treatment failures. However, cell-based therapies suffer from inconsistent outcomes, motivating the search for tools that allow monitoring of cell delivery and behavior in vivo. Noninvasive cell imaging techniques, also known as cell tracking, have been developed to address this issue. These tools can allow real-time, quantitative, and long-term monitoring of transplanted cells in the recipient, providing insight on cell migration, distribution, viability, differentiation, and fate, all of which play crucial roles in treatment efficacy. Understanding these parameters allows the optimization of cell choice, delivery route, and dosage for therapy and advances cell-based therapy for specific clinical uses. To date, most cell tracking work has centered on imaging modalities such as MRI, radionuclide imaging, and optical imaging. However, X-ray computed tomography (CT) is an emerging method for cell tracking that has several strengths such as high spatial and temporal resolution, and excellent quantitative capabilities. The advantages of CT for cell tracking are enhanced by its wide availability and cost effectiveness, allowing CT to become one of the most popular clinical imaging modalities and a key asset in disease diagnosis. In this review, we will discuss recent advances in cell tracking methods using X-ray CT in various applications, in addition to predictions on how the field will progress. PMID:28485976
Neltner, Janna Hackett; Abner, Erin Lynn; Schmitt, Frederick A; Denison, Stephanie Kay; Anderson, Sonya; Patel, Ela; Nelson, Peter T
2012-12-01
Quantitative neuropathologic methods provide information that is important for both research and clinical applications. The technologic advancement of digital pathology and image analysis offers new solutions to enable valid quantification of pathologic severity that is reproducible between raters regardless of experience. Using an Aperio ScanScope XT and its accompanying image analysis software, we designed algorithms for quantitation of amyloid and tau pathologies on 65 β-amyloid (6F/3D antibody) and 48 phospho-tau (PHF-1)-immunostained sections of human temporal neocortex. Quantitative digital pathologic data were compared with manual pathology counts. There were excellent correlations between manually counted and digitally analyzed neuropathologic parameters (R² = 0.56-0.72). Data were highly reproducible among 3 participants with varying degrees of expertise in neuropathology (intraclass correlation coefficient values, >0.910). Digital quantification also provided additional parameters, including average plaque area, which shows statistically significant differences when samples are stratified according to apolipoprotein E allele status (average plaque area, 380.9 μm² in apolipoprotein E [Latin Small Letter Open E]4 carriers vs 274.4 μm² for noncarriers; p < 0.001). Thus, digital pathology offers a rigorous and reproducible method for quantifying Alzheimer disease neuropathologic changes and may provide additional insights into morphologic characteristics that were previously more challenging to assess because of technical limitations.
An assessment of advance regeneration and herbaceous communities in Pennsylvania forests
W. H. McWilliams; S. L. Stout; T. W. Bowersox; L. H. McCormick
1993-01-01
Intensive research in several parts of Pennsylvania has demonstrated that dense herbaceous cover can inhibit the development of advance regeneration, and that advance regeneration often is lacking in Pennsylvania forests. The lack of quantitative statewide information on forest understory prompted a study to characterize advance regeneration and fern inhibition for...
Cardiac imaging: working towards fully-automated machine analysis & interpretation.
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-03-01
Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.
Code of Federal Regulations, 2014 CFR
2014-01-01
... funds or total capital within the group. Quantitative disclosures (d) The aggregate amount of surplus... and conditions of the main features of all regulatory capital instruments. Quantitative disclosures (b... capital to support current and future activities. Quantitative disclosures (b) Risk-weighted assets for...
12 CFR 217.173 - Disclosures by certain advanced approaches Board-regulated institutions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... total capital within the group. Quantitative disclosures (d) The aggregate amount of surplus capital of... conditions of the main features of all regulatory capital instruments. Quantitative disclosures (b) The... current and future activities. Quantitative disclosures (b) Risk-weighted assets for credit risk from: (1...
12 CFR 324.173 - Disclosures by certain advanced approaches FDIC-supervised institutions.
Code of Federal Regulations, 2014 CFR
2014-01-01
... total capital within the group. Quantitative disclosures (d) The aggregate amount of surplus capital of... conditions of the main features of all regulatory capital instruments. Quantitative disclosures (b) The... current and future activities. Quantitative disclosures (b) Risk-weighted assets for credit risk from:(1...
An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise
ERIC Educational Resources Information Center
Parker, Richard H.
2011-01-01
An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…
Johnson, Heath E; Haugh, Jason M
2013-12-02
This unit focuses on the use of total internal reflection fluorescence (TIRF) microscopy and image analysis methods to study the dynamics of signal transduction mediated by class I phosphoinositide 3-kinases (PI3Ks) in mammalian cells. The first four protocols cover live-cell imaging experiments, image acquisition parameters, and basic image processing and segmentation. These methods are generally applicable to live-cell TIRF experiments. The remaining protocols outline more advanced image analysis methods, which were developed in our laboratory for the purpose of characterizing the spatiotemporal dynamics of PI3K signaling. These methods may be extended to analyze other cellular processes monitored using fluorescent biosensors. Copyright © 2013 John Wiley & Sons, Inc.
Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.
The LLNA: A Brief Review of Recent Advances and Limitations
Anderson, Stacey E.; Siegel, Paul D.; Meade, B. J.
2011-01-01
Allergic contact dermatitis is the second most commonly reported occupational illness, accounting for 10% to 15% of all occupational diseases. This highlights the importance of developing rapid and sensitive methods for hazard identification of chemical sensitizers. The murine local lymph node assay (LLNA) was developed and validated for the identification of low molecular weight sensitizing chemicals. It provides several benefits over other tests for sensitization because it provides a quantitative endpoint, dose-responsive data, and allows for prediction of potency. However, there are also several concerns with this assay including: levels of false positive responses, variability due to vehicle, and predictivity. This report serves as a concise review which briefly summarizes the progress, advances and limitations of the assay over the last decade. PMID:21747867
Optimisation of quantitative lung SPECT applied to mild COPD: a software phantom simulation study.
Norberg, Pernilla; Olsson, Anna; Alm Carlsson, Gudrun; Sandborg, Michael; Gustafsson, Agnetha
2015-01-01
The amount of inhomogeneities in a (99m)Tc Technegas single-photon emission computed tomography (SPECT) lung image, caused by reduced ventilation in lung regions affected by chronic obstructive pulmonary disease (COPD), is correlated to disease advancement. A quantitative analysis method, the CVT method, measuring these inhomogeneities was proposed in earlier work. To detect mild COPD, which is a difficult task, optimised parameter values are needed. In this work, the CVT method was optimised with respect to the parameter values of acquisition, reconstruction and analysis. The ordered subset expectation maximisation (OSEM) algorithm was used for reconstructing the lung SPECT images. As a first step towards clinical application of the CVT method in detecting mild COPD, this study was based on simulated SPECT images of an advanced anthropomorphic lung software phantom including respiratory and cardiac motion, where the mild COPD lung had an overall ventilation reduction of 5%. The best separation between healthy and mild COPD lung images as determined using the CVT measure of ventilation inhomogeneity and 125 MBq (99m)Tc was obtained using a low-energy high-resolution collimator (LEHR) and a power 6 Butterworth post-filter with a cutoff frequency of 0.6 to 0.7 cm(-1). Sixty-four reconstruction updates and a small kernel size should be used when the whole lung is analysed, and for the reduced lung a greater number of updates and a larger kernel size are needed. A LEHR collimator and 125 (99m)Tc MBq together with an optimal combination of cutoff frequency, number of updates and kernel size, gave the best result. Suboptimal selections of either cutoff frequency, number of updates and kernel size will reduce the imaging system's ability to detect mild COPD in the lung phantom.
Hall, Sue; Edmonds, Polly; Harding, Richard; Chochinov, Harvey; Higginson, Irene J
2009-01-01
Background Loss of dignity for people with advanced cancer is associated with high levels of psychological and spiritual distress and the loss of the will to live. Dignity Therapy is a brief psychotherapy, which has been developed to help promote dignity and reduce distress. It comprises a recorded interview, which is transcribed, edited then returned to the patient, who can bequeath it to people of their choosing. Piloting in Canada, Australia and the USA, has suggested that Dignity Therapy is beneficial to people with advanced cancer and their families. The aims of this study are to assess the feasibility, acceptability and potential effectiveness of Dignity Therapy to reduce psychological and spiritual distress in people with advanced cancer who have been referred to hospital-based palliative care teams in the UK, and to pilot the methods for a Phase III RCT. Design A randomised controlled open-label trial. Forty patients with advanced cancer are randomly allocated to one of two groups: (i) Intervention (Dignity Therapy offered in addition to any standard care), and (ii) Control group (standard care). Recipients of the 'generativity' documents are asked their views on taking part in the study and the therapy. Both quantitative and qualitative outcomes are assessed in face-to-face interviews at baseline and at approximately one and four weeks after the intervention (equivalent in the control group). The primary outcome is patients' sense of dignity (potential effectiveness) assessed by the Patient Dignity Inventory. Secondary outcomes for patients include distress, hopefulness and quality of life. In view of the relatively small sample size, quantitative analyses are mainly descriptive. The qualitative analysis uses the Framework method. Discussion Dignity Therapy is brief, can be delivered at the bedside and may help both patients and their families. This detailed exploratory research shows if it is feasible to offer Dignity Therapy to patients with advanced cancer, many of whom are likely to be in the terminal stage of their illness, whether it is acceptable to them and their families, if it is likely to be effective, and determine whether a Phase III RCT is desirable. Trial registration Current Controlled Clinical Trials: ISRCTN29868352 PMID:19445711
Bryce, Richard A
2011-04-01
The ability to accurately predict the interaction of a ligand with its receptor is a key limitation in computer-aided drug design approaches such as virtual screening and de novo design. In this article, we examine current strategies for a physics-based approach to scoring of protein-ligand affinity, as well as outlining recent developments in force fields and quantum chemical techniques. We also consider advances in the development and application of simulation-based free energy methods to study protein-ligand interactions. Fuelled by recent advances in computational algorithms and hardware, there is the opportunity for increased integration of physics-based scoring approaches at earlier stages in computationally guided drug discovery. Specifically, we envisage increased use of implicit solvent models and simulation-based scoring methods as tools for computing the affinities of large virtual ligand libraries. Approaches based on end point simulations and reference potentials allow the application of more advanced potential energy functions to prediction of protein-ligand binding affinities. Comprehensive evaluation of polarizable force fields and quantum mechanical (QM)/molecular mechanical and QM methods in scoring of protein-ligand interactions is required, particularly in their ability to address challenging targets such as metalloproteins and other proteins that make highly polar interactions. Finally, we anticipate increasingly quantitative free energy perturbation and thermodynamic integration methods that are practical for optimization of hits obtained from screened ligand libraries.
Jung, Chanho; Kim, Changick
2014-08-01
Automatic segmentation of cell nuclei clusters is a key building block in systems for quantitative analysis of microscopy cell images. For that reason, it has received a great attention over the last decade, and diverse automatic approaches to segment clustered nuclei with varying levels of performance under different test conditions have been proposed in literature. To the best of our knowledge, however, so far there is no comparative study on the methods. This study is a first attempt to fill this research gap. More precisely, the purpose of this study is to present an objective performance comparison of existing state-of-the-art segmentation methods. Particularly, the impact of their accuracy on classification of thyroid follicular lesions is also investigated "quantitatively" under the same experimental condition, to evaluate the applicability of the methods. Thirteen different segmentation approaches are compared in terms of not only errors in nuclei segmentation and delineation, but also their impact on the performance of system to classify thyroid follicular lesions using different metrics (e.g., diagnostic accuracy, sensitivity, specificity, etc.). Extensive experiments have been conducted on a total of 204 digitized thyroid biopsy specimens. Our study demonstrates that significant diagnostic errors can be avoided using more advanced segmentation approaches. We believe that this comprehensive comparative study serves as a reference point and guide for developers and practitioners in choosing an appropriate automatic segmentation technique adopted for building automated systems for specifically classifying follicular thyroid lesions. © 2014 International Society for Advancement of Cytometry.
Advanced ensemble modelling of flexible macromolecules using X-ray solution scattering.
Tria, Giancarlo; Mertens, Haydyn D T; Kachala, Michael; Svergun, Dmitri I
2015-03-01
Dynamic ensembles of macromolecules mediate essential processes in biology. Understanding the mechanisms driving the function and molecular interactions of 'unstructured' and flexible molecules requires alternative approaches to those traditionally employed in structural biology. Small-angle X-ray scattering (SAXS) is an established method for structural characterization of biological macromolecules in solution, and is directly applicable to the study of flexible systems such as intrinsically disordered proteins and multi-domain proteins with unstructured regions. The Ensemble Optimization Method (EOM) [Bernadó et al. (2007 ▶). J. Am. Chem. Soc. 129, 5656-5664] was the first approach introducing the concept of ensemble fitting of the SAXS data from flexible systems. In this approach, a large pool of macromolecules covering the available conformational space is generated and a sub-ensemble of conformers coexisting in solution is selected guided by the fit to the experimental SAXS data. This paper presents a series of new developments and advancements to the method, including significantly enhanced functionality and also quantitative metrics for the characterization of the results. Building on the original concept of ensemble optimization, the algorithms for pool generation have been redesigned to allow for the construction of partially or completely symmetric oligomeric models, and the selection procedure was improved to refine the size of the ensemble. Quantitative measures of the flexibility of the system studied, based on the characteristic integral parameters of the selected ensemble, are introduced. These improvements are implemented in the new EOM version 2.0, and the capabilities as well as inherent limitations of the ensemble approach in SAXS, and of EOM 2.0 in particular, are discussed.
Meng, Fan-Cheng; Wu, Zheng-Feng; Yin, Zhi-Qi; Lin, Li-Gen; Wang, Ruibing; Zhang, Qing-Wen
2018-01-01
Coptidis rhizoma (CR) is the dried rhizome of Coptis chinensis Franch., C. deltoidea C. Y. Cheng et Hsiao or C. teeta Wall. (Ranunculaceae) and is commonly used in Traditional Chinese Medicine for the treatment of various diseases including bacillary dysentery, typhoid, tuberculosis, epidemic cerebrospinal meningitis, empyrosis, pertussis, and other illnesses. A literature survey was conducted via SciFinder, ScieneDirect, PubMed, Springer, and Wiley databases. A total of 139 selected references were classified on the basis of their research scopes, including chemical investigation, quality evaluation and pharmacological studies. Many types of secondary metabolites including alkaloids, lignans, phenylpropanoids, flavonoids, phenolic compounds, saccharides, and steroids have been isolated from CR. Among them, protoberberine-type alkaloids, such as berberine, palmatine, coptisine, epiberberine, jatrorrhizine, columamine, are the main components of CR. Quantitative determination of these alkaloids is a very important aspect in the quality evaluation of CR. In recent years, with the advances in isolation and detection technologies, many new instruments and methods have been developed for the quantitative and qualitative analysis of the main alkaloids from CR. The quality control of CR has provided safety for pharmacological applications. These quality evaluation methods are also frequently employed to screen the active components from CR. Various investigations have shown that CR and its main alkaloids exhibited many powerful pharmacological effects including anti-inflammatory, anti-cancer, anti-diabetic, neuroprotective, cardioprotective, hypoglycemic, anti-Alzheimer and hepatoprotective activities. This review summarizes the recent phytochemical investigations, quality evaluation methods, the biological studies focusing on CR as well as its main alkaloids.
NASA Astrophysics Data System (ADS)
Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.
2016-03-01
A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.
Senesi, Giorgio S; Senesi, Nicola
2016-09-28
Soil organic carbon (OC) measurement is a crucial factor for quantifying soil C pools and inventories and monitoring the inherent temporal and spatial heterogeneity and changes of soil OC content. These are relevant issues in addressing sustainable management of terrestrial OC aiming to enhance C sequestration in soil, thus mitigating the impact of increasing CO2 concentration in the atmosphere and related effects on global climate change. Nowadays, dry combustion by an elemental analyzer or wet combustion by dichromate oxidation of the soil sample are the most recommended and commonly used methods for quantitative soil OC determination. However, the unanimously recognized uncertainties and limitations of these classical laboursome methods have prompted research efforts focusing on the development and application of more advanced and appealing techniques and methods for the measurement of soil OC in the laboratory and possibly in situ in the field. Among these laser-induced breakdown spectroscopy (LIBS) has raised the highest interest for its unique advantages. After an introduction and a highlight of the LIBS basic principles, instrumentation, methodologies and supporting chemometric methods, the main body of this review provides an historical and critical overview of the developments and results obtained up-to-now by the application of LIBS to the quantitative measurement of soil C and especially OC content. A brief critical summary of LIBS advantages and limitations/drawbacks including some final remarks and future perspectives concludes this review. Copyright © 2016 Elsevier B.V. All rights reserved.
Mistry, Rashmita S; White, Elizabeth S; Chow, Kirby A; Griffin, Katherine M; Nenadal, Lindsey
2016-01-01
Mixed methods research approaches are gaining traction across various social science disciplines, including among developmental scientists. In this chapter, we discuss the utility of a mixed methods research approach in examining issues related to equity and justice. We incorporate a brief overview of quantitative and qualitative monomethod research approaches in our larger discussion of the advantages, procedures, and considerations of employing a mixed methods design to advance developmental science from an equity and justice perspective. To better illustrate the theoretical and practical significance of a mixed methods research approach, we include examples of research conducted on children and adolescents' conceptions of economic inequality as one example of developmental science research with an equity and justice frame. © 2016 Elsevier Inc. All rights reserved.
Semantic Edge Based Disparity Estimation Using Adaptive Dynamic Programming for Binocular Sensors
Zhu, Dongchen; Li, Jiamao; Wang, Xianshun; Peng, Jingquan; Shi, Wenjun; Zhang, Xiaolin
2018-01-01
Disparity calculation is crucial for binocular sensor ranging. The disparity estimation based on edges is an important branch in the research of sparse stereo matching and plays an important role in visual navigation. In this paper, we propose a robust sparse stereo matching method based on the semantic edges. Some simple matching costs are used first, and then a novel adaptive dynamic programming algorithm is proposed to obtain optimal solutions. This algorithm makes use of the disparity or semantic consistency constraint between the stereo images to adaptively search parameters, which can improve the robustness of our method. The proposed method is compared quantitatively and qualitatively with the traditional dynamic programming method, some dense stereo matching methods, and the advanced edge-based method respectively. Experiments show that our method can provide superior performance on the above comparison. PMID:29614028
Semantic Edge Based Disparity Estimation Using Adaptive Dynamic Programming for Binocular Sensors.
Zhu, Dongchen; Li, Jiamao; Wang, Xianshun; Peng, Jingquan; Shi, Wenjun; Zhang, Xiaolin
2018-04-03
Disparity calculation is crucial for binocular sensor ranging. The disparity estimation based on edges is an important branch in the research of sparse stereo matching and plays an important role in visual navigation. In this paper, we propose a robust sparse stereo matching method based on the semantic edges. Some simple matching costs are used first, and then a novel adaptive dynamic programming algorithm is proposed to obtain optimal solutions. This algorithm makes use of the disparity or semantic consistency constraint between the stereo images to adaptively search parameters, which can improve the robustness of our method. The proposed method is compared quantitatively and qualitatively with the traditional dynamic programming method, some dense stereo matching methods, and the advanced edge-based method respectively. Experiments show that our method can provide superior performance on the above comparison.
Fung, Eliza N; Bryan, Peter; Kozhich, Alexander
2016-04-01
LC-MS/MS has been investigated to quantify protein therapeutics in biological matrices. The protein therapeutics is digested by an enzyme to generate surrogate peptide(s) before LC-MS/MS analysis. One challenge is isolating protein therapeutics in the presence of large number of endogenous proteins in biological matrices. Immunocapture, in which a capture agent is used to preferentially bind the protein therapeutics over other proteins, is gaining traction. The protein therapeutics is eluted for digestion and LC-MS/MS analysis. One area of tremendous potential for immunocapture-LC-MS/MS is to obtain quantitative data where ligand-binding assay alone is not sufficient, for example, quantitation of antidrug antibody complexes. Herein, we present an overview of recent advance in enzyme digestion and immunocapture applicable to protein quantitation.
Hirschfeld, Gerrit; Blankenburg, Markus R; Süß, Moritz; Zernikow, Boris
2015-01-01
The assessment of somatosensory function is a cornerstone of research and clinical practice in neurology. Recent initiatives have developed novel protocols for quantitative sensory testing (QST). Application of these methods led to intriguing findings, such as the presence lower pain-thresholds in healthy children compared to healthy adolescents. In this article, we (re-) introduce the basic concepts of signal detection theory (SDT) as a method to investigate such differences in somatosensory function in detail. SDT describes participants' responses according to two parameters, sensitivity and response-bias. Sensitivity refers to individuals' ability to discriminate between painful and non-painful stimulations. Response-bias refers to individuals' criterion for giving a "painful" response. We describe how multilevel models can be used to estimate these parameters and to overcome central critiques of these methods. To provide an example we apply these methods to data from the mechanical pain sensitivity test of the QST protocol. The results show that adolescents are more sensitive to mechanical pain and contradict the idea that younger children simply use more lenient criteria to report pain. Overall, we hope that the wider use of multilevel modeling to describe somatosensory functioning may advance neurology research and practice.
NASA Astrophysics Data System (ADS)
Shaked, Natan T.; Girshovitz, Pinhas; Frenklach, Irena
2014-06-01
We present our recent advances in the development of compact, highly portable and inexpensive wide-field interferometric modules. By a smart design of the interferometric system, including the usage of low-coherence illumination sources and common-path off-axis geometry of the interferometers, spatial and temporal noise levels of the resulting quantitative thickness profile can be sub-nanometric, while processing the phase profile in real time. In addition, due to novel experimentally-implemented multiplexing methods, we can capture low-coherence off-axis interferograms with significantly extended field of view and in faster acquisition rates. Using these techniques, we quantitatively imaged rapid dynamics of live biological cells including sperm cells and unicellular microorganisms. Then, we demonstrated dynamic profiling during lithography processes of microscopic elements, with thicknesses that may vary from several nanometers to hundreds of microns. Finally, we present new algorithms for fast reconstruction (including digital phase unwrapping) of off-axis interferograms, which allow real-time processing in more than video rate on regular single-core computers.
Quantitative prediction of solute strengthening in aluminium alloys.
Leyson, Gerard Paul M; Curtin, William A; Hector, Louis G; Woodward, Christopher F
2010-09-01
Despite significant advances in computational materials science, a quantitative, parameter-free prediction of the mechanical properties of alloys has been difficult to achieve from first principles. Here, we present a new analytic theory that, with input from first-principles calculations, is able to predict the strengthening of aluminium by substitutional solute atoms. Solute-dislocation interaction energies in and around the dislocation core are first calculated using density functional theory and a flexible-boundary-condition method. An analytic model for the strength, or stress to move a dislocation, owing to the random field of solutes, is then presented. The theory, which has no adjustable parameters and is extendable to other metallic alloys, predicts both the energy barriers to dislocation motion and the zero-temperature flow stress, allowing for predictions of finite-temperature flow stresses. Quantitative comparisons with experimental flow stresses at temperature T=78 K are made for Al-X alloys (X=Mg, Si, Cu, Cr) and good agreement is obtained.
Carreno-Quintero, Natalia; Acharjee, Animesh; Maliepaard, Chris; Bachem, Christian W.B.; Mumm, Roland; Bouwmeester, Harro; Visser, Richard G.F.; Keurentjes, Joost J.B.
2012-01-01
Recent advances in -omics technologies such as transcriptomics, metabolomics, and proteomics along with genotypic profiling have permitted dissection of the genetics of complex traits represented by molecular phenotypes in nonmodel species. To identify the genetic factors underlying variation in primary metabolism in potato (Solanum tuberosum), we have profiled primary metabolite content in a diploid potato mapping population, derived from crosses between S. tuberosum and wild relatives, using gas chromatography-time of flight-mass spectrometry. In total, 139 polar metabolites were detected, of which we identified metabolite quantitative trait loci for approximately 72% of the detected compounds. In order to obtain an insight into the relationships between metabolic traits and classical phenotypic traits, we also analyzed statistical associations between them. The combined analysis of genetic information through quantitative trait locus coincidence and the application of statistical learning methods provide information on putative indicators associated with the alterations in metabolic networks that affect complex phenotypic traits. PMID:22223596
Practical Stereology Applications for the Pathologist.
Brown, Danielle L
2017-05-01
Qualitative histopathology is the gold standard for routine examination of morphological tissue changes in the regulatory or academic environment. The human eye is exceptional for pattern recognition but often cannot detect small changes in quantity. In cases where detection of subtle quantitative changes is critical, more sensitive methods are required. Two-dimensional histomorphometry can provide additional quantitative information and is quite useful in many cases. However, the provided data may not be referent to the entire tissue and, as such, it makes several assumptions, which are sources of bias. In contrast, stereology is design based rather than assumption based and uses stringent sampling methods to obtain accurate and precise 3-dimensional information using geometrical and statistical principles. Recent advances in technology have made stereology more approachable and practical for the pathologist in both regulatory and academic environments. This review introduces pathologists to the basic principles of stereology and walks the reader through some real-world examples for the application of these principles in the workplace.
Hyperspectral Imaging and SPA-LDA Quantitative Analysis for Detection of Colon Cancer Tissue
NASA Astrophysics Data System (ADS)
Yuan, X.; Zhang, D.; Wang, Ch.; Dai, B.; Zhao, M.; Li, B.
2018-05-01
Hyperspectral imaging (HSI) has been demonstrated to provide a rapid, precise, and noninvasive method for cancer detection. However, because HSI contains many data, quantitative analysis is often necessary to distill information useful for distinguishing cancerous from normal tissue. To demonstrate that HSI with our proposed algorithm can make this distinction, we built a Vis-NIR HSI setup and made many spectral images of colon tissues, and then used a successive projection algorithm (SPA) to analyze the hyperspectral image data of the tissues. This was used to build an identification model based on linear discrimination analysis (LDA) using the relative reflectance values of the effective wavelengths. Other tissues were used as a prediction set to verify the reliability of the identification model. The results suggest that Vis-NIR hyperspectral images, together with the spectroscopic classification method, provide a new approach for reliable and safe diagnosis of colon cancer and could lead to advances in cancer diagnosis generally.
Crayton, Samuel H; Elias, Drew R; Al Zaki, Ajlan; Cheng, Zhiliang; Tsourkas, Andrew
2012-02-01
Recent advances in material science and chemistry have led to the development of nanoparticles with diverse physicochemical properties, e.g. size, charge, shape, and surface chemistry. Evaluating which physicochemical properties are best for imaging and therapeutic studies is challenging not only because of the multitude of samples to evaluate, but also because of the large experimental variability associated with in vivo studies (e.g. differences in tumor size, injected dose, subject weight, etc.). To address this issue, we have developed a lanthanide-doped nanoparticle system and analytical method that allows for the quantitative comparison of multiple nanoparticle compositions simultaneously. Specifically, superparamagnetic iron oxide (SPIO) with a range of different sizes and charges were synthesized, each with a unique lanthanide dopant. Following the simultaneous injection of the various SPIO compositions into tumor-bearing mice, inductively coupled plasma mass spectroscopy (ICP-MS) was used to quantitatively and orthogonally assess the concentration of each SPIO composition in serial blood samples and the resected tumor and organs. The method proved generalizable to other nanoparticle platforms, including dendrimers, liposomes, and polymersomes. This approach provides a simple, cost-effective, and non-radiative method to quantitatively compare tumor localization, biodistribution, and blood clearance of more than 10 nanoparticle compositions simultaneously, removing subject-to-subject variability. Copyright © 2011 Elsevier Ltd. All rights reserved.
Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?
Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.
2005-01-01
Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597
[Statistical analysis of German radiologic periodicals: developmental trends in the last 10 years].
Golder, W
1999-09-01
To identify which statistical tests are applied in German radiological publications, to what extent their use has changed during the last decade, and which factors might be responsible for this development. The major articles published in "ROFO" and "DER RADIOLOGE" during 1988, 1993 and 1998 were reviewed for statistical content. The contributions were classified by principal focus and radiological subspecialty. The methods used were assigned to descriptive, basal and advanced statistics. Sample size, significance level and power were established. The use of experts' assistance was monitored. Finally, we calculated the so-called cumulative accessibility of the publications. 525 contributions were found to be eligible. In 1988, 87% used descriptive statistics only, 12.5% basal, and 0.5% advanced statistics. The corresponding figures in 1993 and 1998 are 62 and 49%, 32 and 41%, and 6 and 10%, respectively. Statistical techniques were most likely to be used in research on musculoskeletal imaging and articles dedicated to MRI. Six basic categories of statistical methods account for the complete statistical analysis appearing in 90% of the articles. ROC analysis is the single most common advanced technique. Authors make increasingly use of statistical experts' opinion and programs. During the last decade, the use of statistical methods in German radiological journals has fundamentally improved, both quantitatively and qualitatively. Presently, advanced techniques account for 20% of the pertinent statistical tests. This development seems to be promoted by the increasing availability of statistical analysis software.
DIGE Analysis of Human Tissues.
Gelfi, Cecilia; Capitanio, Daniele
2018-01-01
Two-dimensional difference gel electrophoresis (2-D DIGE) is an advanced and elegant gel electrophoretic analytical tool for comparative protein assessment. It is based on two-dimensional gel electrophoresis (2-DE) separation of fluorescently labeled protein extracts. The tagging procedures are designed to not interfere with the chemical properties of proteins with respect to their pI and electrophoretic mobility, once a proper labeling protocol is followed. The two-dye or three-dye systems can be adopted and their choice depends on specific applications. Furthermore, the use of an internal pooled standard makes 2-D DIGE a highly accurate quantitative method enabling multiple protein samples to be separated on the same two-dimensional gel. The image matching and cross-gel statistical analysis generates robust quantitative results making data validation by independent technologies successful.
Santos, Radleigh G; Appel, Jon R; Giulianotti, Marc A; Edwards, Bruce S; Sklar, Larry A; Houghten, Richard A; Pinilla, Clemencia
2013-05-30
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays.
Advanced imaging techniques in brain tumors
2009-01-01
Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287
Iron in Multiple Sclerosis and Its Noninvasive Imaging with Quantitative Susceptibility Mapping
Stüber, Carsten; Pitt, David; Wang, Yi
2016-01-01
Iron is considered to play a key role in the development and progression of Multiple Sclerosis (MS). In particular, iron that accumulates in myeloid cells after the blood-brain barrier (BBB) seals may contribute to chronic inflammation, oxidative stress and eventually neurodegeneration. Magnetic resonance imaging (MRI) is a well-established tool for the non-invasive study of MS. In recent years, an advanced MRI method, quantitative susceptibility mapping (QSM), has made it possible to study brain iron through in vivo imaging. Moreover, immunohistochemical investigations have helped defining the lesional and cellular distribution of iron in MS brain tissue. Imaging studies in MS patients and of brain tissue combined with histological studies have provided important insights into the role of iron in inflammation and neurodegeneration in MS. PMID:26784172
Endobiogeny: a global approach to systems biology (part 1 of 2).
Lapraz, Jean-Claude; Hedayat, Kamyar M
2013-01-01
Endobiogeny is a global systems approach to human biology that may offer an advancement in clinical medicine based in scientific principles of rigor and experimentation and the humanistic principles of individualization of care and alleviation of suffering with minimization of harm. Endobiogeny is neither a movement away from modern science nor an uncritical embracing of pre-rational methods of inquiry but a synthesis of quantitative and qualitative relationships reflected in a systems-approach to life and based on new mathematical paradigms of pattern recognition.
Advanced IR System For Supersonic Boundary Layer Transition Flight Experiment
NASA Technical Reports Server (NTRS)
Banks, Daniel W.
2008-01-01
Infrared thermography is a preferred method investigating transition in flight: a) Global and non-intrusive; b) Can also be used to visualize and characterize other fluid mechanic phenomena such as shock impingement, separation etc. F-15 based system was updated with new camera and digital video recorder to support high Reynolds number transition tests. Digital Recording improves image quality and analysis capability and allows for accurate quantitative (temperature) measurements and greater enhancement through image processing allows analysis of smaller scale phenomena.
Cartography and Connectomes Perspective article for Neuron 25th Anniversary Issue
Van Essen, David C.
2013-01-01
The past 25 years have seen great progress in parcellating the cerebral cortex into a mosaic of many distinct areas in mice, monkeys, and humans. Quantitative studies of inter-areal connectivity have revealed unexpectedly many pathways and a wide range of connection strengths in mouse and macaque cortex. In humans, advances in analyzing ‘structural’ and ‘functional’ connectivity using powerful but indirect noninvasive neuroimaging methods are yielding intriguing insights about brain circuits, their variability across individuals, and their relationship to behavior. PMID:24183027
Coherent imaging at the diffraction limit
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-01-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects. PMID:25177990
Coherent imaging at the diffraction limit.
Thibault, Pierre; Guizar-Sicairos, Manuel; Menzel, Andreas
2014-09-01
X-ray ptychography, a scanning coherent diffractive imaging technique, holds promise for imaging with dose-limited resolution and sensitivity. If the foreseen increase of coherent flux by orders of magnitude can be matched by additional technological and analytical advances, ptychography may approach imaging speeds familiar from full-field methods while retaining its inherently quantitative nature and metrological versatility. Beyond promises of high throughput, spectroscopic applications in three dimensions become feasible, as do measurements of sample dynamics through time-resolved imaging or careful characterization of decoherence effects.
Exploring pain pathophysiology in patients.
Sommer, Claudia
2016-11-04
Although animal models of pain have brought invaluable information on basic processes underlying pain pathophysiology, translation to humans is a problem. This Review will summarize what information has been gained by the direct study of patients with chronic pain. The techniques discussed range from patient phenotyping using quantitative sensory testing to specialized nociceptor neurophysiology, imaging methods of peripheral nociceptors, analyses of body fluids, genetics and epigenetics, and the generation of sensory neurons from patients via inducible pluripotent stem cells. Copyright © 2016, American Association for the Advancement of Science.
Fast multipole methods on a cluster of GPUs for the meshless simulation of turbulence
NASA Astrophysics Data System (ADS)
Yokota, R.; Narumi, T.; Sakamaki, R.; Kameoka, S.; Obi, S.; Yasuoka, K.
2009-11-01
Recent advances in the parallelizability of fast N-body algorithms, and the programmability of graphics processing units (GPUs) have opened a new path for particle based simulations. For the simulation of turbulence, vortex methods can now be considered as an interesting alternative to finite difference and spectral methods. The present study focuses on the efficient implementation of the fast multipole method and pseudo-particle method on a cluster of NVIDIA GeForce 8800 GT GPUs, and applies this to a vortex method calculation of homogeneous isotropic turbulence. The results of the present vortex method agree quantitatively with that of the reference calculation using a spectral method. We achieved a maximum speed of 7.48 TFlops using 64 GPUs, and the cost performance was near 9.4/GFlops. The calculation of the present vortex method on 64 GPUs took 4120 s, while the spectral method on 32 CPUs took 4910 s.
Tepper, Ronnie
2017-01-01
Background Workplaces today demand graduates who are prepared with field-specific knowledge, advanced social skills, problem-solving skills, and integration capabilities. Meeting these goals with didactic learning (DL) is becoming increasingly difficult. Enhanced training methods that would better prepare tomorrow’s graduates must be more engaging and game-like, such as feedback based e-learning or simulation-based training, while saving time. Empirical evidence regarding the effectiveness of advanced learning methods is lacking. Objective quantitative research comparing advanced training methods with DL is sparse. Objectives This quantitative study assessed the effectiveness of a computerized interactive simulator coupled with an instructor who monitored students’ progress and provided Web-based immediate feedback. Methods A low-cost, globally accessible, telemedicine simulator, developed at the Technion—Israel Institute of Technology, Haifa, Israel—was used. A previous study in the field of interventional cardiology, evaluating the efficacy of the simulator to enhanced learning via knowledge exams, presented promising results of average scores varying from 94% after training and 54% before training (n=20) with P<.001. Two independent experiments involving obstetrics and gynecology (Ob-Gyn) physicians and senior ultrasound sonographers, with 32 subjects, were conducted using a new interactive concept of the WOZ (Wizard of OZ) simulator platform. The contribution of an instructor to learning outcomes was evaluated by comparing students’ knowledge before and after each interactive instructor-led session as well as after fully automated e-learning in the field of Ob-Gyn. Results from objective knowledge tests were analyzed using hypothesis testing and model fitting. Results A significant advantage (P=.01) was found in favor of the WOZ training approach. Content type and training audience were not significant. Conclusions This study evaluated the contribution of an integrated teaching environment using a computerized interactive simulator, with an instructor providing immediate Web-based immediate feedback to trainees. Involvement of an instructor in the simulation-based training process provided better learning outcomes that varied training content and trainee populations did not affect the overall learning gains. PMID:28432039
Alves, Antoine; Attik, Nina; Bayon, Yves; Royet, Elodie; Wirth, Carine; Bourges, Xavier; Piat, Alexis; Dolmazon, Gaëlle; Clermont, Gaëlle; Boutrand, Jean-Pierre; Grosgogeat, Brigitte; Gritsch, Kerstin
2018-03-14
The paradigm shift brought about by the expansion of tissue engineering and regenerative medicine away from the use of biomaterials, currently questions the value of histopathologic methods in the evaluation of biological changes. To date, the available tools of evaluation are not fully consistent and satisfactory for these advanced therapies. We have developed a new, simple and inexpensive quantitative digital approach that provides key metrics for structural and compositional characterization of the regenerated tissues. For example, metrics provide the tissue ingrowth rate (TIR) which integrates two separate indicators; the cell ingrowth rate (CIR) and the total collagen content (TCC) as featured in the equation, TIR% = CIR% + TCC%. Moreover a subset of quantitative indicators describing the directional organization of the collagen (relating structure and mechanical function of tissues), the ratio of collagen I to collagen III (remodeling quality) and the optical anisotropy property of the collagen (maturity indicator) was automatically assessed as well. Using an image analyzer, all metrics were extracted from only two serial sections stained with either Feulgen & Rossenbeck (cell specific) or Picrosirius Red F3BA (collagen specific). To validate this new procedure, three-dimensional (3D) scaffolds were intraperitoneally implanted in healthy and in diabetic rats. It was hypothesized that quantitatively, the healing tissue would be significantly delayed and of poor quality in diabetic rats in comparison to healthy rats. In addition, a chemically modified 3D scaffold was similarly implanted in a third group of healthy rats with the assumption that modulation of the ingrown tissue would be quantitatively present in comparison to the 3D scaffold-healthy group. After 21 days of implantation, both hypotheses were verified by use of this novel computerized approach. When the two methods were run in parallel, the quantitative results revealed fine details and differences not detected by the semi-quantitative assessment, demonstrating the importance of quantitative analysis in the performance evaluation of soft tissue healing. This automated and supervised method reduced operator dependency and proved to be simple, sensitive, cost-effective and time-effective. It supports objective therapeutic comparisons and helps to elucidate regeneration and the dynamics of a functional tissue.
Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James
2017-01-01
Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678
Glycoprotein Disease Markers and Single Protein-omics*
Chandler, Kevin; Goldman, Radoslav
2013-01-01
Glycoproteins are well represented among biomarkers for inflammatory and cancer diseases. Secreted and membrane-associated glycoproteins make excellent targets for noninvasive detection. In this review, we discuss clinically applicable markers of cancer diseases and methods for their analysis. High throughput discovery continues to supply marker candidates with unusual glycan structures, altered glycoprotein abundance, or distribution of site-specific glycoforms. Improved analytical methods are needed to unlock the potential of these discoveries in validated clinical assays. A new generation of targeted quantitative assays is expected to advance the use of glycoproteins in early detection of diseases, molecular disease classification, and monitoring of therapeutic interventions. PMID:23399550
An, Mingrui; Lohse, Ines; Tan, Zhijing; Zhu, Jianhui; Wu, Jing; Kurapati, Himabindu; Morgan, Meredith A; Lawrence, Theodore S; Cuneo, Kyle C; Lubman, David M
2017-04-07
Pancreatic cancer is the third leading cause of cancer-related death in the USA. Despite extensive research, minimal improvements in patient outcomes have been achieved. Early identification of treatment response and metastasis would be valuable to determine the appropriate therapeutic course for patients. In this work, we isolated exosomes from the serum of 10 patients with locally advanced pancreatic cancer at serial time points over a course of therapy, and quantitative analysis was performed using the iTRAQ method. We detected approximately 700-800 exosomal proteins per sample, several of which have been implicated in metastasis and treatment resistance. We compared the exosomal proteome of patients at different time points during treatment to healthy controls and identified eight proteins that show global treatment-specific changes. We then tested the effect of patient-derived exosomes on the migration of tumor cells and found that patient-derived exosomes, but not healthy controls, induce cell migration, supporting their role in metastasis. Our data show that exosomes can be reliably extracted from patient serum and analyzed for protein content. The differential loading of exosomes during a course of therapy suggests that exosomes may provide novel insights into the development of treatment resistance and metastasis.
A Quantitative ADME-base Tool for Exploring Human ...
Exposure to a wide range of chemicals through our daily habits and routines is ubiquitous and largely unavoidable within modern society. The potential for human exposure, however, has not been quantified for the vast majority of chemicals with wide commercial use. Creative advances in exposure science are needed to support efficient and effective evaluation and management of chemical risks, particularly for chemicals in consumer products. The U.S. Environmental Protection Agency Office of Research and Development is developing, or collaborating in the development of, scientifically-defensible methods for making quantitative or semi-quantitative exposure predictions. The Exposure Prioritization (Ex Priori) model is a simplified, quantitative visual dashboard that provides a rank-ordered internalized dose metric to simultaneously explore exposures across chemical space (not chemical by chemical). Diverse data streams are integrated within the interface such that different exposure scenarios for “individual,” “population,” or “professional” time-use profiles can be interchanged to tailor exposure and quantitatively explore multi-chemical signatures of exposure, internalized dose (uptake), body burden, and elimination. Ex Priori has been designed as an adaptable systems framework that synthesizes knowledge from various domains and is amenable to new knowledge/information. As such, it algorithmically captures the totality of exposure across pathways. It
Li, Peiwu; Zhang, Zhaowei; Hu, Xiaofeng; Zhang, Qi
2013-01-01
Mass spectrometric techniques are essential for advanced research in food safety and environmental monitoring. These fields are important for securing the health of humans and animals, and for ensuring environmental security. Mycotoxins, toxic secondary metabolites of filamentous fungi, are major contaminants of agricultural products, food and feed, biological samples, and the environment as a whole. Mycotoxins can cause cancers, nephritic and hepatic diseases, various hemorrhagic syndromes, and immune and neurological disorders. Mycotoxin-contaminated food and feed can provoke trade conflicts, resulting in massive economic losses. Risk assessment of mycotoxin contamination for humans and animals generally depends on clear identification and reliable quantitation in diversified matrices. Pioneering work on mycotoxin quantitation using mass spectrometry (MS) was performed in the early 1970s. Now, unambiguous confirmation and quantitation of mycotoxins can be readily achieved with a variety hyphenated techniques that combine chromatographic separation with MS, including liquid chromatography (LC) or gas chromatography (GC). With the advent of atmospheric pressure ionization, LC-MS has become a routine technique. Recently, the co-occurrence of multiple mycotoxins in the same sample has drawn an increasing amount of attention. Thus, modern analyses must be able to detect and quantitate multiple mycotoxins in a single run. Improvements in tandem MS techniques have been made to achieve this purpose. This review describes the advanced research that has been done regarding mycotoxin determination using hyphenated chromatographic-MS techniques, but is not a full-circle survey of all the literature published on this topic. The present work provides an overview of the various hyphenated chromatographic-MS-based strategies that have been applied to mycotoxin analysis, with a focus on recent developments. The use of chromatographic-MS to measure levels of mycotoxins, including aflatoxins, ochratoxins, patulin, trichothecenes, zearalenone, and fumonisins, is discussed in detail. Both free and masked mycotoxins are included in this review due to different methods of sample preparation. Techniques are described in terms of sample preparation, internal standards, LC/ultra performance LC (UPLC) optimization, and applications and survey. Several future hyphenated MS techniques are discussed as well, including multidimensional chromatography-MS, capillary electrophoresis-MS, and surface plasmon resonance array-MS. © 2013 Wiley Periodicals, Inc.
Duchnowska, Renata; Biernat, Wojciech; Szostakiewicz, Barbara; Sperinde, Jeff; Piette, Fanny; Haddad, Mojgan; Paquet, Agnes; Lie, Yolanda; Czartoryska-Arłukowicz, Bogumiła; Wysocki, Piotr; Jankowski, Tomasz; Radecka, Barbara; Foszczyńska-Kłoda, Małgorzata; Litwiniuk, Maria; Dȩbska, Sylwia; Weidler, Jodi; Huang, Weidong; Buyse, Marc; Bates, Michael
2012-01-01
Background. Patients with human epidermal growth factor receptor (HER)-2+ breast cancer are at particularly high risk for brain metastases; however, the biological basis is not fully understood. Using a novel HER-2 assay, we investigated the correlation between quantitative HER-2 expression in primary breast cancers and the time to brain metastasis (TTBM) in HER-2+ advanced breast cancer patients treated with trastuzumab. Methods. The study group included 142 consecutive patients who were administered trastuzumab-based therapy for HER-2+ metastatic breast cancer. HER-2/neu gene copy number was quantified as the HER-2/centromeric probe for chromosome 17 (CEP17) ratio by central laboratory fluorescence in situ hybridization (FISH). HER-2 protein was quantified as total HER-2 protein expression (H2T) by the HERmark® assay (Monogram Biosciences, Inc., South San Francisco, CA) in formalin-fixed, paraffin-embedded tumor samples. HER-2 variables were correlated with clinical features and TTBM was measured from the initiation of trastuzumab-containing therapy. Results. A higher H2T level (continuous variable) was correlated with shorter TTBM, whereas HER-2 amplification by FISH and a continuous HER-2/CEP17 ratio were not predictive (p = .013, .28, and .25, respectively). In the subset of patients that was centrally determined by FISH to be HER-2+, an above-the-median H2T level was significantly associated with a shorter TTBM (hazard ratio, [HR], 2.4; p = .005), whereas this was not true for the median HER-2/CEP17 ratio by FISH (p = .4). Correlation between a continuous H2T level and TTBM was confirmed on multivariate analysis (HR, 3.3; p = .024). Conclusions. These data reveal a strong relationship between the quantitative HER-2 protein expression level and the risk for brain relapse in HER-2+ advanced breast cancer patients. Consequently, quantitative assessment of HER-2 protein expression may inform and facilitate refinements in therapeutic treatment strategies for selected subpopulations of patients in this group. PMID:22234627
Advances in Integrated Computational Materials Engineering "ICME"
NASA Astrophysics Data System (ADS)
Hirsch, Jürgen
The methods of Integrated Computational Materials Engineering that were developed and successfully applied for Aluminium have been constantly improved. The main aspects and recent advances of integrated material and process modeling are simulations of material properties like strength and forming properties and for the specific microstructure evolution during processing (rolling, extrusion, annealing) under the influence of material constitution and process variations through the production process down to the final application. Examples are discussed for the through-process simulation of microstructures and related properties of Aluminium sheet, including DC ingot casting, pre-heating and homogenization, hot and cold rolling, final annealing. New results are included of simulation solution annealing and age hardening of 6xxx alloys for automotive applications. Physically based quantitative descriptions and computer assisted evaluation methods are new ICME methods of integrating new simulation tools also for customer applications, like heat affected zones in welding of age hardening alloys. The aspects of estimating the effect of specific elements due to growing recycling volumes requested also for high end Aluminium products are also discussed, being of special interest in the Aluminium producing industries.
Cuevas, Soledad
Agriculture is a major contributor to greenhouse gas emissions, an important part of which is associated to deforestation and indirect land use change. Appropriate and coherent food policies can play an important role in aligning health, economic and environmental goals. From the point of view of policy analysis, however, this requires multi-sectoral, interdisciplinary approaches which can be highly complex. Important methodological advances in the area are not exempted from limitations and criticism. We argue that there is scope for further developments in integrated quantitative and qualitative policy analysis combining existing methods, including mathematical modelling and stakeholder analysis. We outline methodological trends in the field, briefly characterise integrated mixed methods policy analysis and identify contributions, challenges and opportunities for future research. In particular, this type of approach can help address issues of uncertainty and context-specific validity, incorporate multiple perspectives and help advance meaningful interdisciplinary collaboration in the field. Substantial challenges remain, however, such as the integration of key issues related to non-communicable disease, or the incorporation of a broader range of qualitative approaches that can address important cultural and ethical dimensions of food.
Public and patient involvement in quantitative health research: A statistical perspective.
Hannigan, Ailish
2018-06-19
The majority of studies included in recent reviews of impact for public and patient involvement (PPI) in health research had a qualitative design. PPI in solely quantitative designs is underexplored, particularly its impact on statistical analysis. Statisticians in practice have a long history of working in both consultative (indirect) and collaborative (direct) roles in health research, yet their perspective on PPI in quantitative health research has never been explicitly examined. To explore the potential and challenges of PPI from a statistical perspective at distinct stages of quantitative research, that is sampling, measurement and statistical analysis, distinguishing between indirect and direct PPI. Statistical analysis is underpinned by having a representative sample, and a collaborative or direct approach to PPI may help achieve that by supporting access to and increasing participation of under-represented groups in the population. Acknowledging and valuing the role of lay knowledge of the context in statistical analysis and in deciding what variables to measure may support collective learning and advance scientific understanding, as evidenced by the use of participatory modelling in other disciplines. A recurring issue for quantitative researchers, which reflects quantitative sampling methods, is the selection and required number of PPI contributors, and this requires further methodological development. Direct approaches to PPI in quantitative health research may potentially increase its impact, but the facilitation and partnership skills required may require further training for all stakeholders, including statisticians. © 2018 The Authors Health Expectations published by John Wiley & Sons Ltd.
Zheng, Xianlin; Lu, Yiqing; Zhao, Jiangbo; Zhang, Yuhai; Ren, Wei; Liu, Deming; Lu, Jie; Piper, James A; Leif, Robert C; Liu, Xiaogang; Jin, Dayong
2016-01-19
Compared with routine microscopy imaging of a few analytes at a time, rapid scanning through the whole sample area of a microscope slide to locate every single target object offers many advantages in terms of simplicity, speed, throughput, and potential for robust quantitative analysis. Existing techniques that accommodate solid-phase samples incorporating individual micrometer-sized targets generally rely on digital microscopy and image analysis, with intrinsically low throughput and reliability. Here, we report an advanced on-the-fly stage scanning method to achieve high-precision target location across the whole slide. By integrating X- and Y-axis linear encoders to a motorized stage as the virtual "grids" that provide real-time positional references, we demonstrate an orthogonal scanning automated microscopy (OSAM) technique which can search a coverslip area of 50 × 24 mm(2) in just 5.3 min and locate individual 15 μm lanthanide luminescent microspheres with standard deviations of 1.38 and 1.75 μm in X and Y directions. Alongside implementation of an autofocus unit that compensates the tilt of a slide in the Z-axis in real time, we increase the luminescence detection efficiency by 35% with an improved coefficient of variation. We demonstrate the capability of advanced OSAM for robust quantification of luminescence intensities and lifetimes for a variety of micrometer-scale luminescent targets, specifically single down-shifting and upconversion microspheres, crystalline microplates, and color-barcoded microrods, as well as quantitative suspension array assays of biotinylated-DNA functionalized upconversion nanoparticles.
van den Broek, Irene; Blokland, Marco; Nessen, Merel A; Sterk, Saskia
2015-01-01
Detection of misuse of peptides and proteins as growth promoters is a major issue for sport and food regulatory agencies. The limitations of current analytical detection strategies for this class of compounds, in combination with their efficacy in growth-promoting effects, make peptide and protein drugs highly susceptible to abuse by either athletes or farmers who seek for products to illicitly enhance muscle growth. Mass spectrometry (MS) for qualitative analysis of peptides and proteins is well-established, particularly due to tremendous efforts in the proteomics community. Similarly, due to advancements in targeted proteomic strategies and the rapid growth of protein-based biopharmaceuticals, MS for quantitative analysis of peptides and proteins is becoming more widely accepted. These continuous advances in MS instrumentation and MS-based methodologies offer enormous opportunities for detection and confirmation of peptides and proteins. Therefore, MS seems to be the method of choice to improve the qualitative and quantitative analysis of peptide and proteins with growth-promoting properties. This review aims to address the opportunities of MS for peptide and protein analysis in veterinary control and sports-doping control with a particular focus on detection of illicit growth promotion. An overview of potential peptide and protein targets, including their amino acid sequence characteristics and current MS-based detection strategies is, therefore, provided. Furthermore, improvements of current and new detection strategies with state-of-the-art MS instrumentation are discussed for qualitative and quantitative approaches. © 2013 Wiley Periodicals, Inc.
Ilovich, Ohad; Qutaish, Mohammed; Hesterman, Jacob; Orcutt, Kelly; Hoppin, Jack; Polyak, Ildiko; Seaman, Marc; Abu-Yousif, Adnan; Cvet, Donna; Bradley, Daniel
2018-05-04
In vitro properties of antibody drug conjugates (ADCs) such as binding, internalization, and cytotoxicity are often well characterized prior to in vivo studies. Interpretation of in vivo studies could significantly be enhanced by molecular imaging tools. We present here a dual-isotope cryo-imaging quantitative autoradiography (CIQA) methodology combined with advanced 3D imaging and analysis allowing for the simultaneous study of both antibody and payload distribution in tissues of interest. in a pre-clinical setting. Methods: TAK-264, an investigational anti-guanylyl cyclase C (GCC) targeting ADC was synthesized utilizing tritiated Monomethyl auristatin E (MMAE). The tritiated ADC was then conjugated to DTPA, labeled with indium-111 and evaluated in vivo in GCC-positive and GCC-negative tumor-bearing animals. Results: Cryo-imaging Quantitative Autoradiography (CIQA) reveals the time course of drug release from ADC and its distribution into various tumor regions seemingly impenetrablethat are less accessible to the antibody. For GCC-positive tumors, a representative section obtained 96 hours post tracer injection showed only 0.8% of the voxels have co-localized signal versus over 15% of the voxels for a GCC-negative tumor section., suggesting successful and specific cleaving of the toxin in the antigen positive lesions. Conclusion: The combination of a veteran established autoradiography technology with advanced image analysis methodologies affords an experimental tool that can support detailed characterization of ADC tumor penetration and pharmacokinetics. Copyright © 2018 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Cardiac imaging: working towards fully-automated machine analysis & interpretation
Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido
2017-01-01
Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804
Helitzer, Deborah; Morahan, Page; Chang, Shine; Gleason, Katharine; Cardinali, Gina; Wu, Chih-Chieh
2012-01-01
Abstract Background Surprisingly little research is available to explain the well-documented organizational and societal influences on persistent inequities in advancement of women faculty. Methods The Systems of Career Influences Model is a framework for exploring factors influencing women's progression to advanced academic rank, executive positions, and informal leadership roles in academic medicine. The model situates faculty as agents within a complex adaptive system consisting of a trajectory of career advancement with opportunities for formal professional development programming; a dynamic system of influences of organizational policies, practices, and culture; and a dynamic system of individual choices and decisions. These systems of influence may promote or inhibit career advancement. Within this system, women weigh competing influences to make career advancement decisions, and leaders of academic health centers prioritize limited resources to support the school's mission. Results and Conclusions The Systems of Career Influences Model proved useful to identify key research questions. We used the model to probe how research in academic career development might be applied to content and methods of formal professional development programs. We generated a series of questions and hypotheses about how professional development programs might influence professional development of health science faculty members. Using the model as a guide, we developed a study using a quantitative and qualitative design. These analyses should provide insight into what works in recruiting and supporting productive men and women faculty in academic medical centers. PMID:23101486
Advances in physical activity monitoring and lifestyle interventions in obesity: a review.
Bonomi, A G; Westerterp, K R
2012-02-01
Obesity represents a strong risk factor for developing chronic diseases. Strategies for disease prevention often promote lifestyle changes encouraging participation in physical activity. However, determining what amount of physical activity is necessary for achieving specific health benefits has been hampered by the lack of accurate instruments for monitoring physical activity and the related physiological outcomes. This review aims at presenting recent advances in activity-monitoring technology and their application to support interventions for health promotion. Activity monitors have evolved from step counters and measuring devices of physical activity duration and intensity to more advanced systems providing quantitative and qualitative information on the individuals' activity behavior. Correspondingly, methods to predict activity-related energy expenditure using bodily acceleration and subjects characteristics have advanced from linear regression to innovative algorithms capable of determining physical activity types and the related metabolic costs. These novel techniques can monitor modes of sedentary behavior as well as the engagement in specific activity types that helps to evaluate the effectiveness of lifestyle interventions. In conclusion, advances in activity monitoring have the potential to support the design of response-dependent physical activity recommendations that are needed to generate effective and personalized lifestyle interventions for health promotion.
Technology readiness levels for advanced nuclear fuels and materials development
Carmack, W. J.; Braase, L. A.; Wigeland, R. A.; ...
2016-12-23
The Technology Readiness Level (TRL) process is used to quantitatively assess the maturity of a given technology. It was pioneered by the National Aeronautics and Space Administration (NASA) in the 1980s to develop and deploy new systems for space applications. The process was subsequently adopted by the Department of Defense (DoD) to develop and deploy new technology and systems for defense applications as well as the Department of Energy (DOE) to evaluate the maturity of new technologies in major construction projects. Advanced nuclear fuels and materials development is a critical technology needed for improving the performance and safety of currentmore » and advanced reactors, and ultimately closing the nuclear fuel cycle. Because deployment of new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the advanced fuel development program is very useful as a management, communication and tracking tool. Furthermore, this article provides examples regarding the methods by which TRLs are currently used to assess the maturity of nuclear fuels and materials under development in the DOE Fuel Cycle Research and Development (FCRD) Program within the Advanced Fuels Campaign (AFC).« less
Technology readiness levels for advanced nuclear fuels and materials development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carmack, W. J.; Braase, L. A.; Wigeland, R. A.
The Technology Readiness Level (TRL) process is used to quantitatively assess the maturity of a given technology. It was pioneered by the National Aeronautics and Space Administration (NASA) in the 1980s to develop and deploy new systems for space applications. The process was subsequently adopted by the Department of Defense (DoD) to develop and deploy new technology and systems for defense applications as well as the Department of Energy (DOE) to evaluate the maturity of new technologies in major construction projects. Advanced nuclear fuels and materials development is a critical technology needed for improving the performance and safety of currentmore » and advanced reactors, and ultimately closing the nuclear fuel cycle. Because deployment of new nuclear fuel forms requires a lengthy and expensive research, development, and demonstration program, applying the TRL concept to the advanced fuel development program is very useful as a management, communication and tracking tool. Furthermore, this article provides examples regarding the methods by which TRLs are currently used to assess the maturity of nuclear fuels and materials under development in the DOE Fuel Cycle Research and Development (FCRD) Program within the Advanced Fuels Campaign (AFC).« less
Advances in analytical instrumentation have not only increased the number and types of chemicals measured, but reduced the quantitation limits, allowing these chemicals to be detected at progressively lower concentrations in various environmental matrices. Such analytical advanc...
Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E
2014-01-01
Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.
Does Homework Really Matter for College Students in Quantitatively-Based Courses?
ERIC Educational Resources Information Center
Young, Nichole; Dollman, Amanda; Angel, N. Faye
2016-01-01
This investigation was initiated by two students in an Advanced Computer Applications course. They sought to examine the influence of graded homework on final grades in quantitatively-based business courses. They were provided with data from three quantitatively-based core business courses over a period of five years for a total of 10 semesters of…
Quantitative multimodality imaging in cancer research and therapy.
Yankeelov, Thomas E; Abramson, Richard G; Quarles, C Chad
2014-11-01
Advances in hardware and software have enabled the realization of clinically feasible, quantitative multimodality imaging of tissue pathophysiology. Earlier efforts relating to multimodality imaging of cancer have focused on the integration of anatomical and functional characteristics, such as PET-CT and single-photon emission CT (SPECT-CT), whereas more-recent advances and applications have involved the integration of multiple quantitative, functional measurements (for example, multiple PET tracers, varied MRI contrast mechanisms, and PET-MRI), thereby providing a more-comprehensive characterization of the tumour phenotype. The enormous amount of complementary quantitative data generated by such studies is beginning to offer unique insights into opportunities to optimize care for individual patients. Although important technical optimization and improved biological interpretation of multimodality imaging findings are needed, this approach can already be applied informatively in clinical trials of cancer therapeutics using existing tools. These concepts are discussed herein.
Henly, Susan J; McCarthy, Donna O; Wyman, Jean F; Heitkemper, Margaret M; Redeker, Nancy S; Titler, Marita G; McCarthy, Ann Marie; Stone, Patricia W; Moore, Shirley M; Alt-White, Anna C; Conley, Yvette P; Dunbar-Jacob, Jacqueline
2015-01-01
The Council for the Advancement of Nursing Science aims to "facilitate and recognize life-long nursing science career development" as an important part of its mission. In light of fast-paced advances in science and technology that are inspiring new questions and methods of investigation in the health sciences, the Council for the Advancement of Nursing Science convened the Idea Festival for Nursing Science Education and appointed the Idea Festival Advisory Committee (IFAC) to stimulate dialogue about linking PhD education with a renewed vision for preparation of the next generation of nursing scientists. Building on the 2005 National Research Council report Advancing The Nation's Health Needs and the 2010 American Association of Colleges of Nursing Position Statement on the Research-Focused Doctorate Pathways to Excellence, the IFAC specifically addressed the capacity of PhD programs to prepare nursing scientists to conduct cutting-edge research in the following key emerging and priority areas of health sciences research: omics and the microbiome; health behavior, behavior change, and biobehavioral science; patient-reported outcomes; big data, e-science, and informatics; quantitative sciences; translation science; and health economics. The purpose of this article is to (a) describe IFAC activities, (b) summarize 2014 discussions hosted as part of the Idea Festival, and (c) present IFAC recommendations for incorporating these emerging areas of science and technology into research-focused doctoral programs committed to preparing graduates for lifelong, competitive careers in nursing science. The recommendations address clearer articulation of program focus areas; inclusion of foundational knowledge in emerging areas of science in core courses on nursing science and research methods; faculty composition; prerequisite student knowledge and skills; and in-depth, interdisciplinary training in supporting area of science content and methods. Copyright © 2015 Elsevier Inc. All rights reserved.
Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.
Handels, H; Ehrhardt, J
2009-01-01
Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
NASA Astrophysics Data System (ADS)
Amako, Eri; Enjoji, Takaharu; Uchida, Satoshi; Tochikubo, Fumiyoshi
Constant monitoring and immediate control of fermentation processes have been required for advanced quality preservation in food industry. In the present work, simple estimation of metabolic states for heat-injured Escherichia coli (E. coli) in a micro-cell was investigated using dielectrophoretic impedance measurement (DEPIM) method. Temporal change in the conductance between micro-gap (ΔG) was measured for various heat treatment temperatures. In addition, the dependence of enzyme activity, growth capacity and membrane situation for E. coli on heat treatment temperature was also analyzed with conventional biological methods. Consequently, a correlation between ΔG and those biological properties was obtained quantitatively. This result suggests that DEPIM method will be available for an effective monitoring technique for complex change in various biological states of microorganisms.
Zhang, Gong; Huang, Guangwei; Xiao, Lu; Mitchell, Alyson E
2011-11-23
A sensitive and reliable LC-(ESI)MS/MS method was developed and validated for the simultaneous analysis of five common advanced glycation endproducts (AGEs) after enzymatic digestion in raw and roasted almonds. AGEs included carboxymethyl-lysine (CML), carboxyethyl-lysine (CEL), pyralline (Pyr), argpyrimidine (Arg-p), and pentosidine (Pento-s). This method allows accurate quantitation of free and AGE-protein adducts of target AGEs. Results indicate that CML and CEL are found in both raw and roasted almonds. Pyr was identified for the first time in roasted almonds and accounted for 64.4% of free plus bound measured AGEs. Arg-p and Pento-s were below the limit of detection in all almond samples tested. Free AGEs accounted for 1.3-26.8% of free plus bound measured AGEs, indicating that protein-bound forms predominate. The roasting process significantly increased CML, CEL, and Pyr formation, but no significant correlation was observed between these AGEs and roasting temperature.
Rogers, Richard S; Abernathy, Michael; Richardson, Douglas D; Rouse, Jason C; Sperry, Justin B; Swann, Patrick; Wypych, Jette; Yu, Christopher; Zang, Li; Deshpande, Rohini
2017-11-30
Today, we are experiencing unprecedented growth and innovation within the pharmaceutical industry. Established protein therapeutic modalities, such as recombinant human proteins, monoclonal antibodies (mAbs), and fusion proteins, are being used to treat previously unmet medical needs. Novel therapies such as bispecific T cell engagers (BiTEs), chimeric antigen T cell receptors (CARTs), siRNA, and gene therapies are paving the path towards increasingly personalized medicine. This advancement of new indications and therapeutic modalities is paralleled by development of new analytical technologies and methods that provide enhanced information content in a more efficient manner. Recently, a liquid chromatography-mass spectrometry (LC-MS) multi-attribute method (MAM) has been developed and designed for improved simultaneous detection, identification, quantitation, and quality control (monitoring) of molecular attributes (Rogers et al. MAbs 7(5):881-90, 2015). Based on peptide mapping principles, this powerful tool represents a true advancement in testing methodology that can be utilized not only during product characterization, formulation development, stability testing, and development of the manufacturing process, but also as a platform quality control method in dispositioning clinical materials for both innovative biotherapeutics and biosimilars.
Laser Speckle Contrast Imaging of Cerebral Blood Flow
Dunn, Andrew K.
2011-01-01
Laser speckle contrast imaging (LSCI) has emerged over the past decade as a powerful, yet simple, method for imaging of blood flow dynamics in real time. The rapid adoption of LSCI for physiological studies is due to the relative ease and low cost of building an instrument as well as the ability to quantify blood flow changes with excellent spatial and temporal resolution. Although measurements are limited to superficial tissues with no depth resolution, LSCI has been instrumental in pre-clinical studies of neurological disorders as well as clinical applications including dermatological, neurosurgical and endoscopic studies. Recently a number of technical advances have been developed to improve the quantitative accuracy and temporal resolution of speckle imaging. This article reviews some of these recent advances and describes several applications of speckle imaging. PMID:22109805
Microscopic Imaging and Spectroscopy with Scattered Light
Boustany, Nada N.; Boppart, Stephen A.; Backman, Vadim
2012-01-01
Optical contrast based on elastic scattering interactions between light and matter can be used to probe cellular structure and dynamics, and image tissue architecture. The quantitative nature and high sensitivity of light scattering signals to subtle alterations in tissue morphology, as well as the ability to visualize unstained tissue in vivo, has recently generated significant interest in optical scatter based biosensing and imaging. Here we review the fundamental methodologies used to acquire and interpret optical scatter data. We report on recent findings in this field and present current advances in optical scatter techniques and computational methods. Cellular and tissue data enabled by current advances in optical scatter spectroscopy and imaging stand to impact a variety of biomedical applications including clinical tissue diagnosis, in vivo imaging, drug discovery and basic cell biology. PMID:20617940
Climate change and dengue: a critical and systematic review of quantitative modelling approaches
2014-01-01
Background Many studies have found associations between climatic conditions and dengue transmission. However, there is a debate about the future impacts of climate change on dengue transmission. This paper reviewed epidemiological evidence on the relationship between climate and dengue with a focus on quantitative methods for assessing the potential impacts of climate change on global dengue transmission. Methods A literature search was conducted in October 2012, using the electronic databases PubMed, Scopus, ScienceDirect, ProQuest, and Web of Science. The search focused on peer-reviewed journal articles published in English from January 1991 through October 2012. Results Sixteen studies met the inclusion criteria and most studies showed that the transmission of dengue is highly sensitive to climatic conditions, especially temperature, rainfall and relative humidity. Studies on the potential impacts of climate change on dengue indicate increased climatic suitability for transmission and an expansion of the geographic regions at risk during this century. A variety of quantitative modelling approaches were used in the studies. Several key methodological issues and current knowledge gaps were identified through this review. Conclusions It is important to assemble spatio-temporal patterns of dengue transmission compatible with long-term data on climate and other socio-ecological changes and this would advance projections of dengue risks associated with climate change. PMID:24669859
Ornatsky, Olga I.; Kinach, Robert; Bandura, Dmitry R.; Lou, Xudong; Tanner, Scott D.; Baranov, Vladimir I.; Nitz, Mark; Winnik, Mitchell A.
2008-01-01
Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping. PMID:19122859
Computer-assisted qualitative data analysis software.
Cope, Diane G
2014-05-01
Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.
Minority faculty members' resilience and academic productivity: are they related?
Cora-Bramble, Denice; Zhang, Kehua; Castillo-Page, Laura
2010-09-01
To explore whether there is a relationship between resilience and academic productivity of minority faculty members in U.S. academic health centers. For the purposes of the study, the authors defined academic productivity as peer-reviewed and non-peer-reviewed publications, grants, and academic promotion. In 2007, the authors simultaneously collected quantitative and qualitative data by using a triangulation (mixed-method) design. Past participants in the Association of American Medical Colleges' Minority Faculty Career Development Seminar completed the Web-based 70-item Personal Resilience Questionnaire (PRQ). In addition, two focus groups were conducted with past seminar participants. Seventy-four minority faculty members completed the PRQ, and 15 participated in the two focus groups. The quantitative data showed a positive correlation between demographic, educational, and academic productivity variables and certain resilience subscale scores. Common themes that emerged from the qualitative data were categorized under four major domains: existing barriers to academic advancement, internal protective factors or cultural buffers, external institutional or environmental facilitators, and necessary attributes for ensuring academic productivity and advancement. Certain resilience subscales showed correlation with academic productivity of minority faculty members, and specific personal and/or cultural characteristics were identified as enablers. Minority faculty members may benefit from skill development and coaching that extends beyond the traditional scope of faculty development programs and that specifically targets modifiable resilience characteristics. Additional research is needed, but such nontraditional, resilience-centered intervention strategies may positively affect the advancement of minority faculty in academic medicine.
NASA Astrophysics Data System (ADS)
Ruiz-Villanueva, Virginia; Piégay, Hervé; Gurnell, Angela A.; Marston, Richard A.; Stoffel, Markus
2016-09-01
Large wood is an important physical component of woodland rivers and significantly influences river morphology. It is also a key component of stream ecosystems. However, large wood is also a source of risk for human activities as it may damage infrastructure, block river channels, and induce flooding. Therefore, the analysis and quantification of large wood and its mobility are crucial for understanding and managing wood in rivers. As the amount of large-wood-related studies by researchers, river managers, and stakeholders increases, documentation of commonly used and newly available techniques and their effectiveness has also become increasingly relevant as well. Important data and knowledge have been obtained from the application of very different approaches and have generated a significant body of valuable information representative of different environments. This review brings a comprehensive qualitative and quantitative summary of recent advances regarding the different processes involved in large wood dynamics in fluvial systems including wood budgeting and wood mechanics. First, some key definitions and concepts are introduced. Second, advances in quantifying large wood dynamics are reviewed; in particular, how measurements and modeling can be combined to integrate our understanding of how large wood moves through and is retained within river systems. Throughout, we present a quantitative and integrated meta-analysis compiled from different studies and geographical regions. Finally, we conclude by highlighting areas of particular research importance and their likely future trajectories, and we consider a particularly underresearched area so as to stress the future challenges for large wood research.
Li, Jun; Jiang, Bin; Song, Hongwei; ...
2015-04-17
Here, we survey the recent advances in theoretical understanding of quantum state resolved dynamics, using the title reactions as examples. It is shown that the progress was made possible by major developments in two areas. First, an accurate analytical representation of many high-level ab initio points over a large configuration space can now be made with high fidelity and the necessary permutation symmetry. The resulting full-dimensional global potential energy surfaces enable dynamical calculations using either quasi-classical trajectory or more importantly quantum mechanical methods. The second advance is the development of accurate and efficient quantum dynamical methods, which are necessary formore » providing a reliable treatment of quantum effects in reaction dynamics such as tunneling, resonances, and zero-point energy. The powerful combination of the two advances has allowed us to achieve a quantitatively accurate characterization of the reaction dynamics, which unveiled rich dynamical features such as steric steering, strong mode specificity, and bond selectivity. The dependence of reactivity on reactant modes can be rationalized by the recently proposed sudden vector projection model, which attributes the mode specificity and bond selectivity to the coupling of reactant modes with the reaction coordinate at the relevant transition state. The deeper insights provided by these theoretical studies have advanced our understanding of reaction dynamics to a new level.« less
Comparative policy analysis for alcohol and drugs: Current state of the field.
Ritter, Alison; Livingston, Michael; Chalmers, Jenny; Berends, Lynda; Reuter, Peter
2016-05-01
A central policy research question concerns the extent to which specific policies produce certain effects - and cross-national (or between state/province) comparisons appear to be an ideal way to answer such a question. This paper explores the current state of comparative policy analysis (CPA) with respect to alcohol and drugs policies. We created a database of journal articles published between 2010 and 2014 as the body of CPA work for analysis. We used this database of 57 articles to clarify, extract and analyse the ways in which CPA has been defined. Quantitative and qualitative analysis of the CPA methods employed, the policy areas that have been studied, and differences between alcohol CPA and drug CPA are explored. There is a lack of clear definition as to what counts as a CPA. The two criteria for a CPA (explicit study of a policy, and comparison across two or more geographic locations), exclude descriptive epidemiology and single state comparisons. With the strict definition, most CPAs were with reference to alcohol (42%), although the most common policy to be analysed was medical cannabis (23%). The vast majority of papers undertook quantitative data analysis, with a variety of advanced statistical methods. We identified five approaches to the policy specification: classification or categorical coding of policy as present or absent; the use of an index; implied policy differences; described policy difference and data-driven policy coding. Each of these has limitations, but perhaps the most common limitation was the inability for the method to account for the differences between policy-as-stated versus policy-as-implemented. There is significant diversity in CPA methods for analysis of alcohol and drugs policy, and some substantial challenges with the currently employed methods. The absence of clear boundaries to a definition of what counts as a 'comparative policy analysis' may account for the methodological plurality but also appears to stand in the way of advancing the techniques. Copyright © 2016 Elsevier B.V. All rights reserved.
Santos, Radleigh G.; Appel, Jon R.; Giulianotti, Marc A.; Edwards, Bruce S.; Sklar, Larry A.; Houghten, Richard A.; Pinilla, Clemencia
2014-01-01
In the past 20 years, synthetic combinatorial methods have fundamentally advanced the ability to synthesize and screen large numbers of compounds for drug discovery and basic research. Mixture-based libraries and positional scanning deconvolution combine two approaches for the rapid identification of specific scaffolds and active ligands. Here we present a quantitative assessment of the screening of 32 positional scanning libraries in the identification of highly specific and selective ligands for two formylpeptide receptors. We also compare and contrast two mixture-based library approaches using a mathematical model to facilitate the selection of active scaffolds and libraries to be pursued for further evaluation. The flexibility demonstrated in the differently formatted mixture-based libraries allows for their screening in a wide range of assays. PMID:23722730
Metals handbook. Volume 12: Fractography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
ASM International has published this handbook in response to the growing interest in the science of fractography, the result of improved methods of preparing specimens, advances in photographic techniques and equipment, refinement of the scanning electron microscope, and the introduction of quantitative fractography. The book covers all aspects of fracture examination and interpretation, including electron and quantitative fractography. The text is accompanied by line drawings, graphs, and photographic illustrations of fracture surfaces and microstructural features. Articles explain and illustrate the principal modes of fracture and the effects of loading history, environment, and materials quality on fracture appearance. An atlas ofmore » fractographs constitutes the second half of the volume and contains more than 1300 fractographs, including a collection of ferrous and nonferrous alloy parts. Supplemental illustrations of failed metal-matrix composites, resin-matrix composites, polymers, and electronic materials are provided.« less
NASA Astrophysics Data System (ADS)
Mueller, Jenna L.; Harmany, Zachary T.; Mito, Jeffrey K.; Kennedy, Stephanie A.; Kim, Yongbaek; Dodd, Leslie; Geradts, Joseph; Kirsch, David G.; Willett, Rebecca M.; Brown, J. Quincy; Ramanujam, Nimmi
2013-02-01
The combination of fluorescent contrast agents with microscopy is a powerful technique to obtain real time images of tissue histology without the need for fixing, sectioning, and staining. The potential of this technology lies in the identification of robust methods for image segmentation and quantitation, particularly in heterogeneous tissues. Our solution is to apply sparse decomposition (SD) to monochrome images of fluorescently-stained microanatomy to segment and quantify distinct tissue types. The clinical utility of our approach is demonstrated by imaging excised margins in a cohort of mice after surgical resection of a sarcoma. Representative images of excised margins were used to optimize the formulation of SD and tune parameters associated with the algorithm. Our results demonstrate that SD is a robust solution that can advance vital fluorescence microscopy as a clinically significant technology.
Simulation Credibility: Advances in Verification, Validation, and Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Mehta, Unmeel B. (Editor); Eklund, Dean R.; Romero, Vicente J.; Pearce, Jeffrey A.; Keim, Nicholas S.
2016-01-01
Decision makers and other users of simulations need to know quantified simulation credibility to make simulation-based critical decisions and effectively use simulations, respectively. The credibility of a simulation is quantified by its accuracy in terms of uncertainty, and the responsibility of establishing credibility lies with the creator of the simulation. In this volume, we present some state-of-the-art philosophies, principles, and frameworks. The contributing authors involved in this publication have been dedicated to advancing simulation credibility. They detail and provide examples of key advances over the last 10 years in the processes used to quantify simulation credibility: verification, validation, and uncertainty quantification. The philosophies and assessment methods presented here are anticipated to be useful to other technical communities conducting continuum physics-based simulations; for example, issues related to the establishment of simulation credibility in the discipline of propulsion are discussed. We envision that simulation creators will find this volume very useful to guide and assist them in quantitatively conveying the credibility of their simulations.
Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics
Shi, Tujin; Su, Dian; Liu, Tao; Tang, Keqi; Camp, David G.; Qian, Wei-Jun; Smith, Richard D.
2012-01-01
Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the low ng/mL to pg/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides including posttranslational modifications (PTMs), as well as advances in MS instrumentation which have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed. PMID:22577010
Gong, Xing-Chu; Chen, Teng; Qu, Hai-Bin
2017-03-01
Quality by design (QbD) concept is an advanced pharmaceutical quality control concept. The application of QbD concept in the research and development of pharmaceutical processes of traditional Chinese medicines (TCM) mainly contains five parts, including the definition of critical processes and their evaluation criteria, the determination of critical process parameters and critical material attributes, the establishment of quantitative models, the development of design space, as well as the application and continuous improvement of control strategy. In this work, recent research advances in QbD concept implementation methods in the secondary development of Chinese patent medicines were reviewed, and five promising fields of the implementation of QbD concept were pointed out, including the research and development of TCM new drugs and Chinese medicine granules for formulation, modeling of pharmaceutical processes, development of control strategy based on industrial big data, strengthening the research of process amplification rules, and the development of new pharmaceutical equipment.. Copyright© by the Chinese Pharmaceutical Association.
Advancing the sensitivity of selected reaction monitoring-based targeted quantitative proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shi, Tujin; Su, Dian; Liu, Tao
2012-04-01
Selected reaction monitoring (SRM)—also known as multiple reaction monitoring (MRM)—has emerged as a promising high-throughput targeted protein quantification technology for candidate biomarker verification and systems biology applications. A major bottleneck for current SRM technology, however, is insufficient sensitivity for e.g., detecting low-abundance biomarkers likely present at the pg/mL to low ng/mL range in human blood plasma or serum, or extremely low-abundance signaling proteins in the cells or tissues. Herein we review recent advances in methods and technologies, including front-end immunoaffinity depletion, fractionation, selective enrichment of target proteins/peptides or their posttranslational modifications (PTMs), as well as advances in MS instrumentation, whichmore » have significantly enhanced the overall sensitivity of SRM assays and enabled the detection of low-abundance proteins at low to sub- ng/mL level in human blood plasma or serum. General perspectives on the potential of achieving sufficient sensitivity for detection of pg/mL level proteins in plasma are also discussed.« less
Thompson, John W; Sorum, Alexander W; Hsieh-Wilson, Linda C
2018-06-23
The dynamic posttranslational modification O-linked β-N-acetylglucosamine glycosylation (O-GlcNAcylation) is present on thousands of intracellular proteins in the brain. Like phosphorylation, O-GlcNAcylation is inducible and plays important functional roles in both physiology and disease. Recent advances in mass spectrometry (MS) and bioconjugation methods are now enabling the mapping of O-GlcNAcylation events to individual sites in proteins. However, our understanding of which glycosylation events are necessary for regulating protein function and controlling specific processes, phenotypes, or diseases remains in its infancy. Given the sheer number of O-GlcNAc sites, methods are greatly needed to identify promising sites and prioritize them for time- and resource-intensive functional studies. Revealing sites that are dynamically altered by different stimuli or disease states will likely to go a long way in this regard. Here, we describe advanced methods for identifying O-GlcNAc sites on individual proteins and across the proteome, and for determining their stoichiometry in vivo. We also highlight emerging technologies for quantitative, site-specific MS-based O-GlcNAc proteomics (O-GlcNAcomics), which allow proteome-wide tracking of O-GlcNAcylation dynamics at individual sites. These cutting-edge technologies are beginning to bridge the gap between the high-throughput cataloging of O-GlcNAcylated proteins and the relatively low-throughput study of individual proteins. By uncovering the O-GlcNAcylation events that change in specific physiological and disease contexts, these new approaches are providing key insights into the regulatory functions of O-GlcNAc in the brain, including their roles in neuroprotection, neuronal signaling, learning and memory, and neurodegenerative diseases.
Hattotuwagama, Channa K; Doytchinova, Irini A; Flower, Darren R
2007-01-01
Quantitative structure-activity relationship (QSAR) analysis is a cornerstone of modern informatics. Predictive computational models of peptide-major histocompatibility complex (MHC)-binding affinity based on QSAR technology have now become important components of modern computational immunovaccinology. Historically, such approaches have been built around semiqualitative, classification methods, but these are now giving way to quantitative regression methods. We review three methods--a 2D-QSAR additive-partial least squares (PLS) and a 3D-QSAR comparative molecular similarity index analysis (CoMSIA) method--which can identify the sequence dependence of peptide-binding specificity for various class I MHC alleles from the reported binding affinities (IC50) of peptide sets. The third method is an iterative self-consistent (ISC) PLS-based additive method, which is a recently developed extension to the additive method for the affinity prediction of class II peptides. The QSAR methods presented here have established themselves as immunoinformatic techniques complementary to existing methodology, useful in the quantitative prediction of binding affinity: current methods for the in silico identification of T-cell epitopes (which form the basis of many vaccines, diagnostics, and reagents) rely on the accurate computational prediction of peptide-MHC affinity. We have reviewed various human and mouse class I and class II allele models. Studied alleles comprise HLA-A*0101, HLA-A*0201, HLA-A*0202, HLA-A*0203, HLA-A*0206, HLA-A*0301, HLA-A*1101, HLA-A*3101, HLA-A*6801, HLA-A*6802, HLA-B*3501, H2-K(k), H2-K(b), H2-D(b) HLA-DRB1*0101, HLA-DRB1*0401, HLA-DRB1*0701, I-A(b), I-A(d), I-A(k), I-A(S), I-E(d), and I-E(k). In this chapter we show a step-by-step guide into predicting the reliability and the resulting models to represent an advance on existing methods. The peptides used in this study are available from the AntiJen database (http://www.jenner.ac.uk/AntiJen). The PLS method is available commercially in the SYBYL molecular modeling software package. The resulting models, which can be used for accurate T-cell epitope prediction, will be made are freely available online at the URL http://www.jenner.ac.uk/MHCPred.
Schilling, Birgit; Rardin, Matthew J; MacLean, Brendan X; Zawadzka, Anna M; Frewen, Barbara E; Cusack, Michael P; Sorensen, Dylan J; Bereman, Michael S; Jing, Enxuan; Wu, Christine C; Verdin, Eric; Kahn, C Ronald; Maccoss, Michael J; Gibson, Bradford W
2012-05-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models.
Schilling, Birgit; Rardin, Matthew J.; MacLean, Brendan X.; Zawadzka, Anna M.; Frewen, Barbara E.; Cusack, Michael P.; Sorensen, Dylan J.; Bereman, Michael S.; Jing, Enxuan; Wu, Christine C.; Verdin, Eric; Kahn, C. Ronald; MacCoss, Michael J.; Gibson, Bradford W.
2012-01-01
Despite advances in metabolic and postmetabolic labeling methods for quantitative proteomics, there remains a need for improved label-free approaches. This need is particularly pressing for workflows that incorporate affinity enrichment at the peptide level, where isobaric chemical labels such as isobaric tags for relative and absolute quantitation and tandem mass tags may prove problematic or where stable isotope labeling with amino acids in cell culture labeling cannot be readily applied. Skyline is a freely available, open source software tool for quantitative data processing and proteomic analysis. We expanded the capabilities of Skyline to process ion intensity chromatograms of peptide analytes from full scan mass spectral data (MS1) acquired during HPLC MS/MS proteomic experiments. Moreover, unlike existing programs, Skyline MS1 filtering can be used with mass spectrometers from four major vendors, which allows results to be compared directly across laboratories. The new quantitative and graphical tools now available in Skyline specifically support interrogation of multiple acquisitions for MS1 filtering, including visual inspection of peak picking and both automated and manual integration, key features often lacking in existing software. In addition, Skyline MS1 filtering displays retention time indicators from underlying MS/MS data contained within the spectral library to ensure proper peak selection. The modular structure of Skyline also provides well defined, customizable data reports and thus allows users to directly connect to existing statistical programs for post hoc data analysis. To demonstrate the utility of the MS1 filtering approach, we have carried out experiments on several MS platforms and have specifically examined the performance of this method to quantify two important post-translational modifications: acetylation and phosphorylation, in peptide-centric affinity workflows of increasing complexity using mouse and human models. PMID:22454539
Randomized controlled trials and meta-analysis in medical education: what role do they play?
Cook, David A
2012-01-01
Education researchers seek to understand what works, for whom, in what circumstances. Unfortunately, educational environments are complex and research itself is highly context dependent. Faced with these challenges, some have argued that qualitative methods should supplant quantitative methods such as randomized controlled trials (RCTs) and meta-analysis. I disagree. Good qualitative and mixed-methods research are complementary to, rather than exclusive of, quantitative methods. The complexity and challenges we face should not beguile us into ignoring methods that provide strong evidence. What, then, is the proper role for RCTs and meta-analysis in medical education? First, the choice of study design depends on the research question. RCTs and meta-analysis are appropriate for many, but not all, study goals. They have compelling strengths but also numerous limitations. Second, strong methods will not compensate for a pointless question. RCTs do not advance the science when they make confounded comparisons, or make comparison with no intervention. Third, clinical medicine now faces many of the same challenges we encounter in education. We can learn much from other fields about how to handle complexity in RCTs. Finally, no single study will definitively answer any research question. We need carefully planned, theory-building, programmatic research, reflecting a variety of paradigms and approaches, as we accumulate evidence to change the art and science of education.
Quantitative PCR detection of Batrachochytrium dendrobatidis DNA from sediments and water
Kirshtein, Julie D.; Anderson, Chauncey W.; Wood, J.S.; Longcore, Joyce E.; Voytek, Mary A.
2007-01-01
The fungal pathogen Batrachochytrium dendrobatidis (Bd) causes chytridiomycosis, a disease implicated in amphibian declines on 5 continents. Polymerase chain reaction (PCR) primer sets exist with which amphibians can be tested for this disease, and advances in sampling techniques allow non-invasive testing of animals. We developed filtering and PCR based quantitative methods by modifying existing PCR assays to detect Bd DNA in water and sediments, without the need for testing amphibians; we tested the methods at 4 field sites. The SYBR based assay using Boyle primers (SYBR/Boyle assay) and the Taqman based assay using Wood primers performed similarly with samples generated in the laboratory (Bd spiked filters), but the SYBR/Boyle assay detected Bd DNA in more field samples. We detected Bd DNA in water from 3 of 4 sites tested, including one pond historically negative for chytridiomycosis. Zoospore equivalents in sampled water ranged from 19 to 454 l-1 (nominal detection limit is 10 DNA copies, or about 0.06 zoospore). We did not detect DNA of Bd from sediments collected at any sites. Our filtering and amplification methods provide a new tool to investigate critical aspects of Bd in the environment. ?? Inter-Research 2007.
Novel Contrast Mechanisms at 3 Tesla and 7 Tesla
Regatte, Ravinder R.; Schweitzer, Mark E.
2013-01-01
Osteoarthritis (OA) is the most common musculoskeletal degenerative disease, affecting millions of people. Although OA has been considered primarily a cartilage disorder associated with focal cartilage degeneration, it is accompanied by well-known changes in subchondral and trabecular bone, including sclerosis and osteophyte formation. The exact cause of OA initiation and progression remains under debate, but OA typically first affects weightbearing joints such as the knee. Magnetic resonance imaging (MRI) has been recognized as a potential tool for quantitative assessment of cartilage abnormalities due to its excellent soft tissue contrast. Over the last two decades, several new MR biochemical imaging methods have been developed to characterize the disease process and possibly predict the progression of knee OA. These new MR biochemical methods play an important role not only for diagnosis of disease at an early stage, but also for their potential use in monitoring outcome of various drug therapies (success or failure). Recent advances in multicoil radiofrequency technology and high field systems (3 T and above) significantly improve the sensitivity and specificity of imaging studies for the diagnosis of musculoskeletal disorders. The current state-of-the-art MR imaging methods are briefly reviewed for the quantitative biochemical and functional imaging assessment of musculoskeletal systems. PMID:18850506
Yamane, Naoe; Takami, Tomonori; Tozuka, Zenzaburo; Sugiyama, Yuichi; Yamazaki, Akira; Kumagai, Yuji
2009-01-01
A sample treatment procedure and high-sensitive liquid chromatography/tandem mass spectrometry (LC/MS/MS) method for quantitative determination of nicardipine in human plasma were developed for a microdose clinical trial with nicardipine, a non-radioisotope labeled drug. The calibration curve was linear in the range of 1-500 pg/mL using 1 mL of plasma. Analytical method validation for the clinical dose, for which the calibration curve was linear in the range of 0.2-100 ng/mL using 20 microL of plasma, was also conducted. Each method was successfully applied to making determinations in plasma using LC/MS/MS after administration of a microdose (100 microg) and clinical dose (20 mg) to each of six healthy volunteers. We tested new approaches in the search for metabolites in plasma after microdosing. In vitro metabolites of nicardipine were characterized using linear ion trap-fourier transform ion cyclotron resonance mass spectrometry (LIT-FTICRMS) and the nine metabolites predicted to be in plasma were analyzed using LC/MS/MS. There is a strong possibility that analysis of metabolites by LC/MS/MS may advance to utilization in microdose clinical trials with non-radioisotope labeled drugs.
Sparse QSAR modelling methods for therapeutic and regenerative medicine
NASA Astrophysics Data System (ADS)
Winkler, David A.
2018-02-01
The quantitative structure-activity relationships method was popularized by Hansch and Fujita over 50 years ago. The usefulness of the method for drug design and development has been shown in the intervening years. As it was developed initially to elucidate which molecular properties modulated the relative potency of putative agrochemicals, and at a time when computing resources were scarce, there is much scope for applying modern mathematical methods to improve the QSAR method and to extending the general concept to the discovery and optimization of bioactive molecules and materials more broadly. I describe research over the past two decades where we have rebuilt the unit operations of the QSAR method using improved mathematical techniques, and have applied this valuable platform technology to new important areas of research and industry such as nanoscience, omics technologies, advanced materials, and regenerative medicine. This paper was presented as the 2017 ACS Herman Skolnik lecture.
Niwa, Miki; Katada, Naonobu
2013-10-01
In this review, a method for the temperature-programmed desorption (TPD) of ammonia experiment for the characterization of zeolite acidity and its improvement by simultaneous IR measurement and DFT calculation are described. First, various methods of ammonia TPD are explained, since the measurements have been conducted under the concepts of kinetics, equilibrium, or diffusion control. It is however emphasized that the ubiquitous TPD experiment is governed by the equilibrium between ammonia molecules in the gas phase and on the surface. Therefore, a method to measure quantitatively the strength of the acid site (∆H upon ammonia desorption) under equilibrium-controlled conditions is elucidated. Then, a quantitative relationship between ∆H and H0 function is proposed, based on which the acid strength ∆H can be converted into the H0 function. The identification of the desorption peaks and the quantitative measurement of the number of acid sites are then explained. In order to overcome a serious disadvantage of the method (i.e., no information is provided about the structure of acid sites), the simultaneous measurement of IR spectroscopy with ammonia TPD, named IRMS-TPD (infrared spectroscopy/mass spectrometry-temperature-programmed desorption), is proposed. Based on this improved measurement, Brønsted and Lewis acid sites were differentiated and the distribution of Brønsted OH was revealed. The acidity characterized by IRMS-TPD was further supported by the theoretical DFT calculation. Thus, the advanced study of zeolite acidity at the molecular level was made possible. Advantages and disadvantages of the ammonia TPD experiment are discussed, and understanding of the catalytic cracking activity based on the derived acidic profile is explained. Copyright © 2013 The Chemical Society of Japan and Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Advances in mass spectrometry-based approaches for neuropeptide analysis].
Ji, Qianyue; Ma, Min; Peng, Xin; Jia, Chenxi; Ji, Qianyue
2017-07-25
Neuropeptides are an important class of endogenous bioactive substances involved in the function of the nervous system, and connect the brain and other neural and peripheral organs. Mass spectrometry-based neuropeptidomics are designed to study neuropeptides in a large-scale manner and obtain important molecular information to further understand the mechanism of nervous system regulation and the pathogenesis of neurological diseases. This review summarizes the basic strategies for the study of neuropeptides using mass spectrometry, including sample preparation and processing, qualitative and quantitative methods, and mass spectrometry imagining.
Molecularly imprinted polymers for the detection of illegal drugs and additives: a review.
Xiao, Deli; Jiang, Yue; Bi, Yanping
2018-04-04
This review (with 154 refs.) describes the current status of using molecularly imprinted polymers in the extraction and quantitation of illicit drugs and additives. The review starts with an introduction into some synthesis methods (lump MIPs, spherical MIPs, surface imprinting) of MIPs using illicit drugs and additives as templates. The next section covers applications, with subsections on the detection of illegal additives in food, of doping in sports, and of illicit addictive drugs. A particular focus is directed towards current limitations and challenges, on the optimization of methods for preparation of MIPs, their applicability to aqueous samples, the leakage of template molecules, and the identification of the best balance between adsorption capacity and selectivity factor. At last, the need for convincing characterization methods, the lack of uniform parameters for defining selectivity, and the merits and demerits of MIPs prepared using nanomaterials are addressed. Strategies are suggested to solve existing problems, and future developments are discussed with respect to a more widespread use in relevant fields. Graphical abstract This review gives a comprehensive overview of the advances made in molecularly imprinting of polymers for use in the extraction and quantitation of illicit drugs and additives. Methods for syntheses, highlighted applications, limitations and current challenges are specifically addressed.
Health risk behaviours amongst school adolescents: protocol for a mixed methods study.
El Achhab, Youness; El Ammari, Abdelghaffar; El Kazdouh, Hicham; Najdi, Adil; Berraho, Mohamed; Tachfouti, Nabil; Lamri, Driss; El Fakir, Samira; Nejjari, Chakib
2016-11-29
Determining risky behaviours of adolescents provides valuable information for designing appropriate intervention programmes for advancing adolescent's health. However, these behaviours are not fully addressed by researchers in a comprehensive approach. We report the protocol of a mixed methods study designed to investigate the health risk behaviours of Moroccan adolescents with the goal of identifying suitable strategies to address their health concerns. We used a sequential two-phase explanatory mixed method study design. The approach begins with the collection of quantitative data, followed by the collection of qualitative data to explain and enrich the quantitative findings. In the first phase, the global school-based student health survey (GSHS) was administered to 800 students who were between 14 and 19 years of age. The second phase engaged adolescents, parents and teachers in focus groups and assessed education documents to explore the level of coverage of health education in the programme learnt in the middle school. To obtain opinions about strategies to reduce Moroccan adolescents' health risk behaviours, a nominal group technique will be used. The findings of this mixed methods sequential explanatory study provide insights into the risk behaviours that need to be considered if intervention programmes and preventive strategies are to be designed to promote adolescent's health in the Moroccan school.
Single Molecule Approaches in RNA-Protein Interactions.
Serebrov, Victor; Moore, Melissa J
RNA-protein interactions govern every aspect of RNA metabolism, and aberrant RNA-binding proteins are the cause of hundreds of genetic diseases. Quantitative measurements of these interactions are necessary in order to understand mechanisms leading to diseases and to develop efficient therapies. Existing methods of RNA-protein interactome capture can afford a comprehensive snapshot of RNA-protein interaction networks but lack the ability to characterize the dynamics of these interactions. As all ensemble methods, their resolution is also limited by statistical averaging. Here we discuss recent advances in single molecule techniques that have the potential to tackle these challenges. We also provide a thorough overview of single molecule colocalization microscopy and the essential protein and RNA tagging and detection techniques.
Hybrid electro-optics and chipscale integration of electronics and photonics
NASA Astrophysics Data System (ADS)
Dalton, L. R.; Robinson, B. H.; Elder, D. L.; Tillack, A. F.; Johnson, L. E.
2017-08-01
Taken together, theory-guided nano-engineering of organic electro-optic materials and hybrid device architectures have permitted dramatic improvement of the performance of electro-optic devices. For example, the voltage-length product has been improved by nearly a factor of 104 , bandwidths have been extended to nearly 200 GHz, device footprints reduced to less than 200 μm2 , and femtojoule energy efficiency achieved. This presentation discusses the utilization of new coarse-grained theoretical methods and advanced quantum mechanical methods to quantitatively simulate the physical properties of new classes of organic electro-optic materials and to evaluate their performance in nanoscopic device architectures, accounting for the effect on chromophore ordering at interfaces in nanoscopic waveguides.
Sibling recurrence and the genetic epidemiology of autism
Constantino, John N.; Zhang, Yi; Frazier, Thomas; Abbacchi, Anna M.; Law, Paul
2010-01-01
Objective Although the symptoms of autism exhibit quantitative distributions in nature, estimates of recurrence risk in families have never previously considered or incorporated quantitative characterization of the autistic phenotype among siblings. Method We report the results of quantitative characterization of 2,920 children from 1,235 families participating in a national volunteer register who met the criteria of having at least one child clinically-affected by an autism spectrum disorder (ASD) and at least one full biological sibling. Results The occurrence of a traditionally-defined ASD in an additional child occurred in 10.9% of the families. An additional 20% of non-ASD-affected siblings had a history of language delay, half of whom had exhibited autistic qualities of speech. Quantitative characterization using the Social Responsiveness Scale (SRS) supported previously-reported aggregation of a wide range of subclinical (quantitative) autistic traits among otherwise unaffected children in multiple-incidence families, and a relative absence of quantitative autistic traits among siblings in single-incidence autism families. Girls whose standardized severity ratings fell above a first percentile severity threshold (relative to the general population distribution) were significantly less likely to have elicited community diagnoses than their male counterparts. Conclusions These data suggest that, depending on how it is defined, sibling recurrence in ASD may exceed previously-published estimates, and varies as a function of family type. The results support differences in mechanisms of genetic transmission between simplex and multiplex autism, and advance current understanding of the genetic epidemiology of autism. PMID:20889652
Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui
2012-06-20
With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Carpenter, Paul; Curreri, Peter A. (Technical Monitor)
2002-01-01
This course will cover practical applications of the energy-dispersive spectrometer (EDS) to x-ray microanalysis. Topics covered will include detector technology, advances in pulse processing, resolution and performance monitoring, detector modeling, peak deconvolution and fitting, qualitative and quantitative analysis, compositional mapping, and standards. An emphasis will be placed on use of the EDS for quantitative analysis, with discussion of typical problems encountered in the analysis of a wide range of materials and sample geometries.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment
NASA Technical Reports Server (NTRS)
Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.
1979-01-01
The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.
Advancing Usability Evaluation through Human Reliability Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronald L. Boring; David I. Gertman
2005-07-01
This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probabilitymore » of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues.« less
Wagstaff, Jane L; Taylor, Samantha L; Howard, Mark J
2013-04-05
This review aims to illustrate that STD NMR is not simply a method for drug screening and discovery, but has qualitative and quantitative applications that can answer fundamental and applied biological and biomedical questions involving molecular interactions between ligands and proteins. We begin with a basic introduction to the technique of STD NMR and report on recent advances and biological applications of STD including studies to follow the interactions of non-steroidal anti-inflammatories, minimum binding requirements for virus infection and understating inhibition of amyloid fibre formation. We expand on this introduction by reporting recent STD NMR studies of live-cell receptor systems, new methodologies using scanning STD, magic-angle spinning STD and approaches to use STD NMR in a quantitative fashion for dissociation constants and group epitope mapping (GEM) determination. We finish by outlining new approaches that have potential to influence future applications of the technique; NMR isotope-editing, heteronuclear multidimensional STD and (19)F STD methods that are becoming more amenable due to the latest NMR equipment technologies.
Quantitative NDE of Composite Structures at NASA
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.
2015-01-01
The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.
Yang, Guang-Fu; Huang, Xiaoqin
2006-01-01
Over forty years have elapsed since Hansch and Fujita published their pioneering work of quantitative structure-activity relationships (QSAR). Following the introduction of Comparative Molecular Field Analysis (CoMFA) by Cramer in 1998, other three-dimensional QSAR methods have been developed. Currently, combination of classical QSAR and other computational techniques at three-dimensional level is of greatest interest and generally used in the process of modern drug discovery and design. During the last several decades, a number of different mythologies incorporating a range of molecular descriptors and different statistical regression ways have been proposed and successfully applied in developing of new drugs, thus QSAR method has been proven to be indispensable in not only the reliable prediction of specific properties of new compounds, but also the help to elucidate the possible molecular mechanism of the receptor-ligand interactions. Here, we review the recent developments in QSAR and their applications in rational drug design, focusing on the reasonable selection of novel molecular descriptors and the construction of predictive QSAR models by the help of advanced computational techniques.
Trends in fluorescence imaging and related techniques to unravel biological information.
Haustein, Elke; Schwille, Petra
2007-09-01
Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics.
Trends in fluorescence imaging and related techniques to unravel biological information
Haustein, Elke; Schwille, Petra
2007-01-01
Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics. PMID:19404444
Adachi, Tetsuya; Pezzotti, Giuseppe; Yamamoto, Toshiro; Ichioka, Hiroaki; Boffelli, Marco; Zhu, Wenliang; Kanamura, Narisato
2015-05-01
A systematic investigation, based on highly spectrally resolved Raman spectroscopy, was undertaken to research the efficacy of vibrational assessments in locating chemical and crystallographic fingerprints for the characterization of dental caries and the early detection of non-cavitated carious lesions. Raman results published by other authors have indicated possible approaches for this method. However, they conspicuously lacked physical insight at the molecular scale and, thus, the rigor necessary to prove the efficacy of this spectroscopy method. After solving basic physical challenges in a companion paper, we apply them here in the form of newly developed Raman algorithms for practical dental research. Relevant differences in mineral crystallite (average) orientation and texture distribution were revealed for diseased enamel at different stages compared with healthy mineralized enamel. Clear spectroscopy features could be directly translated in terms of a rigorous and quantitative classification of crystallography and chemical characteristics of diseased enamel structures. The Raman procedure enabled us to trace back otherwise invisible characteristics in early caries, in the translucent zone (i.e., the advancing front of the disease) and in the body of lesion of cavitated caries.
Quantitative Laughter Detection, Measurement, and Classification-A Critical Survey.
Cosentino, Sarah; Sessa, Salvatore; Takanishi, Atsuo
2016-01-01
The study of human nonverbal social behaviors has taken a more quantitative and computational approach in recent years due to the development of smart interfaces and virtual agents or robots able to interact socially. One of the most interesting nonverbal social behaviors, producing a characteristic vocal signal, is laughing. Laughter is produced in several different situations: in response to external physical, cognitive, or emotional stimuli; to negotiate social interactions; and also, pathologically, as a consequence of neural damage. For this reason, laughter has attracted researchers from many disciplines. A consequence of this multidisciplinarity is the absence of a holistic vision of this complex behavior: the methods of analysis and classification of laughter, as well as the terminology used, are heterogeneous; the findings sometimes contradictory and poorly documented. This survey aims at collecting and presenting objective measurement methods and results from a variety of different studies in different fields, to contribute to build a unified model and taxonomy of laughter. This could be successfully used for advances in several fields, from artificial intelligence and human-robot interaction to medicine and psychiatry.
Cunefare, David; Cooper, Robert F; Higgins, Brian; Katz, David F; Dubra, Alfredo; Carroll, Joseph; Farsiu, Sina
2016-05-01
Quantitative analysis of the cone photoreceptor mosaic in the living retina is potentially useful for early diagnosis and prognosis of many ocular diseases. Non-confocal split detector based adaptive optics scanning light ophthalmoscope (AOSLO) imaging reveals the cone photoreceptor inner segment mosaics often not visualized on confocal AOSLO imaging. Despite recent advances in automated cone segmentation algorithms for confocal AOSLO imagery, quantitative analysis of split detector AOSLO images is currently a time-consuming manual process. In this paper, we present the fully automatic adaptive filtering and local detection (AFLD) method for detecting cones in split detector AOSLO images. We validated our algorithm on 80 images from 10 subjects, showing an overall mean Dice's coefficient of 0.95 (standard deviation 0.03), when comparing our AFLD algorithm to an expert grader. This is comparable to the inter-observer Dice's coefficient of 0.94 (standard deviation 0.04). To the best of our knowledge, this is the first validated, fully-automated segmentation method which has been applied to split detector AOSLO images.
The costs of introducing new technologies into space systems
NASA Technical Reports Server (NTRS)
Dodson, E. N.; Partma, H.; Ruhland, W.
1992-01-01
A review is conducted of cost-research studies intended to provide guidelines for cost estimates of integrating new technologies into existing satellite systems. Quantitative methods are described for determining the technological state-of-the-art so that proposed programs can be evaluated accurately in terms of their contribution to technological development. The R&D costs associated with the proposed programs are then assessed with attention given to the technological advances. Also incorporated quantifiably are any reductions in the costs of production, operations, and support afforded by the advanced technologies. The proposed model is employed in relation to a satellite sizing and cost study in which a tradeoff between increased R&D costs and reduced production costs is examined. The technology/cost model provides a consistent yardstick for assessing the true relative economic impact of introducing novel techniques and technologies.
Computational Medicine: Translating Models to Clinical Care
Winslow, Raimond L.; Trayanova, Natalia; Geman, Donald; Miller, Michael I.
2013-01-01
Because of the inherent complexity of coupled nonlinear biological systems, the development of computational models is necessary for achieving a quantitative understanding of their structure and function in health and disease. Statistical learning is applied to high-dimensional biomolecular data to create models that describe relationships between molecules and networks. Multiscale modeling links networks to cells, organs, and organ systems. Computational approaches are used to characterize anatomic shape and its variations in health and disease. In each case, the purposes of modeling are to capture all that we know about disease and to develop improved therapies tailored to the needs of individuals. We discuss advances in computational medicine, with specific examples in the fields of cancer, diabetes, cardiology, and neurology. Advances in translating these computational methods to the clinic are described, as well as challenges in applying models for improving patient health. PMID:23115356
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
34 CFR 668.145 - Test approval procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... appropriate advanced degrees and experience in test development or psychometric research, to determine whether... contain multiple sub-tests measuring content domains other than verbal and quantitative domains, the Secretary reviews only those sub-tests covering the verbal and quantitative domains. (b)(1) If the Secretary...
Strains on the nano- and microscale in nickel-titanium: An advanced TEM study
NASA Astrophysics Data System (ADS)
Tirry, Wim
2007-12-01
A general introduction to shape memory behavior and the martensitic transformation is given in chapter 1, with speck information concerning the NiTi material. The technique used to study the material is transmission electron microscopy (TEM) of which the basics are explained in chapter 2 as well as information concerning the NiTi material. The main goal was to apply more advanced TEM techniques in order to measure some aspects in a quantitative way rather than qualitative, which is mostly the case in conventional TEM. (1) Quantitative electron diffraction was used to refine the structure of Ni4Ti3 precipitates, this was done by using the MSLS method in combination with density functional theory (DFT) calculations. (2) These Ni4Ti3 precipitates are (semi-)coherent which results in a strain field in the matrix close to the precipitate. High resolution TEM (HRTEM) in combination with image processing techniques was used to measure these strain fields. The obtained results are compared to the Eshelby model for elliptical inclusions, and major difference is an underestimation of the strain magnitude by the model. One of the algorithms used to extract strain information from HRTEM images is the geometric phase method. (3) The Ni4Ti3-Ni4Ti3 and Ni4Ti3-precipitate interface was investigated with HRTEM showing that the Ni4Ti3-precipitate interface might be diffuse over a range of 3nm. (4) In-situ straining experiments were performed on single crystalline and superelastic polycrystalline NiTi samples. It seems that the strain induced martensite planes in the polycrystalline sample show no sign of twinning. This is in contradiction to what is expected and is discussed in the view of the crystallographic theory of martensite, in addition a first model explaining this behavior is proposed. In this dissertation the main attention is divided over the material aspects of NiTi and on how to apply these more advanced TEM techniques.
Image Analysis of DNA Fiber and Nucleus in Plants.
Ohmido, Nobuko; Wako, Toshiyuki; Kato, Seiji; Fukui, Kiichi
2016-01-01
Advances in cytology have led to the application of a wide range of visualization methods in plant genome studies. Image analysis methods are indispensable tools where morphology, density, and color play important roles in the biological systems. Visualization and image analysis methods are useful techniques in the analyses of the detailed structure and function of extended DNA fibers (EDFs) and interphase nuclei. The EDF is the highest in the spatial resolving power to reveal genome structure and it can be used for physical mapping, especially for closely located genes and tandemly repeated sequences. One the other hand, analyzing nuclear DNA and proteins would reveal nuclear structure and functions. In this chapter, we describe the image analysis protocol for quantitatively analyzing different types of plant genome, EDFs and interphase nuclei.
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-01-01
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis. PMID:28788708
Ju, Hyunjin; Lee, Deuck Hang; Cho, Hae-Chang; Kim, Kang Su; Yoon, Seyoon; Seo, Soo-Yeon
2014-06-23
In this study, hydrophilic chemical grout using silanol (HCGS) was adopted to overcome the performance limitations of epoxy materials used for strengthening existing buildings and civil engineering structures. The enhanced material performances of HCGS were introduced, and applied to the section enlargement method, which is one of the typical structural strengthening methods used in practice. To evaluate the excellent structural strengthening performance of the HCGS, structural tests were conducted on reinforced concrete beams, and analyses on the flexural behaviors of test specimens were performed by modified partial interaction theory (PIT). In particular, to improve the constructability of the section enlargement method, an advanced strengthening method was proposed, in which the precast panel was directly attached to the bottom of the damaged structural member by HCGS, and the degree of connection of the test specimens, strengthened by the section enlargement method, were quantitatively evaluated by PIT-based analysis.
New High Throughput Methods to Estimate Chemical ...
EPA has made many recent advances in high throughput bioactivity testing. However, concurrent advances in rapid, quantitative prediction of human and ecological exposures have been lacking, despite the clear importance of both measures for a risk-based approach to prioritizing and screening chemicals. A recent report by the National Research Council of the National Academies, Exposure Science in the 21st Century: A Vision and a Strategy (NRC 2012) laid out a number of applications in chemical evaluation of both toxicity and risk in critical need of quantitative exposure predictions, including screening and prioritization of chemicals for targeted toxicity testing, focused exposure assessments or monitoring studies, and quantification of population vulnerability. Despite these significant needs, for the majority of chemicals (e.g. non-pesticide environmental compounds) there are no or limited estimates of exposure. For example, exposure estimates exist for only 7% of the ToxCast Phase II chemical list. In addition, the data required for generating exposure estimates for large numbers of chemicals is severely lacking (Egeghy et al. 2012). This SAP reviewed the use of EPA's ExpoCast model to rapidly estimate potential chemical exposures for prioritization and screening purposes. The focus was on bounded chemical exposure values for people and the environment for the Endocrine Disruptor Screening Program (EDSP) Universe of Chemicals. In addition to exposure, the SAP
Aligned fibers direct collective cell migration to engineer closing and nonclosing wound gaps
Sharma, Puja; Ng, Colin; Jana, Aniket; Padhi, Abinash; Szymanski, Paige; Lee, Jerry S. H.; Behkam, Bahareh; Nain, Amrinder S.
2017-01-01
Cell emergence onto damaged or organized fibrous extracellular matrix (ECM) is a crucial precursor to collective cell migration in wound closure and cancer metastasis, respectively. However, there is a fundamental gap in our quantitative understanding of the role of local ECM size and arrangement in cell emergence–based migration and local gap closure. Here, using ECM-mimicking nanofibers bridging cell monolayers, we describe a method to recapitulate and quantitatively describe these in vivo behaviors over multispatial (single cell to cell sheets) and temporal (minutes to weeks) scales. On fiber arrays with large interfiber spacing, cells emerge (invade) either singularly by breaking cell–cell junctions analogous to release of a stretched rubber band (recoil), or in groups of few cells (chains), whereas on closely spaced fibers, multiple chains emerge collectively. Advancing cells on fibers form cell streams, which support suspended cell sheets (SCS) of various sizes and curvatures. SCS converge to form local gaps that close based on both the gap size and shape. We document that cell stream spacing of 375 µm and larger hinders SCS advancement, thus providing abilities to engineer closing and nonclosing gaps. Altogether we highlight the importance of studying cell-fiber interactions and matrix structural remodeling in fundamental and translational cell biology. PMID:28747440
Advances in In Vitro and In Silico Tools for Toxicokinetic Dose ...
Recent advances in vitro assays, in silico tools, and systems biology approaches provide opportunities for refined mechanistic understanding for chemical safety assessment that will ultimately lead to reduced reliance on animal-based methods. With the U.S. commercial chemical landscape encompassing thousands of chemicals with limited data, safety assessment strategies that reliably predict in vivo systemic exposures and subsequent in vivo effects efficiently are a priority. Quantitative in vitro-in vivo extrapolation (QIVIVE) is a methodology that facilitates the explicit and quantitative application of in vitro experimental data and in silico modeling to predict in vivo system behaviors and can be applied to predict chemical toxicokinetics, toxicodynamics and also population variability. Tiered strategies that incorporate sufficient information to reliably inform the relevant decision context will facilitate acceptance of these alternative data streams for safety assessments. This abstract does not necessarily reflect U.S. EPA policy. This talk will provide an update to an international audience on the state of science being conducted within the EPA’s Office of Research and Development to develop and refine approaches that estimate internal chemical concentrations following a given exposure, known as toxicokinetics. Toxicokinetic approaches hold great potential in their ability to link in vitro activities or toxicities identified during high-throughput screen
NASA Astrophysics Data System (ADS)
Takanabe, Akifumi; Tanaka, Masahito; Taniguchi, Atsuo; Yamanaka, Hisashi; Asahi, Toru
2014-07-01
To improve our ability to identify single crystals causing arthritis, we have developed a practical measurement system of polarized light microscopy called advanced compensated polarized light microscopy (A-CPLM). The A-CPLM system is constructed by employing a conventional phase retardation plate, an optical fibre and a charge-coupled device spectrometer in a polarized light microscope. We applied the A-CPLM system to measure linear birefringence (LB) in the visible region, which is an optical anisotropic property, for tiny single crystals causing arthritis, i.e. monosodium urate monohydrate (MSUM) and calcium pyrophosphate dihydrate (CPPD). The A-CPLM system performance was evaluated by comparing the obtained experimental data using the A-CPLM system with (i) literature data for a standard sample, MgF2, and (ii) experimental data obtained using an established optical method, high-accuracy universal polarimeter, for the MSUM. The A-CPLM system was found to be applicable for measuring the LB spectra of the single crystals of MSUM and CPPD, which cause arthritis, in the visible regions. We quantitatively reveal the large difference in LB between MSUM and CPPD crystals. These results demonstrate the usefulness of the A-CPLM system for distinguishing the crystals causing arthritis.
DAWN (Design Assistant Workstation) for advanced physical-chemical life support systems
NASA Technical Reports Server (NTRS)
Rudokas, Mary R.; Cantwell, Elizabeth R.; Robinson, Peter I.; Shenk, Timothy W.
1989-01-01
This paper reports the results of a project supported by the National Aeronautics and Space Administration, Office of Aeronautics and Space Technology (NASA-OAST) under the Advanced Life Support Development Program. It is an initial attempt to integrate artificial intelligence techniques (via expert systems) with conventional quantitative modeling tools for advanced physical-chemical life support systems. The addition of artificial intelligence techniques will assist the designer in the definition and simulation of loosely/well-defined life support processes/problems as well as assist in the capture of design knowledge, both quantitative and qualitative. Expert system and conventional modeling tools are integrated to provide a design workstation that assists the engineer/scientist in creating, evaluating, documenting and optimizing physical-chemical life support systems for short-term and extended duration missions.
NASA Astrophysics Data System (ADS)
Bramwell-Lalor, Sharon; Rainford, Marcia
2014-03-01
This paper reports on teachers' use of concept mapping as an alternative assessment strategy in advanced level biology classes and its effects on students' cognitive skills on selected biology concepts. Using a mixed methods approach, the study employed a pre-test/post-test quasi-experimental design involving 156 students and 8 teachers from intact classes. A researcher-constructed Biology Cognitive Skills Test was used to collect the quantitative data. Qualitative data were collected through interviews and students' personal documents. The data showed that the participants utilized concept mapping in various ways and they described positive experiences while being engaged in its use. The main challenge cited by teachers was the limited time available for more consistent use. The results showed that the use of concept mapping in advanced level biology can lead to learning gains that exceed those achieved in classes where mainly traditional methods are used. The students in the concept mapping experimental groups performed significantly better than their peers in the control group on both the lower-order (F(1) = 21.508; p < .001) and higher-order (F(1) = 42.842, p < .001) cognitive items of the biology test. A mean effect size of .56 was calculated representing the contribution of treatment to the students' performance on the test items.
KOGA, KATIA HIROMOTO; MORIGUCHI, SONIA MARTA; NETO, JORGE NAHÁS; PERES, STELA VERZINHASSE; SILVA, EDUARDO TINÓIS DA; SARRI, ALMIR JOSÉ; MICHELIN, ODAIR CARLITO; MARQUES, MARIANGELA ESTHER ALENCAR; GRIVA, BEATRIZ LOTUFO
2010-01-01
To evaluate the tumor response to neoadjuvant chemotherapy, 99mTc-sestamibi breast scintigraphy was proposed as a quantitative method. Fifty-five patients with ductal carcinoma were studied. They underwent breast scintigraphy before and after neoadjuvant chemotherapy, along with clinical assessment and surgical specimen analysis. The regions of interest on the lesion and contralateral breast were identified, and the pixel counts were used to evaluate lesion uptake in relation to background radiation. The ratio of these counts before to after neoadjuvant chemotherapy was assessed. The decrease in uptake rate due to chemotherapy characterized the scintigraphy tumor response. The Kruskal-Wallis test was used to compare the mean scintigraphic tumor response and histological type. Dunn’s multiple comparison test was used to detect differences between histological types. The Mann-Whitney test was used to compare means between quantitative and qualitative variables: scintigraphic tumor response vs. clinical response and uptake before chemotherapy vs. scintigraphic tumor response. The Spearman’s test was used to correlate the quantitative variables of clinical reduction in tumor size and scintigraphic tumor response. All of the variables compared presented significant differences. The change in 99mTc-sestamibi uptake noted on breast scintigraphy, before to after neoadjuvant chemotherapy, may be used as an effective method for evaluating the response to neoadjuvant chemotherapy, since this quantification reflects the biological behavior of the tumor towards the chemotherapy regimen. Furthermore, additional analysis on the uptake rate before chemotherapy may accurately predict treatment response. PMID:22966312
Wu, Shulian; Huang, Yudian; Tang, Qinggong; Li, Zhifang; Horng, Hannah; Li, Jiatian; Wu, Zaihua; Chen, Yu; Li, Hui
2018-03-01
Preoperative neoadjuvant treatment in locally advanced breast cancer is recognized as an effective adjuvant therapy, as it improves treatment outcomes. However, the potential complications remain a threat, so there is an urgent clinical need to assess both the tumor response and changes in its microenvironment using non-invasive and precise identification techniques. Here, two-photon microscopy was employed to detect morphological alterations in breast cancer progression and recession throughout chemotherapy. The changes in structure were analyzed based on the autofluorescence and collagen of differing statuses. Parameters, including optical redox ratio, the ratio of second harmonic generation and auto-fluorescence signal, collagen density, and collagen shape orientation, were studied. Results indicate that these parameters are potential indicators for evaluating breast tumors and their microenvironment changes during progression and chemotherapy. Combined analyses of these parameters could provide a quantitative, novel method for monitoring tumor therapy.
Pellett, Sabine; Du, Zhong-wei; Pier, Christina L; Tepp, William H; Zhang, Su-chun; Johnson, Eric A
2011-01-07
Botulinum neurotoxins (BoNTs), the most poisonous protein toxins known, represent a serious bioterrorism threat but are also used as a unique and important bio-pharmaceutical to treat an increasing myriad of neurological disorders. The only currently accepted detection method by the United States Food and Drug Administration for biological activity of BoNTs and for potency determination of pharmaceutical preparations is the mouse bioassay (MBA). Recent advances have indicated that cell-based assays using primary neuronal cells can provide an equally sensitive and robust detection platform as the MBA to reliably and quantitatively detect biologically active BoNTs. This study reports for the first time a BoNT detection assay using mouse embryonic stem cells to produce a neuronal cell culture. The data presented indicate that this assay can reliably detect BoNT/A with a similar sensitivity as the MBA. Published by Elsevier Inc.
Microscopy and microanalysis 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bailey, G.W.; Corbett, J.M.; Dimlich, R.V.W.
1996-12-31
The Proceedings of this Annual Meeting contain paper of members from the three societies. These proceedings emphasizes the common research interests and attempts to eliminate some unwanted overlap. Topics covered are: microscopic analysis of animals with altered gene expression and in-situ gene and antibody localizations, high-resolution elemental mapping of nucleoprofein interactions, plant biology and pathology, quantitative HREM analysis of perfect and defected materials, computational methods for TEM image analysis, high-resolution FESM in materials research, frontiers in polymer microscopy and microanalysis, oxidation and corrosion, micro XRD and XRF, molecular microspectroscopy and spectral imaging, advances in confocal and multidimensional light microscopy, analyticalmore » electron microscopy in biology, correlative microscopy in biological sciences, grain-boundary microengineering, surfaces and interfaces, telepresence microscopy in education and research, MSA educational outreach, quantitative electron probe microanalysis, frontiers of analytical electron microscopy, critical issues in ceramic microstructures, dynamic organization of the cell, pathology, microbiology, high-resolution biological and cryo SEM, and scanning-probe microscopy.« less
Principles of Metamorphic Petrology
NASA Astrophysics Data System (ADS)
Williams, Michael L.
2009-05-01
The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.
Kennard, Raymond; DeSisto, William J; Giririjan, Thanu Praba; Mason, Michael D
2008-04-07
Mesoporous silica membranes fabricated by the surfactant-templated sol-gel process have received attention because of the potential to prepare membranes with a narrow pore size distribution and ordering of the interconnected pores. Potential applications include ultrafiltration, biological separations and drug delivery, and separators in lithium-ion batteries. Despite advancements in synthesis and characterization of these membranes, a quantitative description of the membrane microstructure remains a challenge. Currently the membrane microstructure is characterized by the combination of results from several techniques, i.e., gas permeance testing, x-ray diffraction scanning electron microscopy, transmission electron microscopy, and permporometry. The results from these ensemble methods are then compiled and the data fitted to a particular flow model. Although these methods are very effective in determining membrane performance, general pore size distribution, and defect concentration, they are unable to monitor molecular paths through the membrane and quantitatively measure molecular interactions between the molecular specie and pore network. Single-molecule imaging techniques enable optical measurements that probe materials on nanometer length scales through observation of individual molecules without the influence of averaging. Using single-molecule imaging spectroscopy, we can quantitatively characterize the interaction between the probe molecule and the interior of the pore within mesoporous silica membranes. This approach is radically different from typical membrane characterization methods in that it has the potential to spatially sample the underlying pore structure distribution, the surface energy, and the transport properties. Our hope is that this new fundamental knowledge can be quantitatively linked to both the preparation and the performance of membranes, leading to the advancement of membrane science and technology. Fluorescent molecules, 1,1-dioctadecyl-3,3,3,3-tetramethylindo-carbocyanine perchlorate, used to interrogate the available free volume in their vicinity, were loaded into the mesoporous silica membranes at subnanomolar concentrations. The mesoporous silica films were prepared using a nonionic ethylene oxide-propylene oxide-ethylene oxide triblock copolymer surfactant, Pluronic P123, on single crystal silicon substrates using dip coating of a silica sol. Membranes were prepared resulting in an average pore diameter of approximately 5 nm as measured by helium, nitrogen permeance, and porosimetry. Fluorescent images and time transient experiments were recorded using a custom built single-molecule scanning confocal microscope at differing temperatures (10, 20, 30, 40, and 50 degrees C). Time-dependent polarization anisotropy was used to obtain the enthalpy of adsorption and Henry's law constant of the probe molecule.
NASA Astrophysics Data System (ADS)
Kennard, Raymond; DeSisto, William J.; Giririjan, Thanu Praba; Mason, Michael D.
2008-04-01
Mesoporous silica membranes fabricated by the surfactant-templated sol-gel process have received attention because of the potential to prepare membranes with a narrow pore size distribution and ordering of the interconnected pores. Potential applications include ultrafiltration, biological separations and drug delivery, and separators in lithium-ion batteries. Despite advancements in synthesis and characterization of these membranes, a quantitative description of the membrane microstructure remains a challenge. Currently the membrane microstructure is characterized by the combination of results from several techniques, i.e., gas permeance testing, x-ray diffraction scanning electron microscopy, transmission electron microscopy, and permporometry. The results from these ensemble methods are then compiled and the data fitted to a particular flow model. Although these methods are very effective in determining membrane performance, general pore size distribution, and defect concentration, they are unable to monitor molecular paths through the membrane and quantitatively measure molecular interactions between the molecular specie and pore network. Single-molecule imaging techniques enable optical measurements that probe materials on nanometer length scales through observation of individual molecules without the influence of averaging. Using single-molecule imaging spectroscopy, we can quantitatively characterize the interaction between the probe molecule and the interior of the pore within mesoporous silica membranes. This approach is radically different from typical membrane characterization methods in that it has the potential to spatially sample the underlying pore structure distribution, the surface energy, and the transport properties. Our hope is that this new fundamental knowledge can be quantitatively linked to both the preparation and the performance of membranes, leading to the advancement of membrane science and technology. Fluorescent molecules, 1,1-dioctadecyl-3,3,3,3-tetramethylindo-carbocyanine perchlorate, used to interrogate the available free volume in their vicinity, were loaded into the mesoporous silica membranes at subnanomolar concentrations. The mesoporous silica films were prepared using a nonionic ethylene oxide-propylene oxide-ethylene oxide triblock copolymer surfactant, Pluronic P123, on single crystal silicon substrates using dip coating of a silica sol. Membranes were prepared resulting in an average pore diameter of approximately 5nm as measured by helium, nitrogen permeance, and porosimetry. Fluorescent images and time transient experiments were recorded using a custom built single-molecule scanning confocal microscope at differing temperatures (10, 20, 30, 40, and 50°C). Time-dependent polarization anisotropy was used to obtain the enthalpy of adsorption and Henry's law constant of the probe molecule.
Investigation of iterative image reconstruction in three-dimensional optoacoustic tomography
Wang, Kun; Su, Richard; Oraevsky, Alexander A; Anastasio, Mark A
2012-01-01
Iterative image reconstruction algorithms for optoacoustic tomography (OAT), also known as photoacoustic tomography, have the ability to improve image quality over analytic algorithms due to their ability to incorporate accurate models of the imaging physics, instrument response, and measurement noise. However, to date, there have been few reported attempts to employ advanced iterative image reconstruction algorithms for improving image quality in three-dimensional (3D) OAT. In this work, we implement and investigate two iterative image reconstruction methods for use with a 3D OAT small animal imager: namely, a penalized least-squares (PLS) method employing a quadratic smoothness penalty and a PLS method employing a total variation norm penalty. The reconstruction algorithms employ accurate models of the ultrasonic transducer impulse responses. Experimental data sets are employed to compare the performances of the iterative reconstruction algorithms to that of a 3D filtered backprojection (FBP) algorithm. By use of quantitative measures of image quality, we demonstrate that the iterative reconstruction algorithms can mitigate image artifacts and preserve spatial resolution more effectively than FBP algorithms. These features suggest that the use of advanced image reconstruction algorithms can improve the effectiveness of 3D OAT while reducing the amount of data required for biomedical applications. PMID:22864062
Breeding and quantitative genetics advances in sunflower Sclerotinia research
USDA-ARS?s Scientific Manuscript database
Genetic research of the sunflower research unit, USDA-ARS, in Fargo, ND, was discussed in a presentation to a group of producers, industry representatives, and scientists. The need for sunflower quantitative genetics research to find and capture Sclerotinia resistance is increasing with every year t...
Engineering and physical sciences in oncology: challenges and opportunities.
Mitchell, Michael J; Jain, Rakesh K; Langer, Robert
2017-11-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas.
Alves, Julio Cesar Laurentino; Poppi, Ronei Jesus
2013-11-07
Highly polluting fuels based on non-renewable resources such as fossil fuels need to be replaced with potentially less polluting renewable fuels derived from vegetable or animal biomass, these so-called biofuels, are a reality nowadays and many countries have started the challenge of increasing the use of different types of biofuels, such as ethanol and biodiesel (fatty acid alkyl esters), often mixed with petroleum derivatives, such as gasoline and diesel, respectively. The quantitative determination of these fuel blends using simple, fast and low cost methods based on near infrared (NIR) spectroscopy combined with chemometric methods has been reported. However, advanced biofuels based on a mixture of hydrocarbons or a single hydrocarbon molecule, such as farnesane (2,6,10-trimethyldodecane), a hydrocarbon renewable diesel, can also be used in mixtures with biodiesel and petroleum diesel fuel and the use of NIR spectroscopy for the quantitative determination of a ternary fuel blend of these two hydrocarbon-based fuels and biodiesel can be a useful tool for quality control. This work presents a development of an analytical method for the quantitative determination of hydrocarbon renewable diesel (farnesane), biodiesel and petroleum diesel fuel blends using NIR spectroscopy combined with chemometric methods, such as partial least squares (PLS) and support vector machines (SVM). This development leads to a more accurate, simpler, faster and cheaper method when compared to the standard reference method ASTM D6866 and with the main advantage of providing the individual quantification of two different biofuels in a mixture with petroleum diesel fuel. Using the developed PLS model the three fuel blend components were determined simultaneously with values of root mean square error of prediction (RMSEP) of 0.25%, 0.19% and 0.38% for hydrocarbon renewable diesel, biodiesel and petroleum diesel, respectively, the values obtained were in agreement with those suggested by reference methods for the determination of renewable fuels.
2012-01-01
Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037
NASA Astrophysics Data System (ADS)
Goddard, Braden
The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.
Saitou, Takashi; Imamura, Takeshi
2016-01-01
Cell cycle progression is strictly coordinated to ensure proper tissue growth, development, and regeneration of multicellular organisms. Spatiotemporal visualization of cell cycle phases directly helps us to obtain a deeper understanding of controlled, multicellular, cell cycle progression. The fluorescent ubiquitination-based cell cycle indicator (Fucci) system allows us to monitor, in living cells, the G1 and the S/G2/M phases of the cell cycle in red and green fluorescent colors, respectively. Since the discovery of Fucci technology, it has found numerous applications in the characterization of the timing of cell cycle phase transitions under diverse conditions and various biological processes. However, due to the complexity of cell cycle dynamics, understanding of specific patterns of cell cycle progression is still far from complete. In order to tackle this issue, quantitative approaches combined with mathematical modeling seem to be essential. Here, we review several studies that attempted to integrate Fucci technology and mathematical models to obtain quantitative information regarding cell cycle regulatory patterns. Focusing on the technological development of utilizing mathematics to retrieve meaningful information from the Fucci producing data, we discuss how the combined methods advance a quantitative understanding of cell cycle regulation. © 2015 Japanese Society of Developmental Biologists.
Zimmerman, Stefan L; Kim, Woojin; Boonn, William W
2011-01-01
Quantitative and descriptive imaging data are a vital component of the radiology report and are frequently of paramount importance to the ordering physician. Unfortunately, current methods of recording these data in the report are both inefficient and error prone. In addition, the free-text, unstructured format of a radiology report makes aggregate analysis of data from multiple reports difficult or even impossible without manual intervention. A structured reporting work flow has been developed that allows quantitative data created at an advanced imaging workstation to be seamlessly integrated into the radiology report with minimal radiologist intervention. As an intermediary step between the workstation and the reporting software, quantitative and descriptive data are converted into an extensible markup language (XML) file in a standardized format specified by the Annotation and Image Markup (AIM) project of the National Institutes of Health Cancer Biomedical Informatics Grid. The AIM standard was created to allow image annotation data to be stored in a uniform machine-readable format. These XML files containing imaging data can also be stored on a local database for data mining and analysis. This structured work flow solution has the potential to improve radiologist efficiency, reduce errors, and facilitate storage of quantitative and descriptive imaging data for research. Copyright © RSNA, 2011.
Assessing physical activity using wearable monitors: measures of physical activity.
Butte, Nancy F; Ekelund, Ulf; Westerterp, Klaas R
2012-01-01
Physical activity may be defined broadly as "all bodily actions produced by the contraction of skeletal muscle that increase energy expenditure above basal level." Physical activity is a complex construct that can be classified into major categories qualitatively, quantitatively, or contextually. The quantitative assessment of physical activity using wearable monitors is grounded in the measurement of energy expenditure. Six main categories of wearable monitors are currently available to investigators: pedometers, load transducers/foot-contact monitors, accelerometers, HR monitors, combined accelerometer and HR monitors, and multiple sensor systems. Currently available monitors are capable of measuring total physical activity as well as components of physical activity that play important roles in human health. The selection of wearable monitors for measuring physical activity will depend on the physical activity component of interest, study objectives, characteristics of the target population, and study feasibility in terms of cost and logistics. Future development of sensors and analytical techniques for assessing physical activity should focus on the dynamic ranges of sensors, comparability for sensor output across manufacturers, and the application of advanced modeling techniques to predict energy expenditure and classify physical activities. New approaches for qualitatively classifying physical activity should be validated using direct observation or recording. New sensors and methods for quantitatively assessing physical activity should be validated in laboratory and free-living populations using criterion methods of calorimetry or doubly labeled water.
Hsieh, Anne M-Y; Polyakova, Olena; Fu, Guodong; Chazen, Ronald S; MacMillan, Christina; Witterick, Ian J; Ralhan, Ranju; Walfish, Paul G
2018-04-13
Recognition of noninvasive follicular thyroid neoplasms with papillary-like nuclear features (NIFTP) that distinguishes them from invasive malignant encapsulated follicular variant of papillary thyroid carcinoma (EFVPTC) can prevent overtreatment of NIFTP patients. We and others have previously reported that programmed death-ligand 1 (PD-L1) is a useful biomarker in thyroid tumors; however, all reports to date have relied on manual scoring that is time consuming as well as subject to individual bias. Consequently, we developed a digital image analysis (DIA) protocol for cytoplasmic and membranous stain quantitation (ThyApp) and evaluated three tumor sampling methods [Systemic Uniform Random Sampling, hotspot nucleus, and hotspot nucleus/3,3'-Diaminobenzidine (DAB)]. A patient cohort of 153 cases consisting of 48 NIFTP, 44 EFVPTC, 26 benign nodules and 35 encapsulated follicular lesions/neoplasms with lymphocytic thyroiditis (LT) was studied. ThyApp quantitation of PD-L1 expression revealed a significant difference between invasive EFVPTC and NIFTP; but none between NIFTP and benign nodules. ThyApp integrated with hotspot nucleus tumor sampling method demonstrated to be most clinically relevant, consumed least processing time, and eliminated interobserver variance. In conclusion, the fully automatic DIA algorithm developed using a histomorphological approach objectively quantitated PD-L1 expression in encapsulated thyroid neoplasms and outperformed manual scoring in reproducibility and higher efficiency.
Chen, Ran; Zhang, Yuntao; Sahneh, Faryad Darabi; Scoglio, Caterina M; Wohlleben, Wendel; Haase, Andrea; Monteiro-Riviere, Nancy A; Riviere, Jim E
2014-09-23
Quantitative characterization of nanoparticle interactions with their surrounding environment is vital for safe nanotechnological development and standardization. A recent quantitative measure, the biological surface adsorption index (BSAI), has demonstrated promising applications in nanomaterial surface characterization and biological/environmental prediction. This paper further advances the approach beyond the application of five descriptors in the original BSAI to address the concentration dependence of the descriptors, enabling better prediction of the adsorption profile and more accurate categorization of nanomaterials based on their surface properties. Statistical analysis on the obtained adsorption data was performed based on three different models: the original BSAI, a concentration-dependent polynomial model, and an infinite dilution model. These advancements in BSAI modeling showed a promising development in the application of quantitative predictive modeling in biological applications, nanomedicine, and environmental safety assessment of nanomaterials.
NASA Astrophysics Data System (ADS)
Leker, Lindsey Beth
Stereotype threat is a widely researched phenomenon shown to impact performance in testing and evaluation situations (Katz, Roberts, & Robinson, 1965; Steele & Aronson, 1995). When related to gender, stereotype threat can lead women to score lower than men on standardized math exams (Spencer, Steele, & Quinn, 1999). Stereotype threat may be one reason women have lower enrollment in most science, technology, engineering, and mathematics (STEM) majors, hold a smaller number of STEM careers than men, and have a higher attrition rate in STEM professions (Hill, Corbet, & Rose, 2010; Picho & Brown 2011; Sorby & Baartmans, 2000). Most research has investigated stereotype threat using experiments yielding mixed results (Stoet & Geary, 2012). Thus, there is a need to explore stereotype threat using quantitative surveys and qualitative methods to examine other contextual factors that contribute to gender difference in STEM fields. This dissertation outlined a mixed methods study designed to, first, qualitatively explore stereotype threat and contextual factors related to high achieving women in STEM fields, as well as women who have failed and/or avoided STEM fields. Then, the quantitative portion of the study used the themes from the qualitative phase to create a survey that measured stereotype threat and other contextual variables related to STEM success and failure/avoidance. Fifteen participants were interviewed for the qualitative phase of the study and six themes emerged. The quantitative survey was completed 242 undergraduate participants. T-tests, correlations, regressions, and mediation analyses were used to analyze the data. There were significant relationships between stereotype threat and STEM confidence, STEM anxiety, giving up in STEM, and STEM achievement. Overall, this mixed methods study advanced qualitative research on stereotype threat, developed a much-needed scale for the measurement of stereotype threat, and tested the developed scale.
Evaluating Inquiry-Based Learning as a Means to Advance Individual Student Achievement
ERIC Educational Resources Information Center
Ziemer, Cherilyn G.
2013-01-01
Although inquiry-based learning has been debated throughout the greater educational community and demonstrated with some effect in modern classrooms, little quantitative analysis has been performed to empirically validate sustained benefits. This quantitative study focused on whether inquiry-based pedagogy actually brought about sustained and…
Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.
ERIC Educational Resources Information Center
Moffat, A. J.; And Others
Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Szczykutowicz, T; Rubert, N; Ranallo, F
Purpose: A framework for explaining differences in image quality to non-technical audiences in medial imaging is needed. Currently, this task is something that is learned “on the job.” The lack of a formal methodology for communicating optimal acquisition parameters into the clinic effectively mitigates many technological advances. As a community, medical physicists need to be held responsible for not only advancing image science, but also for ensuring its proper use in the clinic. This work outlines a framework that bridges the gap between the results from quantitative image quality metrics like detectability, MTF, and NPS and their effect on specificmore » anatomical structures present in diagnostic imaging tasks. Methods: Specific structures of clinical importance were identified for a body, an extremity, a chest, and a temporal bone protocol. Using these structures, quantitative metrics were used to identify the parameter space that should yield optimal image quality constrained within the confines of clinical logistics and dose considerations. The reading room workflow for presenting the proposed changes for imaging each of these structures is presented. The workflow consists of displaying images for physician review consisting of different combinations of acquisition parameters guided by quantitative metrics. Examples of using detectability index, MTF, NPS, noise and noise non-uniformity are provided. During review, the physician was forced to judge the image quality solely on those features they need for diagnosis, not on the overall “look” of the image. Results: We found that in many cases, use of this framework settled mis-agreements between physicians. Once forced to judge images on the ability to detect specific structures inter reader agreement was obtained. Conclusion: This framework will provide consulting, research/industrial, or in-house physicists with clinically relevant imaging tasks to guide reading room image review. This framework avoids use of the overall “look” or “feel” to dictate acquisition parameter selection. Equipment grants GE Healthcare.« less
Evolving Concepts and Teaching Approaches In Tectonics and Sedimentation.
ERIC Educational Resources Information Center
Graham, Stephan Alan
1983-01-01
Discusses five recent advances in sedimentary tectonics, noting how they are incorporated into college curricula. Advances discussed include basin type, tectonic setting, facies analysis (in conjunction with basin type/setting), stratigraphic analysis of reflection seismic data, and quantitative analysis of subsidence histories of sedimentary…
Jones, Louise; Candy, Bridget; Davis, Sarah; Elliott, Margaret; Gola, Anna; Harrington, Jane; Kupeli, Nuriye; Lord, Kathryn; Moore, Kirsten; Scott, Sharon; Vickerstaff, Victoria; Omar, Rumana Z; King, Michael; Leavey, Gerard; Nazareth, Irwin; Sampson, Elizabeth L
2015-01-01
Background: The prevalence of dementia is rising worldwide and many people will die with the disease. Symptoms towards the end of life may be inadequately managed and informal and professional carers poorly supported. There are few evidence-based interventions to improve end-of-life care in advanced dementia. Aim: To develop an integrated, whole systems, evidence-based intervention that is pragmatic and feasible to improve end-of-life care for people with advanced dementia and support those close to them. Design: A realist-based approach in which qualitative and quantitative data assisted the development of statements. These were incorporated into the RAND/UCLA appropriateness method to achieve consensus on intervention components. Components were mapped to underlying theory of whole systems change and the intervention described in a detailed manual. Setting/participants: Data were collected from people with dementia, carers and health and social care professionals in England, from expert opinion and existing literature. Professional stakeholders in all four countries of the United Kingdom contributed to the RAND/UCLA appropriateness method process. Results: A total of 29 statements were agreed and mapped to individual, group, organisational and economic/political levels of healthcare systems. The resulting main intervention components are as follows: (1) influencing local service organisation through facilitation of integrated multi-disciplinary care, (2) providing training and support for formal and informal carers and (3) influencing local healthcare commissioning and priorities of service providers. Conclusion: Use of in-depth data, consensus methods and theoretical understanding of the intervention components produced an evidence-based intervention for further testing in end-of-life care in advanced dementia. PMID:26354388
Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret
2015-06-11
Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence about the value of sustaining interventions over time, identifying correlates of sustainability along with strategies for sustaining evidence-supported interventions, advancing the theoretical base and research designs for sustainability research, and advancing the workforce capacity, research culture, and funding mechanisms for this important work.
The efficiency of close inbreeding to reduce genetic adaptation to captivity
Theodorou, K; Couvet, D
2015-01-01
Although ex situ conservation is indispensable for thousands of species, captive breeding is associated with negative genetic changes: loss of genetic variance and genetic adaptation to captivity that is deleterious in the wild. We used quantitative genetic individual-based simulations to model the effect of genetic management on the evolution of a quantitative trait and the associated fitness of wild-born individuals that are brought to captivity. We also examined the feasibility of the breeding strategies under a scenario of a large number of loci subject to deleterious mutations. We compared two breeding strategies: repeated half-sib mating and a method of minimizing mean coancestry (referred to as gc/mc). Our major finding was that half-sib mating is more effective in reducing genetic adaptation to captivity than the gc/mc method. Moreover, half-sib mating retains larger allelic and adaptive genetic variance. Relative to initial standing variation, the additive variance of the quantitative trait increased under half-sib mating during the sojourn in captivity. Although fragmentation into smaller populations improves the efficiency of the gc/mc method, half-sib mating still performs better in the scenarios tested. Half-sib mating shows two caveats that could mitigate its beneficial effects: low heterozygosity and high risk of extinction when populations are of low fecundity and size and one of the following conditions are met: (i) the strength of selection in captivity is comparable with that in the wild, (ii) deleterious mutations are numerous and only slightly deleterious. Experimental validation of half-sib mating is therefore needed for the advancement of captive breeding programs. PMID:25052417
Contado, Catia; Argazzi, Roberto; Amendola, Vincenzo
2016-11-04
Many advanced industrial and biomedical applications that use silver nanoparticles (AgNPs), require that particles are not only nano-sized, but also well dispersed, not aggregated and not agglomerated. This study presents two methods able to give rapidly sizes of monodispersed AgNPs suspensions in the dimensional range of 20-100nm. The first method, based on the application of Mie's theory, determines the particle sizes from the values of the surface plasmon resonance wavelength (SPR MAX ), read from the optical absorption spectra, recorded between 190nm and 800nm. The computed sizes were compared with those determined by transmission electron microscopy (TEM) and dynamic light scattering (DLS) and resulted in agreement with the nominal values in a range between 13% (for 20nm NPs) and 1% (for 100nm NPs), The second method is based on the masterly combination of the Sedimentation Field Flow Fractionation (SdFFF - now sold as Centrifugal FFF-CFFF) and the Optical Absorption Spectroscopy (OAS) techniques to accomplish sizes and quantitative particle size distributions for monodispersed, non-aggregated AgNPs suspensions. The SdFFF separation abilities, well exploited to size NPs, greatly benefits from the application of Mie's theory to the UV-vis signal elaboration, producing quantitative mass-based particle size distributions, from which trusted number-sized particle size distributions can be derived. The silver mass distributions were verified and supported by detecting off-line the Ag concentration with the graphite furnace atomic absorption spectrometry (GF-AAS). Copyright © 2016 Elsevier B.V. All rights reserved.
Extracting and Converting Quantitative Data into Human Error Probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuan Q. Tran; Ronald L. Boring; Jeffrey C. Joe
2007-08-01
This paper discusses a proposed method using a combination of advanced statistical approaches (e.g., meta-analysis, regression, structural equation modeling) that will not only convert different empirical results into a common metric for scaling individual PSFs effects, but will also examine the complex interrelationships among PSFs. Furthermore, the paper discusses how the derived statistical estimates (i.e., effect sizes) can be mapped onto a HRA method (e.g. SPAR-H) to generate HEPs that can then be use in probabilistic risk assessment (PRA). The paper concludes with a discussion of the benefits of using academic literature in assisting HRA analysts in generating sound HEPsmore » and HRA developers in validating current HRA models and formulating new HRA models.« less
Folding and unfolding single RNA molecules under tension
Woodside, Michael T; García-García, Cuauhtémoc; Block, Steven M
2010-01-01
Single-molecule force spectroscopy constitutes a powerful method for probing RNA folding: it allows the kinetic, energetic, and structural properties of intermediate and transition states to be determined quantitatively, yielding new insights into folding pathways and energy landscapes. Recent advances in experimental and theoretical methods, including fluctuation theorems, kinetic theories, novel force clamps, and ultrastable instruments, have opened new avenues for study. These tools have been used to probe folding in simple model systems, for example, RNA and DNA hairpins. Knowledge gained from such systems is helping to build our understanding of more complex RNA structures composed of multiple elements, as well as how nucleic acids interact with proteins involved in key cellular activities, such as transcription and translation. PMID:18786653
Aptasensors for quantitative detection of Salmonella Typhimurium.
Ansari, Najmeh; Yazdian-Robati, Rezvan; Shahdordizadeh, Mahin; Wang, Zhouping; Ghazvini, Kiarash
2017-09-15
Salmonella is one of the most frequent causes of food borne infectious disease. Among nearly 2500 documented serotypes are reported, Salmonella Typhimurium is the number one serotype associated with salmonellosis worldwide. Many different methods have been developed for the detection and quantification of S. typhimurium. Most of these assays are usually expensive, time consuming and require difficult sample preparation steps. Therefore, it is necessary to develop rapid, robust, cost-effective and sensitive alternative detection methods. In the last years, aptasensors, used for detection of S. typhimurium in different samples. In this review, recent advances and applications of aptasensors for the detection and quantification of S. typhimurium in details have been summarized. Copyright © 2017 Elsevier Inc. All rights reserved.
A facile method to modify bentonite nanoclay with silane
NASA Astrophysics Data System (ADS)
Abeywardena, Sujani B. Y.; Perera, Srimala; Nalin de Silva, K. M.; Tissera, Nadeeka P.
2017-07-01
Immobilization of smectite clay onto a desirable surface has received much attention, since its nanospace can be utilized for many applications in material science. Here, we present an efficient method to functionalize surface of bentonite nanoclay (BNC) through the grafting of 3-aminotriethoxysilane (APTES). Infrared spectroscopy and elemental analysis confirmed the presence of organic chains and amine groups in modified nanoclay. XRD analysis confirmed grafting of APTES on the surface of bentonite nanoclay without intercalation. The accomplishment of the surface modification was quantitatively proved by TGA analysis. Modified BNC can covalently couple with different material surfaces, allowing its nanospace to be utilized for intercalation of cations, bio-molecules, and polymeric materials, to be used in advanced military aerospace, pharmaceuticals, and many other commercial applications.
Methods, Tools and Current Perspectives in Proteogenomics *
Ruggles, Kelly V.; Krug, Karsten; Wang, Xiaojing; Clauser, Karl R.; Wang, Jing; Payne, Samuel H.; Fenyö, David; Zhang, Bing; Mani, D. R.
2017-01-01
With combined technological advancements in high-throughput next-generation sequencing and deep mass spectrometry-based proteomics, proteogenomics, i.e. the integrative analysis of proteomic and genomic data, has emerged as a new research field. Early efforts in the field were focused on improving protein identification using sample-specific genomic and transcriptomic sequencing data. More recently, integrative analysis of quantitative measurements from genomic and proteomic studies have identified novel insights into gene expression regulation, cell signaling, and disease. Many methods and tools have been developed or adapted to enable an array of integrative proteogenomic approaches and in this article, we systematically classify published methods and tools into four major categories, (1) Sequence-centric proteogenomics; (2) Analysis of proteogenomic relationships; (3) Integrative modeling of proteogenomic data; and (4) Data sharing and visualization. We provide a comprehensive review of methods and available tools in each category and highlight their typical applications. PMID:28456751
Comparison of Different Methods of Grading a Level Turn Task on a Flight Simulator
NASA Technical Reports Server (NTRS)
Heath, Bruce E.; Crier, tomyka
2003-01-01
With the advancements in the computing power of personal computers, pc-based flight simulators and trainers have opened new avenues in the training of airplane pilots. It may be desirable to have the flight simulator make a quantitative evaluation of the progress of a pilot's training thereby reducing the physical requirement of the flight instructor who must, in turn, watch every flight. In an experiment, University students conducted six different flights, each consisting of two level turns. The flights were three minutes in duration. By evaluating videotapes, two certified flight instructors provided separate letter grades for each turn. These level turns were also evaluated using two other computer based grading methods. One method determined automated grades based on prescribed tolerances in bank angle, airspeed and altitude. The other method used was deviations in altitude and bank angle for performance index and performance grades.
Katsarov, Plamen; Gergov, Georgi; Alin, Aylin; Pilicheva, Bissera; Al-Degs, Yahya; Simeonov, Vasil; Kassarova, Margarita
2018-03-01
The prediction power of partial least squares (PLS) and multivariate curve resolution-alternating least squares (MCR-ALS) methods have been studied for simultaneous quantitative analysis of the binary drug combination - doxylamine succinate and pyridoxine hydrochloride. Analysis of first-order UV overlapped spectra was performed using different PLS models - classical PLS1 and PLS2 as well as partial robust M-regression (PRM). These linear models were compared to MCR-ALS with equality and correlation constraints (MCR-ALS-CC). All techniques operated within the full spectral region and extracted maximum information for the drugs analysed. The developed chemometric methods were validated on external sample sets and were applied to the analyses of pharmaceutical formulations. The obtained statistical parameters were satisfactory for calibration and validation sets. All developed methods can be successfully applied for simultaneous spectrophotometric determination of doxylamine and pyridoxine both in laboratory-prepared mixtures and commercial dosage forms.
Integrated System Test of the Advanced Instructional System (AIS). Final Report.
ERIC Educational Resources Information Center
Lintz, Larry M.; And Others
The integrated system test for the Advanced Instructional System (AIS) was designed to provide quantitative information regarding training time reductions resulting from certain computer managed instruction features. The reliabilities of these features and of support systems were also investigated. Basic computer managed instruction reduced…
Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul
2012-01-01
Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.
Polarization sensitive optical coherence tomography – a review [Invited
de Boer, Johannes F.; Hitzenberger, Christoph K.; Yasuno, Yoshiaki
2017-01-01
Optical coherence tomography (OCT) is now a well-established modality for high-resolution cross-sectional and three-dimensional imaging of transparent and translucent samples and tissues. Conventional, intensity based OCT, however, does not provide a tissue-specific contrast, causing an ambiguity with image interpretation in several cases. Polarization sensitive (PS) OCT draws advantage from the fact that several materials and tissues can change the light’s polarization state, adding an additional contrast channel and providing quantitative information. In this paper, we review basic and advanced methods of PS-OCT and demonstrate its use in selected biomedical applications. PMID:28663869
Light and sound - emerging imaging techniques for inflammatory bowel disease
Knieling, Ferdinand; Waldner, Maximilian J
2016-01-01
Patients with inflammatory bowel disease are known to have a high demand of recurrent evaluation for therapy and disease activity. Further, the risk of developing cancer during the disease progression is increasing from year to year. New, mostly non-radiant, quick to perform and quantitative methods are challenging, conventional endoscopy with biopsy as gold standard. Especially, new physical imaging approaches utilizing light and sound waves have facilitated the development of advanced functional and molecular modalities. Besides these advantages they hold the promise to predict personalized therapeutic responses and to spare frequent invasive procedures. Within this article we highlight their potential for initial diagnosis, assessment of disease activity and surveillance of cancer development in established techniques and recent advances such as wide-view full-spectrum endoscopy, chromoendoscopy, autofluorescence endoscopy, endocytoscopy, confocal laser endoscopy, multiphoton endoscopy, molecular imaging endoscopy, B-mode and Doppler ultrasound, contrast-enhanced ultrasound, ultrasound molecular imaging, and elastography. PMID:27433080
Recent Advances in Targeted and Untargeted Metabolomics by NMR and MS/NMR Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bingol, Kerem
Metabolomics has made significant progress in multiple fronts in the last 18 months. This minireview aimed to give an overview of these advancements in the light of their contribution to targeted and untargeted metabolomics. New computational approaches have emerged to overcome manual absolute quantitation step of metabolites in 1D 1H NMR spectra. This provides more consistency between inter-laboratory comparisons. Integration of 2D NMR metabolomics databases under a unified web server allowed very accurate identification of the metabolites that have been catalogued in these databases. For the remaining uncatalogued and unknown metabolites, new cheminformatics approaches have been developed by combining NMRmore » and mass spectrometry. These hybrid NMR/MS approaches accelerated the identification of unknowns in untargeted studies, and now they are allowing to profile ever larger number of metabolites in application studies.« less
Development of imaging techniques to study the pathogenesis of biosafety level 2/3 infectious agents
Rella, Courtney E.; Ruel, Nancy; Eugenin, Eliseo A.
2015-01-01
Despite significant advances in microbiology and molecular biology over the last decades, several infectious diseases remain global concerns, resulting in the death of millions of people worldwide each year. According to the Center for Disease Control (CDC) in 2012, there were 34 million people infected with HIV, 8.7 million new cases of tuberculosis, 500 million cases of hepatitis, and 50–100 million people infected with dengue. Several of these pathogens, despite high incidence, do not have reliable clinical detection methods. New or improved protocols have been generated to enhance detection and quantitation of several pathogens using high-end microscopy (light, confocal, and STORM microscopy) and imaging software. In the current manuscript, we discuss these approaches and the theories behind these methodologies. Thus, advances in imaging techniques will open new possibilities to discover therapeutic interventions to reduce or eliminate the devastating consequences of infectious diseases. PMID:24990818
NASA Astrophysics Data System (ADS)
Dietz, Laura
The Science Teaching Advancement through Modeling Physical Science (STAMPS) professional development workshop was evaluated for effectiveness in improving teachers' and students' content knowledge. Previous research has shown modeling to be an effective method of instruction for improving student and teacher content knowledge, evidenced by assessment scores. Data includes teacher scores on the Force Concept Inventory (FCI; Hestenes, Wells, & Swackhamer, 1992) and the Chemistry Concept Inventory (CCI; Jenkins, Birk, Bauer, Krause, & Pavelich, 2004), as well as student scores on a physics and chemistry assessment. Quantitative data is supported by teacher responses to a post workshop survey and classroom observations. Evaluation of the data shows that the STAMPS professional development workshop was successful in improving both student and teacher content knowledge. Conclusions and suggestions for future study are also included.
Integration of infrared thermography into various maintenance methodologies
NASA Astrophysics Data System (ADS)
Morgan, William T.
1993-04-01
Maintenance methodologies are in developmental stages throughout the world as global competitiveness drives all industries to improve operational efficiencies. Rapid progress in technical advancements has added an additional strain on maintenance organizations to progressively change. Accompanying needs for advanced training and documentation is the demand for utilization of various analytical instruments and quantitative methods. Infrared thermography is one of the primary elements of engineered approaches to maintenance. Current maintenance methodologies can be divided into six categories; Routine ('Breakdown'), Preventive, Predictive, Proactive, Reliability-Based, and Total Productive (TPM) maintenance. Each of these methodologies have distinctive approaches to achieving improved operational efficiencies. Popular though is that infrared thermography is a Predictive maintenance tool. While this is true, it is also true that it can be effectively integrated into each of the maintenance methodologies for achieving desired results. The six maintenance strategies will be defined. Infrared applications integrated into each will be composed in tabular form.
Biological soft X-ray tomography on beamline 2.1 at the Advanced Light Source.
Le Gros, Mark A; McDermott, Gerry; Cinquin, Bertrand P; Smith, Elizabeth A; Do, Myan; Chao, Weilun L; Naulleau, Patrick P; Larabell, Carolyn A
2014-11-01
Beamline 2.1 (XM-2) is a transmission soft X-ray microscope in sector 2 of the Advanced Light Source at Lawrence Berkeley National Laboratory. XM-2 was designed, built and is now operated by the National Center for X-ray Tomography as a National Institutes of Health Biomedical Technology Research Resource. XM-2 is equipped with a cryogenic rotation stage to enable tomographic data collection from cryo-preserved cells, including large mammalian cells. During data collection the specimen is illuminated with `water window' X-rays (284-543 eV). Illuminating photons are attenuated an order of magnitude more strongly by biomolecules than by water. Consequently, differences in molecular composition generate quantitative contrast in images of the specimen. Soft X-ray tomography is an information-rich three-dimensional imaging method that can be applied either as a standalone technique or as a component modality in correlative imaging studies.
Emerging Imaging and Genomic Tools for Developmental Systems Biology.
Liu, Zhe; Keller, Philipp J
2016-03-21
Animal development is a complex and dynamic process orchestrated by exquisitely timed cell lineage commitment, divisions, migration, and morphological changes at the single-cell level. In the past decade, extensive genetic, stem cell, and genomic studies provided crucial insights into molecular underpinnings and the functional importance of genetic pathways governing various cellular differentiation processes. However, it is still largely unknown how the precise coordination of these pathways is achieved at the whole-organism level and how the highly regulated spatiotemporal choreography of development is established in turn. Here, we discuss the latest technological advances in imaging and single-cell genomics that hold great promise for advancing our understanding of this intricate process. We propose an integrated approach that combines such methods to quantitatively decipher in vivo cellular dynamic behaviors and their underlying molecular mechanisms at the systems level with single-cell, single-molecule resolution. Copyright © 2016 Elsevier Inc. All rights reserved.
LES of an Advancing Helicopter Rotor, and Near to Far Wake Assessment
NASA Astrophysics Data System (ADS)
Caprace, Denis-Gabriel; Duponcheel, Matthieu; Chatelain, Philippe; Winckelmans, Grégoire
2017-11-01
Helicopter wake physics involve complex, unsteady vortical flows which have been only scarcely addressed in past studies. The present work focuses on LES of the wake flow behind an advancing rotor, to support the investigation of rotorcraft wake physics and decay mechanisms. A hybrid Vortex Particle-Mesh (VPM) method is employed to simulate the wake of an articulated four-bladed rotor in trimmed conditions, at an advance ratio of 0.41. The simulation domain extends to 30 rotor diameters downstream. The coarse scale aerodynamics of the blades are accounted for through enhanced immersed lifting lines. The vorticity generation mechanisms, the roll-up of the near wake and the resulting established far wake are described (i) qualitatively in terms of vortex dynamics using rotor polar plots and 3D visualizations; (ii) quantitatively using classical integral diagnostics. The power spectra measured by velocity probes in the wake are also presented. The analysis shows that the wake reaches a fully turbulent equilibrium state at a distance of about 30 diameters downstream. This work is supported by the Belgian french community F.R.S.-FNRS.
Computational understanding of Li-ion batteries
NASA Astrophysics Data System (ADS)
Urban, Alexander; Seo, Dong-Hwa; Ceder, Gerbrand
2016-03-01
Over the last two decades, computational methods have made tremendous advances, and today many key properties of lithium-ion batteries can be accurately predicted by first principles calculations. For this reason, computations have become a cornerstone of battery-related research by providing insight into fundamental processes that are not otherwise accessible, such as ionic diffusion mechanisms and electronic structure effects, as well as a quantitative comparison with experimental results. The aim of this review is to provide an overview of state-of-the-art ab initio approaches for the modelling of battery materials. We consider techniques for the computation of equilibrium cell voltages, 0-Kelvin and finite-temperature voltage profiles, ionic mobility and thermal and electrolyte stability. The strengths and weaknesses of different electronic structure methods, such as DFT+U and hybrid functionals, are discussed in the context of voltage and phase diagram predictions, and we review the merits of lattice models for the evaluation of finite-temperature thermodynamics and kinetics. With such a complete set of methods at hand, first principles calculations of ordered, crystalline solids, i.e., of most electrode materials and solid electrolytes, have become reliable and quantitative. However, the description of molecular materials and disordered or amorphous phases remains an important challenge. We highlight recent exciting progress in this area, especially regarding the modelling of organic electrolytes and solid-electrolyte interfaces.
Mata, Ricardo A
2010-05-21
In this Perspective, several developments in the field of quantum mechanics/molecular mechanics (QM/MM) approaches are reviewed. Emphasis is placed on the use of correlated wavefunction theory and new state of the art methods for the treatment of large quantum systems. Until recently, computational chemistry approaches to large/complex chemical problems have seldom been considered as tools for quantitative predictions. However, due to the tremendous development of computational resources and new quantum chemical methods, it is nowadays possible to describe the electronic structure of biomolecules at levels of theory which a decade ago were only possible for system sizes of up to 20 atoms. These advances are here outlined in the context of QM/MM. The article concludes with a short outlook on upcoming developments and possible bottlenecks for future applications.
NASA Astrophysics Data System (ADS)
Bart, Gerhard; Aerne, Ernst Tino; Burri, Martin; Zwicky, Hans-Urs
1986-11-01
Cladding carburization during irradiation of advanced mixed uranium plutonium carbide fast breeder reactor fuel is possibly a life limiting fuel pin factor. The quantitative assessment of such clad carbon embrittlement is difficult to perform by electron microprobe analysis because of sample surface contamination, and due to the very low energy of the carbon K α X-ray transition. The work presented here describes a method developed at the Swiss Federal Institute for Reactor Research (EIR) to use shielded secondary ion mass spectrometry (SIMS) as an accurate tool to determine radial distribution profiles of carbon in radioactive stainless steel fuel pin cladding. Compared with nuclear microprobe analysis (NMA) [1], which is also an accurate method for carbon analysis, the SIMS method distinguishes itself by its versatility for simultaneous determination of additional impurities.
Albrecht, Simone; Mittermayr, Stefan; Smith, Josh; Martín, Silvia Millán; Doherty, Margaret; Bones, Jonathan
2017-01-01
Quantitative glycomics represents an actively expanding research field ranging from the discovery of disease-associated glycan alterations to the quantitative characterization of N-glycans on therapeutic proteins. Commonly used analytical platforms for comparative relative quantitation of complex glycan samples include MALDI-TOF-MS or chromatographic glycan profiling with subsequent data alignment and statistical evaluation. Limitations of such approaches include run-to-run technical variation and the potential introduction of subjectivity during data processing. Here, we introduce an offline 2D LC-MS E workflow for the fractionation and relative quantitation of twoplex isotopically labeled N-linked oligosaccharides using neutral 12 C 6 and 13 C 6 aniline (Δmass = 6 Da). Additional linkage-specific derivatization of sialic acids using 4-(4,6-dimethoxy-1,3,5-trizain-2-yl)-4-methylmorpholinium chloride offered simultaneous and advanced in-depth structural characterization. The potential of the method was demonstrated for the differential analysis of structurally defined N-glycans released from serum proteins of patients diagnosed with various stages of colorectal cancer. The described twoplex 12 C 6 / 13 C 6 aniline 2D LC-MS platform is ideally suited for differential glycomic analysis of structurally complex N-glycan pools due to combination and analysis of samples in a single LC-MS injection and the associated minimization in technical variation. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Thermographic Inspection of Advanced Composite Materials
NASA Technical Reports Server (NTRS)
Cramer, K. Elliott
2004-01-01
As the use of advanced composite materials continues to increase in the aerospace community, the need for a quantitative, rapid, in situ inspection technology has become a critical concern throughout the industry. In many applications it is necessary to monitor changes in these materials over an extended period of time to determine the effects of various load conditions. Additionally, the detection and characterization of defects such as delaminations, is of great concern. This paper will present the application of infrared thermography to characterize various composite materials and show the advantages of different heat source types. Finally, various analysis methodologies used for quantitative material property characterization will be discussed.
Quantitative structural MRI for early detection of Alzheimer’s disease
McEvoy, Linda K; Brewer, James B
2011-01-01
Alzheimer’s disease (AD) is a common progressive neurodegenerative disorder that is not currently diagnosed until a patient reaches the stage of dementia. There is a pressing need to identify AD at an earlier stage, so that treatment, when available, can begin early. Quantitative structural MRI is sensitive to the neurodegeneration that occurs in mild and preclinical AD, and is predictive of decline to dementia in individuals with mild cognitive impairment. Objective evidence of ongoing brain atrophy will be critical for risk/benefit decisions once potentially aggressive, disease-modifying treatments become available. Recent advances have paved the way for the use of quantitative structural MRI in clinical practice, and initial clinical use has been promising. However, further experience with these measures in the relatively unselected patient populations seen in clinical practice is needed to complete translation of the recent enormous advances in scientific knowledge of AD into the clinical realm. PMID:20977326
Getting quantitative about consequences of cross-ecosystem resource subsidies on recipient consumers
Richardson, John S.; Wipfli, Mark S.
2016-01-01
Most studies of cross-ecosystem resource subsidies have demonstrated positive effects on recipient consumer populations, often with very large effect sizes. However, it is important to move beyond these initial addition–exclusion experiments to consider the quantitative consequences for populations across gradients in the rates and quality of resource inputs. In our introduction to this special issue, we describe at least four potential models that describe functional relationships between subsidy input rates and consumer responses, most of them asymptotic. Here we aim to advance our quantitative understanding of how subsidy inputs influence recipient consumers and their communities. In the papers following, fish were either the recipient consumers or the subsidy as carcasses of anadromous species. Advancing general, predictive models will enable us to further consider what other factors are potentially co-limiting (e.g., nutrients, other population interactions, physical habitat, etc.) and better integrate resource subsidies into consumer–resource, biophysical dynamics models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loo, Jr., Billy W.
2000-06-01
The study of the exocrine pancreatic acinar cell has been central to the development of models of many cellular processes, especially of protein transport and secretion. Traditional methods used to examine this system have provided a wealth of qualitative information from which mechanistic models have been inferred. However they have lacked the ability to make quantitative measurements, particularly of the distribution of protein in the cell, information critical for grounding of models in terms of magnitude and relative significance. This dissertation describes the development and application of new tools that were used to measure the protein content of the majormore » intracellular compartments in the acinar cell, particularly the zymogen granule. Soft x-ray microscopy permits image formation with high resolution and contrast determined by the underlying protein content of tissue rather than staining avidity. A sample preparation method compatible with x-ray microscopy was developed and its properties evaluated. Automatic computerized methods were developed to acquire, calibrate, and analyze large volumes of x-ray microscopic images of exocrine pancreatic tissue sections. Statistics were compiled on the protein density of several organelles, and on the protein density, size, and spatial distribution of tens of thousands of zymogen granules. The results of these measurements, and how they compare to predictions of different models of protein transport, are discussed.« less
Gounder, Murugesan K.; Lin, Hongxia; Stein, Mark; Goodin, Susan; Bertino, Joseph R.; Kong, Ah-Ng Tony; DiPaola, Robert S.
2015-01-01
2-deoxyglucose (2-DG), an analog of glucose, is widely used to interfere with the glycolysis in tumor cells and studied as a therapeutic approach in clinical trials. To evaluate the pharmacokinetics of 2-DG, we describe the development and validation of a sensitive HPLC fluorescent method for the quantitation of 2-DG in plasma. Plasma samples were deproteinized with methanol and the supernatant was dried at 45°C. The residues were dissolved in methanolic sodium acetate/boric acid solution. 2-DG and other monosaccharides were derivatized to 2-aminobenzoic acid derivatives in a single step in the presence of sodium cyanoborohydride at 80°C for 45min. The analytes were separated on a YMC ODS C18 reversed-phase column using gradient elution. The excitation and emission wavelengths were set at 360 and 425nm. The 2-DG calibration curves were linear over the range of 0.63 to 300μg/mL with the limit of detection of 0.5μg/mL. The assay provided satisfactory intra-day and inter-day precision with RSD less than 9.8% and the accuracy ranged from 86.8% to 110.0%. The HPLC method is reproducible and suitable for the quantitation of 2-DG in plasma. The method was successfully applied to characterize the pharmacokinetics profile of 2-DG in patients with advanced solid tumors. PMID:21932382
The EPA Office of Water has recognized a critical need for tribes, states and federal agencies to be able to quantitatively assess the condition of the nations wetland resources. Currently, greater than 85% of states, tribes, and territories are lacking even rudimentary biologic...
Quantitative proteomics in cardiovascular research: global and targeted strategies
Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun
2014-01-01
Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501
Berniak, K; Rybak, P; Bernas, T; Zarębski, M; Biela, E; Zhao, H; Darzynkiewicz, Z; Dobrucki, J W
2013-10-01
A method of quantitative analysis of spatial (3D) relationship between discrete nuclear events detected by confocal microscopy is described and applied in analysis of a dependence between sites of DNA damage signaling (γH2AX foci) and DNA replication (EdU incorporation) in cells subjected to treatments with camptothecin (Cpt) or hydrogen peroxide (H2O2). Cpt induces γH2AX foci, likely reporting formation of DNA double-strand breaks (DSBs), almost exclusively at sites of DNA replication. This finding is consistent with the known mechanism of induction of DSBs by DNA topoisomerase I (topo1) inhibitors at the sites of collisions of the moving replication forks with topo1-DNA "cleavable complexes" stabilized by Cpt. Whereas an increased level of H2AX histone phosphorylation is seen in S-phase of cells subjected to H2O2, only a minor proportion of γH2AX foci coincide with DNA replication sites. Thus, the increased level of H2AX phosphorylation induced by H2O2 is not a direct consequence of formation of DNA lesions at the sites of moving DNA replication forks. These data suggest that oxidative stress induced by H2O2 and formation of the primary H2O2-induced lesions (8-oxo-7,8-dihydroguanosine) inhibits replication globally and triggers formation of γH2AX at various distances from replication forks. Quantitative analysis of a frequency of DNA replication sites and γH2AX foci suggests also that stalling of replicating forks by Cpt leads to activation of new DNA replication origins. © 2013 International Society for Advancement of Cytometry. Copyright © 2013 International Society for Advancement of Cytometry.
USDA-ARS?s Scientific Manuscript database
Two advanced backcross populations were developed between a popular southern US tropical japonica rice (Oryza sativa L.) cultivar Bengal and two different of Oryza nivara (IRGC100898; IRGC104705) accessions to identify quantitative trait loci (QTLs) related to sheath blight (SB) disease resistance. ...
Less label, more free: approaches in label-free quantitative mass spectrometry.
Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A
2011-02-01
In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Chiarot, C. B.; Siewerdsen, J. H.; Haycocks, T.; Moseley, D. J.; Jaffray, D. A.
2005-11-01
Development, characterization, and quality assurance of advanced x-ray imaging technologies require phantoms that are quantitative and well suited to such modalities. This note reports on the design, construction, and use of an innovative phantom developed for advanced imaging technologies (e.g., multi-detector CT and the numerous applications of flat-panel detectors in dual-energy imaging, tomosynthesis, and cone-beam CT) in diagnostic and image-guided procedures. The design addresses shortcomings of existing phantoms by incorporating criteria satisfied by no other single phantom: (1) inserts are fully 3D—spherically symmetric rather than cylindrical; (2) modules are quantitative, presenting objects of known size and contrast for quality assurance and image quality investigation; (3) features are incorporated in ideal and semi-realistic (anthropomorphic) contexts; and (4) the phantom allows devices to be inserted and manipulated in an accessible module (right lung). The phantom consists of five primary modules: (1) head, featuring contrast-detail spheres approximate to brain lesions; (2) left lung, featuring contrast-detail spheres approximate to lung modules; (3) right lung, an accessible hull in which devices may be placed and manipulated; (4) liver, featuring conrast-detail spheres approximate to metastases; and (5) abdomen/pelvis, featuring simulated kidneys, colon, rectum, bladder, and prostate. The phantom represents a two-fold evolution in design philosophy—from 2D (cylindrically symmetric) to fully 3D, and from exclusively qualitative or quantitative to a design accommodating quantitative study within an anatomical context. It has proven a valuable tool in investigations throughout our institution, including low-dose CT, dual-energy radiography, and cone-beam CT for image-guided radiation therapy and surgery.
Achieving Integration in Mixed Methods Designs—Principles and Practices
Fetters, Michael D; Curry, Leslie A; Creswell, John W
2013-01-01
Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835
Achieving integration in mixed methods designs-principles and practices.
Fetters, Michael D; Curry, Leslie A; Creswell, John W
2013-12-01
Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.
[Progress in stable isotope labeled quantitative proteomics methods].
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2013-06-01
Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.
Human judgment vs. quantitative models for the management of ecological resources.
Holden, Matthew H; Ellner, Stephen P
2016-07-01
Despite major advances in quantitative approaches to natural resource management, there has been resistance to using these tools in the actual practice of managing ecological populations. Given a managed system and a set of assumptions, translated into a model, optimization methods can be used to solve for the most cost-effective management actions. However, when the underlying assumptions are not met, such methods can potentially lead to decisions that harm the environment and economy. Managers who develop decisions based on past experience and judgment, without the aid of mathematical models, can potentially learn about the system and develop flexible management strategies. However, these strategies are often based on subjective criteria and equally invalid and often unstated assumptions. Given the drawbacks of both methods, it is unclear whether simple quantitative models improve environmental decision making over expert opinion. In this study, we explore how well students, using their experience and judgment, manage simulated fishery populations in an online computer game and compare their management outcomes to the performance of model-based decisions. We consider harvest decisions generated using four different quantitative models: (1) the model used to produce the simulated population dynamics observed in the game, with the values of all parameters known (as a control), (2) the same model, but with unknown parameter values that must be estimated during the game from observed data, (3) models that are structurally different from those used to simulate the population dynamics, and (4) a model that ignores age structure. Humans on average performed much worse than the models in cases 1-3, but in a small minority of scenarios, models produced worse outcomes than those resulting from students making decisions based on experience and judgment. When the models ignored age structure, they generated poorly performing management decisions, but still outperformed students using experience and judgment 66% of the time. © 2016 by the Ecological Society of America.
Wan, Cai-Feng; Liu, Xue-Song; Wang, Lin; Zhang, Jie; Lu, Jin-Song; Li, Feng-Hua
2018-06-01
To clarify whether the quantitative parameters of contrast-enhanced ultrasound (CEUS) can be used to predict pathological complete response (pCR) in patients with locally advanced breast cancer receiving neoadjuvant chemotherapy (NAC). Fifty-one patients with histologically proved locally advanced breast cancer scheduled for NAC were enrolled. The quantitative data for CEUS and the tumor diameter were collected at baseline and before surgery, and compared with the pathological response. Multiple logistic regression analysis was performed to examine quantitative parameters at CEUS and the tumor diameter to predict the pCR, and receiver operating characteristic (ROC) curve analysis was used as a summary statistic. Multiple logistic regression analysis revealed that PEAK (the maximum intensity of the time-intensity curve during bolus transit), PEAK%, TTP% (time to peak), and diameter% were significant independent predictors of pCR, and the area under the ROC curve was 0.932(Az 1 ), and the sensitivity and specificity to predict pCR were 93.7% and 80.0%. The area under the ROC curve for the quantitative parameters was 0.927(Az 2 ), and the sensitivity and specificity to predict pCR were 81.2% and 94.3%. For diameter%, the area under the ROC curve was 0.786 (Az 3 ), and the sensitivity and specificity to predict pCR were 93.8% and 54.3%. The values of Az 1 and Az 2 were significantly higher than that of Az 3 (P = 0.027 and P = 0.034, respectively). However, there was no significant difference between the values of Az 1 and Az 2 (P = 0.825). Quantitative analysis of tumor blood perfusion with CEUS is superior to diameter% to predict pCR, and can be used as a functional technique to evaluate tumor response to NAC. Copyright © 2018. Published by Elsevier B.V.
Chambers, Andrew G.; Percy, Andrew J.; Yang, Juncong; Camenzind, Alexander G.; Borchers, Christoph H.
2013-01-01
Dried blood spot (DBS) sampling, coupled with multiple reaction monitoring mass spectrometry (MRM-MS), is a well-established approach for quantifying a wide range of small molecule biomarkers and drugs. This sampling procedure is simpler and less-invasive than those required for traditional plasma or serum samples enabling collection by minimally trained personnel. Many analytes are stable in the DBS format without refrigeration, which reduces the cost and logistical challenges of sample collection in remote locations. These advantages make DBS sample collection desirable for advancing personalized medicine through population-wide biomarker screening. Here we expand this technology by demonstrating the first multiplexed method for the quantitation of endogenous proteins in DBS samples. A panel of 60 abundant proteins in human blood was targeted by monitoring proteotypic tryptic peptides and their stable isotope-labeled analogs by MRM. Linear calibration curves were obtained for 40 of the 65 peptide targets demonstrating multiple proteins can be quantitatively extracted from DBS collection cards. The method was also highly reproducible with a coefficient of variation of <15% for all 40 peptides. Overall, this assay quantified 37 proteins spanning a range of more than four orders of magnitude in concentration within a single 25 min LC/MRM-MS analysis. The protein abundances of the 33 proteins quantified in matching DBS and whole blood samples showed an excellent correlation, with a slope of 0.96 and an R2 value of 0.97. Furthermore, the measured concentrations for 80% of the proteins were stable for at least 10 days when stored at −20 °C, 4 °C and 37 °C. This work represents an important first step in evaluating the integration of DBS sampling with highly-multiplexed MRM for quantitation of endogenous proteins. PMID:23221968
NanoDrop Microvolume Quantitation of Nucleic Acids
Desjardins, Philippe; Conklin, Deborah
2010-01-01
Biomolecular assays are continually being developed that use progressively smaller amounts of material, often precluding the use of conventional cuvette-based instruments for nucleic acid quantitation for those that can perform microvolume quantitation. The NanoDrop microvolume sample retention system (Thermo Scientific NanoDrop Products) functions by combining fiber optic technology and natural surface tension properties to capture and retain minute amounts of sample independent of traditional containment apparatus such as cuvettes or capillaries. Furthermore, the system employs shorter path lengths, which result in a broad range of nucleic acid concentration measurements, essentially eliminating the need to perform dilutions. Reducing the volume of sample required for spectroscopic analysis also facilitates the inclusion of additional quality control steps throughout many molecular workflows, increasing efficiency and ultimately leading to greater confidence in downstream results. The need for high-sensitivity fluorescent analysis of limited mass has also emerged with recent experimental advances. Using the same microvolume sample retention technology, fluorescent measurements may be performed with 2 μL of material, allowing fluorescent assays volume requirements to be significantly reduced. Such microreactions of 10 μL or less are now possible using a dedicated microvolume fluorospectrometer. Two microvolume nucleic acid quantitation protocols will be demonstrated that use integrated sample retention systems as practical alternatives to traditional cuvette-based protocols. First, a direct A260 absorbance method using a microvolume spectrophotometer is described. This is followed by a demonstration of a fluorescence-based method that enables reduced-volume fluorescence reactions with a microvolume fluorospectrometer. These novel techniques enable the assessment of nucleic acid concentrations ranging from 1 pg/ μL to 15,000 ng/ μL with minimal consumption of sample. PMID:21189466
[Analysis and Control of in Vivo Kinetics of Exosomes for the Development of Exosome-based DDS].
Takahashi, Yuki; Nishikawa, Makiya; Takakura, Yoshinobu
2016-01-01
Exosomes are secretory membrane vesicles containing lipids, proteins, and nucleic acids. They act as intercellular transporters by delivering their components to exosome recipient cells. Based on their endogenous delivery system properties, exosomes are expected to become drug delivery systems (DDS) for various molecules such as nucleic acid-based drugs. Important factors such as drug loading to exosomes, production, and pharmacokinetics of exosomes need to be considered for the development of exosome-based DDS. Of these, the pharmacokinetics of exosomes have rarely been studied, probably because of the lack of quantitative evaluation methods of in vivo exosomal pharmacokinetics. We selected lactadherin as an exosome tropic protein and developed it as a fusion protein with Gaussia luciferase to label exosomes for in vivo imaging. In addition, a fusion protein of lactadherin and streptavidin was developed, and the tissue distribution of exosomes was quantitatively evaluated by radiolabeling the exosomes using (125)I-labeled biotin. Using labeled exosomes, we found that intravenously injected exosomes were rapidly cleared from the systemic circulation by macrophages. In addition, the exosomes were mainly distributed to the liver, lung, and spleen. We also examined the effect of exosome isolation methods on their physicochemical and pharmacokinetic properties. We found that exosomes collected by the ultracentrifugation-based density-gradient method were more dispersed than exosomes collected by other methods, including the ultracentrifugation-based pelleting method. The gradient method is more time-consuming than others; therefore the development of a more efficient method for exosome isolation will advance the development of exosome-based DDS.
Hiler, Daniel J.; Barabas, Marie E.; Griffiths, Lyra M.; Dyer, Michael A.
2017-01-01
Postmitotic differentiated neurons are among the most difficult cells to reprogram into induced pluripotent stem cells (iPSCs) because they have poor viability when cultured as dissociated cells. Other protocols to reprogram postmitotic neurons have required the inactivation of the p53 tumor suppressor. We describe a method that does not require p53 inactivation and induces reprogramming in cells purified from the retinae of reprogrammable mice in aggregates with wild-type retinal cells. After the first 10 days of reprogramming, the aggregates are then dispersed and plated on irradiated feeder cells to propagate and isolate individual iPSC clones. The reprogramming efficiency of different neuronal populations at any stage of development can be quantitated using this protocol. Reprogramming retinal neurons with this protocol will take 56 days, and these retina-derived iPSCs can undergo retinal differentiation to produce retinae in 34 days. In addition, we describe a quantitative assessment of retinal differentiation from these neuron-derived iPSCs called STEM-RET. The procedure quantitates eye field specification, optic cup formation, and retinal differentiation in 3-dimensional cultures using molecular, cellular and morphological criteria. An advanced level of cell culture experience is required to carry out this protocol. PMID:27658012
Cui, Xingye; Hu, Jie; Choi, Jane Ru; Huang, Yalin; Wang, Xuemin; Lu, Tian Jian; Xu, Feng
2016-09-07
A volumetric meter chip was developed for quantitative point-of-care (POC) analysis of bovine catalase, a bioindicator of bovine mastitis, in milk samples. The meter chip displays multiplexed quantitative results by presenting the distance of ink bar advancement that is detectable by the naked eye. The meter chip comprises a poly(methyl methacrylate) (PMMA) layer, a double-sided adhesive (DSA) layer and a glass slide layer fabricated by the laser-etching method, which is typically simple, rapid (∼3 min per chip), and cost effective (∼$0.2 per chip). Specially designed "U shape" reaction cells are covered by an adhesive tape that serves as an on-off switch, enabling the simple operation of the assay. As a proof of concept, we employed the developed meter chip for the quantification of bovine catalase in raw milk samples to detect catalase concentrations as low as 20 μg/mL. The meter chip has great potential to detect various target analytes for a wide range of POC applications. Copyright © 2016 Elsevier B.V. All rights reserved.
Light sheet theta microscopy for rapid high-resolution imaging of large biological samples.
Migliori, Bianca; Datta, Malika S; Dupre, Christophe; Apak, Mehmet C; Asano, Shoh; Gao, Ruixuan; Boyden, Edward S; Hermanson, Ola; Yuste, Rafael; Tomer, Raju
2018-05-29
Advances in tissue clearing and molecular labeling methods are enabling unprecedented optical access to large intact biological systems. These developments fuel the need for high-speed microscopy approaches to image large samples quantitatively and at high resolution. While light sheet microscopy (LSM), with its high planar imaging speed and low photo-bleaching, can be effective, scaling up to larger imaging volumes has been hindered by the use of orthogonal light sheet illumination. To address this fundamental limitation, we have developed light sheet theta microscopy (LSTM), which uniformly illuminates samples from the same side as the detection objective, thereby eliminating limits on lateral dimensions without sacrificing the imaging resolution, depth, and speed. We present a detailed characterization of LSTM, and demonstrate its complementary advantages over LSM for rapid high-resolution quantitative imaging of large intact samples with high uniform quality. The reported LSTM approach is a significant step for the rapid high-resolution quantitative mapping of the structure and function of very large biological systems, such as a clarified thick coronal slab of human brain and uniformly expanded tissues, and also for rapid volumetric calcium imaging of highly motile animals, such as Hydra, undergoing non-isomorphic body shape changes.
Schlippenbach, Trixi von; Oefner, Peter J; Gronwald, Wolfram
2018-03-09
Non-uniform sampling (NUS) allows the accelerated acquisition of multidimensional NMR spectra. The aim of this contribution was the systematic evaluation of the impact of various quantitative NUS parameters on the accuracy and precision of 2D NMR measurements of urinary metabolites. Urine aliquots spiked with varying concentrations (15.6-500.0 µM) of tryptophan, tyrosine, glutamine, glutamic acid, lactic acid, and threonine, which can only be resolved fully by 2D NMR, were used to assess the influence of the sampling scheme, reconstruction algorithm, amount of omitted data points, and seed value on the quantitative performance of NUS in 1 H, 1 H-TOCSY and 1 H, 1 H-COSY45 NMR spectroscopy. Sinusoidal Poisson-gap sampling and a compressed sensing approach employing the iterative re-weighted least squares method for spectral reconstruction allowed a 50% reduction in measurement time while maintaining sufficient quantitative accuracy and precision for both types of homonuclear 2D NMR spectroscopy. Together with other advances in instrument design, such as state-of-the-art cryogenic probes, use of 2D NMR spectroscopy in large biomedical cohort studies seems feasible.
Yaginuma, Hideyuki; Kawai, Shinnosuke; Tabata, Kazuhito V.; Tomiyama, Keisuke; Kakizuka, Akira; Komatsuzaki, Tamiki; Noji, Hiroyuki; Imamura, Hiromi
2014-01-01
Recent advances in quantitative single-cell analysis revealed large diversity in gene expression levels between individual cells, which could affect the physiology and/or fate of each cell. In contrast, for most metabolites, the concentrations were only measureable as ensemble averages of many cells. In living cells, adenosine triphosphate (ATP) is a critically important metabolite that powers many intracellular reactions. Quantitative measurement of the absolute ATP concentration in individual cells has not been achieved because of the lack of reliable methods. In this study, we developed a new genetically-encoded ratiometric fluorescent ATP indicator “QUEEN”, which is composed of a single circularly-permuted fluorescent protein and a bacterial ATP binding protein. Unlike previous FRET-based indicators, QUEEN was apparently insensitive to bacteria growth rate changes. Importantly, intracellular ATP concentrations of numbers of bacterial cells calculated from QUEEN fluorescence were almost equal to those from firefly luciferase assay. Thus, QUEEN is suitable for quantifying the absolute ATP concentration inside bacteria cells. Finally, we found that, even for a genetically-identical Escherichia coli cell population, absolute concentrations of intracellular ATP were significantly diverse between individual cells from the same culture, by imaging QUEEN signals from single cells. PMID:25283467
[Perimetric changes in advanced glaucoma].
Feraru, Crenguta Ioana; Pantalon, Anca
2011-01-01
The evaluation of various perimetric aspects in advanced glaucoma stages correlated to morpho-functional changes. MATHERIAL AND METHOD: Retrospective clinical trial over a 10 months time period that included patients with advanced glaucoma stages, for which there have been recorded several computerised visual field tests (central 24-2 strategy, 10-2 strategy with either III or V--Goldman stimulus spot size) along with other morpho-funtional ocular paramaters: VA, lOP optic disk analysis. We included in our study 56 eyes from 45 patients. In most cases 89% it was an open angle glaucoma (either primary or secondary) Mean visual acuity was 0.45 +/- 0.28. Regarding the perimetric deficit 83% had advanced deficit, 9% moderate and 8% early visual changes. As perimetric type of defect we found a majority with general reduction of sensitivity (33 eyes) + ring shape scotoma. In 6 eyes (10.7%) having left only a central isle of vision we performed the central 10-2 strategy with III or V Goldmann stimulus spot size. Statistic analysis showed scarce correlation between the visual acuity and the quantitative perimetric parameters (MD and PSD), and variance analysis found present a multiple correlation parameter p = 0.07 that proves there is no liniary correspondence between the morpho-functional parameters: VA-MD(PSD) and C/D ratio. In advanced glaucoma stages, the perimetric changes are mostly severe. Perimetric evaluation is essential in these stages and needs to be individualised.
Magrane, Diane; Helitzer, Deborah; Morahan, Page; Chang, Shine; Gleason, Katharine; Cardinali, Gina; Wu, Chih-Chieh
2012-12-01
Surprisingly little research is available to explain the well-documented organizational and societal influences on persistent inequities in advancement of women faculty. The Systems of Career Influences Model is a framework for exploring factors influencing women's progression to advanced academic rank, executive positions, and informal leadership roles in academic medicine. The model situates faculty as agents within a complex adaptive system consisting of a trajectory of career advancement with opportunities for formal professional development programming; a dynamic system of influences of organizational policies, practices, and culture; and a dynamic system of individual choices and decisions. These systems of influence may promote or inhibit career advancement. Within this system, women weigh competing influences to make career advancement decisions, and leaders of academic health centers prioritize limited resources to support the school's mission. The Systems of Career Influences Model proved useful to identify key research questions. We used the model to probe how research in academic career development might be applied to content and methods of formal professional development programs. We generated a series of questions and hypotheses about how professional development programs might influence professional development of health science faculty members. Using the model as a guide, we developed a study using a quantitative and qualitative design. These analyses should provide insight into what works in recruiting and supporting productive men and women faculty in academic medical centers.
An Overview of Advanced SILAC-Labeling Strategies for Quantitative Proteomics.
Terzi, F; Cambridge, S
2017-01-01
Comparative, quantitative mass spectrometry of proteins provides great insight to protein abundance and function, but some molecular characteristics related to protein dynamics are not so easily obtained. Because the metabolic incorporation of stable amino acid isotopes allows the extraction of distinct temporal and spatial aspects of protein dynamics, the SILAC methodology is uniquely suited to be adapted for advanced labeling strategies. New SILAC strategies have emerged that allow deeper foraging into the complexity of cellular proteomes. Here, we review a few advanced SILAC-labeling strategies that have been published during last the years. Among them, different subsaturating-labeling as well as dual-labeling schemes are most prominent for a range of analyses including those of neuronal proteomes, secretion, or cell-cell-induced stimulations. These recent developments suggest that much more information can be gained from proteomic analyses if the labeling strategies are specifically tailored toward the experimental design. © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Heglund, Brian
Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument evaluation and low perceived value of the learning task, respectively. The discussion presents implications for practice and research, such as introducing motivation scaffolds to support appreciation of task value, and addressing major differences between the design of this study and similar published studies, respectively. This work provides contributions in that it tested the effect of Computer-Assisted Argumentation Mapping on the critical thinking skills of twelfth-grade students within the context of evaluating physics arguments, a previously unexplored age group and domain.
Recent advances in computational structural reliability analysis methods
NASA Astrophysics Data System (ADS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-10-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
Cheng, Feixiong; Zhao, Junfei; Zhao, Zhongming
2016-07-01
Cancer is often driven by the accumulation of genetic alterations, including single nucleotide variants, small insertions or deletions, gene fusions, copy-number variations, and large chromosomal rearrangements. Recent advances in next-generation sequencing technologies have helped investigators generate massive amounts of cancer genomic data and catalog somatic mutations in both common and rare cancer types. So far, the somatic mutation landscapes and signatures of >10 major cancer types have been reported; however, pinpointing driver mutations and cancer genes from millions of available cancer somatic mutations remains a monumental challenge. To tackle this important task, many methods and computational tools have been developed during the past several years and, thus, a review of its advances is urgently needed. Here, we first summarize the main features of these methods and tools for whole-exome, whole-genome and whole-transcriptome sequencing data. Then, we discuss major challenges like tumor intra-heterogeneity, tumor sample saturation and functionality of synonymous mutations in cancer, all of which may result in false-positive discoveries. Finally, we highlight new directions in studying regulatory roles of noncoding somatic mutations and quantitatively measuring circulating tumor DNA in cancer. This review may help investigators find an appropriate tool for detecting potential driver or actionable mutations in rapidly emerging precision cancer medicine. © The Author 2015. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Recent advances in computational structural reliability analysis methods
NASA Technical Reports Server (NTRS)
Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.
1993-01-01
The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.
NASA Technical Reports Server (NTRS)
Ghaffarian, Reza; Evans, John W.
2014-01-01
For five decades, the semiconductor industry has distinguished itself by the rapid pace of improvement in miniaturization of electronics products-Moore's Law. Now, scaling hits a brick wall, a paradigm shift. The industry roadmaps recognized the scaling limitation and project that packaging technologies will meet further miniaturization needs or ak.a "More than Moore". This paper presents packaging technology trends and accelerated reliability testing methods currently being practiced. Then, it presents industry status on key advanced electronic packages, factors affecting accelerated solder joint reliability of area array packages, and IPC/JEDEC/Mil specifications for characterizations of assemblies under accelerated thermal and mechanical loading. Finally, it presents an examples demonstrating how Accelerated Testing and Analysis have been effectively employed in the development of complex spacecraft thereby reducing risk. Quantitative assessments necessarily involve the mathematics of probability and statistics. In addition, accelerated tests need to be designed which consider the desired risk posture and schedule for particular project. Such assessments relieve risks without imposing additional costs. and constraints that are not value added for a particular mission. Furthermore, in the course of development of complex systems, variances and defects will inevitably present themselves and require a decision concerning their disposition, necessitating quantitative assessments. In summary, this paper presents a comprehensive view point, from technology to systems, including the benefits and impact of accelerated testing in offsetting risk.
Li, Qingli; Zhang, Jingfa; Wang, Yiting; Xu, Guoteng
2009-12-01
A molecular spectral imaging system has been developed based on microscopy and spectral imaging technology. The system is capable of acquiring molecular spectral images from 400 nm to 800 nm with 2 nm wavelength increments. The basic principles, instrumental systems, and system calibration method as well as its applications for the calculation of the stain-uptake by tissues are introduced. As a case study, the system is used for determining the pathogenesis of diabetic retinopathy and evaluating the therapeutic effects of erythropoietin. Some molecular spectral images of retinal sections of normal, diabetic, and treated rats were collected and analyzed. The typical transmittance curves of positive spots stained for albumin and advanced glycation end products are retrieved from molecular spectral data with the spectral response calibration algorithm. To explore and evaluate the protective effect of erythropoietin (EPO) on retinal albumin leakage of streptozotocin-induced diabetic rats, an algorithm based on Beer-Lambert's law is presented. The algorithm can assess the uptake by histologic retinal sections of stains used in quantitative pathology to label albumin leakage and advanced glycation end products formation. Experimental results show that the system is helpful for the ophthalmologist to reveal the pathogenesis of diabetic retinopathy and explore the protective effect of erythropoietin on retinal cells of diabetic rats. It also highlights the potential of molecular spectral imaging technology to provide more effective and reliable diagnostic criteria in pathology.
Analytical methods in sphingolipidomics: Quantitative and profiling approaches in food analysis.
Canela, Núria; Herrero, Pol; Mariné, Sílvia; Nadal, Pedro; Ras, Maria Rosa; Rodríguez, Miguel Ángel; Arola, Lluís
2016-01-08
In recent years, sphingolipidomics has emerged as an interesting omic science that encompasses the study of the full sphingolipidome characterization, content, structure and activity in cells, tissues or organisms. Like other omics, it has the potential to impact biomarker discovery, drug development and systems biology knowledge. Concretely, dietary food sphingolipids have gained considerable importance due to their extensively reported bioactivity. Because of the complexity of this lipid family and their diversity among foods, powerful analytical methodologies are needed for their study. The analytical tools developed in the past have been improved with the enormous advances made in recent years in mass spectrometry (MS) and chromatography, which allow the convenient and sensitive identification and quantitation of sphingolipid classes and form the basis of current sphingolipidomics methodologies. In addition, novel hyphenated nuclear magnetic resonance (NMR) strategies, new ionization strategies, and MS imaging are outlined as promising technologies to shape the future of sphingolipid analyses. This review traces the analytical methods of sphingolipidomics in food analysis concerning sample extraction, chromatographic separation, the identification and quantification of sphingolipids by MS and their structural elucidation by NMR. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sachan, Ritesh; Zhang, Yanwen; Ou, Xin
Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less
Sachan, Ritesh; Zhang, Yanwen; Ou, Xin; ...
2016-12-13
Here we demonstrate the enhanced imaging capabilities of an aberration corrected scanning transmission electron microscope to advance the understanding of ion track structure in pyrochlore structured materials (i.e., Gd 2Ti 2O 7 and Gd 2TiZrO 7). Track formation occurs due to the inelastic transfer of energy from incident ions to electrons, and atomic-level details of track morphology as a function of energy-loss are revealed in the present work. A comparison of imaging details obtained by varying collection angles of detectors is discussed in the present work. A quantitative analysis of phase identification using high-angle annular dark field imaging is performedmore » on the ion tracks. Finally, a novel 3-dimensional track reconstruction method is provided that is based on depth dependent imaging of the ion tracks. The technique is used in extracting the atomic-level details of nanoscale features, such as the disordered ion tracks, which are embedded in relatively thicker matrix. Another relevance of the method is shown by measuring the tilt of the ion tracks relative to the electron beam incidence that helps in knowing the structure and geometry of ion tracks quantitatively.« less
High Resolution Microendoscopy for Quantitative Diagnosis of Esophageal Neoplasia
NASA Astrophysics Data System (ADS)
Shin, Dongsuk
Esophageal cancer is the eighth most common cancer in the world. Cancers of the esophagus account for 3.8% of all cases of cancers, with approximately 482,300 new cases reported in 2008 worldwide. In the United States alone, it is estimated that approximately 18,000 new cases will be diagnosed in 2013, and 15,210 deaths are expected. Despite advances in surgery and chemoradiation therapy, these advances have not led to a significant increase in survival rates, primarily because diagnosis often at an advanced and incurable stage when treatment is more difficult and less successful. Accurate, objective methods for early detection of esophageal neoplasia are needed. Here, quantitative classification algorithms for high resolution miscroendoscopic images were developed to distinguish between esophageal neoplastic and non-neoplastic tissue. A clinical study in 177 patients with esophageal squamous cell carcinoma (ESCC) was performed to evaluate the diagnostic performance of the classification algorithm in collaboration with the Mount Sinai Medical Center in the United States, the First Hospital of Jilin University in China, and the Cancer Institute and Hospital, the Chinese Academy of Medical Science in China. The study reported a sensitivity and specificity of 93% and 92%, respectively, in the training set, 87% and 97%, respectively, in the test set, and 84% and 95%, respectively, in an independent validation set. Another clinical study in 31 patients with Barrett's esophagus resulted in a sensitivity of 84% and a specificity of 85%. Finally, a compact, portable version of the high resolution microendoscopy (HRME) device using a consumer-grade camera was developed and a series of biomedical experimental studies were carried out to assess the capability of the device.
Veeraraghavan, Rengasayee; Gourdie, Robert G
2016-11-07
The spatial association between proteins is crucial to understanding how they function in biological systems. Colocalization analysis of fluorescence microscopy images is widely used to assess this. However, colocalization analysis performed on two-dimensional images with diffraction-limited resolution merely indicates that the proteins are within 200-300 nm of each other in the xy-plane and within 500-700 nm of each other along the z-axis. Here we demonstrate a novel three-dimensional quantitative analysis applicable to single-molecule positional data: stochastic optical reconstruction microscopy-based relative localization analysis (STORM-RLA). This method offers significant advantages: 1) STORM imaging affords 20-nm resolution in the xy-plane and <50 nm along the z-axis; 2) STORM-RLA provides a quantitative assessment of the frequency and degree of overlap between clusters of colabeled proteins; and 3) STORM-RLA also calculates the precise distances between both overlapping and nonoverlapping clusters in three dimensions. Thus STORM-RLA represents a significant advance in the high-throughput quantitative assessment of the spatial organization of proteins. © 2016 Veeraraghavan and Gourdie. This article is distributed by The American Society for Cell Biology under license from the author(s). Two months after publication it is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Critical Appraisal of Emergency Medicine Education Research: The Best Publications of 2014.
Yarris, Lalena M; Juve, Amy Miller; Coates, Wendy C; Fisher, Jonathan; Heitz, Corey; Shayne, Philip; Farrell, Susan E
2015-11-01
The objective was to critically appraise and highlight rigorous education research study articles published in 2014 whose outcomes advance the science of emergency medicine (EM) education. A search of the English language literature in 2014 querying Education Resources Information Center (ERIC), PsychINFO, PubMed, and Scopus identified 243 EM-related articles using either quantitative (hypothesis-testing or observational investigations of educational interventions) or qualitative (exploring important phenomena in EM education) methods. Two reviewers independently screened all of the publications using previously established exclusion criteria. Six reviewers then independently scored the 25 selected publications using either a qualitative or a quantitative scoring system. Each scoring system consisted of nine criteria. Selected criteria were based on accepted educational review literature and chosen a priori. Both scoring systems use parallel scoring metrics and have been used previously within this annual review. Twenty-five medical education research papers (22 quantitative, three qualitative) met the criteria for inclusion and were reviewed. Five quantitative and two qualitative studies were ranked most highly by the reviewers as exemplary and are summarized in this article. This annual critical appraisal series highlights seven excellent EM education research studies, meeting a priori criteria and published in 2014. Methodologic strengths in the 2014 papers are noted, and current trends in medical education research in EM are discussed. © 2015 by the Society for Academic Emergency Medicine.
Myers, David S.; Ivanova, Pavlina T.; Milne, Stephen B.; Brown, H. Alex
2012-01-01
As technology expands what it is possible to accurately measure, so too the challenges faced by modern mass spectrometry applications expand. A high level of accuracy in lipid quantitation across thousands of chemical species simultaneously is demanded. While relative changes in lipid amounts with varying conditions may provide initial insights or point to novel targets, there are many questions that require determination of lipid analyte absolute quantitation. Glycerophospholipids present a significant challenge in this regard, given the headgroup diversity, large number of possible acyl chain combinations, and vast range of ionization efficiency of species. Lipidomic output is being used more often not just for profiling of the masses of species, but also for highly-targeted flux-based measurements which put additional burdens on the quantitation pipeline. These first two challenges bring into sharp focus the need for a robust lipidomics workflow including deisotoping, differentiation from background noise, use of multiple internal standards per lipid class, and the use of a scriptable environment in order to create maximum user flexibility and maintain metadata on the parameters of the data analysis as it occurs. As lipidomics technology develops and delivers more output on a larger number of analytes, so must the sophistication of statistical post-processing also continue to advance. High-dimensional data analysis methods involving clustering, lipid pathway analysis, and false discovery rate limitation are becoming standard practices in a maturing field. PMID:21683157
Quantitative (q)AOP for aromatase inhibition as case study to advance qAOP development practices
Here we describe how “read across” of a quantitative adverse outcome pathway (qAOP) developed with data for one chemical can be used to screen impacts of other chemicals. We developed a qAOP starting with inhibition of CYP19A (aromatase) in fathead minnows (FHM) as th...
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tuerteltaub, K. W.; Bench, G.; Buchholz, B. A.
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turteltaub, K. W.; Bench, G.; Buchholz, B. A.
2016-04-08
The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integratedmore » HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory instruments.4.) Provide high throughput 14C BioAMS analysis for collaborative and service clients.« less
Flow Pattern Phenomena in Two-Phase Flow in Microchannels
NASA Astrophysics Data System (ADS)
Keska, Jerry K.; Simon, William E.
2004-02-01
Space transportation systems require high-performance thermal protection and fluid management techniques for systems ranging from cryogenic fluid management devices to primary structures and propulsion systems exposed to extremely high temperatures, as well as for other space systems such as cooling or environment control for advanced space suits and integrated circuits. Although considerable developmental effort is being expended to bring potentially applicable technologies to a readiness level for practical use, new and innovative methods are still needed. One such method is the concept of Advanced Micro Cooling Modules (AMCMs), which are essentially compact two-phase heat exchangers constructed of microchannels and designed to remove large amounts of heat rapidly from critical systems by incorporating phase transition. The development of AMCMs requires fundamental technological advancement in many areas, including: (1) development of measurement methods/systems for flow-pattern measurement/identification for two-phase mixtures in microchannels; (2) development of a phenomenological model for two-phase flow which includes the quantitative measure of flow patterns; and (3) database development for multiphase heat transfer/fluid dynamics flows in microchannels. This paper focuses on the results of experimental research in the phenomena of two-phase flow in microchannels. The work encompasses both an experimental and an analytical approach to incorporating flow patterns for air-water mixtures flowing in a microchannel, which are necessary tools for the optimal design of AMCMs. Specifically, the following topics are addressed: (1) design and construction of a sensitive test system for two-phase flow in microchannels, one which measures ac and dc components of in-situ physical mixture parameters including spatial concentration using concomitant methods; (2) data acquisition and analysis in the amplitude, time, and frequency domains; and (3) analysis of results including evaluation of data acquisition techniques and their validity for application in flow pattern determination.
FDTD-based quantitative analysis of terahertz wave detection for multilayered structures.
Tu, Wanli; Zhong, Shuncong; Shen, Yaochun; Zhou, Qing; Yao, Ligang
2014-10-01
Experimental investigations have shown that terahertz pulsed imaging (TPI) is able to quantitatively characterize a range of multilayered media (e.g., biological issues, pharmaceutical tablet coatings, layered polymer composites, etc.). Advanced modeling of the interaction of terahertz radiation with a multilayered medium is required to enable the wide application of terahertz technology in a number of emerging fields, including nondestructive testing. Indeed, there have already been many theoretical analyses performed on the propagation of terahertz radiation in various multilayered media. However, to date, most of these studies used 1D or 2D models, and the dispersive nature of the dielectric layers was not considered or was simplified. In the present work, the theoretical framework of using terahertz waves for the quantitative characterization of multilayered media was established. A 3D model based on the finite difference time domain (FDTD) method is proposed. A batch of pharmaceutical tablets with a single coating layer of different coating thicknesses and different refractive indices was modeled. The reflected terahertz wave from such a sample was computed using the FDTD method, assuming that the incident terahertz wave is broadband, covering a frequency range up to 3.5 THz. The simulated results for all of the pharmaceutical-coated tablets considered were found to be in good agreement with the experimental results obtained using a commercial TPI system. In addition, we studied a three-layered medium to mimic the occurrence of defects in the sample.
Lee, Chang-Ro; Lee, Jung Hun; Park, Kwang Seung; Jeong, Byeong Chul; Lee, Sang Hee
2015-01-01
The increase of methicillin-resistant Staphylococcus aureus (MRSA) and vancomycin-resistant Enterococcus (VRE) poses a worldwide and serious health threat. Although new antibiotics, such as daptomycin and linezolid, have been developed for the treatment of infections of Gram-positive pathogens, the emergence of daptomycin-resistant and linezolid-resistant strains during therapy has now increased clinical treatment failures. In the past few years, studies using quantitative proteomic methods have provided a considerable progress in understanding antibiotic resistance mechanisms. In this review, to understand the resistance mechanisms to four clinically important antibiotics (methicillin, vancomycin, linezolid, and daptomycin) used in the treatment of Gram-positive pathogens, we summarize recent advances in studies on resistance mechanisms using quantitative proteomic methods, and also examine proteins playing an important role in the bacterial mechanisms of resistance to the four antibiotics. Proteomic researches can identify proteins whose expression levels are changed in the resistance mechanism to only one antibiotic, such as LiaH in daptomycin resistance and PrsA in vancomycin resistance, and many proteins simultaneously involved in resistance mechanisms to various antibiotics. Most of resistance-related proteins, which are simultaneously associated with resistance mechanisms to several antibiotics, play important roles in regulating bacterial envelope biogenesis, or compensating for the fitness cost of antibiotic resistance. Therefore, proteomic data confirm that antibiotic resistance requires the fitness cost and the bacterial envelope is an important factor in antibiotic resistance. PMID:26322035
Online Statistical Modeling (Regression Analysis) for Independent Responses
NASA Astrophysics Data System (ADS)
Made Tirta, I.; Anggraeni, Dian; Pandutama, Martinus
2017-06-01
Regression analysis (statistical analmodelling) are among statistical methods which are frequently needed in analyzing quantitative data, especially to model relationship between response and explanatory variables. Nowadays, statistical models have been developed into various directions to model various type and complex relationship of data. Rich varieties of advanced and recent statistical modelling are mostly available on open source software (one of them is R). However, these advanced statistical modelling, are not very friendly to novice R users, since they are based on programming script or command line interface. Our research aims to developed web interface (based on R and shiny), so that most recent and advanced statistical modelling are readily available, accessible and applicable on web. We have previously made interface in the form of e-tutorial for several modern and advanced statistical modelling on R especially for independent responses (including linear models/LM, generalized linier models/GLM, generalized additive model/GAM and generalized additive model for location scale and shape/GAMLSS). In this research we unified them in the form of data analysis, including model using Computer Intensive Statistics (Bootstrap and Markov Chain Monte Carlo/ MCMC). All are readily accessible on our online Virtual Statistics Laboratory. The web (interface) make the statistical modeling becomes easier to apply and easier to compare them in order to find the most appropriate model for the data.
Engineering and physical sciences in oncology: challenges and opportunities
Mitchell, Michael J.; Jain, Rakesh K.; Langer, Robert
2017-01-01
The principles of engineering and physics have been applied to oncology for nearly 50 years. Engineers and physical scientists have made contributions to all aspects of cancer biology, from quantitative understanding of tumour growth and progression to improved detection and treatment of cancer. Many early efforts focused on experimental and computational modelling of drug distribution, cell cycle kinetics and tumour growth dynamics. In the past decade, we have witnessed exponential growth at the interface of engineering, physics and oncology that has been fuelled by advances in fields including materials science, microfabrication, nanomedicine, microfluidics, imaging, and catalysed by new programmes at the National Institutes of Health (NIH), including the National Institute of Biomedical Imaging and Bioengineering (NIBIB), Physical Sciences in Oncology, and the National Cancer Institute (NCI) Alliance for Nanotechnology. Here, we review the advances made at the interface of engineering and physical sciences and oncology in four important areas: the physical microenvironment of the tumour and technological advances in drug delivery; cellular and molecular imaging; and microfluidics and microfabrication. We discussthe research advances, opportunities and challenges for integrating engineering and physical sciences with oncology to develop new methods to study, detect and treat cancer, and we also describe the future outlook for these emerging areas. PMID:29026204
Quantitative imaging as cancer biomarker
NASA Astrophysics Data System (ADS)
Mankoff, David A.
2015-03-01
The ability to assay tumor biologic features and the impact of drugs on tumor biology is fundamental to drug development. Advances in our ability to measure genomics, gene expression, protein expression, and cellular biology have led to a host of new targets for anticancer drug therapy. In translating new drugs into clinical trials and clinical practice, these same assays serve to identify patients most likely to benefit from specific anticancer treatments. As cancer therapy becomes more individualized and targeted, there is an increasing need to characterize tumors and identify therapeutic targets to select therapy most likely to be successful in treating the individual patient's cancer. Thus far assays to identify cancer therapeutic targets or anticancer drug pharmacodynamics have been based upon in vitro assay of tissue or blood samples. Advances in molecular imaging, particularly PET, have led to the ability to perform quantitative non-invasive molecular assays. Imaging has traditionally relied on structural and anatomic features to detect cancer and determine its extent. More recently, imaging has expanded to include the ability to image regional biochemistry and molecular biology, often termed molecular imaging. Molecular imaging can be considered an in vivo assay technique, capable of measuring regional tumor biology without perturbing it. This makes molecular imaging a unique tool for cancer drug development, complementary to traditional assay methods, and a potentially powerful method for guiding targeted therapy in clinical trials and clinical practice. The ability to quantify, in absolute measures, regional in vivo biologic parameters strongly supports the use of molecular imaging as a tool to guide therapy. This review summarizes current and future applications of quantitative molecular imaging as a biomarker for cancer therapy, including the use of imaging to (1) identify patients whose tumors express a specific therapeutic target; (2) determine whether the drug reaches the target; (3) identify an early response to treatment; and (4) predict the impact of therapy on long-term outcomes such as survival. The manuscript reviews basic concepts important in the application of molecular imaging to cancer drug therapy, in general, and will discuss specific examples of studies in humans, and highlight future directions, including ongoing multi-center clinical trials using molecular imaging as a cancer biomarker.
3D reconstruction of cystoscopy videos for comprehensive bladder records
Lurie, Kristen L.; Angst, Roland; Zlatev, Dimitar V.; Liao, Joseph C.; Ellerbee Bowden, Audrey K.
2017-01-01
White light endoscopy is widely used for diagnostic imaging of the interior of organs and body cavities, but the inability to correlate individual 2D images with 3D organ morphology limits its utility for quantitative or longitudinal studies of disease physiology or cancer surveillance. As a result, most endoscopy videos, which carry enormous data potential, are used only for real-time guidance and are discarded after collection. We present a computational method to reconstruct and visualize a 3D model of organs from an endoscopic video that captures the shape and surface appearance of the organ. A key aspect of our strategy is the use of advanced computer vision techniques and unmodified, clinical-grade endoscopy hardware with few constraints on the image acquisition protocol, which presents a low barrier to clinical translation. We validate the accuracy and robustness of our reconstruction and co-registration method using cystoscopy videos from tissue-mimicking bladder phantoms and show clinical utility during cystoscopy in the operating room for bladder cancer evaluation. As our method can powerfully augment the visual medical record of the appearance of internal organs, it is broadly applicable to endoscopy and represents a significant advance in cancer surveillance opportunities for big-data cancer research. PMID:28736658
Advancement via Individual Determination: A Model for Equity in Secondary Mathematics
ERIC Educational Resources Information Center
Hodges, Cynthia D.
2013-01-01
This study examined the impact of Advancement Via Individual Determination (AVID) methodologies on the mathematics achievement of African American, European American, and Hispanic students as measured by the State of Texas Assessment of Academic Readiness (STAAR) End of Course (EOC) for Algebra I. This quantitative nonexperimental ex post facto…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-12
... Biologics Evaluation and Research (HFM-210), Food and Drug Administration, 1401 Rockville Pike, Suite 200N... Person) at least 7 days in advance. SUPPLEMENTARY INFORMATION: Quantitative risk assessments (QRAs) are... and maintaining critical relationships both within the Center for Biologics Evaluation and Research...
USDA-ARS?s Scientific Manuscript database
Greenbug infestations to sorghum can cause severe and above economic threshold damage in the Great Plains of the United States. This study was to identify quantitative trait loci (QTL) and potential candidate genes residing within the QTL region responsible for greenbug resistance in an advanced ma...
NASA Astrophysics Data System (ADS)
Jézéquel, Tangi; Silvestre, Virginie; Dinis, Katy; Giraudeau, Patrick; Akoka, Serge
2018-04-01
Isotope ratio monitoring by 13C NMR spectrometry (irm-13C NMR) provides the complete 13C intramolecular position-specific composition at natural abundance. It represents a powerful tool to track the (bio)chemical pathway which has led to the synthesis of targeted molecules, since it allows Position-specific Isotope Analysis (PSIA). Due to the very small composition range (which represents the range of variation of the isotopic composition of a given nuclei) of 13C natural abundance values (50‰), irm-13C NMR requires a 1‰ accuracy and thus highly quantitative analysis by 13C NMR. Until now, the conventional strategy to determine the position-specific abundance xi relies on the combination of irm-MS (isotopic ratio monitoring Mass Spectrometry) and 13C quantitative NMR. However this approach presents a serious drawback since it relies on two different techniques and requires to measure separately the signal of all the carbons of the analyzed compound, which is not always possible. To circumvent this constraint, we recently proposed a new methodology to perform 13C isotopic analysis using an internal reference method and relying on NMR only. The method combines a highly quantitative 1H NMR pulse sequence (named DWET) with a 13C isotopic NMR measurement. However, the recently published DWET sequence is unsuited for samples with short T1, which forms a serious limitation for irm-13C NMR experiments where a relaxing agent is added. In this context, we suggest two variants of the DWET called Multi-WET and Profiled-WET, developed and optimized to reach the same accuracy of 1‰ with a better immunity towards T1 variations. Their performance is evaluated on the determination of the 13C isotopic profile of vanillin. Both pulse sequences show a 1‰ accuracy with an increased robustness to pulse miscalibrations compared to the initial DWET method. This constitutes a major advance in the context of irm-13C NMR since it is now possible to perform isotopic analysis with high relaxing agent concentrations, leading to a strong reduction of the overall experiment time.
de Sisternes, Luis; Jonna, Gowtham; Moss, Jason; Marmor, Michael F.; Leng, Theodore; Rubin, Daniel L.
2017-01-01
This work introduces and evaluates an automated intra-retinal segmentation method for spectral-domain optical coherence (SD-OCT) retinal images. While quantitative assessment of retinal features in SD-OCT data is important, manual segmentation is extremely time-consuming and subjective. We address challenges that have hindered prior automated methods, including poor performance with diseased retinas relative to healthy retinas, and data smoothing that obscures image features such as small retinal drusen. Our novel segmentation approach is based on the iterative adaptation of a weighted median process, wherein a three-dimensional weighting function is defined according to image intensity and gradient properties, and a set of smoothness constraints and pre-defined rules are considered. We compared the segmentation results for 9 segmented outlines associated with intra-retinal boundaries to those drawn by hand by two retinal specialists and to those produced by an independent state-of-the-art automated software tool in a set of 42 clinical images (from 14 patients). These images were obtained with a Zeiss Cirrus SD-OCT system, including healthy, early or intermediate AMD, and advanced AMD eyes. As a qualitative evaluation of accuracy, a highly experienced third independent reader blindly rated the quality of the outlines produced by each method. The accuracy and image detail of our method was superior in healthy and early or intermediate AMD eyes (98.15% and 97.78% of results not needing substantial editing) to the automated method we compared against. While the performance was not as good in advanced AMD (68.89%), it was still better than the manual outlines or the comparison method (which failed in such cases). We also tested our method’s performance on images acquired with a different SD-OCT manufacturer, collected from a large publicly available data set (114 healthy and 255 AMD eyes), and compared the data quantitatively to reference standard markings of the internal limiting membrane and inner boundary of retinal pigment epithelium, producing a mean unsigned positioning error of 6.04 ± 7.83µm (mean under 2 pixels). Our automated method should be applicable to data from different OCT manufacturers and offers detailed layer segmentations in healthy and AMD eyes. PMID:28663874
Qualitative and Quantitative Distinctions in Personality Disorder
Wright, Aidan G. C.
2011-01-01
The “categorical-dimensional debate” has catalyzed a wealth of empirical advances in the study of personality pathology. However, this debate is merely one articulation of a broader conceptual question regarding whether to define and describe psychopathology as a quantitatively extreme expression of normal functioning or as qualitatively distinct in its process. In this paper I argue that dynamic models of personality (e.g., object-relations, cognitive-affective processing system) offer the conceptual scaffolding to reconcile these seemingly incompatible approaches to characterizing the relationship between normal and pathological personality. I propose that advances in personality assessment that sample behavior and experiences intensively provide the empirical techniques, whereas interpersonal theory offers an integrative theoretical framework, for accomplishing this goal. PMID:22804676
The use of continuous culture in systems biology investigations.
Winder, Catherine L; Lanthaler, Karin
2011-01-01
When acquiring data for systems biology studies, it is essential to perform the experiments in controlled and reproducible conditions. Advances in the fields of proteomics and metabolomics allow the quantitative analysis of the components of the biological cell. It is essential to include a method in the experimental pipeline to culture the biological system in controlled and reproducible conditions to facilitate the acquisition of high-quality data. The employment of continuous culture methods for the growth of microorganisms is an ideal tool to achieve these objectives. This chapter will review the continuous culture approaches which may be applied in such studies, outline the experimental options which should be considered, and describe the approach applied in the production of steady-state cultures of Saccharomyces cerevisiae. Copyright © 2011 Elsevier Inc. All rights reserved.
Evidence of an application of a variable MEMS capacitive sensor for detecting shunt occlusions
NASA Astrophysics Data System (ADS)
Apigo, David J.; Bartholomew, Philip L.; Russell, Thomas; Kanwal, Alokik; Farrow, Reginald C.; Thomas, Gordon A.
2017-04-01
A sensor was tested subdural and in vitro, simulating a supine infant with a ventricular-peritoneal shunt and controlled occlusions. The variable MEMS capacitive device is able to detect and forecast blockages, similar to early detection procedures in cancer patients. For example, with gradual occlusion development over a year, the method forecasts a danger over one month ahead of blockage. The method also distinguishes between ventricular and peritoneal occlusions. Because the sensor provides quantitative data on the dynamics of the cerebrospinal fluid, it can help test new therapies and work toward understanding hydrocephalus as well as idiopathic normal pressure hydrocephalus. The sensor appears to be a substantial advance in treating brain injuries treated with shunts and has the potential to bring significant impact in a clinical setting.
Advances in the use of observed spatial patterns of catchment hydrological response
NASA Astrophysics Data System (ADS)
Grayson, Rodger B.; Blöschl, Günter; Western, Andrew W.; McMahon, Thomas A.
Over the past two decades there have been repeated calls for the collection of new data for use in developing hydrological science. The last few years have begun to bear fruit from the seeds sown by these calls, through increases in the availability and utility of remote sensing data, as well as the execution of campaigns in research catchments aimed at providing new data for advancing hydrological understanding and predictive capability. In this paper we discuss some philosophical considerations related to model complexity, data availability and predictive performance, highlighting the potential of observed patterns in moving the science and practice of catchment hydrology forward. We then review advances that have arisen from recent work on spatial patterns, including in the characterisation of spatial structure and heterogeneity, and the use of patterns for developing, calibrating and testing distributed hydrological models. We illustrate progress via examples using observed patterns of snow cover, runoff occurrence and soil moisture. Methods for the comparison of patterns are presented, illustrating how they can be used to assess hydrologically important characteristics of model performance. These methods include point-to-point comparisons, spatial relationships between errors and landscape parameters, transects, and optimal local alignment. It is argued that the progress made to date augers well for future developments, but there is scope for improvements in several areas. These include better quantitative methods for pattern comparisons, better use of pattern information in data assimilation and modelling, and a call for improved archiving of data from field studies to assist in comparative studies for generalising results and developing fundamental understanding.
A Review on Segmentation of Positron Emission Tomography Images
Foster, Brent; Bagci, Ulas; Mansoor, Awais; Xu, Ziyue; Mollura, Daniel J.
2014-01-01
Positron Emission Tomography (PET), a non-invasive functional imaging method at the molecular level, images the distribution of biologically targeted radiotracers with high sensitivity. PET imaging provides detailed quantitative information about many diseases and is often used to evaluate inflammation, infection, and cancer by detecting emitted photons from a radiotracer localized to abnormal cells. In order to differentiate abnormal tissue from surrounding areas in PET images, image segmentation methods play a vital role; therefore, accurate image segmentation is often necessary for proper disease detection, diagnosis, treatment planning, and follow-ups. In this review paper, we present state-of-the-art PET image segmentation methods, as well as the recent advances in image segmentation techniques. In order to make this manuscript self-contained, we also briefly explain the fundamentals of PET imaging, the challenges of diagnostic PET image analysis, and the effects of these challenges on the segmentation results. PMID:24845019
Sequence of eruptive events in the Vesuvio area recorded in shallow-water Ionian Sea sediments
NASA Astrophysics Data System (ADS)
Taricco, C.; Alessio, S.; Vivaldo, G.
2008-01-01
The dating of the cores we drilled from the Gallipoli terrace in the Gulf of Taranto (Ionian Sea), previously obtained by tephroanalysis, is checked by applying a method to objectively recognize volcanic events. This automatic statistical procedure allows identifying pulse-like features in a series and evaluating quantitatively the confidence level at which the significant peaks are detected. We applied it to the 2000-years-long pyroxenes series of the GT89-3 core, on which the dating is based. The method confirms the dating previously performed by detecting at a high confidence level the peaks originally used and indicates a few possible undocumented eruptions. Moreover, a spectral analysis, focussed on the long-term variability of the pyroxenes series and performed by several advanced methods, reveals that the volcanic pulses are superimposed to a millennial trend and a 400 years oscillation.
Friche, Amélia Augusta de Lima; Dias, Maria Angélica de Salles; Reis, Priscila Brandão Dos; Dias, Cláudia Silva; Caiaffa, Waleska Teixeira
2015-11-01
There is little scientific evidence that urban upgrading helps improve health or reduce inequities. This article presents the design for the BH-Viva Project, a "quasi-experimental", multiphase, mixed-methods study with quantitative and qualitative components, proposing an analytical model for monitoring the effects that interventions in the urban environment can have on residents' health in slums in Belo Horizonte, Minas Gerais State, Brazil. A preliminary analysis revealed intra-urban differences in age-specific mortality when comparing areas with and without interventions; the mortality rate from 2002 to 2012 was stable in the "formal city", increased in slums without interventions, and decreased in slums with interventions. BH-Viva represents an effort at advancing methodological issues, providing learning and theoretical backing for urban health research and research methods, allowing their application and extension to other urban contexts.
NASA Astrophysics Data System (ADS)
Vacarescu, Rares Anthony
The human fingernail is a vital organ used by humans on a daily basis and can provide an immense supply of information based on the biological feedback of the body. By studying the quantitative mechanical and acoustic properties of fingernails, a better understanding of the scarcely-investigated field of ungual research can be explored. Investigating fingernail properties with the use of pulse-echo ultrasound is the aim of this thesis. This thesis involves the application of a developed portable ultrasonic device in a hospital-based data collection and the advancement of ultrasonic methodology to include the calculation of acoustic impedance, density and elasticity. The results of the thesis show that the reflectance method can be utilized to determine fingernail properties with a maximum 17% deviation from literature. Repeatability of measurements fell within a 95% confidence interval. Thus, the ultrasonic reflectance method was validated and may have potential clinical and cosmetic applications.
Image reconstruction for PET/CT scanners: past achievements and future challenges
Tong, Shan; Alessio, Adam M; Kinahan, Paul E
2011-01-01
PET is a medical imaging modality with proven clinical value for disease diagnosis and treatment monitoring. The integration of PET and CT on modern scanners provides a synergy of the two imaging modalities. Through different mathematical algorithms, PET data can be reconstructed into the spatial distribution of the injected radiotracer. With dynamic imaging, kinetic parameters of specific biological processes can also be determined. Numerous efforts have been devoted to the development of PET image reconstruction methods over the last four decades, encompassing analytic and iterative reconstruction methods. This article provides an overview of the commonly used methods. Current challenges in PET image reconstruction include more accurate quantitation, TOF imaging, system modeling, motion correction and dynamic reconstruction. Advances in these aspects could enhance the use of PET/CT imaging in patient care and in clinical research studies of pathophysiology and therapeutic interventions. PMID:21339831
Quantifying and characterizing proanthocyanidins in cranberries in relation to urinary tract health.
Krueger, Christian G; Reed, Jess D; Feliciano, Rodrigo P; Howell, Amy B
2013-05-01
The "A-type" proanthocyanidins in cranberry fruit (Vaccinium macrocarpon Ait.) are bioactive components associated with prevention of urinary tract infections (UTI). Cranberry juice, fruit (fresh and dried), functional foods, and cranberry dietary supplements are promoted for prevention of UTI and for maintenance of urinary tract health (UTH), on the basis of their content of cranberry proanthocyanidins (c-PAC) with "A-type" interflavan bonds. With increasing consumer use of cranberries for maintenance of UTH and an expanding number of commercial cranberry products of different types, the availability of unified methods for measuring levels of c-PAC is important. This review discusses quantitative and qualitative analysis of c-PAC with "A-type" interflavan bonds in relation to their biological activity for UTI prevention. The integrity (including authenticity, standardization, efficacy, and safety) of cranberry fruit, juices, and dietary supplements may now be measured by using recent advances in mass spectrometry, liquid chromatography, production of c-PAC standards, and improved simple quantitative techniques.
Multimodal imaging of ischemic wounds
NASA Astrophysics Data System (ADS)
Zhang, Shiwu; Gnyawali, Surya; Huang, Jiwei; Liu, Peng; Gordillo, Gayle; Sen, Chandan K.; Xu, Ronald
2012-12-01
The wound healing process involves the reparative phases of inflammation, proliferation, and remodeling. Interrupting any of these phases may result in chronically unhealed wounds, amputation, or even patient death. Quantitative assessment of wound tissue ischemia, perfusion, and inflammation provides critical information for appropriate detection, staging, and treatment of chronic wounds. However, no method is available for noninvasive, simultaneous, and quantitative imaging of these tissue parameters. We integrated hyperspectral, laser speckle, and thermographic imaging modalities into a single setup for multimodal assessment of tissue oxygenation, perfusion, and inflammation characteristics. Advanced algorithms were developed for accurate reconstruction of wound oxygenation and appropriate co-registration between different imaging modalities. The multimodal wound imaging system was validated by an ongoing clinical trials approved by OSU IRB. In the clinical trial, a wound of 3mm in diameter was introduced on a healthy subject's lower extremity and the healing process was serially monitored by the multimodal imaging setup. Our experiments demonstrated the clinical usability of multimodal wound imaging.
Monogioudi, Evanthia; Permi, Perttu; Filpponen, Ilari; Lienemann, Michael; Li, Bin; Argyropoulos, Dimitris; Buchert, Johanna; Mattinen, Maija-Liisa
2011-02-23
Cross-linking of β-casein by Trichoderma reesei tyrosinase (TrTyr) and Streptoverticillium mobaraense transglutaminase (Tgase) was analyzed by (31)P nuclear magnetic resonance (NMR) spectroscopy in ionic liquid (IL). According to (31)P NMR, 91% of the tyrosine side chains were cross-linked by TrTyr at high dosages. When Tgase was used, no changes were observed because a different cross-linking mechanism was operational. However, this verified the success of the phosphitylation of phenolics within the protein matrix in the IL. Atomic force microscopy (AFM) in solid state showed that disk-shaped nanoparticles were formed in the reactions with average diameters of 80 and 20 nm for TrTyr and Tgase, respectively. These data further advance the current understanding of the action of tyrosinases on proteins on molecular and chemical bond levels. Quantitative (31)P NMR in IL was shown to be a simple and efficient method for the study of protein modification.
Quantitative assessment of neural outgrowth using spatial light interference microscopy
NASA Astrophysics Data System (ADS)
Lee, Young Jae; Cintora, Pati; Arikkath, Jyothi; Akinsola, Olaoluwa; Kandel, Mikhail; Popescu, Gabriel; Best-Popescu, Catherine
2017-06-01
Optimal growth as well as branching of axons and dendrites is critical for the nervous system function. Neuritic length, arborization, and growth rate determine the innervation properties of neurons and define each cell's computational capability. Thus, to investigate the nervous system function, we need to develop methods and instrumentation techniques capable of quantifying various aspects of neural network formation: neuron process extension, retraction, stability, and branching. During the last three decades, fluorescence microscopy has yielded enormous advances in our understanding of neurobiology. While fluorescent markers provide valuable specificity to imaging, photobleaching, and photoxicity often limit the duration of the investigation. Here, we used spatial light interference microscopy (SLIM) to measure quantitatively neurite outgrowth as a function of cell confluence. Because it is label-free and nondestructive, SLIM allows for long-term investigation over many hours. We found that neurons exhibit a higher growth rate of neurite length in low-confluence versus medium- and high-confluence conditions. We believe this methodology will aid investigators in performing unbiased, nondestructive analysis of morphometric neuronal parameters.
Altenbuchner, Amelie; Haug, Sonja; Kretschmer, Rainer; Weber, Karsten
2018-01-01
This preparatory study accelerates an implementation of individualized monitoring and feedback of physical motion using conventional motion trackers in the rehabilitation process of geriatric trauma patients. Regaining mobility is accompanied with improved quality of life in persons of very advanced age recovering from fragility fractures. Quantitative survey of regaining physical mobility provides recommendations for action on how to use motion trackers effectively in a clinical geriatric setting. Method mix of quantitative and qualitative interdisciplinary and mutual complementary research approaches (sociology, health research, philosophy/ethics, medical informatics, nursing science, gerontology and physical therapy). While validating motion tracker use in geriatric traumatology preliminary data are used to develop a target group oriented motion feedback. In addition measurement accuracy of a questionnaire about quality of life of multimorbid geriatric patients (FLQM) is tested. Implementing a new technology in a complex clinical setting needs to be based on a strong theoretical background but will not succeed without careful field testing.
Genotype-phenotype association study via new multi-task learning model
Huo, Zhouyuan; Shen, Dinggang
2018-01-01
Research on the associations between genetic variations and imaging phenotypes is developing with the advance in high-throughput genotype and brain image techniques. Regression analysis of single nucleotide polymorphisms (SNPs) and imaging measures as quantitative traits (QTs) has been proposed to identify the quantitative trait loci (QTL) via multi-task learning models. Recent studies consider the interlinked structures within SNPs and imaging QTs through group lasso, e.g. ℓ2,1-norm, leading to better predictive results and insights of SNPs. However, group sparsity is not enough for representing the correlation between multiple tasks and ℓ2,1-norm regularization is not robust either. In this paper, we propose a new multi-task learning model to analyze the associations between SNPs and QTs. We suppose that low-rank structure is also beneficial to uncover the correlation between genetic variations and imaging phenotypes. Finally, we conduct regression analysis of SNPs and QTs. Experimental results show that our model is more accurate in prediction than compared methods and presents new insights of SNPs. PMID:29218896
Sakamoto, Seiichi; Putalun, Waraporn; Vimolmangkang, Sornkanok; Phoolcharoen, Waranyoo; Shoyama, Yukihiro; Tanaka, Hiroyuki; Morimoto, Satoshi
2018-01-01
Immunoassays are antibody-based analytical methods for quantitative/qualitative analysis. Since the principle of immunoassays is based on specific antigen-antibody reaction, the assays have been utilized worldwide for diagnosis, pharmacokinetic studies by drug monitoring, and the quality control of commercially available products. Berson and Yalow were the first to develop an immunoassay, known as radioimmunoassay (RIA), for detecting endogenous plasma insulin [1], a development for which Yalow was awarded the Nobel Prize in Physiology or Medicine in 1977. Even today, after half a century, immunoassays are widely utilized with some modifications from the originally proposed system, e.g., radioisotopes have been replaced with enzymes because of safety concerns regarding the use of radioactivity, which is referred to as enzyme immunoassay/enzyme-linked immunosorbent assay (ELISA). In addition, progress has been made in ELISA with the recent advances in recombinant DNA technology, leading to increase in the range of antibodies, probes, and even systems. This review article describes ELISA and its applications for the detection of plant secondary metabolites.
Propagating Qualitative Values Through Quantitative Equations
NASA Technical Reports Server (NTRS)
Kulkarni, Deepak
1992-01-01
In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.
Systematic review of psychosocial outcomes for patients with advanced melanoma.
Dunn, Jeff; Watson, Maggie; Aitken, Joanne F; Hyde, Melissa K
2017-11-01
New advanced melanoma therapies are associated with improved survival; however, quality of survivorship, particularly psychosocial outcomes, for patients overall and those treated with newer therapies is unclear. Synthesize qualitative and quantitative evidence about psychosocial outcomes for advanced (stage III/IV) melanoma patients. Five databases were searched (01/01/1980 to 31/01/2016). Inclusion criteria were as follows: advanced melanoma patients or sub-group analysis; assessed psychosocial outcomes; and English language. Fifty-two studies met review criteria (4 qualitative, 48 quantitative). Trials comprise mostly medical not psychosocial interventions, with psychosocial outcomes assessed within broader quality of life measures. Patients receiving chemotherapy or IFN-alpha showed decreased emotional and social function and increased distress. Five trials of newer therapies appeared to show improvements in emotional and social function. Descriptive studies suggest that patients with advanced, versus localized disease, had decreased emotional and social function and increased distress. Contributors to distress were largely unexplored, and no clear framework described coping/adjustment trajectories. Patients with advanced versus localized disease had more supportive care needs, particularly amount, quality, and timing of melanoma-related information, communication with and emotional support from clinicians. Limitations included: lack of theoretical underpinnings guiding study design; inconsistent measurement approaches; small sample sizes; non-representative sampling; and cross-sectional design. Quality trial evidence is needed to clarify the impact of treatment innovations for advanced melanoma on patients' psychosocial well-being. Survivorship research and subsequent translation of that knowledge into programs and services currently lags behind gains in the medical treatment of advanced melanoma, a troubling circumstance that requires immediate and focused attention. Copyright © 2016 John Wiley & Sons, Ltd.
Single Cell Genomics: Approaches and Utility in Immunology
Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A
2017-01-01
Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102