Sample records for quantitative research techniques

  1. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Nondestructive Evaluation for Aerospace Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara; Cramer, Elliott; Perey, Daniel

    2015-01-01

    Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.

  3. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  4. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  5. Research Methods in Education

    ERIC Educational Resources Information Center

    Check, Joseph; Schutt, Russell K.

    2011-01-01

    "Research Methods in Education" introduces research methods as an integrated set of techniques for investigating questions about the educational world. This lively, innovative text helps students connect technique and substance, appreciate the value of both qualitative and quantitative methodologies, and make ethical research decisions.…

  6. To label or not to label: applications of quantitative proteomics in neuroscience research.

    PubMed

    Filiou, Michaela D; Martins-de-Souza, Daniel; Guest, Paul C; Bahn, Sabine; Turck, Christoph W

    2012-02-01

    Proteomics has provided researchers with a sophisticated toolbox of labeling-based and label-free quantitative methods. These are now being applied in neuroscience research where they have already contributed to the elucidation of fundamental mechanisms and the discovery of candidate biomarkers. In this review, we evaluate and compare labeling-based and label-free quantitative proteomic techniques for applications in neuroscience research. We discuss the considerations required for the analysis of brain and central nervous system specimens, the experimental design of quantitative proteomic workflows as well as the feasibility, advantages, and disadvantages of the available techniques for neuroscience-oriented questions. Furthermore, we assess the use of labeled standards as internal controls for comparative studies in humans and review applications of labeling-based and label-free mass spectrometry approaches in relevant model organisms and human subjects. Providing a comprehensive guide of feasible and meaningful quantitative proteomic methodologies for neuroscience research is crucial not only for overcoming current limitations but also for gaining useful insights into brain function and translating proteomics from bench to bedside. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  8. Examining the Teachers' Emotional Labor Behavior

    ERIC Educational Resources Information Center

    Tösten, Rasim; Sahin, Çigdem Çelik

    2017-01-01

    The aim of this research is to investigate the teachers' emotional labour behaviours and to determine the reasons of the differences. In the research, mixed research methods including both quantitative and qualitative techniques were used. The population of the study was comprised of 280 teachers (266 for quantitative, 14 for qualitative…

  9. University Students' Understanding of the Concepts Empirical, Theoretical, Qualitative and Quantitative Research

    ERIC Educational Resources Information Center

    Murtonen, Mari

    2015-01-01

    University research education in many disciplines is frequently confronted by problems with students' weak level of understanding of research concepts. A mind map technique was used to investigate how students understand central methodological concepts of empirical, theoretical, qualitative and quantitative. The main hypothesis was that some…

  10. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  11. More details...
  12. NASA Lewis Research Center Futuring Workshop

    NASA Technical Reports Server (NTRS)

    Boroush, Mark; Stover, John; Thomas, Charles

    1987-01-01

    On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.

  13. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  15. Secondary Analysis of Qualitative Data.

    ERIC Educational Resources Information Center

    Turner, Paul D.

    The reanalysis of data to answer the original research question with better statistical techniques or to answer new questions with old data is not uncommon in quantitative studies. Meta analysis and research syntheses have increased with the increase in research using similar statistical analyses, refinements of analytical techniques, and the…

  16. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  17. Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.

    PubMed

    Mantle, M D

    2011-09-30

    The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.

  18. [Qualitative techniques for public health research and the development of health care services: more than just another technique].

    PubMed

    March Cerdà, J C; Prieto Rodríguez, M A; Hernán García, M; Solas Gaspar, O

    1999-01-01

    Regarding the debate on the existence of two current focuses on health science research (qualitative and quantitative), the paper states the need for complementing the techniques which contribute to a better knowledge of populations and communities, and the need for offering effective solutions to different problems. The article analyses the usefulness of qualitative methods, describes the techniques and procedures more frequently used to guarantee the validity and reliability of research findings and ends bringing up the need for using qualitative and quantitative approaches. This way of working together or learning from each other will enrich research and interventions on public heath and health management fields. Qualitative methods are useful for sound understanding of a given issue that is being investigated or evaluated taking into account the point of view of the participants under research. Key techniques, listed from the most structured to the less structured are among others: structured interview, Delphi, nominal group, case study, semistructured interview, focal group, brainstorming, discussion group, in depth interview, life story and participant observation.

  19. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  20. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  21. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  1. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  2. Application of Person-Centered Approaches to Critical Quantitative Research: Exploring Inequities in College Financing Strategies

    ERIC Educational Resources Information Center

    Malcom-Piqueux, Lindsey

    2014-01-01

    This chapter discusses the utility of person-centered approaches to critical quantitative researchers. These techniques, which identify groups of individuals who share similar attributes, experiences, or outcomes, are contrasted with more commonly used variable-centered approaches. An illustrative example of a latent class analysis of the college…

  3. CPTAC Accelerates Precision Proteomics Biomedical Research | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.

  4. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  5. Focused Group Interviews as an Innovative Quanti-Qualitative Methodology (QQM): Integrating Quantitative Elements into a Qualitative Methodology

    ERIC Educational Resources Information Center

    Grim, Brian J.; Harmon, Alison H.; Gromis, Judy C.

    2006-01-01

    There is a sharp divide between quantitative and qualitative methodologies in the social sciences. We investigate an innovative way to bridge this gap that incorporates quantitative techniques into a qualitative method, the "quanti-qualitative method" (QQM). Specifically, our research utilized small survey questionnaires and experiment-like…

  6. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  7. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  8. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  9. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  10. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... estimates. (vi) Comparison of proposed prices with prices obtained through market research for the same or...

  11. TPS as an Effective Technique to Enhance the Students' Achievement on Writing Descriptive Text

    ERIC Educational Resources Information Center

    Sumarsih, M. Pd.; Sanjaya, Dedi

    2013-01-01

    Students' achievement in writing descriptive text is very low, in this study Think Pair Share (TPS) is applied to solve the problem. Action research is conducted for the result. Additionally, qualitative and quantitative techniques are applied in this research. The subject of this research is grade VIII in Junior High School in Indonesia. From…

  12. Developing High-Frequency Quantitative Ultrasound Techniques to Characterize Three-Dimensional Engineered Tissues

    NASA Astrophysics Data System (ADS)

    Mercado, Karla Patricia E.

    Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.

  13. Allan Cormack, Computerized Axial Tomography (CAT), and Magnetic Resonance

    Science.gov Websites

    Radiopharmaceuticals, DOE Technical Report, 1977 Emission Computed Tomography: A New Technique for the Quantitative Extending the Power of Nuclear Magnetic Resonance Techniques Magnetic Resonance Imaging Research Top Some

  14. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  15. 48 CFR 15.404-1 - Proposal analysis techniques.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... are: I Price Analysis, II Quantitative Techniques for Contract Pricing, III Cost Analysis, IV Advanced... obtained through market research for the same or similar items. (vii) Analysis of data other than certified...

  16. Qualitative research methods: key features and insights gained from use in infection prevention research.

    PubMed

    Forman, Jane; Creswell, John W; Damschroder, Laura; Kowalski, Christine P; Krein, Sarah L

    2008-12-01

    Infection control professionals and hospital epidemiologists are accustomed to using quantitative research. Although quantitative studies are extremely important in the field of infection control and prevention, often they cannot help us explain why certain factors affect the use of infection control practices and identify the underlying mechanisms through which they do so. Qualitative research methods, which use open-ended techniques, such as interviews, to collect data and nonstatistical techniques to analyze it, provide detailed, diverse insights of individuals, useful quotes that bring a realism to applied research, and information about how different health care settings operate. Qualitative research can illuminate the processes underlying statistical correlations, inform the development of interventions, and show how interventions work to produce observed outcomes. This article describes the key features of qualitative research and the advantages that such features add to existing quantitative research approaches in the study of infection control. We address the goal of qualitative research, the nature of the research process, sampling, data collection and analysis, validity, generalizability of findings, and presentation of findings. Health services researchers are increasingly using qualitative methods to address practical problems by uncovering interacting influences in complex health care environments. Qualitative research methods, applied with expertise and rigor, can contribute important insights to infection prevention efforts.

  17. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  18. Quantitative acoustic emission monitoring of fatigue cracks in fracture critical steel bridges.

    DOT National Transportation Integrated Search

    2014-01-01

    The objective of this research is to evaluate the feasibility to employ quantitative acoustic : emission (AE) techniques for monitoring of fatigue crack initiation and propagation in steel : bridge members. Three A36 compact tension steel specimens w...

  19. Glutenite bodies sequence division of the upper Es4 in northern Minfeng zone of Dongying Sag, Bohai Bay Basin, China

    NASA Astrophysics Data System (ADS)

    Shao, Xupeng

    2017-04-01

    Glutenite bodies are widely developed in northern Minfeng zone of Dongying Sag. Their litho-electric relationship is not clear. In addition, as the conventional sequence stratigraphic research method drawbacks of involving too many subjective human factors, it has limited deepening of the regional sequence stratigraphic research. The wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data have advantages of dividing sequence stratigraphy quantitatively comparing with the conventional methods. Under the basis of the conventional sequence research method, this paper used the above techniques to divide the fourth-order sequence of the upper Es4 in northern Minfeng zone of Dongying Sag. The research shows that the wavelet transform technique based on logging data and the time-frequency analysis technique based on seismic data are essentially consistent, both of which divide sequence stratigraphy quantitatively in the frequency domain; wavelet transform technique has high resolutions. It is suitable for areas with wells. The seismic time-frequency analysis technique has wide applicability, but a low resolution. Both of the techniques should be combined; the upper Es4 in northern Minfeng zone of Dongying Sag is a complete set of third-order sequence, which can be further subdivided into 5 fourth-order sequences that has the depositional characteristics of fine-upward sequence in granularity. Key words: Dongying sag, northern Minfeng zone, wavelet transform technique, time-frequency analysis technique ,the upper Es4, sequence stratigraphy

  20. Using Large Data Sets to Study College Education Trajectories

    ERIC Educational Resources Information Center

    Oseguera, Leticia; Hwang, Jihee

    2014-01-01

    This chapter presents various considerations researchers undertook to conduct a quantitative study on low-income students using a national data set. Specifically, it describes how a critical quantitative scholar approaches guiding frameworks, variable operationalization, analytic techniques, and result interpretation. Results inform how…

  1. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  2. Quantitative Characterization of Nanostructured Materials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dr. Frank

    The two-and-a-half day symposium on the "Quantitative Characterization of Nanostructured Materials" will be the first comprehensive meeting on this topic held under the auspices of a major U.S. professional society. Spring MRS Meetings provide a natural venue for this symposium as they attract a broad audience of researchers that represents a cross-section of the state-of-the-art regarding synthesis, structure-property relations, and applications of nanostructured materials. Close interactions among the experts in local structure measurements and materials researchers will help both to identify measurement needs pertinent to real-world materials problems and to familiarize the materials research community with the state-of-the-art local structuremore » measurement techniques. We have chosen invited speakers that reflect the multidisciplinary and international nature of this topic and the need to continually nurture productive interfaces among university, government and industrial laboratories. The intent of the symposium is to provide an interdisciplinary forum for discussion and exchange of ideas on the recent progress in quantitative characterization of structural order in nanomaterials using different experimental techniques and theory. The symposium is expected to facilitate discussions on optimal approaches for determining atomic structure at the nanoscale using combined inputs from multiple measurement techniques.« less

  3. Quantitative techniques for musculoskeletal MRI at 7 Tesla.

    PubMed

    Bangerter, Neal K; Taylor, Meredith D; Tarbox, Grayson J; Palmer, Antony J; Park, Daniel J

    2016-12-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems.

  4. Qualitative Research in Adult, Career, and Career-Technical Education. Practitioner File.

    ERIC Educational Resources Information Center

    Imel, Susan; Kerka, Sandra; Wonacott, Michael E.

    Directed at practitioners in adult and career education, this document defines qualitative research, compares qualitative research to quantitative research, describes the "war" between proponents of each kind of research, describes how to assess qualitative research, and explains how to choose and use qualitative techniques. Pitfalls of…

  5. Development of quantitative laser ionization mass spectrometry (LIMS). Final report, 1 Aug 87-1 Jan 90

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Odom, R.W.

    1991-06-04

    The objective of the research was to develop quantitative microanalysis methods for dielectric thin films using the laser ionization mass spectrometry (LIMS) technique. The research involved preparation of thin (5,000 A) films of SiO2, Al2O3, MgF2, TiO2, Cr2O3, Ta2O5, Si3N4, and ZrO2, and doping these films with ion implant impurities of 11B, 40Ca, 56Fe, 68Zn, 81Br, and 121Sb. Laser ionization mass spectrometry (LIMS), secondary ion mass spectrometry (SIMS) and Rutherford backscattering spectrometry (RBS) were performed on these films. The research demonstrated quantitative LIMS analysis down to detection levels of 10-100 ppm, and led to the development of (1) a compoundmore » thin film standards product line for the performing organization, (2) routine LIMS analytical methods, and (3) the manufacture of high speed preamplifiers for time-of-flight mass spectrometry (TOF-MS) techniques.« less

  6. Ecosystems and People: Qualitative Insights

    EPA Science Inventory

    Both qualitative and quantitative techniques are crucial in researching human impacts from ecological changes. This matches the importance of ?mixed methods? approaches in other disciplines. Qualitative research helps explore the relevancy and transferability of the foundational ...

  7. Electron Energy Distribution and Transfer Phenomena in Non-Equilibrium Gases

    DTIC Science & Technology

    2016-09-01

    and quantitative determination of species difficult. In a mass spectrometry study on boron chemistry a few decade ago, a technique of isotopic...In this FTMS study on TEB, by means of the high-mass-resolution spectrum to distinguish the isobaric ions, we have identified and quantitatively ...reproduce, release, perform, display, or disclose the work. 14. ABSTRACT During this 3-year in-house experimental research task, researchers in the

  8. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for... materials are available for purchase from the Environmental Protection Agency, Research Triangle Park, NC...

  9. Aerodynamic measurement techniques. [laser based diagnostic techniques

    NASA Technical Reports Server (NTRS)

    Hunter, W. W., Jr.

    1976-01-01

    Laser characteristics of intensity, monochromatic, spatial coherence, and temporal coherence were developed to advance laser based diagnostic techniques for aerodynamic related research. Two broad categories of visualization and optical measurements were considered, and three techniques received significant attention. These are holography, laser velocimetry, and Raman scattering. Examples of the quantitative laser velocimeter and Raman scattering measurements of velocity, temperature, and density indicated the potential of these nonintrusive techniques.

  10. Debriefing after Human Patient Simulation and Nursing Students' Learning

    ERIC Educational Resources Information Center

    Benhuri, Gloria

    2014-01-01

    Human Patient Simulation (HPS) exercises with life-like computerized manikins provide clinical experiences for nursing students in a safe environment followed by debriefing that promotes learning. Quantitative research in techniques to support learning from debriefing is limited. The purpose of the quantitative quasi-experimental study using a…

  11. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  12. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  13. 40 CFR 260.11 - References.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...). (7) ASTM E 168-88, “Standard Practices for General Techniques of Infrared Quantitative Analysis,” IBR...-Visible Quantitative Analysis,” IBR approved for § 264.1063. (9) ASTM E 260-85, “Standard Practice for..., Research Triangle Park, NC. (1) “Screening Procedures for Estimating the Air Quality Impact of Stationary...

  14. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  15. Validation of Passive Sampling Devices for Monitoring of Munitions Constituents in Underwater Environments

    DTIC Science & Technology

    2017-06-30

    Research and Development Program [SERDP] project #ER-2542) into the canister would provide enhancement of the quantitative estimation of the TWA...7 4. Advantages and limitations compared to other sampling techniques...Department of Defense EOD Explosive Ordnance Disposal EPA United States Environmental Protection Agency EQL Environmental Quantitation Limit EST

  16. Combining Qualitative and Quantitative Data: An Example.

    ERIC Educational Resources Information Center

    Sikka, Anjoo; And Others

    Methodology from an ongoing research study to validate teaching techniques for deaf and blind students provides an example of the ways that several types of quantitative and qualitative data can be combined in analysis. Four teacher and student pairs were selected. The students were between 14 and 21 years old, had both auditory and visual…

  17. Measurement Invariance: A Foundational Principle for Quantitative Theory Building

    ERIC Educational Resources Information Center

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    This article describes why measurement invariance is a critical issue to quantitative theory building within the field of human resource development. Readers will learn what measurement invariance is and how to test for its presence using techniques that are accessible to applied researchers. Using data from a LibQUAL+[TM] study of user…

  18. Urban children and nature: a summary of research on camping and outdoor education

    Treesearch

    William R., Jr. Burch

    1977-01-01

    This paper reports the preliminary findings of an extensive bibliographic search that identified studies or urban children in camp and outdoor education programs. These studies were systematically abstracted and classified qualitative or quantitative. Twenty-five percent of the abstracted studies were quantitative. The major findings, techniques of study, and policy...

  19. Academic Advising and First-Generation College Students: A Quantitative Study on Student Retention

    ERIC Educational Resources Information Center

    Swecker, Hadyn K.; Fifolt, Matthew; Searby, Linda

    2014-01-01

    For this quantitative study, we used a multiple logistic regression technique to investigate the relationship between the number of meetings with an academic advisor and retention of first-generation students, as represented by enrollment status and academic standing at a large, public research institution in the Southeast. Consistent with…

  20. The state of RT-quantitative PCR: firsthand observations of implementation of minimum information for the publication of quantitative real-time PCR experiments (MIQE).

    PubMed

    Taylor, Sean C; Mrkusich, Eli M

    2014-01-01

    In the past decade, the techniques of quantitative PCR (qPCR) and reverse transcription (RT)-qPCR have become accessible to virtually all research labs, producing valuable data for peer-reviewed publications and supporting exciting research conclusions. However, the experimental design and validation processes applied to the associated projects are the result of historical biases adopted by individual labs that have evolved and changed since the inception of the techniques and associated technologies. This has resulted in wide variability in the quality, reproducibility and interpretability of published data as a direct result of how each lab has designed their RT-qPCR experiments. The 'minimum information for the publication of quantitative real-time PCR experiments' (MIQE) was published to provide the scientific community with a consistent workflow and key considerations to perform qPCR experiments. We use specific examples to highlight the serious negative ramifications for data quality when the MIQE guidelines are not applied and include a summary of good and poor practices for RT-qPCR. © 2013 S. Karger AG, Basel.

  1. Instrumental Techniques in Archeological Research

    DTIC Science & Technology

    1988-09-01

    and instruments borrowed from the fields of chemistry , physics, geology, metallurgy, and ceramic engineering yield quantitative data on archeological...artifacts. Early analyses relied primarily on wet chemistry techniques in which samples of artifacts were dissolved into liquid solutions, destroying...other organic and inorganic materials. Advantages and disadvantages are dis- cussed. Each technique is presented with attention to appropriate materials

  2. Quantitative techniques for musculoskeletal MRI at 7 Tesla

    PubMed Central

    Taylor, Meredith D.; Tarbox, Grayson J.; Palmer, Antony J.; Park, Daniel J.

    2016-01-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems. PMID:28090448

  3. Thinking big

    NASA Astrophysics Data System (ADS)

    Collins, Harry

    2008-02-01

    Physicists are often quick to discount social research based on qualitative techniques such as ethnography and "deep case studies" - where a researcher draws conclusions about a community based on immersion in the field - thinking that only quantitative research backed up by statistical analysis is sound. The balance is not so clear, however.

  4. 77 FR 43228 - Agency Information Collection Activities; Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... using qualitative and possibly quantitative consumer research techniques, which may include focus groups... used during consumer research while testing nutrition education messages and products developed for the general public. The purpose for performing consumer research is to identify consumers' understanding of...

  5. Effectiveness of project ACORDE materials: applied evaluative research in a preclinical technique course.

    PubMed

    Shugars, D A; Trent, P J; Heymann, H O

    1979-08-01

    Two instructional strategies, the traditional lecture method and a standardized self-instructional (ACORDE) format, were compared for efficiency and perceived usefulness in a preclinical restorative dentistry technique course through the use of a posttest-only control group research design. Control and experimental groups were compared on (a) technique grades, (b) didactic grades, (c) amount of time spent, (d) student and faculty perceptions, and (e) observation of social dynamics. The results of this study demonstrated the effectiveness of Project ACORDE materials in teaching dental students, provided an example of applied research designed to test contemplated instructional innovations prior to use and used a method which highlighted qualitative, as well as quantitative, techniques for data gathering in applied research.

  6. Bridging the Gap between Theory and Practice in Educational Research: Methods at the Margins

    ERIC Educational Resources Information Center

    Winkle-Wagner, Rachelle, Ed.; Hunter, Cheryl A., Ed.; Ortloff, Debora Hinderliter, Ed.

    2009-01-01

    This book provides new ways of thinking about educational processes, using quantitative and qualitative methodologies. Concrete examples of research techniques are provided for those conducting research with marginalized populations or about marginalized ideas. This volume asserts theoretical models related to research methods and the study of…

  7. Quantitative evaluation of translational medicine based on scientometric analysis and information extraction.

    PubMed

    Zhang, Yin; Diao, Tianxi; Wang, Lei

    2014-12-01

    Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.

  8. In-Depth Interviewing as Qualitative Investigation.

    ERIC Educational Resources Information Center

    Books, Marilyn

    A discussion of in-depth interviewing as a method for research on language teaching and learning situates the technique within the continuum of research methodology and differentiates it from quantitative research methods. The strengths and weaknesses of in-depth interviewing are examined, methods of sampling are discussed, and advice on the…

  9. Qualitative interviews in medical research.

    PubMed Central

    Britten, N.

    1995-01-01

    Much qualitative research is interview based, and this paper provides an outline of qualitative interview techniques and their application in medical settings. It explains the rationale for these techniques and shows how they can be used to research kinds of questions that are different from those dealt with by quantitative methods. Different types of qualitative interviews are described, and the way in which they differ from clinical consultations is emphasised. Practical guidance for conducting such interviews is given. Images p252-a PMID:7627048

  10. Multiple element isotope probes, NanoSIMS, and the functional genomics of microbial carbon cycling in soils in response to chronic climatic change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungate, Bruce; Pett-Ridge, Jennifer; Blazewicz, Steven

    In this project, we developed an innovative and ground-breaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peer-review process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less

  11. Multiple element isotope probes, NanoSIMS, and the functional genomics of microbial carbon cycling in soils in response to chronic climatic change

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hungate, Bruce; PettRidge, Jennifer; Blazewicz, St

    In this project, we developed an innovative and groundbreaking technique, quantitative stable isotope probing, a technique that uses density separation of nucleic acids as a quantitative measurement technique. This work is substantial because it advances SIP beyond the qualitative technique that has dominate the field for years. The first methods paper was published in Applied and Environmental Microbiology (Hungate et al. 2015), and this paper describes the mathematical model underlying the quantitative interpretation. A second methods paper (Schwartz et al. 2015) provides a conceptual overview of the method and its application to research problems. A third methods paper was justmore » published (Koch et al. 2018), in which we develop the quantitative model combining sequencing and isotope data to estimate actual rates of microbial growth and death in natural populations. This work has met much enthusiasm in scientific presentations around the world. It has met with equally enthusiastic resistance in the peerreview process, though our record of publication to date argues that people are accepting the merits of the approach. The skepticism and resistance are also potentially signs that this technique is pushing the field forward, albeit with some of the discomfort that accompanies extrapolation. Part of this is a cultural element in the field – the field of microbiology is not accustomed to the assumptions of ecosystem science. Research conducted in this project has pushed the philosophical perspective that major advances can occur when we advocate a sound merger between the traditions of strong inference in microbiology with those of grounded scaling in ecosystem science.« less

  12. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  13. The Application of Operations Research Techniques to the Evaluation of Military Management Information Systems.

    DTIC Science & Technology

    systems such as management information systems . To provide a methodology yielding quantitative results which may assist a commander and his staff in...this analysis, it is proposed that management information systems be evaluated as a whole by a technique defined as the semantic differential. Each

  14. Respondent Techniques for Reduction of Emotions Limiting School Adjustment: A Quantitative Review and Methodological Critique.

    ERIC Educational Resources Information Center

    Misra, Anjali; Schloss, Patrick J.

    1989-01-01

    The critical analysis of 23 studies using respondent techniques for the reduction of excessive emotional reactions in school children focuses on research design, dependent variables, independent variables, component analysis, and demonstrations of generalization and maintenance. Results indicate widespread methodological flaws that limit the…

  15. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  16. The Effects of Performance Assessment Approach on Democratic Attitude of Students

    ERIC Educational Resources Information Center

    Yalcinkaya, Elvan

    2013-01-01

    The aim of the research is to analyze the effects of performance assessment approach on democratic attitude of students. The research model is an experimental design with pretest-posttest control groups. Both quantitative and qualitative techniques are used for gathering of data in this research. 46 students participated in this research, with 23…

  17. Mixed Methods Research in School Psychology: A Mixed Methods Investigation of Trends in the Literature

    ERIC Educational Resources Information Center

    Powell, Heather; Mihalas, Stephanie; Onwuegbuzie, Anthony J.; Suldo, Shannon; Daley, Christine E.

    2008-01-01

    This article illustrates the utility of mixed methods research (i.e., combining quantitative and qualitative techniques) to the field of school psychology. First, the use of mixed methods approaches in school psychology practice is discussed. Second, the mixed methods research process is described in terms of school psychology research. Third, the…

  18. Quantitative nanoparticle tracking: applications to nanomedicine.

    PubMed

    Huang, Feiran; Dempsey, Christopher; Chona, Daniela; Suh, Junghae

    2011-06-01

    Particle tracking is an invaluable technique to extract quantitative and qualitative information regarding the transport of nanomaterials through complex biological environments. This technique can be used to probe the dynamic behavior of nanoparticles as they interact with and navigate through intra- and extra-cellular barriers. In this article, we focus on the recent developments in the application of particle-tracking technology to nanomedicine, including the study of synthetic and virus-based materials designed for gene and drug delivery. Specifically, we cover research where mean square displacements of nanomaterial transport were explicitly determined in order to quantitatively assess the transport of nanoparticles through biological environments. Particle-tracking experiments can provide important insights that may help guide the design of more intelligent and effective diagnostic and therapeutic nanoparticles.

  19. Quantitative analysis of virgin coconut oil in cream cosmetics preparations using fourier transform infrared (FTIR) spectroscopy.

    PubMed

    Rohman, A; Man, Yb Che; Sismindari

    2009-10-01

    Today, virgin coconut oil (VCO) is becoming valuable oil and is receiving an attractive topic for researchers because of its several biological activities. In cosmetics industry, VCO is excellent material which functions as a skin moisturizer and softener. Therefore, it is important to develop a quantitative analytical method offering a fast and reliable technique. Fourier transform infrared (FTIR) spectroscopy with sample handling technique of attenuated total reflectance (ATR) can be successfully used to analyze VCO quantitatively in cream cosmetic preparations. A multivariate analysis using calibration of partial least square (PLS) model revealed the good relationship between actual value and FTIR-predicted value of VCO with coefficient of determination (R2) of 0.998.

  20. Bridging the Gap: The Case for Expanding Ethnographic Techniques in the Marketing Research Curriculum

    ERIC Educational Resources Information Center

    Freeman, Lynne; Spanjaard, Daniela

    2012-01-01

    This article challenges the content of most marketing research courses whereby students are indoctrinated into the qualitative-then-quantitative archetype commonly found in scholarly research, under the assumption that it is both sufficient and appropriate when equipping students with the necessary skills for business. By following this standard…

  1. Publication Bias in Research Synthesis: Sensitivity Analysis Using A Priori Weight Functions

    ERIC Educational Resources Information Center

    Vevea, Jack L.; Woods, Carol M.

    2005-01-01

    Publication bias, sometimes known as the "file-drawer problem" or "funnel-plot asymmetry," is common in empirical research. The authors review the implications of publication bias for quantitative research synthesis (meta-analysis) and describe existing techniques for detecting and correcting it. A new approach is proposed that is suitable for…

  2. [Progress of study on the detection technique of microRNA].

    PubMed

    Zhao, Hai-Feng; Yang, Ren-Chi

    2009-12-01

    MicroRNAs (miRNAs) are small noncoding RNA molecules that negatively regulate gene expression via degradation or translational repression of their targeted mRNAs. MiRNAs are involved in critical biologic processes, including development, cell differentiation, proliferation and the pathogenesis of disease. This review focuses on recent researches on the detection techniques of miRNA including micorarray technique, Northern blot, real-time quantitative PCR, detection technique of miRNA function and so on.

  3. Overcoming Methods Anxiety: Qualitative First, Quantitative Next, Frequent Feedback along the Way

    ERIC Educational Resources Information Center

    Bernstein, Jeffrey L.; Allen, Brooke Thomas

    2013-01-01

    Political Science research methods courses face two problems. First is what to cover, as there are too many techniques to explore in any one course. Second is dealing with student anxiety around quantitative material. We explore a novel way to approach these issues. Our students began by writing a qualitative paper. They followed with a term…

  4. "I'm Not a Natural Mathematician": Inquiry-Based Learning, Constructive Alignment and Introductory Quantitative Social Science

    ERIC Educational Resources Information Center

    Clark, Tom; Foster, Liam

    2017-01-01

    There is continuing concern about the paucity of social science graduates who have the quantitative skills required by academia and industry. Not only do students often lack the confidence to explore, and use, statistical techniques, the dominance of qualitative research in many disciplines has also often constrained programme-level integration of…

  5. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  6. Using the Critical Incident Technique for Triangulation and Elaboration of Communication Management Competencies

    ERIC Educational Resources Information Center

    Brunton, Margaret Ann; Jeffrey, Lynn Maud

    2010-01-01

    This paper presents the findings from research using the critical incident technique to identify the use of key competencies for communication management practitioners. Qualitative data was generated from 202 critical incidents reported by 710 respondents. We also present a brief summary of the quantitative data, which identified two superordinate…

  7. An Investigation of Proposed Techniques for Quantifying Confidence in Assurance Arguments

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2016-01-01

    The use of safety cases in certification raises the question of assurance argument sufficiency and the issue of confidence (or uncertainty) in the argument's claims. Some researchers propose to model confidence quantitatively and to calculate confidence in argument conclusions. We know of little evidence to suggest that any proposed technique would deliver trustworthy results when implemented by system safety practitioners. Proponents do not usually assess the efficacy of their techniques through controlled experiment or historical study. Instead, they present an illustrative example where the calculation delivers a plausible result. In this paper, we review current proposals, claims made about them, and evidence advanced in favor of them. We then show that proposed techniques can deliver implausible results in some cases. We conclude that quantitative confidence techniques require further validation before they should be recommended as part of the basis for deciding whether an assurance argument justifies fielding a critical system.

  8. Teaching Action Research: The Role of Demographics

    ERIC Educational Resources Information Center

    Mcmurray, Adela J.

    2006-01-01

    This article summarizes a longitudinal study of employed MBA students with particular emphasis on findings involving their choice of action research model to implement personal and organizational change in their environment. A multi-method approach merging both quantitative and qualitative techniques was utilized. A questionnaire consisting of…

  9. Quantitative methods used in Australian health promotion research: a review of publications from 1992-2002.

    PubMed

    Smith, Ben J; Zehle, Katharina; Bauman, Adrian E; Chau, Josephine; Hawkshaw, Barbara; Frost, Steven; Thomas, Margaret

    2006-04-01

    This study examined the use of quantitative methods in Australian health promotion research in order to identify methodological trends and priorities for strengthening the evidence base for health promotion. Australian health promotion articles were identified by hand searching publications from 1992-2002 in six journals: Health Promotion Journal of Australia, Australian and New Zealand journal of Public Health, Health Promotion International, Health Education Research, Health Education and Behavior and the American Journal of Health Promotion. The study designs and statistical methods used in articles presenting quantitative research were recorded. 591 (57.7%) of the 1,025 articles used quantitative methods. Cross-sectional designs were used in the majority (54.3%) of studies with pre- and post-test (14.6%) and post-test only (9.5%) the next most common designs. Bivariate statistical methods were used in 45.9% of papers, multivariate methods in 27.1% and simple numbers and proportions in 25.4%. Few studies used higher-level statistical techniques. While most studies used quantitative methods, the majority were descriptive in nature. The study designs and statistical methods used provided limited scope for demonstrating intervention effects or understanding the determinants of change.

  10. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Qualitative Research in Educational Communications and Technology: A Brief Introduction to Principles and Procedures

    ERIC Educational Resources Information Center

    Neuman, Delia

    2014-01-01

    Over the past 30 years, qualitative research has emerged as a widely accepted alternative to the quantitative paradigm for performing research in educational communications and technology. As the new paradigm has evolved, it has spawned a variety of theoretical perspectives and methodological techniques that have both increased its potential…

  12. Thermodynamics and Mechanics of Membrane Curvature Generation and Sensing by Proteins and Lipids

    PubMed Central

    Baumgart, Tobias; Capraro, Benjamin R.; Zhu, Chen; Das, Sovan L.

    2014-01-01

    Research investigating lipid membrane curvature generation and sensing is a rapidly developing frontier in membrane physical chemistry and biophysics. The fast recent progress is based on the discovery of a plethora of proteins involved in coupling membrane shape to cellular membrane function, the design of new quantitative experimental techniques to study aspects of membrane curvature, and the development of analytical theories and simulation techniques that allow a mechanistic interpretation of quantitative measurements. The present review first provides an overview of important classes of membrane proteins for which function is coupled to membrane curvature. We then survey several mechanisms that are assumed to underlie membrane curvature sensing and generation. Finally, we discuss relatively simple thermodynamic/mechanical models that allow quantitative interpretation of experimental observations. PMID:21219150

  13. Quantitative secondary electron imaging for work function extraction at atomic level and layer identification of graphene

    PubMed Central

    Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou

    2016-01-01

    Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907

  14. The application of absolute quantitative (1)H NMR spectroscopy in drug discovery and development.

    PubMed

    Singh, Suruchi; Roy, Raja

    2016-07-01

    The identification of a drug candidate and its structural determination is the most important step in the process of the drug discovery and for this, nuclear magnetic resonance (NMR) is one of the most selective analytical techniques. The present review illustrates the various perspectives of absolute quantitative (1)H NMR spectroscopy in drug discovery and development. It deals with the fundamentals of quantitative NMR (qNMR), the physiochemical properties affecting qNMR, and the latest referencing techniques used for quantification. The precise application of qNMR during various stages of drug discovery and development, namely natural product research, drug quantitation in dosage forms, drug metabolism studies, impurity profiling and solubility measurements is elaborated. To achieve this, the authors explore the literature of NMR in drug discovery and development between 1963 and 2015. It also takes into account several other reviews on the subject. qNMR experiments are used for drug discovery and development processes as it is a non-destructive, versatile and robust technique with high intra and interpersonal variability. However, there are several limitations also. qNMR of complex biological samples is incorporated with peak overlap and a low limit of quantification and this can be overcome by using hyphenated chromatographic techniques in addition to NMR.

  15. Berkeley Lab Wins Seven 2015 R&D 100 Awards | Berkeley Lab

    Science.gov Websites

    products from industry, academia, and government-sponsored research, ranging from chemistry to materials to problems in metrology techniques: the quantitative characterization of the imaging instrumentation Computational Research Division led the development of the technology. Sensor Integrated with Recombinant and

  16. Scientist | Center for Cancer Research

    Cancer.gov

    KEY ROLES/RESPONSIBILITIES The Scientist I will support research efforts to define the role of transcriptional regulators in myeloid cell development, and their potential role in leukemogenesis.  This work will be accomplished performing both molecular and stem cell biology techniques, cloning and construction of retroviral vectors, quantitative RT-PCR, cloning of conditional

  17. Characteristics of Successful Small and Micro Community Enterprises in Rural Thailand

    ERIC Educational Resources Information Center

    Ruengdet, Kamon; Wongsurawat, Winai

    2010-01-01

    This research aims to articulate the most salient factors that set apart successful small and micro community enterprises in the province of Phetchaburi, Thailand. The authors utilize both quantitative and qualitative research techniques. Approximately one hundred questionnaires were sent to leaders of the community enterprises. Simple statistical…

  18. The Effect of Six Thinking Hats on Student Success in Teaching Subjects Related to Sustainable Development in Geography Classes

    ERIC Educational Resources Information Center

    Kaya, Mehmet Fatih

    2013-01-01

    This study aimed to assess the effectiveness of six thinking hats technique in teaching subjects related to sustainable development in geography classes. The study was in both a quantitative and qualitative form. The quantitative part of the study was designed according to pre-test, post-test control group research model, and in the qualitative…

  19. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    NASA Astrophysics Data System (ADS)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  20. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  1. Advanced hyphenated chromatographic-mass spectrometry in mycotoxin determination: current status and prospects.

    PubMed

    Li, Peiwu; Zhang, Zhaowei; Hu, Xiaofeng; Zhang, Qi

    2013-01-01

    Mass spectrometric techniques are essential for advanced research in food safety and environmental monitoring. These fields are important for securing the health of humans and animals, and for ensuring environmental security. Mycotoxins, toxic secondary metabolites of filamentous fungi, are major contaminants of agricultural products, food and feed, biological samples, and the environment as a whole. Mycotoxins can cause cancers, nephritic and hepatic diseases, various hemorrhagic syndromes, and immune and neurological disorders. Mycotoxin-contaminated food and feed can provoke trade conflicts, resulting in massive economic losses. Risk assessment of mycotoxin contamination for humans and animals generally depends on clear identification and reliable quantitation in diversified matrices. Pioneering work on mycotoxin quantitation using mass spectrometry (MS) was performed in the early 1970s. Now, unambiguous confirmation and quantitation of mycotoxins can be readily achieved with a variety hyphenated techniques that combine chromatographic separation with MS, including liquid chromatography (LC) or gas chromatography (GC). With the advent of atmospheric pressure ionization, LC-MS has become a routine technique. Recently, the co-occurrence of multiple mycotoxins in the same sample has drawn an increasing amount of attention. Thus, modern analyses must be able to detect and quantitate multiple mycotoxins in a single run. Improvements in tandem MS techniques have been made to achieve this purpose. This review describes the advanced research that has been done regarding mycotoxin determination using hyphenated chromatographic-MS techniques, but is not a full-circle survey of all the literature published on this topic. The present work provides an overview of the various hyphenated chromatographic-MS-based strategies that have been applied to mycotoxin analysis, with a focus on recent developments. The use of chromatographic-MS to measure levels of mycotoxins, including aflatoxins, ochratoxins, patulin, trichothecenes, zearalenone, and fumonisins, is discussed in detail. Both free and masked mycotoxins are included in this review due to different methods of sample preparation. Techniques are described in terms of sample preparation, internal standards, LC/ultra performance LC (UPLC) optimization, and applications and survey. Several future hyphenated MS techniques are discussed as well, including multidimensional chromatography-MS, capillary electrophoresis-MS, and surface plasmon resonance array-MS. © 2013 Wiley Periodicals, Inc.

  2. Methodological triangulation in a study of social support for siblings of children with cancer.

    PubMed

    Murray, J S

    1999-10-01

    Triangulation is an approach to research that is becoming increasingly popular among nurse researchers. Five types of triangulation are used in nursing research: data, methodological, theoretical, researcher, and analytical triangulation. Methodological triangulation is an attempt to improve validity by combining various techniques in one study. In this article, an example of quantitative and qualitative triangulation is discussed to illustrate the procedures used and the results achieved. The secondary data used as an example are from a previous study that was conducted by the researcher and investigated nursing interventions used by pediatric oncology nurses to provide social support to siblings of children with cancer. Results show that methodological triangulation was beneficial in this study for three reasons. First, the careful comparison of quantitative and qualitative data added support for the social support variables under investigation. Second, the comparison showed more in-depth dimensions about pediatric oncology nurses providing social support to siblings of children with cancer. Finally, the use of methodological triangulation provided insight into revisions for the quantitative instrument.

  3. Early history of neutron scattering at oak ridge

    NASA Astrophysics Data System (ADS)

    Wilkinson, M. K.

    1986-03-01

    Most of the early development of neutron scattering techniques utilizing reactor neutrons occurred at the Oak Ridge National Laboratory during the years immediately following World War II. C.G. Shull, E.O. Wollan, and their associates systematically established neutron diffraction as a quantitative research tool and then applied this technique to important problems in nuclear physics, chemical crystallography, and magnetism. This article briefly summarizes the very important research at ORNL during this period, which laid the foundation for the establishment of neutron scattering programs throughout the world.

  4. Methodological Synthesis in Quantitative L2 Research: A Review of Reviews and a Case Study of Exploratory Factor Analysis

    ERIC Educational Resources Information Center

    Plonsky, Luke; Gonulal, Talip

    2015-01-01

    Research synthesis and meta-analysis provide a pathway to bring together findings in a given domain with greater systematicity, objectivity, and transparency than traditional reviews. The same techniques and corresponding benefits can be and have been applied to examine methodological practices in second language (L2) research (e.g., Plonsky,…

  5. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Genevieve; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, Jose Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment. Several such techniques have emerged over the past decades to address the issue of reader's experience and inter-measurement variability in interpretation. Some were widely embraced by echocardiographers around the world and became part of the clinical routine, whereas others remained limited to research and exploration of new clinical applications. Two such techniques have dominated the research arena of echocardiography: (1) Doppler-based tissue velocity measurements, frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements. Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated back- scatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses, briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  6. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Geneviève; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, José Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment.Several such technique shave emerged over the past decades to address the issue of reader's experience and inter measurement variability in interpretation.Some were widely embraced by echocardiographers around the world and became part of the clinical routine,whereas others remained limited to research and exploration of new clinical applications.Two such techniques have dominated the research arena of echocardiography: (1) Doppler based tissue velocity measurements,frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements.Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated backscatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses,briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  7. Mixing it but not mixed-up: mixed methods research in medical education (a critical narrative review).

    PubMed

    Maudsley, Gillian

    2011-01-01

    Some important research questions in medical education and health services research need 'mixed methods research' (particularly synthesizing quantitative and qualitative findings). The approach is not new, but should be more explicitly reported. The broad search question here, of a disjointed literature, was thus: What is mixed methods research - how should it relate to medical education research?, focused on explicit acknowledgement of 'mixing'. Literature searching focused on Web of Knowledge supplemented by other databases across disciplines. Five main messages emerged: - Thinking quantitative and qualitative, not quantitative versus qualitative - Appreciating that mixed methods research blends different knowledge claims, enquiry strategies, and methods - Using a 'horses for courses' [whatever works] approach to the question, and clarifying the mix - Appreciating how medical education research competes with the 'evidence-based' movement, health services research, and the 'RCT' - Being more explicit about the role of mixed methods in medical education research, and the required expertise Mixed methods research is valuable, yet the literature relevant to medical education is fragmented and poorly indexed. The required time, effort, expertise, and techniques deserve better recognition. More write-ups should explicitly discuss the 'mixing' (particularly of findings), rather than report separate components.

  8. Thermal Nondestructive Characterization of Corrosion in Boiler Tubes by Application fo a Moving Line Heat Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.

  9. A Re-Examination of the Education Production Function Using Individual Participant Data

    ERIC Educational Resources Information Center

    Pigott, Therese D.; Williams, Ryan T.; Polanin, Joshua R.

    2011-01-01

    The focus and purpose of this research is to examine the benefits, limitations, and implications of Individual Participant Data (IPD) meta-analysis in education. Comprehensive research reviews in education have been limited to the use of aggregated data (AD) meta- analysis, techniques based on quantitatively combining information from studies on…

  10. Quantitative Analysis of the Interdisciplinarity of Applied Mathematics.

    PubMed

    Xie, Zheng; Duan, Xiaojun; Ouyang, Zhenzheng; Zhang, Pengyuan

    2015-01-01

    The increasing use of mathematical techniques in scientific research leads to the interdisciplinarity of applied mathematics. This viewpoint is validated quantitatively here by statistical and network analysis on the corpus PNAS 1999-2013. A network describing the interdisciplinary relationships between disciplines in a panoramic view is built based on the corpus. Specific network indicators show the hub role of applied mathematics in interdisciplinary research. The statistical analysis on the corpus content finds that algorithms, a primary topic of applied mathematics, positively correlates, increasingly co-occurs, and has an equilibrium relationship in the long-run with certain typical research paradigms and methodologies. The finding can be understood as an intrinsic cause of the interdisciplinarity of applied mathematics.

  11. Thin-Film Material Science and Processing | Materials Science | NREL

    Science.gov Websites

    , a prime example of this research is thin-film photovoltaics (PV). Thin films are important because have developed a quantitative high-throughput technique that can measure many barriers in parallel with

  12. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  13. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  14. A simple technique to identify key recruitment issues in randomised controlled trials: Q-QAT - Quanti-Qualitative Appointment Timing.

    PubMed

    Paramasivan, Sangeetha; Strong, Sean; Wilson, Caroline; Campbell, Bruce; Blazeby, Jane M; Donovan, Jenny L

    2015-03-11

    Recruitment to pragmatic randomised controlled trials (RCTs) is acknowledged to be difficult, and few interventions have proved to be effective. Previous qualitative research has consistently revealed that recruiters provide imbalanced information about RCT treatments. However, qualitative research can be time-consuming to apply. Within a programme of research to optimise recruitment and informed consent in challenging RCTs, we developed a simple technique, Q-QAT (Quanti-Qualitative Appointment Timing), to systematically investigate and quantify the imbalance to help identify and address recruitment difficulties. The Q-QAT technique comprised: 1) quantification of time spent discussing the RCT and its treatments using transcripts of audio-recorded recruitment appointments, 2) targeted qualitative research to understand the obstacles to recruitment and 3) feedback to recruiters on opportunities for improvement. This was applied to two RCTs with different clinical contexts and recruitment processes. Comparisons were made across clinical centres, recruiters and specialties. In both RCTs, the Q-QAT technique first identified considerable variations in the time spent by recruiters discussing the RCT and its treatments. The patterns emerging from this initial quantification of recruitment appointments then enabled targeted qualitative research to understand the issues and make suggestions to improve recruitment. In RCT1, presentation of the treatments was balanced, but little time was devoted to describing the RCT. Qualitative research revealed patients would have considered participation, but lacked awareness of the RCT. In RCT2, the balance of treatment presentation varied by specialists and centres. Qualitative research revealed difficulties with equipoise and confidence among recruiters presenting the RCT. The quantitative and qualitative findings were well-received by recruiters and opportunities to improve information provision were discussed. A blind coding exercise across three researchers led to the development of guidelines that can be used to apply the Q-QAT technique to other difficult RCTs. The Q-QAT technique was easy to apply and rapidly identified obstacles to recruitment that could be understood through targeted qualitative research and addressed through feedback. The technique's combination of quantitative and qualitative findings enabled the presentation of a holistic picture of recruitment challenges and added credibility to the feedback process. Note: both RCTs in this manuscript asked to be anonymised, so no trial registration details are provided.

  15. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  16. Quantitative Analysis, Design, and Fabrication of Biosensing and Bioprocessing Devices in Living Cells

    DTIC Science & Technology

    2015-03-10

    AFRL-OSR-VA-TR-2015-0080 Biosensing and Bioprocessing Devices in Living Cells Domitilla Del Vecchio MASSACHUSETTS INSTITUTE OF TECHNOLOGY Final...Of Biosensing And Bioprocessing Devices In Living Cells FA9550-12-1-0129 D. Del Vecchio Massachusetts Institute of Technology -- 77 Massachusetts...research is to develop quantitative techniques for the de novo design and fabrication of biosensing devices in living cells . Such devices will be entirely

  17. Conceptual development and retention within the learning cycle

    NASA Astrophysics Data System (ADS)

    McWhirter, Lisa Jo

    1998-12-01

    This research was designed to achieve two goals: (1) examine concept development and retention within the learning cycle and (2) examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. Forty-eight sixth-grade students and one teacher at an urban middle school participated in the study. The research utilized both quantitative and qualitative analyses. Quantitative assessments included a concept mapping technique as well as teacher generated multiple choice tests. Preliminary quantitative analysis found that students' reading levels had an effect on students' pretest scores in both the concept mapping and the multiple-choice assessment. Therefore, a covariant design was implemented for the quantitative analyses. Quantitative analysis techniques were used to examine concept development and retention, it was discovered that the students' concept knowledge increased significantly from the time of the conclusion of the term introduction phase to the conclusion of the expansion phase. These findings would indicate that all three phases of the learning cycle are necessary for conceptual development. However, quantitative analyses of concept maps indicated that this is not true for all students. Individual students showed evidence of concept development and integration at each phase. Therefore, concept development is individualized and all phases of the learning cycle are not necessary for all students. As a result, individual's assimilation, disequilibration, accommodation and organization may not correlate with the phases of the learning cycle. Quantitative analysis also indicated a significant decrease in the retention of concepts over time. Qualitative analyses were used to examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. It was discovered that there was a correlation between teacher-student interaction and small-group interaction and concept mediation. Therefore, students who had a high level of teacher-student dialogue which utilized teacher led discussions with integrated scaffolding techniques where the same students who mediated the ideas within the small group discussions. Those students whose teacher-student interactions consisted of dialogue with little positive teacher feedback made no contributions within the small group regardless of their level of concept development.

  18. IOPS advisor: Research in progress on knowledge-intensive methods for irregular operations airline scheduling

    NASA Technical Reports Server (NTRS)

    Borse, John E.; Owens, Christopher C.

    1992-01-01

    Our research focuses on the problem of recovering from perturbations in large-scale schedules, specifically on the ability of a human-machine partnership to dynamically modify an airline schedule in response to unanticipated disruptions. This task is characterized by massive interdependencies and a large space of possible actions. Our approach is to apply the following: qualitative, knowledge-intensive techniques relying on a memory of stereotypical failures and appropriate recoveries; and quantitative techniques drawn from the Operations Research community's work on scheduling. Our main scientific challenge is to represent schedules, failures, and repairs so as to make both sets of techniques applicable to the same data. This paper outlines ongoing research in which we are cooperating with United Airlines to develop our understanding of the scientific issues underlying the practicalities of dynamic, real-time schedule repair.

  19. Combined qualitative and quantitative research designs.

    PubMed

    Seymour, Jane

    2012-12-01

    Mixed methods research designs have been recognized as important in addressing complexity and are recommended particularly in the development and evaluation of complex interventions. This article reports a review of studies in palliative care published between 2010 and March 2012 that combine qualitative and quantitative approaches. A synthesis of approaches to mixed methods research taken in 28 examples of published research studies of relevance to palliative and supportive care is provided, using a typology based on a classic categorization put forward in 1992. Mixed-method studies are becoming more frequently employed in palliative care research and resonate with the complexity of the palliative care endeavour. Undertaking mixed methods research requires a sophisticated understanding of the research process and recognition of some of the underlying complexities encountered when working with different traditions and perspectives on issues of: sampling, validity, reliability and rigour, different sources of data and different data collection and analysis techniques.

  20. Required, Practical, or Unnecessary? An Examination and Demonstration of Propensity Score Matching Using Longitudinal Secondary Data

    ERIC Educational Resources Information Center

    Padgett, Ryan D.; Salisbury, Mark H.; An, Brian P.; Pascarella, Ernest T.

    2010-01-01

    The sophisticated analytical techniques available to institutional researchers give them an array of procedures to estimate a causal effect using observational data. But as many quantitative researchers have discovered, access to a wider selection of statistical tools does not necessarily ensure construction of a better analytical model. Moreover,…

  1. The Relationship between Attention Levels and Class Participation of First-Year Students in Classroom Teaching Departments

    ERIC Educational Resources Information Center

    Sezer, Adem; Inel, Yusuf; Seçkin, Ahmet Çagdas; Uluçinar, Ufuk

    2017-01-01

    This study aimed to detect any relationship that may exist between classroom teacher candidates' class participation and their attention levels. The research method was a convergent parallel design, mixing quantitative and qualitative research techniques, and the study group was composed of 21 freshmen studying in the Classroom Teaching Department…

  2. Improving Symptom Control, QOL, and Quality of Care for Women with Breast Cancer: Developing a Research Program on Neurological Effects via Doctoral Education

    DTIC Science & Technology

    2006-06-01

    phenomenological study . Nursing Research , 41, 166-170. Beck, C. (1993). Teetering on the edge: A substantive theory ... grounded theory : Strategies for qualitative research . Chicago: Aldine. Goldstein, D., Lu, Y., Detke, M., Lee, T., & Iyengar, S. (2005). Duloxetine vs...Sandelowski, M. (2000a). Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed- method studies . Research

  3. Osteoporosis Imaging: State of the Art and Advanced Imaging

    PubMed Central

    2012-01-01

    Osteoporosis is becoming an increasingly important public health issue, and effective treatments to prevent fragility fractures are available. Osteoporosis imaging is of critical importance in identifying individuals at risk for fractures who would require pharmacotherapy to reduce fracture risk and also in monitoring response to treatment. Dual x-ray absorptiometry is currently the state-of-the-art technique to measure bone mineral density and to diagnose osteoporosis according to the World Health Organization guidelines. Motivated by a 2000 National Institutes of Health consensus conference, substantial research efforts have focused on assessing bone quality by using advanced imaging techniques. Among these techniques aimed at better characterizing fracture risk and treatment effects, high-resolution peripheral quantitative computed tomography (CT) currently plays a central role, and a large number of recent studies have used this technique to study trabecular and cortical bone architecture. Other techniques to analyze bone quality include multidetector CT, magnetic resonance imaging, and quantitative ultrasonography. In addition to quantitative imaging techniques measuring bone density and quality, imaging needs to be used to diagnose prevalent osteoporotic fractures, such as spine fractures on chest radiographs and sagittal multidetector CT reconstructions. Radiologists need to be sensitized to the fact that the presence of fragility fractures will alter patient care, and these fractures need to be described in the report. This review article covers state-of-the-art imaging techniques to measure bone mineral density, describes novel techniques to study bone quality, and focuses on how standard imaging techniques should be used to diagnose prevalent osteoporotic fractures. © RSNA, 2012 PMID:22438439

  4. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites in tissue at therapeutic levels.

    PubMed

    Sun, Na; Walch, Axel

    2013-08-01

    Mass spectrometry imaging (MSI) is a rapidly evolving technology that yields qualitative and quantitative distribution maps of small pharmaceutical-active molecules and their metabolites in tissue sections in situ. The simplicity, high sensitivity and ability to provide comprehensive spatial distribution maps of different classes of biomolecules make MSI a valuable tool to complement histopathology for diagnostics and biomarker discovery. In this review, qualitative and quantitative MSI of drugs and metabolites in tissue at therapeutic levels are discussed and the impact of this technique in drug discovery and clinical research is highlighted.

  5. Quantitative electron density characterization of soft tissue substitute plastic materials using grating-based x-ray phase-contrast imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarapata, A.; Chabior, M.; Zanette, I.

    2014-10-15

    Many scientific research areas rely on accurate electron density characterization of various materials. For instance in X-ray optics and radiation therapy, there is a need for a fast and reliable technique to quantitatively characterize samples for electron density. We present how a precise measurement of electron density can be performed using an X-ray phase-contrast grating interferometer in a radiographic mode of a homogenous sample in a controlled geometry. A batch of various plastic materials was characterized quantitatively and compared with calculated results. We found that the measured electron densities closely match theoretical values. The technique yields comparable results between amore » monochromatic and a polychromatic X-ray source. Measured electron densities can be further used to design dedicated X-ray phase contrast phantoms and the additional information on small angle scattering should be taken into account in order to exclude unsuitable materials.« less

  6. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  7. Quantitative Phase Fraction Detection in Organic Photovoltaic Materials through EELS Imaging

    DOE PAGES

    Dyck, Ondrej; Hu, Sheng; Das, Sanjib; ...

    2015-11-24

    Organic photovoltaic materials have recently seen intense interest from the research community. Improvements in device performance are occurring at an impressive rate; however, visualization of the active layer phase separation still remains a challenge. Our paper outlines the application of two electron energy-loss spectroscopic (EELS) imaging techniques that can complement and enhance current phase detection techniques. Specifically, the bulk plasmon peak position, often used to produce contrast between phases in energy filtered transmission electron microscopy (EFTEM), is quantitatively mapped across a sample cross section. One complementary spectrum image capturing the carbon and sulfur core loss edges is compared with themore » plasmon peak map and found to agree quite well, indicating that carbon and sulfur density differences between the two phases also allows phase discrimination. Additionally, an analytical technique for determining absolute atomic areal density is used to produce an absolute carbon and sulfur areal density map. We also show how these maps may be re-interpreted as a phase ratio map, giving quantitative information about the purity of the phases within the junction.« less

  8. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  9. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  10. Thermographic Imaging of Material Loss in Boiler Water-Wall Tubing by Application of Scanning Line Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Localized wall thinning due to corrosion in utility boiler water-wall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proven to be very manpower and time intensive. This has resulted in a spot check approach to inspections, documenting thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed for large structures such as boiler water-walls. A theoretical basis for the technique will be presented which explains the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of applying this technology to actual water-wall tubing samples and in situ inspections will be presented.

  11. Analysis of 4th Grade Students' Problem Solving Skills in Terms of Several Variables

    ERIC Educational Resources Information Center

    Sungur, Gülcan; Bal, Pervin Nedim

    2016-01-01

    The aim of this study is to examine if the level of primary school students in solving problems differs according to some demographic variables. The research is descriptive type in the general survey method, it was carried out with quantitative research techniques. The sample of the study consisted of 587 primary school students in Grade 4. The…

  12. Quantitative impedimetric monitoring of cell migration under the stimulation of cytokine or anti-cancer drug in a microfluidic chip

    PubMed Central

    Xiao, Xia; Lei, Kin Fong; Huang, Chia-Hao

    2015-01-01

    Cell migration is a cellular response and results in various biological processes such as cancer metastasis, that is, the primary cause of death for cancer patients. Quantitative investigation of the correlation between cell migration and extracellular stimulation is essential for developing effective therapeutic strategies for controlling invasive cancer cells. The conventional method to determine cell migration rate based on comparison of successive images may not be an objective approach. In this work, a microfluidic chip embedded with measurement electrodes has been developed to quantitatively monitor the cell migration activity based on the impedimetric measurement technique. A no-damage wound was constructed by microfluidic phenomenon and cell migration activity under the stimulation of cytokine and an anti-cancer drug, i.e., interleukin-6 and doxorubicin, were, respectively, investigated. Impedance measurement was concurrently performed during the cell migration process. The impedance change was directly correlated to the cell migration activity; therefore, the migration rate could be calculated. In addition, a good match was found between impedance measurement and conventional imaging analysis. But the impedimetric measurement technique provides an objective and quantitative measurement. Based on our technique, cell migration rates were calculated to be 8.5, 19.1, and 34.9 μm/h under the stimulation of cytokine at concentrations of 0 (control), 5, and 10 ng/ml. This technique has high potential to be developed into a powerful analytical platform for cancer research. PMID:26180566

  13. Quantitative mass spectrometry: an overview

    NASA Astrophysics Data System (ADS)

    Urban, Pawel L.

    2016-10-01

    Mass spectrometry (MS) is a mainstream chemical analysis technique in the twenty-first century. It has contributed to numerous discoveries in chemistry, physics and biochemistry. Hundreds of research laboratories scattered all over the world use MS every day to investigate fundamental phenomena on the molecular level. MS is also widely used by industry-especially in drug discovery, quality control and food safety protocols. In some cases, mass spectrometers are indispensable and irreplaceable by any other metrological tools. The uniqueness of MS is due to the fact that it enables direct identification of molecules based on the mass-to-charge ratios as well as fragmentation patterns. Thus, for several decades now, MS has been used in qualitative chemical analysis. To address the pressing need for quantitative molecular measurements, a number of laboratories focused on technological and methodological improvements that could render MS a fully quantitative metrological platform. In this theme issue, the experts working for some of those laboratories share their knowledge and enthusiasm about quantitative MS. I hope this theme issue will benefit readers, and foster fundamental and applied research based on quantitative MS measurements. This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. Reclaiming Our "Toughest" Youth

    ERIC Educational Resources Information Center

    Gharabaghi, Kiaras

    2008-01-01

    Some so-called "evidence-based" interventions are narrow methods which are justified by some quantitative research. This limited focus ignores broader qualitative studies showing that interpersonal relationships wield more impact than technique. Even a cursory review of youth-serving organizations demonstrates that the overwhelming majority of…

  15. Objective evaluation of reconstruction methods for quantitative SPECT imaging in the absence of ground truth.

    PubMed

    Jha, Abhinav K; Song, Na; Caffo, Brian; Frey, Eric C

    2015-04-13

    Quantitative single-photon emission computed tomography (SPECT) imaging is emerging as an important tool in clinical studies and biomedical research. There is thus a need for optimization and evaluation of systems and algorithms that are being developed for quantitative SPECT imaging. An appropriate objective method to evaluate these systems is by comparing their performance in the end task that is required in quantitative SPECT imaging, such as estimating the mean activity concentration in a volume of interest (VOI) in a patient image. This objective evaluation can be performed if the true value of the estimated parameter is known, i.e. we have a gold standard. However, very rarely is this gold standard known in human studies. Thus, no-gold-standard techniques to optimize and evaluate systems and algorithms in the absence of gold standard are required. In this work, we developed a no-gold-standard technique to objectively evaluate reconstruction methods used in quantitative SPECT when the parameter to be estimated is the mean activity concentration in a VOI. We studied the performance of the technique with realistic simulated image data generated from an object database consisting of five phantom anatomies with all possible combinations of five sets of organ uptakes, where each anatomy consisted of eight different organ VOIs. Results indicate that the method provided accurate ranking of the reconstruction methods. We also demonstrated the application of consistency checks to test the no-gold-standard output.

  16. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    PubMed

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  17. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Obesity prevention: Comparison of techniques and potential solution

    NASA Astrophysics Data System (ADS)

    Zulkepli, Jafri; Abidin, Norhaslinda Zainal; Zaibidi, Nerda Zura

    2014-12-01

    Over the years, obesity prevention has been a broadly studied subject by both academicians and practitioners. It is one of the most serious public health issue as it can cause numerous chronic health and psychosocial problems. Research is needed to suggest a population-based strategy for obesity prevention. In the academic environment, the importance of obesity prevention has triggered various problem solving approaches. A good obesity prevention model, should comprehend and cater all complex and dynamics issues. Hence, the main purpose of this paper is to discuss the qualitative and quantitative approaches on obesity prevention study and to provide an extensive literature review on various recent modelling techniques for obesity prevention. Based on these literatures, the comparison of both quantitative and qualitative approahes are highlighted and the justification on the used of system dynamics technique to solve the population of obesity is discussed. Lastly, a potential framework solution based on system dynamics modelling is proposed.

  19. Caries Detection Methods Based on Changes in Optical Properties between Healthy and Carious Tissue

    PubMed Central

    Karlsson, Lena

    2010-01-01

    A conservative, noninvasive or minimally invasive approach to clinical management of dental caries requires diagnostic techniques capable of detecting and quantifying lesions at an early stage, when progression can be arrested or reversed. Objective evidence of initiation of the disease can be detected in the form of distinct changes in the optical properties of the affected tooth structure. Caries detection methods based on changes in a specific optical property are collectively referred to as optically based methods. This paper presents a simple overview of the feasibility of three such technologies for quantitative or semiquantitative assessment of caries lesions. Two of the techniques are well-established: quantitative light-induced fluorescence, which is used primarily in caries research, and laser-induced fluorescence, a commercially available method used in clinical dental practice. The third technique, based on near-infrared transillumination of dental enamel is in the developmental stages. PMID:20454579

  20. MO-E-12A-01: Quantitative Imaging: Techniques, Applications, and Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, E; Jeraj, R; McNitt-Gray, M

    The first symposium in the Quantitative Imaging Track focused on the introduction of quantitative imaging (QI) by illustrating the potential of QI in diagnostic and therapeutic applications in research and patient care, highlighting key challenges in implementation of such QI applications, and reviewing QI efforts of selected national and international agencies and organizations, including the FDA, NCI, NIST, and RSNA. This second QI symposium will focus more specifically on the techniques, applications, and challenges of QI. The first talk of the session will focus on modalityagnostic challenges of QI, beginning with challenges of the development and implementation of QI applicationsmore » in single-center, single-vendor settings and progressing to the challenges encountered in the most general setting of multi-center, multi-vendor settings. The subsequent three talks will focus on specific QI challenges and opportunities in the modalityspecific settings of CT, PET/CT, and MR. Each talk will provide information on modality-specific QI techniques, applications, and challenges, including current efforts focused on solutions to such challenges. Learning Objectives: Understand key general challenges of QI application development and implementation, regardless of modality. Understand selected QI techniques and applications in CT, PET/CT, and MR. Understand challenges, and potential solutions for such challenges, for the applications presented for each modality.« less

  1. Nuclear medicine and quantitative imaging research (quantitative studies in radiopharmaceutical science): Comprehensive progress report, April 1, 1986-December 31, 1988

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cooper, M.D.; Beck, R.N.

    1988-06-01

    This document describes several years research to improve PET imaging and diagnostic techniques in man. This program addresses the problems involving the basic science and technology underlying the physical and conceptual tools of radioactive tracer methodology as they relate to the measurement of structural and functional parameters of physiologic importance in health and disease. The principal tool is quantitative radionuclide imaging. The overall objective of this program is to further the development and transfer of radiotracer methodology from basic theory to routine clinical practice in order that individual patients and society as a whole will receive the maximum net benefitmore » from the new knowledge gained. The focus of the research is on the development of new instruments and radiopharmaceuticals, and the evaluation of these through the phase of clinical feasibility. The reports in the study were processed separately for the data bases. (TEM)« less

  2. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  3. Large-scale quantitative analysis of painting arts.

    PubMed

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-11

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  4. A New Quantitative 3D Imaging Method for Characterizing Spray in the Near-field of Nozzle Exits

    DTIC Science & Technology

    2015-01-13

    measurements were performed on a flat-panel tabletop cone - beam CT system in the Radiology Department at Stanford University. The X-ray generator (CPI...quantitative measurement technique to examine the dense near-field region of sprays using X-ray computed tomography (CT). An optimized “spray CT system” was...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 X-ray CT, Spray, Hollow Cone Spray, Near Field REPORT DOCUMENTATION PAGE 11. SPONSOR

  5. A quantitative evaluation of the public response to climate engineering

    NASA Astrophysics Data System (ADS)

    Wright, Malcolm J.; Teagle, Damon A. H.; Feetham, Pamela M.

    2014-02-01

    Atmospheric greenhouse gas concentrations continue to increase, with CO2 passing 400 parts per million in May 2013. To avoid severe climate change and the attendant economic and social dislocation, existing energy efficiency and emissions control initiatives may need support from some form of climate engineering. As climate engineering will be controversial, there is a pressing need to inform the public and understand their concerns before policy decisions are taken. So far, engagement has been exploratory, small-scale or technique-specific. We depart from past research to draw on the associative methods used by corporations to evaluate brands. A systematic, quantitative and comparative approach for evaluating public reaction to climate engineering is developed. Its application reveals that the overall public evaluation of climate engineering is negative. Where there are positive associations they favour carbon dioxide removal (CDR) over solar radiation management (SRM) techniques. Therefore, as SRM techniques become more widely known they are more likely to elicit negative reactions. Two climate engineering techniques, enhanced weathering and cloud brightening, have indistinct concept images and so are less likely to draw public attention than other CDR or SRM techniques.

  6. Studies Relevent to Catalytic Activation Co & other small Molecules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ford, Peter C

    2005-02-22

    Detailed annual and triannual reports describing the progress accomplished during the tenure of this grant were filed with the Program Manager for Catalysis at the Office of Basic Energy Sciences. To avoid unnecessary duplication, the present report will provide a brief overview of the research areas that were sponsored by this grant and list the resulting publications and theses based on this DOE supported research. The scientific personnel participating in (and trained by) this grant's research are also listed. Research carried out under this DOE grant was largely concerned with the mechanisms of the homogeneous catalytic and photocatalytic activation ofmore » small molecules such as carbon monoxide, dihydrogen and various hydrocarbons. Much of the more recent effort has focused on the dynamics and mechanisms of reactions relevant to substrate carbonylations by homogeneous organometallic catalysts. A wide range of modern investigative techniques were employed, including quantitative fast reaction methodologies such as time-resolved optical (TRO) and time-resolved infrared (TRIR) spectroscopy and stopped flow kinetics. Although somewhat diverse, this research falls within the scope of the long-term objective of applying quantitative techniques to elucidate the dynamics and understand the principles of mechanisms relevant to the selective and efficient catalytic conversions of fundamental feedstocks to higher value materials.« less

  7. Quantitative Analysis of TDLUs using Adaptive Morphological Shape Techniques

    PubMed Central

    Rosebrock, Adrian; Caban, Jesus J.; Figueroa, Jonine; Gierach, Gretchen; Linville, Laura; Hewitt, Stephen; Sherman, Mark

    2014-01-01

    Within the complex branching system of the breast, terminal duct lobular units (TDLUs) are the anatomical location where most cancer originates. With aging, TDLUs undergo physiological involution, reflected in a loss of structural components (acini) and a reduction in total number. Data suggest that women undergoing benign breast biopsies that do not show age appropriate involution are at increased risk of developing breast cancer. To date, TDLU assessments have generally been made by qualitative visual assessment, rather than by objective quantitative analysis. This paper introduces a technique to automatically estimate a set of quantitative measurements and use those variables to more objectively describe and classify TDLUs. To validate the accuracy of our system, we compared the computer-based morphological properties of 51 TDLUs in breast tissues donated for research by volunteers in the Susan G. Komen Tissue Bank and compared results to those of a pathologist, demonstrating 70% agreement. Secondly, in order to show that our method is applicable to a wider range of datasets, we analyzed 52 TDLUs from biopsies performed for clinical indications in the National Cancer Institute’s Breast Radiology Evaluation and Study of Tissues (BREAST) Stamp Project and obtained 82% correlation with visual assessment. Lastly, we demonstrate the ability to uncover novel measures when researching the structural properties of the acini by applying machine learning and clustering techniques. Through our study we found that while the number of acini per TDLU increases exponentially with the TDLU diameter, the average elongation and roundness remain constant. PMID:25722829

  8. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes.

    PubMed

    Popelka, Stanislav; Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka

    2016-01-01

    The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe.

  9. EyeTribe Tracker Data Accuracy Evaluation and Its Interconnection with Hypothesis Software for Cartographic Purposes

    PubMed Central

    Stachoň, Zdeněk; Šašinka, Čeněk; Doležalová, Jitka

    2016-01-01

    The mixed research design is a progressive methodological discourse that combines the advantages of quantitative and qualitative methods. Its possibilities of application are, however, dependent on the efficiency with which the particular research techniques are used and combined. The aim of the paper is to introduce the possible combination of Hypothesis with EyeTribe tracker. The Hypothesis is intended for quantitative data acquisition and the EyeTribe is intended for qualitative (eye-tracking) data recording. In the first part of the paper, Hypothesis software is described. The Hypothesis platform provides an environment for web-based computerized experiment design and mass data collection. Then, evaluation of the accuracy of data recorded by EyeTribe tracker was performed with the use of concurrent recording together with the SMI RED 250 eye-tracker. Both qualitative and quantitative results showed that data accuracy is sufficient for cartographic research. In the third part of the paper, a system for connecting EyeTribe tracker and Hypothesis software is presented. The interconnection was performed with the help of developed web application HypOgama. The created system uses open-source software OGAMA for recording the eye-movements of participants together with quantitative data from Hypothesis. The final part of the paper describes the integrated research system combining Hypothesis and EyeTribe. PMID:27087805

  10. Preliminary results of real-time in-vitro electronic speckle pattern interferometry (ESPI) measurements in otolaryngology

    NASA Astrophysics Data System (ADS)

    Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.

    1995-05-01

    Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.

  11. A real-time interferometer technique for compressible flow research

    NASA Technical Reports Server (NTRS)

    Bachalo, W. D.; Houser, M. J.

    1984-01-01

    Strengths and shortcomings in the application of interferometric techniques to transonic flow fields are examined and an improved method is elaborated. Such applications have demonstrated the value of interferometry in obtaining data for compressible flow research. With holographic techniques, interferometry may be applied in large scale facilities without the use of expensive optics or elaborate vibration isolation equipment. Results obtained using holographic interferometry and other methods demonstrate that reliable qualitative and quantitative data can be acquired. Nevertheless, the conventional method can be difficult to set up and apply, and it cannot produce real-time data. A new interferometry technique is investigated that promises to be easier to apply and can provide real-time information. This single-beam technique has the necessary insensitivity to vibration for large scale wind tunnel operations. Capabilities of the method and preliminary tests on some laboratory scale flow fluids are described.

  12. Development of a quantitative intracranial vascular features extraction tool on 3D MRA using semiautomated open-curve active contour vessel tracing.

    PubMed

    Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun

    2018-06-01

    To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  13. Points of Convergence in Music Education: The Use of Data Labels as a Strategy for Mixed Methods Integration

    ERIC Educational Resources Information Center

    Fitzpatrick, Kate R.

    2016-01-01

    Although the mixing of quantitative and qualitative data is an essential component of mixed methods research, the process of integrating both types of data in meaningful ways can be challenging. The purpose of this article is to describe the use of data labels in mixed methods research as a technique for the integration of qualitative and…

  14. MilQuant: a free, generic software tool for isobaric tagging-based quantitation.

    PubMed

    Zou, Xiao; Zhao, Minzhi; Shen, Hongyan; Zhao, Xuyang; Tong, Yuanpeng; Wang, Qingsong; Wei, Shicheng; Ji, Jianguo

    2012-09-18

    Isobaric tagging techniques such as iTRAQ and TMT are widely used in quantitative proteomics and especially useful for samples that demand in vitro labeling. Due to diversity in choices of MS acquisition approaches, identification algorithms, and relative abundance deduction strategies, researchers are faced with a plethora of possibilities when it comes to data analysis. However, the lack of generic and flexible software tool often makes it cumbersome for researchers to perform the analysis entirely as desired. In this paper, we present MilQuant, mzXML-based isobaric labeling quantitator, a pipeline of freely available programs that supports native acquisition files produced by all mass spectrometer types and collection approaches currently used in isobaric tagging based MS data collection. Moreover, aside from effective normalization and abundance ratio deduction algorithms, MilQuant exports various intermediate results along each step of the pipeline, making it easy for researchers to customize the analysis. The functionality of MilQuant was demonstrated by four distinct datasets from different laboratories. The compatibility and extendibility of MilQuant makes it a generic and flexible tool that can serve as a full solution to data analysis of isobaric tagging-based quantitation. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Evaluation of non-intrusive flow measurement techniques for a re-entry flight experiment

    NASA Technical Reports Server (NTRS)

    Miles, R. B.; Santavicca, D. A.; Zimmermann, M.

    1983-01-01

    This study evaluates various non-intrusive techniques for the measurement of the flow field on the windward side of the Space Shuttle orbiter or a similar reentry vehicle. Included are linear (Rayleigh, Raman, Mie, Laser Doppler Velocimetry, Resonant Doppler Velocimetry) and nonlinear (Coherent Anti-Stokes Raman, Laser-Induced Fluorescence) light scattering, electron-beam fluorescence, thermal emission, and mass spectroscopy. Flow-field properties were taken from a nonequilibrium flow model by Shinn, Moss, and Simmonds at the NASA Langley Research Center. Conclusions are, when possible, based on quantitative scaling of known laboratory results to the conditions projected. Detailed discussion with researchers in the field contributed further to these conclusions and provided valuable insights regarding the experimental feasibility of each of the techniques.

  16. Boiler Tube Corrosion Characterization with a Scanning Thermal Line

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Jacobstein, Ronald; Reilly, Thomas

    2001-01-01

    Wall thinning due to corrosion in utility boiler water wall tubing is a significant operational concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. Unfortunately, ultrasonic inspection is very manpower intense and slow. Therefore, thickness measurements are typically taken over a relatively small percentage of the total boiler wall and statistical analysis is used to determine the overall condition of the boiler tubing. Other inspection techniques, such as electromagnetic acoustic transducer (EMAT), have recently been evaluated, however they provide only a qualitative evaluation - identifying areas or spots where corrosion has significantly reduced the wall thickness. NASA Langley Research Center, in cooperation with ThermTech Services, has developed a thermal NDE technique designed to quantitatively measure the wall thickness and thus determine the amount of material thinning present in steel boiler tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed and accuracy for large structures such as boiler water walls. A theoretical basis for the technique will be presented to establish the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of the application of this technology to actual water wall tubing samples and in-situ inspections will be presented.

  17. Intracellular subsurface imaging using a hybrid shear-force feedback/scanning quantitative phase microscopy technique

    NASA Astrophysics Data System (ADS)

    Edward, Kert

    Quantitative phase microscopy (QPM) allows for the imaging of translucent or transparent biological specimens without the need for exogenous contrast agents. This technique is usually applied towards the investigation of simple cells such as red blood cells which are typically enucleated and can be considered to be homogenous. However, most biological cells are nucleated and contain other interesting intracellular organelles. It has been established that the physical characteristics of certain subsurface structures such as the shape and roughness of the nucleus is well correlated with onset and progress of pathological conditions such as cancer. Although the acquired quantitative phase information of biological cells contains surface information as well as coupled subsurface information, the latter has been ignored up until now. A novel scanning quantitative phase imaging system unencumbered by 2pi ambiguities is hereby presented. This system is incorporated into a shear-force feedback scheme which allows for simultaneous phase and topography determination. It will be shown how subsequent image processing of these two data sets allows for the extraction of the subsurface component in the phase data and in vivo cell refractometry studies. Both fabricated samples and biological cells ranging from rat fibroblast cells to malaria infected human erythrocytes were investigated as part of this research. The results correlate quite well with that obtained via other microscopy techniques.

  18. Using Log Linear Analysis for Categorical Family Variables.

    ERIC Educational Resources Information Center

    Moen, Phyllis

    The Goodman technique of log linear analysis is ideal for family research, because it is designed for categorical (non-quantitative) variables. Variables are dichotomized (for example, married/divorced, childless/with children) or otherwise categorized (for example, level of permissiveness, life cycle stage). Contingency tables are then…

  19. Development of Naphthalene PLIF for Making Quantitative Measurements of Ablation Products Transport in Supersonic Flows

    NASA Astrophysics Data System (ADS)

    Combs, Christopher; Clemens, Noel

    2014-11-01

    Ablation is a multi-physics process involving heat and mass transfer and codes aiming to predict ablation are in need of experimental data pertaining to the turbulent transport of ablation products for validation. Low-temperature sublimating ablators such as naphthalene can be used to create a limited physics problem and simulate ablation at relatively low temperature conditions. At The University of Texas at Austin, a technique is being developed that uses planar laser-induced fluorescence (PLIF) of naphthalene to visualize the transport of ablation products in a supersonic flow. In the current work, naphthalene PLIF will be used to make quantitative measurements of the concentration of ablation products in a Mach 5 turbulent boundary layer. For this technique to be used for quantitative research in supersonic wind tunnel facilities, the fluorescence properties of naphthalene must first be investigated over a wide range of state conditions and excitation wavelengths. The resulting calibration of naphthalene fluorescence will be applied to the PLIF images of ablation from a boundary layer plug, yielding 2-D fields of naphthalene mole fraction. These images may help provide data necessary to validate computational models of ablative thermal protection systems for reentry vehicles. Work supported by NASA Space Technology Research Fellowship Program under grant NNX11AN55H.

  20. Magnetic resonance spectroscopy.

    PubMed

    Hope, P L; Moorcraft, J

    1991-09-01

    MRS is a noninvasive technique that does not use ionizing radiation and can be used to measure relative metabolite concentrations in human tissues and organs in vivo. Phosphorus MRS can be used to study energy metabolites and intracellular pH. The first neonatal applications were described in 1983 in a study of cerebral metabolism. Since then, the value of cerebral MRS as research tool and an investigative technique has been confirmed, and its prognostic power in asphyxiated infants has been established. Techniques of spatial localization and quantitation have been developed, but studies of other organs and the use of other nuclei remain at a very preliminary stage. Considering the huge potential of MRS and the proliferation of high field magnets primarily designed for imaging, there has been a disappointing lack of progress in the development of clinical and research applications of spectroscopy. The logistic differences of studying sick infants in strong magnetic fields make MRS a time-consuming and labor-intensive investigation, which will inevitably limit its widespread routine use. Research studies are hampered by the diversity of spectroscopic and signal processing techniques, which make comparisons of data from different groups impossible. Some techniques for the assessment of cerebral hemodynamics such as doppler ultrasound and near infrared spectroscopy have the advantage of being available at the cotside, but MRS is unique in providing quantitative information about a wide range of intracellular metabolites. The altricial development of MRS as a clinical investigative tool in neonatology can be ascribed partly to practical difficulties, but these should not detract from the exciting possibilities opened up by a technique that gives a noninvasive insight into intracellular chemistry. The metabolic information from MRS is an invaluable addition to the information provided by other techniques and will certainly play an important role in unraveling the sequence of events between an hypoxic-ischemic insult and cell death. A better understanding of these mechanisms is a prerequisite to the development of rational therapeutic maneuvers following asphyxial insults.

  1. Research on the development of green chemistry technology assessment techniques: a material reutilization case.

    PubMed

    Hong, Seokpyo; Ahn, Kilsoo; Kim, Sungjune; Gong, Sungyong

    2015-01-01

    This study presents a methodology that enables a quantitative assessment of green chemistry technologies. The study carries out a quantitative evaluation of a particular case of material reutilization by calculating the level of "greenness" i.e., the level of compliance with the principles of green chemistry that was achieved by implementing a green chemistry technology. The results indicate that the greenness level was enhanced by 42% compared to the pre-improvement level, thus demonstrating the economic feasibility of green chemistry. The assessment technique established in this study will serve as a useful reference for setting the direction of industry-level and government-level technological R&D and for evaluating newly developed technologies, which can greatly contribute toward gaining a competitive advantage in the global market.

  2. Advanced imaging techniques in brain tumors

    PubMed Central

    2009-01-01

    Abstract Perfusion, permeability and magnetic resonance spectroscopy (MRS) are now widely used in the research and clinical settings. In the clinical setting, qualitative, semi-quantitative and quantitative approaches such as review of color-coded maps to region of interest analysis and analysis of signal intensity curves are being applied in practice. There are several pitfalls with all of these approaches. Some of these shortcomings are reviewed, such as the relative low sensitivity of metabolite ratios from MRS and the effect of leakage on the appearance of color-coded maps from dynamic susceptibility contrast (DSC) magnetic resonance (MR) perfusion imaging and what correction and normalization methods can be applied. Combining and applying these different imaging techniques in a multi-parametric algorithmic fashion in the clinical setting can be shown to increase diagnostic specificity and confidence. PMID:19965287

  3. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  4. NDE of ceramics and ceramic composites

    NASA Technical Reports Server (NTRS)

    Vary, Alex; Klima, Stanley J.

    1991-01-01

    Although nondestructive evaluation (NDE) techniques for ceramics are fairly well developed, they are difficult to apply in many cases for high probability detection of the minute flaws that can cause failure in monolithic ceramics. Conventional NDE techniques are available for monolithic and fiber reinforced ceramic matrix composites, but more exact quantitative techniques needed are still being investigated and developed. Needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in ceramic composites. NDE techniques that will ultimately be applicable to production and quality control of ceramic structures are still emerging from the lab. Needs are different depending on the processing stage, fabrication method, and nature of the finished product. NDE techniques are being developed in concert with materials processing research where they can provide feedback information to processing development and quality improvement. NDE techniques also serve as research tools for materials characterization and for understanding failure processes, e.g., during thermomechanical testing.

  5. Domain Definition and Search Techniques in Meta-Analyses of L2 Research (or Why 18 Meta-Analyses of Feedback Have Different Results)

    ERIC Educational Resources Information Center

    Plonsky, Luke; Brown, Dan

    2015-01-01

    Applied linguists have turned increasingly in recent years to meta-analysis as the preferred means of synthesizing quantitative research. The first step in the meta-analytic process involves defining a domain of interest. Despite its apparent simplicity, this step involves a great deal of subjectivity on the part of the meta-analyst. This article…

  6. The Intercultural Component in Textbooks for Teaching a Service Technical Writing Course

    ERIC Educational Resources Information Center

    Matveeva, Natalia

    2007-01-01

    This research article investigates new developments in the representation of the intercultural component in textbooks for a service technical writing course. Through textual analysis, using quantitative and qualitative techniques, I report discourse analysis of 15 technical writing textbooks published during 1993-2006. The theoretical and…

  7. Community College Students' Perceptions of Effective Communication in Online Learning

    ERIC Educational Resources Information Center

    Parker, Donna Alice Hill

    2012-01-01

    This quantitative research project analyzed the application of instructional communication tools and techniques used by community college students to determine how they perceive communication in their online classes. Online students from a community college participated in this study by completing an electronic survey. Data analysis revealed that…

  8. Genetics and child psychiatry: I Advances in quantitative and molecular genetics.

    PubMed

    Rutter, M; Silberg, J; O'Connor, T; Simonoff, E

    1999-01-01

    Advances in quantitative psychiatric genetics as a whole are reviewed with respect to conceptual and methodological issues in relation to statistical model fitting, new genetic designs, twin and adoptee studies, definition of the phenotype, pervasiveness of genetic influences, pervasiveness of environmental influences, shared and nonshared environmental effects, and nature-nurture interplay. Advances in molecular genetics are discussed in relation to the shifts in research strategies to investigate multifactorial disorders (affected relative linkage designs, association strategies, and quantitative trait loci studies); new techniques and identified genetic mechanisms (expansion of trinucleotide repeats, genomic imprinting, mitochondrial DNA, fluorescent in-situ hybridisation, behavioural phenotypes, and animal models); and the successful localisation of genes.

  9. Transferable Calibration Standard Developed for Quantitative Raman Scattering Diagnostics in High-Pressure Flames

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Kojima, Jun

    2005-01-01

    Researchers from NASA Glenn Research Center s Combustion Branch and the Ohio Aerospace Institute (OAI) have developed a transferable calibration standard for an optical technique called spontaneous Raman scattering (SRS) in high-pressure flames. SRS is perhaps the only technique that provides spatially and temporally resolved, simultaneous multiscalar measurements in turbulent flames. Such measurements are critical for the validation of numerical models of combustion. This study has been a combined experimental and theoretical effort to develop a spectral calibration database for multiscalar diagnostics using SRS in high-pressure flames. However, in the past such measurements have used a one-of-a-kind experimental setup and a setup-dependent calibration procedure to empirically account for spectral interferences, or crosstalk, among the major species of interest. Such calibration procedures, being non-transferable, are prohibitively expensive to duplicate. A goal of this effort is to provide an SRS calibration database using transferable standards that can be implemented widely by other researchers for both atmospheric-pressure and high-pressure (less than 30 atm) SRS studies. A secondary goal of this effort is to provide quantitative multiscalar diagnostics in high pressure environments to validate computational combustion codes.

  10. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team 1998

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available under the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  11. Summary of Work for Joint Research Interchanges with DARWIN Integrated Product Team

    NASA Technical Reports Server (NTRS)

    Hesselink, Lambertus

    1999-01-01

    The intent of Stanford University's SciVis group is to develop technologies that enabled comparative analysis and visualization techniques for simulated and experimental flow fields. These techniques would then be made available un- der the Joint Research Interchange for potential injection into the DARWIN Workspace Environment (DWE). In the past, we have focused on techniques that exploited feature based comparisons such as shock and vortex extractions. Our current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching an@ vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will briefly (1) describe current technologies in the area of comparison techniques, (2) will describe the theory of our new method and finally (3) summarize a few of the results.

  12. ULTRASONIC STUDIES OF THE FUNDAMENTAL MECHANISMS OF RECRYSTALLIZATION AND SINTERING OF METALS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TURNER, JOSEPH A.

    2005-11-30

    The purpose of this project was to develop a fundamental understanding of the interaction of an ultrasonic wave with complex media, with specific emphases on recrystallization and sintering of metals. A combined analytical, numerical, and experimental research program was implemented. Theoretical models of elastic wave propagation through these complex materials were developed using stochastic wave field techniques. The numerical simulations focused on finite element wave propagation solutions through complex media. The experimental efforts were focused on corroboration of the models developed and on the development of new experimental techniques. The analytical and numerical research allows the experimental results to bemore » interpreted quantitatively.« less

  13. [Research progress in neuropsychopharmacology updated for the post-genomic era].

    PubMed

    Nakanishi, Toru

    2009-11-01

    Neuropsychopharmacological research in the post genomic (genomic sequence) era has been developing rapidly through the use of novel techniques including DNA chips. We have applied these techniques to investigate the anti-tumor effect of NSAIDs, isolate novel genes specifically expressed in rheumatoid arthritis, and analyze gene expression profiles in mesenchymal stem cells. Recently, we have developed a novel system of quantitative PCR for detection of BDNF mRNA isoforms. By using this system, we identified the exon-specific mode of expression in acute and chronic pain. In addition, we have made gene expression profiles of KO mice of beta2 subunits in acetylcholine receptors.

  14. Standardizing Quality Assessment of Fused Remotely Sensed Images

    NASA Astrophysics Data System (ADS)

    Pohl, C.; Moellmann, J.; Fries, K.

    2017-09-01

    The multitude of available operational remote sensing satellites led to the development of many image fusion techniques to provide high spatial, spectral and temporal resolution images. The comparison of different techniques is necessary to obtain an optimized image for the different applications of remote sensing. There are two approaches in assessing image quality: 1. Quantitatively by visual interpretation and 2. Quantitatively using image quality indices. However an objective comparison is difficult due to the fact that a visual assessment is always subject and a quantitative assessment is done by different criteria. Depending on the criteria and indices the result varies. Therefore it is necessary to standardize both processes (qualitative and quantitative assessment) in order to allow an objective image fusion quality evaluation. Various studies have been conducted at the University of Osnabrueck (UOS) to establish a standardized process to objectively compare fused image quality. First established image fusion quality assessment protocols, i.e. Quality with No Reference (QNR) and Khan's protocol, were compared on varies fusion experiments. Second the process of visual quality assessment was structured and standardized with the aim to provide an evaluation protocol. This manuscript reports on the results of the comparison and provides recommendations for future research.

  15. Shuttle Tethered Aerothermodynamics Research Facility (STARFAC) Instrumentation Requirements

    NASA Technical Reports Server (NTRS)

    Wood, George M.; Siemers, Paul M.; Carlomagno, Giovanni M.; Hoffman, John

    1986-01-01

    The instrumentation requirements for the Shuttle Tethered Aerothermodynamic Research Facility (STARFAC) are presented. The typical physical properties of the terrestrial atmosphere are given along with representative atmospheric daytime ion concentrations and the equilibrium and nonequilibrium gas property comparison from a point away from a wall. STARFAC science and engineering measurements are given as are the TSS free stream gas analysis. The potential nonintrusive measurement techniques for hypersonic boundary layer research are outlined along with the quantitative physical measurement methods for aerothermodynamic studies.

  16. Multi-modality imaging of tumor phenotype and response to therapy

    NASA Astrophysics Data System (ADS)

    Nyflot, Matthew J.

    2011-12-01

    Imaging and radiation oncology have historically been closely linked. However, the vast majority of techniques used in the clinic involve anatomical imaging. Biological imaging offers the potential for innovation in the areas of cancer diagnosis and staging, radiotherapy target definition, and treatment response assessment. Some relevant imaging techniques are FDG PET (for imaging cellular metabolism), FLT PET (proliferation), CuATSM PET (hypoxia), and contrast-enhanced CT (vasculature and perfusion). Here, a technique for quantitative spatial correlation of tumor phenotype is presented for FDG PET, FLT PET, and CuATSM PET images. Additionally, multimodality imaging of treatment response with FLT PET, CuATSM, and dynamic contrast-enhanced CT is presented, in a trial of patients receiving an antiangiogenic agent (Avastin) combined with cisplatin and radiotherapy. Results are also presented for translational applications in animal models, including quantitative assessment of proliferative response to cetuximab with FLT PET and quantification of vascular volume with a blood-pool contrast agent (Fenestra). These techniques have clear applications to radiobiological research and optimized treatment strategies, and may eventually be used for personalized therapy for patients.

  17. Why interdisciplinary research enriches the study of crime. Comment on "Statistical physics of crime: A review" by M.R. D'Orsogna and M. Perc

    NASA Astrophysics Data System (ADS)

    Donnay, Karsten

    2015-03-01

    The past several years have seen a rapidly growing interest in the use of advanced quantitative methodologies and formalisms adapted from the natural sciences to study a broad range of social phenomena. The research field of computational social science [1,2], for example, uses digital artifacts of human online activity to cast a new light on social dynamics. Similarly, the studies reviewed by D'Orsogna and Perc showcase a diverse set of advanced quantitative techniques to study the dynamics of crime. Methods used range from partial differential equations and self-exciting point processes to agent-based models, evolutionary game theory and network science [3].

  18. Four dimensional observations of clouds from geosynchronous orbit using stereo display and measurement techniques on an interactive information processing system

    NASA Technical Reports Server (NTRS)

    Hasler, A. F.; Desjardins, M.; Shenk, W. E.

    1979-01-01

    Simultaneous Geosynchronous Operational Environmental Satellite (GOES) 1 km resolution visible image pairs can provide quantitative three dimensional measurements of clouds. These data have great potential for severe storms research and as a basic parameter measurement source for other areas of meteorology (e.g. climate). These stereo cloud height measurements are not subject to the errors and ambiguities caused by unknown cloud emissivity and temperature profiles that are associated with infrared techniques. This effort describes the display and measurement of stereo data using digital processing techniques.

  19. Fuel film thickness measurements using refractive index matching in a stratified-charge SI engine operated on E30 and alkylate fuels

    NASA Astrophysics Data System (ADS)

    Ding, Carl-Philipp; Sjöberg, Magnus; Vuilleumier, David; Reuss, David L.; He, Xu; Böhm, Benjamin

    2018-03-01

    This study shows fuel film measurements in a spark-ignited direct injection engine using refractive index matching (RIM). The RIM technique is applied to measure the fuel impingement of a high research octane number gasoline fuel with 30 vol% ethanol content at two intake pressures and coolant temperatures. Measurements are conducted for an alkylate fuel at one operating case, as well. It is shown that the fuel volume on the piston surface increases for lower intake pressure and lower coolant temperature and that the alkylate fuel shows very little spray impingement. The fuel films can be linked to increased soot emissions. A detailed description of the calibration technique is provided and measurement uncertainties are discussed. The dependency of the RIM signal on refractive index changes is measured. The RIM technique provides quantitative film thickness measurements up to 0.9 µm in this engine. For thicker films, semi-quantitative results of film thickness can be utilized to study the distribution of impinged fuel.

  20. Utility and translatability of mathematical modeling, cell culture and small and large animal models in magnetic nanoparticle hyperthermia cancer treatment research

    NASA Astrophysics Data System (ADS)

    Hoopes, P. J.; Petryk, Alicia A.; Misra, Adwiteeya; Kastner, Elliot J.; Pearce, John A.; Ryan, Thomas P.

    2015-03-01

    For more than 50 years, hyperthermia-based cancer researchers have utilized mathematical models, cell culture studies and animal models to better understand, develop and validate potential new treatments. It has been, and remains, unclear how and to what degree these research techniques depend on, complement and, ultimately, translate accurately to a successful clinical treatment. In the past, when mathematical models have not proven accurate in a clinical treatment situation, the initiating quantitative scientists (engineers, mathematicians and physicists) have tended to believe the biomedical parameters provided to them were inaccurately determined or reported. In a similar manner, experienced biomedical scientists often tend to question the value of mathematical models and cell culture results since those data typically lack the level of biologic and medical variability and complexity that are essential to accurately study and predict complex diseases and subsequent treatments. Such quantitative and biomedical interdependence, variability, diversity and promise have never been greater than they are within magnetic nanoparticle hyperthermia cancer treatment. The use of hyperthermia to treat cancer is well studied and has utilized numerous delivery techniques, including microwaves, radio frequency, focused ultrasound, induction heating, infrared radiation, warmed perfusion liquids (combined with chemotherapy), and, recently, metallic nanoparticles (NP) activated by near infrared radiation (NIR) and alternating magnetic field (AMF) based platforms. The goal of this paper is to use proven concepts and current research to address the potential pathobiology, modeling and quantification of the effects of treatment as pertaining to the similarities and differences in energy delivered by known external delivery techniques and iron oxide nanoparticles.

  1. Qualitative research and the profound grasp of the obvious.

    PubMed Central

    Hurley, R E

    1999-01-01

    OBJECTIVE: To discuss the value of promoting coexistent and complementary relationships between qualitative and quantitative research methods as illustrated by presentations made by four respected health services researchers who described their experiences in multi-method projects. DATA SOURCES: Presentations and publications related to the four research projects, which described key substantive and methodological areas that had been addressed with qualitative techniques. PRINCIPAL FINDINGS: Sponsor interest in timely, insightful, and reality-anchored evidence has provided a strong base of support for the incorporation of qualitative methods into major contemporary policy research studies. In addition, many issues may be suitable for study only with qualitative methods because of their complexity, their emergent nature, or because of the need to revisit and reexamine previously untested assumptions. CONCLUSION: Experiences from the four projects, as well as from other recent health services studies with major qualitative components, support the assertion that the interests of sponsors in the policy realm and pressure from them suppress some of the traditional tensions and antagonisms between qualitative and quantitative methods. PMID:10591276

  2. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  3. An Automated Energy Detection Algorithm Based on Morphological Filter Processing with a Modified Watershed Transform

    DTIC Science & Technology

    2018-01-01

    collected data. These statistical techniques are under the area of descriptive statistics, which is a methodology to condense the data in quantitative ...ARL-TR-8270 ● JAN 2018 US Army Research Laboratory An Automated Energy Detection Algorithm Based on Morphological Filter...report when it is no longer needed. Do not return it to the originator. ARL-TR-8270 ● JAN 2017 US Army Research Laboratory An

  4. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  5. The application analysis of the multi-angle polarization technique for ocean color remote sensing

    NASA Astrophysics Data System (ADS)

    Zhang, Yongchao; Zhu, Jun; Yin, Huan; Zhang, Keli

    2017-02-01

    The multi-angle polarization technique, which uses the intensity of polarized radiation as the observed quantity, is a new remote sensing means for earth observation. With this method, not only can the multi-angle light intensity data be provided, but also the multi-angle information of polarized radiation can be obtained. So, the technique may solve the problems, those could not be solved with the traditional remote sensing methods. Nowadays, the multi-angle polarization technique has become one of the hot topics in the field of the international quantitative research on remote sensing. In this paper, we firstly introduce the principles of the multi-angle polarization technique, then the situations of basic research and engineering applications are particularly summarized and analysed in 1) the peeled-off method of sun glitter based on polarization, 2) the ocean color remote sensing based on polarization, 3) oil spill detection using polarization technique, 4) the ocean aerosol monitoring based on polarization. Finally, based on the previous work, we briefly present the problems and prospects of the multi-angle polarization technique used in China's ocean color remote sensing.

  6. Nonlinear ultrasonics for material state awareness

    NASA Astrophysics Data System (ADS)

    Jacobs, L. J.

    2014-02-01

    Predictive health monitoring of structural components will require the development of advanced sensing techniques capable of providing quantitative information on the damage state of structural materials. By focusing on nonlinear acoustic techniques, it is possible to measure absolute, strength based material parameters that can then be coupled with uncertainty models to enable accurate and quantitative life prediction. Starting at the material level, this review will present current research that involves a combination of sensing techniques and physics-based models to characterize damage in metallic materials. In metals, these nonlinear ultrasonic measurements can sense material state, before the formation of micro- and macro-cracks. Typically, cracks of a measurable size appear quite late in a component's total life, while the material's integrity in terms of toughness and strength gradually decreases due to the microplasticity (dislocations) and associated change in the material's microstructure. This review focuses on second harmonic generation techniques. Since these nonlinear acoustic techniques are acoustic wave based, component interrogation can be performed with bulk, surface and guided waves using the same underlying material physics; these nonlinear ultrasonic techniques provide results which are independent of the wave type used. Recent physics-based models consider the evolution of damage due to dislocations, slip bands, interstitials, and precipitates in the lattice structure, which can lead to localized damage.

  7. Attachment Style, Social Support, and Coping as Psychosocial Correlates of Happiness in Persons with Spinal Cord Injuries

    ERIC Educational Resources Information Center

    Wilson, Lisa; Catalano, Denise; Sung, Connie; Phillips, Brian; Chou, Chih-Chin; Chan, Jacob Yui Chung; Chan, Fong

    2013-01-01

    Objective: To examine the roles of attachment, social support, and coping as psychosocial correlates in predicting happiness in people with spinal cord injuries. Design: Quantitative descriptive research design using multiple regression and correlation techniques. Participants: 274 individuals with spinal cord injuries. Outcome Measures: Happiness…

  8. Evaluation Techniques for the Sandy Point Discovery Center, Great Bay National Estuarine Research Reserve.

    ERIC Educational Resources Information Center

    Heffernan, Bernadette M.

    1998-01-01

    Describes work done to provide staff of the Sandy Point Discovery Center with methods for evaluating exhibits and interpretive programming. Quantitative and qualitative evaluation measures were designed to assess the program's objective of estuary education. Pretest-posttest questionnaires and interviews are used to measure subjects' knowledge and…

  9. Relationships between World Health Organization "International Classification of Functioning, Disability and Health" Constructs and Participation in Adults with Severe Mental Illness

    ERIC Educational Resources Information Center

    Sánchez, Jennifer; Rosenthal, David A.; Chan, Fong; Brooks, Jessica; Bezyak, Jill L.

    2016-01-01

    Purpose: To examine the World Health Organization "International Classification of Functioning, Disability and Health" (ICF) constructs as correlates of community participation of people with severe mental illnesses (SMI). Methods: Quantitative descriptive research design using multiple regression and correlational techniques was used to…

  10. 76 FR 60497 - Agency Forms Undergoing Paperwork Reduction Act Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-29

    ... child maltreatment (CM) using a quantitative, preference-based approach. The U.S. Department of Health... research on the consequences of CM in adults, few studies have utilized standard HRQoL techniques and none... preferences over a series of comparisons that will be shown to survey respondents. The online survey will be...

  11. Dynamical Analyses for Developmental Science: A Primer for Intrigued Scientists

    ERIC Educational Resources Information Center

    DiDonato, M. D.; England, D.; Martin, C. L.; Amazeen, P. G.

    2013-01-01

    Dynamical systems theory is becoming more popular in social and developmental science. However, unfamiliarity with dynamical analysis techniques remains an obstacle for developmentalists who would like to quantitatively apply dynamics in their own research. The goal of this article is to address this issue by clearly and simply presenting several…

  12. Thinking Statistically in Writing: Journals and Discussion Boards in an Introductory Statistics Course

    ERIC Educational Resources Information Center

    Theoret, Julie M.; Luna, Andrea

    2009-01-01

    This action research combined qualitative and quantitative techniques to investigate two different types of writing assignments in an introductory undergraduate statistics course. The assignments were written in response to the same set of prompts but in two different ways: homework journal assignments or initial posts to a computer discussion…

  13. Application of Critical Classroom Discourse Analysis (CCDA) in Analyzing Classroom Interaction

    ERIC Educational Resources Information Center

    Sadeghi, Sima; Ketabi, Saeed; Tavakoli, Mansoor; Sadeghi, Moslem

    2012-01-01

    As an area of classroom research, Interaction Analysis developed from the need and desire to investigate the process of classroom teaching and learning in terms of action-reaction between individuals and their socio-cultural context (Biddle, 1967). However, sole reliance on quantitative techniques could be problematic, since they conceal more than…

  14. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  15. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures

    NASA Astrophysics Data System (ADS)

    Boes, Kelsey S.; Roberts, Michael S.; Vinueza, Nelson R.

    2017-12-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. [Figure not available: see fulltext.

  16. Rapid Quadrupole-Time-of-Flight Mass Spectrometry Method Quantifies Oxygen-Rich Lignin Compound in Complex Mixtures.

    PubMed

    Boes, Kelsey S; Roberts, Michael S; Vinueza, Nelson R

    2018-03-01

    Complex mixture analysis is a costly and time-consuming task facing researchers with foci as varied as food science and fuel analysis. When faced with the task of quantifying oxygen-rich bio-oil molecules in a complex diesel mixture, we asked whether complex mixtures could be qualitatively and quantitatively analyzed on a single mass spectrometer with mid-range resolving power without the use of lengthy separations. To answer this question, we developed and evaluated a quantitation method that eliminated chromatography steps and expanded the use of quadrupole-time-of-flight mass spectrometry from primarily qualitative to quantitative as well. To account for mixture complexity, the method employed an ionization dopant, targeted tandem mass spectrometry, and an internal standard. This combination of three techniques achieved reliable quantitation of oxygen-rich eugenol in diesel from 300 to 2500 ng/mL with sufficient linearity (R 2 = 0.97 ± 0.01) and excellent accuracy (percent error = 0% ± 5). To understand the limitations of the method, it was compared to quantitation attained on a triple quadrupole mass spectrometer, the gold standard for quantitation. The triple quadrupole quantified eugenol from 50 to 2500 ng/mL with stronger linearity (R 2 = 0.996 ± 0.003) than the quadrupole-time-of-flight and comparable accuracy (percent error = 4% ± 5). This demonstrates that a quadrupole-time-of-flight can be used for not only qualitative analysis but also targeted quantitation of oxygen-rich lignin molecules in complex mixtures without extensive sample preparation. The rapid and cost-effective method presented here offers new possibilities for bio-oil research, including: (1) allowing for bio-oil studies that demand repetitive analysis as process parameters are changed and (2) making this research accessible to more laboratories. Graphical Abstract ᅟ.

  17. Civil infrastructure monitoring for IVHS using optical fiber sensors

    NASA Astrophysics Data System (ADS)

    de Vries, Marten J.; Arya, Vivek; Grinder, C. R.; Murphy, Kent A.; Claus, Richard O.

    1995-01-01

    8Early deployment of Intelligent Vehicle Highway Systems would necessitate the internal instrumentation of infrastructure for emergency preparedness. Existing quantitative analysis and visual analysis techniques are time consuming, cost prohibitive, and are often unreliable. Fiber optic sensors are rapidly replacing conventional instrumentation because of their small size, light weight, immunity to electromagnetic interference, and extremely high information carrying capability. In this paper research on novel optical fiber sensing techniques for health monitoring of civil infrastructure such as highways and bridges is reported. Design, fabrication, and implementation of fiber optic sensor configurations used for measurements of strain are discussed. Results from field tests conducted to demonstrate the effectiveness of fiber sensors at determining quantitative strain vector components near crack locations in bridges are presented. Emerging applications of fiber sensors for vehicle flow, vehicle speed, and weigh-in-motion measurements are also discussed.

  18. [The relevance of qualitative techniques in biomedical research].

    PubMed

    de Camargo, Kenneth Rochel

    2008-01-01

    On observing how qualitative and quantitative studies are reported in the biomedical literature it becomes evident that, besides the virtual absence of the former, they are presented in different ways. Authors of qualitative studies seem to need almost invariably to explain why they choose a qualitative approach whereas that does not occur in quantitative studies. This paper takes Ludwik Fleck's comparative epistemology as a means of exploring those differences empirically, illustrating on the basis of two studies dealing with different aspects of biomedical practices how qualitative methods can elucidate a variety of questions pertaining to this field. The paper concludes presenting some structural characteristics of the biomedical field which on one hand, would not be explored properly without employing qualitative methods and, on the other hand, can help understanding the little value given to qualitative techniques in this area.

  19. In vivo studies of brain development by magnetic resonance techniques.

    PubMed

    Inder, T E; Huppi, P S

    2000-01-01

    Understanding of the morphological development of the human brain has largely come from neuropathological studies obtained postmortem. Magnetic resonance (MR) techniques have recently allowed the provision of detailed structural, metabolic, and functional information in vivo on the human brain. These techniques have been utilized in studies from premature infants to adults and have provided invaluable data on the sequence of normal human brain development. This article will focus on MR techniques including conventional structural MR imaging techniques, quantitative morphometric MR techniques, diffusion weighted MR techniques, and MR spectroscopy. In order to understand the potential applications and limitations of MR techniques, relevant physical and biological principles for each of the MR techniques are first reviewed. This is followed by a review of the understanding of the sequence of normal brain development utilizing these techniques. MRDD Research Reviews 6:59-67, 2000. Copyright 2000 Wiley-Liss, Inc.

  20. Liquid chromatography-mass spectrometry-based quantitative proteomics.

    PubMed

    Linscheid, Michael W; Ahrends, Robert; Pieper, Stefan; Kühn, Andreas

    2009-01-01

    During the last decades, molecular sciences revolutionized biomedical research and gave rise to the biotechnology industry. During the next decades, the application of the quantitative sciences--informatics, physics, chemistry, and engineering--to biomedical research brings about the next revolution that will improve human healthcare and certainly create new technologies, since there is no doubt that small changes can have great effects. It is not a question of "yes" or "no," but of "how much," to make best use of the medical options we will have. In this context, the development of accurate analytical methods must be considered a cornerstone, since the understanding of biological processes will be impossible without information about the minute changes induced in cells by interactions of cell constituents with all sorts of endogenous and exogenous influences and disturbances. The first quantitative techniques, which were developed, allowed monitoring relative changes only, but they clearly showed the significance of the information obtained. The recent advent of techniques claiming to quantify proteins and peptides not only relative to each other, but also in an absolute fashion, promised another quantum leap, since knowing the absolute amount will allow comparing even unrelated species and the definition of parameters will permit to model biological systems much more accurate than before. To bring these promises to life, several approaches are under development at this point in time and this review is focused on those developments.

  1. Identification and Quantitation of Flavanols and Proanthocyanidins in Foods: How Good are the Datas?

    PubMed Central

    Kelm, Mark A.; Hammerstone, John F.; Schmitz, Harold H.

    2005-01-01

    Evidence suggesting that dietary polyphenols, flavanols, and proanthocyanidins in particular offer significant cardiovascular health benefits is rapidly increasing. Accordingly, reliable and accurate methods are needed to provide qualitative and quantitative food composition data necessary for high quality epidemiological and clinical research. Measurements for flavonoids and proanthocyanidins have employed a range of analytical techniques, with various colorimetric assays still being popular for estimating total polyphenolic content in foods and other biological samples despite advances made with more sophisticated analyses. More crudely, estimations of polyphenol content as well as antioxidant activity are also reported with values relating to radical scavenging activity. High-performance liquid chromatography (HPLC) is the method of choice for quantitative analysis of individual polyphenols such as flavanols and proanthocyanidins. Qualitative information regarding proanthocyanidin structure has been determined by chemical methods such as thiolysis and by HPLC-mass spectrometry (MS) techniques at present. The lack of appropriate standards is the single most important factor that limits the aforementioned analyses. However, with ever expanding research in the arena of flavanols, proanthocyanidins, and health and the importance of their future inclusion in food composition databases, the need for standards becomes more critical. At present, sufficiently well-characterized standard material is available for selective flavanols and proanthocyanidins, and construction of at least a limited food composition database is feasible. PMID:15712597

  2. Study of fault-tolerant software technology

    NASA Technical Reports Server (NTRS)

    Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.

    1984-01-01

    Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.

  3. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  4. Advanced technology development for image gathering, coding, and processing

    NASA Technical Reports Server (NTRS)

    Huck, Friedrich O.

    1990-01-01

    Three overlapping areas of research activities are presented: (1) Information theory and optimal filtering are extended to visual information acquisition and processing. The goal is to provide a comprehensive methodology for quantitatively assessing the end-to-end performance of image gathering, coding, and processing. (2) Focal-plane processing techniques and technology are developed to combine effectively image gathering with coding. The emphasis is on low-level vision processing akin to the retinal processing in human vision. (3) A breadboard adaptive image-coding system is being assembled. This system will be used to develop and evaluate a number of advanced image-coding technologies and techniques as well as research the concept of adaptive image coding.

  5. [Advantages and disadvantages of incorporating qualitative methodology in the evaluation of health services. A practical case: evaluation of a high-resolution clinic].

    PubMed

    Alvarez Del Arco, D; Rodríguez Rieiro, C; Sanchidrián De Blás, C; Alejos, B; Plá Mestre, R

    2012-01-01

    We examined the usefulness of incorporating a qualitative phase in the evaluation of the quality of care in a high-resolution medical service carried out with quantitative methods. A quantitative research was performed using a structured questionnaire and selecting interviewees by systematic randomized sampling methods (n=320). In addition, a qualitative research was carried on through semi-structured interviews with patients selected by convenience criteria (n=11), observations in the care assistance circuit, and a group interview with health professionals working in the service. A multidisciplinary research team conducted an individual analysis of the information collected in both quantitative and qualitative phases. Subsequently, three meetings based on group brainstorming techniques were held to identify the diverse contributions of each of the methodologies employed to the research, using affinity graphs to analyse the different results obtained in both phases and evaluate possible bias arising from the use of qualitative methods. Qualitative research allowed examining specific aspects of the health care service that had been collected in the quantitative phase, harmonizing the results obtained in the previous phase, giving in-depth data on the reasons for patient dissatisfaction with specific aspects, such as waiting times and available infrastructures, and identifying emerging issues of the service which had not been previously assessed. Overall, the qualitative phase enriched the results of the research. It is appropriate and recommendable to incorporate this methodological approach in research aimed at evaluating the quality of the service in specific health care settings, since it is provided first hand, by the voice of the customer. Copyright © 2011 SECA. Published by Elsevier Espana. All rights reserved.

  6. The Next Frontier: Quantitative Biochemistry in Living Cells.

    PubMed

    Honigmann, Alf; Nadler, André

    2018-01-09

    Researchers striving to convert biology into an exact science foremost rely on structural biology and biochemical reconstitution approaches to obtain quantitative data. However, cell biological research is moving at an ever-accelerating speed into areas where these approaches lose much of their edge. Intrinsically unstructured proteins and biochemical interaction networks composed of interchangeable, multivalent, and unspecific interactions pose unique challenges to quantitative biology, as do processes that occur in discrete cellular microenvironments. Here we argue that a conceptual change in our way of conducting biochemical experiments is required to take on these new challenges. We propose that reconstitution of cellular processes in vitro should be much more focused on mimicking the cellular environment in vivo, an approach that requires detailed knowledge of the material properties of cellular compartments, essentially requiring a material science of the cell. In a similar vein, we suggest that quantitative biochemical experiments in vitro should be accompanied by corresponding experiments in vivo, as many newly relevant cellular processes are highly context-dependent. In essence, this constitutes a call for chemical biologists to convert their discipline from a proof-of-principle science to an area that could rightfully be called quantitative biochemistry in living cells. In this essay, we discuss novel techniques and experimental strategies with regard to their potential to fulfill such ambitious aims.

  7. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  8. Research in health sciences library and information science: a quantitative analysis.

    PubMed Central

    Dimitroff, A

    1992-01-01

    A content analysis of research articles published between 1966 and 1990 in the Bulletin of the Medical Library Association was undertaken. Four specific questions were addressed: What subjects are of interest to health sciences librarians? Who is conducting this research? How do health sciences librarians conduct their research? Do health sciences librarians obtain funding for their research activities? Bibliometric characteristics of the research articles are described and compared to characteristics of research in library and information science as a whole in terms of subject and methodology. General findings were that most research in health sciences librarianship is conducted by librarians affiliated with academic health sciences libraries (51.8%); most deals with an applied (45.7%) or a theoretical (29.2%) topic; survey (41.0%) or observational (20.7%) research methodologies are used; descriptive quantitative analytical techniques are used (83.5%); and over 25% of research is funded. The average number of authors was 1.85, average article length was 7.25 pages, and average number of citations per article was 9.23. These findings are consistent with those reported in the general library and information science literature for the most part, although specific differences do exist in methodological and analytical areas. PMID:1422504

  9. Quantitative Analysis of Situational Awareness (QUASA): Applying Signal Detection Theory to True/False Probes and Self-Ratings

    DTIC Science & Technology

    2004-06-01

    obtained. Further refinements of the technique based on recent research in experimental psychology are also considered. INTRODUCTION The key...an established line of research in psychology in which objective and subjective metrics are combined to analyse the degree of ‘calibration’ in... Creelman , 1991). A notable exception is the study by Kunimoto et al. (2001) in which confidence ratings were subjected to SDT analysis to evaluate the

  10. An attribute-driven statistics generator for use in a G.I.S. environment

    NASA Technical Reports Server (NTRS)

    Thomas, R. W.; Ritter, P. R.; Kaugars, A.

    1984-01-01

    When performing research using digital geographic information it is often useful to produce quantitative characterizations of the data, usually within some constraints. In the research environment the different combinations of required data and constraints can often become quite complex. This paper describes a technique that gives the researcher a powerful and flexible way to set up many possible combinations of data and constraints without having to perform numerous intermediate steps or create temporary data bands. This method provides an efficient way to produce descriptive statistics in such situations.

  11. Metabolic Mapping: Quantitative Enzyme Cytochemistry and Histochemistry to Determine the Activity of Dehydrogenases in Cells and Tissues.

    PubMed

    Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F

    2018-05-26

    Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.

  12. Quantitative microscopy of the lung: a problem-based approach. Part 1: basic principles of lung stereology.

    PubMed

    Ochs, Matthias; Mühlfeld, Christian

    2013-07-01

    The growing awareness of the importance of accurate morphometry in lung research has recently motivated the publication of guidelines set forth by a combined task force of the American Thoracic Society and the European Respiratory Society (20). This official ATS/ERS Research Policy Statement provides general recommendations on which stereological methods are to be used in quantitative microscopy of the lung. However, to integrate stereology into a particular experimental study design, investigators are left with the problem of how to implement this in practice. Specifically, different animal models of human lung disease require the use of different stereological techniques and may determine the mode of lung fixation, tissue processing, preparation of sections, and other things. Therefore, the present companion articles were designed to allow a short practically oriented introduction into the concepts of design-based stereology (Part 1) and to provide recommendations for choosing the most appropriate methods to investigate a number of important disease models (Part 2). Worked examples with illustrative images will facilitate the practical performance of equivalent analyses. Study algorithms provide comprehensive surveys to ensure that no essential step gets lost during the multistage workflow. Thus, with this review, we hope to close the gap between theory and practice and enhance the use of stereological techniques in pulmonary research.

  13. Quantitation without Calibration: Response Profile as an Indicator of Target Amount.

    PubMed

    Debnath, Mrittika; Farace, Jessica M; Johnson, Kristopher D; Nesterova, Irina V

    2018-06-21

    Quantitative assessment of biomarkers is essential in numerous contexts from decision-making in clinical situations to food quality monitoring to interpretation of life-science research findings. However, appropriate quantitation techniques are not as widely addressed as detection methods. One of the major challenges in biomarker's quantitation is the need to have a calibration for correlating a measured signal to a target amount. The step complicates the methodologies and makes them less sustainable. In this work we address the issue via a new strategy: relying on position of response profile rather than on an absolute signal value for assessment of a target's amount. In order to enable the capability we develop a target-probe binding mechanism based on a negative cooperativity effect. A proof-of-concept example demonstrates that the model is suitable for quantitative analysis of nucleic acids over a wide concentration range. The general principles of the platform will be applicable toward a variety of biomarkers such as nucleic acids, proteins, peptides, and others.

  14. Experimental feasibility study of the application of magnetic suspension techniques to large-scale aerodynamic test facilities. [cryogenic traonics wind tunnel

    NASA Technical Reports Server (NTRS)

    Zapata, R. N.; Humphris, R. R.; Henderson, K. C.

    1975-01-01

    Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.

  15. Quantitative proteomics in Giardia duodenalis-Achievements and challenges.

    PubMed

    Emery, Samantha J; Lacey, Ernest; Haynes, Paul A

    2016-08-01

    Giardia duodenalis (syn. G. lamblia and G. intestinalis) is a protozoan parasite of vertebrates and a major contributor to the global burden of diarrheal diseases and gastroenteritis. The publication of multiple genome sequences in the G. duodenalis species complex has provided important insights into parasite biology, and made post-genomic technologies, including proteomics, significantly more accessible. The aims of proteomics are to identify and quantify proteins present in a cell, and assign functions to them within the context of dynamic biological systems. In Giardia, proteomics in the post-genomic era has transitioned from reliance on gel-based systems to utilisation of a diverse array of techniques based on bottom-up LC-MS/MS technologies. Together, these have generated crucial foundations for subcellular proteomes, elucidated intra- and inter-assemblage isolate variation, and identified pathways and markers in differentiation, host-parasite interactions and drug resistance. However, in Giardia, proteomics remains an emerging field, with considerable shortcomings evident from the published research. These include a bias towards assemblage A, a lack of emphasis on quantitative analytical techniques, and limited information on post-translational protein modifications. Additionally, there are multiple areas of research for which proteomic data is not available to add value to published transcriptomic data. The challenge of amalgamating data in the systems biology paradigm necessitates the further generation of large, high-quality quantitative datasets to accurately model parasite biology. This review surveys the current proteomic research available for Giardia and evaluates their technical and quantitative approaches, while contextualising their biological insights into parasite pathology, isolate variation and eukaryotic evolution. Finally, we propose areas of priority for the generation of future proteomic data to explore fundamental questions in Giardia, including the analysis of post-translational modifications, and the design of MS-based assays for validation of differentially expressed proteins in large datasets. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Exercise, Diet, and Stress Management as Mediators between Functional Disability and Health-Related Quality of Life in Multiple Sclerosis

    ERIC Educational Resources Information Center

    Sung, Connie; Chiu, Chung-Yi; Lee, Eun-Jeong; Bezyak, Jill; Chan, Fong; Muller, Veronica

    2013-01-01

    The main objective of this study was to examine the mediational and moderational effect of exercise, diet, and stress management on the relationship between functional disability and health-related quality of life. Quantitative descriptive research design using multiple regression and correlation techniques was used. Participants were 215…

  17. Values of Local Wisdom: A Potential to Develop an Assessment and Remedial

    ERIC Educational Resources Information Center

    Toharudin, Uus; Kurniawan, Iwan Setia

    2017-01-01

    Development assessment and remedial needs to be done because it is an important part of a learning process. This study aimed to describe the ability of student teachers of biology in developing assessment and remedial based on local wisdom. using a quasi-experimental research methods with quantitative descriptive analysis techniques. The research…

  18. The Effect of Physical Activity on Children with ADHD: A Quantitative Review of the Literature

    ERIC Educational Resources Information Center

    Cornelius, Colleen; Fedewa, Alicia L.; Ahn, Soyeon

    2017-01-01

    Research on the effects of physical activity on children with attention deficit hyperactivity disorder is promising, yet no attempt has been made to integrate current findings using meta-analytic techniques. Using a meta-regression, the present study examined the effectives of physical activity for children with attention deficit hyperactivity…

  19. Application of Simulation to Individualized Self-Paced Training. Final Report. TAEG Report No. 11-2.

    ERIC Educational Resources Information Center

    Lindahl, William H.; Gardner, James H.

    Computer simulation is recognized as a valuable systems analysis research tool which enables the detailed examination, evaluation, and manipulation, under stated conditions, of a system without direct action on the system. This technique provides management with quantitative data on system performance and capabilities which can be used to compare…

  20. Perceptions and Barriers to ICT Use among English Teachers in Indonesia

    ERIC Educational Resources Information Center

    Muslem, Asnawi; Yusuf, Yunisrina Qismullah; Juliana, Rena

    2018-01-01

    The purpose of this research is to investigate English teachers' perception and challenges of the implementation of ICT in ELT classrooms. This study used mixed methods, qualitative and quantitative in nature. A purposive sampling technique was used to select the study subjects, who are 26 English teachers from 16 public senior high schools in…

  1. Applying Qualitative Methods in Organizations: A Note for Industrial/Organizational Psychologists

    ERIC Educational Resources Information Center

    Ehigie, Benjamin Osayawe; Ehigie, Rebecca Ibhaguelo

    2005-01-01

    Early approach to research in industrial and organizational (I/O) psychology was oriented towards quantitative techniques as a result of influences from the social sciences. As the focus of I/O psychology expands from psychological test development to other personnel functions, there has been an inclusion of qualitative methods in I/O psychology…

  2. Relationship between Teacher ICT Competency and Teacher Acceptance and Use of School Management System (SMS)

    ERIC Educational Resources Information Center

    Wei, Leong Mei; Piaw, Chua Yan; Kannan, Sathiamoorthy; Moulod, Shafinaz A.

    2016-01-01

    This study aims at examining the relationship between teacher ICT competency and teacher acceptance and use of SMS in Negeri Sembilan secondary schools in Malaysia. This is a non-experimental quantitative research using survey technique through the administration of a set of questionnaire that comprised teacher demographic variables, teacher ICT…

  3. Quali-quantitative analysis (QQA): why it could open new frontiers for holistic health practice.

    PubMed

    Bell, Erica

    2006-12-15

    Holistic health practice is often described as being about understanding the larger contexts of patients, their health services, and their communities. Yet do traditional quantitative and qualitative health research methods produce the best possible evidence for the holistic practices of doctors, nurses, and allied health professionals? This paper argues "no", and examines the potential of a cutting-edge, social science research method--Quali-Quantitative Research (QQA)--for providing better evidence for holistic practice, particularly in small-N populations, such as rural and remote communities. It does so with reference to the international literature on holistic medicine, as well as three holistic health projects conducted in Tasmania: about prevention of falls in older people, adolescent substance abuse, and interventions for children aged 0-5 exposed to domestic violence. The findings suggest that much health research fails to capture rigorously the contextual complexity of holistic health challenges: the multiple different needs of individual patients, and the interprofessional approaches needed to deliver multidisciplinary and multiservice health interventions tailored to meet those needs in particular community contexts. QQA offers a "configurational", case-based, diversity-oriented approach to analysing data that combines qualitative and quantitative techniques to overcome the limitations of both research traditions. The author concludes that QQA could open new frontiers for holistic health by helping doctors, nurses, and allied health professionals answer a fundamental question presented by complex health challenges: "Given this set of whole-of-patient needs, what elements of which interventions in what services would work best in this particular community?"

  4. Text Mining in Organizational Research

    PubMed Central

    Kobayashi, Vladimer B.; Berkers, Hannah A.; Kismihók, Gábor; Den Hartog, Deanne N.

    2017-01-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies. PMID:29881248

  5. Text Mining in Organizational Research.

    PubMed

    Kobayashi, Vladimer B; Mol, Stefan T; Berkers, Hannah A; Kismihók, Gábor; Den Hartog, Deanne N

    2018-07-01

    Despite the ubiquity of textual data, so far few researchers have applied text mining to answer organizational research questions. Text mining, which essentially entails a quantitative approach to the analysis of (usually) voluminous textual data, helps accelerate knowledge discovery by radically increasing the amount data that can be analyzed. This article aims to acquaint organizational researchers with the fundamental logic underpinning text mining, the analytical stages involved, and contemporary techniques that may be used to achieve different types of objectives. The specific analytical techniques reviewed are (a) dimensionality reduction, (b) distance and similarity computing, (c) clustering, (d) topic modeling, and (e) classification. We describe how text mining may extend contemporary organizational research by allowing the testing of existing or new research questions with data that are likely to be rich, contextualized, and ecologically valid. After an exploration of how evidence for the validity of text mining output may be generated, we conclude the article by illustrating the text mining process in a job analysis setting using a dataset composed of job vacancies.

  6. The new statistics: why and how.

    PubMed

    Cumming, Geoff

    2014-01-01

    We need to make substantial changes to how we conduct research. First, in response to heightened concern that our published research literature is incomplete and untrustworthy, we need new requirements to ensure research integrity. These include prespecification of studies whenever possible, avoidance of selection and other inappropriate data-analytic practices, complete reporting, and encouragement of replication. Second, in response to renewed recognition of the severe flaws of null-hypothesis significance testing (NHST), we need to shift from reliance on NHST to estimation and other preferred techniques. The new statistics refers to recommended practices, including estimation based on effect sizes, confidence intervals, and meta-analysis. The techniques are not new, but adopting them widely would be new for many researchers, as well as highly beneficial. This article explains why the new statistics are important and offers guidance for their use. It describes an eight-step new-statistics strategy for research with integrity, which starts with formulation of research questions in estimation terms, has no place for NHST, and is aimed at building a cumulative quantitative discipline.

  7. Advances in imaging and quantification of electrical properties at the nanoscale using Scanning Microwave Impedance Microscopy (sMIM)

    NASA Astrophysics Data System (ADS)

    Friedman, Stuart; Yang, Yongliang; Amster, Oskar

    2015-03-01

    Scanning Microwave Impedance Microscopy (sMIM) is a mode for Atomic Force Microscopy (AFM) enabling imaging of unique contrast mechanisms and measurement of local permittivity and conductivity at the 10's of nm length scale. Recent results will be presented illustrating high-resolution electrical features such as sub 15 nm Moire' patterns in Graphene, carbon nanotubes of various electrical states and ferro-electrics. In addition to imaging, the technique is suited to a variety of metrology applications where specific physical properties are determined quantitatively. We will present research activities on quantitative measurements using multiple techniques to determine dielectric constant (permittivity) and conductivity (e.g. dopant concentration) for a range of materials. Examples include bulk dielectrics, low-k dielectric thin films, capacitance standards and doped semiconductors. Funded in part by DOE SBIR DE-SC0009586.

  8. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  9. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  10. Advanced ballistic range technology

    NASA Technical Reports Server (NTRS)

    Yates, Leslie A.

    1993-01-01

    Experimental interferograms, schlieren, and shadowgraphs are used for quantitative and qualitative flow-field studies. These images are created by passing light through a flow field, and the recorded intensity patterns are functions of the phase shift and angular deflection of the light. As part of the grant NCC2-583, techniques and software have been developed for obtaining phase shifts from finite-fringe interferograms and for constructing optical images from Computational Fluid Dynamics (CFD) solutions. During the period from 1 Nov. 1992 - 30 Jun. 1993, research efforts have been concentrated in improving these techniques.

  11. Multistage, multiseasonal and multiband imagery to identify and qualify non-forest vegetation resources

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.; Francis, R. E.

    1970-01-01

    A description of space and supporting aircraft photography for the interpretation and analyses of non-forest (shrubby and herbaceous) native vegetation is presented. The research includes the development of a multiple sampling technique to assign quantitative area values of specific plant community types included within an assigned space photograph map unit. Also, investigations of aerial film type, scale, and season of photography for identification and quantity measures of shrubby and herbaceous vegetation were conducted. Some work was done to develop automated interpretation techniques with film image density measurement devices.

  12. Nondestructive techniques for characterizing mechanical properties of structural materials: An overview

    NASA Technical Reports Server (NTRS)

    Vary, A.; Klima, S. J.

    1985-01-01

    An overview of nondestructive evaluation (NDE) is presented to indicate the availability and application potentials of techniques for quantitative characterization of the mechanical properties of structural materials. The purpose is to review NDE techniques that go beyond the usual emphasis on flaw detection and characterization. Discussed are current and emerging NDE techniques that can verify and monitor entrinsic properties (e.g., tensile, shear, and yield strengths; fracture toughness, hardness, ductility; elastic moduli) and underlying microstructural and morphological factors. Most of the techniques described are, at present, neither widely applied nor widely accepted in commerce and industry because they are still emerging from the laboratory. The limitations of the techniques may be overcome by advances in applications research and instrumentation technology and perhaps by accommodations for their use in the design of structural parts.

  13. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Recent advances in mass spectrometry-based proteomics of gastric cancer.

    PubMed

    Kang, Changwon; Lee, Yejin; Lee, J Eugene

    2016-10-07

    The last decade has witnessed remarkable technological advances in mass spectrometry-based proteomics. The development of proteomics techniques has enabled the reliable analysis of complex proteomes, leading to the identification and quantification of thousands of proteins in gastric cancer cells, tissues, and sera. This quantitative information has been used to profile the anomalies in gastric cancer and provide insights into the pathogenic mechanism of the disease. In this review, we mainly focus on the advances in mass spectrometry and quantitative proteomics that were achieved in the last five years and how these up-and-coming technologies are employed to track biochemical changes in gastric cancer cells. We conclude by presenting a perspective on quantitative proteomics and its future applications in the clinic and translational gastric cancer research.

  15. A comparison of two IPv4/IPv6 transition mechanisms - OpenVPN and IVI

    NASA Astrophysics Data System (ADS)

    Vu, Cong Tuan; Tran, Quang Anh; Jiang, Frank

    2012-09-01

    This document presents a comparison of two IPv4/IPv6 transition mechanisms. They are OpenVPN and IVI. Meanwhile OpenVPN is based on tunneling technology, IVI is a stateless IPv4/IPv6 translation technique which is developed by China Education and Research Network (CERNET). This research focus on the quantitative and qualitative comparison of these two main mechanisms; how they are applied in practical situation by the Internet Service Providers, as well as their advantages and drawbacks.

  16. The research of statistical properties of colorimetric features of screens with a three-component color formation principle

    NASA Astrophysics Data System (ADS)

    Zharinov, I. O.; Zharinov, O. O.

    2017-12-01

    The problem of the research is concerned with quantitative analysis of influence of technological variation of the screen color profile parameters on chromaticity coordinates of the displayed image. Some mathematical expressions which approximate the two-dimensional distribution of chromaticity coordinates of an image, which is displayed on the screen with a three-component color formation principle were proposed. Proposed mathematical expressions show the way to development of correction techniques to improve reproducibility of the colorimetric features of displays.

  17. The remote sensing of aquatic macrophytes Part 1: Color-infrared aerial photography as a tool for identification and mapping of littoral vegetation. Part 2: Aerial photography as a quantitative tool for the investigation of aquatic ecosystems. [Lake Wingra, Wisconsin

    NASA Technical Reports Server (NTRS)

    Gustafson, T. D.; Adams, M. S.

    1973-01-01

    Research was initiated to use aerial photography as an investigative tool in studies that are part of an intensive aquatic ecosystem research effort at Lake Wingra, Madison, Wisconsin. It is anticipated that photographic techniques would supply information about the growth and distribution of littoral macrophytes with efficiency and accuracy greater than conventional methods.

  18. Introduction to Modern Methods in Light Microscopy.

    PubMed

    Ryan, Joel; Gerhold, Abby R; Boudreau, Vincent; Smith, Lydia; Maddox, Paul S

    2017-01-01

    For centuries, light microscopy has been a key method in biological research, from the early work of Robert Hooke describing biological organisms as cells, to the latest in live-cell and single-molecule systems. Here, we introduce some of the key concepts related to the development and implementation of modern microscopy techniques. We briefly discuss the basics of optics in the microscope, super-resolution imaging, quantitative image analysis, live-cell imaging, and provide an outlook on active research areas pertaining to light microscopy.

  19. Microscopy techniques in flavivirus research.

    PubMed

    Chong, Mun Keat; Chua, Anthony Jin Shun; Tan, Terence Tze Tong; Tan, Suat Hoon; Ng, Mah Lee

    2014-04-01

    The Flavivirus genus is composed of many medically important viruses that cause high morbidity and mortality, which include Dengue and West Nile viruses. Various molecular and biochemical techniques have been developed in the endeavour to study flaviviruses. However, microscopy techniques still have irreplaceable roles in the identification of novel virus pathogens and characterization of morphological changes in virus-infected cells. Fluorescence microscopy contributes greatly in understanding the fundamental viral protein localizations and virus-host protein interactions during infection. Electron microscopy remains the gold standard for visualizing ultra-structural features of virus particles and infected cells. New imaging techniques and combinatory applications are continuously being developed to push the limit of resolution and extract more quantitative data. Currently, correlative live cell imaging and high resolution three-dimensional imaging have already been achieved through the tandem use of optical and electron microscopy in analyzing biological specimens. Microscopy techniques are also used to measure protein binding affinities and determine the mobility pattern of proteins in cells. This chapter will consolidate on the applications of various well-established microscopy techniques in flavivirus research, and discuss how recently developed microscopy techniques can potentially help advance our understanding in these membrane viruses. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. [Research progress and application prospect of near infrared spectroscopy in soil nutrition analysis].

    PubMed

    Ding, Hai-quan; Lu, Qi-peng

    2012-01-01

    "Digital agriculture" or "precision agriculture" is an important direction of modern agriculture technique. It is the combination of the modern information technique and traditional agriculture and becomes a hotspot field in international agriculture research in recent years. As a nondestructive, real-time, effective and exact analysis technique, near infrared spectroscopy, by which precision agriculture could be carried out, has vast prospect in agrology and gradually gained the recognition. The present paper intends to review the basic theory of near infrared spectroscopy and its applications in the field of agrology, pointing out that the direction of NIR in agrology should based on portable NIR spectrograph in order to acquire qualitative or quantitative information from real-time measuring in field. In addition, NIRS could be combined with space remote sensing to macroscopically control the way crop is growing and the nutrition crops need, to change the current state of our country's agriculture radically.

  1. Multi-scale modeling in cell biology

    PubMed Central

    Meier-Schellersheim, Martin; Fraser, Iain D. C.; Klauschen, Frederick

    2009-01-01

    Biomedical research frequently involves performing experiments and developing hypotheses that link different scales of biological systems such as, for instance, the scales of intracellular molecular interactions to the scale of cellular behavior and beyond to the behavior of cell populations. Computational modeling efforts that aim at exploring such multi-scale systems quantitatively with the help of simulations have to incorporate several different simulation techniques due to the different time and space scales involved. Here, we provide a non-technical overview of how different scales of experimental research can be combined with the appropriate computational modeling techniques. We also show that current modeling software permits building and simulating multi-scale models without having to become involved with the underlying technical details of computational modeling. PMID:20448808

  2. Heike Kamerlingh Onnes: Master of Experimental Technique and Quantitative Research

    NASA Astrophysics Data System (ADS)

    Reif-Acherman, Simón

    Heike Kamerlingh Onnes (1853-1926), born a century and a half ago, was a major protagonist in the so-called Second Golden Age of Dutch Science. He devoted his career to the emerging field of low-temperature physics. His particular concern was to test the theories of his older compatriot Johannes Diderik van der Waals (1837-1923) by creating a style of research that was characterized by meticulous planning, precise measurement, and constant improvement of techniques and instruments. He made numerous contributions to low-temperature physics, but I focus on his liquefaction of helium, for which he received the Nobel Prize in Physics for 1913, and on his discovery of superconductivity. He became known internationally as le gentleman du zéro absolu.

  3. Attitude and perception of farmers to the implementation of conservation farming in the mountainous area of South Sulawesi

    NASA Astrophysics Data System (ADS)

    Busthanul, N.; Lumoindong, Y.; Syafiuddin, M.; Heliawaty; Lanuhu, N.; Ibrahim, T.; Ambrosius, R. R.

    2018-05-01

    Farmers’ attitudes and perceptions may be the cause of ineffective implementation of conservation farming for agriculture sustainability due to vary of implementing of conservation techniques. The purpose of this research is to know the attitude and perception of farmer toward the application of conservation technique and to know correlation between farmer attitude and perception toward the application of conservation technique. The research was carried out in Kanreapia Village, Tombolo Pao District, Gowa Regency, South Sulawesi Province, Indonesia. Sampling was done by randomly with 30 farmers; using non-parametric statistics with quantitative and qualitative descriptive data analysis approach, using Likert scale. The result showed that farmer attitude and perception toward conservation technique implementation which having the highest category (appropriate) is seasonal crop rotation, while the lowest with less appropriate category is the processing of land according to the contour and the cultivation of the plants accordingly. There is a very strong relationship between farmer attitude and perception. The implications of the findings are that improvements the implementation of conservation farming techniques should be made through improved perceptions.

  4. Teachers' Subject Matter Knowledge as a Teacher Qualification: A Synthesis of the Quantitative Literature on Students' Mathematics Achievement

    ERIC Educational Resources Information Center

    Ahn, Soyeon; Choi, Jinyoung

    2004-01-01

    The aim of this paper is to examine a variety of features of research that might account for mixed findings of the relationship between teachers' subject matter knowledge and student achievement based on meta-analytic technique. Of several variables that might contribute to inconsistencies and variations in study findings, three features of…

  5. Education on an Island: Oklahoma Correctional Educators' Views of Internal Teacher Traits and Successful Learning Environments on Incarcerated Adult Students in an Institutional Setting

    ERIC Educational Resources Information Center

    Ely, Jeana Dawn

    2011-01-01

    Scope and method of study. This inquiry, using survey and interview techniques, demonstrated both quantitative and qualitative research methodologies. In this study, effective teacher traits related to successful classroom structure in the correctional environment for adult students with a wide variety of issues, problems and learning difficulties…

  6. Investigating How the Biographies of Today's Scientists Affect 8th Graders' Scientist Image

    ERIC Educational Resources Information Center

    Karaçam, Sedat

    2016-01-01

    This study aimed to investigate how a poster study focusing on the biographies of today's scientists affected 8th graders' scientist images. The study utilized a mixed model which combined qualitative and quantitative research techniques. 142 8th graders from a secondary school in Ankara Province Keçiören District participated in the study.…

  7. Grid-Enabled Quantitative Analysis of Breast Cancer

    DTIC Science & Technology

    2009-10-01

    large-scale, multi-modality computerized image analysis . The central hypothesis of this research is that large-scale image analysis for breast cancer...pilot study to utilize large scale parallel Grid computing to harness the nationwide cluster infrastructure for optimization of medical image ... analysis parameters. Additionally, we investigated the use of cutting edge dataanalysis/ mining techniques as applied to Ultrasound, FFDM, and DCE-MRI Breast

  8. System Modeling and Diagnostics for Liquefying-Fuel Hybrid Rockets

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Ou, Jeremy; Sanderfer, Dwight; Patterson-Hine, Ann

    2003-01-01

    A Hybrid Combustion Facility (HCF) was recently built at NASA Ames Research Center to study the combustion properties of a new fuel formulation that burns approximately three times faster than conventional hybrid fuels. Researchers at Ames working in the area of Integrated Vehicle Health Management recognized a good opportunity to apply IVHM techniques to a candidate technology for next generation launch systems. Five tools were selected to examine various IVHM techniques for the HCF. Three of the tools, TEAMS (Testability Engineering and Maintenance System), L2 (Livingstone2), and RODON, are model-based reasoning (or diagnostic) systems. Two other tools in this study, ICS (Interval Constraint Simulator) and IMS (Inductive Monitoring System) do not attempt to isolate the cause of the failure but may be used for fault detection. Models of varying scope and completeness were created, both qualitative and quantitative. In each of the models, the structure and behavior of the physical system are captured. In the qualitative models, the temporal aspects of the system behavior and the abstraction of sensor data are handled outside of the model and require the development of additional code. In the quantitative model, less extensive processing code is also necessary. Examples of fault diagnoses are given.

  9. [Drug advertisement in a medicine school in the Southern of Brazil].

    PubMed

    Trevisol, Daisson José; Ferreira, Maria Beatriz Cardoso; Karnopp, Zuleica Maria Patrício

    2010-11-01

    This is a quali-quantitative study on drug advertisement in a Medicine school in Santa Catarina state. Participants were medicine students, faculty physicians and patients of school ambulatories, totaling 1,231 interviewees. The focal group technique was used to the qualitative research; the quantitative research with a semistructured questionnaire. 53.6% of the faculty physicians considered they were rarely or never influenced by the propaganda, and 53.7% claimed their colleagues are. Among the students, 43.2% believe that, after graduated, they will rarely or never be influenced; while 42.0% believe that graduated are always or frequently influenced. For 41.7%, the information given by the representatives of the pharmaceutical industry is good or excellent. Also, 74.8% reported that the pharmaceutical industry will be able to contribute for their professional practice. This study identified that the distribution of free drug samples are one of the main advertising and propaganda techniques used by the pharmaceutical industry; as there is a certain pressure of the medical preceptor upon the choice of the prescription; although no direct impact of the influence of the pharmaceutical industry on the ambulatories was observed. Drug prescription is usually not rational.

  10. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  11. Analysis of the differentially expressed low molecular weight peptides in human serum via an N-terminal isotope labeling technique combining nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Leng, Jiapeng; Zhu, Dong; Wu, Duojiao; Zhu, Tongyu; Zhao, Ningwei; Guo, Yinlong

    2012-11-15

    Peptidomics analysis of human serum is challenging due to the low abundance of serum peptides and interference from the complex matrix. This study analyzed the differentially expressed (DE) low molecular weight peptides in human serum integrating a DMPITC-based N-terminal isotope labeling technique with nano-liquid chromatography and matrix-assisted laser desorption/ionization mass spectrometry (nano-LC/MALDI-MS). The workflow introduced a [d(6)]-4,6-dimethoxypyrimidine-2-isothiocyanate (DMPITC)-labeled mixture of aliquots from test samples as the internal standard. The spiked [d(0)]-DMPITC-labeled samples were separated by nano-LC then spotted on the MALDI target. Both quantitative and qualitative studies for serum peptides were achieved based on the isotope-labeled peaks. The DMPITC labeling technique combined with nano-LC/MALDI-MS not only minimized the errors in peptide quantitation, but also allowed convenient recognition of the labeled peptides due to the 6 Da mass difference. The data showed that the entire research procedure as well as the subsequent data analysis method were effective, reproducible, and sensitive for the analysis of DE serum peptides. This study successfully established a research model for DE serum peptides using DMPITC-based N-terminal isotope labeling and nano-LC/MALDI-MS. Application of the DMPITC-based N-terminal labeling technique is expected to provide a promising tool for the investigation of peptides in vivo, especially for the analysis of DE peptides under different biological conditions. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Visual salience metrics for image inpainting

    NASA Astrophysics Data System (ADS)

    Ardis, Paul A.; Singhal, Amit

    2009-01-01

    Quantitative metrics for successful image inpainting currently do not exist, with researchers instead relying upon qualitative human comparisons to evaluate their methodologies and techniques. In an attempt to rectify this situation, we propose two new metrics to capture the notions of noticeability and visual intent in order to evaluate inpainting results. The proposed metrics use a quantitative measure of visual salience based upon a computational model of human visual attention. We demonstrate how these two metrics repeatably correlate with qualitative opinion in a human observer study, correctly identify the optimum uses for exemplar-based inpainting (as specified in the original publication), and match qualitative opinion in published examples.

  13. The quantitative and condition-dependent Escherichia coli proteome

    PubMed Central

    Schmidt, Alexander; Kochanowski, Karl; Vedelaar, Silke; Ahrné, Erik; Volkmer, Benjamin; Callipo, Luciano; Knoops, Kèvin; Bauer, Manuel; Aebersold, Ruedi; Heinemann, Matthias

    2016-01-01

    Measuring precise concentrations of proteins can provide insights into biological processes. Here, we use efficient protein extraction and sample fractionation and state-of-the-art quantitative mass spectrometry techniques to generate a comprehensive, condition-dependent protein abundance map of Escherichia coli. We measure cellular protein concentrations for 55% of predicted E. coli genes (>2300 proteins) under 22 different experimental conditions and identify methylation and N-terminal protein acetylations previously not known to be prevalent in bacteria. We uncover system-wide proteome allocation, expression regulation, and post-translational adaptations. These data provide a valuable resource for the systems biology and broader E. coli research communities. PMID:26641532

  14. Examination of China’s performance and thematic evolution in quantum cryptography research using quantitative and computational techniques

    PubMed Central

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China’s quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001–2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China’s QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China’s performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China’s performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China’s H-index (a normalized indicator) has surpassed all other countries’ over the last several years. The second phase of analysis shows how China’s main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China’s QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research. PMID:29385151

  15. Examination of China's performance and thematic evolution in quantum cryptography research using quantitative and computational techniques.

    PubMed

    Olijnyk, Nicholas V

    2018-01-01

    This study performed two phases of analysis to shed light on the performance and thematic evolution of China's quantum cryptography (QC) research. First, large-scale research publication metadata derived from QC research published from 2001-2017 was used to examine the research performance of China relative to that of global peers using established quantitative and qualitative measures. Second, this study identified the thematic evolution of China's QC research using co-word cluster network analysis, a computational science mapping technique. The results from the first phase indicate that over the past 17 years, China's performance has evolved dramatically, placing it in a leading position. Among the most significant findings is the exponential rate at which all of China's performance indicators (i.e., Publication Frequency, citation score, H-index) are growing. China's H-index (a normalized indicator) has surpassed all other countries' over the last several years. The second phase of analysis shows how China's main research focus has shifted among several QC themes, including quantum-key-distribution, photon-optical communication, network protocols, and quantum entanglement with an emphasis on applied research. Several themes were observed across time periods (e.g., photons, quantum-key-distribution, secret-messages, quantum-optics, quantum-signatures); some themes disappeared over time (e.g., computer-networks, attack-strategies, bell-state, polarization-state), while others emerged more recently (e.g., quantum-entanglement, decoy-state, unitary-operation). Findings from the first phase of analysis provide empirical evidence that China has emerged as the global driving force in QC. Considering China is the premier driving force in global QC research, findings from the second phase of analysis provide an understanding of China's QC research themes, which can provide clarity into how QC technologies might take shape. QC and science and technology policy researchers can also use these findings to trace previous research directions and plan future lines of research.

  16. Molecular biology of myopia.

    PubMed

    Schaeffel, Frank; Simon, Perikles; Feldkaemper, Marita; Ohngemach, Sibylle; Williams, Robert W

    2003-09-01

    Experiments in animal models of myopia have emphasised the importance of visual input in emmetropisation but it is also evident that the development of human myopia is influenced to some degree by genetic factors. Molecular genetic approaches can help to identify both the genes involved in the control of ocular development and the potential targets for pharmacological intervention. This review covers a variety of techniques that are being used to study the molecular biology of myopia. In the first part, we describe techniques used to analyse visually induced changes in gene expression: Northern Blot, polymerase chain reaction (PCR) and real-time PCR to obtain semi-quantitative and quantitative measures of changes in transcription level of a known gene, differential display reverse transcription PCR (DD-RT-PCR) to search for new genes that are controlled by visual input, rapid amplification of 5' cDNA (5'-RACE) to extend the 5' end of sequences that are regulated by visual input, in situ hybridisation to localise the expression of a given gene in a tissue and oligonucleotide microarray assays to simultaneously test visually induced changes in thousands of transcripts in single experiments. In the second part, we describe techniques that are used to localise regions in the genome that contain genes that are involved in the control of eye growth and refractive errors in mice and humans. These include quantitative trait loci (QTL) mapping, exploiting experimental test crosses of mice and transmission disequilibrium tests (TDT) in humans to find chromosomal intervals that harbour genes involved in myopia development. We review several successful applications of this battery of techniques in myopia research.

  17. A robust rotorcraft flight control system design methodology utilizing quantitative feedback theory

    NASA Technical Reports Server (NTRS)

    Gorder, Peter James

    1993-01-01

    Rotorcraft flight control systems present design challenges which often exceed those associated with fixed-wing aircraft. First, large variations in the response characteristics of the rotorcraft result from the wide range of airspeeds of typical operation (hover to over 100 kts). Second, the assumption of vehicle rigidity often employed in the design of fixed-wing flight control systems is rarely justified in rotorcraft where rotor degrees of freedom can have a significant impact on the system performance and stability. This research was intended to develop a methodology for the design of robust rotorcraft flight control systems. Quantitative Feedback Theory (QFT) was chosen as the basis for the investigation. Quantitative Feedback Theory is a technique which accounts for variability in the dynamic response of the controlled element in the design robust control systems. It was developed to address a Multiple-Input Single-Output (MISO) design problem, and utilizes two degrees of freedom to satisfy the design criteria. Two techniques were examined for extending the QFT MISO technique to the design of a Multiple-Input-Multiple-Output (MIMO) flight control system (FCS) for a UH-60 Black Hawk Helicopter. In the first, a set of MISO systems, mathematically equivalent to the MIMO system, was determined. QFT was applied to each member of the set simultaneously. In the second, the same set of equivalent MISO systems were analyzed sequentially, with closed loop response information from each loop utilized in subsequent MISO designs. The results of each technique were compared, and the advantages of the second, termed Sequential Loop Closure, were clearly evident.

  18. Comparative research on activation technique for GaAs photocathodes

    NASA Astrophysics Data System (ADS)

    Chen, Liang; Qian, Yunsheng; Chang, Benkang; Chen, Xinlong; Yang, Rui

    2012-03-01

    The properties of GaAs photocathodes mainly depend on the material design and activation technique. In early researches, high-low temperature two-step activation has been proved to get more quantum efficiency than high-temperature single-step activation. But the variations of surface barriers for two activation techniques have not been well studied, thus the best activation temperature, best Cs-O ratio and best activation time for two-step activation technique have not been well found. Because the surface photovoltage spectroscopy (SPS) before activation is only in connection with the body parameters for GaAs photocathode such as electron diffusion length and the spectral response current (SRC) after activation is in connection with not only body parameters but also surface barriers, thus the surface escape probability (SEP) can be well fitted through the comparative research between SPS before activation and SEP after activation. Through deduction for the tunneling process of surface barriers by Schrödinger equation, the width and height for surface barrier I and II can be well fitted through the curves of SEP. The fitting results were well proved and analyzed by quantitative analysis of angle-dependent X-ray photoelectron spectroscopy (ADXPS) which can also study the surface chemical compositions, atomic concentration percentage and layer thickness for GaAs photocathodes. This comparative research method for fitting parameters of surface barriers through SPS before activation and SRC after activation shows a better real-time in system method for the researches of activation techniques.

  19. Automated optimization techniques for aircraft synthesis

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1976-01-01

    Application of numerical optimization techniques to automated conceptual aircraft design is examined. These methods are shown to be a general and efficient way to obtain quantitative information for evaluating alternative new vehicle projects. Fully automated design is compared with traditional point design methods and time and resource requirements for automated design are given. The NASA Ames Research Center aircraft synthesis program (ACSYNT) is described with special attention to calculation of the weight of a vehicle to fly a specified mission. The ACSYNT procedures for automatically obtaining sensitivity of the design (aircraft weight, performance and cost) to various vehicle, mission, and material technology parameters are presented. Examples are used to demonstrate the efficient application of these techniques.

  20. Man-machine analysis of translation and work tasks of Skylab films

    NASA Technical Reports Server (NTRS)

    Hosler, W. W.; Boelter, J. G.; Morrow, J. R., Jr.; Jackson, J. T.

    1979-01-01

    An objective approach to determine the concurrent validity of computer-graphic models is real time film analysis. This technique was illustrated through the procedures and results obtained in an evaluation of translation of Skylab mission astronauts. The quantitative analysis was facilitated by the use of an electronic film analyzer, minicomputer, and specifically supportive software. The uses of this technique for human factors research are: (1) validation of theoretical operator models; (2) biokinetic analysis; (3) objective data evaluation; (4) dynamic anthropometry; (5) empirical time-line analysis; and (6) consideration of human variability. Computer assisted techniques for interface design and evaluation have the potential for improving the capability for human factors engineering.

  1. Use of Mixed Methods Research in Research on Coronary Artery Disease, Diabetes Mellitus, and Hypertension: A Scoping Review.

    PubMed

    Campbell, David J T; Tam-Tham, Helen; Dhaliwal, Kirnvir K; Manns, Braden J; Hemmelgarn, Brenda R; Sanmartin, Claudia; King-Shier, Kathryn

    2017-01-01

    Mixed methods research, the use of both qualitative and quantitative methods within 1 program of study, is becoming increasingly popular to allow investigators to explore patient experiences (qualitative) and also measure outcomes (quantitative). Coronary artery disease and its risk factors are some of the most studied conditions; however, the extent to which mixed methods studies are being conducted in these content areas is unknown. We sought to comprehensively describe the characteristics of published mixed methods studies on coronary artery disease and major risk factors (diabetes mellitus and hypertension). We conducted a scoping review of the literature indexed in PubMed, Medline, EMBASE, and CINAHL. We identified 811 abstracts for screening, of which 254 articles underwent full-text review and 97 reports of 81 studies met criteria for inclusion. The majority of studies in this area were conducted in the past 10 years by nurse researchers from the United States and United Kingdom. Diabetes mellitus was the most common content area for mixed methods investigation (compared with coronary artery disease and hypertension). Most authors described their rationale for using mixed methods as complementarity and did not describe study priority or how they reconciled differences in methodological paradigms. Some mixed methods study designs were more commonly used than others, including concurrent timing and integration at the interpretation stage. Qualitative strands were most commonly descriptive studies using interviews for data collection. Quantitative strands were most commonly cross-sectional observational studies, which relied heavily on self-report data such as surveys and scales. Although mixed methods research is becoming increasingly popular in the area of coronary artery disease and its risk factors, many of the more advanced mixed methods, qualitative, and quantitative techniques have not been commonly used in these areas. © 2016 American Heart Association, Inc.

  2. A novel post-weld-shift measurement and compensation technique in butterfly-type laser module packages

    NASA Astrophysics Data System (ADS)

    Hsu, Yi-Cheng, Sr.; Tsai, Y. C.; Hung, Y. S.; Cheng, W. H.

    2005-08-01

    One of the greatest challenges in the packaging of laser modules using laser welding technique is to use a reliable and accurate joining process. However, during welding, due to the material property difference between welded components, the rapid solidification of the welded region and the associated material shrinkage often introduced a post-weld-shift (PWS) between welded components. For a typical single-mode fiber application, if the PWS induced fiber alignment shift by the laser welding joining process is even a few micrometers, up to 50 % or greater loss in the coupled power may occur. The fiber alignment shift of the PWS effect in the laser welding process has a significant impact on the laser module package yield. Therefore, a detailed understanding of the effects of PWS on the fiber alignment shifts in laser-welded laser module packages and then the compensation of the fiber alignment shifts due to PWS effects are the key research subjects in laser welding techniques for optoelectronic packaging applications. Previously, the power losses due to PWS in butterfly-type laser module packages have been qualitatively corrected by applying the laser hammering technique to the direction of the detected shift. Therefore, by applying an elastic deformation to the welded components and by observing the corresponding power variation, the direction and magnitude of the PWS may be predicted. Despite numerous studies on improving the fabrication yields of laser module packaging using the PWS correction in laser welding techniques by a qualitative estimate, limited information is available for the quantitative understanding of the PWS induced fiber alignment shift which can be useful in designing and fabricating high-yield and high-performance laser module packages. The purpose of this paper is to present a quantitative probing of the PWS induced fiber alignment shift in laser-welded butterfly-type laser module packaging by employing a novel technique of a high-magnification camera with image capture system (HMCICS). The benefit of using the HMCICS technique to determine the fiber alignment shift are quantitatively measure and compensate the PWS direction and magnitude during the laser-welded laser module packages. This study makes it possible to probe the nonlinear behavior of the PWS by using a novel HMCICS technique that results in a real time quantitative compensation of the PWS in butterfly-type laser module packages, when compared to the currently available qualitatively estimated techniques to correct the PWS2. Therefore, the reliable butterfly-type laser modules with high yield and high performance used in lightwave transmission systems may thus be developed and fabricated.

  3. Discrimination and Measurements of Three Flavonols with Similar Structure Using Terahertz Spectroscopy and Chemometrics

    NASA Astrophysics Data System (ADS)

    Yan, Ling; Liu, Changhong; Qu, Hao; Liu, Wei; Zhang, Yan; Yang, Jianbo; Zheng, Lei

    2018-03-01

    Terahertz (THz) technique, a recently developed spectral method, has been researched and used for the rapid discrimination and measurements of food compositions due to its low-energy and non-ionizing characteristics. In this study, THz spectroscopy combined with chemometrics has been utilized for qualitative and quantitative analysis of myricetin, quercetin, and kaempferol with concentrations of 0.025, 0.05, and 0.1 mg/mL. The qualitative discrimination was achieved by KNN, ELM, and RF models with the spectra pre-treatments. An excellent discrimination (100% CCR in the prediction set) could be achieved using the RF model. Furthermore, the quantitative analyses were performed by partial least square regression (PLSR) and least squares support vector machine (LS-SVM). Comparing to the PLSR models, the LS-SVM yielded better results with low RMSEP (0.0044, 0.0039, and 0.0048), higher Rp (0.9601, 0.9688, and 0.9359), and higher RPD (8.6272, 9.6333, and 7.9083) for myricetin, quercetin, and kaempferol, respectively. Our results demonstrate that THz spectroscopy technique is a powerful tool for identification of three flavonols with similar chemical structures and quantitative determination of their concentrations.

  4. [The validation of kit of reagents for quantitative detection of DNA of human cytomegalovirus in biological material using polymerase chain reaction technique in real time operation mode].

    PubMed

    Sil'veĭstrova, O Iu; Domonova, É A; Shipulina, O Iu

    2014-04-01

    The validation of kit of reagents destined to detection and quantitative evaluation of DNA of human cytomegalovirus in biological material using polymerase chain reaction technique in real time operation mode was implemented. The comparison was made against international WHO standard--The first WHO international standard for human cytomegalovirus to implement measures the kit of reagents "AmpliSens CMV-screen/monitor-FL" and standard sample of enterprise DNA HCMV (The central research institute of epidemiology of Rospotrebnadzor) was applied. The fivefold dilution of international WHO standard and standard sample of enterprise were carried out in concentrations of DNA HCMV from 106 to 102. The arrangement of polymerase chain reaction and analysis of results were implemented using programed amplifier with system of detection of fluorescent signal in real-time mode "Rotor-Gene Q" ("Qiagen", Germany). In the total of three series of experiments, all stages of polymerase chain reaction study included, the coefficient of translation of quantitative evaluation of DNA HCMV from copy/ml to ME/ml equal to 0.6 was introduced for this kit of reagents.

  5. Susceptibility-Weighted Imaging and Quantitative Susceptibility Mapping in the Brain

    PubMed Central

    Liu, Chunlei; Li, Wei; Tong, Karen A.; Yeom, Kristen W.; Kuzminski, Samuel

    2015-01-01

    Susceptibility-weighted imaging (SWI) is a magnetic resonance imaging (MRI) technique that enhances image contrast by using the susceptibility differences between tissues. It is created by combining both magnitude and phase in the gradient echo data. SWI is sensitive to both paramagnetic and diamagnetic substances which generate different phase shift in MRI data. SWI images can be displayed as a minimum intensity projection that provides high resolution delineation of the cerebral venous architecture, a feature that is not available in other MRI techniques. As such, SWI has been widely applied to diagnose various venous abnormalities. SWI is especially sensitive to deoxygenated blood and intracranial mineral deposition and, for that reason, has been applied to image various pathologies including intracranial hemorrhage, traumatic brain injury, stroke, neoplasm, and multiple sclerosis. SWI, however, does not provide quantitative measures of magnetic susceptibility. This limitation is currently being addressed with the development of quantitative susceptibility mapping (QSM) and susceptibility tensor imaging (STI). While QSM treats susceptibility as isotropic, STI treats susceptibility as generally anisotropic characterized by a tensor quantity. This article reviews the basic principles of SWI, its clinical and research applications, the mechanisms governing brain susceptibility properties, and its practical implementation, with a focus on brain imaging. PMID:25270052

  6. Cells and Stripes: A novel quantitative photo-manipulation technique

    PubMed Central

    Mistrik, Martin; Vesela, Eva; Furst, Tomas; Hanzlikova, Hana; Frydrych, Ivo; Gursky, Jan; Majera, Dusana; Bartek, Jiri

    2016-01-01

    Laser micro-irradiation is a technology widely used in the DNA damage response, checkpoint signaling, chromatin remodeling and related research fields, to assess chromatin modifications and recruitment of diverse DNA damage sensors, mediators and repair proteins to sites of DNA lesions. While this approach has aided numerous discoveries related to cell biology, maintenance of genome integrity, aging and cancer, it has so far been limited by a tedious manual definition of laser-irradiated subcellular regions, with the ensuing restriction to only a small number of cells treated and analyzed in a single experiment. Here, we present an improved and versatile alternative to the micro-irradiation approach: Quantitative analysis of photo-manipulated samples using innovative settings of standard laser-scanning microscopes. Up to 200 cells are simultaneously exposed to a laser beam in a defined pattern of collinear rays. The induced striation pattern is then automatically evaluated by a simple algorithm, which provides a quantitative assessment of various laser-induced phenotypes in live or fixed cells. Overall, this new approach represents a more robust alternative to existing techniques, and provides a versatile tool for a wide range of applications in biomedicine. PMID:26777522

  7. Spectral Analysis and Experimental Modeling of Ice Accretion Roughness

    NASA Technical Reports Server (NTRS)

    Orr, D. J.; Breuer, K. S.; Torres, B. E.; Hansman, R. J., Jr.

    1996-01-01

    A self-consistent scheme for relating wind tunnel ice accretion roughness to the resulting enhancement of heat transfer is described. First, a spectral technique of quantitative analysis of early ice roughness images is reviewed. The image processing scheme uses a spectral estimation technique (SET) which extracts physically descriptive parameters by comparing scan lines from the experimentally-obtained accretion images to a prescribed test function. Analysis using this technique for both streamwise and spanwise directions of data from the NASA Lewis Icing Research Tunnel (IRT) are presented. An experimental technique is then presented for constructing physical roughness models suitable for wind tunnel testing that match the SET parameters extracted from the IRT images. The icing castings and modeled roughness are tested for enhancement of boundary layer heat transfer using infrared techniques in a "dry" wind tunnel.

  8. 3D Slicer as an Image Computing Platform for the Quantitative Imaging Network

    PubMed Central

    Fedorov, Andriy; Beichel, Reinhard; Kalpathy-Cramer, Jayashree; Finet, Julien; Fillion-Robin, Jean-Christophe; Pujol, Sonia; Bauer, Christian; Jennings, Dominique; Fennessy, Fiona; Sonka, Milan; Buatti, John; Aylward, Stephen; Miller, James V.; Pieper, Steve; Kikinis, Ron

    2012-01-01

    Quantitative analysis has tremendous but mostly unrealized potential in healthcare to support objective and accurate interpretation of the clinical imaging. In 2008, the National Cancer Institute began building the Quantitative Imaging Network (QIN) initiative with the goal of advancing quantitative imaging in the context of personalized therapy and evaluation of treatment response. Computerized analysis is an important component contributing to reproducibility and efficiency of the quantitative imaging techniques. The success of quantitative imaging is contingent on robust analysis methods and software tools to bring these methods from bench to bedside. 3D Slicer is a free open source software application for medical image computing. As a clinical research tool, 3D Slicer is similar to a radiology workstation that supports versatile visualizations but also provides advanced functionality such as automated segmentation and registration for a variety of application domains. Unlike a typical radiology workstation, 3D Slicer is free and is not tied to specific hardware. As a programming platform, 3D Slicer facilitates translation and evaluation of the new quantitative methods by allowing the biomedical researcher to focus on the implementation of the algorithm, and providing abstractions for the common tasks of data communication, visualization and user interface development. Compared to other tools that provide aspects of this functionality, 3D Slicer is fully open source and can be readily extended and redistributed. In addition, 3D Slicer is designed to facilitate the development of new functionality in the form of 3D Slicer extensions. In this paper, we present an overview of 3D Slicer as a platform for prototyping, development and evaluation of image analysis tools for clinical research applications. To illustrate the utility of the platform in the scope of QIN, we discuss several use cases of 3D Slicer by the existing QIN teams, and we elaborate on the future directions that can further facilitate development and validation of imaging biomarkers using 3D Slicer. PMID:22770690

  9. But Are They Learning? Getting Started in Classroom Evaluation

    PubMed Central

    Dancy, Melissa H; Beichner, Robert J

    2002-01-01

    There are increasing numbers of traditional biologists, untrained in educational research methods, who want to develop and assess new classroom innovations. In this article we argue the necessity of formal research over normal classroom feedback. We also argue that traditionally trained biologists can make significant contributions to biology pedagogy. We then offer some guidance to the biologist with no formal educational research training who wants to get started. Specifically, we suggest ways to find out what others have done, we discuss the difference between qualitative and quantitative research, and we elaborate on the process of gaining insights from student interviews. We end with an example of a project that has used many different research techniques. PMID:12459792

  10. Current perspectives of CASA applications in diverse mammalian spermatozoa.

    PubMed

    van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S

    2018-03-26

    Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.

  11. Optical Fiber Sensors for Advanced Civil Structures

    NASA Astrophysics Data System (ADS)

    de Vries, Marten Johannes Cornelius

    1995-01-01

    The objective of this dissertation is to develop, analyze, and implement optical fiber-based sensors for the nondestructive quantitative evaluation of advanced civil structures. Based on a comparative evaluation of optical fiber sensors that may be used to obtain quantitative information related to physical perturbations in the civil structure, the extrinsic Fabry-Perot interferometric (EFPI) optical fiber sensor is selected as the most attractive sensor. The operation of the EFPI sensor is explained using the Kirchhoff diffraction approach. As is shown in this dissertation, this approach better predicts the signal-to-noise ratio as a function of gap length than methods employed previously. The performance of the optical fiber sensor is demonstrated in three different implementations. In the first implementation, performed with researchers in the Civil Engineering Department at the University of Southern California in Los Angeles, optical fiber sensors were used to obtain quantitative strain information from reinforced concrete interior and exterior column-to-beam connections. The second implementation, performed in cooperation with researchers at the United States Bureau of Mines in Spokane, Washington, used optical fiber sensors to monitor the performance of roof bolts used in mines. The last implementation, performed in cooperation with researchers at the Turner-Fairbanks Federal Highway Administration Research Center in McLean, Virginia, used optical fiber sensors, attached to composite prestressing strands used for reinforcing concrete, to obtain absolute strain information. Multiplexing techniques including time, frequency and wavelength division multiplexing are briefly discussed, whereas the principles of operation of spread spectrum and optical time domain reflectometery (OTDR) are discussed in greater detail. Results demonstrating that spread spectrum and OTDR techniques can be used to multiplex optical fiber sensors are presented. Finally, practical considerations that have to be taken into account when implementing optical fiber sensors into a civil structure environment are discussed, and possible solutions to some of these problems are proposed.

  12. Visible-Light Actinometry and Intermittent Illumination as Convenient Tools to Study Ru(bpy)3Cl2 Mediated Photoredox Transformations

    PubMed Central

    Pitre, Spencer P.; McTiernan, Christopher D.; Vine, Wyatt; DiPucchio, Rebecca; Grenier, Michel; Scaiano, Juan C.

    2015-01-01

    Photoredox catalysis provides many green opportunities for radical-mediated synthetic transformations. However, the determination of the underlying mechanisms has been challenging due to lack of quantitative methods that can be easily implemented in synthetic labs, where this research tends to be centered. We report here on the development, characterization and calibration of a novel actinometer based on the photocatalyst tris(2,2′-bipyridyl)ruthenium(II) chloride (Ru(bpy)3Cl2). By using the same molecule as the photocatalyst and the actinometer, we eliminate problems associated with matching sample spectral distribution, lamp-sample spectral overlap and other problems intrinsic to doing quantitative photochemistry in a laboratory that has little expertise in this area. In order to validate our actinometer system in determining the quantum yield of a Ru(bpy)3Cl2 photosensitized reaction, we test the Ru(bpy)3Cl2 catalyzed oxidation of benzhydrol to benzophenone as a model chain reaction. We also revive the rotating sector method by updating the technique for modern LED technologies and demonstrate how intermittent illumination on the timescale of milliseconds to seconds can help probe a chain reaction, using the benzhydrol to benzophenone oxidation to validate the technique. We envision these methods to have great implications in the field of photoredox catalysis, providing researchers with valuable research tools. PMID:26578341

  13. Real-time quantitative fluorescence imaging using a single snapshot optical properties technique for neurosurgical guidance

    NASA Astrophysics Data System (ADS)

    Valdes, Pablo A.; Angelo, Joseph; Gioux, Sylvain

    2015-03-01

    Fluorescence imaging has shown promise as an adjunct to improve the extent of resection in neurosurgery and oncologic surgery. Nevertheless, current fluorescence imaging techniques do not account for the heterogeneous attenuation effects of tissue optical properties. In this work, we present a novel imaging system that performs real time quantitative fluorescence imaging using Single Snapshot Optical Properties (SSOP) imaging. We developed the technique and performed initial phantom studies to validate the quantitative capabilities of the system for intraoperative feasibility. Overall, this work introduces a novel real-time quantitative fluorescence imaging method capable of being used intraoperatively for neurosurgical guidance.

  14. A Checklist for Successful Quantitative Live Cell Imaging in Systems Biology

    PubMed Central

    Sung, Myong-Hee

    2013-01-01

    Mathematical modeling of signaling and gene regulatory networks has provided unique insights about systems behaviors for many cell biological problems of medical importance. Quantitative single cell monitoring has a crucial role in advancing systems modeling of molecular networks. However, due to the multidisciplinary techniques that are necessary for adaptation of such systems biology approaches, dissemination to a wide research community has been relatively slow. In this essay, I focus on some technical aspects that are often under-appreciated, yet critical in harnessing live cell imaging methods to achieve single-cell-level understanding and quantitative modeling of molecular networks. The importance of these technical considerations will be elaborated with examples of successes and shortcomings. Future efforts will benefit by avoiding some pitfalls and by utilizing the lessons collectively learned from recent applications of imaging in systems biology. PMID:24709701

  15. A standardized kit for automated quantitative assessment of candidate protein biomarkers in human plasma.

    PubMed

    Percy, Andrew J; Mohammed, Yassene; Yang, Juncong; Borchers, Christoph H

    2015-12-01

    An increasingly popular mass spectrometry-based quantitative approach for health-related research in the biomedical field involves the use of stable isotope-labeled standards (SIS) and multiple/selected reaction monitoring (MRM/SRM). To improve inter-laboratory precision and enable more widespread use of this 'absolute' quantitative technique in disease-biomarker assessment studies, methods must be standardized. Results/methodology: Using this MRM-with-SIS-peptide approach, we developed an automated method (encompassing sample preparation, processing and analysis) for quantifying 76 candidate protein markers (spanning >4 orders of magnitude in concentration) in neat human plasma. The assembled biomarker assessment kit - the 'BAK-76' - contains the essential materials (SIS mixes), methods (for acquisition and analysis), and tools (Qualis-SIS software) for performing biomarker discovery or verification studies in a rapid and standardized manner.

  16. Gravitational Effects on Flow Instability and Transition in Low Density Jets

    NASA Technical Reports Server (NTRS)

    Agrawal, Ajay K.; Parthasarathy, Ramkumar

    2004-01-01

    Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2-second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of sight measurement technique suited for the microgravity environment. The flow structure was characterized by distributions of helium mole fraction obtained from color schlieren images taken at 60 Hz. Results show that the jet in microgravity was up to 70 percent wider than that in Earth gravity. Experiments reveal that the global flow oscillations observed in Earth gravity are absent in microgravity. The report provides quantitative details of flow evolution as the experiment undergoes change in gravity in the drop tower.

  17. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. The horizontal working mobility of employees with garment technique educational background

    NASA Astrophysics Data System (ADS)

    Supraptono, Eko; Sudana, I. Made; Rini, Sri Hastuti Eko

    2018-03-01

    The purposes of this report are: 1) to know how is the working mobility for garment employees, 2) to analyze the factors that caused working mobility, and new working orientation who searched by garment employees. This research is using qualitative and quantitative approach. The Informant in this research is gotten by purposive action. The data collecting techniques are observations, interviews, and documentations. The data analysis is using descriptive qualitative analysis by observing every aspect. The result of research shows that the criteria of the labor migration was high. It can be seen from Ungaran Sari Garment Company. The length of the migration is high, between 1 until 6 months. and the types of new job that searched by the employees is appropriate job vacancy with their competence. Some factors that influence the working mobility are mental of the workers and company management system. The orientation of the new job is feeling comfortable while working.

  19. Methods to study Drosophila immunity.

    PubMed

    Neyen, Claudine; Bretscher, Andrew J; Binggeli, Olivier; Lemaitre, Bruno

    2014-06-15

    Innate immune mechanisms are well conserved throughout evolution, and many theoretical concepts, molecular pathways and gene networks are applicable to invertebrate model organisms as much as vertebrate ones. Drosophila immunity research benefits from an easily manipulated genome, a fantastic international resource of transgenic tools and over a quarter century of accumulated techniques and approaches to study innate immunity. Here we present a short collection of ways to challenge the fruit fly immune system with various pathogens and parasites, as well as read-outs to assess its functions, including cellular and humoral immune responses. Our review covers techniques for assessing the kinetics and efficiency of immune responses quantitatively and qualitatively, such as survival analysis, bacterial persistence, antimicrobial peptide gene expression, phagocytosis and melanisation assays. Finally, we offer a toolkit of Drosophila strains available to the research community for current and future research. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Systems Biology, Neuroimaging, Neuropsychology, Neuroconnectivity and Traumatic Brain Injury

    PubMed Central

    Bigler, Erin D.

    2016-01-01

    The patient who sustains a traumatic brain injury (TBI) typically undergoes neuroimaging studies, usually in the form of computed tomography (CT) and magnetic resonance imaging (MRI). In most cases the neuroimaging findings are clinically assessed with descriptive statements that provide qualitative information about the presence/absence of visually identifiable abnormalities; though little if any of the potential information in a scan is analyzed in any quantitative manner, except in research settings. Fortunately, major advances have been made, especially during the last decade, in regards to image quantification techniques, especially those that involve automated image analysis methods. This review argues that a systems biology approach to understanding quantitative neuroimaging findings in TBI provides an appropriate framework for better utilizing the information derived from quantitative neuroimaging and its relation with neuropsychological outcome. Different image analysis methods are reviewed in an attempt to integrate quantitative neuroimaging methods with neuropsychological outcome measures and to illustrate how different neuroimaging techniques tap different aspects of TBI-related neuropathology. Likewise, how different neuropathologies may relate to neuropsychological outcome is explored by examining how damage influences brain connectivity and neural networks. Emphasis is placed on the dynamic changes that occur following TBI and how best to capture those pathologies via different neuroimaging methods. However, traditional clinical neuropsychological techniques are not well suited for interpretation based on contemporary and advanced neuroimaging methods and network analyses. Significant improvements need to be made in the cognitive and behavioral assessment of the brain injured individual to better interface with advances in neuroimaging-based network analyses. By viewing both neuroimaging and neuropsychological processes within a systems biology perspective could represent a significant advancement for the field. PMID:27555810

  1. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  2. Time-resolved quantitative-phase microscopy of laser-material interactions using a wavefront sensor.

    PubMed

    Gallais, Laurent; Monneret, Serge

    2016-07-15

    We report on a simple and efficient technique based on a wavefront sensor to obtain time-resolved amplitude and phase images of laser-material interactions. The main interest of the technique is to obtain quantitative self-calibrated phase measurements in one shot at the femtosecond time-scale, with high spatial resolution. The technique is used for direct observation and quantitative measurement of the Kerr effect in a fused silica substrate and free electron generation by photo-ionization processes in an optical coating.

  3. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  4. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    PubMed

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  5. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    PubMed

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  6. Probing myocardium biomechanics using quantitative optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.

    2015-03-01

    We present a quantitative optical coherence elastographic method for noncontact assessment of the myocardium elasticity. The method is based on shear wave imaging optical coherence tomography (SWI-OCT), where a focused air-puff system is used to induce localized tissue deformation through a low-pressure short-duration air stream and a phase-sensitive OCT system is utilized to monitor the propagation of the induced tissue displacement with nanoscale sensitivity. The 1-D scanning of M-mode OCT imaging and the application of optical phase retrieval and mapping techniques enable the reconstruction and visualization of 2-D depth-resolved shear wave propagation in tissue with ultra-high frame rate. The feasibility of this method in quantitative elasticity measurement is demonstrated on tissue-mimicking phantoms with the estimated Young's modulus compared with uniaxial compression tests. We also performed pilot experiments on ex vivo mouse cardiac muscle tissues with normal and genetically altered cardiomyocytes. Our results indicate this noncontact quantitative optical coherence elastographic method can be a useful tool for the cardiac muscle research and studies.

  7. Evaluation of methods for determining hardware projected life

    NASA Technical Reports Server (NTRS)

    1971-01-01

    An investigation of existing methods of predicting hardware life is summarized by reviewing programs having long life requirements, current research efforts on long life problems, and technical papers reporting work on life predicting techniques. The results indicate that there are no accurate quantitative means to predict hardware life for system level hardware. The effectiveness of test programs and the cause of hardware failures is considered.

  8. FT. Sam 91 Whiskey Combat Medic Medical Simulation Training Quantitative Integration Enhancement Program

    DTIC Science & Technology

    2011-07-01

    joined the project team in the statistical and research coordination role. Dr. Collin is an employee at the University of Pittsburgh. A successful...3. Submit to Ft. Detrick Completed Milestone: Statistical analysis planning 1. Review planned data metrics and data gathering tools...approach to performance assessment for continuous quality improvement.  Analyzing data with modern statistical techniques to determine the

  9. The Impact of Interactive Environment and Metacognitive Support on Academic Achievement and Transactional Distance in Online Learning

    ERIC Educational Resources Information Center

    Yilmaz, Ramazan; Keser, Hafize

    2017-01-01

    The aim of the present study is to reveal the impact of the interactive environment and metacognitive support (MS) in online learning on academic achievement and transactional distance (TD). The study is designed as 2 × 2 factorial design, and both qualitative and quantitative research techniques are used. The study was carried out on 127…

  10. Examination of the Mathematical Problem-Solving Beliefs and Success Levels of Primary School Teacher Candidates through the Variables of Mathematical Success and Gender

    ERIC Educational Resources Information Center

    Bal, Ayten Pinar

    2015-01-01

    The aim of this study is to examine the mathematical problem-solving beliefs and problem-solving success levels of primary school teacher candidates through the variables of academic success and gender. The research was designed according to the mixed methods technique in which qualitative and quantitative methods are used together. The working…

  11. Applications of mass spectrometry for quantitative protein analysis in formalin-fixed paraffin-embedded tissues

    PubMed Central

    Steiner, Carine; Ducret, Axel; Tille, Jean-Christophe; Thomas, Marlene; McKee, Thomas A; Rubbia-Brandt, Laura A; Scherl, Alexander; Lescuyer, Pierre; Cutler, Paul

    2014-01-01

    Proteomic analysis of tissues has advanced in recent years as instruments and methodologies have evolved. The ability to retrieve peptides from formalin-fixed paraffin-embedded tissues followed by shotgun or targeted proteomic analysis is offering new opportunities in biomedical research. In particular, access to large collections of clinically annotated samples should enable the detailed analysis of pathologically relevant tissues in a manner previously considered unfeasible. In this paper, we review the current status of proteomic analysis of formalin-fixed paraffin-embedded tissues with a particular focus on targeted approaches and the potential for this technique to be used in clinical research and clinical diagnosis. We also discuss the limitations and perspectives of the technique, particularly with regard to application in clinical diagnosis and drug discovery. PMID:24339433

  12. Physics Structure Analysis of Parallel Waves Concept of Physics Teacher Candidate

    NASA Astrophysics Data System (ADS)

    Sarwi, S.; Supardi, K. I.; Linuwih, S.

    2017-04-01

    The aim of this research was to find a parallel structure concept of wave physics and the factors that influence on the formation of parallel conceptions of physics teacher candidates. The method used qualitative research which types of cross-sectional design. These subjects were five of the third semester of basic physics and six of the fifth semester of wave course students. Data collection techniques used think aloud and written tests. Quantitative data were analysed with descriptive technique-percentage. The data analysis technique for belief and be aware of answers uses an explanatory analysis. Results of the research include: 1) the structure of the concept can be displayed through the illustration of a map containing the theoretical core, supplements the theory and phenomena that occur daily; 2) the trend of parallel conception of wave physics have been identified on the stationary waves, resonance of the sound and the propagation of transverse electromagnetic waves; 3) the influence on the parallel conception that reading textbooks less comprehensive and knowledge is partial understanding as forming the structure of the theory.

  13. Mixed-Methods Research in Nutrition and Dietetics.

    PubMed

    Zoellner, Jamie; Harris, Jeffrey E

    2017-05-01

    This work focuses on mixed-methods research (MMR) and is the 11th in a series exploring the importance of research design, statistical analysis, and epidemiologic methods as applied to nutrition and dietetics research. MMR research is an investigative technique that applies both quantitative and qualitative data. The purpose of this article is to define MMR; describe its history and nature; provide reasons for its use; describe and explain the six different MMR designs; describe sample selection; and provide guidance in data collection, analysis, and inference. MMR concepts are applied and integrated with nutrition-related scenarios in real-world research contexts and summary recommendations are provided. Copyright © 2017 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.

  14. Combining the power of stories and the power of numbers: mixed methods research and mixed studies reviews.

    PubMed

    Pluye, Pierre; Hong, Quan Nha

    2014-01-01

    This article provides an overview of mixed methods research and mixed studies reviews. These two approaches are used to combine the strengths of quantitative and qualitative methods and to compensate for their respective limitations. This article is structured in three main parts. First, the epistemological background for mixed methods will be presented. Afterward, we present the main types of mixed methods research designs and techniques as well as guidance for planning, conducting, and appraising mixed methods research. In the last part, we describe the main types of mixed studies reviews and provide a tool kit and examples. Future research needs to offer guidance for assessing mixed methods research and reporting mixed studies reviews, among other challenges.

  15. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  16. Interdisciplinary study of atmospheric processes and constituents of the mid-Atlantic coastal region.. [air pollution control studies in Virginia

    NASA Technical Reports Server (NTRS)

    Kindle, E. C.; Bandy, E. C.; Copeland, G.; Blais, R.; Levy, G.; Sonenshine, D.

    1975-01-01

    Past research projects for the year 1974-1975 are listed along with future research programs in the area of air pollution control, remote sensor analysis of smoke plumes, the biosphere component, and field experiments. A detailed budget analysis is presented. Attachments are included on the following topics: mapping forest vegetation with ERTS-1 MSS data and automatic data processing techniques, and use of LARS system for the quantitative determination of smoke plume lateral diffusion coefficients from ERTS images of Virginia.

  17. How to Improve Interest, IQ, and Motivation of Vocational Students?

    NASA Astrophysics Data System (ADS)

    Sumual, H.; Ombuh, D. M.

    2018-02-01

    The aim of this research was to study the effect of interest, motivation and IQ of students on the learning result. The survey method with quantitative approach was used in this study. The data were then analysed using path paradigm. Data were collected by questionnaire technique, special tests for IQ and documentation for learning outcomes. The results showed that the interest, IQ and the motivation influence significantly and positively on learning result as well as interest to learning motivation. However, no significant influence of IQ on Learning Motivation was detected in this research.

  18. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007.

    PubMed

    Gore, Sally A; Nordberg, Judith M; Palmer, Lisa A; Piorun, Mary E

    2009-07-01

    This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991-2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians.

  19. Molecular imaging of melanin distribution in vivo and quantitative differential diagnosis of human pigmented lesions using label-free harmonic generation biopsy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Sun, Chi-Kuang; Wei, Ming-Liang; Su, Yu-Hsiang; Weng, Wei-Hung; Liao, Yi-Hua

    2017-02-01

    Harmonic generation microscopy is a noninvasive repetitive imaging technique that provides real-time 3D microscopic images of human skin with a sub-femtoliter resolution and high penetration down to the reticular dermis. In this talk, we show that with a strong resonance effect, the third-harmonic-generation (THG) modality provides enhanced contrast on melanin and allows not only differential diagnosis of various pigmented skin lesions but also quantitative imaging for longterm tracking. This unique capability makes THG microscopy the only label-free technique capable of identifying the active melanocytes in human skin and to image their different dendriticity patterns. In this talk, we will review our recent efforts to in vivo image melanin distribution and quantitatively diagnose pigmented skin lesions using label-free harmonic generation biopsy. This talk will first cover the spectroscopic study on the melanin enhanced THG effect in human cells and the calibration strategy inside human skin for quantitative imaging. We will then review our recent clinical trials including: differential diagnosis capability study on pigmented skin tumors; as well as quantitative virtual biopsy study on pre- and post- treatment evaluation on melasma and solar lentigo. Our study indicates the unmatched capability of harmonic generation microscopy to perform virtual biopsy for noninvasive histopathological diagnosis of various pigmented skin tumors, as well as its unsurpassed capability to noninvasively reveal the pathological origin of different hyperpigmentary diseases on human face as well as to monitor the efficacy of laser depigmentation treatments. This work is sponsored by National Health Research Institutes.

  20. Dataglove measurement of joint angles in sign language handshapes

    PubMed Central

    Eccarius, Petra; Bour, Rebecca; Scheidt, Robert A.

    2012-01-01

    In sign language research, we understand little about articulatory factors involved in shaping phonemic boundaries or the amount (and articulatory nature) of acceptable phonetic variation between handshapes. To date, there exists no comprehensive analysis of handshape based on the quantitative measurement of joint angles during sign production. The purpose of our work is to develop a methodology for collecting and visualizing quantitative handshape data in an attempt to better understand how handshapes are produced at a phonetic level. In this pursuit, we seek to quantify the flexion and abduction angles of the finger joints using a commercial data glove (CyberGlove; Immersion Inc.). We present calibration procedures used to convert raw glove signals into joint angles. We then implement those procedures and evaluate their ability to accurately predict joint angle. Finally, we provide examples of how our recording techniques might inform current research questions. PMID:23997644

  1. Video methods in the quantification of children's exposures.

    PubMed

    Ferguson, Alesia C; Canales, Robert A; Beamer, Paloma; Auyeung, Willa; Key, Maya; Munninghoff, Amy; Lee, Kevin Tse-Wing; Robertson, Alexander; Leckie, James O

    2006-05-01

    In 1994, Stanford University's Exposure Research Group (ERG) conducted its first pilot study to collect micro-level activity time series (MLATS) data for young children. The pilot study involved videotaping four children of farm workers in the Salinas Valley of California and converting their videotaped activities to valuable text files of contact behavior using video-translation techniques. These MLATS are especially useful for describing intermittent dermal (i.e., second-by-second account of surfaces and objects contacted) and non-dietary ingestion (second-by-second account of objects or hands placed in the mouth) contact behavior. Second-by-second records of children contact behavior are amenable to quantitative and statistical analysis and allow for more accurate model estimates of human exposure and dose to environmental contaminants. Activity patterns data for modeling inhalation exposure (i.e., accounts of microenvironments visited) can also be extracted from the MLATS data. Since the pilot study, ERG has collected an immense MLATS data set for 92 children using more developed and refined videotaping and video-translation methodologies. This paper describes all aspects required for the collection of MLATS including: subject recruitment techniques, videotaping and video-translation processes, and potential data analysis. This paper also describes the quality assurance steps employed for these new MLATS projects, including: training, data management, and the application of interobserver and intraobserver agreement during video translation. The discussion of these issues and ERG's experiences in dealing with them can assist other groups in the conduct of research that employs these more quantitative techniques.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  4. The use of virtual environments for percentage view analysis.

    PubMed

    Schofield, Damian; Cox, Christopher J B

    2005-09-01

    It is recognised that Visual Impact Assessment (VIA), unlike many other aspects of Environmental Impact Assessments (EIA), relies less upon measurement than upon experience and judgement. Hence, it is necessary for a more structured and consistent approach towards VIA, reducing the amount of bias and subjectivity. For proposed developments, there are very few quantitative techniques for the evaluation of visibility, and these existing methods can be highly inaccurate and time consuming. Percentage view changes are one of the few quantitative techniques, and the use of computer technology can reduce the inaccuracy and the time spent evaluating the visibility of either existing or proposed developments. For over 10 years, research work undertaken by the authors at the University of Nottingham has employed Computer Graphics (CG) and Virtual Reality (VR) in civilian and industrial contexts for environmental planning, design visualisation, accident reconstruction, risk analysis, data visualisation and training simulators. This paper describes a method to quantitatively assess the visual impact of proposed developments on the landscape using CG techniques. This method allows the determination of accurate percentage view changes with the use of a computer-generated model of the environment and the application of specialist software that has been developed at the University of Nottingham. The principles are easy to understand and therefore planners, authorisation agencies and members of the public can use and understand the results. A case study is shown to demonstrate the application and the capabilities of the technology.

  5. Quantitative analysis of packed and compacted granular systems by x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Fu, Xiaowei; Milroy, Georgina E.; Dutt, Meenakshi; Bentham, A. Craig; Hancock, Bruno C.; Elliott, James A.

    2005-04-01

    The packing and compaction of powders are general processes in pharmaceutical, food, ceramic and powder metallurgy industries. Understanding how particles pack in a confined space and how powders behave during compaction is crucial for producing high quality products. This paper outlines a new technique, based on modern desktop X-ray tomography and image processing, to quantitatively investigate the packing of particles in the process of powder compaction and provide great insights on how powder densify during powder compaction, which relate in terms of materials properties and processing conditions to tablet manufacture by compaction. A variety of powder systems were considered, which include glass, sugar, NaCl, with a typical particle size of 200-300 mm and binary mixtures of NaCl-Glass Spheres. The results are new and have been validated by SEM observation and numerical simulations using discrete element methods (DEM). The research demonstrates that XMT technique has the potential in further investigating of pharmaceutical processing and even verifying other physical models on complex packing.

  6. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  7. Q methodology in health economics.

    PubMed

    Baker, Rachel; Thompson, Carl; Mannion, Russell

    2006-01-01

    The recognition that health economists need to understand the meaning of data if they are to adequately understand research findings which challenge conventional economic theory has led to the growth of qualitative modes of enquiry in health economics. The use of qualitative methods of exploration and description alongside quantitative techniques gives rise to a number of epistemological, ontological and methodological challenges: difficulties in accounting for subjectivity in choices, the need for rigour and transparency in method, and problems of disciplinary acceptability to health economists. Q methodology is introduced as a means of overcoming some of these challenges. We argue that Q offers a means of exploring subjectivity, beliefs and values while retaining the transparency, rigour and mathematical underpinnings of quantitative techniques. The various stages of Q methodological enquiry are outlined alongside potential areas of application in health economics, before discussing the strengths and limitations of the approach. We conclude that Q methodology is a useful addition to economists' methodological armoury and one that merits further consideration and evaluation in the study of health services.

  8. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities.

    PubMed

    Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P

    2015-09-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.

  9. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities

    PubMed Central

    Green, Carla A.; Duan, Naihua; Gibbons, Robert D.; Hoagwood, Kimberly E.; Palinkas, Lawrence A.; Wisdom, Jennifer P.

    2015-01-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings. PMID:24722814

  10. A Quantitative Technique for Beginning Microscopists.

    ERIC Educational Resources Information Center

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  11. 3D light scanning macrography.

    PubMed

    Huber, D; Keller, M; Robert, D

    2001-08-01

    The technique of 3D light scanning macrography permits the non-invasive surface scanning of small specimens at magnifications up to 200x. Obviating both the problem of limited depth of field inherent to conventional close-up macrophotography and the metallic coating required by scanning electron microscopy, 3D light scanning macrography provides three-dimensional digital images of intact specimens without the loss of colour, texture and transparency information. This newly developed technique offers a versatile, portable and cost-efficient method for the non-invasive digital and photographic documentation of small objects. Computer controlled device operation and digital image acquisition facilitate fast and accurate quantitative morphometric investigations, and the technique offers a broad field of research and educational applications in biological, medical and materials sciences.

  12. Design, Validation, and Testing of a Hot-Film Anemometer for Hypersonic Flow

    NASA Astrophysics Data System (ADS)

    Sheplak, Mark

    The application of constant-temperature hot-film anemometry to hypersonic flow has been reviewed and extended in this thesis. The objective of this investigation was to develop a measurement tool capable of yielding continuous, high-bandwidth, quantitative, normal mass-flux and total -temperature measurements in moderate-enthalpy environments. This research has produced a probe design that represents a significant advancement over existing designs, offering the following improvements: (1) a five-fold increase in bandwidth; (2) true stagnation-line sensor placement; (3) a two order-of-magnitude decrease in sensor volume; and (4) over a 70% increase in maximum film temperature. These improvements were achieved through substrate design, sensor placement, the use of high-temperature materials, and state -of-the-art microphotolithographic fabrication techniques. The experimental study to characterize the probe was performed in four different hypersonic wind tunnels at NASA-Langley Research Center. The initial test consisted of traversing the hot film through a Mach 6, flat-plate, turbulent boundary layer in air. The detailed static-calibration measurements that followed were performed in two different hypersonic flows: a Mach 11 helium flow and Mach 6 air flow. The final test of this thesis consisted of traversing the probe through the Mach 6 wake of a 70^ circ blunt body. The goal of this test was to determine the state (i.e., laminar or turbulent) of the wake. These studies indicate that substrate conduction effects result in instrumentation characteristics that prevent the hot-film anemometer from being used as a quantitative tool. The extension of this technique to providing quantitative information is dependent upon the development of lower thermal-conductivity substrate materials. However, the probe durability, absence of strain gauging, and high bandwidth represent significant improvements over the hot-wire technique for making qualitative measurements. Potential uses for this probe are: frequency identification for resonant flows, transition studies, turbulence detection for quiet-tunnel development and reattaching turbulent shear flows, and qualitative turbulence studies of shock-wave/turbulent boundary layer interactions.

  13. Monitoring the Transcriptional Activity of Human Endogenous Retroviral HERV-W Family Using PNA Strand Invasion into Double-Stranded DNA.

    PubMed

    Machnik, Grzegorz; Skudrzyk, Estera; Bułdak, Łukasz; Ruczyński, Jarosław; Kozłowska, Agnieszka; Mucha, Piotr; Rekowski, Piotr; Szkróbka, Witold; Basiak, Marcin; Bołdys, Aleksandra; Sławska, Helena; Okopień, Bogusław

    2018-02-01

    In the presented assay, we elaborated a method for distinguishing sequences that are genetically closely related to each other. This is particularly important in a situation where a fine balance of the allele abundance is a point of research interest. We developed a peptide nucleic acid (PNA) strand invasion technique for the differentiation between multiple sclerosis-associated retrovirus (MSRV) and ERVWE1 sequences, both molecularly similar, belonging to the human endogenous retrovirus HERV-W family. We have found that this method may support the PCR technique in screening for minor alleles which, in certain conditions, may be undetected by the standard PCR technique. We performed the analysis of different ERVWE1 and MSRV template mixtures ranging from 0 to 100% of ERVWE1 in the studied samples, finding the linear correlation between template composition and signal intensity of final reaction products. Using the PNA strand invasion assay, we were able to estimate the relative ERVWE1 expression level in human specimens such as U-87 MG, normal human astrocytes cell lines and placental tissue. The results remained in concordance with those obtained by semi-quantitative or quantitative PCR.

  14. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.

    1987-01-01

    Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.

  15. Recent advances in rice genome and chromosome structure research by fluorescence in situ hybridization (FISH).

    PubMed

    Ohmido, Nobuko; Fukui, Kiichi; Kinoshita, Toshiro

    2010-01-01

    Fluorescence in situ hybridization (FISH) is an effective method for the physical mapping of genes and repetitive DNA sequences on chromosomes. Physical mapping of unique nucleotide sequences on specific rice chromosome regions was performed using a combination of chromosome identification and highly sensitive FISH. Increases in the detection sensitivity of smaller DNA sequences and improvements in spatial resolution have ushered in a new phase in FISH technology. Thus, it is now possible to perform in situ hybridization on somatic chromosomes, pachytene chromosomes, and even on extended DNA fibers (EDFs). Pachytene-FISH allows the integration of genetic linkage maps and quantitative chromosome maps. Visualization methods using FISH can reveal the spatial organization of the centromere, heterochromatin/euchromatin, and the terminal structures of rice chromosomes. Furthermore, EDF-FISH and the DNA combing technique can resolve a spatial distance of 1 kb between adjacent DNA sequences, and the detection of even a 300-bp target is now feasible. The copy numbers of various repetitive sequences and the sizes of various DNA molecules were quantitatively measured using the molecular combing technique. This review describes the significance of these advances in molecular cytology in rice and discusses future applications in plant studies using visualization techniques.

  16. Comparison of Grid Nudging and Spectral Nudging Techniques for Dynamical Climate Downscaling within the WRF Model

    NASA Astrophysics Data System (ADS)

    Fan, X.; Chen, L.; Ma, Z.

    2010-12-01

    Climate downscaling has been an active research and application area in the past several decades focusing on regional climate studies. Dynamical downscaling, in addition to statistical methods, has been widely used in downscaling as the advanced modern numerical weather and regional climate models emerge. The utilization of numerical models enables that a full set of climate variables are generated in the process of downscaling, which are dynamically consistent due to the constraints of physical laws. While we are generating high resolution regional climate, the large scale climate patterns should be retained. To serve this purpose, nudging techniques, including grid analysis nudging and spectral nudging, have been used in different models. There are studies demonstrating the benefit and advantages of each nudging technique; however, the results are sensitive to many factors such as nudging coefficients and the amount of information to nudge to, and thus the conclusions are controversy. While in a companion work of developing approaches for quantitative assessment of the downscaled climate, in this study, the two nudging techniques are under extensive experiments in the Weather Research and Forecasting (WRF) model. Using the same model provides fair comparability. Applying the quantitative assessments provides objectiveness of comparison. Three types of downscaling experiments were performed for one month of choice. The first type is serving as a base whereas the large scale information is communicated through lateral boundary conditions only; the second is using the grid analysis nudging; and the third is using spectral nudging. Emphases are given to the experiments of different nudging coefficients and nudging to different variables in the grid analysis nudging; while in spectral nudging, we focus on testing the nudging coefficients, different wave numbers on different model levels to nudge.

  17. [Progress of researches on mechanism of acupuncture therapy underlying improvement of acute cerebral hemorrhage].

    PubMed

    Wang, Fan; Wang, Hai-qiao; Dong, Gui-rong

    2011-04-01

    In the present paper, the authors review the progress of researches on the mechanism of acupuncture therapy underlying improvement of acute cerebral hemorrhage from experimental studies and research methods. The effects of acupuncture intervention mainly involve (1) lessening inflammatory reactions, (2) reducing impairment of free radicals and excitatory amino acids on cerebral neurons, (3) balancing release of vascular bioactive substances to increase regional cerebral blood flow, and (4) promoting repair and regeneration of the neural tissue, etc. In regard to the research methods, many new biological techniques such as biological molecular approaches, neuro-cellular chemical methods, reverse transcription-polymerase chain reaction (RT-PCR) or quantitative real time-PCR, situ hybridization, western blotting, electron microscope, etc., have been extensively applied to researches on the underlying mechanism of acupuncture therapy for cerebral infarction. In addition, the authors also pointed out that in spite of achieving some bigger progresses in experimental studies, most of the results basically reflect static, isolated and regional changes rather than dynamic and whole body changes. For this reason, more vivo research techniques and noninvasive research methods are highly recommended to be used in the future research on the underlying mechanisms of acupuncture therapy for acute cerebral ischemia.

  18. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  19. Cardiovascular and pulmonary dynamics by quantitative imaging

    NASA Technical Reports Server (NTRS)

    Wood, E. H.

    1976-01-01

    The accuracy and range of studies on cardiovascular and pulmonary functions can be greatly facilitated if the motions of the underlying organ systems throughout individual cycles can be directly visualized and readily measured with minimum or preferably no effect on these motions. Achievement of this objective requires development of techniques for quantitative noninvasive or minimally invasive dynamic and stop-action imaging of the organ systems. A review of advances in dynamic quantitative imaging of moving organs reveals that the revolutionary value of cross-sectional and three-dimensional images produced by various types of radiant energy such as X-rays and gamma rays, positrons, electrons, protons, light, and ultrasound for clinical diagnostic and biomedical research applications is just beginning to be realized. The fabrication of a clinically useful cross-section reconstruction device with sensing capabilities for both anatomical structural composition and chemical composition may be possible and awaits future development.

  20. Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets

    NASA Technical Reports Server (NTRS)

    Griffin, D. W.; Yep, T. W.; Agrawal, A. K.

    2005-01-01

    Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2- second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet in microgravity was up to 70 percent wider than that in Earth gravity. The global jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes change in gravity in the drop tower.

  1. Quantitative ptychographic reconstruction by applying a probe constraint

    NASA Astrophysics Data System (ADS)

    Reinhardt, J.; Schroer, C. G.

    2018-04-01

    The coherent scanning technique X-ray ptychography has become a routine tool for high-resolution imaging and nanoanalysis in various fields of research such as chemistry, biology or materials science. Often the ptychographic reconstruction results are analysed in order to yield absolute quantitative values for the object transmission and illuminating probe function. In this work, we address a common ambiguity encountered in scaling the object transmission and probe intensity via the application of an additional constraint to the reconstruction algorithm. A ptychographic measurement of a model sample containing nanoparticles is used as a test data set against which to benchmark in the reconstruction results depending on the type of constraint used. Achieving quantitative absolute values for the reconstructed object transmission is essential for advanced investigation of samples that are changing over time, e.g., during in-situ experiments or in general when different data sets are compared.

  2. Analysis of defect structure in silicon. Characterization of SEMIX material. Silicon sheet growth development for the large area silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.

    1983-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  3. Qualitative and quantitative imaging in microgravity combustion

    NASA Technical Reports Server (NTRS)

    Weiland, Karen J.

    1995-01-01

    An overview of the imaging techniques implemented by researchers in the microgravity combustion program shows that for almost any system, imaging of the flame may be accomplished in a variety of ways. Standard and intensified video, high speed, and infrared cameras and fluorescence, laser schlieren, rainbow schlieren, soot volume fraction, and soot temperature imaging have all been used in the laboratory and many in reduced gravity to make the necessary experimental measurements.

  4. RT-qPCR Demonstrates Light-Dependent AtRBCS1A and AtRBCS3B mRNA Expressions in "Arabidopsis thaliana" Leaves

    ERIC Educational Resources Information Center

    Chang, Ming-Mei; Li, Anna; Feissner, Robert; Ahmad, Talal

    2016-01-01

    Reverse transcription quantitative polymerase chain reaction (RT-qPCR) is widely used in diagnosis and research to determine specific mRNA expressions in cells. As RT-qPCR applications increase, it is necessary to provide undergraduates hands-on experience of this modern technique. Here, we report a 3-week laboratory exercise using RT-qPCR to…

  5. Landslide hazard assessment: recent trends and techniques.

    PubMed

    Pardeshi, Sudhakar D; Autade, Sumant E; Pardeshi, Suchitra S

    2013-01-01

    Landslide hazard assessment is an important step towards landslide hazard and risk management. There are several methods of Landslide Hazard Zonation (LHZ) viz. heuristic, semi quantitative, quantitative, probabilistic and multi-criteria decision making process. However, no one method is accepted universally for effective assessment of landslide hazards. In recent years, several attempts have been made to apply different methods of LHZ and to compare results in order to find the best suited model. This paper presents the review of researches on landslide hazard mapping published in recent years. The advanced multivariate techniques are proved to be effective in spatial prediction of landslides with high degree of accuracy. Physical process based models also perform well in LHZ mapping even in the areas with poor database. Multi-criteria decision making approach also play significant role in determining relative importance of landslide causative factors in slope instability process. Remote Sensing and Geographical Information System (GIS) are powerful tools to assess landslide hazards and are being used extensively in landslide researches since last decade. Aerial photographs and high resolution satellite data are useful in detection, mapping and monitoring landslide processes. GIS based LHZ models helps not only to map and monitor landslides but also to predict future slope failures. The advancements in Geo-spatial technologies have opened the doors for detailed and accurate assessment of landslide hazards.

  6. Quantifying environmental DNA signals for aquatic invasive species across multiple detection platforms.

    PubMed

    Nathan, Lucas M; Simmons, Megan; Wegleitner, Benjamin J; Jerde, Christopher L; Mahon, Andrew R

    2014-11-04

    The use of molecular surveillance techniques has become popular among aquatic researchers and managers due to the improved sensitivity and efficiency compared to traditional sampling methods. Rapid expansion in the use of environmental DNA (eDNA), paired with the advancement of molecular technologies, has resulted in new detection platforms and techniques. In this study we present a comparison of three eDNA surveillance platforms: traditional polymerase chain reaction (PCR), quantitative PCR (qPCR), and digital droplet PCR (ddPCR) in which water samples were collected over a 24 h time period from mesocosm experiments containing a population gradient of invasive species densities. All platforms reliably detected the presence of DNA, even at low target organism densities within the first hour. The two quantitative platforms (qPCR and ddPCR) produced similar estimates of DNA concentrations. The analyses completed with ddPCR was faster from sample collection through analyses and cost approximately half the expenditure of qPCR. Although a new platform for eDNA surveillance of aquatic species, ddPCR was consistent with more commonly used qPCR and a cost-effective means of estimating DNA concentrations. Use of ddPCR by researchers and managers should be considered in future eDNA surveillance applications.

  7. Costs of genetic testing: Supporting Brazilian Public Policies for the incorporating of molecular diagnostic technologies

    PubMed Central

    Schlatter, Rosane Paixão; Matte, Ursula; Polanczyk, Carisi Anne; Koehler-Santos, Patrícia; Ashton-Prolla, Patricia

    2015-01-01

    This study identifies and describes the operating costs associated with the molecular diagnosis of diseases, such as hereditary cancer. To approximate the costs associated with these tests, data informed by Standard Operating Procedures for various techniques was collected from hospital software and a survey of market prices. Costs were established for four scenarios of capacity utilization to represent the possibility of suboptimal use in research laboratories. Cost description was based on a single site. The results show that only one technique was not impacted by rising costs due to underutilized capacity. Several common techniques were considerably more expensive at 30% capacity, including polymerase chain reaction (180%), microsatellite instability analysis (181%), gene rearrangement analysis by multiplex ligation probe amplification (412%), non-labeled sequencing (173%), and quantitation of nucleic acids (169%). These findings should be relevant for the definition of public policies and suggest that investment of public funds in the establishment of centralized diagnostic research centers would reduce costs to the Public Health System. PMID:26500437

  8. [Progress in porky genes and transcriptome and discussion of relative issues].

    PubMed

    Zhu, Meng-Jin; Liu, Bang; Li, Kui

    2005-01-01

    To date, research on molecular base of porky molecular development was mainly involved in muscle growth and meat quality. Some functional genes including Hal gene and RN gene and some QTLs controlling or associated with porky growth and quality were detected through candidate gene approach and genome-wide scanning. Genic transcriptome pertinent to porcine muscle and adipose also came into study. At the same time, these researches have befallen some shortcomings to some extent. Research from molecular quantitative genetics showed shortcomings that single gene was devilishly emphasized and co-expression pattern of multi-genes was ignored. Research applying transcriptome analysis tool also met two of limitations, one was the singleness of type of molecular experimental techniques, and another was that genes of muscle and adipose were artificially divided into unattached two parts. Thus, porky genes were explored by parallel genetics based on systemic views and techniques to specially reveal the interactional mechanism of porky genes respectively controlling muscle and adipose, which would be important issues of genes and genome researches on porky development in the near future.

  9. An easy and inexpensive method for quantitative analysis of endothelial damage by using vital dye staining and Adobe Photoshop software.

    PubMed

    Saad, Hisham A; Terry, Mark A; Shamie, Neda; Chen, Edwin S; Friend, Daniel F; Holiman, Jeffrey D; Stoeger, Christopher

    2008-08-01

    We developed a simple, practical, and inexpensive technique to analyze areas of endothelial cell loss and/or damage over the entire corneal area after vital dye staining by using a readily available, off-the-shelf, consumer software program, Adobe Photoshop. The purpose of this article is to convey a method of quantifying areas of cell loss and/or damage. Descemet-stripping automated endothelial keratoplasty corneal transplant surgery was performed by using 5 precut corneas on a human cadaver eye. Corneas were removed and stained with trypan blue and alizarin red S and subsequently photographed. Quantitative assessment of endothelial damage was performed by using Adobe Photoshop 7.0 software. The average difference for cell area damage for analyses performed by 1 observer twice was 1.41%. For analyses performed by 2 observers, the average difference was 1.71%. Three masked observers were 100% successful in matching the randomized stained corneas to their randomized processed Adobe images. Vital dye staining of corneal endothelial cells can be combined with Adobe Photoshop software to yield a quantitative assessment of areas of acute endothelial cell loss and/or damage. This described technique holds promise for a more consistent and accurate method to evaluate the surgical trauma to the endothelial cell layer in laboratory models. This method of quantitative analysis can probably be generalized to any area of research that involves areas that are differentiated by color or contrast.

  10. In vivo confocal microscopy of the cornea: New developments in image acquisition, reconstruction and analysis using the HRT-Rostock Corneal Module

    PubMed Central

    Petroll, W. Matthew; Robertson, Danielle M.

    2015-01-01

    The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608

  11. An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yongmin; Alexander, Thomas

    1986-06-01

    In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.

  12. Gene Profiling Technique to Accelerate Stem Cell Therapies for Eye Diseases

    MedlinePlus

    ... like RPE. They also use a technique called quantitative RT-PCR to measure the expression of genes ... higher in iPS cells than mature RPE. But quantitative RT-PCR only permits the simultaneous measurement of ...

  13. Quantitative analysis of the effect of environmental-scanning electron microscopy on collagenous tissues.

    PubMed

    Lee, Woowon; Toussaint, Kimani C

    2018-05-31

    Environmental-scanning electron microscopy (ESEM) is routinely applied to various biological samples due to its ability to maintain a wet environment while imaging; moreover, the technique obviates the need for sample coating. However, there is limited research carried out on electron-beam (e-beam) induced tissue damage resulting from using the ESEM. In this paper, we use quantitative second-harmonic generation (SHG) microscopy to examine the effects of e-beam exposure from the ESEM on collagenous tissue samples prepared as either fixed, frozen, wet or dehydrated. Quantitative SHG analysis of tissues, before and after ESEM e-beam exposure in low-vacuum mode, reveals evidence of cross-linking of collagen fibers, however there are no structural differences observed in fixed tissue. Meanwhile wet-mode ESEM appears to radically alter the structure from a regular fibrous arrangement to a more random fiber orientation. We also confirm that ESEM images of collagenous tissues show higher spatial resolution compared to SHG microscopy, but the relative tradeoff with collagen specificity reduces its effectiveness in quantifying collagen fiber organization. Our work provides insight on both the limitations of the ESEM for tissue imaging, and the potential opportunity to use as a complementary technique when imaging fine features in the non-collagenous regions of tissue samples.

  14. Potential advantages of using synchrotron X-ray based techniques in pediatric research.

    PubMed

    Pascolo, L; Esteve, F; Rizzardi, C; James, S; Menk, R H

    2013-01-01

    Synchrotron radiation (SR), which combines extremely high intensity, high collimation, tunability, and continuous energy spectrum, allows the development of advanced X-ray based techniques that are becoming a uniquely useful tool in life science research, along providing exciting opportunities in biomedical imaging and radiotherapy. This review summarize emerging techniques and their potential to greatly enhance the exploration of dynamical biological process occurring across various spatial and temporal regimes, from whole body physiology, down to the location of individual chemical species within single cells. In recent years pediatric research and clinic practice have started to profit from these new opportunities, particularly by extending the diagnostic and therapeutic capabilities of these X-ray based techniques. In diagnosis, technical advances in DEI and KES imaging modalities have been demonstrated as particularly valuable for children and women since SR allows dose minimization, with significant reductions compared to conventional approaches. However, the greatest expectations are in the field of SR based radiotherapy, increasingly studies are demonstrating SR radiotherapy provides improved chances of recovery; this is especially the case for pediatric patients. In addition, we report on the applicability of advanced X-ray microscopy techniques that offer exceptional spatial and quantitative resolution in elemental detection. These techniques, which are useful for in vitro studies, will be particularly advantageous where investigators seek deeper understanding of diseases where mismetabolism of metals, either physiological important (i.e. Cu, Zn) or outright toxic (i.e. Pb), underlies pathogenesis.

  15. A validated method for the quantitation of 1,1-difluoroethane using a gas in equilibrium method of calibration.

    PubMed

    Avella, Joseph; Lehrer, Michael; Zito, S William

    2008-10-01

    1,1-Difluoroethane (DFE), also known as Freon 152A, is a member of a class of compounds known as halogenated hydrocarbons. A number of these compounds have gained notoriety because of their ability to induce rapid onset of intoxication after inhalation exposure. Abuse of DFE has necessitated development of methods for its detection and quantitation in postmortem and human performance specimens. Furthermore, methodologies applicable to research studies are required as there have been limited toxicokinetic and toxicodynamic reports published on DFE. This paper describes a method for the quantitation of DFE using a gas chromatography-flame-ionization headspace technique that employs solventless standards for calibration. Two calibration curves using 0.5 mL whole blood calibrators which ranged from A: 0.225-1.350 to B: 9.0-180.0 mg/L were developed. These were evaluated for linearity (0.9992 and 0.9995), limit of detection of 0.018 mg/L, limit of quantitation of 0.099 mg/L (recovery 111.9%, CV 9.92%), and upper limit of linearity of 27,000.0 mg/L. Combined curve recovery results of a 98.0 mg/L DFE control that was prepared using an alternate technique was 102.2% with CV of 3.09%. No matrix interference was observed in DFE enriched blood, urine or brain specimens nor did analysis of variance detect any significant differences (alpha = 0.01) in the area under the curve of blood, urine or brain specimens at three identical DFE concentrations. The method is suitable for use in forensic laboratories because validation was performed on instrumentation routinely used in forensic labs and due to the ease with which the calibration range can be adjusted. Perhaps more importantly it is also useful for research oriented studies because the removal of solvent from standard preparation eliminates the possibility for solvent induced changes to the gas/liquid partitioning of DFE or chromatographic interference due to the presence of solvent in specimens.

  16. Application of Deep Learning in Automated Analysis of Molecular Images in Cancer: A Survey

    PubMed Central

    Xue, Yong; Chen, Shihui; Liu, Yong

    2017-01-01

    Molecular imaging enables the visualization and quantitative analysis of the alterations of biological procedures at molecular and/or cellular level, which is of great significance for early detection of cancer. In recent years, deep leaning has been widely used in medical imaging analysis, as it overcomes the limitations of visual assessment and traditional machine learning techniques by extracting hierarchical features with powerful representation capability. Research on cancer molecular images using deep learning techniques is also increasing dynamically. Hence, in this paper, we review the applications of deep learning in molecular imaging in terms of tumor lesion segmentation, tumor classification, and survival prediction. We also outline some future directions in which researchers may develop more powerful deep learning models for better performance in the applications in cancer molecular imaging. PMID:29114182

  17. Research in cosmic and gamma ray astrophysics

    NASA Technical Reports Server (NTRS)

    Stone, Edward C.; Mewaldt, Richard A.; Prince, Thomas A.

    1992-01-01

    Discussed here is research in cosmic ray and gamma ray astrophysics at the Space Radiation Laboratory (SRL) of the California Institute of Technology. The primary activities discussed involve the development of new instrumentation and techniques for future space flight. In many cases these instrumentation developments were tested in balloon flight instruments designed to conduct new investigations in cosmic ray and gamma ray astrophysics. The results of these investigations are briefly summarized. Specific topics include a quantitative investigation of the solar modulation of cosmic ray protons and helium nuclei, a study of cosmic ray positron and electron spectra in interplanetary and interstellar space, the solar modulation of cosmic rays, an investigation of techniques for the measurement and interpretation of cosmic ray isotopic abundances, and a balloon measurement of the isotopic composition of galactic cosmic ray boron, carbon, and nitrogen.

  18. TU-CD-BRA-11: Application of Bone Suppression Technique to Inspiratory/expiratory Chest Radiography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanaka, R; Sanada, S; Sakuta, K

    Purpose: The bone suppression technique based on advanced image processing can suppress the conspicuity of bones on chest radiographs, creating soft tissue images normally obtained by the dual-energy subtraction technique. This study was performed to investigate the usefulness of bone suppression technique in quantitative analysis of pulmonary function in inspiratory/expiratory chest radiography. Methods: Commercial bone suppression image processing software (ClearRead; Riverain Technologies) was applied to paired inspiratory/expiratory chest radiographs of 107 patients (normal, 33; abnormal, 74) to create corresponding bone suppression images. The abnormal subjects had been diagnosed with pulmonary diseases, such as pneumothorax, pneumonia, emphysema, asthma, and lung cancer.more » After recognition of the lung area, the vectors of respiratory displacement were measured in all local lung areas using a cross-correlation technique. The measured displacement in each area was visualized as displacement color maps. The distribution pattern of respiratory displacement was assessed by comparison with the findings of lung scintigraphy. Results: Respiratory displacement of pulmonary markings (soft tissues) was able to be quantified separately from the rib movements on bone suppression images. The resulting displacement map showed a left-right symmetric distribution increasing from the lung apex to the bottom region of the lung in many cases. However, patients with ventilatory impairments showed a nonuniform distribution caused by decreased displacement of pulmonary markings, which were confirmed to correspond to area with ventilatory impairments found on the lung scintigrams. Conclusion: The bone suppression technique was useful for quantitative analysis of respiratory displacement of pulmonary markings without any interruption of the rib shadows. Abnormal areas could be detected as decreased displacement of pulmonary markings. Inspiratory/expiratory chest radiography combined with the bone suppression technique has potential for predicting local lung function on the basis of dynamic analysis of pulmonary markings. This work was partially supported by Nakatani Foundation, Grant-in-aid for Scientific Research (C) of Ministry of Education, Culture, Sports, Science and Technology, JAPAN (Grant number : 24601007), and Nakatani Foundation, Mitsubishi Foundation, and the he Mitani Foundation for Research and Development. Yasushi Kishitani is a staff of TOYO corporation.« less

  19. Quantitative analysis of drug distribution by ambient mass spectrometry imaging method with signal extinction normalization strategy and inkjet-printing technology.

    PubMed

    Luo, Zhigang; He, Jingjing; He, Jiuming; Huang, Lan; Song, Xiaowei; Li, Xin; Abliz, Zeper

    2018-03-01

    Quantitative mass spectrometry imaging (MSI) is a robust approach that provides both quantitative and spatial information for drug candidates' research. However, because of complicated signal suppression and interference, acquiring accurate quantitative information from MSI data remains a challenge, especially for whole-body tissue sample. Ambient MSI techniques using spray-based ionization appear to be ideal for pharmaceutical quantitative MSI analysis. However, it is more challenging, as it involves almost no sample preparation and is more susceptible to ion suppression/enhancement. Herein, based on our developed air flow-assisted desorption electrospray ionization (AFADESI)-MSI technology, an ambient quantitative MSI method was introduced by integrating inkjet-printing technology with normalization of the signal extinction coefficient (SEC) using the target compound itself. The method utilized a single calibration curve to quantify multiple tissue types. Basic blue 7 and an antitumor drug candidate (S-(+)-deoxytylophorinidine, CAT) were chosen to initially validate the feasibility and reliability of the quantitative MSI method. Rat tissue sections (heart, kidney, and brain) administered with CAT was then analyzed. The quantitative MSI analysis results were cross-validated by LC-MS/MS analysis data of the same tissues. The consistency suggests that the approach is able to fast obtain the quantitative MSI data without introducing interference into the in-situ environment of the tissue sample, and is potential to provide a high-throughput, economical and reliable approach for drug discovery and development. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  1. Overview of Supersonic Aerodynamics Measurement Techniques in the NASA Langley Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Erickson, Gary E.

    2007-01-01

    An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.

  2. Qualitative Methods in Mental Health Services Research

    PubMed Central

    Palinkas, Lawrence A.

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This paper reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the papers included in this special series along with representative examples from the literature. Qualitative methods are used to provide a “thick description” or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods, but often differ with respect to study design, data collection and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semi-structured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research. PMID:25350675

  3. Qualitative and mixed methods in mental health services and implementation research.

    PubMed

    Palinkas, Lawrence A

    2014-01-01

    Qualitative and mixed methods play a prominent role in mental health services research. However, the standards for their use are not always evident, especially for those not trained in such methods. This article reviews the rationale and common approaches to using qualitative and mixed methods in mental health services and implementation research based on a review of the articles included in this special series along with representative examples from the literature. Qualitative methods are used to provide a "thick description" or depth of understanding to complement breadth of understanding afforded by quantitative methods, elicit the perspective of those being studied, explore issues that have not been well studied, develop conceptual theories or test hypotheses, or evaluate the process of a phenomenon or intervention. Qualitative methods adhere to many of the same principles of scientific rigor as quantitative methods but often differ with respect to study design, data collection, and data analysis strategies. For instance, participants for qualitative studies are usually sampled purposefully rather than at random and the design usually reflects an iterative process alternating between data collection and analysis. The most common techniques for data collection are individual semistructured interviews, focus groups, document reviews, and participant observation. Strategies for analysis are usually inductive, based on principles of grounded theory or phenomenology. Qualitative methods are also used in combination with quantitative methods in mixed-method designs for convergence, complementarity, expansion, development, and sampling. Rigorously applied qualitative methods offer great potential in contributing to the scientific foundation of mental health services research.

  4. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  6. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  7. Trends in health sciences library and information science research: an analysis of research publications in the Bulletin of the Medical Library Association and Journal of the Medical Library Association from 1991 to 2007*

    PubMed Central

    Gore, Sally A.; Nordberg, Judith M.; Palmer, Lisa A.

    2009-01-01

    Objective: This study analyzed trends in research activity as represented in the published research in the leading peer-reviewed professional journal for health sciences librarianship. Methodology: Research articles were identified from the Bulletin of the Medical Library Association and Journal of the Medical Library Association (1991–2007). Using content analysis and bibliometric techniques, data were collected for each article on the (1) subject, (2) research method, (3) analytical technique used, (4) number of authors, (5) number of citations, (6) first author affiliation, and (7) funding source. The results were compared to a previous study, covering the period 1966 to 1990, to identify changes over time. Results: Of the 930 articles examined, 474 (51%) were identified as research articles. Survey (n = 174, 37.1%) was the most common methodology employed, quantitative descriptive statistics (n = 298, 63.5%) the most used analytical technique, and applied topics (n = 332, 70%) the most common type of subject studied. The majority of first authors were associated with an academic health sciences library (n = 264, 55.7%). Only 27.4% (n = 130) of studies identified a funding source. Conclusion: This study's findings demonstrate that progress is being made in health sciences librarianship research. There is, however, room for improvement in terms of research methodologies used, proportion of applied versus theoretical research, and elimination of barriers to conducting research for practicing librarians. PMID:19626146

  8. Multimodal quantitative phase and fluorescence imaging of cell apoptosis

    NASA Astrophysics Data System (ADS)

    Fu, Xinye; Zuo, Chao; Yan, Hao

    2017-06-01

    Fluorescence microscopy, utilizing fluorescence labeling, has the capability to observe intercellular changes which transmitted and reflected light microscopy techniques cannot resolve. However, the parts without fluorescence labeling are not imaged. Hence, the processes simultaneously happen in these parts cannot be revealed. Meanwhile, fluorescence imaging is 2D imaging where information in the depth is missing. Therefore the information in labeling parts is also not complete. On the other hand, quantitative phase imaging is capable to image cells in 3D in real time through phase calculation. However, its resolution is limited by the optical diffraction and cannot observe intercellular changes below 200 nanometers. In this work, fluorescence imaging and quantitative phase imaging are combined to build a multimodal imaging system. Such system has the capability to simultaneously observe the detailed intercellular phenomenon and 3D cell morphology. In this study the proposed multimodal imaging system is used to observe the cell behavior in the cell apoptosis. The aim is to highlight the limitations of fluorescence microscopy and to point out the advantages of multimodal quantitative phase and fluorescence imaging. The proposed multimodal quantitative phase imaging could be further applied in cell related biomedical research, such as tumor.

  9. Sub-band denoising and spline curve fitting method for hemodynamic measurement in perfusion MRI

    NASA Astrophysics Data System (ADS)

    Lin, Hong-Dun; Huang, Hsiao-Ling; Hsu, Yuan-Yu; Chen, Chi-Chen; Chen, Ing-Yi; Wu, Liang-Chi; Liu, Ren-Shyan; Lin, Kang-Ping

    2003-05-01

    In clinical research, non-invasive MR perfusion imaging is capable of investigating brain perfusion phenomenon via various hemodynamic measurements, such as cerebral blood volume (CBV), cerebral blood flow (CBF), and mean trasnit time (MTT). These hemodynamic parameters are useful in diagnosing brain disorders such as stroke, infarction and periinfarct ischemia by further semi-quantitative analysis. However, the accuracy of quantitative analysis is usually affected by poor signal-to-noise ratio image quality. In this paper, we propose a hemodynamic measurement method based upon sub-band denoising and spline curve fitting processes to improve image quality for better hemodynamic quantitative analysis results. Ten sets of perfusion MRI data and corresponding PET images were used to validate the performance. For quantitative comparison, we evaluate gray/white matter CBF ratio. As a result, the hemodynamic semi-quantitative analysis result of mean gray to white matter CBF ratio is 2.10 +/- 0.34. The evaluated ratio of brain tissues in perfusion MRI is comparable to PET technique is less than 1-% difference in average. Furthermore, the method features excellent noise reduction and boundary preserving in image processing, and short hemodynamic measurement time.

  10. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  11. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  12. Methodological issues in microdialysis sampling for pharmacokinetic studies.

    PubMed

    de Lange, E C; de Boer, A G; Breimer, D D

    2000-12-15

    Microdialysis is an in vivo technique that permits monitoring of local concentrations of drugs and metabolites at specific sites in the body. Microdialysis has several characteristics, which makes it an attractive tool for pharmacokinetic research. About a decade ago the microdialysis technique entered the field of pharmacokinetic research, in the brain, and later also in peripheral tissues and blood. Within this period much has been learned on the proper use of this technique. Today, it has outgrown its child diseases and its potentials and limitations have become more or less well defined. As microdialysis is a delicate technique for which experimental factors appear to be critical with respect to the validity of the experimental outcomes, several factors should be considered. These include the probe; the perfusion solution; post-surgery interval in relation to surgical trauma, tissue integrity and repeated experiments; the analysis of microdialysate samples; and the quantification of microdialysate data. Provided that experimental conditions are optimized to give valid and quantitative results, microdialysis can provide numerous data points from a relatively small number of individual animals to determine detailed pharmacokinetic information. An example of one of the added values of this technique compared with other in vivo pharmacokinetic techniques, is that microdialysis reflects free concentrations in tissues and plasma. This gives the opportunity to assess information on drug transport equilibration across membranes such as the blood-brain barrier, which already has provided new insights. With the progress of analytical methodology, especially with respect to low volume/low concentration measurements and simultaneous measurement of multiple compounds, the applications and importance of the microdialysis technique in pharmacokinetic research will continue to increase.

  13. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  14. Antibodies against toluene diisocyanate protein conjugates. Three methods of measurement.

    PubMed

    Patterson, R; Harris, K E; Zeiss, C R

    1983-12-01

    With the use of canine antisera against toluene diisocyanate (TDI)-dog serum albumin (DSA), techniques for measuring antibody against TDI-DSA were evaluated. The use of an ammonium sulfate precipitation assay showed suggestive evidence of antibody binding but high levels of TDI-DSA precipitation in the absence of antibody limit any usefulness of this technique. Double-antibody co-precipitation techniques will measure total antibody or Ig class antibody against 125I-TDI-DSA. These techniques are quantitative. The polystyrene tube radioimmunoassay is a highly sensitive method of detecting and quantitatively estimating IgG antibody. The enzyme linked immunosorbent assay is a rapidly adaptable method for the quantitative estimation of IgG, IgA, and IgM against TDI-homologous proteins. All these techniques were compared and results are demonstrated by using the same serum sample for analysis.

  15. Quantitative NDE of Composite Structures at NASA

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Leckey, Cara A. C.; Howell, Patricia A.; Johnston, Patrick H.; Burke, Eric R.; Zalameda, Joseph N.; Winfree, William P.; Seebo, Jeffery P.

    2015-01-01

    The use of composite materials continues to increase in the aerospace community due to the potential benefits of reduced weight, increased strength, and manufacturability. Ongoing work at NASA involves the use of the large-scale composite structures for spacecraft (payload shrouds, cryotanks, crew modules, etc). NASA is also working to enable the use and certification of composites in aircraft structures through the Advanced Composites Project (ACP). The rapid, in situ characterization of a wide range of the composite materials and structures has become a critical concern for the industry. In many applications it is necessary to monitor changes in these materials over a long time. The quantitative characterization of composite defects such as fiber waviness, reduced bond strength, delamination damage, and microcracking are of particular interest. The research approaches of NASA's Nondestructive Evaluation Sciences Branch include investigation of conventional, guided wave, and phase sensitive ultrasonic methods, infrared thermography and x-ray computed tomography techniques. The use of simulation tools for optimizing and developing these methods is also an active area of research. This paper will focus on current research activities related to large area NDE for rapidly characterizing aerospace composites.

  16. Qualitative GIS and the Visualization of Narrative Activity Space Data

    PubMed Central

    Mennis, Jeremy; Mason, Michael J.; Cao, Yinghui

    2012-01-01

    Qualitative activity space data, i.e. qualitative data associated with the routine locations and activities of individuals, are recognized as increasingly useful by researchers in the social and health sciences for investigating the influence of environment on human behavior. However, there has been little research on techniques for exploring qualitative activity space data. This research illustrates the theoretical principles of combining qualitative and quantitative data and methodologies within the context of GIS, using visualization as the means of inquiry. Through the use of a prototype implementation of a visualization system for qualitative activity space data, and its application in a case study of urban youth, we show how these theoretical methodological principles are realized in applied research. The visualization system uses a variety of visual variables to simultaneously depict multiple qualitative and quantitative attributes of individuals’ activity spaces. The visualization is applied to explore the activity spaces of a sample of urban youth participating in a study on the geographic and social contexts of adolescent substance use. Examples demonstrate how the visualization may be used to explore individual activity spaces to generate hypotheses, investigate statistical outliers, and explore activity space patterns among subject subgroups. PMID:26190932

  17. Qualitative GIS and the Visualization of Narrative Activity Space Data.

    PubMed

    Mennis, Jeremy; Mason, Michael J; Cao, Yinghui

    Qualitative activity space data, i.e. qualitative data associated with the routine locations and activities of individuals, are recognized as increasingly useful by researchers in the social and health sciences for investigating the influence of environment on human behavior. However, there has been little research on techniques for exploring qualitative activity space data. This research illustrates the theoretical principles of combining qualitative and quantitative data and methodologies within the context of GIS, using visualization as the means of inquiry. Through the use of a prototype implementation of a visualization system for qualitative activity space data, and its application in a case study of urban youth, we show how these theoretical methodological principles are realized in applied research. The visualization system uses a variety of visual variables to simultaneously depict multiple qualitative and quantitative attributes of individuals' activity spaces. The visualization is applied to explore the activity spaces of a sample of urban youth participating in a study on the geographic and social contexts of adolescent substance use. Examples demonstrate how the visualization may be used to explore individual activity spaces to generate hypotheses, investigate statistical outliers, and explore activity space patterns among subject subgroups.

  18. Beryllium particle combustion

    NASA Technical Reports Server (NTRS)

    Prentice, J. L.

    1972-01-01

    A two-year study of the combustion efficiency of single beryllium droplets burning in a variety of oxidizers (primarily mixtures of oxygen/argon and oxygen/nitrogen) is summarized. An advanced laser heating technique was used to acquire systematic quantitative data on the burning of single beryllium droplets at atmospheric pressure. The research confirmed the sensitivity of beryllium droplet combustion to the chemistry of environmental species and provides experimental documentation for the nitrogen-induced droplet fragmentation of burning beryllium droplets.

  19. Quantitative real-time RT-PCR assay for research studies on enterovirus infections in the central nervous system.

    PubMed

    Volle, Romain; Nourrisson, Céline; Mirand, Audrey; Regagnon, Christel; Chambon, Martine; Henquell, Cécile; Bailly, Jean-Luc; Peigue-Lafeuille, Hélène; Archimbaud, Christine

    2012-10-01

    Human enteroviruses are the most frequent cause of aseptic meningitis and are involved in other neurological infections. Qualitative detection of enterovirus genomes in cerebrospinal fluid is a prerequisite in diagnosing neurological diseases. The pathogenesis of these infections is not well understood and research in this domain would benefit from the availability of a quantitative technique to determine viral load in clinical specimens. This study describes the development of a real-time RT-qPCR assay using hydrolysis TaqMan probe and a competitive RNA internal control. The assay has high specificity and can be used for a large sample of distinct enterovirus strains and serotypes. The reproducible limit of detection was estimated at 1875 copies/ml of quantitative standards composed of RNA transcripts obtained from a cloned echovirus 30 genome. Technical performance was unaffected by the introduction of a competitive RNA internal control before RNA extraction. The mean enterovirus RNA concentration in an evaluation series of 15 archived cerebrospinal fluid specimens was determined at 4.78 log(10)copies/ml for the overall sample. The sensitivity and reproducibility of the real time RT-qPCR assay used in combination with the internal control to monitor the overall specimen process make it a valuable tool with applied research into enterovirus infections. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  1. Molecular methods of measurement of hepatitis B virus, hepatitis C virus, and human immunodeficiency virus infection: implications for occupational health practice

    PubMed Central

    Kao, J. H.; Heptonstall, J.; Chen, D. S.

    1999-01-01

    Over the past decade, several molecular techniques for the detection of human immunodeficiency virus (HIV), hepatitis B virus (HBV), and hepatitis C virus (HCV) have been developed that have implications for occupational health practice. This review describes the techniques used for qualitative and quantitative detection of the viral genome, and briefly explains nucleic acid sequencing and analysis of phylogenetic trees. The review also discusses the current and potential uses of these techniques in investigations of transmission of bloodborne viruses by patient to worker and worker to patient, in the management of occupational exposure to blood, in research, and in the development of guidance and policy on infected healthcare workers who perform procedures prone to exposure.   PMID:10658557

  2. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  3. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  4. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  5. Registration of knee joint surfaces for the in vivo study of joint injuries based on magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Cheng, Rita W. T.; Habib, Ayman F.; Frayne, Richard; Ronsky, Janet L.

    2006-03-01

    In-vivo quantitative assessments of joint conditions and health status can help to increase understanding of the pathology of osteoarthritis, a degenerative joint disease that affects a large population each year. Magnetic resonance imaging (MRI) provides a non-invasive and accurate means to assess and monitor joint properties, and has become widely used for diagnosis and biomechanics studies. Quantitative analyses and comparisons of MR datasets require accurate alignment of anatomical structures, thus image registration becomes a necessary procedure for these applications. This research focuses on developing a registration technique for MR knee joint surfaces to allow quantitative study of joint injuries and health status. It introduces a novel idea of translating techniques originally developed for geographic data in the field of photogrammetry and remote sensing to register 3D MR data. The proposed algorithm works with surfaces that are represented by randomly distributed points with no requirement of known correspondences. The algorithm performs matching locally by identifying corresponding surface elements, and solves for the transformation parameters relating the surfaces by minimizing normal distances between them. This technique was used in three applications to: 1) register temporal MR data to verify the feasibility of the algorithm to help monitor diseases, 2) quantify patellar movement with respect to the femur based on the transformation parameters, and 3) quantify changes in contact area locations between the patellar and femoral cartilage at different knee flexion angles. The results indicate accurate registration and the proposed algorithm can be applied for in-vivo study of joint injuries with MRI.

  6. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1993-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  7. Techniques for optimal crop selection in a controlled ecological life support system

    NASA Technical Reports Server (NTRS)

    Mccormack, Ann; Finn, Cory; Dunsky, Betsy

    1992-01-01

    A Controlled Ecological Life Support System (CELSS) utilizes a plant's natural ability to regenerate air and water while being grown as a food source in a closed life support system. Current plant research is directed toward obtaining quantitative empirical data on the regenerative ability of each species of plant and the system volume and power requirements. Two techniques were adapted to optimize crop species selection while at the same time minimizing the system volume and power requirements. Each allows the level of life support supplied by the plants to be selected, as well as other system parameters. The first technique uses decision analysis in the form of a spreadsheet. The second method, which is used as a comparison with and validation of the first, utilizes standard design optimization techniques. Simple models of plant processes are used in the development of these methods.

  8. Tooth color measurement using Chroma Meter: techniques, advantages, and disadvantages.

    PubMed

    Li, Yiming

    2003-01-01

    Tooth whitening has become a popular and routine dental procedure, and its efficacy and safety have been well documented. However, the measurement of tooth color, particularly in the evaluation of the efficacy of a system intended to enhance tooth whiteness, remains a challenge. One of the instruments used for assessing tooth color in clinical whitening studies is the Minolta Chroma Meter CR-321 (Minolta Corporation USA, Ramsey, NJ, USA). This article describes the instrument and discusses various measuring procedures and the Chroma Meter's advantages, limitations, and disadvantages. The available information indicates that, although Minolta Chroma Meter CR-321 provides quantitative and objective measurements of tooth color, it can be tedious to use with a custom alignment device. The Chroma Meter data are inconsistent with the commonly used visual instruments such as Vitapan Classical Shade Guide (Vita Zahnfabrik, Bad Säckingen, Germany), although in many cases the general trends are similar. It is also questionable whether the small area measured adequately represents the color of the whole tooth. A more critical challenge is the lack of methods for interpreting the Chroma Meter data regarding tooth color change in studies evaluating the efficacy of whitening systems. Consequently, at present the Chroma Meter data alone do not appear to be adequate for determining tooth color change in whitening research, although the quantitative measurements may be useful as supplemental or supportive data. Research is needed to develop and improve the instrument and technique for quantitative measurement of tooth color and interpretation of the data for evaluating tooth color change. This paper will help readers to understand the advantages and limitations of the Minolta Chroma Meter used for evaluating the efficacy of tooth-whitening systems so that proper judgment can be made in the interpretation of the results of clinical studies.

  9. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  10. Standardized pivot shift test improves measurement accuracy.

    PubMed

    Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker

    2012-04-01

    The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.

  11. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  12. The influence of social capital towards the quality of community tourism services in Lake Toba Parapat North Sumatera

    NASA Astrophysics Data System (ADS)

    Revida, Erika; Yanti Siahaan, Asima; Purba, Sukarman

    2018-03-01

    The objective of the research was to analyze the influence of social capital towards the quality of community tourism service In Lake Toba Parapat North Sumatera. The method used the combination between quantitative and qualitative research. Sample was taken from the Community in the area around Lake Toba Parapat North Sumatera with sample of 150 head of the family. The sampling technique was Simple Random Sampling. Data collection techniques used documentary studies, questionnaires, interview and observations, while the data analysis used were Product Moment and Simple Linear Regression analysis. The results of the research showed that there were positive and significant influence between Social Capital and the Quality of Community Tourism Services in Lake Toba Parapat North Sumatera. This research recommend the need to enhance Social Capital such as trust, norms and network and the quality of community tourism services such as Tangibles, Reliability, Responsiveness, Assurance, and Empathy by giving communications, information and education continuously from the families, institutions formal and informal, community leaders, religious figures and all communities in Lake Toba Parapat North Sumatera.

  13. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  14. Integrating service development with evaluation in telehealthcare: an ethnographic study.

    PubMed

    Finch, Tracy; May, Carl; Mair, Frances; Mort, Maggie; Gask, Linda

    2003-11-22

    To identify issues that facilitate the successful integration of evaluation and development of telehealthcare services. Ethnographic study using various qualitative research techniques to obtain data from several sources, including in-depth semistructured interviews, project steering group meetings, and public telehealthcare meetings. Seven telehealthcare evaluation projects (four randomised controlled trials and three pragmatic service evaluations) in the United Kingdom, studied over two years. Projects spanned a range of specialties-dermatology, psychiatry, respiratory medicine, cardiology, and oncology. Clinicians, managers, technical experts, and researchers involved in the projects. Key problems in successfully integrating evaluation and service development in telehealthcare are, firstly, defining existing clinical practices (and anticipating changes) in ways that permit measurement; secondly, managing additional workload and conflicting responsibilities brought about by combining clinical and research responsibilities (including managing risk); and, thirdly, understanding various perspectives on effectiveness and the limitations of evaluation results beyond the context of the research study. Combined implementation and evaluation of telehealthcare systems is complex, and is often underestimated. The distinction between quantitative outcomes and the workability of the system is important for producing evaluative knowledge that is of practical value. More pragmatic approaches to evaluation, that permit both quantitative and qualitative methods, are required to improve the quality of such research and its relevance for service provision in the NHS.

  15. Who theorizes age? The "socio-demographic variables" device and age-period-cohort analysis in the rhetoric of survey research.

    PubMed

    Rughiniș, Cosima; Humă, Bogdana

    2015-12-01

    In this paper we argue that quantitative survey-based social research essentializes age, through specific rhetorical tools. We outline the device of 'socio-demographic variables' and we discuss its argumentative functions, looking at scientific survey-based analyses of adult scientific literacy, in the Public Understanding of Science research field. 'Socio-demographics' are virtually omnipresent in survey literature: they are, as a rule, used and discussed as bundles of independent variables, requiring little, if any, theoretical and measurement attention. 'Socio-demographics' are rhetorically effective through their common-sense richness of meaning and inferential power. We identify their main argumentation functions as 'structure building', 'pacification', and 'purification'. Socio-demographics are used to uphold causal vocabularies, supporting the transmutation of the descriptive statistical jargon of 'effects' and 'explained variance' into 'explanatory factors'. Age can also be studied statistically as a main variable of interest, through the age-period-cohort (APC) disambiguation technique. While this approach has generated interesting findings, it did not mitigate the reductionism that appears when treating age as a socio-demographic variable. By working with age as a 'socio-demographic variable', quantitative researchers convert it (inadvertently) into a quasi-biological feature, symmetrical, as regards analytical treatment, with pathogens in epidemiological research. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Development of High Speed Imaging and Analysis Techniques Compressible Dynamics Stall

    NASA Technical Reports Server (NTRS)

    Chandrasekhara, M. S.; Carr, L. W.; Wilder, M. C.; Davis, Sanford S. (Technical Monitor)

    1996-01-01

    Dynamic stall has limited the flight envelope of helicopters for many years. The problem has been studied in the laboratory as well as in flight, but most research, even in the laboratory, has been restricted to surface measurement techniques such as pressure transducers or skin friction gauges, except at low speed. From this research, it became apparent that flow visualization tests performed at Mach numbers representing actual flight conditions were needed if the complex physics associated with dynamic stall was to be properly understood. However, visualization of the flow field during compressible conditions required carefully aligned and meticulously reconstructed holographic interferometry. As part of a long-range effort focused on exposing of the physics of compressible dynamic stall, a research wind tunnel was developed at NASA Ames Research Center which permits visual access to the full flow field surrounding an oscillating airfoil during compressible dynamic stall. Initially, a stroboscopic schlieren technique was used for visualization of the stall process, but the primary research tool has been point diffraction interferometry(PDI), a technique carefully optimized for use in th is project. A review of the process of development of PDI will be presented in the full paper. One of the most valuable aspects of PDI is the fact that interferograms are produced in real time on a continuous basis. The use of a rapidly-pulsed laser makes this practical; a discussion of this approach will be presented in the full paper. This rapid pulsing(up to 40,000 pulses/sec) produces interferograms of the rapidly developing dynamic stall field in sufficient resolution(both in space and time) that the fluid physics of the compressible dynamic stall flowfield can be quantitatively determined, including the gradients of pressure in space and time. This permits analysis of the influence of the effect of pitch rate, Mach number, Reynolds number, amplitude of oscillation, and other parameters on the dynamic stall process. When interferograms can be captured in real time, the potential for real-time mapping of a developing unsteady flow such as dynamic stall becomes a possibility. This has been achieved in the present case through the use of a high-speed drum camera combined with electronic circuitry which has resulted in a series of interferograms obtained during a single cycle of dynamic stall; images obtained at the rate of 20 KHz will be presented as a part of the formal presentation. Interferometry has been available for a long time; however, most of its use has been limited to visualization. The present research has focused on use of interferograms for quantitative mapping of the flow over oscillating airfoils. Instantaneous pressure distributions can now be obtained semi-automatically, making practical the analysis of the thousands of interferograms that are produced in this research. A review of the techniques that have been developed as part of this research effort will be presented in the final paper.

  17. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  18. Misconceived Relationships between Logical Positivism and Quantitative Research: An Analysis in the Framework of Ian Hacking.

    ERIC Educational Resources Information Center

    Yu, Chong Ho

    Although quantitative research methodology is widely applied by psychological researchers, there is a common misconception that quantitative research is based on logical positivism. This paper examines the relationship between quantitative research and eight major notions of logical positivism: (1) verification; (2) pro-observation; (3)…

  19. Quantitative RNA-seq analysis of the Campylobacter jejuni transcriptome

    PubMed Central

    Chaudhuri, Roy R.; Yu, Lu; Kanji, Alpa; Perkins, Timothy T.; Gardner, Paul P.; Choudhary, Jyoti; Maskell, Duncan J.

    2011-01-01

    Campylobacter jejuni is the most common bacterial cause of foodborne disease in the developed world. Its general physiology and biochemistry, as well as the mechanisms enabling it to colonize and cause disease in various hosts, are not well understood, and new approaches are required to understand its basic biology. High-throughput sequencing technologies provide unprecedented opportunities for functional genomic research. Recent studies have shown that direct Illumina sequencing of cDNA (RNA-seq) is a useful technique for the quantitative and qualitative examination of transcriptomes. In this study we report RNA-seq analyses of the transcriptomes of C. jejuni (NCTC11168) and its rpoN mutant. This has allowed the identification of hitherto unknown transcriptional units, and further defines the regulon that is dependent on rpoN for expression. The analysis of the NCTC11168 transcriptome was supplemented by additional proteomic analysis using liquid chromatography-MS. The transcriptomic and proteomic datasets represent an important resource for the Campylobacter research community. PMID:21816880

  20. A novel method for unsteady flow field segmentation based on stochastic similarity of direction

    NASA Astrophysics Data System (ADS)

    Omata, Noriyasu; Shirayama, Susumu

    2018-04-01

    Recent developments in fluid dynamics research have opened up the possibility for the detailed quantitative understanding of unsteady flow fields. However, the visualization techniques currently in use generally provide only qualitative insights. A method for dividing the flow field into physically relevant regions of interest can help researchers quantify unsteady fluid behaviors. Most methods at present compare the trajectories of virtual Lagrangian particles. The time-invariant features of an unsteady flow are also frequently of interest, but the Lagrangian specification only reveals time-variant features. To address these challenges, we propose a novel method for the time-invariant spatial segmentation of an unsteady flow field. This segmentation method does not require Lagrangian particle tracking but instead quantitatively compares the stochastic models of the direction of the flow at each observed point. The proposed method is validated with several clustering tests for 3D flows past a sphere. Results show that the proposed method reveals the time-invariant, physically relevant structures of an unsteady flow.

  1. Dissection and Downstream Analysis of Zebra Finch Embryos at Early Stages of Development

    PubMed Central

    Murray, Jessica R.; Stanciauskas, Monika E.; Aralere, Tejas S.; Saha, Margaret S.

    2014-01-01

    The zebra finch (Taeniopygiaguttata) has become an increasingly important model organism in many areas of research including toxicology1,2, behavior3, and memory and learning4,5,6. As the only songbird with a sequenced genome, the zebra finch has great potential for use in developmental studies; however, the early stages of zebra finch development have not been well studied. Lack of research in zebra finch development can be attributed to the difficulty of dissecting the small egg and embryo. The following dissection method minimizes embryonic tissue damage, which allows for investigation of morphology and gene expression at all stages of embryonic development. This permits both bright field and fluorescence quality imaging of embryos, use in molecular procedures such as in situ hybridization (ISH), cell proliferation assays, and RNA extraction for quantitative assays such as quantitative real-time PCR (qtRT-PCR). This technique allows investigators to study early stages of development that were previously difficult to access. PMID:24999108

  2. Quantitative Proton Magnetic Resonance Techniques for Measuring Fat

    PubMed Central

    Harry, Houchun; Kan, Hermien E.

    2014-01-01

    Accurate, precise, and reliable techniques for quantifying body and organ fat distributions are important tools in physiology research. They are critically needed in studies of obesity and diseases involving excess fat accumulation. Proton magnetic resonance methods address this need by providing an array of relaxometry-based (T1, T2) and chemical-shift-based approaches. These techniques can generate informative visualizations of regional and whole-body fat distributions, yield measurements of fat volumes within specific body depots, and quantify fat accumulation in abdominal organs and muscles. MR methods are commonly used to investigate the role of fat in nutrition and metabolism, to measure the efficacy of short and long-term dietary and exercise interventions, to study the implications of fat in organ steatosis and muscular dystrophies, and to elucidate pathophysiological mechanisms in the context of obesity and its comorbidities. The purpose of this review is to provide a summary of mainstream MR strategies for fat quantification. The article will succinctly describe the principles that differentiate water and fat proton signals, summarize advantages and limitations of various techniques, and offer a few illustrative examples. The article will also highlight recent efforts in MR of brown adipose tissue and conclude by briefly discussing some future research directions. PMID:24123229

  3. An optimized color transformation for the analysis of digital images of hematoxylin & eosin stained slides.

    PubMed

    Zarella, Mark D; Breen, David E; Plagov, Andrei; Garcia, Fernando U

    2015-01-01

    Hematoxylin and eosin (H&E) staining is ubiquitous in pathology practice and research. As digital pathology has evolved, the reliance of quantitative methods that make use of H&E images has similarly expanded. For example, cell counting and nuclear morphometry rely on the accurate demarcation of nuclei from other structures and each other. One of the major obstacles to quantitative analysis of H&E images is the high degree of variability observed between different samples and different laboratories. In an effort to characterize this variability, as well as to provide a substrate that can potentially mitigate this factor in quantitative image analysis, we developed a technique to project H&E images into an optimized space more appropriate for many image analysis procedures. We used a decision tree-based support vector machine learning algorithm to classify 44 H&E stained whole slide images of resected breast tumors according to the histological structures that are present. This procedure takes an H&E image as an input and produces a classification map of the image that predicts the likelihood of a pixel belonging to any one of a set of user-defined structures (e.g., cytoplasm, stroma). By reducing these maps into their constituent pixels in color space, an optimal reference vector is obtained for each structure, which identifies the color attributes that maximally distinguish one structure from other elements in the image. We show that tissue structures can be identified using this semi-automated technique. By comparing structure centroids across different images, we obtained a quantitative depiction of H&E variability for each structure. This measurement can potentially be utilized in the laboratory to help calibrate daily staining or identify troublesome slides. Moreover, by aligning reference vectors derived from this technique, images can be transformed in a way that standardizes their color properties and makes them more amenable to image processing.

  4. New Tool Quantitatively Maps Minority-Carrier Lifetime of Multicrystalline Silicon Bricks (Fact Sheet)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    2011-11-01

    NREL's new imaging tool could provide manufacturers with insight on their processes. Scientists at the National Renewable Energy Laboratory (NREL) have used capabilities within the Process Development and Integration Laboratory (PDIL) to generate quantitative minority-carrier lifetime maps of multicrystalline silicon (mc-Si) bricks. This feat has been accomplished by using the PDIL's photoluminescence (PL) imaging system in conjunction with transient lifetime measurements obtained using a custom NREL-designed resonance-coupled photoconductive decay (RCPCD) system. PL imaging can obtain rapid high-resolution images that provide a qualitative assessment of the material lifetime-with the lifetime proportional to the pixel intensity. In contrast, the RCPCD technique providesmore » a fast quantitative measure of the lifetime with a lower resolution and penetrates millimeters into the mc-Si brick, providing information on bulk lifetimes and material quality. This technique contrasts with commercially available minority-carrier lifetime mapping systems that use microwave conductivity measurements. Such measurements are dominated by surface recombination and lack information on the material quality within the bulk of the brick. By combining these two complementary techniques, we obtain high-resolution lifetime maps at very fast data acquisition times-attributes necessary for a production-based diagnostic tool. These bulk lifetime measurements provide manufacturers with invaluable feedback on their silicon ingot casting processes. NREL has been applying the PL images of lifetime in mc-Si bricks in collaboration with a U.S. photovoltaic industry partner through Recovery Act Funded Project ARRA T24. NREL developed a new tool to quantitatively map minority-carrier lifetime of multicrystalline silicon bricks by using photoluminescence imaging in conjunction with resonance-coupled photoconductive decay measurements. Researchers are not hindered by surface recombination and can look deeper into the material to map bulk lifetimes. The tool is being applied to silicon bricks in a project collaborating with a U.S. photovoltaic industry partner. Photovoltaic manufacturers can use the NREL tool to obtain valuable feedback on their silicon ingot casting processes.« less

  5. Pilot clinical study for quantitative spectral diagnosis of non-melanoma skin cancer.

    PubMed

    Rajaram, Narasimhan; Reichenberg, Jason S; Migden, Michael R; Nguyen, Tri H; Tunnell, James W

    2010-12-01

    Several research groups have demonstrated the non-invasive diagnostic potential of diffuse optical spectroscopy (DOS) and laser-induced fluorescence (LIF) techniques for early cancer detection. By combining both modalities, one can simultaneously measure quantitative parameters related to the morphology, function and biochemical composition of tissue and use them to diagnose malignancy. The objective of this study was to use a quantitative reflectance/fluorescence spectroscopic technique to determine the optical properties of normal skin and non-melanoma skin cancers and the ability to accurately classify them. An additional goal was to determine the ability of the technique to differentiate non-melanoma skin cancers from normal skin. The study comprised 48 lesions measured from 40 patients scheduled for a biopsy of suspected non-melanoma skin cancers. White light reflectance and laser-induced fluorescence spectra (wavelength range = 350-700 nm) were collected from each suspected lesion and adjacent clinically normal skin using a custom-built, optical fiber-based clinical instrument. After measurement, the skin sites were biopsied and categorized according to histopathology. Using a quantitative model, we extracted various optical parameters from the measured spectra that could be correlated to the physiological state of tissue. Scattering from cancerous lesions was significantly lower than normal skin for every lesion group, whereas absorption parameters were significantly higher. Using numerical cut-offs for our optical parameters, our clinical instrument could classify basal cell carcinomas with a sensitivity and specificity of 94% and 89%, respectively. Similarly, the instrument classified actinic keratoses and squamous cell carcinomas with a sensitivity of 100% and specificity of 50%. The measured optical properties and fluorophore contributions of normal skin and non-melanoma skin cancers are significantly different from each other and correlate well with tissue pathology. A diagnostic algorithm that combines these extracted properties holds promise for the potential non-invasive diagnosis of skin cancer. Copyright © 2010 Wiley-Liss, Inc.

  6. CLOSED-LOOP STRIPPING ANALYSIS (CLSA) OF ...

    EPA Pesticide Factsheets

    Synthetic musk compounds have been found in surface water, fish tissues, and human breast milk. Current techniques for separating these compounds from fish tissues require tedious sample clean-upprocedures A simple method for the deterrnination of these compounds in fish tissues has been developed. Closed-loop stripping of saponified fish tissues in a I -L Wheaton purge-and-trap vessel is used to strip compounds with high vapor pressures such as synthetic musks from the matrix onto a solid sorbent (Abselut Nexus). This technique is useful for screening biological tissues that contain lipids for musk compounds. Analytes are desorbed from the sorbent trap sequentially with polar and nonpolar solvents, concentrated, and directly analyzed by high resolution gas chromatography coupled to a mass spectrometer operating in the selected ion monitoring mode. In this paper, we analyzed two homogenized samples of whole fish tissues with spiked synthetic musk compounds using closed-loop stripping analysis (CLSA) and pressurized liquid extraction (PLE). The analytes were not recovered quantitatively but the extraction yield was sufficiently reproducible for at least semi-quantitative purposes (screening). The method was less expensive to implement and required significantly less sample preparation than the PLE technique. The research focused on in the subtasks is the development and application of state-of the-art technologies to meet the needs of the public, Office of Water,

  7. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    PubMed

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  8. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  9. Reproducibility of techniques using Archimedes' principle in measuring cancellous bone volume.

    PubMed

    Zou, L; Bloebaum, R D; Bachus, K N

    1997-01-01

    Researchers have been interested in developing techniques to accurately and reproducibly measure the volume fraction of cancellous bone. Historically bone researchers have used Archimedes' principle with water to measure the volume fraction of cancellous bone. Preliminary results in our lab suggested that the calibrated water technique did not provide reproducible results. Because of this difficulty, it was decided to compare the conventional water method to a water with surfactant and a helium method using a micropycnometer. The water/surfactant and the helium methods were attempts to improve the fluid penetration into the small voids present in the cancellous bone structure. In order to compare the reproducibility of the new methods with the conventional water method, 16 cancellous bone specimens were obtained from femoral condyles of human and greyhound dog femora. The volume fraction measurements on each specimen were repeated three times with all three techniques. The results showed that the helium displacement method was more than an order of magnitudes more reproducible than the two other water methods (p < 0.05). Statistical analysis also showed that the conventional water method produced the lowest reproducibility (p < 0.05). The data from this study indicate that the helium displacement technique is a very useful, rapid and reproducible tool for quantitatively characterizing anisotropic porous tissue structures such as cancellous bone.

  10. Application of isotope dilution technique in vitamin A nutrition.

    PubMed

    Wasantwisut, Emorn

    2002-09-01

    The isotope dilution technique involving deuterated retinol has been developed to quantitatively estimate total body reserves of vitamin A in humans. The technique provided good estimates in comparison to hepatic vitamin A concentrations in Bangladeshi surgical patients. Kinetic studies in the United States, Bangladesh, and Guatemala indicated the mean equilibration time of 17 to 20 days irrespective of the size of hepatic reserves. Due to the controversy surrounding the efficacy of a carotene-rich diet on improvement of vitamin A status, the isotope dilution technique was proposed to pursue this research question further (IAEA's coordinated research program). In the Philippines, schoolchildren with low serum retinol concentrations showed significant improvement in total body vitamin A stores following intake of carotene-rich foods (orange fruits and vegetables), using a three-day deuterated-retinol-dilution procedure. When Chinese kindergarten children were fed green and yellow vegetables during the winter, their total body vitamin A stores were sustained as compared to a steady decline of vitamin A stores in the control children. Likewise, daily consumption of purified beta-carotene or diet rich in provitamin A carotenoids were shown to prevent a loss in total body vitamin A stores among Thai lactating women during the rice-planting season. These studies demonstrate potentials of the isotope dilution technique to evaluate the impact of provitamin A carotenoid intervention programs.

  11. Research on corrosion detection for steel reinforced concrete structures using the fiber optical white light interferometer sensing technique

    NASA Astrophysics Data System (ADS)

    Zhao, Xuefeng; Cui, Yanjun; Wei, Heming; Kong, Xianglong; Zhang, Pinglei; Sun, Changsen

    2013-06-01

    In this paper, a novel kind of steel rebar corrosion monitoring technique for steel reinforced concrete structures is proposed, designed, and tested. The technique is based on the fiber optical white light interferometer (WLI) sensing technique. Firstly, a feasibility test was carried out using an equal-strength beam for comparison of strain sensing ability between the WLI and a fiber Bragg grating (FBG). The comparison results showed that the sensitivity of the WLI is sufficient for corrosion expansion strain monitoring. Then, two WLI corrosion sensors (WLI-CSs) were designed, fabricated, and embedded into concrete specimens to monitor expansion strain caused by steel rebar corrosion. Their performance was studied in an accelerated electrochemical corrosion test. Experimental results show that expansion strain along the fiber optical coil winding area can be detected and measured accurately by the proposed sensor. The advantages of the proposed monitoring technique allow for quantitative corrosion expansion monitoring to be executed in real time for reinforced concrete structures and with low cost.

  12. Trends in fluorescence imaging and related techniques to unravel biological information.

    PubMed

    Haustein, Elke; Schwille, Petra

    2007-09-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics.

  13. Trends in fluorescence imaging and related techniques to unravel biological information

    PubMed Central

    Haustein, Elke; Schwille, Petra

    2007-01-01

    Optical microscopy is among the most powerful tools that the physical sciences have ever provided biology. It is indispensable for basic lab work, as well as for cutting edge research, as the visual monitoring of life processes still belongs to the most compelling evidences for a multitude of biomedical applications. Along with the rapid development of new probes and methods for the analysis of laser induced fluorescence, optical microscopy over past years experienced a vast increase of both new techniques and novel combinations of established methods to study biological processes with unprecedented spatial and temporal precision. On the one hand, major technical advances have significantly improved spatial resolution. On the other hand, life scientists are moving toward three- and even four-dimensional cell biology and biophysics involving time as a crucial coordinate to quantitatively understand living specimen. Monitoring the whole cell or tissue in real time, rather than producing snap-shot-like two-dimensional projections, will enable more physiological and, thus, more clinically relevant experiments, whereas an increase in temporal resolution facilitates monitoring fast nonperiodic processes as well as the quantitative analysis of characteristic dynamics. PMID:19404444

  14. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  15. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE PAGES

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.; ...

    2016-07-05

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  16. Standardization of chemical analytical techniques for pyrolysis bio-oil: history, challenges, and current status of methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrell, Jack R.; Olarte, Mariefel V.; Christensen, Earl D.

    Here, we discuss the standardization of analytical techniques for pyrolysis bio-oils, including the current status of methods, and our opinions on future directions. First, the history of past standardization efforts is summarized, and both successful and unsuccessful validation of analytical techniques highlighted. The majority of analytical standardization studies to-date has tested only physical characterization techniques. In this paper, we present results from an international round robin on the validation of chemical characterization techniques for bio-oils. Techniques tested included acid number, carbonyl titrations using two different methods (one at room temperature and one at 80 °C), 31P NMR for determination ofmore » hydroxyl groups, and a quantitative gas chromatography–mass spectrometry (GC-MS) method. Both carbonyl titration and acid number methods have yielded acceptable inter-laboratory variabilities. 31P NMR produced acceptable results for aliphatic and phenolic hydroxyl groups, but not for carboxylic hydroxyl groups. As shown in previous round robins, GC-MS results were more variable. Reliable chemical characterization of bio-oils will enable upgrading research and allow for detailed comparisons of bio-oils produced at different facilities. Reliable analytics are also needed to enable an emerging bioenergy industry, as processing facilities often have different analytical needs and capabilities than research facilities. We feel that correlations in reliable characterizations of bio-oils will help strike a balance between research and industry, and will ultimately help to -determine metrics for bio-oil quality. Lastly, the standardization of additional analytical methods is needed, particularly for upgraded bio-oils.« less

  17. Reviewing effectiveness of ankle assessment techniques for use in robot-assisted therapy.

    PubMed

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Shane

    2014-01-01

    This article provides a comprehensive review of studies that investigated ankle assessment techniques to better understand those that can be used in the real-time monitoring of rehabilitation progress for implementation in conjunction with robot-assisted therapy. Seventy-six publications published between January 1980 and August 2013 were selected based on eight databases. They were divided into two main categories (16 qualitative and 60 quantitative studies): 13 goniometer studies, 18 dynamometer studies, and 29 studies about innovative techniques. A total of 465 subjects participated in the 29 quantitative studies of innovative measurement techniques that may potentially be integrated in a real-time monitoring device, of which 19 studies included less than 10 participants. Results show that qualitative ankle assessment methods are not suitable for real-time monitoring in robot-assisted therapy, though they are reliable for certain patients, while the quantitative methods show great potential. The majority of quantitative techniques are reliable in measuring ankle kinematics and kinetics but are usually available only for use in the sagittal plane. Limited studies determine kinematics and kinetics in all three planes (sagittal, transverse, and frontal) where motions of the ankle joint and the subtalar joint actually occur.

  18. [The quantitative testing of V617F mutation in gen JAK2 using pyrosequencing technique].

    PubMed

    Dunaeva, E A; Mironov, K O; Dribnokhodova, T E; Subbotina, E E; Bashmakova; Ol'hovskiĭ, I A; Shipulin, G A

    2014-11-01

    The somatic mutation V617F in gen JAK2 is a frequent cause of chronic myeloprolific diseases not conditioned by BCR/ABL mutation. The quantitative testing of relative percentage of mutant allele can be used in establishing severity of disease and its prognosis and in prescription of remedy inhibiting activity of JAK2. To quantitatively test mutation the pyrosequencing technique was applied. The developed technique permits detecting and quantitatively, testing percentage of mutation fraction since 7%. The "gray zone" is presented by samples with percentage of mutant allele from 4% to 7%. The dependence of expected percentage of mutant fraction in analyzed sample from observed value of signal is described by equation of line with regression coefficients y = - 0.97, x = -1.32 and at that measurement uncertainty consists ± 0.7. The developed technique is approved officially on clinical material from 192 patients with main forms of myeloprolific diseases not conditioned by BCR/ABL mutation. It was detected 64 samples with mautant fraction percentage from 13% to 91%. The developed technique permits implementing monitoring of therapy of myeloprolific diseases and facilitates to optimize tactics of treatment.

  19. Planetary Analogs in Antarctica: Icy Satellites

    NASA Technical Reports Server (NTRS)

    Malin, M. C.

    1985-01-01

    As part of a study to provide semi-quantitative techniques to date past Antarctic glaciations, sponsored by the Antarctic Research Program, field observations pertinent to other planets were also acquired. The extremely diverse surface conditions, marked by extreme cold and large amounts of ice, provide potential terrain and process analogs to the icy satellites of Jupiter and Saturn. Thin ice tectonic features and explosion craters (on sea ice) and deformation features on thicker ice (glaciers) are specifically addressed.

  20. The Intracellular Trafficking Pathway of Transferrin

    PubMed Central

    Mayle, Kristine M.; Le, Alexander M.; Kamei, Daniel T.

    2011-01-01

    Background Transferrin (Tf) is an iron-binding protein that facilitates iron-uptake in cells. Iron-loaded Tf first binds to the Tf receptor (TfR) and enters the cell through clathrin-mediated endocytosis. Inside the cell, Tf is trafficked to early endosomes, delivers iron, and then is subsequently directed to recycling endosomes to be taken back to the cell surface. Scope of Review We aim to review the various methods and techniques that researchers have employed for elucidating the Tf trafficking pathway and the cell-machinery components involved. These experimental methods can be categorized as microscopy, radioactivity, and surface plasmon resonance (SPR). Major Conclusions Qualitative experiments, such as total internal reflectance fluorescence (TIRF), electron, laser-scanning confocal, and spinning-disk confocal microscopy, have been utilized to determine the roles of key components in the Tf trafficking pathway. These techniques allow temporal resolution and are useful for imaging Tf endocytosis and recycling, which occur on the order of seconds to minutes. Additionally, radiolabeling and SPR methods, when combined with mathematical modeling, have enabled researchers to estimate quantitative kinetic parameters and equilibrium constants associated with Tf binding and trafficking. General Significance Both qualitative and quantitative data can be used to analyze the Tf trafficking pathway. The valuable information that is obtained about the Tf trafficking pathway can then be combined with mathematical models to identify design criteria to improve the ability of Tf to deliver anticancer drugs. PMID:21968002

  1. Characterization of Thermal and Mechanical Impact on Aluminum Honeycomb Structures

    NASA Technical Reports Server (NTRS)

    Robinson, Christen M.

    2013-01-01

    This study supports NASA Kennedy Space Center's research in the area of intelligent thermal management systems and multifunctional thermal systems. This project addresses the evaluation of the mechanical and thermal properties of metallic cellular solid (MCS) materials; those that are lightweight; high strength, tunable, multifunctional and affordable. A portion of the work includes understanding the mechanical properties of honeycomb structured cellular solids upon impact testing under ambient, water-immersed, liquid nitrogen-cooled, and liquid nitrogen-immersed conditions. Additionally, this study will address characterization techniques of the aluminum honeycomb's ability to resist multiple high-rate loadings or impacts in varying environmental conditions, using various techniques for the quantitative and qualitative determination for commercial applicability.

  2. A relative performance analysis of atmospheric Laser Doppler Velocimeter methods.

    NASA Technical Reports Server (NTRS)

    Farmer, W. M.; Hornkohl, J. O.; Brayton, D. B.

    1971-01-01

    Evaluation of the effectiveness of atmospheric applications of a Laser Doppler Velocimeter (LDV) at a wavelength of about 0.5 micrometer in conjunction with dual scatter LDV illuminating techniques, or at a wavelength of 10.6 micrometer with local oscillator LDV illuminating techniques. Equations and examples are given to provide a quantitative basis for LDV system selection and performance criteria in atmospheric research. The comparative study shows that specific ranges and conditions exist where performance of one of the methods is superior to that of the other. It is also pointed out that great care must be exercised in choosing system parameters that optimize a particular LDV designed for atmospheric applications.

  3. Quantitative label-free multimodality nonlinear optical imaging for in situ differentiation of cancerous lesions

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoyun; Li, Xiaoyan; Cheng, Jie; Liu, Zhengfan; Thrall, Michael J.; Wang, Xi; Wang, Zhiyong; Wong, Stephen T. C.

    2013-03-01

    The development of real-time, label-free imaging techniques has recently attracted research interest for in situ differentiation of cancerous lesions from normal tissues. Molecule-specific intrinsic contrast can arise from label-free imaging techniques such as Coherent Anti-Stokes Raman Scattering (CARS), Two-Photon Excited AutoFluorescence (TPEAF), and Second Harmonic Generation (SHG), which, in combination, would hold the promise of a powerful label-free tool for cancer diagnosis. Among cancer-related deaths, lung carcinoma is the leading cause for both sexes. Although early treatment can increase the survival rate dramatically, lesion detection and precise diagnosis at an early stage is unusual due to its asymptomatic nature and limitations of current diagnostic techniques that make screening difficult. We investigated the potential of using multimodality nonlinear optical microscopy that incorporates CARS, TPEAF, and SHG techniques for differentiation of lung cancer from normal tissue. Cancerous and non-cancerous lung tissue samples from patients were imaged using CARS, TPEAF, and SHG techniques for comparison. These images showed good pathology correlation with hematoxylin and eosin (H and E) stained sections from the same tissue samples. Ongoing work includes imaging at various penetration depths to show three-dimensional morphologies of tumor cell nuclei using CARS, elastin using TPEAF, and collagen using SHG and developing classification algorithms for quantitative feature extraction to enable lung cancer diagnosis. Our results indicate that via real-time morphology analyses, a multimodality nonlinear optical imaging platform potentially offers a powerful minimally-invasive way to differentiate cancer lesions from surrounding non-tumor tissues in vivo for clinical applications.

  4. Improving fieldwork by using GIS for quantitative exploration, data management and digital mapping

    NASA Astrophysics Data System (ADS)

    Marra, Wouter; Alberti, Koko; van de Grint, Liesbeth; Karssenberg, Derek

    2016-04-01

    Fieldwork is an essential part of teaching geosciences. The essence of a fieldwork is to study natural phenomena in its proper context. Fieldworks dominantly utilize a learning-by-experiencing learning style and are often light on abstract thinking skills. We introduce more of the latter skills to a first-year fieldwork of several weeks by using Geographical Information Systems (GIS). We use simple techniques as the involved students had no prior experience with GIS. In our project, we introduced new tutorials prior to the fieldwork where students explored their research area using aerial photos, satellite images, an elevation model and slope-map using Google Earth and QGIS. The goal of these tutorials was to get acquainted with the area, plan the first steps of the fieldwork, and formulate hypotheses in form of a preliminary map based on quantitative data. During the actual fieldwork, half of the students processed and managed their field data using GIS, used elevation data as additional data source, and made digital geomorphological maps. This was in contrast to the other half of the students that used classic techniques with paper maps. We evaluated the learning benefits by two questionnaires (one before and one after the fieldwork), and a group interview with students that used GIS in the field. Students liked the use of Google Earth and GIS, and many indicate the added value of using quantitative maps. The hypotheses and fieldwork plans of the students were quickly superseded by insights during the fieldwork itself, but making these plans and hypotheses in advance improved the student's ability to perform empirical research. Students were very positive towards the use of GIS for their fieldwork, mainly because they experienced it as a modern and relevant technique for research and the labour market. Tech-savvy students were extra motivated and explored additional methods. There were some minor technical difficulties with using GIS during the fieldwork, but these can be solved by focussing the preparatory tutorials on what to expect during the fieldwork. We did not observe a significant difference in the quality of the products created by students between both groups since both digital and classic maps show a large range of aesthetic and scientific quality. To conclude, we had a positive experience with our first attempt to add GIS components to a classic fieldwork. The main benefit is that students use quantitative data which provides a different view on the fieldwork area and triggers abstract thinking. Future plans include using the student's field data in a web-gis app to allow easy remote supervision and using digital maps in the field.

  5. Future technology insight: mass spectrometry imaging as a tool in drug research and development

    PubMed Central

    Cobice, D F; Goodwin, R J A; Andren, P E; Nilsson, A; Mackay, C L; Andrew, R

    2015-01-01

    In pharmaceutical research, understanding the biodistribution, accumulation and metabolism of drugs in tissue plays a key role during drug discovery and development. In particular, information regarding pharmacokinetics, pharmacodynamics and transport properties of compounds in tissues is crucial during early screening. Historically, the abundance and distribution of drugs have been assessed by well-established techniques such as quantitative whole-body autoradiography (WBA) or tissue homogenization with LC/MS analysis. However, WBA does not distinguish active drug from its metabolites and LC/MS, while highly sensitive, does not report spatial distribution. Mass spectrometry imaging (MSI) can discriminate drug and its metabolites and endogenous compounds, while simultaneously reporting their distribution. MSI data are influencing drug development and currently used in investigational studies in areas such as compound toxicity. In in vivo studies MSI results may soon be used to support new drug regulatory applications, although clinical trial MSI data will take longer to be validated for incorporation into submissions. We review the current and future applications of MSI, focussing on applications for drug discovery and development, with examples to highlight the impact of this promising technique in early drug screening. Recent sample preparation and analysis methods that enable effective MSI, including quantitative analysis of drugs from tissue sections will be summarized and key aspects of methodological protocols to increase the effectiveness of MSI analysis for previously undetectable targets addressed. These examples highlight how MSI has become a powerful tool in drug research and development and offers great potential in streamlining the drug discovery process. PMID:25766375

  6. So you want to do research? 3. An introduction to qualitative methods.

    PubMed

    Meadows, Keith A

    2003-10-01

    This article describes some of the key issues in the use of qualitative research methods. Starting with a description of what qualitative research is and outlining some of the distinguishing features between quantitative and qualitative research, examples of the type of setting where qualitative research can be applied are provided. Methods of collecting information through in-depth interviews and group discussions are discussed in some detail, including issues around sampling and recruitment, the use of topic guides and techniques to encourage participants to talk openly. An overview on the analysis of qualitative data discusses aspects on data reduction, display and drawing conclusions from the data. Approaches to ensuring rigour in the collection, analysis and reporting of qualitative research are discussed and the concepts of credibility, transferability, dependability and confirmability are described. Finally, guidelines for the reporting of qualitative research are outlined and the need to write for a particular audience is discussed.

  7. A combined qualitative-quantitative approach for the identification of highly co-creative technology-driven firms

    NASA Astrophysics Data System (ADS)

    Milyakov, Hristo; Tanev, Stoyan; Ruskov, Petko

    2011-03-01

    Value co-creation, is an emerging business and innovation paradigm, however, there is not enough clarity on the distinctive characteristics of value co-creation as compared to more traditional value creation approaches. The present paper summarizes the results from an empirically-derived research study focusing on the development of a systematic procedure for the identification of firms that are active in value co-creation. The study is based on a sample 273 firms that were selected for being representative of the breadth of their value co-creation activities. The results include: i) the identification of the key components of value co-creation based on a research methodology using web search and Principal Component Analysis techniques, and ii) the comparison of two different classification techniques identifying the firms with the highest degree of involvement in value co-creation practices. To the best of our knowledge this is the first study using sophisticated data collection techniques to provide a classification of firms according to the degree of their involvement in value co-creation.

  8. Updates on measurements and modeling techniques for expendable countermeasures

    NASA Astrophysics Data System (ADS)

    Gignilliat, Robert; Tepfer, Kathleen; Wilson, Rebekah F.; Taczak, Thomas M.

    2016-10-01

    The potential threat of recently-advertised anti-ship missiles has instigated research at the United States (US) Naval Research Laboratory (NRL) into the improvement of measurement techniques for visual band countermeasures. The goal of measurements is the collection of radiometric imagery for use in the building and validation of digital models of expendable countermeasures. This paper will present an overview of measurement requirements unique to the visual band and differences between visual band and infrared (IR) band measurements. A review of the metrics used to characterize signatures in the visible band will be presented and contrasted to those commonly used in IR band measurements. For example, the visual band measurements require higher fidelity characterization of the background, including improved high-transmittance measurements and better characterization of solar conditions to correlate results more closely with changes in the environment. The range of relevant engagement angles has also been expanded to include higher altitude measurements of targets and countermeasures. In addition to the discussion of measurement techniques, a top-level qualitative summary of modeling approaches will be presented. No quantitative results or data will be presented.

  9. Quantitative changes in proteins responsible for flavonoid and anthocyanin biosynthesis in strawberry fruit at different ripening stages: A targeted quantitative proteomic investigation employing multiple reaction monitoring.

    PubMed

    Song, Jun; Du, Lina; Li, Li; Kalt, Wilhelmina; Palmer, Leslie Campbell; Fillmore, Sherry; Zhang, Ying; Zhang, ZhaoQi; Li, XiHong

    2015-06-03

    To better understand the regulation of flavonoid and anthocyanin biosynthesis, a targeted quantitative proteomic investigation employing LC-MS with multiple reaction monitoring was conducted on two strawberry cultivars at three ripening stages. This quantitative proteomic workflow was improved through an OFFGEL electrophoresis to fractionate peptides from total protein digests. A total of 154 peptide transitions from 47 peptides covering 21 proteins and isoforms related to anthocyanin biosynthesis were investigated. The normalized protein abundance, which was measured using isotopically-labeled standards, was significantly changed concurrently with increased anthocyanin content and advanced fruit maturity. The protein abundance of phenylalanine ammonia-lyase; anthocyanidin synthase, chalcone isomerase; flavanone 3-hydroxylase; dihydroflavonol 4-reductase, UDP-glucose:flavonoid-3-O-glucosyltransferase, cytochrome c and cytochrome C oxidase subunit 2, was all significantly increased in fruit of more advanced ripeness. An interaction between cultivar and maturity was also shown with respect to chalcone isomerase. The good correlation between protein abundance and anthocyanin content suggested that a metabolic control point may exist for anthocyanin biosynthesis. This research provides insights into the process of anthocyanin formation in strawberry fruit at the level of protein concentration and reveals possible candidates in the regulation of anthocyanin formation during fruit ripening. To gain insight into the molecular mechanisms contributing to flavonoids and anthocyanin biosynthesis and regulation of strawberry fruit during ripening is challenging due to limited molecular biology tools and established hypothesis. Our targeted proteomic approach employing LC-MS/MS analysis and MRM technique to quantify proteins in relation to flavonoids and anthocyanin biosynthesis and regulation in strawberry fruit during fruit ripening is novel. The identification of peptides and proteins provided reliable design and validation of quantitative approaches using SRM on targeted proteins proposed involved in strawberry fruit. Our data revealed the identifying candidate proteins and their quantitative changes in relation to fruit ripening and flavonoids and anthocyanin biosynthesis and regulation. More importantly, this quantitative proteomic data is also compared with chemical analysis to reveal possible control levels of this important quality trait. Although, MRM approach is not new in plant biology research, the application has been very rare. This is the first systematic multi-targeted interrogation of the possible regulation of entire pathway of flavonoids and anthocyanin biosynthesis in strawberry fruit at different ripening stages using quantitative MRM technique on mass spectrometry. Our results demonstrate the power of targeted quantitative mass spectrometry data for analysis of proteins in biological regulation. These results indicate that distinct and diverse control of flavonoids and anthocyanin biosynthesis mechanisms at metabolism and proteins levels. This important and complementary knowledge will be useful for systematically characterizing the flavonoids and anthocyanin biosynthesis pathway of any fruit/plant species. Copyright © 2015. Published by Elsevier B.V.

  10. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  11. Gravitational Effects on Near Field Flow Structure of Low Density Gas Jets

    NASA Technical Reports Server (NTRS)

    Yep, Tze-Wing; Agrawal, Ajay K.; Griffin, DeVon; Salzman, Jack (Technical Monitor)

    2001-01-01

    Experiments were conducted in Earth gravity and microgravity to acquire quantitative data on near field flow structure of helium jets injected into air. Microgravity conditions were simulated in the 2.2-second drop tower at NASA Glenn Research Center. The jet flow was observed by quantitative rainbow schlieren deflectometry, a non-intrusive line of site measurement technique for the whole field. The flow structure was characterized by distributions of angular deflection and helium mole percentage obtained from color schlieren images taken at 60 Hz. Results show that the jet flow was significantly influenced by the gravity. The jet in microgravity was up to 70 percent wider than that in Earth gravity. The jet flow oscillations observed in Earth gravity were absent in microgravity, providing direct experimental evidence that the flow instability in the low density jet was buoyancy induced. The paper provides quantitative details of temporal flow evolution as the experiment undergoes a change in gravity in the drop tower.

  12. Quantitative Metrics in Clinical Radiology Reporting: A Snapshot Perspective from a Single Mixed Academic-Community Practice

    PubMed Central

    Abramson, Richard G.; Su, Pei-Fang; Shyr, Yu

    2012-01-01

    Quantitative imaging has emerged as a leading priority on the imaging research agenda, yet clinical radiology has traditionally maintained a skeptical attitude toward numerical measurement in diagnostic interpretation. To gauge the extent to which quantitative reporting has been incorporated into routine clinical radiology practice, and to offer preliminary baseline data against which the evolution of quantitative imaging can be measured, we obtained all clinical computed tomography (CT) and magnetic resonance imaging (MRI) reports from two randomly selected weekdays in 2011 at a single mixed academic-community practice and evaluated those reports for the presence of quantitative descriptors. We found that 44% of all reports contained at least one “quantitative metric” (QM), defined as any numerical descriptor of a physical property other than quantity, but only 2% of reports contained an “advanced quantitative metric” (AQM), defined as a numerical parameter reporting on lesion function or composition, excluding simple size and distance measurements. Possible reasons for the slow translation of AQMs into routine clinical radiology reporting include perceptions that the primary clinical question may be qualitative in nature or that a qualitative answer may be sufficient; concern that quantitative approaches may obscure important qualitative information, may not be adequately validated, or may not allow sufficient expression of uncertainty; the feeling that “gestalt” interpretation may be superior to quantitative paradigms; and practical workflow limitations. We suggest that quantitative imaging techniques will evolve primarily as dedicated instruments for answering specific clinical questions requiring precise and standardized interpretation. Validation in real-world settings, ease of use, and reimbursement economics will all play a role in determining the rate of translation of AQMs into broad practice. PMID:22795791

  13. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  14. Complementarity in radiochemical and infrared spectroscopic characterization of electrode adsorption

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wieckowski, A.

    1994-03-01

    Radioactive labelling and infrared spectroscopy are frequently used as direct, in situ probes into the structure of the electrochemical solid/liquid interface. These techniques are compared, in a polemical fashion, in the context of a recent publication by Parry et al. (Langmuir 1993, 9, 1878) where the research potential of the former technique was not adequately depicted. It is shown that radiotracers can clearly differentiate between the surface and solution species, both neutrals and anions. In addition to the surface specificity, the radiotracers offer a quantitative determination of adsorbate surface concentrations, a feature not yet demonstrated with surface infrared spectroscopy inmore » electrochemistry. Therefore, these two techniques are complementary. Examples of the combined radiochemical and spectroscopic measurements of adsorption with equivalent (smooth) electrode surfaces are quoted. 11 refs., 2 figs.« less

  15. Soot Imaging and Measurement

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Soot, sometimes referred to as smoke, is made up primarily of the carbon particles generated by most combustion processes. For example, large quantities of soot can be seen issuing from the exhaust pipes of diesel-powered vehicles. Heated soot also is responsible for the warm orange color of candle flames, though that soot is generally consumed before it can exit the flame. Research has suggested that heavy atmospheric soot concentrations aggravate conditions such as pneumonia and asthma, causing many deaths each year. To understand the formation and oxidation of soot, NASA Lewis Research Center scientists, together with several university investigators, are investigating the properties of soot generated in reduced gravity, where the absence of buoyancy allows more time for the particles to grow. The increased time allows researchers to better study the life cycle of these particles, with the hope that increased understanding will lead to better control strategies. To quantify the amount of soot present in a flame, Lewis scientists developed a unique imaging technique that provides quantitative and qualitative soot data over a large field of view. There is significant improvement over the single-point methods normally used. The technique is shown in the sketch, where light from a laser is expanded with a microscope objective, rendered parallel, and passed through a flame where soot particles reduce the amount of light transmitted to the camera. A filter only allows light at the wavelength of the laser to pass to the camera, preventing any extraneous signals. When images of the laser light with and without the flame are compared, a quantitative map of the soot concentration is produced. In addition to that data, a qualitative image of the soot in the flame is also generated, an example of which is displayed in the photo. This technique has the potential to be adapted to real-time process control in industrial powerplants.

  16. Quantitative phase imaging using four interferograms with special phase shifts by dual-wavelength in-line phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoqing; Wang, Yawei; Ji, Ying; Xu, Yuanyuan; Xie, Ming; Han, Hao

    2018-05-01

    A new approach of quantitative phase imaging using four interferograms with special phase shifts in dual-wavelength in-line phase-shifting interferometry is presented. In this method, positive negative 2π phase shifts are employed to easily separate the incoherent addition of two single-wavelength interferograms by combining the phase-shifting technique with the subtraction procedure, then the quantitative phase at one of both wavelengths can be achieved based on two intensities without the corresponding dc terms by the use of the character of the trigonometric function. The quantitative phase of the other wavelength can be retrieved from two dc-term suppressed intensities obtained by employing the two-step phase-shifting technique or the filtering technique in the frequency domain. The proposed method is illustrated with theory, and its effectiveness is demonstrated by simulation experiments of the spherical cap and the HeLa cell, respectively.

  17. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. QUANTITATIVE MAGNETIC RESONANCE IMAGING OF ARTICULAR CARTILAGE AND ITS CLINICAL APPLICATIONS

    PubMed Central

    Li, Xiaojuan; Majumdar, Sharmila

    2013-01-01

    Cartilage is one of the most essential tissues for healthy joint function and is compromised in degenerative and traumatic joint diseases. There have been tremendous advances during the past decade using quantitative MRI techniques as a non-invasive tool for evaluating cartilage, with a focus on assessing cartilage degeneration during osteoarthritis (OA). In this review, after a brief overview of cartilage composition and degeneration, we discuss techniques that grade and quantify morphologic changes as well as the techniques that quantify changes in the extracellular matrix. The basic principles, in vivo applications, advantages and challenges for each technique are discussed. Recent studies using the OA Initiative (OAI) data are also summarized. Quantitative MRI provides non-invasive measures of cartilage degeneration at the earliest stages of joint degeneration, which is essential for efforts towards prevention and early intervention in OA. PMID:24115571

  19. PREFACE: 2nd International Conference and Young Scientist School ''Magnetic resonance imaging in biomedical research''

    NASA Astrophysics Data System (ADS)

    Naumova, A. V.; Khodanovich, M. Y.; Yarnykh, V. L.

    2016-02-01

    The Second International Conference and Young Scientist School ''Magnetic resonance imaging in biomedical research'' was held on the campus of the National Research Tomsk State University (Tomsk, Russia) on September 7-9, 2015. The conference was focused on magnetic resonance imaging (MRI) applications for biomedical research. The main goal was to bring together basic scientists, clinical researchers and developers of new MRI techniques to bridge the gap between clinical/research needs and advanced technological solutions. The conference fostered research and development in basic and clinical MR science and its application to health care. It also had an educational purpose to promote understanding of cutting-edge MR developments. The conference provided an opportunity for researchers and clinicians to present their recent theoretical developments, practical applications, and to discuss unsolved problems. The program of the conference was divided into three main topics. First day of the conference was devoted to educational lectures on the fundamentals of MRI physics and image acquisition/reconstruction techniques, including recent developments in quantitative MRI. The second day was focused on developments and applications of new contrast agents. Multinuclear and spectroscopic acquisitions as well as functional MRI were presented during the third day of the conference. We would like to highlight the main developments presented at the conference and introduce the prominent speakers. The keynote speaker of the conference Dr. Vasily Yarnykh (University of Washington, Seattle, USA) presented a recently developed MRI method, macromolecular proton fraction (MPF) mapping, as a unique tool for modifying image contrast and a unique tool for quantification of the myelin content in neural tissues. Professor Yury Pirogov (Lomonosov Moscow State University) described development of new fluorocarbon compounds and applications for biomedicine. Drs. Julia Velikina and Alexey Samsonov (University of Wisconsin-Madison, USA) demonstrated new image reconstruction methods for accelerated quantitative parameter mapping and magnetic resonance angiography. Finally, we would like to thank the scientific committee, the local organizing committee and the National Research Tomsk State University for giving an opportunity to share scientific ideas and new developments at the conference and the Russian Science Foundation (project № 14-45-00040) for financial support.

  20. MO-C-BRB-06: Translating NIH / NIBIB funding to clinical reality in quantitative diagnostic imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, E.

    Diagnostic radiology and radiation oncology are arguably two of the most technologically advanced specialties in medicine. The imaging and radiation medicine technologies in clinical use today have been continuously improved through new advances made in the commercial and academic research arenas. This symposium explores the translational path from research through clinical implementation. Dr. Pettigrew will start this discussion by sharing his perspectives as director of the National Institute of Biomedical Imaging and Bioengineering (NIBIB). The NIBIB has focused on promoting research that is technological in nature and has high clinical impact. We are in the age of precision medicine, andmore » the technological innovations and quantitative tools developed by engineers and physicists working with physicians are providing innovative tools that increase precision and improve outcomes in health care. NIBIB funded grants lead to a very high patenting rate (per grant dollar), and these patents have higher citation rates by other patents, suggesting greater clinical impact, as well. Two examples of clinical translation resulting from NIH-funded research will be presented, in radiation therapy and diagnostic imaging. Dr. Yu will describe a stereotactic radiotherapy device developed in his laboratory that is designed for treating breast cancer with the patient in the prone position. It uses 36 rotating Cobalt-60 sources positioned in an annular geometry to focus the radiation beam at the system’s isocenter. The radiation dose is delivered throughout the target volume in the breast by constantly moving the patient in a planned trajectory relative to the fixed isocenter. With this technique, the focal spot dynamically paints the dose distribution throughout the target volume in three dimensions. Dr. Jackson will conclude this symposium by describing the RSNA Quantitative Imaging Biomarkers Alliance (QIBA), which is funded in part by NIBIB and is a synergistic collaboration between medical physicists, radiologists, oncologists, industry representatives, and other stakeholders. The mission of QIBA is to improve the accuracy and practicality of quantitative image-based biomarkers by increasing precision across devices, patients, and time, an essential step in incorporating quantitative imaging biomarkers into radiology practice. Validated quantitative imaging biomarkers are necessary to support precision medicine initiatives, multimodality / multiparametric applications in medicine, treatment planning and response assessment, and radiogenomics applications. Current applications in the QIBA portfolio extend to cancer diagnosis and treatment, pulmonary diseases, and neurological disorders. The overall goal of this symposium is to illustrate the bidirectional exchange between medical research and clinical practice. Revitalizing scientific excellence in clinical medical physics challenges practitioners to identify clinical limitations, which then drive research innovation; research funded by the NIH and other agencies develops technological solutions to these limitations, which are translated to the care environment to ultimately improve clinical practice in radiology and radiation oncology.« less

  1. Physical interpretation and development of ultrasonic nondestructive evaluation techniques applied to the quantitative characterization of textile composite materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1993-01-01

    In this Progress Report, we describe our current research activities concerning the development and implementation of advanced ultrasonic nondestructive evaluation methods applied to the characterization of stitched composite materials and bonded aluminum plate specimens. One purpose of this investigation is to identify and characterize specific features of polar backscatter interrogation which enhance the ability of ultrasound to detect flaws in a stitched composite laminate. Another focus is to explore the feasibility of implementing medical linear array imaging technology as a viable ultrasonic-based nondestructive evaluation method to inspect and characterize bonded aluminum lap joints. As an approach to implementing quantitative ultrasonic inspection methods to both of these materials, we focus on the physics that underlies the detection of flaws in such materials.

  2. Research-IQ: Development and Evaluation of an Ontology-anchored Integrative Query Tool

    PubMed Central

    Borlawsky, Tara B.; Lele, Omkar; Payne, Philip R. O.

    2011-01-01

    Investigators in the translational research and systems medicine domains require highly usable, efficient and integrative tools and methods that allow for the navigation of and reasoning over emerging large-scale data sets. Such resources must cover a spectrum of granularity from bio-molecules to population phenotypes. Given such information needs, we report upon the initial design and evaluation of an ontology-anchored integrative query tool, Research-IQ, which employs a combination of conceptual knowledge engineering and information retrieval techniques to enable the intuitive and rapid construction of queries, in terms of semi-structured textual propositions, that can subsequently be applied to integrative data sets. Our initial results, based upon both quantitative and qualitative evaluations of the efficacy and usability of Research-IQ, demonstrate its potential to increase clinical and translational research throughput. PMID:21821150

  3. Quantitative 3D investigation of Neuronal network in mouse spinal cord model

    NASA Astrophysics Data System (ADS)

    Bukreeva, I.; Campi, G.; Fratini, M.; Spanò, R.; Bucci, D.; Battaglia, G.; Giove, F.; Bravin, A.; Uccelli, A.; Venturi, C.; Mastrogiacomo, M.; Cedola, A.

    2017-01-01

    The investigation of the neuronal network in mouse spinal cord models represents the basis for the research on neurodegenerative diseases. In this framework, the quantitative analysis of the single elements in different districts is a crucial task. However, conventional 3D imaging techniques do not have enough spatial resolution and contrast to allow for a quantitative investigation of the neuronal network. Exploiting the high coherence and the high flux of synchrotron sources, X-ray Phase-Contrast multiscale-Tomography allows for the 3D investigation of the neuronal microanatomy without any aggressive sample preparation or sectioning. We investigated healthy-mouse neuronal architecture by imaging the 3D distribution of the neuronal-network with a spatial resolution of 640 nm. The high quality of the obtained images enables a quantitative study of the neuronal structure on a subject-by-subject basis. We developed and applied a spatial statistical analysis on the motor neurons to obtain quantitative information on their 3D arrangement in the healthy-mice spinal cord. Then, we compared the obtained results with a mouse model of multiple sclerosis. Our approach paves the way to the creation of a “database” for the characterization of the neuronal network main features for a comparative investigation of neurodegenerative diseases and therapies.

  4. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  5. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    ERIC Educational Resources Information Center

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  6. On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.

    2008-08-11

    The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less

  7. Understanding osteoporosis and fractures: an introduction to the use of qualitative research.

    PubMed

    Hoang-Kim, A; Schemitsch, E; Sale, J E M; Beaton, D; Warmington, K; Kulkarni, A V; Reeves, S

    2014-02-01

    Qualitative research has been recognized in recent years as a field of inquiry used to understand people's beliefs, attitudes, behaviors, culture or lifestyle. While quantitative results are challenging to apply in everyday practice, the qualitative paradigm can be useful to fill in a research context that is poorly understood or ill-defined. It can provide an in-depth study of interactions, a way to incorporate context, and a means to hear the voices of participants. Understanding experiences, motivation, and beliefs can have a profound effect on the interpretation of quantitative research and generating hypotheses. In this paper, we will review different qualitative approaches that healthcare providers and researchers may find useful to implement in future study designs, specifically in the context of osteoporosis and fracture. We will provide insight into the qualitative paradigm gained from the osteoporosis literature on fractures using examples from the database Scopus. Five prominent qualitative techniques (narratives, phenomenology, grounded theory, ethnography, and case study) can be used to generate meanings of the social and clinical world. We have highlighted how these strategies are implemented in qualitative research on osteoporosis and fractures and are anchored to specific methodological practices. We focus on studies that explore patient psychosocial experiences of diagnosis and treatment, cultural boundaries, and interprofessional communication. After reviewing the research, we believe that action research, that is not frequently used, could also effectively be used by many professions to improve programs and policies affecting those dealing with osteoporosis issues.

  8. Quantitative X-ray Differential Interference Contrast Microscopy

    NASA Astrophysics Data System (ADS)

    Nakamura, Takashi

    Full-field soft x-ray microscopes are widely used in many fields of sciences. Advances in nanofabrication technology enabled short wavelength focusing elements with significantly improved spatial resolution. In the soft x-ray spectral region, samples as small as 12 nm can be resolved using micro zone-plates as the objective lens. In addition to conventional x-ray microscopy in which x-ray absorption difference provides the image contrast, phase contrast mechanisms such as differential phase contrast (DIC) and Zernike phase contrast have also been demonstrated These phase contrast imaging mechanisms are especially attractive at the x-ray wavelengths where phase contrast of most materials is typically 10 times stronger than the absorption contrast. With recent progresses in plasma-based x- ray sources and increasing accessibility to synchrotron user facilities, x-ray microscopes are quickly becoming standard measurement equipment in the laboratory. To further the usefulness of x-ray DIC microscopy this thesis explicitly addresses three known issues with this imaging modality by introducing new techniques and devices First, as opposed to its visible-light counterpart, no quantitative phase imaging technique exists for x-ray DIC microscopy. To address this issue, two nanoscale x-ray quantitative phase imaging techniques, using exclusive OR (XOR) patterns and zone-plate doublets, respectively, are proposed. Unlike existing x-ray quantitative phase imaging techniques such as Talbot interferometry and ptychography, no dedicated experimental setups or stringent illumination coherence are needed for quantitative phase retrieval. Second, to the best of our knowledge, no quantitative performance characterization of DIC microscopy exists to date. Therefore the imaging system's response to sample's spatial frequency is not known In order to gain in-depth understanding of this imaging modality, performance of x-ray DIC microscopy is quantified using modulation transfer function. A new illumination apparatus required for the transfer function analysis under partially coherent illumination is also proposed. Such a characterization is essential for a proper selection of DIC optics for various transparent samples under study. Finally, optical elements used for x-ray DIC microscopy are highly absorptive and high brilliance x-ray sources such as synchrotrons are generally needed for image contrast. To extend the use of x-ray DIC microscopy to a wider variety of applications, a high efficiency large numerical aperture optical element consisting of high reflective Bragg reflectors is proposed. Using Bragg reflectors, which have 70% ˜99% reflectivity at extreme ultraviolet and soft x-rays for all angles of glancing incidence, the first order focusing efficiency is expected to increase by ˜ 8 times compared to that of a typical Fresnel zone-plate. This thesis contributes to current nanoscale x-ray phase contrast imaging research and provides new insights for biological, material, and magnetic sciences

  9. A Framework for Assessment of Aviation Safety Technology Portfolios

    NASA Technical Reports Server (NTRS)

    Jones, Sharon M.; Reveley, Mary S.

    2014-01-01

    The programs within NASA's Aeronautics Research Mission Directorate (ARMD) conduct research and development to improve the national air transportation system so that Americans can travel as safely as possible. NASA aviation safety systems analysis personnel support various levels of ARMD management in their fulfillment of system analysis and technology prioritization as defined in the agency's program and project requirements. This paper provides a framework for the assessment of aviation safety research and technology portfolios that includes metrics such as projected impact on current and future safety, technical development risk and implementation risk. The paper also contains methods for presenting portfolio analysis and aviation safety Bayesian Belief Network (BBN) output results to management using bubble charts and quantitative decision analysis techniques.

  10. Development of a low-cost method of analysis for the qualitative and quantitative analysis of butyltins in environmental samples.

    PubMed

    Bangkedphol, Sornnarin; Keenan, Helen E; Davidson, Christine; Sakultantimetha, Arthit; Songsasen, Apisit

    2008-12-01

    Most analytical methods for butyltins are based on high resolution techniques with complicated sample preparation. For this study, a simple application of an analytical method was developed using High Performance Liquid Chromatography (HPLC) with UV detection. The developed method was studied to determine tributyltin (TBT), dibutyltin (DBT) and monobutyltin (MBT) in sediment and water samples. The separation was performed in isocratic mode on an ultra cyanopropyl column with a mobile phase of hexane containing 5% THF and 0.03% acetic acid. This method was confirmed using standard GC/MS techniques and verified by statistical paired t-test method. Under the experimental conditions used, the limit of detection (LOD) of TBT and DBT were 0.70 and 0.50 microg/mL, respectively. The optimised extraction method for butyltins in water and sediment samples involved using hexane containing 0.05-0.5% tropolone and 0.2% sodium chloride in water at pH 1.7. The quantitative extraction of butyltin compounds in a certified reference material (BCR-646) and naturally contaminated samples was achieved with recoveries ranging from 95 to 108% and at %RSD 0.02-1.00%. This HPLC method and optimum extraction conditions were used to determine the contamination level of butyltins in environmental samples collected from the Forth and Clyde canal, Scotland, UK. The values obtained severely exceeded the Environmental Quality Standard (EQS) values. Although high resolution methods are utilised extensively for this type of research, the developed method is cheaper in both terms of equipment and running costs, faster in analysis time and has comparable detection limits to the alternative methods. This is advantageous not just as a confirmatory technique but also to enable further research in this field.

  11. Computer Vision Techniques for Transcatheter Intervention

    PubMed Central

    Zhao, Feng; Roach, Matthew

    2015-01-01

    Minimally invasive transcatheter technologies have demonstrated substantial promise for the diagnosis and the treatment of cardiovascular diseases. For example, transcatheter aortic valve implantation is an alternative to aortic valve replacement for the treatment of severe aortic stenosis, and transcatheter atrial fibrillation ablation is widely used for the treatment and the cure of atrial fibrillation. In addition, catheter-based intravascular ultrasound and optical coherence tomography imaging of coronary arteries provides important information about the coronary lumen, wall, and plaque characteristics. Qualitative and quantitative analysis of these cross-sectional image data will be beneficial to the evaluation and the treatment of coronary artery diseases such as atherosclerosis. In all the phases (preoperative, intraoperative, and postoperative) during the transcatheter intervention procedure, computer vision techniques (e.g., image segmentation and motion tracking) have been largely applied in the field to accomplish tasks like annulus measurement, valve selection, catheter placement control, and vessel centerline extraction. This provides beneficial guidance for the clinicians in surgical planning, disease diagnosis, and treatment assessment. In this paper, we present a systematical review on these state-of-the-art methods. We aim to give a comprehensive overview for researchers in the area of computer vision on the subject of transcatheter intervention. Research in medical computing is multi-disciplinary due to its nature, and hence, it is important to understand the application domain, clinical background, and imaging modality, so that methods and quantitative measurements derived from analyzing the imaging data are appropriate and meaningful. We thus provide an overview on the background information of the transcatheter intervention procedures, as well as a review of the computer vision techniques and methodologies applied in this area. PMID:27170893

  12. High-Pressure Gaseous Burner (HPGB) Facility Completed for Quantitative Laser Diagnostics Calibration

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet

    2002-01-01

    A gas-fueled high-pressure combustion facility with optical access, which was developed over the last 2 years, has just been completed. The High Pressure Gaseous Burner (HPGB) rig at the NASA Glenn Research Center can operate at sustained pressures up to 60 atm with a variety of gaseous fuels and liquid jet fuel. The facility is unique as it is the only continuous-flow, hydrogen-capable, 60-atm rig in the world with optical access. It will provide researchers with new insights into flame conditions that simulate the environment inside the ultra-high-pressure-ratio combustion chambers of tomorrow's advanced aircraft engines. The facility provides optical access to the flame zone, enabling the calibration of nonintrusive optical diagnostics to measure chemical species and temperature. The data from the HPGB rig enables the validation of numerical codes that simulate gas turbine combustors, such as the National Combustor Code (NCC). The validation of such numerical codes is often best achieved with nonintrusive optical diagnostic techniques that meet these goals: information-rich (multispecies) and quantitative while providing good spatial and time resolution. Achieving these goals is a challenge for most nonintrusive optical diagnostic techniques. Raman scattering is a technique that meets these challenges. Raman scattering occurs when intense laser light interacts with molecules to radiate light at a shifted wavelength (known as the Raman shift). This shift in wavelength is unique to each chemical species and provides a "fingerprint" of the different species present. The facility will first be used to gather a comprehensive data base of laser Raman spectra at high pressures. These calibration data will then be used to quantify future laser Raman measurements of chemical species concentration and temperature in this facility and other facilities that use Raman scattering.

  13. Estimating Hydrologic Fluxes, Crop Water Use, and Agricultural Land Area in China using Data Assimilation

    NASA Astrophysics Data System (ADS)

    Smith, Tiziana; McLaughlin, Dennis B.; Hoisungwan, Piyatida

    2016-04-01

    Crop production has significantly altered the terrestrial environment by changing land use and by altering the water cycle through both co-opted rainfall and surface water withdrawals. As the world's population continues to grow and individual diets become more resource-intensive, the demand for food - and the land and water necessary to produce it - will continue to increase. High-resolution quantitative data about water availability, water use, and agricultural land use are needed to develop sustainable water and agricultural planning and policies. However, existing data covering large areas with high resolution are susceptible to errors and can be physically inconsistent. China is an example of a large area where food demand is expected to increase and a lack of data clouds the resource management dialogue. Some assert that China will have insufficient land and water resources to feed itself, posing a threat to global food security if they seek to increase food imports. Others believe resources are plentiful. Without quantitative data, it is difficult to discern if these concerns are realistic or overly dramatized. This research presents a quantitative approach using data assimilation techniques to characterize hydrologic fluxes, crop water use (defined as crop evapotranspiration), and agricultural land use at 0.5 by 0.5 degree resolution and applies the methodology in China using data from around the year 2000. The approach uses the principles of water balance and of crop water requirements to assimilate existing data with a least-squares estimation technique, producing new estimates of water and land use variables that are physically consistent while minimizing differences from measured data. We argue that this technique for estimating water fluxes and agricultural land use can provide a useful basis for resource management modeling and policy, both in China and around the world.

  14. Improved assay for quantitating adherence of ruminal bacteria to cellulose.

    PubMed Central

    Rasmussen, M A; White, B A; Hespell, R B

    1989-01-01

    A quantitative technique suitable for the determination of adherence of ruminal bacteria to cellulose was developed. This technique employs adherence of cells to cellulose disks and alleviates the problem of nonspecific cell entrapment within cellulose particles. By using this technique, it was demonstrated that the adherence of Ruminococcus flavefaciens FD1 to cellulose was inhibited by formaldehyde, methylcellulose, and carboxymethyl cellulose. Adherence was unaffected by acid hydrolysates of methylcellulose, glucose, and cellobiose. PMID:2782879

  15. MO-C-BRCD-03: The Role of Informatics in Medical Physics and Vice Versa.

    PubMed

    Andriole, K

    2012-06-01

    Like Medical Physics, Imaging Informatics encompasses concepts touching every aspect of the imaging chain from image creation, acquisition, management and archival, to image processing, analysis, display and interpretation. The two disciplines are in fact quite complementary, with similar goals to improve the quality of care provided to patients using an evidence-based approach, to assure safety in the clinical and research environments, to facilitate efficiency in the workplace, and to accelerate knowledge discovery. Use-cases describing several areas of informatics activity will be given to illustrate current limitations that would benefit from medical physicist participation, and conversely areas in which informaticists may contribute to the solution. Topics to be discussed include radiation dose monitoring, process management and quality control, display technologies, business analytics techniques, and quantitative imaging. Quantitative imaging is increasingly becoming an essential part of biomedicalresearch as well as being incorporated into clinical diagnostic activities. Referring clinicians are asking for more objective information to be gleaned from the imaging tests that they order so that they may make the best clinical management decisions for their patients. Medical Physicists may be called upon to identify existing issues as well as develop, validate and implement new approaches and technologies to help move the field further toward quantitative imaging methods for the future. Biomedical imaging informatics tools and techniques such as standards, integration, data mining, cloud computing and new systems architectures, ontologies and lexicons, data visualization and navigation tools, and business analytics applications can be used to overcome some of the existing limitations. 1. Describe what is meant by Medical Imaging Informatics and understand why the medical physicist should care. 2. Identify existing limitations in information technologies with respect to Medical Physics, and conversely see how Informatics may assist the medical physicist in filling some of the current gaps in their activities. 3. Understand general informatics concepts and areas of investigation including imaging and workflow standards, systems integration, computing architectures, ontologies, data mining and business analytics, data visualization and human-computer interface tools, and the importance of quantitative imaging for the future of Medical Physics and Imaging Informatics. 4. Become familiar with on-going efforts to address current challenges facing future research into and clinical implementation of quantitative imaging applications. © 2012 American Association of Physicists in Medicine.

  16. Chronic Obstructive Pulmonary Disease Exacerbations in the COPDGene Study: Associated Radiologic Phenotypes

    PubMed Central

    Kazerooni, Ella A.; Lynch, David A.; Liu, Lyrica X.; Murray, Susan; Curtis, Jeffrey L.; Criner, Gerard J.; Kim, Victor; Bowler, Russell P.; Hanania, Nicola A.; Anzueto, Antonio R.; Make, Barry J.; Hokanson, John E.; Crapo, James D.; Silverman, Edwin K.; Martinez, Fernando J.; Washko, George R.

    2011-01-01

    Purpose: To test the hypothesis—given the increasing emphasis on quantitative computed tomographic (CT) phenotypes of chronic obstructive pulmonary disease (COPD)—that a relationship exists between COPD exacerbation frequency and quantitative CT measures of emphysema and airway disease. Materials and Methods: This research protocol was approved by the institutional review board of each participating institution, and all participants provided written informed consent. One thousand two subjects who were enrolled in the COPDGene Study and met the GOLD (Global Initiative for Chronic Obstructive Lung Disease) criteria for COPD with quantitative CT analysis were included. Total lung emphysema percentage was measured by using the attenuation mask technique with a −950-HU threshold. An automated program measured the mean wall thickness and mean wall area percentage in six segmental bronchi. The frequency of COPD exacerbation in the prior year was determined by using a questionnaire. Statistical analysis was performed to examine the relationship of exacerbation frequency with lung function and quantitative CT measurements. Results: In a multivariate analysis adjusted for lung function, bronchial wall thickness and total lung emphysema percentage were associated with COPD exacerbation frequency. Each 1-mm increase in bronchial wall thickness was associated with a 1.84-fold increase in annual exacerbation rate (P = .004). For patients with 35% or greater total emphysema, each 5% increase in emphysema was associated with a 1.18-fold increase in this rate (P = .047). Conclusion: Greater lung emphysema and airway wall thickness were associated with COPD exacerbations, independent of the severity of airflow obstruction. Quantitative CT can help identify subgroups of patients with COPD who experience exacerbations for targeted research and therapy development for individual phenotypes. © RSNA, 2011 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.11110173/-/DC1 PMID:21788524

  17. Comparison of longitudinal excursion of a nerve-phantom model using quantitative ultrasound imaging and motion analysis system methods: A convergent validity study.

    PubMed

    Paquette, Philippe; El Khamlichi, Youssef; Lamontagne, Martin; Higgins, Johanne; Gagnon, Dany H

    2017-08-01

    Quantitative ultrasound imaging is gaining popularity in research and clinical settings to measure the neuromechanical properties of the peripheral nerves such as their capability to glide in response to body segment movement. Increasing evidence suggests that impaired median nerve longitudinal excursion is associated with carpal tunnel syndrome. To date, psychometric properties of longitudinal nerve excursion measurements using quantitative ultrasound imaging have not been extensively investigated. This study investigates the convergent validity of the longitudinal nerve excursion by comparing measures obtained using quantitative ultrasound imaging with those determined with a motion analysis system. A 38-cm long rigid nerve-phantom model was used to assess the longitudinal excursion in a laboratory environment. The nerve-phantom model, immersed in a 20-cm deep container filled with a gelatin-based solution, was moved 20 times using a linear forward and backward motion. Three light-emitting diodes were used to record nerve-phantom excursion with a motion analysis system, while a 5-cm linear transducer allowed simultaneous recording via ultrasound imaging. Both measurement techniques yielded excellent association ( r  = 0.99) and agreement (mean absolute difference between methods = 0.85 mm; mean relative difference between methods = 7.48 %). Small discrepancies were largely found when larger excursions (i.e. > 10 mm) were performed, revealing slight underestimation of the excursion by the ultrasound imaging analysis software. Quantitative ultrasound imaging is an accurate method to assess the longitudinal excursion of an in vitro nerve-phantom model and appears relevant for future research protocols investigating the neuromechanical properties of the peripheral nerves.

  18. [Application of Raman Spectroscopy Technique to Agricultural Products Quality and Safety Determination].

    PubMed

    Liu, Yan-de; Jin, Tan-tan

    2015-09-01

    The quality and safety of agricultural products and people health are inseparable. Using the conventional chemical methods which have so many defects, such as sample pretreatment, complicated operation process and destroying the samples. Raman spectroscopy as a powerful tool of analysing and testing molecular structure, can implement samples quickly without damage, qualitative and quantitative detection analysis. With the continuous improvement and the scope of the application of Raman spectroscopy technology gradually widen, Raman spectroscopy technique plays an important role in agricultural products quality and safety determination, and has wide application prospects. There have been a lot of related research reports based on Raman spectroscopy detection on agricultural product quality safety at present. For the understanding of the principle of detection and the current development situation of Raman spectroscopy, as well as tracking the latest research progress both at home and abroad, the basic principles and the development of Raman spectroscopy as well as the detection device were introduced briefly. The latest research progress of quality and safety determination in fruits and vegetables, livestock and grain by Raman spectroscopy technique were reviewed deeply. Its technical problems for agricultural products quality and safety determination were pointed out. In addition, the text also briefly introduces some information of Raman spectrometer and the application for patent of the portable Raman spectrometer, prospects the future research and application.

  19. Research using qualitative, quantitative or mixed methods and choice based on the research.

    PubMed

    McCusker, K; Gunaydin, S

    2015-10-01

    Research is fundamental to the advancement of medicine and critical to identifying the most optimal therapies unique to particular societies. This is easily observed through the dynamics associated with pharmacology, surgical technique and the medical equipment used today versus short years ago. Advancements in knowledge synthesis and reporting guidelines enhance the quality, scope and applicability of results; thus, improving health science and clinical practice and advancing health policy. While advancements are critical to the progression of optimal health care, the high cost associated with these endeavors cannot be ignored. Research fundamentally needs to be evaluated to identify the most efficient methods of evaluation. The primary objective of this paper is to look at a specific research methodology when applied to the area of clinical research, especially extracorporeal circulation and its prognosis for the future. © The Author(s) 2014.

  20. Analysis of eye-tracking experiments performed on a Tobii T60

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Banks, David C

    2008-01-01

    Commercial eye-gaze trackers have the potential to be an important tool for quantifying the benefits of new visualization techniques. The expense of such trackers has made their use relatively infrequent in visualization studies. As such, it is difficult for researchers to compare multiple devices obtaining several demonstration models is impractical in cost and time, and quantitative measures from real-world use are not readily available. In this paper, we present a sample protocol to determine the accuracy of a gaze-tacking device.

  1. Clear turbulence forecasting - Towards a union of art and science

    NASA Technical Reports Server (NTRS)

    Keller, J. L.

    1985-01-01

    The development of clear air turbulence (CAT) forecasting over the last several decades is reviewed in the context of empirical and theoretical research into the nature of nonconvective turbulence in the free atmosphere, particularly at jet stream levels. Various qualitative CAT forecasting techniques are examined, and prospects for an effective quantitative index to aid aviation meteorologists in jet stream level turbulence monitoring and forecasting are examined. Finally, the use of on-board sensors for short-term warning is discussed.

  2. A Partnership Training Program in Breast Cancer Research Using Molecular Imaging Techniques

    DTIC Science & Technology

    2008-07-01

    PubMed) 2. Berlier J.E., Rothe A., Buller G., Bradford J., Gray D.R., Filanoski B.J., Telford W.G., Yue S., Liu J., Cheung C.Y., et al. Quantitative...3 3 cm3 voxel within the gray matter of the occipitoparietal lobe was established using anatomic landmarks. Pulse Sequences All experiments were...software (SAS Institute, Cary, NC, USA). RESULTS Figure 1 shows a PRESS spectrum recorded from the occipitoparietal gray matter region of a 25-year-old sub

  3. Improving the Ar I and II branching ratio calibration method: Monte Carlo simulations of effects from photon scattering/reflecting in hollow cathodes

    NASA Astrophysics Data System (ADS)

    Lawler, J. E.; Den Hartog, E. A.

    2018-03-01

    The Ar I and II branching ratio calibration method is discussed with the goal of improving the technique. This method of establishing a relative radiometric calibration is important in ongoing research to improve atomic transition probabilities for quantitative spectroscopy in astrophysics and other fields. Specific suggestions are presented along with Monte Carlo simulations of wavelength dependent effects from scattering/reflecting of photons in a hollow cathode.

  4. LANDSAT land cover analysis completed for CIRSS/San Bernardino County project

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.; Sinnott, D. (Principal Investigator)

    1982-01-01

    The LANDSAT analysis carried out as part of Ames Research Center's San Bernardino County Project, one of four projects sponsored by NASA as part of the California Integrated Remote Sensing System (CIRSS) effort for generating and utilizing digital geographic data bases, is described. Topics explored include use of data-base modeling with spectral cluster data to improve LANDSAT data classification, and quantitative evaluation of several change techniques. Both 1976 and 1979 LANDSAT data were used in the project.

  5. Consumer understanding of sugars claims on food and drink products

    PubMed Central

    Patterson, N J; Sadler, M J; Cooper, J M

    2012-01-01

    Consumer understanding of nutrition and health claims is a key aspect of current regulations in the European Union (EU). In view of this, qualitative and quantitative research techniques were used to investigate consumer awareness and understanding of product claims in the UK, focusing particularly on nutrition claims relating to sugars. Both research methods identified a good awareness of product claims. No added sugars claims were generally preferred to reduced sugars claims, and there was a general assumption that sweeteners and other ingredients would be added in place of sugars. However, there was little awareness of the level of sugar reduction and the associated calorie reduction in products when reduced sugars claims were made on pack. In focus groups, participants felt deceived if sugar reduction claims were being made without a significant reduction in calories. This was reinforced in the quantitative research which showed that respondents expected a similar and meaningful level of calorie reduction to the level of sugar reduction. The research also identified consumer confusion around the calorie content of different nutrients, including over-estimation of the calorie content of sugars. This is crucial to consumers' expectations as they clearly link sugar to calories and therefore expect a reduction in sugar content to deliver a reduction in calorie content. PMID:22973161

  6. Music and suicidality: a quantitative review and extension.

    PubMed

    Stack, Steven; Lester, David; Rosenberg, Jonathan S

    2012-12-01

    This article provides the first quantitative review of the literature on music and suicidality. Multivariate logistic regression techniques are applied to 90 findings from 21 studies. Investigations employing ecological data on suicide completions are 19.2 times more apt than other studies to report a link between music and suicide. More recent and studies with large samples are also more apt than their counterparts to report significant results. Further, none of the studies based on experimental research designs found a link between music and suicide ideation, prompting us to do a brief content analysis of 24 suicide songs versus 24 nonsuicide songs from the same album. Using Linguistic Inquiry and Word Count software, we found no difference in the content of the suicide songs and controls, including the percentage of sad words, negative affect, and mentions of death, thus providing an explanation for nonfindings from experimental research. In summary, ecologically based (which capture at-risk persons not in typical school-based samples) and more recent investigations (which have used superior or new methodologies) tend to demonstrate a linkage between music and suicidality. Experimental research is needed with a control group of songs from an alternative genre with low suicidogenic content. © 2012 The American Association of Suicidology.

  7. A novel approach to mixing qualitative and quantitative methods in HIV and STI prevention research.

    PubMed

    Penman-Aguilar, Ana; Macaluso, Maurizio; Peacock, Nadine; Snead, M Christine; Posner, Samuel F

    2014-04-01

    Mixed-method designs are increasingly used in sexually transmitted infection (STI) and HIV prevention research. The authors designed a mixedmethod approach and applied it to estimate and evaluate a predictor of continued female condom use (6+ uses, among those who used it at least once) in a 6-month prospective cohort study. The analysis included 402 women who received an intervention promoting use of female and male condoms for STI prevention and completed monthly quantitative surveys; 33 also completed a semistructured qualitative interview. The authors identified a qualitative theme (couples' female condom enjoyment [CFCE]), applied discriminant analysis techniques to estimate CFCE for all participants, and added CFCE to a multivariable logistic regression model of continued female condom use. CFCE related to comfort, naturalness, pleasure, feeling protected, playfulness, ease of use, intimacy, and feeling in control of protection. CFCE was associated with continued female condom use (adjusted odds ratio: 2.8, 95% confidence interval: 1.4-5.6) and significantly improved model fit (p < .001). CFCE predicted continued female condom use. Mixed-method approaches for "scaling up" qualitative findings from small samples to larger numbers of participants can benefit HIV and STI prevention research.

  8. Research on High-Bandgap Materials and Amorphous Silicon-Based Solar Cells, Final Technical Report, 15 May 1994-15 January 1998

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiff, E. A.; Gu, Q.; Jiang, L.

    1998-12-28

    This report describes work performed by Syracuse University under this subcontract. Researchers developed a technique based on electroabsorption measurements for obtaining quantitative estimates of the built-in potential Vbi in a-Si:H-based heterostructure solar cells incorporating microcrystalline or a-SiC:H p layers. Using this new electroabsorption technique, researchers confirmed previous estimates of Vbi {yields} 1.0 V in a-Si:H solar cells with ''conventional'' intrinsic layers and either microcrystalline or a-SiC:H p layers. Researchers also explored the recent claim that light-soaking of a-Si:H substantially changes the polarized electroabsorption associated with interband optical transitions (and hence, not defect transitions). Researchers confirmed measurements of improved (5') holemore » drift mobilities in some specially prepared a-Si:H samples. Disturbingly, solar cells made with such materials did not show improved efficiencies. Researchers significantly clarified the relationship of ambipolar diffusion-length measurements to hole drift mobilities in a-Si:H, and have shown that the photocapacitance measurements can be interpreted in terms of hole drift mobilities in amorphous silicon. They also completed a survey of thin BP:H and BPC:H films prepared by plasma deposition using phosphine, diborane, trimethylboron, and hydrogen as precursor gases.« less

  9. Semi-quantitative prediction of a multiple API solid dosage form with a combination of vibrational spectroscopy methods.

    PubMed

    Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T

    2016-05-30

    Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    PubMed

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory analysis our results show similar diagnostic accuracy comparing anatomical (AUC 0.86(0.83-0.89)) and functional reference standards (AUC 0.88(0.84-0.90)). Only the per territory analysis sensitivity did not show significant heterogeneity. None of the groups showed signs of publication bias. The clinical value of semi-quantitative and quantitative CMR perfusion analysis remains uncertain due to extensive inter-study heterogeneity and large differences in CMR perfusion acquisition protocols, reference standards, and methods of assessment of myocardial perfusion parameters. For wide spread implementation, standardization of CMR perfusion techniques is essential. CRD42016040176 .

  11. Optimisation of techniques for quantification of Botrytis cinerea in grape berries and receptacles by quantitative polymerase chain reaction

    USDA-ARS?s Scientific Manuscript database

    Quantitative PCR (qPCR) can be used to detect and monitor pathogen colonization, but early attempts to apply the technology to Botrytis cinerea infection of grape berries have identified limitations to current techniques. In this study, four DNA extraction methods, two grinding methods, two grape or...

  12. A land classification protocol for pollinator ecology research: An urbanization case study.

    PubMed

    Samuelson, Ash E; Leadbeater, Ellouise

    2018-06-01

    Land-use change is one of the most important drivers of widespread declines in pollinator populations. Comprehensive quantitative methods for land classification are critical to understanding these effects, but co-option of existing human-focussed land classifications is often inappropriate for pollinator research. Here, we present a flexible GIS-based land classification protocol for pollinator research using a bottom-up approach driven by reference to pollinator ecology, with urbanization as a case study. Our multistep method involves manually generating land cover maps at multiple biologically relevant radii surrounding study sites using GIS, with a focus on identifying land cover types that have a specific relevance to pollinators. This is followed by a three-step refinement process using statistical tools: (i) definition of land-use categories, (ii) principal components analysis on the categories, and (iii) cluster analysis to generate a categorical land-use variable for use in subsequent analysis. Model selection is then used to determine the appropriate spatial scale for analysis. We demonstrate an application of our protocol using a case study of 38 sites across a gradient of urbanization in South-East England. In our case study, the land classification generated a categorical land-use variable at each of four radii based on the clustering of sites with different degrees of urbanization, open land, and flower-rich habitat. Studies of land-use effects on pollinators have historically employed a wide array of land classification techniques from descriptive and qualitative to complex and quantitative. We suggest that land-use studies in pollinator ecology should broadly adopt GIS-based multistep land classification techniques to enable robust analysis and aid comparative research. Our protocol offers a customizable approach that combines specific relevance to pollinator research with the potential for application to a wide range of ecological questions, including agroecological studies of pest control.

  13. Ptychography: use of quantitative phase information for high-contrast label free time-lapse imaging of living cells

    NASA Astrophysics Data System (ADS)

    Suman, Rakesh; O'Toole, Peter

    2014-03-01

    Here we report a novel label free, high contrast and quantitative method for imaging live cells. The technique reconstructs an image from overlapping diffraction patterns using a ptychographical algorithm. The algorithm utilises both amplitude and phase data from the sample to report on quantitative changes related to the refractive index (RI) and thickness of the specimen. We report the ability of this technique to generate high contrast images, to visualise neurite elongation in neuronal cells, and to provide measure of cell proliferation.

  14. MO-C-BRB-00: President’s Symposium: Revitalizing Scientific Excellence: Turning Research Into Clinical Reality Through Translational Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    Diagnostic radiology and radiation oncology are arguably two of the most technologically advanced specialties in medicine. The imaging and radiation medicine technologies in clinical use today have been continuously improved through new advances made in the commercial and academic research arenas. This symposium explores the translational path from research through clinical implementation. Dr. Pettigrew will start this discussion by sharing his perspectives as director of the National Institute of Biomedical Imaging and Bioengineering (NIBIB). The NIBIB has focused on promoting research that is technological in nature and has high clinical impact. We are in the age of precision medicine, andmore » the technological innovations and quantitative tools developed by engineers and physicists working with physicians are providing innovative tools that increase precision and improve outcomes in health care. NIBIB funded grants lead to a very high patenting rate (per grant dollar), and these patents have higher citation rates by other patents, suggesting greater clinical impact, as well. Two examples of clinical translation resulting from NIH-funded research will be presented, in radiation therapy and diagnostic imaging. Dr. Yu will describe a stereotactic radiotherapy device developed in his laboratory that is designed for treating breast cancer with the patient in the prone position. It uses 36 rotating Cobalt-60 sources positioned in an annular geometry to focus the radiation beam at the system’s isocenter. The radiation dose is delivered throughout the target volume in the breast by constantly moving the patient in a planned trajectory relative to the fixed isocenter. With this technique, the focal spot dynamically paints the dose distribution throughout the target volume in three dimensions. Dr. Jackson will conclude this symposium by describing the RSNA Quantitative Imaging Biomarkers Alliance (QIBA), which is funded in part by NIBIB and is a synergistic collaboration between medical physicists, radiologists, oncologists, industry representatives, and other stakeholders. The mission of QIBA is to improve the accuracy and practicality of quantitative image-based biomarkers by increasing precision across devices, patients, and time, an essential step in incorporating quantitative imaging biomarkers into radiology practice. Validated quantitative imaging biomarkers are necessary to support precision medicine initiatives, multimodality / multiparametric applications in medicine, treatment planning and response assessment, and radiogenomics applications. Current applications in the QIBA portfolio extend to cancer diagnosis and treatment, pulmonary diseases, and neurological disorders. The overall goal of this symposium is to illustrate the bidirectional exchange between medical research and clinical practice. Revitalizing scientific excellence in clinical medical physics challenges practitioners to identify clinical limitations, which then drive research innovation; research funded by the NIH and other agencies develops technological solutions to these limitations, which are translated to the care environment to ultimately improve clinical practice in radiology and radiation oncology.« less

  15. Simultaneous off-axis multiplexed holography and regular fluorescence microscopy of biological cells.

    PubMed

    Nygate, Yoav N; Singh, Gyanendra; Barnea, Itay; Shaked, Natan T

    2018-06-01

    We present a new technique for obtaining simultaneous multimodal quantitative phase and fluorescence microscopy of biological cells, providing both quantitative phase imaging and molecular specificity using a single camera. Our system is based on an interferometric multiplexing module, externally positioned at the exit of an optical microscope. In contrast to previous approaches, the presented technique allows conventional fluorescence imaging, rather than interferometric off-axis fluorescence imaging. We demonstrate the presented technique for imaging fluorescent beads and live biological cells.

  16. Applying Knowledge of Quantitative Design and Analysis

    ERIC Educational Resources Information Center

    Baskas, Richard S.

    2011-01-01

    This study compared and contrasted two quantitative scholarly articles in relation to their research designs. Their designs were analyzed by the comparison of research references and research specific vocabulary to describe how various research methods were used. When researching and analyzing quantitative scholarly articles, it is imperative to…

  17. 76 FR 52383 - Reports, Forms, and Recordkeeping Requirements

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-22

    ... OMB: Title: 49 CFR 575--Consumer Information Regulations (sections 103 and 105) Quantitative Research... research and is now requesting to conduct follow- up quantitative research with consumers to assess current.... The results of that research phase were used to inform the quantitative phase of research which this...

  18. Composing, Analyzing and Validating Software Models

    NASA Astrophysics Data System (ADS)

    Sheldon, Frederick T.

    1998-10-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  19. Composing, Analyzing and Validating Software Models

    NASA Technical Reports Server (NTRS)

    Sheldon, Frederick T.

    1998-01-01

    This research has been conducted at the Computational Sciences Division of the Information Sciences Directorate at Ames Research Center (Automated Software Engineering Grp). The principle work this summer has been to review and refine the agenda that were carried forward from last summer. Formal specifications provide good support for designing a functionally correct system, however they are weak at incorporating non-functional performance requirements (like reliability). Techniques which utilize stochastic Petri nets (SPNs) are good for evaluating the performance and reliability for a system, but they may be too abstract and cumbersome from the stand point of specifying and evaluating functional behavior. Therefore, one major objective of this research is to provide an integrated approach to assist the user in specifying both functionality (qualitative: mutual exclusion and synchronization) and performance requirements (quantitative: reliability and execution deadlines). In this way, the merits of a powerful modeling technique for performability analysis (using SPNs) can be combined with a well-defined formal specification language. In doing so, we can come closer to providing a formal approach to designing a functionally correct system that meets reliability and performance goals.

  20. Integrating service development with evaluation in telehealthcare: an ethnographic study

    PubMed Central

    Finch, Tracy; May, Carl; Mair, Frances; Mort, Maggie; Gask, Linda

    2003-01-01

    Objectives To identify issues that facilitate the successful integration of evaluation and development of telehealthcare services. Design Ethnographic study using various qualitative research techniques to obtain data from several sources, including in-depth semistructured interviews, project steering group meetings, and public telehealthcare meetings. Setting Seven telehealthcare evaluation projects (four randomised controlled trials and three pragmatic service evaluations) in the United Kingdom, studied over two years. Projects spanned a range of specialties—dermatology, psychiatry, respiratory medicine, cardiology, and oncology. Participants Clinicians, managers, technical experts, and researchers involved in the projects. Results and discussion Key problems in successfully integrating evaluation and service development in telehealthcare are, firstly, defining existing clinical practices (and anticipating changes) in ways that permit measurement; secondly, managing additional workload and conflicting responsibilities brought about by combining clinical and research responsibilities (including managing risk); and, thirdly, understanding various perspectives on effectiveness and the limitations of evaluation results beyond the context of the research study. Conclusions Combined implementation and evaluation of telehealthcare systems is complex, and is often underestimated. The distinction between quantitative outcomes and the workability of the system is important for producing evaluative knowledge that is of practical value. More pragmatic approaches to evaluation, that permit both quantitative and qualitative methods, are required to improve the quality of such research and its relevance for service provision in the NHS. PMID:14630758

  1. Methodological triangulation: an approach to understanding data.

    PubMed

    Bekhet, Abir K; Zauszniewski, Jaclene A

    2012-01-01

    To describe the use of methodological triangulation in a study of how people who had moved to retirement communities were adjusting. Methodological triangulation involves using more than one kind of method to study a phenomenon. It has been found to be beneficial in providing confirmation of findings, more comprehensive data, increased validity and enhanced understanding of studied phenomena. While many researchers have used this well-established technique, there are few published examples of its use. The authors used methodological triangulation in their study of people who had moved to retirement communities in Ohio, US. A blended qualitative and quantitative approach was used. The collected qualitative data complemented and clarified the quantitative findings by helping to identify common themes. Qualitative data also helped in understanding interventions for promoting 'pulling' factors and for overcoming 'pushing' factors of participants. The authors used focused research questions to reflect the research's purpose and four evaluative criteria--'truth value', 'applicability', 'consistency' and 'neutrality'--to ensure rigour. This paper provides an example of how methodological triangulation can be used in nursing research. It identifies challenges associated with methodological triangulation, recommends strategies for overcoming them, provides a rationale for using triangulation and explains how to maintain rigour. Methodological triangulation can be used to enhance the analysis and the interpretation of findings. As data are drawn from multiple sources, it broadens the researcher's insight into the different issues underlying the phenomena being studied.

  2. Overview of Student Affairs Research Methods: Qualitative and Quantitative.

    ERIC Educational Resources Information Center

    Perl, Emily J.; Noldon, Denise F.

    2000-01-01

    Reviews the strengths and weaknesses of quantitative and qualitative research in student affairs research, noting that many student affairs professionals question the value of more traditional quantitative approaches to research, though they typically have very good people skills that they have applied to being good qualitative researchers.…

  3. Qualitative Research? Quantitative Research? What's the Problem? Resolving the Dilemma via a Postconstructivist Approach.

    ERIC Educational Resources Information Center

    Shank, Gary

    It is argued that the debate between qualitative and quantitative research for educational researchers is actually an argument between constructivism and positivism. Positivism has been the basis for most quantitative research in education. Two different things are actually meant when constructivism is discussed (constructivism and…

  4. Quantitative Research Attitudes and Research Training Perceptions among Master's-Level Students

    ERIC Educational Resources Information Center

    Steele, Janeé M.; Rawls, Glinda J.

    2015-01-01

    This study explored master's-level counseling students' (N = 804) perceptions of training in the Council for Accreditation of Counseling and Related Educational Programs (2009) Research and Program Evaluation standard, and their attitudes toward quantitative research. Training perceptions and quantitative research attitudes were low to moderate,…

  5. Detection and Quantification of Graphene-Family Nanomaterials in the Environment.

    PubMed

    Goodwin, David G; Adeleye, Adeyemi S; Sung, Lipiin; Ho, Kay T; Burgess, Robert M; Petersen, Elijah J

    2018-04-17

    An increase in production of commercial products containing graphene-family nanomaterials (GFNs) has led to concern over their release into the environment. The fate and potential ecotoxicological effects of GFNs in the environment are currently unclear, partially due to the limited analytical methods for GFN measurements. In this review, the unique properties of GFNs that are useful for their detection and quantification are discussed. The capacity of several classes of techniques to identify and/or quantify GFNs in different environmental matrices (water, soil, sediment, and organisms), after environmental transformations, and after release from a polymer matrix of a product is evaluated. Extraction and strategies to combine methods for more accurate discrimination of GFNs from environmental interferences as well as from other carbonaceous nanomaterials are recommended. Overall, a comprehensive review of the techniques available to detect and quantify GFNs are systematically presented to inform the state of the science, guide researchers in their selection of the best technique for the system under investigation, and enable further development of GFN metrology in environmental matrices. Two case studies are described to provide practical examples of choosing which techniques to utilize for detection or quantification of GFNs in specific scenarios. Because the available quantitative techniques are somewhat limited, more research is required to distinguish GFNs from other carbonaceous materials and improve the accuracy and detection limits of GFNs at more environmentally relevant concentrations.

  6. Noninvasive imaging of bone microarchitecture

    PubMed Central

    Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila

    2015-01-01

    The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043

  7. Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security

    NASA Astrophysics Data System (ADS)

    Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver

    This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.

  8. Non-intrusive flow measurements on a reentry vehicle

    NASA Technical Reports Server (NTRS)

    Miles, R. B.; Satavicca, D. A.; Zimmermann, G. M.

    1983-01-01

    This study evaluates the utility of various non-intrusive techniques for the measurement of the flow field on the windward side of the Space Shuttle or a similar re-entry vehicle. Included are linear (Rayleigh, Raman, Mie, Laser Doppler Velocimetry, Resonant Doppler Velocimetry) and nonlinear (Coherent Anti-Stokes Raman, Laser Induced Fluorescence) light scattering, electron beam fluorescence, thermal emission and mass spectroscopy. Flow field properties are taken from a nonequilibrium flow model by Shinn, Moss and Simmonds at NASA Langley. Conclusions are, when possible, based on quantitative scaling of known laboratory results to the conditions projected. Detailed discussion with researchers in the field contributed further to these conclusions and provided valuable insights regarding the experimental feasibility of each of the techniques.

  9. Preparation and quantification of radioactive particles for tracking hydrodynamic behavior in multiphase reactors.

    PubMed

    Yunos, Mohd Amirul Syafiq Mohd; Hussain, Siti Aslina; Yusoff, Hamdan Mohamed; Abdullah, Jaafar

    2014-09-01

    Radioactive particle tracking (RPT) has emerged as a promising and versatile technique that can provide rich information about a variety of multiphase flow systems. However, RPT is not an off-the-shelf technique, and thus, users must customize RPT for their applications. This paper presents a simple procedure for preparing radioactive tracer particles created via irradiation with neutrons from the TRIGA Mark II research reactor. The present study focuses on the performance evaluation of encapsulated gold and scandium particles for applications as individual radioactive tracer particles using qualitative and quantitative neutron activation analysis (NAA) and an X-ray microcomputed tomography (X-ray Micro-CT) scanner installed at the Malaysian Nuclear Agency. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Stereo vision techniques for telescience

    NASA Astrophysics Data System (ADS)

    Hewett, S.

    1990-02-01

    The Botanic Experiment is one of the pilot experiments in the Telescience Test Bed program at the ESTEC research and technology center of the European Space Agency. The aim of the Telescience Test Bed is to develop the techniques required by an experimenter using a ground based work station for remote control, monitoring, and modification of an experiment operating on a space platform. The purpose of the Botanic Experiment is to examine the growth of seedlings under various illumination conditions with a video camera from a number of viewpoints throughout the duration of the experiment. This paper describes the Botanic Experiment and the points addressed in developing a stereo vision software package to extract quantitative information about the seedlings from the recorded video images.

  11. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standards.

    PubMed

    Halvorsen, Ken; Agris, Paul F

    2014-11-15

    Measuring interactions between biological molecules is vitally important to both basic and applied research as well as development of pharmaceuticals. Although a wide and growing range of techniques is available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using isothermal titration calorimetry, differential scanning calorimetry, and ultraviolet-visible (UV-vis) monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features, including low cost, high purity, easily measurable concentrations, and minimal handling concerns, making them ideal for use as a reference material. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standardsa

    PubMed Central

    Halvorsen, Ken; Agris, Paul F.

    2014-01-01

    Measuring interactions between biological molecules is vitally important to both basic and applied research, as well as development of pharmaceuticals. While a wide and growing range of techniques are available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel, or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using Isothermal Titration Calorimetry, Differential Scanning Calorimetry, and UV-Vis monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features including low cost, high purity, easily measureable concentrations, and minimal handling concerns, making them ideal for use as a reference material. PMID:25124363

  13. Critical Quantitative Inquiry in Context

    ERIC Educational Resources Information Center

    Stage, Frances K.; Wells, Ryan S.

    2014-01-01

    This chapter briefly traces the development of the concept of critical quantitative inquiry, provides an expanded conceptualization of the tasks of critical quantitative research, offers theoretical explanation and justification for critical research using quantitative methods, and previews the work of quantitative criticalists presented in this…

  14. Can we predict necrosis intra-operatively? Real-time optical quantitative perfusion imaging in surgery: study protocol for a prospective, observational, in vivo pilot study.

    PubMed

    Jansen, Sanne M; de Bruin, Daniel M; van Berge Henegouwen, Mark I; Strackee, Simon D; Veelo, Denise P; van Leeuwen, Ton G; Gisbertz, Suzanne S

    2017-01-01

    Compromised perfusion as a result of surgical intervention causes a reduction of oxygen and nutrients in tissue and therefore decreased tissue vitality. Quantitative imaging of tissue perfusion during reconstructive surgery, therefore, may reduce the incidence of complications. Non-invasive optical techniques allow real-time tissue imaging, with high resolution and high contrast. The objectives of this study are, first, to assess the feasibility and accuracy of optical coherence tomography (OCT), sidestream darkfield microscopy (SDF), laser speckle contrast imaging (LSCI), and fluorescence imaging (FI) for quantitative perfusion imaging and, second, to identify/search for criteria that enable risk prediction of necrosis during gastric tube and free flap reconstruction. This prospective, multicenter, observational in vivo pilot study will assess tissue perfusion using four optical technologies: OCT, SDF, LSCI, and FI in 40 patients: 20 patients who will undergo gastric tube reconstruction after esophagectomy and 20 patients who will undergo free flap surgery. Intra-operative images of gastric perfusion will be obtained directly after reconstruction at four perfusion areas. Feasibility of perfusion imaging will be analyzed per technique. Quantitative parameters directly related to perfusion will be scored per perfusion area, and differences between biologically good versus reduced perfusion will be tested statistically. Patient outcome will be correlated to images and perfusion parameters. Differences in perfusion parameters before and after a bolus of ephedrine will be tested for significance. This study will identify quantitative perfusion-related parameters for an objective assessment of tissue perfusion during surgery. This will likely allow early risk stratification of necrosis development, which will aid in achieving a reduction of complications in gastric tube reconstruction and free flap transplantation. Clinicaltrials.gov registration number NCT02902549. Dutch Central Committee on Research Involving Human Subjects registration number NL52377.018.15.

  15. Earthquake prediction evaluation standards applied to the VAN Method

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

  16. Ultrasonic Nondestructive Evaluation Techniques Applied to the Quantitative Characterization of Textile Composite Materials

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1998-01-01

    An overall goal of this research has been to enhance our understanding of the scientific principles necessary to develop advanced ultrasonic nondestructive techniques for the quantitative characterization of advanced composite structures. To this end, we have investigated a thin woven composite (5-harness biaxial weave). We have studied the effects that variations of the physical parameters of the experimental setup can have on the ultrasonic determination of the material properties for this thin composite. In particular, we have considered the variation of the nominal center frequency and the f-number of the transmitting transducer which in turn address issues such as focusing and beam spread of ultrasonic fields. This study has employed a planar, two-dimensional, receiving pseudo-array that has permitted investigation of the diffraction patterns of ultrasonic fields. Distortion of the ultrasonic field due to the spatial anisotropy of the thin composite has prompted investigation of the phenomenon of phase cancellation at the face of a finite-aperture, piezoelectric receiver. We have performed phase-sensitive and phase-insensitive analyses to provide a measure of the amount of phase cancellation at the face of a finite-aperture, piezoelectric receiver. The pursuit of robust measurements of received energy (i.e., those not susceptible to phase cancellation at the face of a finite-aperture, piezoelectric receiver) supports the development of robust techniques to determine material properties from measure ultrasonic parameters.

  17. The Advantages and Disadvantages of Using Qualitative and Quantitative Approaches and Methods in Language "Testing and Assessment" Research: A Literature Review

    ERIC Educational Resources Information Center

    Rahman, Md Shidur

    2017-01-01

    The researchers of various disciplines often use qualitative and quantitative research methods and approaches for their studies. Some of these researchers like to be known as qualitative researchers; others like to be regarded as quantitative researchers. The researchers, thus, are sharply polarised; and they involve in a competition of pointing…

  18. Characterization of Model-Based Reasoning Strategies for Use in IVHM Architectures

    NASA Technical Reports Server (NTRS)

    Poll, Scott; Iverson, David; Patterson-Hine, Ann

    2003-01-01

    Open architectures are gaining popularity for Integrated Vehicle Health Management (IVHM) applications due to the diversity of subsystem health monitoring strategies in use and the need to integrate a variety of techniques at the system health management level. The basic concept of an open architecture suggests that whatever monitoring or reasoning strategy a subsystem wishes to deploy, the system architecture will support the needs of that subsystem and will be capable of transmitting subsystem health status across subsystem boundaries and up to the system level for system-wide fault identification and diagnosis. There is a need to understand the capabilities of various reasoning engines and how they, coupled with intelligent monitoring techniques, can support fault detection and system level fault management. Researchers in IVHM at NASA Ames Research Center are supporting the development of an IVHM system for liquefying-fuel hybrid rockets. In the initial stage of this project, a few readily available reasoning engines were studied to assess candidate technologies for application in next generation launch systems. Three tools representing the spectrum of model-based reasoning approaches, from a quantitative simulation based approach to a graph-based fault propagation technique, were applied to model the behavior of the Hybrid Combustion Facility testbed at Ames. This paper summarizes the characterization of the modeling process for each of the techniques.

  19. Imaging challenges in biomaterials and tissue engineering

    PubMed Central

    Appel, Alyssa A.; Anastasio, Mark A.; Larson, Jeffery C.; Brey, Eric M.

    2013-01-01

    Biomaterials are employed in the fields of tissue engineering and regenerative medicine (TERM) in order to enhance the regeneration or replacement of tissue function and/or structure. The unique environments resulting from the presence of biomaterials, cells, and tissues result in distinct challenges in regards to monitoring and assessing the results of these interventions. Imaging technologies for three-dimensional (3D) analysis have been identified as a strategic priority in TERM research. Traditionally, histological and immunohistochemical techniques have been used to evaluate engineered tissues. However, these methods do not allow for an accurate volume assessment, are invasive, and do not provide information on functional status. Imaging techniques are needed that enable non-destructive, longitudinal, quantitative, and three-dimensional analysis of TERM strategies. This review focuses on evaluating the application of available imaging modalities for assessment of biomaterials and tissue in TERM applications. Included is a discussion of limitations of these techniques and identification of areas for further development. PMID:23768903

  20. Identification of Aroma Compounds of Lamiaceae Species in Turkey Using the Purge and Trap Technique

    PubMed Central

    Sonmezdag, Ahmet Salih; Kelebek, Hasim; Selli, Serkan

    2017-01-01

    The present research was planned to characterize the aroma composition of important members of the Lamiaceae family such as Salvia officinalis, Lavandula angustifolia and Mentha asiatica. Aroma components of the S. officinalis, L. angustifolia and M. asiatica were extracted with the purge and trap technique with dichloromethane and analyzed with the gas chromatography–mass spectrometry (GC–MS) technique. A total of 23, 33 and 33 aroma compounds were detected in Salvia officinalis, Lavandula angustifolia and Mentha asiatica, respectively including, acids, alcohols, aldehydes, esters, hydrocarbons and terpenes. Terpene compounds were both qualitatively and quantitatively the major chemical group among the identified aroma compounds, followed by esters. The main terpene compounds were 1,8-cineole, sabinene and linalool in Salvia officinalis, Lavandula angustifolia and Mentha asiatica, respectively. Among esters, linalyl acetate was the only and most important ester compound which was detected in all samples. PMID:28231089

  1. Identification of Aroma Compounds of Lamiaceae Species in Turkey Using the Purge and Trap Technique.

    PubMed

    Sonmezdag, Ahmet Salih; Kelebek, Hasim; Selli, Serkan

    2017-02-08

    The present research was planned to characterize the aroma composition of important members of the Lamiaceae family such as Salvia officinalis , Lavandula angustifolia and Mentha asiatica . Aroma components of the S. officinalis , L. angustifolia and M. asiatica were extracted with the purge and trap technique with dichloromethane and analyzed with the gas chromatography-mass spectrometry (GC-MS) technique. A total of 23, 33 and 33 aroma compounds were detected in Salvia officinalis , Lavandula angustifolia and Mentha asiatica , respectively including, acids, alcohols, aldehydes, esters, hydrocarbons and terpenes. Terpene compounds were both qualitatively and quantitatively the major chemical group among the identified aroma compounds, followed by esters. The main terpene compounds were 1,8-cineole, sabinene and linalool in Salvia officinalis , Lavandula angustifolia and Mentha asiatica , respectively. Among esters, linalyl acetate was the only and most important ester compound which was detected in all samples.

  2. [Methods quantitative for determination of water-soluble vitamins in premixes and fortified food products by micellar electrokinetic chromatography on short end of the capillary].

    PubMed

    Bogachuk, M N; Bessonov, V V; Perederiaev, O I

    2011-01-01

    It was purposed new technique by micellar electrokinetic chromatography on short end of the capillary (capillary electrophoresis system Agilent 3D CE, DAD, quartz capillary HPCE stndrd cap 56 cm, 50 microm, 50 mM borate buffer pH=9,3, 100 mM sodium dodecil sulfate) for simultaneous determination of water-soluble vitamins (B1, B2, B6, B12, PP, B5, B9, C, B8) in fortified food products and premixes. It was observed on 6 samples of vitamin premixes and 28 samples of fortified food products using this technique. Our findings are consistent with the results of research on certain vitamins, conducted by other methods. The developed technique can be used in analysis of water-soluble vitamins in premixes and fortified food products.

  3. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    NASA Astrophysics Data System (ADS)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  4. Perceptions of adolescents' sexual and reproductive health and rights: a cross-sectional study in Lahore District, Pakistan.

    PubMed

    Iqbal, Sarosh; Zakar, Rubeena; Zakar, Muhammad Zakria; Fischer, Florian

    2017-02-23

    Sexual and reproductive health (SRH) is a significant aspect of adolescents' growth, safeguarded by SRH rights (SRHR). Despite various global efforts to promote adolescents SRHR (ASRHR), the majority of adolescents still lack awareness and autonomy to access SRH related information and services. This research aimed to explore the knowledge and perceptions of adolescents' sexual and reproductive health rights and highlights key constraints hindering adolescents from accessing and exercising SRHR in the district of Lahore, Pakistan. The research uses a mixed methods approach including both quantitative and qualitative methods. For quantitative component, household survey was conducted with 600 respondents including adolescents (15-19 years) and their parents/caregivers. A multistage cluster random sampling technique was performed, based on the population proportion of administrative towns in Lahore district, Pakistan. A structured interview schedule was used to collect data. Quantitative data were collected by a standardized quantitative questionnaire; analysis was performed using SPSS version 21. For qualitative data collection, 12 in-depth interviews with teachers and doctors and four focus group discussions with adolescents were conducted, and analysed using thematic areas. The research revealed a low level of perception of ASRHR amongst the respondents and identified socio-cultural and structural constraints as the major underlying issues. Although more than half of the respondents were found to be aware of ASRHR, agreed to their importance and were in favour for adolescents to have access to requisite information, nonetheless they believed that adolescents had limited ability to exercise these rights. The research found a low level of perception amongst adolescents and their parents/caregivers about ASRHR in Lahore district emphasising the rights-based approach. There is an urgent need to design specific policies and educational programmes to promote healthy practices. Research is recommended to inform and advocate Punjab Government and communities, including partners, teachers, doctors, religious scholars and media groups, to empower adolescents through health education. This can be achieved through the inclusion of SRH topics in educational curricula, establishing a virtual knowledge centre, encouraging debate competitions, and organising orientation sessions for professionals/experts and community etc.

  5. A Bibliometric Analysis on Cancer Population Science with Topic Modeling.

    PubMed

    Li, Ding-Cheng; Rastegar-Mojarad, Majid; Okamoto, Janet; Liu, Hongfang; Leichow, Scott

    2015-01-01

    Bibliometric analysis is a research method used in library and information science to evaluate research performance. It applies quantitative and statistical analyses to describe patterns observed in a set of publications and can help identify previous, current, and future research trends or focus. To better guide our institutional strategic plan in cancer population science, we conducted bibliometric analysis on publications of investigators currently funded by either Division of Cancer Preventions (DCP) or Division of Cancer Control and Population Science (DCCPS) at National Cancer Institute. We applied two topic modeling techniques: author topic modeling (AT) and dynamic topic modeling (DTM). Our initial results show that AT can address reasonably the issues related to investigators' research interests, research topic distributions and popularities. In compensation, DTM can address the evolving trend of each topic by displaying the proportion changes of key words, which is consistent with the changes of MeSH headings.

  6. The effect of Missouri mathematics project learning model on students’ mathematical problem solving ability

    NASA Astrophysics Data System (ADS)

    Handayani, I.; Januar, R. L.; Purwanto, S. E.

    2018-01-01

    This research aims to know the influence of Missouri Mathematics Project Learning Model to Mathematical Problem-solving Ability of Students at Junior High School. This research is a quantitative research and uses experimental research method of Quasi Experimental Design. The research population includes all student of grade VII of Junior High School who are enrolled in the even semester of the academic year 2016/2017. The Sample studied are 76 students from experimental and control groups. The sampling technique being used is cluster sampling method. The instrument is consisted of 7 essay questions whose validity, reliability, difficulty level and discriminating power have been tested. Before analyzing the data by using t-test, the data has fulfilled the requirement for normality and homogeneity. The result of data shows that there is the influence of Missouri mathematics project learning model to mathematical problem-solving ability of students at junior high school with medium effect.

  7. Student’s rigorous mathematical thinking based on cognitive style

    NASA Astrophysics Data System (ADS)

    Fitriyani, H.; Khasanah, U.

    2017-12-01

    The purpose of this research was to determine the rigorous mathematical thinking (RMT) of mathematics education students in solving math problems in terms of reflective and impulsive cognitive styles. The research used descriptive qualitative approach. Subjects in this research were 4 students of the reflective and impulsive cognitive style which was each consisting male and female subjects. Data collection techniques used problem-solving test and interview. Analysis of research data used Miles and Huberman model that was reduction of data, presentation of data, and conclusion. The results showed that impulsive male subjects used three levels of the cognitive function required for RMT that were qualitative thinking, quantitative thinking with precision, and relational thinking completely while the other three subjects were only able to use cognitive function at qualitative thinking level of RMT. Therefore the subject of impulsive male has a better RMT ability than the other three research subjects.

  8. Incorporating Multiple-Choice Questions into an AACSB Assurance of Learning Process: A Course-Embedded Assessment Application to an Introductory Finance Course

    ERIC Educational Resources Information Center

    Santos, Michael R.; Hu, Aidong; Jordan, Douglas

    2014-01-01

    The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…

  9. Bound Pool Fractions Complement Diffusion Measures to Describe White Matter Micro and Macrostructure

    PubMed Central

    Stikov, Nikola; Perry, Lee M.; Mezer, Aviv; Rykhlevskaia, Elena; Wandell, Brian A.; Pauly, John M.; Dougherty, Robert F.

    2010-01-01

    Diffusion imaging and bound pool fraction (BPF) mapping are two quantitative magnetic resonance imaging techniques that measure microstructural features of the white matter of the brain. Diffusion imaging provides a quantitative measure of the diffusivity of water in tissue. BPF mapping is a quantitative magnetization transfer (qMT) technique that estimates the proportion of exchanging protons bound to macromolecules, such as those found in myelin, and is thus a more direct measure of myelin content than diffusion. In this work, we combine BPF estimates of macromolecular content with measurements of diffusivity within human white matter tracts. Within the white matter, the correlation between BPFs and diffusivity measures such as fractional anisotropy and radial diffusivity was modest, suggesting that diffusion tensor imaging and bound pool fractions are complementary techniques. We found that several major tracts have high BPF, suggesting a higher density of myelin in these tracts. We interpret these results in the context of a quantitative tissue model. PMID:20828622

  10. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  11. Advanced Elemental and Isotopic Characterization of Atmospheric Aerosols

    NASA Astrophysics Data System (ADS)

    Shafer, M. M.; Schauer, J. J.; Park, J.

    2001-12-01

    Recent sampling and analytical developments advanced by the project team enable the detailed elemental and isotopic fingerprinting of extremely small masses of atmospheric aerosols. Historically, this type of characterization was rarely achieved due to limitations in analytical sensitivity and a lack of awareness concerning the potential for contamination. However, with the introduction of 3rd and 4th generation ICP-MS instrumentation and the application of state-of-the- art "clean-techniques", quantitative analysis of over 40 elements in sub-milligram samples can be realized. When coupled with an efficient and validated solubilization method, ICP-MS approaches provide distinct advantages in comparison with traditional methods; greatly enhanced detection limits, improved accuracy, and isotope resolution capability, to name a few. Importantly, the ICP-MS approach can readily be integrated with techniques which enable phase differentiation and chemical speciation information to be acquired. For example, selective chemical leaching can provide data on the association of metals with major phase-components, and oxidation state of certain metals. Critical information on metal-ligand stability can be obtained when electrochemical techniques, such as adsorptive cathodic stripping voltammetry (ACSV), are applied to these same extracts. Our research group is applying these techniques in a broad range of research projects to better understand the sources and distribution of trace metals in particulate matter in the atmosphere. Using examples from our research, including recent Pb and Sr isotope ratio work on Asian aerosols, we will illustrate the capabilities and applications of these new methods.

  12. Understanding quantitative research: part 1.

    PubMed

    Hoe, Juanita; Hoare, Zoë

    This article, which is the first in a two-part series, provides an introduction to understanding quantitative research, basic statistics and terminology used in research articles. Critical appraisal of research articles is essential to ensure that nurses remain up to date with evidence-based practice to provide consistent and high-quality nursing care. This article focuses on developing critical appraisal skills and understanding the use and implications of different quantitative approaches to research. Part two of this article will focus on explaining common statistical terms and the presentation of statistical data in quantitative research.

  13. THz Imaging of Skin Burn: Seeing the Unseen—An Overview

    PubMed Central

    Dutta, Moumita; Bhalla, Amar S.; Guo, Ruyan

    2016-01-01

    Significance: This review article puts together all the studies performed so far in realizing terahertz (THz) spectra as a probing mechanism for burn evaluation, summarizing their experimental conditions, observations, outcomes, merits, and demerits, along with a comparative discussion of other currently used technologies to present the state of art in a condensed manner. The key features of this noncontact investigation technique like its precise burn depth analysis and the approaches it follows to convert the probed data into a quantitative measure have also been discussed in this article. Recent Advances: The current research developments in THz regime observed in device design technologies (like THz time domain spectrometer, quantum cascade THz lasers, THz single-photon detectors, etc.) and in understanding its unique properties (like nonionizing nature, penetrability through dry dielectrics, etc.) have motivated the research world to realize THz window as a potential candidate for burn detection. Critical Issues: Application of appropriate medical measure for burn injury is primarily subjective to proper estimation of burn depth. Tool modality distinguishing between partial and full-thickness burn contributing toward correct medical care is indeed awaited. Future Directions: The overview of THz imaging as a burn assessment tool as provided in this article will certainly help in further nurturing of this emerging diagnostic technique particularly in improving its detection and accompanied image processing methods so that the minute nuances captured by the THz beam can be correlated with the physiological–anatomical changes in skin structures, caused by burn, for better sensitivity, resolution, and quantitative analysis. PMID:27602253

  14. A device for rapid and quantitative measurement of cardiac myocyte contractility

    NASA Astrophysics Data System (ADS)

    Gaitas, Angelo; Malhotra, Ricky; Li, Tao; Herron, Todd; Jalife, José

    2015-03-01

    Cardiac contractility is the hallmark of cardiac function and is a predictor of healthy or diseased cardiac muscle. Despite advancements over the last two decades, the techniques and tools available to cardiovascular scientists are limited in their utility to accurately and reliably measure the amplitude and frequency of cardiomyocyte contractions. Isometric force measurements in the past have entailed cumbersome attachment of isolated and permeabilized cardiomyocytes to a force transducer followed by measurements of sarcomere lengths under conditions of submaximal and maximal Ca2+ activation. These techniques have the inherent disadvantages of being labor intensive and costly. We have engineered a micro-machined cantilever sensor with an embedded deflection-sensing element that, in preliminary experiments, has demonstrated to reliably measure cardiac cell contractions in real-time. Here, we describe this new bioengineering tool with applicability in the cardiovascular research field to effectively and reliably measure cardiac cell contractility in a quantitative manner. We measured contractility in both primary neonatal rat heart cardiomyocyte monolayers that demonstrated a beat frequency of 3 Hz as well as human embryonic stem cell-derived cardiomyocytes with a contractile frequency of about 1 Hz. We also employed the β-adrenergic agonist isoproterenol (100 nmol l-1) and observed that our cantilever demonstrated high sensitivity in detecting subtle changes in both chronotropic and inotropic responses of monolayers. This report describes the utility of our micro-device in both basic cardiovascular research as well as in small molecule drug discovery to monitor cardiac cell contractions.

  15. THz Imaging of Skin Burn: Seeing the Unseen-An Overview.

    PubMed

    Dutta, Moumita; Bhalla, Amar S; Guo, Ruyan

    2016-08-01

    Significance: This review article puts together all the studies performed so far in realizing terahertz (THz) spectra as a probing mechanism for burn evaluation, summarizing their experimental conditions, observations, outcomes, merits, and demerits, along with a comparative discussion of other currently used technologies to present the state of art in a condensed manner. The key features of this noncontact investigation technique like its precise burn depth analysis and the approaches it follows to convert the probed data into a quantitative measure have also been discussed in this article. Recent Advances: The current research developments in THz regime observed in device design technologies (like THz time domain spectrometer, quantum cascade THz lasers, THz single-photon detectors, etc.) and in understanding its unique properties (like nonionizing nature, penetrability through dry dielectrics, etc.) have motivated the research world to realize THz window as a potential candidate for burn detection. Critical Issues: Application of appropriate medical measure for burn injury is primarily subjective to proper estimation of burn depth. Tool modality distinguishing between partial and full-thickness burn contributing toward correct medical care is indeed awaited. Future Directions: The overview of THz imaging as a burn assessment tool as provided in this article will certainly help in further nurturing of this emerging diagnostic technique particularly in improving its detection and accompanied image processing methods so that the minute nuances captured by the THz beam can be correlated with the physiological-anatomical changes in skin structures, caused by burn, for better sensitivity, resolution, and quantitative analysis.

  16. Application of ASTAR(TM)/Precession Electron Diffraction Technique to Quantitatively Study Defects in Nanocrystalline Metallic Materials

    NASA Astrophysics Data System (ADS)

    Ghamarian, Iman

    Nanocrystalline metallic materials have the potential to exhibit outstanding performance which leads to their usage in challenging applications such as coatings and biomedical implant devices. To optimize the performance of nanocrystalline metallic materials according to the desired applications, it is important to have a decent understanding of the structure, processing and properties of these materials. Various efforts have been made to correlate microstructure and properties of nanocrystalline metallic materials. Based on these research activities, it is noticed that microstructure and defects (e.g., dislocations and grain boundaries) play a key role in the behavior of these materials. Therefore, it is of great importance to establish methods to quantitatively study microstructures, defects and their interactions in nanocrystalline metallic materials. Since the mechanisms controlling the properties of nanocrystalline metallic materials occur at a very small length scale, it is fairly difficult to study them. Unfortunately, most of the characterization techniques used to explore these materials do not have the high enough spatial resolution required for the characterization of these materials. For instance, by applying complex profile-fitting algorithms to X-ray diffraction patterns, it is possible to get an estimation of the average grain size and the average dislocation density within a relatively large area. However, these average values are not enough for developing meticulous phenomenological models which are able to correlate microstructure and properties of nanocrystalline metallic materials. As another example, electron backscatter diffraction technique also cannot be used widely in the characterization of these materials due to problems such as relative poor spatial resolution (which is 90 nm) and the degradation of Kikuchi diffraction patterns in severely deformed nano-size grain metallic materials. In this study, ASTAR(TM)/precession electron diffraction is introduced as a relatively new orientation microscopy technique to characterize defects (e.g., geometrically necessary dislocations and grain boundaries) in challenging nanocrystalline metallic materials. The capability of this characterization technique to quantitatively determine the dislocation density distributions of geometrically necessary dislocations in severely deformed metallic materials is assessed. Based on the developed method, it is possible to determine the distributions and accumulations of dislocations with respect to the nearest grain boundaries and triple junctions. Also, the competency of this technique to study the grain boundary character distributions of nanocrystalline metallic materials is presented.

  17. Streamlined approach to mapping the magnetic induction of skyrmionic materials.

    PubMed

    Chess, Jordan J; Montoya, Sergio A; Harvey, Tyler R; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E; McMorran, Benjamin J

    2017-06-01

    Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Multistrip western blotting to increase quantitative data output.

    PubMed

    Kiyatkin, Anatoly; Aksamitiene, Edita

    2009-01-01

    The qualitative and quantitative measurements of protein abundance and modification states are essential in understanding their functions in diverse cellular processes. Typical western blotting, though sensitive, is prone to produce substantial errors and is not readily adapted to high-throughput technologies. Multistrip western blotting is a modified immunoblotting procedure based on simultaneous electrophoretic transfer of proteins from multiple strips of polyacrylamide gels to a single membrane sheet. In comparison with the conventional technique, Multistrip western blotting increases the data output per single blotting cycle up to tenfold, allows concurrent monitoring of up to nine different proteins from the same loading of the sample, and substantially improves the data accuracy by reducing immunoblotting-derived signal errors. This approach enables statistically reliable comparison of different or repeated sets of data, and therefore is beneficial to apply in biomedical diagnostics, systems biology, and cell signaling research.

  19. Introduction to special section of the Journal of Family Psychology, advances in mixed methods in family psychology: integrative and applied solutions for family science.

    PubMed

    Weisner, Thomas S; Fiese, Barbara H

    2011-12-01

    Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.

  20. Increasing Literacy in Quantitative Methods: The Key to the Future of Canadian Psychology

    PubMed Central

    Counsell, Alyssa; Cribbie, Robert A.; Harlow, Lisa. L.

    2016-01-01

    Quantitative methods (QM) dominate empirical research in psychology. Unfortunately most researchers in psychology receive inadequate training in QM. This creates a challenge for researchers who require advanced statistical methods to appropriately analyze their data. Many of the recent concerns about research quality, replicability, and reporting practices are directly tied to the problematic use of QM. As such, improving quantitative literacy in psychology is an important step towards eliminating these concerns. The current paper will include two main sections that discuss quantitative challenges and opportunities. The first section discusses training and resources for students and presents descriptive results on the number of quantitative courses required and available to graduate students in Canadian psychology departments. In the second section, we discuss ways of improving quantitative literacy for faculty, researchers, and clinicians. This includes a strong focus on the importance of collaboration. The paper concludes with practical recommendations for improving quantitative skills and literacy for students and researchers in Canada. PMID:28042199

Top