Sample records for techniques including quantitative

  1. Quantitative Ultrasound for Nondestructive Characterization of Engineered Tissues and Biomaterials

    PubMed Central

    Dalecki, Diane; Mercado, Karla P.; Hocking, Denise C.

    2015-01-01

    Non-invasive, non-destructive technologies for imaging and quantitatively monitoring the development of artificial tissues are critical for the advancement of tissue engineering. Current standard techniques for evaluating engineered tissues, including histology, biochemical assays and mechanical testing, are destructive approaches. Ultrasound is emerging as a valuable tool for imaging and quantitatively monitoring the properties of engineered tissues and biomaterials longitudinally during fabrication and post-implantation. Ultrasound techniques are rapid, non-invasive, non-destructive and can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, high-frequency quantitative ultrasound techniques can enable volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation. This review provides an overview of ultrasound imaging, quantitative ultrasound techniques, and elastography, with representative examples of applications of these ultrasound-based techniques to the field of tissue engineering. PMID:26581347

  2. Review of progress in quantitative NDE. [Nondestructive Evaluation (NDE)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-01-01

    This booklet is composed of abstracts from papers submitted at a meeting on quantitative NDE. A multitude of topics are discussed including analysis of composite materials, NMR uses, x-ray instruments and techniques, manufacturing uses, neural networks, eddy currents, stress measurements, magnetic materials, adhesive bonds, signal processing, NDE of mechanical structures, tomography,defect sizing, NDE of plastics and ceramics, new techniques, optical and electromagnetic techniques, and nonlinear techniques. (GHH)

  3. Quantitative optical metrology with CMOS cameras

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Kolenovic, Ervin; Ferguson, Curtis F.

    2004-08-01

    Recent advances in laser technology, optical sensing, and computer processing of data, have lead to the development of advanced quantitative optical metrology techniques for high accuracy measurements of absolute shapes and deformations of objects. These techniques provide noninvasive, remote, and full field of view information about the objects of interest. The information obtained relates to changes in shape and/or size of the objects, characterizes anomalies, and provides tools to enhance fabrication processes. Factors that influence selection and applicability of an optical technique include the required sensitivity, accuracy, and precision that are necessary for a particular application. In this paper, sensitivity, accuracy, and precision characteristics in quantitative optical metrology techniques, and specifically in optoelectronic holography (OEH) based on CMOS cameras, are discussed. Sensitivity, accuracy, and precision are investigated with the aid of National Institute of Standards and Technology (NIST) traceable gauges, demonstrating the applicability of CMOS cameras in quantitative optical metrology techniques. It is shown that the advanced nature of CMOS technology can be applied to challenging engineering applications, including the study of rapidly evolving phenomena occurring in MEMS and micromechatronics.

  4. A quantitative image cytometry technique for time series or population analyses of signaling networks.

    PubMed

    Ozaki, Yu-ichi; Uda, Shinsuke; Saito, Takeshi H; Chung, Jaehoon; Kubota, Hiroyuki; Kuroda, Shinya

    2010-04-01

    Modeling of cellular functions on the basis of experimental observation is increasingly common in the field of cellular signaling. However, such modeling requires a large amount of quantitative data of signaling events with high spatio-temporal resolution. A novel technique which allows us to obtain such data is needed for systems biology of cellular signaling. We developed a fully automatable assay technique, termed quantitative image cytometry (QIC), which integrates a quantitative immunostaining technique and a high precision image-processing algorithm for cell identification. With the aid of an automated sample preparation system, this device can quantify protein expression, phosphorylation and localization with subcellular resolution at one-minute intervals. The signaling activities quantified by the assay system showed good correlation with, as well as comparable reproducibility to, western blot analysis. Taking advantage of the high spatio-temporal resolution, we investigated the signaling dynamics of the ERK pathway in PC12 cells. The QIC technique appears as a highly quantitative and versatile technique, which can be a convenient replacement for the most conventional techniques including western blot, flow cytometry and live cell imaging. Thus, the QIC technique can be a powerful tool for investigating the systems biology of cellular signaling.

  5. Modern quantitative schlieren techniques

    NASA Astrophysics Data System (ADS)

    Hargather, Michael; Settles, Gary

    2010-11-01

    Schlieren optical techniques have traditionally been used to qualitatively visualize refractive flowfields in transparent media. Modern schlieren optics, however, are increasingly focused on obtaining quantitative information such as temperature and density fields in a flow -- once the sole purview of interferometry -- without the need for coherent illumination. Quantitative data are obtained from schlieren images by integrating the measured refractive index gradient to obtain the refractive index field in an image. Ultimately this is converted to a density or temperature field using the Gladstone-Dale relationship, an equation of state, and geometry assumptions for the flowfield of interest. Several quantitative schlieren methods are reviewed here, including background-oriented schlieren (BOS), schlieren using a weak lens as a "standard," and "rainbow schlieren." Results are presented for the application of these techniques to measure density and temperature fields across a supersonic turbulent boundary layer and a low-speed free-convection boundary layer in air. Modern equipment, including digital cameras, LED light sources, and computer software that make this possible are also discussed.

  6. Analysis of Synthetic Polymers.

    ERIC Educational Resources Information Center

    Smith, Charles G.; And Others

    1989-01-01

    Reviews techniques for the characterization and analysis of synthetic polymers, copolymers, and blends. Includes techniques for structure determination, separation, and quantitation of additives and residual monomers; determination of molecular weight; and the study of thermal properties including degradation mechanisms. (MVL)

  7. Quantitative proteomics in the field of microbiology.

    PubMed

    Otto, Andreas; Becher, Dörte; Schmidt, Frank

    2014-03-01

    Quantitative proteomics has become an indispensable analytical tool for microbial research. Modern microbial proteomics covers a wide range of topics in basic and applied research from in vitro characterization of single organisms to unravel the physiological implications of stress/starvation to description of the proteome content of a cell at a given time. With the techniques available, ranging from classical gel-based procedures to modern MS-based quantitative techniques, including metabolic and chemical labeling, as well as label-free techniques, quantitative proteomics is today highly successful in sophisticated settings of high complexity such as host-pathogen interactions, mixed microbial communities, and microbial metaproteomics. In this review, we will focus on the vast range of techniques practically applied in current research with an introduction of the workflows used for quantitative comparisons, a description of the advantages/disadvantages of the various methods, reference to hallmark publications and presentation of applications in current microbial research. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Nondestructive Evaluation for Aerospace Composites

    NASA Technical Reports Server (NTRS)

    Leckey, Cara; Cramer, Elliott; Perey, Daniel

    2015-01-01

    Nondestructive evaluation (NDE) techniques are important for enabling NASA's missions in space exploration and aeronautics. The expanded and continued use of composite materials for aerospace components and vehicles leads to a need for advanced NDE techniques capable of quantitatively characterizing damage in composites. Quantitative damage detection techniques help to ensure safety, reliability and durability of space and aeronautic vehicles. This presentation will give a broad outline of NASA's range of technical work and an overview of the NDE research performed in the Nondestructive Evaluation Sciences Branch at NASA Langley Research Center. The presentation will focus on ongoing research in the development of NDE techniques for composite materials and structures, including development of automated data processing tools to turn NDE data into quantitative location and sizing results. Composites focused NDE research in the areas of ultrasonics, thermography, X-ray computed tomography, and NDE modeling will be discussed.

  9. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  10. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Descriptive and analytical techniques for NASA trend analysis applications are presented in this standard. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. This document should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend analysis is neither a precise term nor a circumscribed methodology: it generally connotes quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this document. The basic ideas needed for qualitative and quantitative assessment of trends along with relevant examples are presented.

  11. [Progress of study on the detection technique of microRNA].

    PubMed

    Zhao, Hai-Feng; Yang, Ren-Chi

    2009-12-01

    MicroRNAs (miRNAs) are small noncoding RNA molecules that negatively regulate gene expression via degradation or translational repression of their targeted mRNAs. MiRNAs are involved in critical biologic processes, including development, cell differentiation, proliferation and the pathogenesis of disease. This review focuses on recent researches on the detection techniques of miRNA including micorarray technique, Northern blot, real-time quantitative PCR, detection technique of miRNA function and so on.

  12. Advanced hyphenated chromatographic-mass spectrometry in mycotoxin determination: current status and prospects.

    PubMed

    Li, Peiwu; Zhang, Zhaowei; Hu, Xiaofeng; Zhang, Qi

    2013-01-01

    Mass spectrometric techniques are essential for advanced research in food safety and environmental monitoring. These fields are important for securing the health of humans and animals, and for ensuring environmental security. Mycotoxins, toxic secondary metabolites of filamentous fungi, are major contaminants of agricultural products, food and feed, biological samples, and the environment as a whole. Mycotoxins can cause cancers, nephritic and hepatic diseases, various hemorrhagic syndromes, and immune and neurological disorders. Mycotoxin-contaminated food and feed can provoke trade conflicts, resulting in massive economic losses. Risk assessment of mycotoxin contamination for humans and animals generally depends on clear identification and reliable quantitation in diversified matrices. Pioneering work on mycotoxin quantitation using mass spectrometry (MS) was performed in the early 1970s. Now, unambiguous confirmation and quantitation of mycotoxins can be readily achieved with a variety hyphenated techniques that combine chromatographic separation with MS, including liquid chromatography (LC) or gas chromatography (GC). With the advent of atmospheric pressure ionization, LC-MS has become a routine technique. Recently, the co-occurrence of multiple mycotoxins in the same sample has drawn an increasing amount of attention. Thus, modern analyses must be able to detect and quantitate multiple mycotoxins in a single run. Improvements in tandem MS techniques have been made to achieve this purpose. This review describes the advanced research that has been done regarding mycotoxin determination using hyphenated chromatographic-MS techniques, but is not a full-circle survey of all the literature published on this topic. The present work provides an overview of the various hyphenated chromatographic-MS-based strategies that have been applied to mycotoxin analysis, with a focus on recent developments. The use of chromatographic-MS to measure levels of mycotoxins, including aflatoxins, ochratoxins, patulin, trichothecenes, zearalenone, and fumonisins, is discussed in detail. Both free and masked mycotoxins are included in this review due to different methods of sample preparation. Techniques are described in terms of sample preparation, internal standards, LC/ultra performance LC (UPLC) optimization, and applications and survey. Several future hyphenated MS techniques are discussed as well, including multidimensional chromatography-MS, capillary electrophoresis-MS, and surface plasmon resonance array-MS. © 2013 Wiley Periodicals, Inc.

  13. Subsurface imaging and cell refractometry using quantitative phase/ shear-force feedback microscopy

    NASA Astrophysics Data System (ADS)

    Edward, Kert; Farahi, Faramarz

    2009-10-01

    Over the last few years, several novel quantitative phase imaging techniques have been developed for the study of biological cells. However, many of these techniques are encumbered by inherent limitations including 2π phase ambiguities and diffraction limited spatial resolution. In addition, subsurface information in the phase data is not exploited. We hereby present a novel quantitative phase imaging system without 2 π ambiguities, which also allows for subsurface imaging and cell refractometry studies. This is accomplished by utilizing simultaneously obtained shear-force topography information. We will demonstrate how the quantitative phase and topography data can be used for subsurface and cell refractometry analysis and will present results for a fabricated structure and a malaria infected red blood cell.

  14. Recommendations for Quantitative Analysis of Small Molecules by Matrix-assisted laser desorption ionization mass spectrometry

    PubMed Central

    Wang, Poguang; Giese, Roger W.

    2017-01-01

    Matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) has been used for quantitative analysis of small molecules for many years. It is usually preceded by an LC separation step when complex samples are tested. With the development several years ago of “modern MALDI” (automation, high repetition laser, high resolution peaks), the ease of use and performance of MALDI as a quantitative technique greatly increased. This review focuses on practical aspects of modern MALDI for quantitation of small molecules conducted in an ordinary way (no special reagents, devices or techniques for the spotting step of MALDI), and includes our ordinary, preferred Methods The review is organized as 18 recommendations with accompanying explanations, criticisms and exceptions. PMID:28118972

  15. Reviewing effectiveness of ankle assessment techniques for use in robot-assisted therapy.

    PubMed

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Shane

    2014-01-01

    This article provides a comprehensive review of studies that investigated ankle assessment techniques to better understand those that can be used in the real-time monitoring of rehabilitation progress for implementation in conjunction with robot-assisted therapy. Seventy-six publications published between January 1980 and August 2013 were selected based on eight databases. They were divided into two main categories (16 qualitative and 60 quantitative studies): 13 goniometer studies, 18 dynamometer studies, and 29 studies about innovative techniques. A total of 465 subjects participated in the 29 quantitative studies of innovative measurement techniques that may potentially be integrated in a real-time monitoring device, of which 19 studies included less than 10 participants. Results show that qualitative ankle assessment methods are not suitable for real-time monitoring in robot-assisted therapy, though they are reliable for certain patients, while the quantitative methods show great potential. The majority of quantitative techniques are reliable in measuring ankle kinematics and kinetics but are usually available only for use in the sagittal plane. Limited studies determine kinematics and kinetics in all three planes (sagittal, transverse, and frontal) where motions of the ankle joint and the subtalar joint actually occur.

  16. NASA standard: Trend analysis techniques

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This Standard presents descriptive and analytical techniques for NASA trend analysis applications. Trend analysis is applicable in all organizational elements of NASA connected with, or supporting, developmental/operational programs. Use of this Standard is not mandatory; however, it should be consulted for any data analysis activity requiring the identification or interpretation of trends. Trend Analysis is neither a precise term nor a circumscribed methodology, but rather connotes, generally, quantitative analysis of time-series data. For NASA activities, the appropriate and applicable techniques include descriptive and graphical statistics, and the fitting or modeling of data by linear, quadratic, and exponential models. Usually, but not always, the data is time-series in nature. Concepts such as autocorrelation and techniques such as Box-Jenkins time-series analysis would only rarely apply and are not included in this Standard. The document presents the basic ideas needed for qualitative and quantitative assessment of trends, together with relevant examples. A list of references provides additional sources of information.

  17. Quantitative techniques for musculoskeletal MRI at 7 Tesla.

    PubMed

    Bangerter, Neal K; Taylor, Meredith D; Tarbox, Grayson J; Palmer, Antony J; Park, Daniel J

    2016-12-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems.

  18. Reinventing the ames test as a quantitative lab that connects classical and molecular genetics.

    PubMed

    Goodson-Gregg, Nathan; De Stasio, Elizabeth A

    2009-01-01

    While many institutions use a version of the Ames test in the undergraduate genetics laboratory, students typically are not exposed to techniques or procedures beyond qualitative analysis of phenotypic reversion, thereby seriously limiting the scope of learning. We have extended the Ames test to include both quantitative analysis of reversion frequency and molecular analysis of revertant gene sequences. By giving students a role in designing their quantitative methods and analyses, students practice and apply quantitative skills. To help students connect classical and molecular genetic concepts and techniques, we report here procedures for characterizing the molecular lesions that confer a revertant phenotype. We suggest undertaking reversion of both missense and frameshift mutants to allow a more sophisticated molecular genetic analysis. These modifications and additions broaden the educational content of the traditional Ames test teaching laboratory, while simultaneously enhancing students' skills in experimental design, quantitative analysis, and data interpretation.

  19. Using Facebook as a LMS?

    ERIC Educational Resources Information Center

    Arabacioglu, Taner; Akar-Vural, Ruken

    2014-01-01

    The main purpose of this research was to compare the communication media according to effective teaching. For this purpose, in the research, the mixed method, including quantitative and qualitative data collecting techniques, was applied. For the quantitative part of the research, the static group comparison design was implemented as one of the…

  20. Examining the Teachers' Emotional Labor Behavior

    ERIC Educational Resources Information Center

    Tösten, Rasim; Sahin, Çigdem Çelik

    2017-01-01

    The aim of this research is to investigate the teachers' emotional labour behaviours and to determine the reasons of the differences. In the research, mixed research methods including both quantitative and qualitative techniques were used. The population of the study was comprised of 280 teachers (266 for quantitative, 14 for qualitative…

  1. Diagnostic accuracy of semi-quantitative and quantitative culture techniques for the diagnosis of catheter-related infections in newborns and molecular typing of isolated microorganisms.

    PubMed

    Riboli, Danilo Flávio Moraes; Lyra, João César; Silva, Eliane Pessoa; Valadão, Luisa Leite; Bentlin, Maria Regina; Corrente, José Eduardo; Rugolo, Ligia Maria Suppo de Souza; da Cunha, Maria de Lourdes Ribeiro de Souza

    2014-05-22

    Catheter-related bloodstream infections (CR-BSIs) have become the most common cause of healthcare-associated bloodstream infections in neonatal intensive care units (ICUs). Microbiological evidence implicating catheters as the source of bloodstream infection is necessary to establish the diagnosis of CR-BSIs. Semi-quantitative culture is used to determine the presence of microorganisms on the external catheter surface, whereas quantitative culture also isolates microorganisms present inside the catheter. The main objective of this study was to determine the sensitivity and specificity of these two techniques for the diagnosis of CR-BSIs in newborns from a neonatal ICU. In addition, PFGE was used for similarity analysis of the microorganisms isolated from catheters and blood cultures. Semi-quantitative and quantitative methods were used for the culture of catheter tips obtained from newborns. Strains isolated from catheter tips and blood cultures which exhibited the same antimicrobial susceptibility profile were included in the study as positive cases of CR-BSI. PFGE of the microorganisms isolated from catheters and blood cultures was performed for similarity analysis and detection of clones in the ICU. A total of 584 catheter tips from 399 patients seen between November 2005 and June 2012 were analyzed. Twenty-nine cases of CR-BSI were confirmed. Coagulase-negative staphylococci (CoNS) were the most frequently isolated microorganisms, including S. epidermidis as the most prevalent species (65.5%), followed by S. haemolyticus (10.3%), yeasts (10.3%), K. pneumoniae (6.9%), S. aureus (3.4%), and E. coli (3.4%). The sensitivity of the semi-quantitative and quantitative techniques was 72.7% and 59.3%, respectively, and specificity was 95.7% and 94.4%. The diagnosis of CR-BSIs based on PFGE analysis of similarity between strains isolated from catheter tips and blood cultures showed 82.6% sensitivity and 100% specificity. The semi-quantitative culture method showed higher sensitivity and specificity for the diagnosis of CR-BSIs in newborns when compared to the quantitative technique. In addition, this method is easier to perform and shows better agreement with the gold standard, and should therefore be recommended for routine clinical laboratory use. PFGE may contribute to the control of CR-BSIs by identifying clusters of microorganisms in neonatal ICUs, providing a means of determining potential cross-infection between patients.

  2. Critically appraising qualitative research: a guide for clinicians more familiar with quantitative techniques.

    PubMed

    Kisely, Stephen; Kendall, Elizabeth

    2011-08-01

    Papers using qualitative methods are increasingly common in psychiatric journals. This overview is an introduction to critically appraising a qualitative paper for clinicians who are more familiar with quantitative methods. Qualitative research uses data from interviews (semi-structured or unstructured), focus groups, observations or written materials. Data analysis is inductive, allowing meaning to emerge from the data, rather than the more deductive, hypothesis centred approach of quantitative research. This overview compares and contrasts quantitative and qualitative research methods. Quantitative concepts such as reliability, validity, statistical power, bias and generalisability have qualitative equivalents. These include triangulation, trustworthiness, saturation, reflexivity and applicability. Reflexivity also shares features of transference. Qualitative approaches include: ethnography, action-assessment, grounded theory, case studies and mixed methods. Qualitative research can complement quantitative approaches. An understanding of both is useful in critically appraising the psychiatric literature.

  3. Analysis of Gold Ores by Fire Assay

    ERIC Educational Resources Information Center

    Blyth, Kristy M.; Phillips, David N.; van Bronswijk, Wilhelm

    2004-01-01

    Students of an Applied Chemistry degree course carried out a fire-assay exercise. The analysis showed that the technique was a worthwhile quantitative analytical technique and covered interesting theory including acid-base and redox chemistry and other concepts such as inquarting and cupelling.

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  5. MR morphology of triangular fibrocartilage complex: correlation with quantitative MR and biomechanical properties.

    PubMed

    Bae, Won C; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda; Chung, Christine B

    2016-04-01

    To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high-resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Five cadaveric wrists (22-70 years) were imaged at 3 T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques.

  6. Review of quantitative ultrasound: envelope statistics and backscatter coefficient imaging and contributions to diagnostic ultrasound

    PubMed Central

    Oelze, Michael L.; Mamou, Jonathan

    2017-01-01

    Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging techniques can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient, estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter and the effective acoustic concentration of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and pre-clinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy. PMID:26761606

  7. The Role of Hemispheral Asymmetry and Regional Activity of Quantitative EEG in Children with Stuttering

    ERIC Educational Resources Information Center

    Ozge, Aynur; Toros, Fevziye; Comelekoglu, Ulku

    2004-01-01

    We investigated the role of delayed cerebral maturation, hemisphere asymmetry and regional differences in children with stuttering and healthy controls during resting state and hyperventilation, using conventional EEG techniques and quantitative EEG (QEEG) analysis. This cross-sectional case control study included 26 children with stuttering and…

  8. Identification of downy mildew resistance gene candidates by positional cloning in maize (Zea mays subsp. mays; Poaceae)1

    PubMed Central

    Kim, Jae Yoon; Moon, Jun-Cheol; Kim, Hyo Chul; Shin, Seungho; Song, Kitae; Kim, Kyung-Hee; Lee, Byung-Moo

    2017-01-01

    Premise of the study: Positional cloning in combination with phenotyping is a general approach to identify disease-resistance gene candidates in plants; however, it requires several time-consuming steps including population or fine mapping. Therefore, in the present study, we suggest a new combined strategy to improve the identification of disease-resistance gene candidates. Methods and Results: Downy mildew (DM)–resistant maize was selected from five cultivars using a spreader row technique. Positional cloning and bioinformatics tools were used to identify the DM-resistance quantitative trait locus marker (bnlg1702) and 47 protein-coding gene annotations. Eventually, five DM-resistance gene candidates, including bZIP34, Bak1, and Ppr, were identified by quantitative reverse-transcription PCR (RT-PCR) without fine mapping of the bnlg1702 locus. Conclusions: The combined protocol with the spreader row technique, quantitative trait locus positional cloning, and quantitative RT-PCR was effective for identifying DM-resistance candidate genes. This cloning approach may be applied to other whole-genome-sequenced crops or resistance to other diseases. PMID:28224059

  9. MO-E-12A-01: Quantitative Imaging: Techniques, Applications, and Challenges

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, E; Jeraj, R; McNitt-Gray, M

    The first symposium in the Quantitative Imaging Track focused on the introduction of quantitative imaging (QI) by illustrating the potential of QI in diagnostic and therapeutic applications in research and patient care, highlighting key challenges in implementation of such QI applications, and reviewing QI efforts of selected national and international agencies and organizations, including the FDA, NCI, NIST, and RSNA. This second QI symposium will focus more specifically on the techniques, applications, and challenges of QI. The first talk of the session will focus on modalityagnostic challenges of QI, beginning with challenges of the development and implementation of QI applicationsmore » in single-center, single-vendor settings and progressing to the challenges encountered in the most general setting of multi-center, multi-vendor settings. The subsequent three talks will focus on specific QI challenges and opportunities in the modalityspecific settings of CT, PET/CT, and MR. Each talk will provide information on modality-specific QI techniques, applications, and challenges, including current efforts focused on solutions to such challenges. Learning Objectives: Understand key general challenges of QI application development and implementation, regardless of modality. Understand selected QI techniques and applications in CT, PET/CT, and MR. Understand challenges, and potential solutions for such challenges, for the applications presented for each modality.« less

  10. Surface temperature/heat transfer measurement using a quantitative phosphor thermography system

    NASA Technical Reports Server (NTRS)

    Buck, G. M.

    1991-01-01

    A relative-intensity phosphor thermography technique developed for surface heating studies in hypersonic wind tunnels is described. A direct relationship between relative emission intensity and phosphor temperature is used for quantitative surface temperature measurements in time. The technique provides global surface temperature-time histories using a 3-CCD (Charge Coupled Device) video camera and digital recording system. A current history of technique development at Langley is discussed. Latest developments include a phosphor mixture for a greater range of temperature sensitivity and use of castable ceramics for inexpensive test models. A method of calculating surface heat-transfer from thermal image data in blowdown wind tunnels is included in an appendix, with an analysis of material thermal heat-transfer properties. Results from tests in the Langley 31-Inch Mach 10 Tunnel are presented for a ceramic orbiter configuration and a four-inch diameter hemisphere model. Data include windward heating for bow-shock/wing-shock interactions on the orbiter wing surface, and a comparison with prediction for hemisphere heating distribution.

  11. Quantitative techniques for musculoskeletal MRI at 7 Tesla

    PubMed Central

    Taylor, Meredith D.; Tarbox, Grayson J.; Palmer, Antony J.; Park, Daniel J.

    2016-01-01

    Whole-body 7 Tesla MRI scanners have been approved solely for research since they appeared on the market over 10 years ago, but may soon be approved for selected clinical neurological and musculoskeletal applications in both the EU and the United States. There has been considerable research work on musculoskeletal applications at 7 Tesla over the past decade, including techniques for ultra-high resolution morphological imaging, 3D T2 and T2* mapping, ultra-short TE applications, diffusion tensor imaging of cartilage, and several techniques for assessing proteoglycan content in cartilage. Most of this work has been done in the knee or other extremities, due to technical difficulties associated with scanning areas such as the hip and torso at 7 Tesla. In this manuscript, we first provide some technical context for 7 Tesla imaging, including challenges and potential advantages. We then review the major quantitative MRI techniques being applied to musculoskeletal applications on 7 Tesla whole-body systems. PMID:28090448

  12. Identification and quantitation of semi-crystalline microplastics using image analysis and differential scanning calorimetry.

    PubMed

    Rodríguez Chialanza, Mauricio; Sierra, Ignacio; Pérez Parada, Andrés; Fornaro, Laura

    2018-06-01

    There are several techniques used to analyze microplastics. These are often based on a combination of visual and spectroscopic techniques. Here we introduce an alternative workflow for identification and mass quantitation through a combination of optical microscopy with image analysis (IA) and differential scanning calorimetry (DSC). We studied four synthetic polymers with environmental concern: low and high density polyethylene (LDPE and HDPE, respectively), polypropylene (PP), and polyethylene terephthalate (PET). Selected experiments were conducted to investigate (i) particle characterization and counting procedures based on image analysis with open-source software, (ii) chemical identification of microplastics based on DSC signal processing, (iii) dependence of particle size on DSC signal, and (iv) quantitation of microplastics mass based on DSC signal. We describe the potential and limitations of these techniques to increase reliability for microplastic analysis. Particle size demonstrated to have particular incidence in the qualitative and quantitative performance of DSC signals. Both, identification (based on characteristic onset temperature) and mass quantitation (based on heat flow) showed to be affected by particle size. As a result, a proper sample treatment which includes sieving of suspended particles is particularly required for this analytical approach.

  13. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  14. NIR technique in the classification of cotton leaf grade

    USDA-ARS?s Scientific Manuscript database

    Near infrared (NIR) spectroscopy, a useful technique due to the speed, ease of use, and adaptability to on-line or off-line implementation, has been applied to perform the qualitative classification and quantitative prediction of cotton quality characteristics, including trash index. One term to as...

  15. Quantitative nanoparticle tracking: applications to nanomedicine.

    PubMed

    Huang, Feiran; Dempsey, Christopher; Chona, Daniela; Suh, Junghae

    2011-06-01

    Particle tracking is an invaluable technique to extract quantitative and qualitative information regarding the transport of nanomaterials through complex biological environments. This technique can be used to probe the dynamic behavior of nanoparticles as they interact with and navigate through intra- and extra-cellular barriers. In this article, we focus on the recent developments in the application of particle-tracking technology to nanomedicine, including the study of synthetic and virus-based materials designed for gene and drug delivery. Specifically, we cover research where mean square displacements of nanomaterial transport were explicitly determined in order to quantitatively assess the transport of nanoparticles through biological environments. Particle-tracking experiments can provide important insights that may help guide the design of more intelligent and effective diagnostic and therapeutic nanoparticles.

  16. Advanced NDE techniques for quantitative characterization of aircraft

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S.; Winfree, William P.

    1990-01-01

    Recent advances in nondestructive evaluation (NDE) at NASA Langley Research Center and their applications that have resulted in quantitative assessment of material properties based on thermal and ultrasonic measurements are reviewed. Specific applications include ultrasonic determination of bolt tension, ultrasonic and thermal characterization of bonded layered structures, characterization of composite materials, and disbonds in aircraft skins.

  17. Quantitative analysis on electrooculography (EOG) for neurodegenerative disease

    NASA Astrophysics Data System (ADS)

    Liu, Chang-Chia; Chaovalitwongse, W. Art; Pardalos, Panos M.; Seref, Onur; Xanthopoulos, Petros; Sackellares, J. C.; Skidmore, Frank M.

    2007-11-01

    Many studies have documented abnormal horizontal and vertical eye movements in human neurodegenerative disease as well as during altered states of consciousness (including drowsiness and intoxication) in healthy adults. Eye movement measurement may play an important role measuring the progress of neurodegenerative diseases and state of alertness in healthy individuals. There are several techniques for measuring eye movement, Infrared detection technique (IR). Video-oculography (VOG), Scleral eye coil and EOG. Among those available recording techniques, EOG is a major source for monitoring the abnormal eye movement. In this real-time quantitative analysis study, the methods which can capture the characteristic of the eye movement were proposed to accurately categorize the state of neurodegenerative subjects. The EOG recordings were taken while 5 tested subjects were watching a short (>120 s) animation clip. In response to the animated clip the participants executed a number of eye movements, including vertical smooth pursued (SVP), horizontal smooth pursued (HVP) and random saccades (RS). Detection of abnormalities in ocular movement may improve our diagnosis and understanding a neurodegenerative disease and altered states of consciousness. A standard real-time quantitative analysis will improve detection and provide a better understanding of pathology in these disorders.

  18. Quantitative Modeling of Earth Surface Processes

    NASA Astrophysics Data System (ADS)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes.

  19. More details...
  20. Analytical aids in land management planning

    Treesearch

    David R. Betters

    1978-01-01

    Quantitative techniques may be applied to aid in completing various phases of land management planning. Analytical procedures which have been used include a procedure for public involvement, PUBLIC; a matrix information generator, MAGE5; an allocation procedure, linear programming (LP); and an input-output economic analysis (EA). These techniques have proven useful in...

  21. Exploring NIR technique in rapid prediction of cotton trash components

    USDA-ARS?s Scientific Manuscript database

    Near infrared (NIR) spectroscopy, a useful technique due to the speed, ease of use, and adaptability to on-line or off-line implementation, has been applied to perform the qualitative classification and quantitative prediction on a number of cotton quality indices, including cotton trash from HVI, S...

  1. MR Morphology of Triangular Fibrocartilage Complex: Correlation with Quantitative MR and Biomechanical Properties

    PubMed Central

    Bae, Won C.; Ruangchaijatuporn, Thumanoon; Chang, Eric Y; Biswas, Reni; Du, Jiang; Statum, Sheronda

    2016-01-01

    Objective To evaluate pathology of the triangular fibrocartilage complex (TFCC) using high resolution morphologic magnetic resonance (MR) imaging, and compare with quantitative MR and biomechanical properties. Materials and Methods Five cadaveric wrists (22 to 70 yrs) were imaged at 3T using morphologic (proton density weighted spin echo, PD FS, and 3D spoiled gradient echo, 3D SPGR) and quantitative MR sequences to determine T2 and T1rho properties. In eight geographic regions, morphology of TFC disc and laminae were evaluated for pathology and quantitative MR values. Samples were disarticulated and biomechanical indentation testing was performed on the distal surface of the TFC disc. Results On morphologic PD SE images, TFC disc pathology included degeneration and tears, while that of the laminae included degeneration, degeneration with superimposed tear, mucinous transformation, and globular calcification. Punctate calcifications were highly visible on 3D SPGR images and found only in pathologic regions. Disc pathology occurred more frequently in proximal regions of the disc than distal regions. Quantitative MR values were lowest in normal samples, and generally higher in pathologic regions. Biomechanical testing demonstrated an inverse relationship, with indentation modulus being high in normal regions with low MR values. The laminae studied were mostly pathologic, and additional normal samples are needed to discern quantitative changes. Conclusion These results show technical feasibility of morphologic MR, quantitative MR, and biomechanical techniques to characterize pathology of the TFCC. Quantitative MRI may be a suitable surrogate marker of soft tissue mechanical properties, and a useful adjunct to conventional morphologic MR techniques. PMID:26691643

  2. Raman Spectrometry.

    ERIC Educational Resources Information Center

    Gardiner, Derek J.

    1980-01-01

    Reviews mainly quantitative analytical applications in the field of Raman spectrometry. Includes references to other reviews, new and analytically untested techniques, and novel sampling and instrument designs. Cites 184 references. (CS)

  3. Quantitative X-ray mapping, scatter diagrams and the generation of correction maps to obtain more information about your material

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2014-03-01

    Quantitative X-ray mapping with silicon drift detectors and multi-EDS detector systems have become an invaluable analysis technique and one of the most useful methods of X-ray microanalysis today. The time to perform an X-ray map has reduced considerably with the ability to map minor and trace elements very accurately due to the larger detector area and higher count rate detectors. Live X-ray imaging can now be performed with a significant amount of data collected in a matter of minutes. A great deal of information can be obtained from X-ray maps. This includes; elemental relationship or scatter diagram creation, elemental ratio mapping, chemical phase mapping (CPM) and quantitative X-ray maps. In obtaining quantitative x-ray maps, we are able to easily generate atomic number (Z), absorption (A), fluorescence (F), theoretical back scatter coefficient (η), and quantitative total maps from each pixel in the image. This allows us to generate an image corresponding to each factor (for each element present). These images allow the user to predict and verify where they are likely to have problems in our images, and are especially helpful to look at possible interface artefacts. The post-processing techniques to improve the quantitation of X-ray map data and the development of post processing techniques for improved characterisation are covered in this paper.

  4. Multispectral analysis of ocean dumped materials

    NASA Technical Reports Server (NTRS)

    Johnson, R. W.

    1977-01-01

    Remotely sensed data were collected in conjunction with sea-truth measurements in three experiments in the New York Bight. Pollution features of primary interest were ocean dumped materials, such as sewage sludge and acid waste. Sewage-sludge and acid-waste plumes, including plumes from sewage sludge dumped by the 'line-dump' and 'spot-dump' methods, were located, identified, and mapped. Previously developed quantitative analysis techniques for determining quantitative distributions of materials in sewage sludge dumps were evaluated, along with multispectral analysis techniques developed to identify ocean dumped materials. Results of these experiments and the associated data analysis investigations are presented and discussed.

  5. The determination of ethanol in blood and urine by mass fragmentography

    NASA Technical Reports Server (NTRS)

    Pereira, W. E.; Summons, R. E.; Rindfleisch, T. C.; Duffield, A. M.

    1974-01-01

    A mass fragmentographic technique for a rapid, specific and sensitive determination of ethanol in blood and urine is described. A Varian gas chromatograph coupled through an all-glass membrane separator to a Finnigan quadripole mass spectrometer and interfaced to a computer system is used for ethanol determination in blood and urine samples. A procedure for plotting calibration curves for ethanol quantitation is also described. Quantitation is achieved by plotting the peak area ratios of undeuterated-to-deuterated ethanol fragment ions against the amount of ethanol added. Representative results obtained by this technique are included.

  6. Clinical application of microsampling versus conventional sampling techniques in the quantitative bioanalysis of antibiotics: a systematic review.

    PubMed

    Guerra Valero, Yarmarly C; Wallis, Steven C; Lipman, Jeffrey; Stove, Christophe; Roberts, Jason A; Parker, Suzanne L

    2018-03-01

    Conventional sampling techniques for clinical pharmacokinetic studies often require the removal of large blood volumes from patients. This can result in a physiological or emotional burden, particularly for neonates or pediatric patients. Antibiotic pharmacokinetic studies are typically performed on healthy adults or general ward patients. These may not account for alterations to a patient's pathophysiology and can lead to suboptimal treatment. Microsampling offers an important opportunity for clinical pharmacokinetic studies in vulnerable patient populations, where smaller sample volumes can be collected. This systematic review provides a description of currently available microsampling techniques and an overview of studies reporting the quantitation and validation of antibiotics using microsampling. A comparison of microsampling to conventional sampling in clinical studies is included.

  7. Review of Quantitative Ultrasound: Envelope Statistics and Backscatter Coefficient Imaging and Contributions to Diagnostic Ultrasound.

    PubMed

    Oelze, Michael L; Mamou, Jonathan

    2016-02-01

    Conventional medical imaging technologies, including ultrasound, have continued to improve over the years. For example, in oncology, medical imaging is characterized by high sensitivity, i.e., the ability to detect anomalous tissue features, but the ability to classify these tissue features from images often lacks specificity. As a result, a large number of biopsies of tissues with suspicious image findings are performed each year with a vast majority of these biopsies resulting in a negative finding. To improve specificity of cancer imaging, quantitative imaging techniques can play an important role. Conventional ultrasound B-mode imaging is mainly qualitative in nature. However, quantitative ultrasound (QUS) imaging can provide specific numbers related to tissue features that can increase the specificity of image findings leading to improvements in diagnostic ultrasound. QUS imaging can encompass a wide variety of techniques including spectral-based parameterization, elastography, shear wave imaging, flow estimation, and envelope statistics. Currently, spectral-based parameterization and envelope statistics are not available on most conventional clinical ultrasound machines. However, in recent years, QUS techniques involving spectral-based parameterization and envelope statistics have demonstrated success in many applications, providing additional diagnostic capabilities. Spectral-based techniques include the estimation of the backscatter coefficient (BSC), estimation of attenuation, and estimation of scatterer properties such as the correlation length associated with an effective scatterer diameter (ESD) and the effective acoustic concentration (EAC) of scatterers. Envelope statistics include the estimation of the number density of scatterers and quantification of coherent to incoherent signals produced from the tissue. Challenges for clinical application include correctly accounting for attenuation effects and transmission losses and implementation of QUS on clinical devices. Successful clinical and preclinical applications demonstrating the ability of QUS to improve medical diagnostics include characterization of the myocardium during the cardiac cycle, cancer detection, classification of solid tumors and lymph nodes, detection and quantification of fatty liver disease, and monitoring and assessment of therapy.

  8. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  9. 1H MAS NMR (magic-angle spinning nuclear magnetic resonance) techniques for the quantitative determination of hydrogen types in solid catalysts and supports.

    PubMed

    Kennedy, Gordon J; Afeworki, Mobae; Calabro, David C; Chase, Clarence E; Smiley, Randolph J

    2004-06-01

    Distinct hydrogen species are present in important inorganic solids such as zeolites, silicoaluminophosphates (SAPOs), mesoporous materials, amorphous silicas, and aluminas. These H species include hydrogens associated with acidic sites such as Al(OH)Si, non-framework aluminum sites, silanols, and surface functionalities. Direct and quantitative methodology to identify, measure, and monitor these hydrogen species are key to monitoring catalyst activity, optimizing synthesis conditions, tracking post-synthesis structural modifications, and in the preparation of novel catalytic materials. Many workers have developed several techniques to address these issues, including 1H MAS NMR (magic-angle spinning nuclear magnetic resonance). 1H MAS NMR offers many potential advantages over other techniques, but care is needed in recognizing experimental limitations and developing sample handling and NMR methodology to obtain quantitatively reliable data. A simplified approach is described that permits vacuum dehydration of multiple samples simultaneously and directly in the MAS rotor without the need for epoxy, flame sealing, or extensive glovebox use. We have found that careful optimization of important NMR conditions, such as magnetic field homogeneity and magic angle setting are necessary to acquire quantitative, high-resolution spectra that accurately measure the concentrations of the different hydrogen species present. Details of this 1H MAS NMR methodology with representative applications to zeolites, SAPOs, M41S, and silicas as a function of synthesis conditions and post-synthesis treatments (i.e., steaming, thermal dehydroxylation, and functionalization) are presented.

  10. Melt-Flow Behaviours of Thermoplastic Materials under Fire Conditions: Recent Experimental Studies and Some Theoretical Approaches

    PubMed Central

    Joseph, Paul; Tretsiakova-McNally, Svetlana

    2015-01-01

    Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions. PMID:28793746

  11. Pulmonary nodule characterization, including computer analysis and quantitative features.

    PubMed

    Bartholmai, Brian J; Koo, Chi Wan; Johnson, Geoffrey B; White, Darin B; Raghunath, Sushravya M; Rajagopalan, Srinivasan; Moynagh, Michael R; Lindell, Rebecca M; Hartman, Thomas E

    2015-03-01

    Pulmonary nodules are commonly detected in computed tomography (CT) chest screening of a high-risk population. The specific visual or quantitative features on CT or other modalities can be used to characterize the likelihood that a nodule is benign or malignant. Visual features on CT such as size, attenuation, location, morphology, edge characteristics, and other distinctive "signs" can be highly suggestive of a specific diagnosis and, in general, be used to determine the probability that a specific nodule is benign or malignant. Change in size, attenuation, and morphology on serial follow-up CT, or features on other modalities such as nuclear medicine studies or MRI, can also contribute to the characterization of lung nodules. Imaging analytics can objectively and reproducibly quantify nodule features on CT, nuclear medicine, and magnetic resonance imaging. Some quantitative techniques show great promise in helping to differentiate benign from malignant lesions or to stratify the risk of aggressive versus indolent neoplasm. In this article, we (1) summarize the visual characteristics, descriptors, and signs that may be helpful in management of nodules identified on screening CT, (2) discuss current quantitative and multimodality techniques that aid in the differentiation of nodules, and (3) highlight the power, pitfalls, and limitations of these various techniques.

  12. Melt-Flow Behaviours of Thermoplastic Materials under Fire Conditions: Recent Experimental Studies and Some Theoretical Approaches.

    PubMed

    Joseph, Paul; Tretsiakova-McNally, Svetlana

    2015-12-15

    Polymeric materials often exhibit complex combustion behaviours encompassing several stages and involving solid phase, gas phase and interphase. A wide range of qualitative, semi-quantitative and quantitative testing techniques are currently available, both at the laboratory scale and for commercial purposes, for evaluating the decomposition and combustion behaviours of polymeric materials. They include, but are not limited to, techniques such as: thermo-gravimetric analysis (TGA), oxygen bomb calorimetry, limiting oxygen index measurements (LOI), Underwriters Laboratory 94 (UL-94) tests, cone calorimetry, etc. However, none of the above mentioned techniques are capable of quantitatively deciphering the underpinning physiochemical processes leading to the melt flow behaviour of thermoplastics. Melt-flow of polymeric materials can constitute a serious secondary hazard in fire scenarios, for example, if they are present as component parts of a ceiling in an enclosure. In recent years, more quantitative attempts to measure the mass loss and melt-drip behaviour of some commercially important chain- and step-growth polymers have been accomplished. The present article focuses, primarily, on the experimental and some theoretical aspects of melt-flow behaviours of thermoplastics under heat/fire conditions.

  13. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    PubMed

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  14. A Review of Imaging Techniques for Plant Phenotyping

    PubMed Central

    Li, Lei; Zhang, Qin; Huang, Danfeng

    2014-01-01

    Given the rapid development of plant genomic technologies, a lack of access to plant phenotyping capabilities limits our ability to dissect the genetics of quantitative traits. Effective, high-throughput phenotyping platforms have recently been developed to solve this problem. In high-throughput phenotyping platforms, a variety of imaging methodologies are being used to collect data for quantitative studies of complex traits related to the growth, yield and adaptation to biotic or abiotic stress (disease, insects, drought and salinity). These imaging techniques include visible imaging (machine vision), imaging spectroscopy (multispectral and hyperspectral remote sensing), thermal infrared imaging, fluorescence imaging, 3D imaging and tomographic imaging (MRT, PET and CT). This paper presents a brief review on these imaging techniques and their applications in plant phenotyping. The features used to apply these imaging techniques to plant phenotyping are described and discussed in this review. PMID:25347588

  15. A Quantitative Three-Dimensional Image Analysis Tool for Maximal Acquisition of Spatial Heterogeneity Data.

    PubMed

    Allenby, Mark C; Misener, Ruth; Panoskaltsis, Nicki; Mantalaris, Athanasios

    2017-02-01

    Three-dimensional (3D) imaging techniques provide spatial insight into environmental and cellular interactions and are implemented in various fields, including tissue engineering, but have been restricted by limited quantification tools that misrepresent or underutilize the cellular phenomena captured. This study develops image postprocessing algorithms pairing complex Euclidean metrics with Monte Carlo simulations to quantitatively assess cell and microenvironment spatial distributions while utilizing, for the first time, the entire 3D image captured. Although current methods only analyze a central fraction of presented confocal microscopy images, the proposed algorithms can utilize 210% more cells to calculate 3D spatial distributions that can span a 23-fold longer distance. These algorithms seek to leverage the high sample cost of 3D tissue imaging techniques by extracting maximal quantitative data throughout the captured image.

  16. High Resolution Qualitative and Quantitative MR Evaluation of the Glenoid Labrum

    PubMed Central

    Iwasaki, Kenyu; Tafur, Monica; Chang, Eric Y.; SherondaStatum; Biswas, Reni; Tran, Betty; Bae, Won C.; Du, Jiang; Bydder, Graeme M.; Chung, Christine B.

    2015-01-01

    Objective To implement qualitative and quantitative MR sequences for the evaluation of labral pathology. Methods Six glenoid labra were dissected and the anterior and posterior portions were divided into normal, mildly degenerated, or severely degenerated groups using gross and MR findings. Qualitative evaluation was performed using T1-weighted, proton density-weighted (PD), spoiled gradient echo (SPGR) and ultra-short echo time (UTE) sequences. Quantitative evaluation included T2 and T1rho measurements as well as T1, T2*, and T1rho measurements acquired with UTE techniques. Results SPGR and UTE sequences best demonstrated labral fiber structure. Degenerated labra had a tendency towards decreased T1 values, increased T2/T2* values and increased T1 rho values. T2* values obtained with the UTE sequence allowed for delineation between normal, mildly degenerated and severely degenerated groups (p<0.001). Conclusion Quantitative T2* measurements acquired with the UTE technique are useful for distinguishing between normal, mildly degenerated and severely degenerated labra. PMID:26359581

  17. Automated quantitative micro-mineralogical characterization for environmental applications

    USGS Publications Warehouse

    Smith, Kathleen S.; Hoal, K.O.; Walton-Day, Katherine; Stammer, J.G.; Pietersen, K.

    2013-01-01

    Characterization of ore and waste-rock material using automated quantitative micro-mineralogical techniques (e.g., QEMSCAN® and MLA) has the potential to complement traditional acid-base accounting and humidity cell techniques when predicting acid generation and metal release. These characterization techniques, which most commonly are used for metallurgical, mineral-processing, and geometallurgical applications, can be broadly applied throughout the mine-life cycle to include numerous environmental applications. Critical insights into mineral liberation, mineral associations, particle size, particle texture, and mineralogical residence phase(s) of environmentally important elements can be used to anticipate potential environmental challenges. Resources spent on initial characterization result in lower uncertainties of potential environmental impacts and possible cost savings associated with remediation and closure. Examples illustrate mineralogical and textural characterization of fluvial tailings material from the upper Arkansas River in Colorado.

  18. A method for the extraction and quantitation of phycoerythrin from algae

    NASA Technical Reports Server (NTRS)

    Stewart, D. E.

    1982-01-01

    A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.

  19. Metabolic Mapping: Quantitative Enzyme Cytochemistry and Histochemistry to Determine the Activity of Dehydrogenases in Cells and Tissues.

    PubMed

    Molenaar, Remco J; Khurshed, Mohammed; Hira, Vashendriya V V; Van Noorden, Cornelis J F

    2018-05-26

    Altered cellular metabolism is a hallmark of many diseases, including cancer, cardiovascular diseases and infection. The metabolic motor units of cells are enzymes and their activity is heavily regulated at many levels, including the transcriptional, mRNA stability, translational, post-translational and functional level. This complex regulation means that conventional quantitative or imaging assays, such as quantitative mRNA experiments, Western Blots and immunohistochemistry, yield incomplete information regarding the ultimate activity of enzymes, their function and/or their subcellular localization. Quantitative enzyme cytochemistry and histochemistry (i.e., metabolic mapping) show in-depth information on in situ enzymatic activity and its kinetics, function and subcellular localization in an almost true-to-nature situation. We describe a protocol to detect the activity of dehydrogenases, which are enzymes that perform redox reactions to reduce cofactors such as NAD(P) + and FAD. Cells and tissue sections are incubated in a medium that is specific for the enzymatic activity of one dehydrogenase. Subsequently, the dehydrogenase that is the subject of investigation performs its enzymatic activity in its subcellular site. In a chemical reaction with the reaction medium, this ultimately generates blue-colored formazan at the site of the dehydrogenase's activity. The formazan's absorbance is therefore a direct measure of the dehydrogenase's activity and can be quantified using monochromatic light microscopy and image analysis. The quantitative aspect of this protocol enables researchers to draw statistical conclusions from these assays. Besides observational studies, this technique can be used for inhibition studies of specific enzymes. In this context, studies benefit from the true-to-nature advantages of metabolic mapping, giving in situ results that may be physiologically more relevant than in vitro enzyme inhibition studies. In all, metabolic mapping is an indispensable technique to study metabolism at the cellular or tissue level. The technique is easy to adopt, provides in-depth, comprehensive and integrated metabolic information and enables rapid quantitative analysis.

  20. Detection of genetically modified organisms in foods by DNA amplification techniques.

    PubMed

    García-Cañas, Virginia; Cifuentes, Alejandro; González, Ramón

    2004-01-01

    In this article, the different DNA amplification techniques that are being used for detecting genetically modified organisms (GMOs) in foods are examined. This study intends to provide an updated overview (including works published till June 2002) on the principal applications of such techniques together with their main advantages and drawbacks in GMO detection in foods. Some relevant facts on sampling, DNA isolation, and DNA amplification methods are discussed. Moreover; these analytical protocols are discuissed from a quantitative point of view, including the newest investigations on multiplex detection of GMOs in foods and validation of methods.

  1. Systems Biology in Immunology – A Computational Modeling Perspective

    PubMed Central

    Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.

    2011-01-01

    Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182

  2. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  3. Osteoporosis Imaging: State of the Art and Advanced Imaging

    PubMed Central

    2012-01-01

    Osteoporosis is becoming an increasingly important public health issue, and effective treatments to prevent fragility fractures are available. Osteoporosis imaging is of critical importance in identifying individuals at risk for fractures who would require pharmacotherapy to reduce fracture risk and also in monitoring response to treatment. Dual x-ray absorptiometry is currently the state-of-the-art technique to measure bone mineral density and to diagnose osteoporosis according to the World Health Organization guidelines. Motivated by a 2000 National Institutes of Health consensus conference, substantial research efforts have focused on assessing bone quality by using advanced imaging techniques. Among these techniques aimed at better characterizing fracture risk and treatment effects, high-resolution peripheral quantitative computed tomography (CT) currently plays a central role, and a large number of recent studies have used this technique to study trabecular and cortical bone architecture. Other techniques to analyze bone quality include multidetector CT, magnetic resonance imaging, and quantitative ultrasonography. In addition to quantitative imaging techniques measuring bone density and quality, imaging needs to be used to diagnose prevalent osteoporotic fractures, such as spine fractures on chest radiographs and sagittal multidetector CT reconstructions. Radiologists need to be sensitized to the fact that the presence of fragility fractures will alter patient care, and these fractures need to be described in the report. This review article covers state-of-the-art imaging techniques to measure bone mineral density, describes novel techniques to study bone quality, and focuses on how standard imaging techniques should be used to diagnose prevalent osteoporotic fractures. © RSNA, 2012 PMID:22438439

  4. Critical factors determining the quantification capability of matrix-assisted laser desorption/ionization– time-of-flight mass spectrometry

    PubMed Central

    Wang, Chia-Chen; Lai, Yin-Hung; Ou, Yu-Meng; Chang, Huan-Tsung; Wang, Yi-Sheng

    2016-01-01

    Quantitative analysis with mass spectrometry (MS) is important but challenging. Matrix-assisted laser desorption/ionization (MALDI) coupled with time-of-flight (TOF) MS offers superior sensitivity, resolution and speed, but such techniques have numerous disadvantages that hinder quantitative analyses. This review summarizes essential obstacles to analyte quantification with MALDI-TOF MS, including the complex ionization mechanism of MALDI, sensitive characteristics of the applied electric fields and the mass-dependent detection efficiency of ion detectors. General quantitative ionization and desorption interpretations of ion production are described. Important instrument parameters and available methods of MALDI-TOF MS used for quantitative analysis are also reviewed. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644968

  5. Dosage and Distribution in Morphosyntax Intervention: Current Evidence and Future Needs

    ERIC Educational Resources Information Center

    Proctor-Williams, Kerry

    2009-01-01

    This article reviews the effectiveness of dose forms and the efficacy of dosage and distribution in morphosyntax intervention for children. Dose forms include the commonly used techniques, procedures, and intervention contexts that constitute teaching episodes; dosage includes the quantitative measures of dose, dose frequency, total intervention…

  6. Combustion method for assay of biological materials labeled with carbon-14 or tritium, or double-labeled

    NASA Technical Reports Server (NTRS)

    Huebner, L. G.; Kisieleski, W. E.

    1969-01-01

    Dry catalytic combustion at high temperatures is used for assaying biological materials labeled carbon-14 and tritium, or double-labeled. A modified oxygen-flask technique is combined with standard vacuum-line techniques and includes convenience of direct in-vial collection of final combustion products, giving quantitative recovery of tritium and carbon-14.

  7. Evaluation of thresholding techniques for segmenting scaffold images in tissue engineering

    NASA Astrophysics Data System (ADS)

    Rajagopalan, Srinivasan; Yaszemski, Michael J.; Robb, Richard A.

    2004-05-01

    Tissue engineering attempts to address the ever widening gap between the demand and supply of organ and tissue transplants using natural and biomimetic scaffolds. The regeneration of specific tissues aided by synthetic materials is dependent on the structural and morphometric properties of the scaffold. These properties can be derived non-destructively using quantitative analysis of high resolution microCT scans of scaffolds. Thresholding of the scanned images into polymeric and porous phase is central to the outcome of the subsequent structural and morphometric analysis. Visual thresholding of scaffolds produced using stochastic processes is inaccurate. Depending on the algorithmic assumptions made, automatic thresholding might also be inaccurate. Hence there is a need to analyze the performance of different techniques and propose alternate ones, if needed. This paper provides a quantitative comparison of different thresholding techniques for segmenting scaffold images. The thresholding algorithms examined include those that exploit spatial information, locally adaptive characteristics, histogram entropy information, histogram shape information, and clustering of gray-level information. The performance of different techniques was evaluated using established criteria, including misclassification error, edge mismatch, relative foreground error, and region non-uniformity. Algorithms that exploit local image characteristics seem to perform much better than those using global information.

  8. Meta-analysis of the technical performance of an imaging procedure: guidelines and statistical methodology.

    PubMed

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2015-02-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test-retest repeatability data for illustrative purposes. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  9. Meta-analysis of the technical performance of an imaging procedure: Guidelines and statistical methodology

    PubMed Central

    Huang, Erich P; Wang, Xiao-Feng; Choudhury, Kingshuk Roy; McShane, Lisa M; Gönen, Mithat; Ye, Jingjing; Buckler, Andrew J; Kinahan, Paul E; Reeves, Anthony P; Jackson, Edward F; Guimaraes, Alexander R; Zahlmann, Gudrun

    2017-01-01

    Medical imaging serves many roles in patient care and the drug approval process, including assessing treatment response and guiding treatment decisions. These roles often involve a quantitative imaging biomarker, an objectively measured characteristic of the underlying anatomic structure or biochemical process derived from medical images. Before a quantitative imaging biomarker is accepted for use in such roles, the imaging procedure to acquire it must undergo evaluation of its technical performance, which entails assessment of performance metrics such as repeatability and reproducibility of the quantitative imaging biomarker. Ideally, this evaluation will involve quantitative summaries of results from multiple studies to overcome limitations due to the typically small sample sizes of technical performance studies and/or to include a broader range of clinical settings and patient populations. This paper is a review of meta-analysis procedures for such an evaluation, including identification of suitable studies, statistical methodology to evaluate and summarize the performance metrics, and complete and transparent reporting of the results. This review addresses challenges typical of meta-analyses of technical performance, particularly small study sizes, which often causes violations of assumptions underlying standard meta-analysis techniques. Alternative approaches to address these difficulties are also presented; simulation studies indicate that they outperform standard techniques when some studies are small. The meta-analysis procedures presented are also applied to actual [18F]-fluorodeoxyglucose positron emission tomography (FDG-PET) test–retest repeatability data for illustrative purposes. PMID:24872353

  10. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  11. The Xeno-glycomics database (XDB): a relational database of qualitative and quantitative pig glycome repertoire.

    PubMed

    Park, Hae-Min; Park, Ju-Hyeong; Kim, Yoon-Woo; Kim, Kyoung-Jin; Jeong, Hee-Jin; Jang, Kyoung-Soon; Kim, Byung-Gee; Kim, Yun-Gon

    2013-11-15

    In recent years, the improvement of mass spectrometry-based glycomics techniques (i.e. highly sensitive, quantitative and high-throughput analytical tools) has enabled us to obtain a large dataset of glycans. Here we present a database named Xeno-glycomics database (XDB) that contains cell- or tissue-specific pig glycomes analyzed with mass spectrometry-based techniques, including a comprehensive pig glycan information on chemical structures, mass values, types and relative quantities. It was designed as a user-friendly web-based interface that allows users to query the database according to pig tissue/cell types or glycan masses. This database will contribute in providing qualitative and quantitative information on glycomes characterized from various pig cells/organs in xenotransplantation and might eventually provide new targets in the α1,3-galactosyltransferase gene-knock out pigs era. The database can be accessed on the web at http://bioinformatics.snu.ac.kr/xdb.

  12. Electroencephalography and quantitative electroencephalography in mild traumatic brain injury.

    PubMed

    Haneef, Zulfi; Levin, Harvey S; Frost, James D; Mizrahi, Eli M

    2013-04-15

    Mild traumatic brain injury (mTBI) causes brain injury resulting in electrophysiologic abnormalities visible in electroencephalography (EEG) recordings. Quantitative EEG (qEEG) makes use of quantitative techniques to analyze EEG characteristics such as frequency, amplitude, coherence, power, phase, and symmetry over time independently or in combination. QEEG has been evaluated for its use in making a diagnosis of mTBI and assessing prognosis, including the likelihood of progressing to the postconcussive syndrome (PCS) phase. We review the EEG and qEEG changes of mTBI described in the literature. An attempt is made to separate the findings seen during the acute, subacute, and chronic phases after mTBI. Brief mention is also made of the neurobiological correlates of qEEG using neuroimaging techniques or in histopathology. Although the literature indicates the promise of qEEG in making a diagnosis and indicating prognosis of mTBI, further study is needed to corroborate and refine these methods.

  13. Electroencephalography and Quantitative Electroencephalography in Mild Traumatic Brain Injury

    PubMed Central

    Levin, Harvey S.; Frost, James D.; Mizrahi, Eli M.

    2013-01-01

    Abstract Mild traumatic brain injury (mTBI) causes brain injury resulting in electrophysiologic abnormalities visible in electroencephalography (EEG) recordings. Quantitative EEG (qEEG) makes use of quantitative techniques to analyze EEG characteristics such as frequency, amplitude, coherence, power, phase, and symmetry over time independently or in combination. QEEG has been evaluated for its use in making a diagnosis of mTBI and assessing prognosis, including the likelihood of progressing to the postconcussive syndrome (PCS) phase. We review the EEG and qEEG changes of mTBI described in the literature. An attempt is made to separate the findings seen during the acute, subacute, and chronic phases after mTBI. Brief mention is also made of the neurobiological correlates of qEEG using neuroimaging techniques or in histopathology. Although the literature indicates the promise of qEEG in making a diagnosis and indicating prognosis of mTBI, further study is needed to corroborate and refine these methods. PMID:23249295

  14. Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Waltz, Ed

    2016-05-01

    Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.

  15. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  16. Radiologic-Pathologic Analysis of Contrast-enhanced and Diffusion-weighted MR Imaging in Patients with HCC after TACE: Diagnostic Accuracy of 3D Quantitative Image Analysis

    PubMed Central

    Chapiro, Julius; Wood, Laura D.; Lin, MingDe; Duran, Rafael; Cornish, Toby; Lesage, David; Charu, Vivek; Schernthaner, Rüdiger; Wang, Zhijun; Tacher, Vania; Savic, Lynn Jeanette; Kamel, Ihab R.

    2014-01-01

    Purpose To evaluate the diagnostic performance of three-dimensional (3Dthree-dimensional) quantitative enhancement-based and diffusion-weighted volumetric magnetic resonance (MR) imaging assessment of hepatocellular carcinoma (HCChepatocellular carcinoma) lesions in determining the extent of pathologic tumor necrosis after transarterial chemoembolization (TACEtransarterial chemoembolization). Materials and Methods This institutional review board–approved retrospective study included 17 patients with HCChepatocellular carcinoma who underwent TACEtransarterial chemoembolization before surgery. Semiautomatic 3Dthree-dimensional volumetric segmentation of target lesions was performed at the last MR examination before orthotopic liver transplantation or surgical resection. The amount of necrotic tumor tissue on contrast material–enhanced arterial phase MR images and the amount of diffusion-restricted tumor tissue on apparent diffusion coefficient (ADCapparent diffusion coefficient) maps were expressed as a percentage of the total tumor volume. Visual assessment of the extent of tumor necrosis and tumor response according to European Association for the Study of the Liver (EASLEuropean Association for the Study of the Liver) criteria was performed. Pathologic tumor necrosis was quantified by using slide-by-slide segmentation. Correlation analysis was performed to evaluate the predictive values of the radiologic techniques. Results At histopathologic examination, the mean percentage of tumor necrosis was 70% (range, 10%–100%). Both 3Dthree-dimensional quantitative techniques demonstrated a strong correlation with tumor necrosis at pathologic examination (R2 = 0.9657 and R2 = 0.9662 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively) and a strong intermethod agreement (R2 = 0.9585). Both methods showed a significantly lower discrepancy with pathologically measured necrosis (residual standard error [RSEresidual standard error] = 6.38 and 6.33 for quantitative EASLEuropean Association for the Study of the Liver and quantitative ADCapparent diffusion coefficient, respectively), when compared with non-3Dthree-dimensional techniques (RSEresidual standard error = 12.18 for visual assessment). Conclusion This radiologic-pathologic correlation study demonstrates the diagnostic accuracy of 3Dthree-dimensional quantitative MR imaging techniques in identifying pathologically measured tumor necrosis in HCChepatocellular carcinoma lesions treated with TACEtransarterial chemoembolization. © RSNA, 2014 Online supplemental material is available for this article. PMID:25028783

  17. Sample and data processing considerations for the NIST quantitative infrared database

    NASA Astrophysics Data System (ADS)

    Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William

    1999-02-01

    Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.

  18. General quantitative genetic methods for comparative biology: phylogenies, taxonomies and multi-trait models for continuous and categorical characters.

    PubMed

    Hadfield, J D; Nakagawa, S

    2010-03-01

    Although many of the statistical techniques used in comparative biology were originally developed in quantitative genetics, subsequent development of comparative techniques has progressed in relative isolation. Consequently, many of the new and planned developments in comparative analysis already have well-tested solutions in quantitative genetics. In this paper, we take three recent publications that develop phylogenetic meta-analysis, either implicitly or explicitly, and show how they can be considered as quantitative genetic models. We highlight some of the difficulties with the proposed solutions, and demonstrate that standard quantitative genetic theory and software offer solutions. We also show how results from Bayesian quantitative genetics can be used to create efficient Markov chain Monte Carlo algorithms for phylogenetic mixed models, thereby extending their generality to non-Gaussian data. Of particular utility is the development of multinomial models for analysing the evolution of discrete traits, and the development of multi-trait models in which traits can follow different distributions. Meta-analyses often include a nonrandom collection of species for which the full phylogenetic tree has only been partly resolved. Using missing data theory, we show how the presented models can be used to correct for nonrandom sampling and show how taxonomies and phylogenies can be combined to give a flexible framework with which to model dependence.

  19. Current methods and advances in bone densitometry

    NASA Technical Reports Server (NTRS)

    Guglielmi, G.; Gluer, C. C.; Majumdar, S.; Blunt, B. A.; Genant, H. K.

    1995-01-01

    Bone mass is the primary, although not the only, determinant of fracture. Over the past few years a number of noninvasive techniques have been developed to more sensitively quantitate bone mass. These include single and dual photon absorptiometry (SPA and DPA), single and dual X-ray absorptiometry (SXA and DXA) and quantitative computed tomography (QCT). While differing in anatomic sites measured and in their estimates of precision, accuracy, and fracture discrimination, all of these methods provide clinically useful measurements of skeletal status. It is the intent of this review to discuss the pros and cons of these techniques and to present the new applications of ultrasound (US) and magnetic resonance (MRI) in the detection and management of osteoporosis.

  20. Nanomechanical effects of light unveil photons momentum in medium

    PubMed Central

    Verma, Gopal; Chaudhary, Komal; Singh, Kamal P.

    2017-01-01

    Precision measurement on momentum transfer between light and fluid interface has many implications including resolving the intriguing nature of photons momentum in a medium. For example, the existence of Abraham pressure of light under specific experimental configuration and the predictions of Chau-Amperian formalism of optical momentum for TE and TM polarizations remain untested. Here, we quantitatively and cleanly measure nanomehanical dynamics of water surface excited by radiation pressure of a laser beam. We systematically scanned wide range of experimental parameters including long exposure times, angle of incidence, spot size and laser polarization, and used two independent pump-probe techniques to validate a nano- bump on the water surface under all the tested conditions, in quantitative agreement with the Minkowski’s momentum of light. With careful experiments, we demonstrate advantages and limitations of nanometer resolved optical probing techniques and narrow down actual manifestation of optical momentum in a medium. PMID:28198468

  1. Quantitative ultrasonic evaluation of concrete structures using one-sided access

    NASA Astrophysics Data System (ADS)

    Khazanovich, Lev; Hoegh, Kyle

    2016-02-01

    Nondestructive diagnostics of concrete structures is an important and challenging problem. A recent introduction of array ultrasonic dry point contact transducer systems offers opportunities for quantitative assessment of the subsurface condition of concrete structures, including detection of defects and inclusions. The methods described in this paper are developed for signal interpretation of shear wave impulse response time histories from multiple fixed distance transducer pairs in a self-contained ultrasonic linear array. This included generalizing Kirchoff migration-based synthetic aperture focusing technique (SAFT) reconstruction methods to handle the spatially diverse transducer pair locations, creating expanded virtual arrays with associated reconstruction methods, and creating automated reconstruction interpretation methods for reinforcement detection and stochastic flaw detection. Interpretation of the reconstruction techniques developed in this study were validated using the results of laboratory and field forensic studies. Applicability of the developed methods for solving practical engineering problems was demonstrated.

  2. Noninvasive characterization of the fission yeast cell cycle by monitoring dry mass with digital holographic microscopy.

    PubMed

    Rappaz, Benjamin; Cano, Elena; Colomb, Tristan; Kühn, Jonas; Depeursinge, Christian; Simanis, Viesturs; Magistretti, Pierre J; Marquet, Pierre

    2009-01-01

    Digital holography microscopy (DHM) is an optical technique which provides phase images yielding quantitative information about cell structure and cellular dynamics. Furthermore, the quantitative phase images allow the derivation of other parameters, including dry mass production, density, and spatial distribution. We have applied DHM to study the dry mass production rate and the dry mass surface density in wild-type and mutant fission yeast cells. Our study demonstrates the applicability of DHM as a tool for label-free quantitative analysis of the cell cycle and opens the possibility for its use in high-throughput screening.

  3. Diagnostic techniques in deflagration and detonation studies.

    PubMed

    Proud, William G; Williamson, David M; Field, John E; Walley, Stephen M

    2015-12-01

    Advances in experimental, high-speed techniques can be used to explore the processes occurring within energetic materials. This review describes techniques used to study a wide range of processes: hot-spot formation, ignition thresholds, deflagration, sensitivity and finally the detonation process. As this is a wide field the focus will be on small-scale experiments and quantitative studies. It is important that such studies are linked to predictive models, which inform the experimental design process. The stimuli range includes, thermal ignition, drop-weight, Hopkinson Bar and Plate Impact studies. Studies made with inert simulants are also included as these are important in differentiating between reactive response and purely mechanical behaviour.

  4. Ultrasonic nondestructive evaluation of graphite epoxy composite laminates

    NASA Technical Reports Server (NTRS)

    Miller, James G.

    1990-01-01

    Quantitative ultrasonic techniques are summarized with applications to the measurement of frequency-dependent attenuation and backscatter and to the NDE of composite laminates. Results are listed for the ultrasonic NDE of graphite-epoxy composite laminates including impact and fatigue damage as well as porosity. The methods reviewed include transmission measurements of attenuation, reconstructive tomography based on attenuation, estimating attenuation from backscattered ultrasound, and backscatter approaches. Phase-sensitive and -insensitive detection techniques are mentioned such as phase cancellation at piezoelectric receiving transducers and acoustoelectric effects. The techniques permit the NDE of the parameters listed in inhomogeneous media and provide both images from the transmission mode and in the reflection mode.

  5. Influence of Cooperative Integrated Reading and Composition Technique on Foreign Students' Reading and Writing Skills in Turkish

    ERIC Educational Resources Information Center

    Varisoglu, Behice

    2016-01-01

    The purpose of this study was to reveal whether the technique of Cooperative Integrated Reading and Composition (CIRC) in Turkish Language teaching had influence on students' skills in reading and writing. In the study, the mixed method, which included quantitative and qualitative dimensions together, was used. The study group was made up of 16…

  6. Rival approaches to mathematical modelling in immunology

    NASA Astrophysics Data System (ADS)

    Andrew, Sarah M.; Baker, Christopher T. H.; Bocharov, Gennady A.

    2007-08-01

    In order to formulate quantitatively correct mathematical models of the immune system, one requires an understanding of immune processes and familiarity with a range of mathematical techniques. Selection of an appropriate model requires a number of decisions to be made, including a choice of the modelling objectives, strategies and techniques and the types of model considered as candidate models. The authors adopt a multidisciplinary perspective.

  7. Comparison of selected analytical techniques for protein sizing, quantitation and molecular weight determination.

    PubMed

    Goetz, H; Kuschel, M; Wulff, T; Sauber, C; Miller, C; Fisher, S; Woodward, C

    2004-09-30

    Protein analysis techniques are developing fast due to the growing number of proteins obtained by recombinant DNA techniques. In the present paper we compare selected techniques, which are used for protein sizing, quantitation and molecular weight determination: sodium dodecylsulfate-polyacrylamide gel electrophoresis (SDS-PAGE), lab-on-a-chip or microfluidics technology (LoaC), size exclusion chromatography (SEC) and mass spectrometry (MS). We compare advantages and limitations of each technique in respect to different application areas, analysis time, protein sizing and quantitation performance.

  8. The state of RT-quantitative PCR: firsthand observations of implementation of minimum information for the publication of quantitative real-time PCR experiments (MIQE).

    PubMed

    Taylor, Sean C; Mrkusich, Eli M

    2014-01-01

    In the past decade, the techniques of quantitative PCR (qPCR) and reverse transcription (RT)-qPCR have become accessible to virtually all research labs, producing valuable data for peer-reviewed publications and supporting exciting research conclusions. However, the experimental design and validation processes applied to the associated projects are the result of historical biases adopted by individual labs that have evolved and changed since the inception of the techniques and associated technologies. This has resulted in wide variability in the quality, reproducibility and interpretability of published data as a direct result of how each lab has designed their RT-qPCR experiments. The 'minimum information for the publication of quantitative real-time PCR experiments' (MIQE) was published to provide the scientific community with a consistent workflow and key considerations to perform qPCR experiments. We use specific examples to highlight the serious negative ramifications for data quality when the MIQE guidelines are not applied and include a summary of good and poor practices for RT-qPCR. © 2013 S. Karger AG, Basel.

  9. IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.

    PubMed

    Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd

    2011-02-04

    Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.

  10. Analytical Chemistry and the Microchip.

    ERIC Educational Resources Information Center

    Lowry, Robert K.

    1986-01-01

    Analytical techniques used at various points in making microchips are described. They include: Fourier transform infrared spectrometry (silicon purity); optical emission spectroscopy (quantitative thin-film composition); X-ray photoelectron spectroscopy (chemical changes in thin films); wet chemistry, instrumental analysis (process chemicals);…

  11. Comparative study of quantitative phase imaging techniques for refractometry of optical fibers

    NASA Astrophysics Data System (ADS)

    de Dorlodot, Bertrand; Bélanger, Erik; Bérubé, Jean-Philippe; Vallée, Réal; Marquet, Pierre

    2018-02-01

    The refractive index difference profile of optical fibers is the key design parameter because it determines, among other properties, the insertion losses and propagating modes. Therefore, an accurate refractive index profiling method is of paramount importance to their development and optimization. Quantitative phase imaging (QPI) is one of the available tools to retrieve structural characteristics of optical fibers, including the refractive index difference profile. Having the advantage of being non-destructive, several different QPI methods have been developed over the last decades. Here, we present a comparative study of three different available QPI techniques, namely the transport-of-intensity equation, quadriwave lateral shearing interferometry and digital holographic microscopy. To assess the accuracy and precision of those QPI techniques, quantitative phase images of the core of a well-characterized optical fiber have been retrieved for each of them and a robust image processing procedure has been applied in order to retrieve their refractive index difference profiles. As a result, even if the raw images for all the three QPI methods were suffering from different shortcomings, our robust automated image-processing pipeline successfully corrected these. After this treatment, all three QPI techniques yielded accurate, reliable and mutually consistent refractive index difference profiles in agreement with the accuracy and precision of the refracted near-field benchmark measurement.

  12. Qualitative and quantitative interpretation of SEM image using digital image processing.

    PubMed

    Saladra, Dawid; Kopernik, Magdalena

    2016-10-01

    The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  13. Introduction of an automated user-independent quantitative volumetric magnetic resonance imaging breast density measurement system using the Dixon sequence: comparison with mammographic breast density assessment.

    PubMed

    Wengert, Georg Johannes; Helbich, Thomas H; Vogl, Wolf-Dieter; Baltzer, Pascal; Langs, Georg; Weber, Michael; Bogner, Wolfgang; Gruber, Stephan; Trattnig, Siegfried; Pinker, Katja

    2015-02-01

    The purposes of this study were to introduce and assess an automated user-independent quantitative volumetric (AUQV) breast density (BD) measurement system on the basis of magnetic resonance imaging (MRI) using the Dixon technique as well as to compare it with qualitative and quantitative mammographic (MG) BD measurements. Forty-three women with normal mammogram results (Breast Imaging Reporting and Data System 1) were included in this institutional review board-approved prospective study. All participants were subjected to BD assessment with MRI using the following sequence with the Dixon technique (echo time/echo time, 6 milliseconds/2.45 milliseconds/2.67 milliseconds; 1-mm isotropic; 3 minutes 38 seconds). To test the reproducibility, a second MRI after patient repositioning was performed. The AUQV magnetic resonance (MR) BD measurement system automatically calculated percentage (%) BD. The qualitative BD assessment was performed using the American College of Radiology Breast Imaging Reporting and Data System BD categories. Quantitative BD was estimated semiautomatically using the thresholding technique Cumulus4. Appropriate statistical tests were used to assess the agreement between the AUQV MR measurements and to compare them with qualitative and quantitative MG BD estimations. The AUQV MR BD measurements were successfully performed in all 43 women. There was a nearly perfect agreement of AUQV MR BD measurements between the 2 MR examinations for % BD (P < 0.001; intraclass correlation coefficient, 0.998) with no significant differences (P = 0.384). The AUQV MR BD measurements were significantly lower than quantitative and qualitative MG BD assessment (P < 0.001). The AUQV MR BD measurement system allows a fully automated, user-independent, robust, reproducible, as well as radiation- and compression-free volumetric quantitative BD assessment through different levels of BD. The AUQV MR BD measurements were significantly lower than the currently used qualitative and quantitative MG-based approaches, implying that the current assessment might overestimate breast density with MG.

  14. Speciation of individual mineral particles of micrometer size by the combined use of attenuated total reflectance-Fourier transform-infrared imaging and quantitative energy-dispersive electron probe X-ray microanalysis techniques.

    PubMed

    Jung, Hae-Jin; Malek, Md Abdul; Ryu, JiYeon; Kim, BoWha; Song, Young-Chul; Kim, HyeKyeong; Ro, Chul-Un

    2010-07-15

    Our previous work demonstrated for the first time the potential of the combined use of two techniques, attenuated total reflectance FT-IR (ATR-FT-IR) imaging and a quantitative energy-dispersive electron probe X-ray microanalysis, low-Z particle EPMA, for the characterization of individual aerosol particles. In this work, the speciation of mineral particles was performed on a single particle level for 24 mineral samples, including kaolinite, montmorillonite, vermiculite, talc, quartz, feldspar, calcite, gypsum, and apatite, by the combined use of ATR-FT-IR imaging and low-Z particle EPMA techniques. These two single particle analytical techniques provide complementary information, the ATR-FT-IR imaging on mineral types and low-Z particle EPMA on the morphology and elemental concentrations, on the same individual particles. This work demonstrates that the combined use of the two single particle analytical techniques can powerfully characterize externally heterogeneous mineral particle samples in detail and has great potential for the characterization of airborne mineral dust particles.

  15. Distributed Simulation as a modelling tool for the development of a simulation-based training programme for cardiovascular specialties.

    PubMed

    Kelay, Tanika; Chan, Kah Leong; Ako, Emmanuel; Yasin, Mohammad; Costopoulos, Charis; Gold, Matthew; Kneebone, Roger K; Malik, Iqbal S; Bello, Fernando

    2017-01-01

    Distributed Simulation is the concept of portable, high-fidelity immersive simulation. Here, it is used for the development of a simulation-based training programme for cardiovascular specialities. We present an evidence base for how accessible, portable and self-contained simulated environments can be effectively utilised for the modelling, development and testing of a complex training framework and assessment methodology. Iterative user feedback through mixed-methods evaluation techniques resulted in the implementation of the training programme. Four phases were involved in the development of our immersive simulation-based training programme: ( 1) initial conceptual stage for mapping structural criteria and parameters of the simulation training framework and scenario development ( n  = 16), (2) training facility design using Distributed Simulation , (3) test cases with clinicians ( n  = 8) and collaborative design, where evaluation and user feedback involved a mixed-methods approach featuring (a) quantitative surveys to evaluate the realism and perceived educational relevance of the simulation format and framework for training and (b) qualitative semi-structured interviews to capture detailed feedback including changes and scope for development. Refinements were made iteratively to the simulation framework based on user feedback, resulting in (4) transition towards implementation of the simulation training framework, involving consistent quantitative evaluation techniques for clinicians ( n  = 62). For comparative purposes, clinicians' initial quantitative mean evaluation scores for realism of the simulation training framework, realism of the training facility and relevance for training ( n  = 8) are presented longitudinally, alongside feedback throughout the development stages from concept to delivery, including the implementation stage ( n  = 62). Initially, mean evaluation scores fluctuated from low to average, rising incrementally. This corresponded with the qualitative component, which augmented the quantitative findings; trainees' user feedback was used to perform iterative refinements to the simulation design and components (collaborative design), resulting in higher mean evaluation scores leading up to the implementation phase. Through application of innovative Distributed Simulation techniques, collaborative design, and consistent evaluation techniques from conceptual, development, and implementation stages, fully immersive simulation techniques for cardiovascular specialities are achievable and have the potential to be implemented more broadly.

  16. Characterization of Minerals of Geochronological Interest by EPMA and Atom Probe Tomography

    NASA Astrophysics Data System (ADS)

    Snoeyenbos, D.; Jercinovic, M. J.; Reinhard, D. A.; Hombourger, C.

    2012-12-01

    Isotopic and chemical dating techniques for zircon and monazite rely on several assumptions: that initial common Pb is low to nonexistent, that the analyzed domain is chronologically homogeneous, and that any relative migration of radiogenic Pb and its parent isotopes has not exceeded the analyzed domain. Yet, both zircon and monazite commonly contain significant submicron heterogeneities that may challenge these assumptions and can complicate the interpretation of chemical and isotopic data. Compositional mapping and submicron quantitative analysis by EPMA and FE-EPMA have been found to be useful techniques both for the characterization of these heterogeneities, and for quantitative geochronological determinations within the analytical limits of these techniques and the statistics of submicron sampling. Complementary to high-resolution EPMA techniques is Atom Probe Tomography (APT), wherein a specimen with dimensions of a few hundreds of nanometers is field evaporated atom by atom. The original position of each atom is identified, along with its atomic species and isotope. The result is a reconstruction allowing quantitative three-dimensional study of the specimen at the atomic scale, with low detection limits and high mass resolution. With the introduction of laser-induced thermal pulsing to achieve field evaporation, the technique is no longer limited to conductive specimens. There exists the capability to explore the compositional and isotopic structure of insulating materials at sub-nanometer resolution. Minerals of geochronological interest have been studied by an analytical method involving first compositional mapping and submicron quantitative analysis by EPMA and FE-EPMA, and subsequent use of these data to select specific sites for APT specimen extraction by FIB. Examples presented include 1) zircon from the Taconian of New England, USA, containing a fossil resorption front included between an unmodified igneous core, and a subsequent metamorphic overgrowth, with significant redistribution of U, Th, P and Y along microfracture arrays extending into the overgrowth, and 2) Paleoproterozoic monazite in thin bands <1μm wide along cleavage planes within much older (Neoarchean) monazite from the Boothia mainland of the Western Churchill Province, Canada.

  17. Multistage, multiseasonal and multiband imagery to identify and qualify non-forest vegetation resources

    NASA Technical Reports Server (NTRS)

    Driscoll, R. S.; Francis, R. E.

    1970-01-01

    A description of space and supporting aircraft photography for the interpretation and analyses of non-forest (shrubby and herbaceous) native vegetation is presented. The research includes the development of a multiple sampling technique to assign quantitative area values of specific plant community types included within an assigned space photograph map unit. Also, investigations of aerial film type, scale, and season of photography for identification and quantity measures of shrubby and herbaceous vegetation were conducted. Some work was done to develop automated interpretation techniques with film image density measurement devices.

  18. Challenges and perspectives in quantitative NMR.

    PubMed

    Giraudeau, Patrick

    2017-01-01

    This perspective article summarizes, from the author's point of view at the beginning of 2016, the major challenges and perspectives in the field of quantitative NMR. The key concepts in quantitative NMR are first summarized; then, the most recent evolutions in terms of resolution and sensitivity are discussed, as well as some potential future research directions in this field. A particular focus is made on methodologies capable of boosting the resolution and sensitivity of quantitative NMR, which could open application perspectives in fields where the sample complexity and the analyte concentrations are particularly challenging. These include multi-dimensional quantitative NMR and hyperpolarization techniques such as para-hydrogen-induced polarization or dynamic nuclear polarization. Because quantitative NMR cannot be dissociated from the key concepts of analytical chemistry, i.e. trueness and precision, the methodological developments are systematically described together with their level of analytical performance. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  19. Lunar mineral feedstocks from rocks and soils: X-ray digital imaging in resource evaluation

    NASA Technical Reports Server (NTRS)

    Chambers, John G.; Patchen, Allan; Taylor, Lawrence A.; Higgins, Stefan J.; Mckay, David S.

    1994-01-01

    The rocks and soils of the Moon provide raw materials essential to the successful establishment of a lunar base. Efficient exploitation of these resources requires accurate characterization of mineral abundances, sizes/shapes, and association of 'ore' and 'gangue' phases, as well as the technology to generate high-yield/high-grade feedstocks. Only recently have x-ray mapping and digital imaging techniques been applied to lunar resource evaluation. The topics covered include inherent differences between lunar basalts and soils and quantitative comparison of rock-derived and soil-derived ilmenite concentrates. It is concluded that x-ray digital-imaging characterization of lunar raw materials provides a quantitative comparison that is unattainable by traditional petrographic techniques. These data are necessary for accurately determining mineral distributions of soil and crushed rock material. Application of these techniques will provide an important link to choosing the best raw material for mineral beneficiation.

  20. Change analysis in the United Arab Emirates: An investigation of techniques

    USGS Publications Warehouse

    Sohl, Terry L.

    1999-01-01

    Much of the landscape of the United Arab Emirates has been transformed over the past 15 years by massive afforestation, beautification, and agricultural programs. The "greening" of the United Arab Emirates has had environmental consequences, however, including degraded groundwater quality and possible damage to natural regional ecosystems. Personnel from the Ground- Water Research project, a joint effort between the National Drilling Company of the Abu Dhabi Emirate and the U.S. Geological Survey, were interested in studying landscape change in the Abu Dhabi Emirate using Landsat thematic mapper (TM) data. The EROs Data Center in Sioux Falls, South Dakota was asked to investigate land-cover change techniques that (1) provided locational, quantitative, and qualitative information on landcover change within the Abu Dhabi Emirate; and (2) could be easily implemented by project personnel who were relatively inexperienced in remote sensing. A number of products were created with 1987 and 1996 Landsat TM data using change-detection techniques, including univariate image differencing, an "enhanced" image differencing, vegetation index differencing, post-classification differencing, and changevector analysis. The different techniques provided products that varied in levels of adequacy according to the specific application and the ease of implementation and interpretation. Specific quantitative values of change were most accurately and easily provided by the enhanced image-differencing technique, while the change-vector analysis excelled at providing rich qualitative detail about the nature of a change. 

  1. Mapping of thermal injury in biologic tissues using quantitative pathologic techniques

    NASA Astrophysics Data System (ADS)

    Thomsen, Sharon L.

    1999-05-01

    Qualitative and quantitative pathologic techniques can be used for (1) mapping of thermal injury, (2) comparisons lesion sizes and configurations for different instruments or heating sources and (3) comparisons of treatment effects. Concentric zones of thermal damage form around a single volume heat source. The boundaries between some of these zones are distinct and measurable. Depending on the energy deposition, heating times and tissue type, the zones can include the following beginning at the hotter center and progressing to the cooler periphery: (1) tissue ablation, (2) carbonization, (3) tissue water vaporization, (4) structural protein denaturation (thermal coagulation), (5) vital enzyme protein denaturation, (6) cell membrane disruption, (7) hemorrhage, hemostasis and hyperhemia, (8) tissue necrosis and (9) wound organization and healing.

  2. Separation techniques: Chromatography

    PubMed Central

    Coskun, Ozlem

    2016-01-01

    Chromatography is an important biophysical technique that enables the separation, identification, and purification of the components of a mixture for qualitative and quantitative analysis. Proteins can be purified based on characteristics such as size and shape, total charge, hydrophobic groups present on the surface, and binding capacity with the stationary phase. Four separation techniques based on molecular characteristics and interaction type use mechanisms of ion exchange, surface adsorption, partition, and size exclusion. Other chromatography techniques are based on the stationary bed, including column, thin layer, and paper chromatography. Column chromatography is one of the most common methods of protein purification. PMID:28058406

  3. Measurements of morphology and refractive indexes on human downy hairs using three-dimensional quantitative phase imaging.

    PubMed

    Lee, SangYun; Kim, Kyoohyun; Lee, Yuhyun; Park, Sungjin; Shin, Heejae; Yang, Jongwon; Ko, Kwanhong; Park, HyunJoo; Park, YongKeun

    2015-01-01

    We present optical measurements of morphology and refractive indexes (RIs) of human downy arm hairs using three-dimensional (3-D) quantitative phase imaging techniques. 3-D RI tomograms and high-resolution two-dimensional synthetic aperture images of individual downy arm hairs were measured using a Mach–Zehnder laser interferometric microscopy equipped with a two-axis galvanometer mirror. From the measured quantitative images, the RIs and morphological parameters of downy hairs were noninvasively quantified including the mean RI, volume, cylinder, and effective radius of individual hairs. In addition, the effects of hydrogen peroxide on individual downy hairs were investigated.

  4. Calibration-free quantitative analysis of elemental ratios in intermetallic nanoalloys and nanocomposites using Laser Induced Breakdown Spectroscopy (LIBS).

    PubMed

    Davari, Seyyed Ali; Hu, Sheng; Mukherjee, Dibyendu

    2017-03-01

    Intermetallic nanoalloys (NAs) and nanocomposites (NCs) have increasingly gained prominence as efficient catalytic materials in electrochemical energy conversion and storage systems. But their morphology and chemical compositions play critical role in tuning their catalytic activities, and precious metal contents. While advanced microscopy techniques facilitate morphological characterizations, traditional chemical characterizations are either qualitative or extremely involved. In this study, we apply Laser Induced Breakdown Spectroscopy (LIBS) for quantitative compositional analysis of NAs and NCs synthesized with varied elemental ratios by our in-house built pulsed laser ablation technique. Specifically, elemental ratios of binary PtNi, PdCo (NAs) and PtCo (NCs) of different compositions are determined from LIBS measurements employing an internal calibration scheme using the bulk matrix species as internal standards. Morphology and qualitative elemental compositions of the aforesaid NAs and NCs are confirmed from Transmission Electron Microscopy (TEM) images and Energy Dispersive X-ray Spectroscopy (EDX) measurements. LIBS experiments are carried out in ambient conditions with the NA and NC samples drop cast on silicon wafers after centrifugation to increase their concentrations. The technique does not call for cumbersome sample preparations including acid digestions and external calibration standards commonly required in Inductively Coupled Plasma-Optical Emission Spectroscopy (ICP-OES) techniques. Yet the quantitative LIBS results are in good agreement with the results from ICP-OES measurements. Our results indicate the feasibility of using LIBS in future for rapid and in-situ quantitative chemical characterizations of wide classes of synthesized NAs and NCs. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  6. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  7. Quantitative magnetic resonance (MR) neurography for evaluation of peripheral nerves and plexus injuries

    PubMed Central

    Barousse, Rafael; Socolovsky, Mariano; Luna, Antonio

    2017-01-01

    Traumatic conditions of peripheral nerves and plexus have been classically evaluated by morphological imaging techniques and electrophysiological tests. New magnetic resonance imaging (MRI) studies based on 3D fat-suppressed techniques are providing high accuracy for peripheral nerve injury evaluation from a qualitative point of view. However, these techniques do not provide quantitative information. Diffusion weighted imaging (DWI) and diffusion tensor imaging (DTI) are functional MRI techniques that are able to evaluate and quantify the movement of water molecules within different biological structures. These techniques have been successfully applied in other anatomical areas, especially in the assessment of central nervous system, and now are being imported, with promising results for peripheral nerve and plexus evaluation. DWI and DTI allow performing a qualitative and quantitative peripheral nerve analysis, providing valuable pathophysiological information about functional integrity of these structures. In the field of trauma and peripheral nerve or plexus injury, several derived parameters from DWI and DTI studies such as apparent diffusion coefficient (ADC) or fractional anisotropy (FA) among others, can be used as potential biomarkers of neural damage providing information about fiber organization, axonal flow or myelin integrity. A proper knowledge of physical basis of these techniques and their limitations is important for an optimal interpretation of the imaging findings and derived data. In this paper, a comprehensive review of the potential applications of DWI and DTI neurographic studies is performed with a focus on traumatic conditions, including main nerve entrapment syndromes in both peripheral nerves and brachial or lumbar plexus. PMID:28932698

  8. Characterizing nonconstant instrumental variance in emerging miniaturized analytical techniques.

    PubMed

    Noblitt, Scott D; Berg, Kathleen E; Cate, David M; Henry, Charles S

    2016-04-07

    Measurement variance is a crucial aspect of quantitative chemical analysis. Variance directly affects important analytical figures of merit, including detection limit, quantitation limit, and confidence intervals. Most reported analyses for emerging analytical techniques implicitly assume constant variance (homoskedasticity) by using unweighted regression calibrations. Despite the assumption of constant variance, it is known that most instruments exhibit heteroskedasticity, where variance changes with signal intensity. Ignoring nonconstant variance results in suboptimal calibrations, invalid uncertainty estimates, and incorrect detection limits. Three techniques where homoskedasticity is often assumed were covered in this work to evaluate if heteroskedasticity had a significant quantitative impact-naked-eye, distance-based detection using paper-based analytical devices (PADs), cathodic stripping voltammetry (CSV) with disposable carbon-ink electrode devices, and microchip electrophoresis (MCE) with conductivity detection. Despite these techniques representing a wide range of chemistries and precision, heteroskedastic behavior was confirmed for each. The general variance forms were analyzed, and recommendations for accounting for nonconstant variance discussed. Monte Carlo simulations of instrument responses were performed to quantify the benefits of weighted regression, and the sensitivity to uncertainty in the variance function was tested. Results show that heteroskedasticity should be considered during development of new techniques; even moderate uncertainty (30%) in the variance function still results in weighted regression outperforming unweighted regressions. We recommend utilizing the power model of variance because it is easy to apply, requires little additional experimentation, and produces higher-precision results and more reliable uncertainty estimates than assuming homoskedasticity. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. The diagnostic capability of laser induced fluorescence in the characterization of excised breast tissues

    NASA Astrophysics Data System (ADS)

    Galmed, A. H.; Elshemey, Wael M.

    2017-08-01

    Differentiating between normal, benign and malignant excised breast tissues is one of the major worldwide challenges that need a quantitative, fast and reliable technique in order to avoid personal errors in diagnosis. Laser induced fluorescence (LIF) is a promising technique that has been applied for the characterization of biological tissues including breast tissue. Unfortunately, only few studies have adopted a quantitative approach that can be directly applied for breast tissue characterization. This work provides a quantitative means for such characterization via introduction of several LIF characterization parameters and determining the diagnostic accuracy of each parameter in the differentiation between normal, benign and malignant excised breast tissues. Extensive analysis on 41 lyophilized breast samples using scatter diagrams, cut-off values, diagnostic indices and receiver operating characteristic (ROC) curves, shows that some spectral parameters (peak height and area under the peak) are superior for characterization of normal, benign and malignant breast tissues with high sensitivity (up to 0.91), specificity (up to 0.91) and accuracy ranking (highly accurate).

  10. Indicators of Family Care for Development for Use in Multicountry Surveys

    PubMed Central

    Kariger, Patricia; Engle, Patrice; Britto, Pia M. Rebello; Sywulka, Sara M.; Menon, Purnima

    2012-01-01

    Indicators of family care for development are essential for ascertaining whether families are providing their children with an environment that leads to positive developmental outcomes. This project aimed to develop indicators from a set of items, measuring family care practices and resources important for caregiving, for use in epidemiologic surveys in developing countries. A mixed method (quantitative and qualitative) design was used for item selection and evaluation. Qualitative and quantitative analyses were conducted to examine the validity of candidate items in several country samples. Qualitative methods included the use of global expert panels to identify and evaluate the performance of each candidate item as well as in-country focus groups to test the content validity of the items. The quantitative methods included analyses of item-response distributions, using bivariate techniques. The selected items measured two family care practices (support for learning/stimulating environment and limit-setting techniques) and caregiving resources (adequacy of the alternate caregiver when the mother worked). Six play-activity items, indicative of support for learning/stimulating environment, were included in the core module of UNICEF's Multiple Cluster Indictor Survey 3. The other items were included in optional modules. This project provided, for the first time, a globally-relevant set of items for assessing family care practices and resources in epidemiological surveys. These items have multiple uses, including national monitoring and cross-country comparisons of the status of family care for development used globally. The obtained information will reinforce attention to efforts to improve the support for development of children. PMID:23304914

  11. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph

    PubMed Central

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation. PMID:26909045

  12. Quantitative Assessment of Heart Rate Dynamics during Meditation: An ECG Based Study with Multi-Fractality and Visibility Graph.

    PubMed

    Bhaduri, Anirban; Ghosh, Dipak

    2016-01-01

    The cardiac dynamics during meditation is explored quantitatively with two chaos-based non-linear techniques viz. multi-fractal detrended fluctuation analysis and visibility network analysis techniques. The data used are the instantaneous heart rate (in beats/minute) of subjects performing Kundalini Yoga and Chi meditation from PhysioNet. The results show consistent differences between the quantitative parameters obtained by both the analysis techniques. This indicates an interesting phenomenon of change in the complexity of the cardiac dynamics during meditation supported with quantitative parameters. The results also produce a preliminary evidence that these techniques can be used as a measure of physiological impact on subjects performing meditation.

  13. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  14. Assessment of cleaning and disinfection in Salmonella-contaminated poultry layer houses using qualitative and semi-quantitative culture techniques.

    PubMed

    Wales, Andrew; Breslin, Mark; Davies, Robert

    2006-09-10

    Salmonella infection of laying flocks in the UK is predominantly a problem of the persistent contamination of layer houses and associated wildlife vectors by Salmonella Enteritidis. Methods for its control and elimination include effective cleaning and disinfection of layer houses between flocks, and it is important to be able to measure the success of such decontamination. A method for the environmental detection and semi-quantitative enumeration of salmonellae was used and compared with a standard qualitative method, in 12 Salmonella-contaminated caged layer houses before and after cleaning and disinfection. The quantitative technique proved to have comparable sensitivity to the standard method, and additionally provided insights into the numerical Salmonella challenge that replacement flocks would encounter. Elimination of S. Enteritidis was not achieved in any of the premises examined although substantial reductions in the prevalence and numbers of salmonellae were demonstrated, whilst in others an increase in contamination was observed after cleaning and disinfection. Particular problems with feeders and wildlife vectors were highlighted. The use of a quantitative method assisted the identification of problem areas, such as those with a high initial bacterial load or those experiencing only a modest reduction in bacterial count following decontamination.

  15. Review of progress in quantitative NDE

    NASA Astrophysics Data System (ADS)

    s of 386 papers and plenary presentations are included. The plenary sessions are related to the national technology initiative. The other sessions covered the following NDE topics: corrosion, electromagnetic arrays, elastic wave scattering and backscattering/noise, civil structures, material properties, holography, shearography, UT wave propagation, eddy currents, coatings, signal processing, radiography, computed tomography, EM imaging, adhesive bonds, NMR, laser ultrasonics, composites, thermal techniques, magnetic measurements, nonlinear acoustics, interface modeling and characterization, UT transducers, new techniques, joined materials, probes and systems, fatigue cracks and fracture, imaging and sizing, NDE in engineering and process control, acoustics of cracks, and sensors. An author index is included.

  16. In vivo confocal microscopy of the cornea: New developments in image acquisition, reconstruction and analysis using the HRT-Rostock Corneal Module

    PubMed Central

    Petroll, W. Matthew; Robertson, Danielle M.

    2015-01-01

    The optical sectioning ability of confocal microscopy allows high magnification images to be obtained from different depths within a thick tissue specimen, and is thus ideally suited to the study of intact tissue in living subjects. In vivo confocal microscopy has been used in a variety of corneal research and clinical applications since its development over 25 years ago. In this article we review the latest developments in quantitative corneal imaging with the Heidelberg Retinal Tomograph with Rostock Corneal Module (HRT-RCM). We provide an overview of the unique strengths and weaknesses of the HRT-RCM. We discuss techniques for performing 3-D imaging with the HRT-RCM, including hardware and software modifications that allow full thickness confocal microscopy through focusing (CMTF) of the cornea, which can provide quantitative measurements of corneal sublayer thicknesses, stromal cell and extracellular matrix backscatter, and depth dependent changes in corneal keratocyte density. We also review current approaches for quantitative imaging of the subbasal nerve plexus, which require a combination of advanced image acquisition and analysis procedures, including wide field mapping and 3-D reconstruction of nerve structures. The development of new hardware, software, and acquisition techniques continues to expand the number of applications of the HRT-RCM for quantitative in vivo corneal imaging at the cellular level. Knowledge of these rapidly evolving strategies should benefit corneal clinicians and basic scientists alike. PMID:25998608

  17. Portable low-coherence interferometry for quantitatively imaging fast dynamics with extended field of view

    NASA Astrophysics Data System (ADS)

    Shaked, Natan T.; Girshovitz, Pinhas; Frenklach, Irena

    2014-06-01

    We present our recent advances in the development of compact, highly portable and inexpensive wide-field interferometric modules. By a smart design of the interferometric system, including the usage of low-coherence illumination sources and common-path off-axis geometry of the interferometers, spatial and temporal noise levels of the resulting quantitative thickness profile can be sub-nanometric, while processing the phase profile in real time. In addition, due to novel experimentally-implemented multiplexing methods, we can capture low-coherence off-axis interferograms with significantly extended field of view and in faster acquisition rates. Using these techniques, we quantitatively imaged rapid dynamics of live biological cells including sperm cells and unicellular microorganisms. Then, we demonstrated dynamic profiling during lithography processes of microscopic elements, with thicknesses that may vary from several nanometers to hundreds of microns. Finally, we present new algorithms for fast reconstruction (including digital phase unwrapping) of off-axis interferograms, which allow real-time processing in more than video rate on regular single-core computers.

  18. Continuous EEG monitoring in the intensive care unit.

    PubMed

    Scheuer, Mark L

    2002-01-01

    Continuous EEG (CEEG) monitoring allows uninterrupted assessment of cerebral cortical activity with good spatial resolution and excellent temporal resolution. Thus, this procedure provides a means of constantly assessing brain function in critically ill obtunded and comatose patients. Recent advances in digital EEG acquisition, storage, quantitative analysis, and transmission have made CEEG monitoring in the intensive care unit (ICU) technically feasible and useful. This article summarizes the indications and methodology of CEEG monitoring in the ICU, and discusses the role of some quantitative EEG analysis techniques in near real-time remote observation of CEEG recordings. Clinical examples of CEEG use, including monitoring of status epilepticus, assessment of ongoing therapy for treatment of seizures in critically ill patients, and monitoring for cerebral ischemia, are presented. Areas requiring further development of CEEG monitoring techniques and indications are discussed.

  19. Film/chemistry selection for the earth resources technology satellite /ERTS/ ground data handling system

    NASA Technical Reports Server (NTRS)

    Shaffer, R. M.

    1973-01-01

    A detailed description is given of the methods of choose the duplication film and chemistry currently used in the NASA-ERTS Ground Data Handling System. The major ERTS photographic duplication goals are given as background information to justify the specifications for the desirable film/chemistry combination. Once these specifications were defined, a quantitative evaluation program was designed and implemented to determine if any recommended combinations could meet the ERTS laboratory specifications. The specifications include tone reproduction, granularity, MTF and cosmetic effects. A complete description of the techniques used to measure the test response variables is given. It is anticipated that similar quantitative techniques could be used on other programs to determine the optimum film/chemistry consistent with the engineering goals of the program.

  20. Multi-modality imaging of tumor phenotype and response to therapy

    NASA Astrophysics Data System (ADS)

    Nyflot, Matthew J.

    2011-12-01

    Imaging and radiation oncology have historically been closely linked. However, the vast majority of techniques used in the clinic involve anatomical imaging. Biological imaging offers the potential for innovation in the areas of cancer diagnosis and staging, radiotherapy target definition, and treatment response assessment. Some relevant imaging techniques are FDG PET (for imaging cellular metabolism), FLT PET (proliferation), CuATSM PET (hypoxia), and contrast-enhanced CT (vasculature and perfusion). Here, a technique for quantitative spatial correlation of tumor phenotype is presented for FDG PET, FLT PET, and CuATSM PET images. Additionally, multimodality imaging of treatment response with FLT PET, CuATSM, and dynamic contrast-enhanced CT is presented, in a trial of patients receiving an antiangiogenic agent (Avastin) combined with cisplatin and radiotherapy. Results are also presented for translational applications in animal models, including quantitative assessment of proliferative response to cetuximab with FLT PET and quantification of vascular volume with a blood-pool contrast agent (Fenestra). These techniques have clear applications to radiobiological research and optimized treatment strategies, and may eventually be used for personalized therapy for patients.

  1. A quantitative analysis of the F18 flight control system

    NASA Technical Reports Server (NTRS)

    Doyle, Stacy A.; Dugan, Joanne B.; Patterson-Hine, Ann

    1993-01-01

    This paper presents an informal quantitative analysis of the F18 flight control system (FCS). The analysis technique combines a coverage model with a fault tree model. To demonstrate the method's extensive capabilities, we replace the fault tree with a digraph model of the F18 FCS, the only model available to us. The substitution shows that while digraphs have primarily been used for qualitative analysis, they can also be used for quantitative analysis. Based on our assumptions and the particular failure rates assigned to the F18 FCS components, we show that coverage does have a significant effect on the system's reliability and thus it is important to include coverage in the reliability analysis.

  2. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  3. Knowledge Management for the Analysis of Complex Experimentation.

    ERIC Educational Resources Information Center

    Maule, R.; Schacher, G.; Gallup, S.

    2002-01-01

    Describes a knowledge management system that was developed to help provide structure for dynamic and static data and to aid in the analysis of complex experimentation. Topics include quantitative and qualitative data; mining operations using artificial intelligence techniques; information architecture of the system; and transforming data into…

  4. 24 CFR 91.105 - Citizen participation plan; local governments.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... met in the case of public hearings where a significant number of non-English speaking residents can be... encourage the participation of all its citizens, including minorities and non-English speaking persons, as... jurisdiction should also explore alternative public involvement techniques and quantitative ways to measure...

  5. 77 FR 43228 - Agency Information Collection Activities; Proposed Collection; Comment Request-Generic Clearance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-24

    ... using qualitative and possibly quantitative consumer research techniques, which may include focus groups... used during consumer research while testing nutrition education messages and products developed for the general public. The purpose for performing consumer research is to identify consumers' understanding of...

  6. Principles, performance, and applications of spectral reconstitution (SR) in quantitative analysis of oils by Fourier transform infrared spectroscopy (FT-IR).

    PubMed

    García-González, Diego L; Sedman, Jacqueline; van de Voort, Frederik R

    2013-04-01

    Spectral reconstitution (SR) is a dilution technique developed to facilitate the rapid, automated, and quantitative analysis of viscous oil samples by Fourier transform infrared spectroscopy (FT-IR). This technique involves determining the dilution factor through measurement of an absorption band of a suitable spectral marker added to the diluent, and then spectrally removing the diluent from the sample and multiplying the resulting spectrum to compensate for the effect of dilution on the band intensities. The facsimile spectrum of the neat oil thus obtained can then be qualitatively or quantitatively analyzed for the parameter(s) of interest. The quantitative performance of the SR technique was examined with two transition-metal carbonyl complexes as spectral markers, chromium hexacarbonyl and methylcyclopentadienyl manganese tricarbonyl. The estimation of the volume fraction (VF) of the diluent in a model system, consisting of canola oil diluted to various extents with odorless mineral spirits, served as the basis for assessment of these markers. The relationship between the VF estimates and the true volume fraction (VF(t)) was found to be strongly dependent on the dilution ratio and also depended, to a lesser extent, on the spectral resolution. These dependences are attributable to the effect of changes in matrix polarity on the bandwidth of the ν(CO) marker bands. Excellent VF(t) estimates were obtained by making a polarity correction devised with a variance-spectrum-delineated correction equation. In the absence of such a correction, SR was shown to introduce only a minor and constant bias, provided that polarity differences among all the diluted samples analyzed were minimal. This bias can be built into the calibration of a quantitative FT-IR analytical method by subjecting appropriate calibration standards to the same SR procedure as the samples to be analyzed. The primary purpose of the SR technique is to simplify preparation of diluted samples such that only approximate proportions need to be adhered to, rather than using exact weights or volumes, the marker accounting for minor variations. Additional applications discussed include the use of the SR technique in extraction-based, quantitative, automated FT-IR methods for the determination of moisture, acid number, and base number in lubricating oils, as well as of moisture content in edible oils.

  7. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    PubMed Central

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-01-01

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137

  8. Anniversary Paper: History and status of CAD and quantitative image analysis: The role of Medical Physics and AAPM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giger, Maryellen L.; Chan, Heang-Ping; Boone, John

    2008-12-15

    The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less

  9. "The Math You Need" When Faculty Need It: Enhancing Quantitative Skills at a Broad Spectrum of Higher Education Institutions

    NASA Astrophysics Data System (ADS)

    Baer, E. M.; Wenner, J. M.

    2014-12-01

    Implementation of "The Math You Need, When You Need It" (TMYN) modules at a wide variety of institutions suggests a broad need for faculty support in helping students develop quantitative skills necessary in introductory geoscience courses. Designed to support students in applying geoscience relevant quantitative skills, TMYN modules are web-based, self-paced and commonly assigned outside of class. They include topics such as calculating slope, rearranging equations, and unit conversions and provide several applications of the mathematical technique to geoscience problems. Each instructor chooses modules that are applicable to the content in his/her individual course and students typically work through the module immediately before the module topic is applied in lab or class. Instructors assigned TMYN modules in their courses at more than 40 diverse institutions, including four-year colleges and universities (4YCs) that vary from non-selective to highly selective and open-door two-year colleges (2YCs). Analysis of module topics assigned, frequency of module use, and institutional characteristics reveals similarities and differences among faculty perception of required quantitative skills and incoming student ability at variably selective institutions. Results indicate that institutional type and selectivity are not correlated with module topic; that is, faculty apply similar quantitative skills in all introductory geoscience courses. For example, nearly every instructor assigned the unit conversions module, whereas very few required the trigonometry module. However, differences in number of assigned modules and faculty expectations are observed between 2YCs and 4YCs (no matter the selectivity). Two-year college faculty typically assign a higher number of modules per course and faculty at 4YCs more often combine portions of multiple modules or cover multiple mathematical concepts in a single assignment. These observations suggest that quantitative skills required for introductory geoscience courses are similar among all higher-education institution types. However, faculty at 4YCs may expect students to acquire and apply multiple quantitative skills in the same class/lab, whereas 2YC faculty may structure assignments to introduce and apply only one quantitative technique at a time.

  10. Use of a capillary electrophoresis instrument with laser-induced fluorescence detection for DNA quantitation. Comparison of YO-PRO-1 and PicoGreen assays.

    PubMed

    Guillo, Christelle; Ferrance, Jerome P; Landers, James P

    2006-04-28

    Highly selective and sensitive assays are required for detection and quantitation of the small masses of DNA typically encountered in clinical and forensic settings. High detection sensitivity is achieved using fluorescent labeling dyes and detection techniques such as spectrofluorometers, microplate readers and cytometers. This work describes the use of a laser-induced fluorescence (LIF) detector in conjunction with a commercial capillary electrophoresis instrument for DNA quantitation. PicoGreen and YO-PRO-1, two fluorescent DNA labeling dyes, were used to assess the potential of the system for routine DNA analysis. Linearity, reproducibility, sensitivity, limits of detection and quantitation, and sample stability were examined for the two assays. The LIF detector response was found to be linear (R2 > 0.999) and reproducible (RSD < 9%) in both cases. The PicoGreen assay displayed lower limits of detection and quantitation (20 pg and 60 pg, respectively) than the YO-PRO-1 assay (60 pg and 260 pg, respectively). Although a small variation in fluorescence was observed for the DNA/dye complexes over time, quantitation was not significantly affected and the solutions were found to be relatively stable for 80 min. The advantages of the technique include a 4- to 40-fold reduction in the volume of sample required compared to traditional assays, a 2- to 20-fold reduction in the volume of reagents consumed, fast and automated analysis, and low cost (no specific instrumentation required).

  11. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1978-01-01

    The development of system models that can provide a basis for the formulation and evaluation of aircraft computer system effectiveness, the formulation of quantitative measures of system effectiveness, and the development of analytic and simulation techniques for evaluating the effectiveness of a proposed or existing aircraft computer are described. Specific topics covered include: system models; performability evaluation; capability and functional dependence; computation of trajectory set probabilities; and hierarchical modeling of an air transport mission.

  12. Laser-induced breakdown spectroscopy application in environmental monitoring of water quality: a review.

    PubMed

    Yu, Xiaodong; Li, Yang; Gu, Xiaofeng; Bao, Jiming; Yang, Huizhong; Sun, Li

    2014-12-01

    Water quality monitoring is a critical part of environmental management and protection, and to be able to qualitatively and quantitatively determine contamination and impurity levels in water is especially important. Compared to the currently available water quality monitoring methods and techniques, laser-induced breakdown spectroscopy (LIBS) has several advantages, including no need for sample pre-preparation, fast and easy operation, and chemical free during the process. Therefore, it is of great importance to understand the fundamentals of aqueous LIBS analysis and effectively apply this technique to environmental monitoring. This article reviews the research conducted on LIBS analysis for liquid samples, and the article content includes LIBS theory, history and applications, quantitative analysis of metallic species in liquids, LIBS signal enhancement methods and data processing, characteristics of plasma generated by laser in water, and the factors affecting accuracy of analysis results. Although there have been many research works focusing on aqueous LIBS analysis, detection limit and stability of this technique still need to be improved to satisfy the requirements of environmental monitoring standard. In addition, determination of nonmetallic species in liquid by LIBS is equally important and needs immediate attention from the community. This comprehensive review will assist the readers to better understand the aqueous LIBS technique and help to identify current research needs for environmental monitoring of water quality.

  13. Earthquake prediction evaluation standards applied to the VAN Method

    NASA Astrophysics Data System (ADS)

    Jackson, David D.

    Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.

  14. Living cell dry mass measurement using quantitative phase imaging with quadriwave lateral shearing interferometry: an accuracy and sensitivity discussion.

    PubMed

    Aknoun, Sherazade; Savatier, Julien; Bon, Pierre; Galland, Frédéric; Abdeladim, Lamiae; Wattellier, Benoit; Monneret, Serge

    2015-01-01

    Single-cell dry mass measurement is used in biology to follow cell cycle, to address effects of drugs, or to investigate cell metabolism. Quantitative phase imaging technique with quadriwave lateral shearing interferometry (QWLSI) allows measuring cell dry mass. The technique is very simple to set up, as it is integrated in a camera-like instrument. It simply plugs onto a standard microscope and uses a white light illumination source. Its working principle is first explained, from image acquisition to automated segmentation algorithm and dry mass quantification. Metrology of the whole process, including its sensitivity, repeatability, reliability, sources of error, over different kinds of samples and under different experimental conditions, is developed. We show that there is no influence of magnification or spatial light coherence on dry mass measurement; effect of defocus is more critical but can be calibrated. As a consequence, QWLSI is a well-suited technique for fast, simple, and reliable cell dry mass study, especially for live cells.

  15. Assessment of simulation fidelity using measurements of piloting technique in flight

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Cleveland, W. B.; Key, D. L.

    1984-01-01

    The U.S. Army and NASA joined together on a project to conduct a systematic investigation and validation of a ground based piloted simulation of the Army/Sikorsky UH-60A helicopter. Flight testing was an integral part of the validation effort. Nap-of-the-Earth (NOE) piloting tasks which were investigated included the bob-up, the hover turn, the dash/quickstop, the sidestep, the dolphin, and the slalom. Results from the simulation indicate that the pilot's NOE task performance in the simulator is noticeably and quantifiably degraded when compared with the task performance results generated in flight test. The results of the flight test and ground based simulation experiments support a unique rationale for the assessment of simulation fidelity: flight simulation fidelity should be judged quantitatively by measuring pilot's control strategy and technique as induced by the simulator. A quantitative comparison is offered between the piloting technique observed in a flight simulator and that observed in flight test for the same tasks performed by the same pilots.

  16. Developing High-Frequency Quantitative Ultrasound Techniques to Characterize Three-Dimensional Engineered Tissues

    NASA Astrophysics Data System (ADS)

    Mercado, Karla Patricia E.

    Tissue engineering holds great promise for the repair or replacement of native tissues and organs. Further advancements in the fabrication of functional engineered tissues are partly dependent on developing new and improved technologies to monitor the properties of engineered tissues volumetrically, quantitatively, noninvasively, and nondestructively over time. Currently, engineered tissues are evaluated during fabrication using histology, biochemical assays, and direct mechanical tests. However, these techniques destroy tissue samples and, therefore, lack the capability for real-time, longitudinal monitoring. The research reported in this thesis developed nondestructive, noninvasive approaches to characterize the structural, biological, and mechanical properties of 3-D engineered tissues using high-frequency quantitative ultrasound and elastography technologies. A quantitative ultrasound technique, using a system-independent parameter known as the integrated backscatter coefficient (IBC), was employed to visualize and quantify structural properties of engineered tissues. Specifically, the IBC was demonstrated to estimate cell concentration and quantitatively detect differences in the microstructure of 3-D collagen hydrogels. Additionally, the feasibility of an ultrasound elastography technique called Single Tracking Location Acoustic Radiation Force Impulse (STL-ARFI) imaging was demonstrated for estimating the shear moduli of 3-D engineered tissues. High-frequency ultrasound techniques can be easily integrated into sterile environments necessary for tissue engineering. Furthermore, these high-frequency quantitative ultrasound techniques can enable noninvasive, volumetric characterization of the structural, biological, and mechanical properties of engineered tissues during fabrication and post-implantation.

  17. Social Competence in the Preschool: A Multivariate View.

    ERIC Educational Resources Information Center

    Connolly, Jennifer; Doyle, Anna-Beth

    This study was designed to provide additional understanding of the construct of social competence by using multiple assessments, including both behavioral and inferential techniques. Indices of qualitative social behaviors and of quantitative interaction dimensions were collected on 66 preschoolers during free play. Scores on the Kohn and Rosman…

  18. Reflectance spectroscopy: quantitative analysis techniques for remote sensing applications.

    USGS Publications Warehouse

    Clark, R.N.; Roush, T.L.

    1984-01-01

    Several methods for the analysis of remotely sensed reflectance data are compared, including empirical methods and scattering theories, both of which are important for solving remote sensing problems. The concept of the photon mean path length and the implications for use in modeling reflectance spectra are presented.-from Authors

  19. Outcome Evaluation: Student Development Program, Special Studies Division, Cleveland State University.

    ERIC Educational Resources Information Center

    Pasch, Marvin

    Techniques and procedures used to evaluate the outcomes of the student development program, and to use the evaluation results, are presented. Specific evaluation questions are posed that address overall outcomes, not individual student outcomes, and quantitative measures are suggested to accompany the questions. The measures include statistical…

  20. Object Recognition and Random Image Structure Evolution

    ERIC Educational Resources Information Center

    Sadr, Jvid; Sinha, Pawan

    2004-01-01

    We present a technique called Random Image Structure Evolution (RISE) for use in experimental investigations of high-level visual perception. Potential applications of RISE include the quantitative measurement of perceptual hysteresis and priming, the study of the neural substrates of object perception, and the assessment and detection of subtle…

  1. Effect of skin wettedness on sweat gland response

    NASA Technical Reports Server (NTRS)

    Nadel, E. R.; Stolwijk, J. A. J.

    1973-01-01

    Investigation of the effect of skin wettedness upon sweating rate. Several techniques were used to gain a better understanding of the quantitative nature of this effect. The results include the finding that the evaporative power of the environment has a profound effect on the relationship between body temperature and sweating rate.

  2. Developing Public Education Policy through Policy-Impact Analysis.

    ERIC Educational Resources Information Center

    Hackett, E. Raymond; And Others

    A model for analyzing policy impacts is presented that will assist state-level policy makers in education. The model comprises four stages: (1) monitoring, which includes the identification of relevant trends and issues and the development of a data base; (2) forecasting, which uses quantitative and qualitative techniques developed in futures…

  3. Diabetic microangiopathy in capillaroscopic examination of juveniles with diabetes type 1.

    PubMed

    Kaminska-Winciorek, Grażyna; Deja, Grażyna; Polańska, Joanna; Jarosz-Chobot, Przemysława

    2012-01-30

    The aim of this work was a quantitative and qualitative assessment of a selected part of the microcirculation in children with diabetes type 1 using videocapillaroscopy technique. The authors tested a group consisting of 145 children (70 boys, 75 girls) diagnosed and treated for diabetes type 1 in the Diabetic Clinic of GCZD in Katowice for at least one year. The study included history, clinical examination (including dermatological examination) and videocapillaroscopy. Capillaroscopy, a non-invasive, painless and easily repeatable test, was performed using videocapillaroscopy with digital storage of the obtained images. All nailfolds were examined in all children using videocapillaroscopy, and the obtained images were assessed quantitatively and qualitatively for changes in capillary loops in the tested children according to the defined diagnostic procedure. The analysis of capillaroscopic images described selected quantitative and qualitative characteristics. The conducted analysis showed an increase in the number of capillaries and their elongation, the presence of megacapillaries and Raynaud loops, which were accompanied by an intensive red background, indicating possible neoangiogenesis. The increase in the number of capillaries, disturbances in distribution of capillaries and the presence of abnormal capillaries were correlated with the longer duration of diabetes. Raynaud loops were more frequently found in the cases of increased mean values of HbA1c. Higher values of HbA1c influenced the capillaroscopic images, mainly the number of vessels, including Raynaud loops. Videocapillaroscopy technique could be a useful tool to detect the early changes of microangiopathy in children with diabetes type 1.

  4. Quantitative contrast-enhanced ultrasound imaging: a review of sources of variability

    PubMed Central

    Tang, M.-X.; Mulvana, H.; Gauthier, T.; Lim, A. K. P.; Cosgrove, D. O.; Eckersley, R. J.; Stride, E.

    2011-01-01

    Ultrasound provides a valuable tool for medical diagnosis offering real-time imaging with excellent spatial resolution and low cost. The advent of microbubble contrast agents has provided the additional ability to obtain essential quantitative information relating to tissue vascularity, tissue perfusion and even endothelial wall function. This technique has shown great promise for diagnosis and monitoring in a wide range of clinical conditions such as cardiovascular diseases and cancer, with considerable potential benefits in terms of patient care. A key challenge of this technique, however, is the existence of significant variations in the imaging results, and the lack of understanding regarding their origin. The aim of this paper is to review the potential sources of variability in the quantification of tissue perfusion based on microbubble contrast-enhanced ultrasound images. These are divided into the following three categories: (i) factors relating to the scanner setting, which include transmission power, transmission focal depth, dynamic range, signal gain and transmission frequency, (ii) factors relating to the patient, which include body physical differences, physiological interaction of body with bubbles, propagation and attenuation through tissue, and tissue motion, and (iii) factors relating to the microbubbles, which include the type of bubbles and their stability, preparation and injection and dosage. It has been shown that the factors in all the three categories can significantly affect the imaging results and contribute to the variations observed. How these factors influence quantitative imaging is explained and possible methods for reducing such variations are discussed. PMID:22866229

  5. Real-time quantitative fluorescence imaging using a single snapshot optical properties technique for neurosurgical guidance

    NASA Astrophysics Data System (ADS)

    Valdes, Pablo A.; Angelo, Joseph; Gioux, Sylvain

    2015-03-01

    Fluorescence imaging has shown promise as an adjunct to improve the extent of resection in neurosurgery and oncologic surgery. Nevertheless, current fluorescence imaging techniques do not account for the heterogeneous attenuation effects of tissue optical properties. In this work, we present a novel imaging system that performs real time quantitative fluorescence imaging using Single Snapshot Optical Properties (SSOP) imaging. We developed the technique and performed initial phantom studies to validate the quantitative capabilities of the system for intraoperative feasibility. Overall, this work introduces a novel real-time quantitative fluorescence imaging method capable of being used intraoperatively for neurosurgical guidance.

  6. Evaluating ocular blood flow

    PubMed Central

    Maram, Jyotsna; Srinivas, Sowmya; Sadda, Srinivas R

    2017-01-01

    Studies have shown that vascular impairment plays an important role in the etiology and pathogenesis of various ocular diseases including glaucoma, age-related macular degeneration, diabetic retinopathy, and retinal venous occlusive disease. Thus, qualitative and quantitative assessment of ocular blood flow (BF) is a topic of interest for early disease detection, diagnosis, and management. Owing to the rapid improvement in technology, there are several invasive and noninvasive techniques available for evaluating ocular BF, with each of these techniques having their own limitations and advantages. This article reviews these important techniques, with a particular focus on Doppler Fourier domain optical coherence tomography (OCT) and OCT-angiography. PMID:28573987

  7. Atlas of computerized blood flow analysis in bone disease.

    PubMed

    Gandsman, E J; Deutsch, S D; Tyson, I B

    1983-11-01

    The role of computerized blood flow analysis in routine bone scanning is reviewed. Cases illustrating the technique include proven diagnoses of toxic synovitis, Legg-Perthes disease, arthritis, avascular necrosis of the hip, fractures, benign and malignant tumors, Paget's disease, cellulitis, osteomyelitis, and shin splints. Several examples also show the use of the technique in monitoring treatment. The use of quantitative data from the blood flow, bone uptake phase, and static images suggests specific diagnostic patterns for each of the diseases presented in this atlas. Thus, this technique enables increased accuracy in the interpretation of the radionuclide bone scan.

  8. Time-resolved quantitative-phase microscopy of laser-material interactions using a wavefront sensor.

    PubMed

    Gallais, Laurent; Monneret, Serge

    2016-07-15

    We report on a simple and efficient technique based on a wavefront sensor to obtain time-resolved amplitude and phase images of laser-material interactions. The main interest of the technique is to obtain quantitative self-calibrated phase measurements in one shot at the femtosecond time-scale, with high spatial resolution. The technique is used for direct observation and quantitative measurement of the Kerr effect in a fused silica substrate and free electron generation by photo-ionization processes in an optical coating.

  9. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  10. A thioacidolysis method tailored for higher‐throughput quantitative analysis of lignin monomers

    PubMed Central

    Foster, Cliff; Happs, Renee M.; Doeppke, Crissa; Meunier, Kristoffer; Gehan, Jackson; Yue, Fengxia; Lu, Fachuang; Davis, Mark F.

    2016-01-01

    Abstract Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β‐O‐4 linkages. Current thioacidolysis methods are low‐throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non‐chlorinated organic solvent and is tailored for higher‐throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1–2 mg of biomass per assay and has been quantified using fast‐GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, including standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day‐to‐day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. The method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses. PMID:27534715

  11. The Quantitative Preparation of Future Geoscience Graduate Students

    NASA Astrophysics Data System (ADS)

    Manduca, C. A.; Hancock, G. S.

    2006-12-01

    Modern geoscience is a highly quantitative science. In February, a small group of faculty and graduate students from across the country met to discuss the quantitative preparation of geoscience majors for graduate school. The group included ten faculty supervising graduate students in quantitative areas spanning the earth, atmosphere, and ocean sciences; five current graduate students in these areas; and five faculty teaching undergraduate students in the spectrum of institutions preparing students for graduate work. Discussion focused in four key ares: Are incoming graduate students adequately prepared for the quantitative aspects of graduate geoscience programs? What are the essential quantitative skills are that are required for success in graduate school? What are perceived as the important courses to prepare students for the quantitative aspects of graduate school? What programs/resources would be valuable in helping faculty/departments improve the quantitative preparation of students? The participants concluded that strengthening the quantitative preparation of undergraduate geoscience majors would increase their opportunities in graduate school. While specifics differed amongst disciplines, a special importance was placed on developing the ability to use quantitative skills to solve geoscience problems. This requires the ability to pose problems so they can be addressed quantitatively, understand the relationship between quantitative concepts and physical representations, visualize mathematics, test the reasonableness of quantitative results, creatively move forward from existing models/techniques/approaches, and move between quantitative and verbal descriptions. A list of important quantitative competencies desirable in incoming graduate students includes mechanical skills in basic mathematics, functions, multi-variate analysis, statistics and calculus, as well as skills in logical analysis and the ability to learn independently in quantitative ways. Calculus, calculus-based physics, chemistry, statistics, programming and linear algebra were viewed as important course preparation for a successful graduate experience. A set of recommendations for departments and for new community resources includes ideas for infusing quantitative reasoning throughout the undergraduate experience and mechanisms for learning from successful experiments in both geoscience and mathematics. A full list of participants, summaries of the meeting discussion and recommendations are available at http://serc.carleton.edu/quantskills/winter06/index.html. These documents, crafted by a small but diverse group can serve as a starting point for broader community discussion of the quantitative preparation of future geoscience graduate students.

  12. In vivo studies of brain development by magnetic resonance techniques.

    PubMed

    Inder, T E; Huppi, P S

    2000-01-01

    Understanding of the morphological development of the human brain has largely come from neuropathological studies obtained postmortem. Magnetic resonance (MR) techniques have recently allowed the provision of detailed structural, metabolic, and functional information in vivo on the human brain. These techniques have been utilized in studies from premature infants to adults and have provided invaluable data on the sequence of normal human brain development. This article will focus on MR techniques including conventional structural MR imaging techniques, quantitative morphometric MR techniques, diffusion weighted MR techniques, and MR spectroscopy. In order to understand the potential applications and limitations of MR techniques, relevant physical and biological principles for each of the MR techniques are first reviewed. This is followed by a review of the understanding of the sequence of normal brain development utilizing these techniques. MRDD Research Reviews 6:59-67, 2000. Copyright 2000 Wiley-Liss, Inc.

  13. Quantification of Liver Iron with MRI: State of the Art and Remaining Challenges

    PubMed Central

    Hernando, Diego; Levin, Yakir S; Sirlin, Claude B; Reeder, Scott B

    2015-01-01

    Liver iron overload is the histological hallmark of hereditary hemochromatosis and transfusional hemosiderosis, and can also occur in chronic hepatopathies. Iron overload can result in liver damage, with the eventual development of cirrhosis, liver failure and hepatocellular carcinoma. Assessment of liver iron levels is necessary for detection and quantitative staging of iron overload, and monitoring of iron-reducing treatments. This article discusses the need for non-invasive assessment of liver iron, and reviews qualitative and quantitative methods with a particular emphasis on MRI. Specific MRI methods for liver iron quantification include signal intensity ratio as well as R2 and R2* relaxometry techniques. Methods that are in clinical use, as well as their limitations, are described. Remaining challenges, unsolved problems, and emerging techniques to provide improved characterization of liver iron deposition are discussed. PMID:24585403

  14. Quantitative imaging of brain energy metabolisms and neuroenergetics using in vivo X-nuclear 2H, 17O and 31P MRS at ultra-high field.

    PubMed

    Zhu, Xiao-Hong; Lu, Ming; Chen, Wei

    2018-07-01

    Brain energy metabolism relies predominantly on glucose and oxygen utilization to generate biochemical energy in the form of adenosine triphosphate (ATP). ATP is essential for maintaining basal electrophysiological activities in a resting brain and supporting evoked neuronal activity under an activated state. Studying complex neuroenergetic processes in the brain requires sophisticated neuroimaging techniques enabling noninvasive and quantitative assessment of cerebral energy metabolisms and quantification of metabolic rates. Recent state-of-the-art in vivo X-nuclear MRS techniques, including 2 H, 17 O and 31 P MRS have shown promise, especially at ultra-high fields, in the quest for understanding neuroenergetics and brain function using preclinical models and in human subjects under healthy and diseased conditions. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Low angle light scattering analysis: a novel quantitative method for functional characterization of human and murine platelet receptors.

    PubMed

    Mindukshev, Igor; Gambaryan, Stepan; Kehrer, Linda; Schuetz, Claudia; Kobsar, Anna; Rukoyatkina, Natalia; Nikolaev, Viacheslav O; Krivchenko, Alexander; Watson, Steve P; Walter, Ulrich; Geiger, Joerg

    2012-07-01

    Determinations of platelet receptor functions are indispensable diagnostic indicators of cardiovascular and hemostatic diseases including hereditary and acquired receptor defects and receptor responses to drugs. However, presently available techniques for assessing platelet function have some disadvantages, such as low sensitivity and the requirement of large sample sizes and unphysiologically high agonist concentrations. Our goal was to develop and initially characterize a new technique designed to quantitatively analyze platelet receptor activation and platelet function on the basis of measuring changes in low angle light scattering. We developed a novel technique based on low angle light scattering registering changes in light scattering at a range of different angles in platelet suspensions during activation. The method proved to be highly sensitive for simultaneous real time detection of changes in size and shape of platelets during activation. Unlike commonly-used methods, the light scattering method could detect platelet shape change and aggregation in response to nanomolar concentrations of extracellular nucleotides. Furthermore, our results demonstrate that the advantages of the light scattering method make it a choice method for platelet receptor monitoring and for investigation of both murine and human platelets in disease models. Our data demonstrate the suitability and superiority of this new low angle light scattering method for comprehensive analyses of platelet receptors and functions. This highly sensitive, quantitative, and online detection of essential physiological, pathophysiological and pharmacological-response properties of human and mouse platelets is a significant improvement over conventional techniques.

  16. DGT Passive Sampling for Quantitative in Situ Measurements of Compounds from Household and Personal Care Products in Waters.

    PubMed

    Chen, Wei; Li, Yanying; Chen, Chang-Er; Sweetman, Andrew J; Zhang, Hao; Jones, Kevin C

    2017-11-21

    Widespread use of organic chemicals in household and personal care products (HPCPs) and their discharge into aquatic systems means reliable, robust techniques to monitor environmental concentrations are needed. The passive sampling approach of diffusive gradients in thin-films (DGT) is developed here and demonstrated to provide in situ quantitative and time-weighted average (TWA) measurement of these chemicals in waters. The novel technique is developed for HPCPs, including preservatives, antioxidants and disinfectants, by evaluating the performance of different binding agents. Ultrasonic extraction of binding gels in acetonitrile gave good and consistent recoveries for all test chemicals. Uptake by DGT with HLB (hydrophilic-lipophilic-balanced) as the binding agent was relatively independent of pH (3.5-9.5), ionic strength (0.001-0.1 M) and dissolved organic matter (0-20 mg L -1 ), making it suitable for applications across a wide range of environments. Deployment time and diffusion layer thickness dependence experiments confirmed DGT accumulated chemicals masses are consistent with theoretical predictions. The technique was further tested and applied in the influent and effluent of a wastewater treatment plant. Results were compared with conventional grab-sampling and 24-h-composited samples from autosamplers. DGT provided TWA concentrations over up to 18 days deployment, with minimal effects from biofouling or the diffusive boundary layer. The field application demonstrated advantages of the DGT technique: it gives in situ analyte preconcentration in a simple matrix, with more quantitative measurement of the HPCP analytes.

  17. Ultrasound elastography: principles, techniques, and clinical applications.

    PubMed

    Dewall, Ryan J

    2013-01-01

    Ultrasound elastography is an emerging set of imaging modalities used to image tissue elasticity and are often referred to as virtual palpation. These techniques have proven effective in detecting and assessing many different pathologies, because tissue mechanical changes often correlate with tissue pathological changes. This article reviews the principles of ultrasound elastography, many of the ultrasound-based techniques, and popular clinical applications. Originally, elastography was a technique that imaged tissue strain by comparing pre- and postcompression ultrasound images. However, new techniques have been developed that use different excitation methods such as external vibration or acoustic radiation force. Some techniques track transient phenomena such as shear waves to quantitatively measure tissue elasticity. Clinical use of elastography is increasing, with applications including lesion detection and classification, fibrosis staging, treatment monitoring, vascular imaging, and musculoskeletal applications.

  18. In vivo standardization of bone ultrasonometry of the clavicle.

    PubMed

    Mandarano-Filho, Luiz Garcia; Bezuti, Márcio Takey; Barbieri, Cláudio Henrique

    2016-03-01

    The assessment of fracture union includes physical examination and radiographic imaging, which depend on the examiner's experience. The development of ancillary methods may avoid prolonged treatments and the improper removal of implants. Quantitative bone ultrasonometry has been studied for this purpose and will soon be included in clinical practice. The aims of the present study were to assess the feasibility of using this technique on the clavicle and to standardize its in vivo application. Twenty adult volunteers, including 10 men and 10 women without medical conditions or a previous history of clavicle fracture, underwent axial quantitative ultrasonometric assessment using transducers in various positions (different distances between the transducers and different angulations relative to the clavicle). Similar values of wave propagation velocity were obtained in the different tested set-ups, which included distinct distances between the transducers and angular positions relative to the clavicle. There were significant differences only in the transducers positioned at 0° and at 5 or 7 cm apart. The use of bone ultrasonometry on the clavicle is feasible and the standardization of the technique proposed in this study (transducers placed at 45° and at 7 cm apart) will allow its future application in clinical trials to evaluate the healing process of diaphyseal fractures of the clavicle.

  19. Advances in magnetic resonance neuroimaging techniques in the evaluation of neonatal encephalopathy.

    PubMed

    Panigrahy, Ashok; Blüml, Stefan

    2007-02-01

    Magnetic resonance (MR) imaging has become an essential tool in the evaluation of neonatal encephalopathy. Magnetic resonance-compatible neonatal incubators allow sick neonates to be transported to the MR scanner, and neonatal head coils can improve signal-to-noise ratio, critical for advanced MR imaging techniques. Refinement of conventional imaging techniques include the use of PROPELLER techniques for motion correction. Magnetic resonance spectroscopic imaging and diffusion tensor imaging provide quantitative assessment of both brain development and brain injury in the newborn with respect to metabolite abnormalities and hypoxic-ischemic injury. Knowledge of normal developmental changes in MR spectroscopy metabolite concentration and diffusion tensor metrics is essential to interpret pathological cases. Perfusion MR and functional MR can provide additional physiological information. Both MR spectroscopy and diffusion tensor imaging can provide additional information in the differential of neonatal encephalopathy, including perinatal white matter injury, hypoxic-ischemic brain injury, metabolic disease, infection, and birth injury.

  20. [Research progress and development trend of quantitative assessment techniques for urban thermal environment.

    PubMed

    Sun, Tie Gang; Xiao, Rong Bo; Cai, Yun Nan; Wang, Yao Wu; Wu, Chang Guang

    2016-08-01

    Quantitative assessment of urban thermal environment has become a focus for urban climate and environmental science since the concept of urban heat island has been proposed. With the continual development of space information and computer simulation technology, substantial progresses have been made on quantitative assessment techniques and methods of urban thermal environment. The quantitative assessment techniques have been developed to dynamics simulation and forecast of thermal environment at various scales based on statistical analysis of thermal environment on urban-scale using the historical data of weather stations. This study reviewed the development progress of ground meteorological observation, thermal infrared remote sensing and numerical simulation. Moreover, the potential advantages and disadvantages, applicability and the development trends of these techniques were also summarized, aiming to add fundamental knowledge of understanding the urban thermal environment assessment and optimization.

  1. Quantitative magnetic resonance imaging phantoms: A review and the need for a system phantom.

    PubMed

    Keenan, Kathryn E; Ainslie, Maureen; Barker, Alex J; Boss, Michael A; Cecil, Kim M; Charles, Cecil; Chenevert, Thomas L; Clarke, Larry; Evelhoch, Jeffrey L; Finn, Paul; Gembris, Daniel; Gunter, Jeffrey L; Hill, Derek L G; Jack, Clifford R; Jackson, Edward F; Liu, Guoying; Russek, Stephen E; Sharma, Samir D; Steckner, Michael; Stupic, Karl F; Trzasko, Joshua D; Yuan, Chun; Zheng, Jie

    2018-01-01

    The MRI community is using quantitative mapping techniques to complement qualitative imaging. For quantitative imaging to reach its full potential, it is necessary to analyze measurements across systems and longitudinally. Clinical use of quantitative imaging can be facilitated through adoption and use of a standard system phantom, a calibration/standard reference object, to assess the performance of an MRI machine. The International Society of Magnetic Resonance in Medicine AdHoc Committee on Standards for Quantitative Magnetic Resonance was established in February 2007 to facilitate the expansion of MRI as a mainstream modality for multi-institutional measurements, including, among other things, multicenter trials. The goal of the Standards for Quantitative Magnetic Resonance committee was to provide a framework to ensure that quantitative measures derived from MR data are comparable over time, between subjects, between sites, and between vendors. This paper, written by members of the Standards for Quantitative Magnetic Resonance committee, reviews standardization attempts and then details the need, requirements, and implementation plan for a standard system phantom for quantitative MRI. In addition, application-specific phantoms and implementation of quantitative MRI are reviewed. Magn Reson Med 79:48-61, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    PubMed Central

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation. PMID:26982626

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    PubMed

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest. Results showed that the proposed technique provided accurate ranking of the reconstruction methods for 97.5% of the 50 noise realizations. Further, the technique was robust to the choice of evaluated reconstruction methods. The simulation study pointed to possible violations of the assumptions made in the NGS technique under clinical scenarios. However, numerical experiments indicated that the NGS technique was robust in ranking methods even when there was some degree of such violation.

  4. Electrons, Photons, and Force: Quantitative Single-Molecule Measurements from Physics to Biology

    PubMed Central

    2011-01-01

    Single-molecule measurement techniques have illuminated unprecedented details of chemical behavior, including observations of the motion of a single molecule on a surface, and even the vibration of a single bond within a molecule. Such measurements are critical to our understanding of entities ranging from single atoms to the most complex protein assemblies. We provide an overview of the strikingly diverse classes of measurements that can be used to quantify single-molecule properties, including those of single macromolecules and single molecular assemblies, and discuss the quantitative insights they provide. Examples are drawn from across the single-molecule literature, ranging from ultrahigh vacuum scanning tunneling microscopy studies of adsorbate diffusion on surfaces to fluorescence studies of protein conformational changes in solution. PMID:21338175

  5. Crosscutting Airborne Remote Sensing Technologies for Oil and Gas and Earth Science Applications

    NASA Technical Reports Server (NTRS)

    Aubrey, A. D.; Frankenberg, C.; Green, R. O.; Eastwood, M. L.; Thompson, D. R.; Thorpe, A. K.

    2015-01-01

    Airborne imaging spectroscopy has evolved dramatically since the 1980s as a robust remote sensing technique used to generate 2-dimensional maps of surface properties over large spatial areas. Traditional applications for passive airborne imaging spectroscopy include interrogation of surface composition, such as mapping of vegetation diversity and surface geological composition. Two recent applications are particularly relevant to the needs of both the oil and gas as well as government sectors: quantification of surficial hydrocarbon thickness in aquatic environments and mapping atmospheric greenhouse gas components. These techniques provide valuable capabilities for petroleum seepage in addition to detection and quantification of fugitive emissions. New empirical data that provides insight into the source strength of anthropogenic methane will be reviewed, with particular emphasis on the evolving constraints enabled by new methane remote sensing techniques. Contemporary studies attribute high-strength point sources as significantly contributing to the national methane inventory and underscore the need for high performance remote sensing technologies that provide quantitative leak detection. Imaging sensors that map spatial distributions of methane anomalies provide effective techniques to detect, localize, and quantify fugitive leaks. Airborne remote sensing instruments provide the unique combination of high spatial resolution (<1 m) and large coverage required to directly attribute methane emissions to individual emission sources. This capability cannot currently be achieved using spaceborne sensors. In this study, results from recent NASA remote sensing field experiments focused on point-source leak detection, will be highlighted. This includes existing quantitative capabilities for oil and methane using state-of-the-art airborne remote sensing instruments. While these capabilities are of interest to NASA for assessment of environmental impact and global climate change, industry similarly seeks to detect and localize leaks of both oil and methane across operating fields. In some cases, higher sensitivities desired for upstream and downstream applications can only be provided by new airborne remote sensing instruments tailored specifically for a given application. There exists a unique opportunity for alignment of efforts between commercial and government sectors to advance the next generation of instruments to provide more sensitive leak detection capabilities, including those for quantitative source strength determination.

  6. A new systematic and quantitative approach to characterization of surface nanostructures using fuzzy logic

    NASA Astrophysics Data System (ADS)

    Al-Mousa, Amjed A.

    Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.

  7. A Quantitative Technique for Beginning Microscopists.

    ERIC Educational Resources Information Center

    Sundberg, Marshall D.

    1984-01-01

    Stereology is the study of three-dimensional objects through the interpretation of two-dimensional images. Stereological techniques used in introductory botany to quantitatively examine changes in leaf anatomy in response to different environments are discussed. (JN)

  8. Conceptual development and retention within the learning cycle

    NASA Astrophysics Data System (ADS)

    McWhirter, Lisa Jo

    1998-12-01

    This research was designed to achieve two goals: (1) examine concept development and retention within the learning cycle and (2) examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. Forty-eight sixth-grade students and one teacher at an urban middle school participated in the study. The research utilized both quantitative and qualitative analyses. Quantitative assessments included a concept mapping technique as well as teacher generated multiple choice tests. Preliminary quantitative analysis found that students' reading levels had an effect on students' pretest scores in both the concept mapping and the multiple-choice assessment. Therefore, a covariant design was implemented for the quantitative analyses. Quantitative analysis techniques were used to examine concept development and retention, it was discovered that the students' concept knowledge increased significantly from the time of the conclusion of the term introduction phase to the conclusion of the expansion phase. These findings would indicate that all three phases of the learning cycle are necessary for conceptual development. However, quantitative analyses of concept maps indicated that this is not true for all students. Individual students showed evidence of concept development and integration at each phase. Therefore, concept development is individualized and all phases of the learning cycle are not necessary for all students. As a result, individual's assimilation, disequilibration, accommodation and organization may not correlate with the phases of the learning cycle. Quantitative analysis also indicated a significant decrease in the retention of concepts over time. Qualitative analyses were used to examine how students' concept development is mediated by classroom discussions and the students' small cooperative learning group. It was discovered that there was a correlation between teacher-student interaction and small-group interaction and concept mediation. Therefore, students who had a high level of teacher-student dialogue which utilized teacher led discussions with integrated scaffolding techniques where the same students who mediated the ideas within the small group discussions. Those students whose teacher-student interactions consisted of dialogue with little positive teacher feedback made no contributions within the small group regardless of their level of concept development.

  9. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  10. [SciELO Public Health: the performance of Cadernos de Saúde Pública and Revista de Saúde Pública].

    PubMed

    Barata, Rita Barradas

    2007-12-01

    The aim of this paper was to analyze two Brazilian scientific journals included in the SciELO Library of Public Health, using a group of bibliometric indicators and scrutinizing the articles most viewed. Cadernos de Saúde Pública was accessed 3,743.59 times per month, with an average of 30.31 citations per article. The 50 articles most viewed (6.72 to 524.5 views) were mostly published in Portuguese (92%). 42% were theoretical essays, 20% surveys, and 16% descriptive studies. 42% used argumentative techniques, 34% quantitative techniques, 18% qualitative techniques, and 6% mathematical modeling. The most common themes were: health and work (50%), epidemiology (22%), and environmental health (8%). Revista de Saúde Pública was accessed 1,590.97 times per month, with an average of 26.27 citations per article. The 50 articles most viewed (7.33 and 56.50 views) were all published in Portuguese: 46% were surveys, 14% databases analysis, and 12% systematic reviews. Quantitative techniques were adopted in 66% of such articles, while mathematical modeling was the same as observed in Cadernos de Saúde Pública, as were qualitative techniques. The most common themes were health services organization (22%), nutrition (22%), health and work (18%), epidemiology (12%), and environmental health (12%).

  11. Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses

    NASA Astrophysics Data System (ADS)

    Wong, Stephen T. C.; Knowlton, Robert C.; Hoo, Kent S.; Huang, H. K.

    1995-05-01

    Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the brain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstation to aid the noninvasive presurgical evaluation of epilepsy patients. These techniques include online access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitation of structural and functional information contained in the registered images. For illustration, we describe the use of these techniques in a patient case of nonlesional neocortical epilepsy. We also present out future work based on preliminary studies.

  12. Analysis of defect structure in silicon. Characterization of SEMIX material. Silicon sheet growth development for the large area silicon sheet task of the low-cost solar array project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Stringfellow, G. B.; Virkar, A. V.; Dunn, J.; Guyer, T.

    1983-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13C. Important correlation was obtained between defect densities, cell efficiency, and diffusion length. Grain boundary substructure displayed a strong influence on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements gave statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for quantimet quantitative image analyzer (QTM) analysis was perfected and is used routinely. The relationships between hole mobility and grain boundary density was determined. Mobility was measured using the van der Pauw technique, and grain boundary density was measured using quantitative microscopy technique. Mobility was found to decrease with increasing grain boundary density.

  13. Elicitation of quantitative data from a heterogeneous expert panel: formal process and application in animal health.

    PubMed

    Van der Fels-Klerx, Ine H J; Goossens, Louis H J; Saatkamp, Helmut W; Horst, Suzan H S

    2002-02-01

    This paper presents a protocol for a formal expert judgment process using a heterogeneous expert panel aimed at the quantification of continuous variables. The emphasis is on the process's requirements related to the nature of expertise within the panel, in particular the heterogeneity of both substantive and normative expertise. The process provides the opportunity for interaction among the experts so that they fully understand and agree upon the problem at hand, including qualitative aspects relevant to the variables of interest, prior to the actual quantification task. Individual experts' assessments on the variables of interest, cast in the form of subjective probability density functions, are elicited with a minimal demand for normative expertise. The individual experts' assessments are aggregated into a single probability density function per variable, thereby weighting the experts according to their expertise. Elicitation techniques proposed include the Delphi technique for the qualitative assessment task and the ELI method for the actual quantitative assessment task. Appropriately, the Classical model was used to weight the experts' assessments in order to construct a single distribution per variable. Applying this model, the experts' quality typically was based on their performance on seed variables. An application of the proposed protocol in the broad and multidisciplinary field of animal health is presented. Results of this expert judgment process showed that the proposed protocol in combination with the proposed elicitation and analysis techniques resulted in valid data on the (continuous) variables of interest. In conclusion, the proposed protocol for a formal expert judgment process aimed at the elicitation of quantitative data from a heterogeneous expert panel provided satisfactory results. Hence, this protocol might be useful for expert judgment studies in other broad and/or multidisciplinary fields of interest.

  14. A new technique for quantitative analysis of hair loss in mice using grayscale analysis.

    PubMed

    Ponnapakkam, Tulasi; Katikaneni, Ranjitha; Gulati, Rohan; Gensure, Robert

    2015-03-09

    Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.

  15. Electron molecular ion recombination: product excitation and fragmentation.

    PubMed

    Adams, Nigel G; Poterya, Viktoriya; Babcock, Lucia M

    2006-01-01

    Electron-ion dissociative recombination is an important ionization loss process in any ionized gas containing molecular ions. This includes the interstellar medium, circumstellar shells, cometary comae, planetary ionospheres, fusion plasma boundaries, combustion flames, laser plasmas and chemical deposition and etching plasmas. In addition to controlling the ionization density, the process generates many radical species, which can contribute to a parallel neutral chemistry. Techniques used to obtain rate data and product information (flowing afterglows and storage rings) are discussed and recent data are reviewed including diatomic to polyatomic ions and cluster ions. The data are divided into rate coefficients and cross sections, including their temperature/energy dependencies, and quantitative identification of neutral reaction products. The latter involve both ground and electronically excited states and including vibrational excitation. The data from the different techniques are compared and trends in the data are examined. The reactions are considered in terms of the basic mechanisms (direct and indirect processes including tunneling) and recent theoretical developments are discussed. Finally, new techniques are mentioned (for product identification; electrostatic storage rings, including single and double rings; Coulomb explosion) and new ways forward are suggested.

  16. Diagnostic molecular microbiology: a 2013 snapshot.

    PubMed

    Fairfax, Marilynn Ransom; Salimnia, Hossein

    2013-12-01

    Molecular testing has a large and increasing role in the diagnosis of infectious diseases. It has evolved significantly since the first probe tests were FDA approved in the early 1990s. This article highlights the uses of molecular techniques in diagnostic microbiology, including "older," as well as innovative, probe techniques, qualitative and quantitative RT-PCR, highly multiplexed PCR panels, some of which use sealed microfluidic test cartridges, MALDI TOF, and nuclear magnetic resonance. Tests are grouped together by technique and target. Tests with similar roles for similar analytes are compared with respect to benefits, drawbacks, and possible problems. Copyright © 2013 Elsevier Inc. All rights reserved.

  17. Advances in imaging and quantification of electrical properties at the nanoscale using Scanning Microwave Impedance Microscopy (sMIM)

    NASA Astrophysics Data System (ADS)

    Friedman, Stuart; Yang, Yongliang; Amster, Oskar

    2015-03-01

    Scanning Microwave Impedance Microscopy (sMIM) is a mode for Atomic Force Microscopy (AFM) enabling imaging of unique contrast mechanisms and measurement of local permittivity and conductivity at the 10's of nm length scale. Recent results will be presented illustrating high-resolution electrical features such as sub 15 nm Moire' patterns in Graphene, carbon nanotubes of various electrical states and ferro-electrics. In addition to imaging, the technique is suited to a variety of metrology applications where specific physical properties are determined quantitatively. We will present research activities on quantitative measurements using multiple techniques to determine dielectric constant (permittivity) and conductivity (e.g. dopant concentration) for a range of materials. Examples include bulk dielectrics, low-k dielectric thin films, capacitance standards and doped semiconductors. Funded in part by DOE SBIR DE-SC0009586.

  18. Scattering matrix elements of biological particles measured in a flow through system: theory and practice.

    PubMed

    Sloot, P M; Hoekstra, A G; van der Liet, H; Figdor, C G

    1989-05-15

    Light scattering techniques (including depolarization experiments) applied to biological cells provide a fast nondestructive probe that is very sensitive to small morphological differences. Until now quantitative measurement of these scatter phenomena were only described for particles in suspension. In this paper we discuss the symmetry conditions applicable to the scattering matrices of monodisperse biological cells in a flow cytometer and provide evidence that quantitative measurement of the elements of these scattering matrices is possible in flow through systems. Two fundamental extensions to the theoretical description of conventional scattering experiments are introduced: large cone integration of scattering signals and simultaneous implementation of the localization principle to account for scattering by a sharply focused laser beam. In addition, a specific calibration technique is proposed to account for depolarization effects of the highly specialized optics applied in flow through equipment.

  19. The potential of statistical shape modelling for geometric morphometric analysis of human teeth in archaeological research

    PubMed Central

    Fernee, Christianne; Browne, Martin; Zakrzewski, Sonia

    2017-01-01

    This paper introduces statistical shape modelling (SSM) for use in osteoarchaeology research. SSM is a full field, multi-material analytical technique, and is presented as a supplementary geometric morphometric (GM) tool. Lower mandibular canines from two archaeological populations and one modern population were sampled, digitised using micro-CT, aligned, registered to a baseline and statistically modelled using principal component analysis (PCA). Sample material properties were incorporated as a binary enamel/dentin parameter. Results were assessed qualitatively and quantitatively using anatomical landmarks. Finally, the technique’s application was demonstrated for inter-sample comparison through analysis of the principal component (PC) weights. It was found that SSM could provide high detail qualitative and quantitative insight with respect to archaeological inter- and intra-sample variability. This technique has value for archaeological, biomechanical and forensic applications including identification, finite element analysis (FEA) and reconstruction from partial datasets. PMID:29216199

  20. Principles of Metamorphic Petrology

    NASA Astrophysics Data System (ADS)

    Williams, Michael L.

    2009-05-01

    The field of metamorphic petrology has seen spectacular advances in the past decade, including new X-ray mapping techniques for characterizing metamorphic rocks and minerals, new internally consistent thermobarometers, new software for constructing and viewing phase diagrams, new methods to date metamorphic processes, and perhaps most significant, revised petrologic databases and the ability to calculate accurate phase diagrams and pseudosections. These tools and techniques provide new power and resolution for constraining pressure-temperature (P-T) histories and tectonic events. Two books have been fundamental for empowering petrologists and structural geologists during the past decade. Frank Spear's Metamorphic Phase Equilibria and Pressure-Temperature-Time Paths, published in 1993, builds on his seminal papers to provide a quantitative framework for P-T path analysis. Spear's book lays the foundation for modern quantitative metamorphic analysis. Cees Passchier and Rudolph Trouw's Microtectonics, published in 2005, with its superb photos and figures, provides the tools and the theory for interpreting deformation textures and inferring deformation processes.

  1. Improvement of the analog forecasting method by using local thermodynamic data. Application to autumn precipitation in Catalonia

    NASA Astrophysics Data System (ADS)

    Gibergans-Báguena, J.; Llasat, M. C.

    2007-12-01

    The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.

  2. Application of Laser Induced Breakdown Spectroscopy to the identification of emeralds from different synthetic processes

    NASA Astrophysics Data System (ADS)

    Agrosì, G.; Tempesta, G.; Scandale, E.; Legnaioli, S.; Lorenzetti, G.; Pagnotta, S.; Palleschi, V.; Mangone, A.; Lezzerini, M.

    2014-12-01

    Laser Induced Breakdown Spectroscopy can provide a useful contribution in mineralogical field in which the quantitative chemical analyses (including the evaluation of light elements) can play a key role in the studies on the origin of the emeralds. In particular, the chemical analyses permit to determine those trace elements, known as fingerprints, that can be useful to study their provenance. This technique, not requiring sample preparation results particularly suitable for gemstones, that obviously must be studied in a non-destructive way. In this paper, the LIBS technique was applied to distinguish synthetic emeralds grown by Biron hydrothermal method from those grown by Chatham flux method. The analyses performed by collinear double-pulse LIBS give a signal enhancement useful for the quantitative chemical analyses while guaranteeing a minimal sample damage. In this way it was obtained a considerable improvement on the detection limit of the trace elements, whose determination is essential for determining the origin of emerald gemstone. The trace elements V, Cr, and Fe and their relative amounts allowed the correct attribution of the manufacturer. Two different methods for quantitative analyses were used for this study: the standard Calibration-Free LIBS (CF-LIBS) method and its recent evolution, the One Point Calibration LIBS (OPC-LIBS). This is the first approach to the evaluation of the emerald origin by means of the LIBS technique.

  3. Gene Profiling Technique to Accelerate Stem Cell Therapies for Eye Diseases

    MedlinePlus

    ... like RPE. They also use a technique called quantitative RT-PCR to measure the expression of genes ... higher in iPS cells than mature RPE. But quantitative RT-PCR only permits the simultaneous measurement of ...

  4. Quantitative determination and validation of octreotide acetate using 1 H-NMR spectroscopy with internal standard method.

    PubMed

    Yu, Chen; Zhang, Qian; Xu, Peng-Yao; Bai, Yin; Shen, Wen-Bin; Di, Bin; Su, Meng-Xiang

    2018-01-01

    Quantitative nuclear magnetic resonance (qNMR) is a well-established technique in quantitative analysis. We presented a validated 1 H-qNMR method for assay of octreotide acetate, a kind of cyclic octopeptide. Deuterium oxide was used to remove the undesired exchangeable peaks, which was referred to as proton exchange, in order to make the quantitative signals isolated in the crowded spectrum of the peptide and ensure precise quantitative analysis. Gemcitabine hydrochloride was chosen as the suitable internal standard. Experimental conditions, including relaxation delay time, the numbers of scans, and pulse angle, were optimized first. Then method validation was carried out in terms of selectivity, stability, linearity, precision, and robustness. The assay result was compared with that by means of high performance liquid chromatography, which is provided by Chinese Pharmacopoeia. The statistical F test, Student's t test, and nonparametric test at 95% confidence level indicate that there was no significant difference between these two methods. qNMR is a simple and accurate quantitative tool with no need for specific corresponding reference standards. It has the potential of the quantitative analysis of other peptide drugs and standardization of the corresponding reference standards. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Characterization of Colloidal Quantum Dot Ligand Exchange by X-ray Photoelectron Spectroscopy

    NASA Astrophysics Data System (ADS)

    Atewologun, Ayomide; Ge, Wangyao; Stiff-Roberts, Adrienne D.

    2013-05-01

    Colloidal quantum dots (CQDs) are chemically synthesized semiconductor nanoparticles with size-dependent wavelength tunability. Chemical synthesis of CQDs involves the attachment of long organic surface ligands to prevent aggregation; however, these ligands also impede charge transport. Therefore, it is beneficial to exchange longer surface ligands for shorter ones for optoelectronic devices. Typical characterization techniques used to analyze surface ligand exchange include Fourier-transform infrared spectroscopy, x-ray diffraction, transmission electron microscopy, and nuclear magnetic resonance spectroscopy, yet these techniques do not provide a simultaneously direct, quantitative, and sensitive method for evaluating surface ligands on CQDs. In contrast, x-ray photoelectron spectroscopy (XPS) can provide nanoscale sensitivity for quantitative analysis of CQD surface ligand exchange. A unique aspect of this work is that a fingerprint is identified for shorter surface ligands by resolving the regional XPS spectrum corresponding to different types of carbon bonds. In addition, a deposition technique known as resonant infrared matrix-assisted pulsed laser evaporation is used to improve the CQD film uniformity such that stronger XPS signals are obtained, enabling more accurate analysis of the ligand exchange process.

  6. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Quantitative Imaging Biomarkers of NAFLD

    PubMed Central

    Kinner, Sonja; Reeder, Scott B.

    2016-01-01

    Conventional imaging modalities, including ultrasonography (US), computed tomography (CT), and magnetic resonance (MR), play an important role in the diagnosis and management of patients with nonalcoholic fatty liver disease (NAFLD) by allowing noninvasive diagnosis of hepatic steatosis. However, conventional imaging modalities are limited as biomarkers of NAFLD for various reasons. Multi-parametric quantitative MRI techniques overcome many of the shortcomings of conventional imaging and allow comprehensive and objective evaluation of NAFLD. MRI can provide unconfounded biomarkers of hepatic fat, iron, and fibrosis in a single examination—a virtual biopsy has become a clinical reality. In this article, we will review the utility and limitation of conventional US, CT, and MR imaging for the diagnosis NAFLD. Recent advances in imaging biomarkers of NAFLD are also discussed with an emphasis in multi-parametric quantitative MRI. PMID:26848588

  8. Molecular biology of myopia.

    PubMed

    Schaeffel, Frank; Simon, Perikles; Feldkaemper, Marita; Ohngemach, Sibylle; Williams, Robert W

    2003-09-01

    Experiments in animal models of myopia have emphasised the importance of visual input in emmetropisation but it is also evident that the development of human myopia is influenced to some degree by genetic factors. Molecular genetic approaches can help to identify both the genes involved in the control of ocular development and the potential targets for pharmacological intervention. This review covers a variety of techniques that are being used to study the molecular biology of myopia. In the first part, we describe techniques used to analyse visually induced changes in gene expression: Northern Blot, polymerase chain reaction (PCR) and real-time PCR to obtain semi-quantitative and quantitative measures of changes in transcription level of a known gene, differential display reverse transcription PCR (DD-RT-PCR) to search for new genes that are controlled by visual input, rapid amplification of 5' cDNA (5'-RACE) to extend the 5' end of sequences that are regulated by visual input, in situ hybridisation to localise the expression of a given gene in a tissue and oligonucleotide microarray assays to simultaneously test visually induced changes in thousands of transcripts in single experiments. In the second part, we describe techniques that are used to localise regions in the genome that contain genes that are involved in the control of eye growth and refractive errors in mice and humans. These include quantitative trait loci (QTL) mapping, exploiting experimental test crosses of mice and transmission disequilibrium tests (TDT) in humans to find chromosomal intervals that harbour genes involved in myopia development. We review several successful applications of this battery of techniques in myopia research.

  9. Laryngeal Reflexes: Physiology, Technique and Clinical Use

    PubMed Central

    Ludlow, Christy L.

    2015-01-01

    This review examines the current level of knowledge and techniques available for the study of laryngeal reflexes. Overall, the larynx is under constant control of several systems (including respiration, swallowing and cough) as well as sensory-motor reflex responses involving glossopharyngeal, pharyngeal, laryngeal and tracheobronchial sensory receptors. Techniques for the clinical assessment of these reflexes are emerging and need to be examined for sensitivity and specificity in identifying laryngeal sensory disorders. Quantitative assessment methods for the diagnosis of sensory reductions as well as sensory hypersensitivity may account for laryngeal disorders such as chronic cough, paradoxical vocal fold disorder and muscular tension dysphonia. The development of accurate assessment techniques could improve our understanding of the mechanisms involved in these disorders. PMID:26241237

  10. Application of Phase-Field Techniques to Hydraulically- and Deformation-Induced Fracture.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Culp, David; Miller, Nathan; Schweizer, Laura

    Phase-field techniques provide an alternative approach to fracture problems which mitigate some of the computational expense associated with tracking the crack interface and the coalescence of individual fractures. The technique is extended to apply to hydraulically driven fracture such as would occur during fracking or CO 2 sequestration. Additionally, the technique is applied to a stainless steel specimen used in the Sandia Fracture Challenge. It was found that the phase-field model performs very well, at least qualitatively, in both deformation-induced fracture and hydraulically-induced fracture, though spurious hourglassing modes were observed during coupled hydralically-induced fracture. Future work would include performing additionalmore » quantitative benchmark tests and updating the model as needed.« less

  11. Characterization of shape and deformation of MEMS by quantitative optoelectronic metrology techniques

    NASA Astrophysics Data System (ADS)

    Furlong, Cosme; Pryputniewicz, Ryszard J.

    2002-06-01

    Recent technological trends based on miniaturization of mechanical, electro-mechanical, and photonic devices to the microscopic scale, have led to the development of microelectromechanical systems (MEMS). Effective development of MEMS components requires the synergism of advanced design, analysis, and fabrication methodologies, and also of quantitative metrology techniques for characterizing their performance, reliability, and integrity during the electronic packaging cycle. In this paper, we describe opto-electronic techniques for measuring, with sub-micrometer accuracy, shape and changes in states of deformation of MEMS strictures. With the described opto-electronic techniques, it is possible to characterize MEMS components using the display and data modes. In the display mode, interferometric information related to shape and deformation is displayed at video frame rates, providing the capability for adjusting and setting experimental conditions. In the data mode, interferometric information related to shape and deformation is recorded as high-spatial and high-digital resolution images, which are further processed to provide quantitative 3D information. Furthermore, the quantitative 3D data are exported to computer-aided design (CAD) environments and utilized for analysis and optimization of MEMS devices. Capabilities of opto- electronic techniques are illustrated with representative applications demonstrating their applicability to provide indispensable quantitative information for the effective development and optimization of MEMS devices.

  12. Less label, more free: approaches in label-free quantitative mass spectrometry.

    PubMed

    Neilson, Karlie A; Ali, Naveid A; Muralidharan, Sridevi; Mirzaei, Mehdi; Mariani, Michael; Assadourian, Gariné; Lee, Albert; van Sluyter, Steven C; Haynes, Paul A

    2011-02-01

    In this review we examine techniques, software, and statistical analyses used in label-free quantitative proteomics studies for area under the curve and spectral counting approaches. Recent advances in the field are discussed in an order that reflects a logical workflow design. Examples of studies that follow this design are presented to highlight the requirement for statistical assessment and further experiments to validate results from label-free quantitation. Limitations of label-free approaches are considered, label-free approaches are compared with labelling techniques, and forward-looking applications for label-free quantitative data are presented. We conclude that label-free quantitative proteomics is a reliable, versatile, and cost-effective alternative to labelled quantitation. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Applications of Microfluidics in Quantitative Biology.

    PubMed

    Bai, Yang; Gao, Meng; Wen, Lingling; He, Caiyun; Chen, Yuan; Liu, Chenli; Fu, Xiongfei; Huang, Shuqiang

    2018-05-01

    Quantitative biology is dedicated to taking advantage of quantitative reasoning and advanced engineering technologies to make biology more predictable. Microfluidics, as an emerging technique, provides new approaches to precisely control fluidic conditions on small scales and collect data in high-throughput and quantitative manners. In this review, the authors present the relevant applications of microfluidics to quantitative biology based on two major categories (channel-based microfluidics and droplet-based microfluidics), and their typical features. We also envision some other microfluidic techniques that may not be employed in quantitative biology right now, but have great potential in the near future. © 2017 Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences. Biotechnology Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  14. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  15. Analysis of airborne LiDAR surveys to quantify the characteristic morphologies of northern forested wetlands

    Treesearch

    Murray C. Richardson; Carl P. J. Mitchell; Brian A. Branfireun; Randall K. Kolka

    2010-01-01

    A new technique for quantifying the geomorphic form of northern forested wetlands from airborne LiDAR surveys is introduced, demonstrating the unprecedented ability to characterize the geomorphic form of northern forested wetlands using high-resolution digital topography. Two quantitative indices are presented, including the lagg width index (LWI) which objectively...

  16. Quantitative proteomics in Giardia duodenalis-Achievements and challenges.

    PubMed

    Emery, Samantha J; Lacey, Ernest; Haynes, Paul A

    2016-08-01

    Giardia duodenalis (syn. G. lamblia and G. intestinalis) is a protozoan parasite of vertebrates and a major contributor to the global burden of diarrheal diseases and gastroenteritis. The publication of multiple genome sequences in the G. duodenalis species complex has provided important insights into parasite biology, and made post-genomic technologies, including proteomics, significantly more accessible. The aims of proteomics are to identify and quantify proteins present in a cell, and assign functions to them within the context of dynamic biological systems. In Giardia, proteomics in the post-genomic era has transitioned from reliance on gel-based systems to utilisation of a diverse array of techniques based on bottom-up LC-MS/MS technologies. Together, these have generated crucial foundations for subcellular proteomes, elucidated intra- and inter-assemblage isolate variation, and identified pathways and markers in differentiation, host-parasite interactions and drug resistance. However, in Giardia, proteomics remains an emerging field, with considerable shortcomings evident from the published research. These include a bias towards assemblage A, a lack of emphasis on quantitative analytical techniques, and limited information on post-translational protein modifications. Additionally, there are multiple areas of research for which proteomic data is not available to add value to published transcriptomic data. The challenge of amalgamating data in the systems biology paradigm necessitates the further generation of large, high-quality quantitative datasets to accurately model parasite biology. This review surveys the current proteomic research available for Giardia and evaluates their technical and quantitative approaches, while contextualising their biological insights into parasite pathology, isolate variation and eukaryotic evolution. Finally, we propose areas of priority for the generation of future proteomic data to explore fundamental questions in Giardia, including the analysis of post-translational modifications, and the design of MS-based assays for validation of differentially expressed proteins in large datasets. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Flexible automated approach for quantitative liquid handling of complex biological samples.

    PubMed

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  18. Antibodies against toluene diisocyanate protein conjugates. Three methods of measurement.

    PubMed

    Patterson, R; Harris, K E; Zeiss, C R

    1983-12-01

    With the use of canine antisera against toluene diisocyanate (TDI)-dog serum albumin (DSA), techniques for measuring antibody against TDI-DSA were evaluated. The use of an ammonium sulfate precipitation assay showed suggestive evidence of antibody binding but high levels of TDI-DSA precipitation in the absence of antibody limit any usefulness of this technique. Double-antibody co-precipitation techniques will measure total antibody or Ig class antibody against 125I-TDI-DSA. These techniques are quantitative. The polystyrene tube radioimmunoassay is a highly sensitive method of detecting and quantitatively estimating IgG antibody. The enzyme linked immunosorbent assay is a rapidly adaptable method for the quantitative estimation of IgG, IgA, and IgM against TDI-homologous proteins. All these techniques were compared and results are demonstrated by using the same serum sample for analysis.

  19. Ultrasound-guided injection for MR arthrography of the hip: comparison of two different techniques.

    PubMed

    Kantarci, Fatih; Ozbayrak, Mustafa; Gulsen, Fatih; Gencturk, Mert; Botanlioglu, Huseyin; Mihmanli, Ismail

    2013-01-01

    The purpose of this study was to prospectively evaluate the two different ultrasound-guided injection techniques for MR arthrography of the hip. Fifty-nine consecutive patients (21 men, 38 women) referred for MR arthrographies of the hip were prospectively included in the study. Three patients underwent bilateral MR arthrography. The two injection techniques were quantitatively and qualitatively compared. Quantitative analysis was performed by the comparison of injected contrast material volume into the hip joint. Qualitative analysis was performed with regard to extraarticular leakage of contrast material into the soft tissues. Extraarticular leakage of contrast material was graded as none, minimal, moderate, or severe according to the MR images. Each patient rated discomfort after the procedure using a visual analogue scale (VAS). The injected contrast material volume was less in femoral head puncture technique (mean 8.9 ± 3.4 ml) when compared to femoral neck puncture technique (mean 11.2 ± 2.9 ml) (p < 0.05). The chi-squared test showed significantly more contrast leakage by femoral head puncture technique (p < 0.05). Statistical analysis showed no difference between the head and neck puncture groups in terms of feeling of pain (p = 0.744) or in the body mass index (p = 0.658) of the patients. The femoral neck injection technique provides high intraarticular contrast volume and produces less extraarticular contrast leakage than the femoral head injection technique when US guidance is used for MR arthrography of the hip.

  20. Quantitative shear wave imaging optical coherence tomography for noncontact mechanical characterization of myocardium

    NASA Astrophysics Data System (ADS)

    Wang, Shang; Lopez, Andrew L.; Morikawa, Yuka; Tao, Ge; Li, Jiasong; Larina, Irina V.; Martin, James F.; Larin, Kirill V.

    2015-03-01

    Optical coherence elastography (OCE) is an emerging low-coherence imaging technique that provides noninvasive assessment of tissue biomechanics with high spatial resolution. Among various OCE methods, the capability of quantitative measurement of tissue elasticity is of great importance for tissue characterization and pathology detection across different samples. Here we report a quantitative OCE technique, termed quantitative shear wave imaging optical coherence tomography (Q-SWI-OCT), which enables noncontact measurement of tissue Young's modulus based on the ultra-fast imaging of the shear wave propagation inside the sample. A focused air-puff device is used to interrogate the tissue with a low-pressure short-duration air stream that stimulates a localized displacement with the scale at micron level. The propagation of this tissue deformation in the form of shear wave is captured by a phase-sensitive OCT system running with the scan of the M-mode imaging over the path of the wave propagation. The temporal characteristics of the shear wave is quantified based on the cross-correlation of the tissue deformation profiles at all the measurement locations, and linear regression is utilized to fit the data plotted in the domain of time delay versus wave propagation distance. The wave group velocity is thus calculated, which results in the quantitative measurement of the Young's modulus. As the feasibility demonstration, experiments are performed on tissuemimicking phantoms with different agar concentrations and the quantified elasticity values with Q-SWI-OCT agree well with the uniaxial compression tests. For functional characterization of myocardium with this OCE technique, we perform our pilot experiments on ex vivo mouse cardiac muscle tissues with two studies, including 1) elasticity difference of cardiac muscle under relaxation and contract conditions and 2) mechanical heterogeneity of the heart introduced by the muscle fiber orientation. Our results suggest the potential of using Q-SWI-OCT as an essential tool for nondestructive biomechanical evaluation of myocardium.

  1. Quantitative elasticity measurement of urinary bladder wall using laser-induced surface acoustic waves.

    PubMed

    Li, Chunhui; Guan, Guangying; Zhang, Fan; Song, Shaozhen; Wang, Ruikang K; Huang, Zhihong; Nabi, Ghulam

    2014-12-01

    The maintenance of urinary bladder elasticity is essential to its functions, including the storage and voiding phases of the micturition cycle. The bladder stiffness can be changed by various pathophysiological conditions. Quantitative measurement of bladder elasticity is an essential step toward understanding various urinary bladder disease processes and improving patient care. As a nondestructive, and noncontact method, laser-induced surface acoustic waves (SAWs) can accurately characterize the elastic properties of different layers of organs such as the urinary bladder. This initial investigation evaluates the feasibility of a noncontact, all-optical method of generating and measuring the elasticity of the urinary bladder. Quantitative elasticity measurements of ex vivo porcine urinary bladder were made using the laser-induced SAW technique. A pulsed laser was used to excite SAWs that propagated on the bladder wall surface. A dedicated phase-sensitive optical coherence tomography (PhS-OCT) system remotely recorded the SAWs, from which the elasticity properties of different layers of the bladder were estimated. During the experiments, series of measurements were performed under five precisely controlled bladder volumes using water to estimate changes in the elasticity in relation to various urinary bladder contents. The results, validated by optical coherence elastography, show that the laser-induced SAW technique combined with PhS-OCT can be a feasible method of quantitative estimation of biomechanical properties.

  2. Rationalising the 'irrational': a think aloud study of discrete choice experiment responses.

    PubMed

    Ryan, Mandy; Watson, Verity; Entwistle, Vikki

    2009-03-01

    Stated preference methods assume respondents' preferences are consistent with utility theory, but many empirical studies report evidence of preferences that violate utility theory. This evidence is often derived from quantitative tests that occur naturally within, or are added to, stated preference tasks. In this study, we use qualitative methods to explore three axioms of utility theory: completeness, monotonicity, and continuity. We take a novel approach, adopting a 'think aloud' technique to identify violations of the axioms of utility theory and to consider how well the quantitative tests incorporated within a discrete choice experiment are able to detect these. Results indicate that quantitative tests classify respondents as being 'irrational' when qualitative statements would indicate they are 'rational'. In particular, 'non-monotonic' responses can often be explained by respondents inferring additional information beyond what is presented in the task, and individuals who appear to adopt non-compensatory decision-making strategies do so because they rate particular attributes very highly (they are not attempting to simplify the task). The results also provide evidence of 'cost-based responses': respondents assumed tests with higher costs would be of higher quality. The value of including in-depth qualitative validation techniques in the development of stated preference tasks is shown.

  3. [The validation of kit of reagents for quantitative detection of DNA of human cytomegalovirus in biological material using polymerase chain reaction technique in real time operation mode].

    PubMed

    Sil'veĭstrova, O Iu; Domonova, É A; Shipulina, O Iu

    2014-04-01

    The validation of kit of reagents destined to detection and quantitative evaluation of DNA of human cytomegalovirus in biological material using polymerase chain reaction technique in real time operation mode was implemented. The comparison was made against international WHO standard--The first WHO international standard for human cytomegalovirus to implement measures the kit of reagents "AmpliSens CMV-screen/monitor-FL" and standard sample of enterprise DNA HCMV (The central research institute of epidemiology of Rospotrebnadzor) was applied. The fivefold dilution of international WHO standard and standard sample of enterprise were carried out in concentrations of DNA HCMV from 106 to 102. The arrangement of polymerase chain reaction and analysis of results were implemented using programed amplifier with system of detection of fluorescent signal in real-time mode "Rotor-Gene Q" ("Qiagen", Germany). In the total of three series of experiments, all stages of polymerase chain reaction study included, the coefficient of translation of quantitative evaluation of DNA HCMV from copy/ml to ME/ml equal to 0.6 was introduced for this kit of reagents.

  4. Susceptibility-Weighted Imaging and Quantitative Susceptibility Mapping in the Brain

    PubMed Central

    Liu, Chunlei; Li, Wei; Tong, Karen A.; Yeom, Kristen W.; Kuzminski, Samuel

    2015-01-01

    Susceptibility-weighted imaging (SWI) is a magnetic resonance imaging (MRI) technique that enhances image contrast by using the susceptibility differences between tissues. It is created by combining both magnitude and phase in the gradient echo data. SWI is sensitive to both paramagnetic and diamagnetic substances which generate different phase shift in MRI data. SWI images can be displayed as a minimum intensity projection that provides high resolution delineation of the cerebral venous architecture, a feature that is not available in other MRI techniques. As such, SWI has been widely applied to diagnose various venous abnormalities. SWI is especially sensitive to deoxygenated blood and intracranial mineral deposition and, for that reason, has been applied to image various pathologies including intracranial hemorrhage, traumatic brain injury, stroke, neoplasm, and multiple sclerosis. SWI, however, does not provide quantitative measures of magnetic susceptibility. This limitation is currently being addressed with the development of quantitative susceptibility mapping (QSM) and susceptibility tensor imaging (STI). While QSM treats susceptibility as isotropic, STI treats susceptibility as generally anisotropic characterized by a tensor quantity. This article reviews the basic principles of SWI, its clinical and research applications, the mechanisms governing brain susceptibility properties, and its practical implementation, with a focus on brain imaging. PMID:25270052

  5. Magnetoresistive biosensors for quantitative proteomics

    NASA Astrophysics Data System (ADS)

    Zhou, Xiahan; Huang, Chih-Cheng; Hall, Drew A.

    2017-08-01

    Quantitative proteomics, as a developing method for study of proteins and identification of diseases, reveals more comprehensive and accurate information of an organism than traditional genomics. A variety of platforms, such as mass spectrometry, optical sensors, electrochemical sensors, magnetic sensors, etc., have been developed for detecting proteins quantitatively. The sandwich immunoassay is widely used as a labeled detection method due to its high specificity and flexibility allowing multiple different types of labels. While optical sensors use enzyme and fluorophore labels to detect proteins with high sensitivity, they often suffer from high background signal and challenges in miniaturization. Magnetic biosensors, including nuclear magnetic resonance sensors, oscillator-based sensors, Hall-effect sensors, and magnetoresistive sensors, use the specific binding events between magnetic nanoparticles (MNPs) and target proteins to measure the analyte concentration. Compared with other biosensing techniques, magnetic sensors take advantage of the intrinsic lack of magnetic signatures in biological samples to achieve high sensitivity and high specificity, and are compatible with semiconductor-based fabrication process to have low-cost and small-size for point-of-care (POC) applications. Although still in the development stage, magnetic biosensing is a promising technique for in-home testing and portable disease monitoring.

  6. Quantitative methods for compensation of matrix effects and self-absorption in Laser Induced Breakdown Spectroscopy signals of solids

    NASA Astrophysics Data System (ADS)

    Takahashi, Tomoko; Thornton, Blair

    2017-12-01

    This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.

  7. Standardized pivot shift test improves measurement accuracy.

    PubMed

    Hoshino, Yuichi; Araujo, Paulo; Ahlden, Mattias; Moore, Charity G; Kuroda, Ryosuke; Zaffagnini, Stefano; Karlsson, Jon; Fu, Freddie H; Musahl, Volker

    2012-04-01

    The variability of the pivot shift test techniques greatly interferes with achieving a quantitative and generally comparable measurement. The purpose of this study was to compare the variation of the quantitative pivot shift measurements with different surgeons' preferred techniques to a standardized technique. The hypothesis was that standardizing the pivot shift test would improve consistency in the quantitative evaluation when compared with surgeon-specific techniques. A whole lower body cadaveric specimen was prepared to have a low-grade pivot shift on one side and high-grade pivot shift on the other side. Twelve expert surgeons performed the pivot shift test using (1) their preferred technique and (2) a standardized technique. Electromagnetic tracking was utilized to measure anterior tibial translation and acceleration of the reduction during the pivot shift test. The variation of the measurement was compared between the surgeons' preferred technique and the standardized technique. The anterior tibial translation during pivot shift test was similar between using surgeons' preferred technique (left 24.0 ± 4.3 mm; right 15.5 ± 3.8 mm) and using standardized technique (left 25.1 ± 3.2 mm; right 15.6 ± 4.0 mm; n.s.). However, the variation in acceleration was significantly smaller with the standardized technique (left 3.0 ± 1.3 mm/s(2); right 2.5 ± 0.7 mm/s(2)) compared with the surgeons' preferred technique (left 4.3 ± 3.3 mm/s(2); right 3.4 ± 2.3 mm/s(2); both P < 0.01). Standardizing the pivot shift test maneuver provides a more consistent quantitative evaluation and may be helpful in designing future multicenter clinical outcome trials. Diagnostic study, Level I.

  8. [The role of endotracheal aspirate culture in the diagnosis of ventilator-associated pneumonia: a meta analysis].

    PubMed

    Wang, Fei; He, Bei

    2013-01-01

    To investigate the role of endotracheal aspirate (EA) culture in the diagnosis and antibiotic management in ventilator-associated pneumonia (VAP). We searched CNKI, Wanfang, PUBMED and EMBASE databases published from January 1990 to December 2011, to find relevant literatures on VAP microbiological diagnostic techniques including EA and bronchoalveolar lavage (BALF). The following key words were used: ventilator associated pneumonia, diagnosis and adult. Meta-analysis was performed and the sensitivity and specificity of EA on VAP diagnosis were calculated. Our literature search identified 1665 potential articles, 8 of which fulfilled our selection criteria including 561 patients with paired cultures. Using BALF quantitative culture as reference standard, the sensitivity and specificity of EA were 72% and 71%. When considering quantitative culture of EA only, the sensitivity and specificity improved to 90% and 65%, while the positive and the negative predictive values were 68% and 89% respectively. However, the sensitivity and specificity of semi-quantitative culture of EA were only 50% and 80%, with a positive predictive value of 77% and a negative predictive value of 58% respectively. EA culture had relatively poor sensitivity and specificity, although quantitative culture of EA only could improve the sensitivity. Initiating therapy on the basis of EA quantitative culture may still result in excessive antibiotic usage. Our data suggested that EA could provide some information for clinical decision but could not replace the role of BALF quantitative culture in VAP diagnosis.

  9. Sweat testing to evaluate autonomic function

    PubMed Central

    Illigens, Ben M.W.; Gibbons, Christopher H.

    2011-01-01

    Sudomotor dysfunction is one of the earliest detectable neurophysiologic abnormalities in distal small fiber neuropathy. Traditional neurophysiologic measurements of sudomotor function include thermoregulatory sweat testing (TST), quantitative sudomotor axon reflex testing (QSART), silicone impressions, the sympathetic skin response (SSR), and the recent addition of quantitative direct and indirect axon reflex testing (QDIRT). These testing techniques, when used in combination, can detect and localized pre- and postganglionic lesions, can provide early diagnosis of sudomotor dysfunction and can monitor disease progression or disease recovery. In this article, we review the common tests available for assessment of sudomotor function, detail the testing methodology, review the limitations and provide examples of test results. PMID:18989618

  10. Quantitative three-dimensional low-speed wake surveys

    NASA Technical Reports Server (NTRS)

    Brune, G. W.

    1992-01-01

    Theoretical and practical aspects of conducting three-dimensional wake measurements in large wind tunnels are reviewed with emphasis on applications in low-speed aerodynamics. Such quantitative wake surveys furnish separate values for the components of drag, such as profile drag and induced drag, but also measure lift without the use of a balance. In addition to global data, details of the wake flowfield as well as spanwise distributions of lift and drag are obtained. The paper demonstrates the value of this measurement technique using data from wake measurements conducted by Boeing on a variety of low-speed configurations including the complex high-lift system of a transport aircraft.

  11. Microstructural study of the nickel-base alloy WAZ-20 using qualitative and quantitative electron optical techniques

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1973-01-01

    The NASA nickel-base alloy WAZ-20 was analyzed by advanced metallographic techniques to qualitatively and quantitatively characterize its phases and stability. The as-cast alloy contained primary gamma-prime, a coarse gamma-gamma prime eutectic, a gamma-fine gamma prime matrix, and MC carbides. A specimen aged at 870 C for 1000 hours contained these same constituents and a few widely scattered high W particles. No detrimental phases (such as sigma or mu) were observed. Scanning electron microscope, light metallography, and replica electron microscope methods are compared. The value of quantitative electron microprobe techniques such as spot and area analysis is demonstrated.

  12. Surface and Flow Field Measurements on the FAITH Hill Model

    NASA Technical Reports Server (NTRS)

    Bell, James H.; Heineck, James T.; Zilliac, Gregory; Mehta, Rabindra D.; Long, Kurtis R.

    2012-01-01

    A series of experimental tests, using both qualitative and quantitative techniques, were conducted to characterize both surface and off-surface flow characteristics of an axisymmetric, modified-cosine-shaped, wall-mounted hill named "FAITH" (Fundamental Aero Investigates The Hill). Two separate models were employed: a 6" high, 18" base diameter machined aluminum model that was used for wind tunnel tests and a smaller scale (2" high, 6" base diameter) sintered nylon version that was used in the water channel facility. Wind tunnel and water channel tests were conducted at mean test section speeds of 165 fps (Reynolds Number based on height = 500,000) and 0.1 fps (Reynolds Number of 1000), respectively. The ratio of model height to boundary later height was approximately 3 for both tests. Qualitative techniques that were employed to characterize the complex flow included surface oil flow visualization for the wind tunnel tests, and dye injection for the water channel tests. Quantitative techniques that were employed to characterize the flow included Cobra Probe to determine point-wise steady and unsteady 3D velocities, Particle Image Velocimetry (PIV) to determine 3D velocities and turbulence statistics along specified planes, Pressure Sensitive Paint (PSP) to determine mean surface pressures, and Fringe Imaging Skin Friction (FISF) to determine surface skin friction (magnitude and direction). This initial report summarizes the experimental set-up, techniques used, data acquired and describes some details of the dataset that is being constructed for use by other researchers, especially the CFD community. Subsequent reports will discuss the data and their interpretation in more detail

  13. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  14. Protocol for Standardizing High-to-Moderate Abundance Protein Biomarker Assessments Through an MRM-with-Standard-Peptides Quantitative Approach.

    PubMed

    Percy, Andrew J; Yang, Juncong; Chambers, Andrew G; Mohammed, Yassene; Miliotis, Tasso; Borchers, Christoph H

    2016-01-01

    Quantitative mass spectrometry (MS)-based approaches are emerging as a core technology for addressing health-related queries in systems biology and in the biomedical and clinical fields. In several 'omics disciplines (proteomics included), an approach centered on selected or multiple reaction monitoring (SRM or MRM)-MS with stable isotope-labeled standards (SIS), at the protein or peptide level, has emerged as the most precise technique for quantifying and screening putative analytes in biological samples. To enable the widespread use of MRM-based protein quantitation for disease biomarker assessment studies and its ultimate acceptance for clinical analysis, the technique must be standardized to facilitate precise and accurate protein quantitation. To that end, we have developed a number of kits for assessing method/platform performance, as well as for screening proposed candidate protein biomarkers in various human biofluids. Collectively, these kits utilize a bottom-up LC-MS methodology with SIS peptides as internal standards and quantify proteins using regression analysis of standard curves. This chapter details the methodology used to quantify 192 plasma proteins of high-to-moderate abundance (covers a 6 order of magnitude range from 31 mg/mL for albumin to 18 ng/mL for peroxidredoxin-2), and a 21-protein subset thereof. We also describe the application of this method to patient samples for biomarker discovery and verification studies. Additionally, we introduce our recently developed Qualis-SIS software, which is used to expedite the analysis and assessment of protein quantitation data in control and patient samples.

  15. Molecular imaging of melanin distribution in vivo and quantitative differential diagnosis of human pigmented lesions using label-free harmonic generation biopsy (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Sun, Chi-Kuang; Wei, Ming-Liang; Su, Yu-Hsiang; Weng, Wei-Hung; Liao, Yi-Hua

    2017-02-01

    Harmonic generation microscopy is a noninvasive repetitive imaging technique that provides real-time 3D microscopic images of human skin with a sub-femtoliter resolution and high penetration down to the reticular dermis. In this talk, we show that with a strong resonance effect, the third-harmonic-generation (THG) modality provides enhanced contrast on melanin and allows not only differential diagnosis of various pigmented skin lesions but also quantitative imaging for longterm tracking. This unique capability makes THG microscopy the only label-free technique capable of identifying the active melanocytes in human skin and to image their different dendriticity patterns. In this talk, we will review our recent efforts to in vivo image melanin distribution and quantitatively diagnose pigmented skin lesions using label-free harmonic generation biopsy. This talk will first cover the spectroscopic study on the melanin enhanced THG effect in human cells and the calibration strategy inside human skin for quantitative imaging. We will then review our recent clinical trials including: differential diagnosis capability study on pigmented skin tumors; as well as quantitative virtual biopsy study on pre- and post- treatment evaluation on melasma and solar lentigo. Our study indicates the unmatched capability of harmonic generation microscopy to perform virtual biopsy for noninvasive histopathological diagnosis of various pigmented skin tumors, as well as its unsurpassed capability to noninvasively reveal the pathological origin of different hyperpigmentary diseases on human face as well as to monitor the efficacy of laser depigmentation treatments. This work is sponsored by National Health Research Institutes.

  16. Phenomenological plasmon broadening and relation to the dispersion

    NASA Astrophysics Data System (ADS)

    Hobbiger, Raphael; Drachta, Jürgen T.; Kreil, Dominik; Böhm, Helga M.

    2017-02-01

    Pragmatic ways of including lifetime broadening of collective modes in the electron liquid are critically compared. Special focus lies on the impact of the damping parameter onto the dispersion. It is quantitatively exemplified for the two-dimensional case, for both, the charge ('sheet'-)plasmon and the spin-density plasmon. The predicted deviations fall within the resolution limits of advanced techniques.

  17. Projecting technology change to improve space technology planning and systems management

    NASA Astrophysics Data System (ADS)

    Walk, Steven Robert

    2011-04-01

    Projecting technology performance evolution has been improving over the years. Reliable quantitative forecasting methods have been developed that project the growth, diffusion, and performance of technology in time, including projecting technology substitutions, saturation levels, and performance improvements. These forecasts can be applied at the early stages of space technology planning to better predict available future technology performance, assure the successful selection of technology, and improve technology systems management strategy. Often what is published as a technology forecast is simply scenario planning, usually made by extrapolating current trends into the future, with perhaps some subjective insight added. Typically, the accuracy of such predictions falls rapidly with distance in time. Quantitative technology forecasting (QTF), on the other hand, includes the study of historic data to identify one of or a combination of several recognized universal technology diffusion or substitution patterns. In the same manner that quantitative models of physical phenomena provide excellent predictions of system behavior, so do QTF models provide reliable technological performance trajectories. In practice, a quantitative technology forecast is completed to ascertain with confidence when the projected performance of a technology or system of technologies will occur. Such projections provide reliable time-referenced information when considering cost and performance trade-offs in maintaining, replacing, or migrating a technology, component, or system. This paper introduces various quantitative technology forecasting techniques and illustrates their practical application in space technology and technology systems management.

  18. Quantitative NDA measurements of advanced reprocessing product materials containing uranium, neptunium, plutonium, and americium

    NASA Astrophysics Data System (ADS)

    Goddard, Braden

    The ability of inspection agencies and facility operators to measure powders containing several actinides is increasingly necessary as new reprocessing techniques and fuel forms are being developed. These powders are difficult to measure with nondestructive assay (NDA) techniques because neutrons emitted from induced and spontaneous fission of different nuclides are very similar. A neutron multiplicity technique based on first principle methods was developed to measure these powders by exploiting isotope-specific nuclear properties, such as the energy-dependent fission cross sections and the neutron induced fission neutron multiplicity. This technique was tested through extensive simulations using the Monte Carlo N-Particle eXtended (MCNPX) code and by one measurement campaign using the Active Well Coincidence Counter (AWCC) and two measurement campaigns using the Epithermal Neutron Multiplicity Counter (ENMC) with various (alpha,n) sources and actinide materials. Four potential applications of this first principle technique have been identified: (1) quantitative measurement of uranium, neptunium, plutonium, and americium materials; (2) quantitative measurement of mixed oxide (MOX) materials; (3) quantitative measurement of uranium materials; and (4) weapons verification in arms control agreements. This technique still has several challenges which need to be overcome, the largest of these being the challenge of having high-precision active and passive measurements to produce results with acceptably small uncertainties.

  19. Analysis of atomic force microscopy data for surface characterization using fuzzy logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.

    2011-07-15

    In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less

  20. Use of multidimensional, multimodal imaging and PACS to support neurological diagnoses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, S.T.C.; Knowlton, R.; Hoo, K.S.

    1995-12-31

    Technological advances in brain imaging have revolutionized diagnosis in neurology and neurological surgery. Major imaging techniques include magnetic resonance imaging (MRI) to visualize structural anatomy, positron emission tomography (PET) to image metabolic function and cerebral blood flow, magnetoencephalography (MEG) to visualize the location of physiologic current sources, and magnetic resonance spectroscopy (MRS) to measure specific biochemicals. Each of these techniques studies different biomedical aspects of the grain, but there lacks an effective means to quantify and correlate the disparate imaging datasets in order to improve clinical decision making processes. This paper describes several techniques developed in a UNIX-based neurodiagnostic workstationmore » to aid the non-invasive presurgical evaluation of epilepsy patients. These techniques include on-line access to the picture archiving and communication systems (PACS) multimedia archive, coregistration of multimodality image datasets, and correlation and quantitative of structural and functional information contained in the registered images. For illustration, the authors describe the use of these techniques in a patient case of non-lesional neocortical epilepsy. They also present the future work based on preliminary studies.« less

  1. EFSUMB guidelines 2011: comment on emergent indications and visions.

    PubMed

    Dietrich, C F; Cui, X W; Barreiros, A P; Hocke, M; Ignee, A

    2012-07-01

    The focus of this article is the emergent and potential indications of contrast-enhanced ultrasound (CEUS). Emergent applications of CEUS techniques include extravascular and intracavitary contrast-enhanced ultrasound, quantitative assessment of microvascular circulation for tumor response assessment, and tumor characterization using dynamic contrast-enhanced ultrasound (DCE-US). Potential indications for microbubble agents include novel molecular imaging and drug and gene delivery techniques, which have been successfully tested in animal models. "Comments and Illustrations of the European Federation of Societies for Ultrasound in Medicine and Biology (EFSUMB) Non-Liver Guidelines 2011" which focus more on established applications are published in the same supplement to Ultraschall in der Medizin (European Journal of Ultrasound). © Georg Thieme Verlag KG Stuttgart · New York.

  2. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  3. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  4. Quantitative Hyperspectral Reflectance Imaging

    PubMed Central

    Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.

    2008-01-01

    Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831

  5. [The quantitative testing of V617F mutation in gen JAK2 using pyrosequencing technique].

    PubMed

    Dunaeva, E A; Mironov, K O; Dribnokhodova, T E; Subbotina, E E; Bashmakova; Ol'hovskiĭ, I A; Shipulin, G A

    2014-11-01

    The somatic mutation V617F in gen JAK2 is a frequent cause of chronic myeloprolific diseases not conditioned by BCR/ABL mutation. The quantitative testing of relative percentage of mutant allele can be used in establishing severity of disease and its prognosis and in prescription of remedy inhibiting activity of JAK2. To quantitatively test mutation the pyrosequencing technique was applied. The developed technique permits detecting and quantitatively, testing percentage of mutation fraction since 7%. The "gray zone" is presented by samples with percentage of mutant allele from 4% to 7%. The dependence of expected percentage of mutant fraction in analyzed sample from observed value of signal is described by equation of line with regression coefficients y = - 0.97, x = -1.32 and at that measurement uncertainty consists ± 0.7. The developed technique is approved officially on clinical material from 192 patients with main forms of myeloprolific diseases not conditioned by BCR/ABL mutation. It was detected 64 samples with mautant fraction percentage from 13% to 91%. The developed technique permits implementing monitoring of therapy of myeloprolific diseases and facilitates to optimize tactics of treatment.

  6. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  7. Metals handbook. Volume 12: Fractography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1987-01-01

    ASM International has published this handbook in response to the growing interest in the science of fractography, the result of improved methods of preparing specimens, advances in photographic techniques and equipment, refinement of the scanning electron microscope, and the introduction of quantitative fractography. The book covers all aspects of fracture examination and interpretation, including electron and quantitative fractography. The text is accompanied by line drawings, graphs, and photographic illustrations of fracture surfaces and microstructural features. Articles explain and illustrate the principal modes of fracture and the effects of loading history, environment, and materials quality on fracture appearance. An atlas ofmore » fractographs constitutes the second half of the volume and contains more than 1300 fractographs, including a collection of ferrous and nonferrous alloy parts. Supplemental illustrations of failed metal-matrix composites, resin-matrix composites, polymers, and electronic materials are provided.« less

  8. Processing of polarimetric SAR data for soil moisture estimation over Mahantango watershed area

    NASA Technical Reports Server (NTRS)

    Rao, K. S.; Teng, W. L.; Wang, J. R.

    1992-01-01

    Microwave remote sensing technique has a high potential for measuring soil moisture due to the large contrast in dielectric constant of dry and wet soils. Recent work by Pults et al. demonstrated the use of X/C-band data for quantitative surface soil moisture extraction from Airborne Synthetic Aperture Radar (SAR) system. Similar technique was adopted using polarimetric SAR data acquired with the JPL-AIRSAR system over the Mahantango watershed area in central Pennsylvania during July 1990. The data sets reported include C-, L-, and P-bands of 10, 13, 15, and 17 July 1990.

  9. Photobiomolecular deposition of metallic particles and films

    DOEpatents

    Hu, Zhong-Cheng

    2005-02-08

    The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.

  10. Photobiomolecular metallic particles and films

    DOEpatents

    Hu, Zhong-Cheng

    2003-05-06

    The method of the invention is based on the unique electron-carrying function of a photocatalytic unit such as the photosynthesis system I (PSI) reaction center of the protein-chlorophyll complex isolated from chloroplasts. The method employs a photo-biomolecular metal deposition technique for precisely controlled nucleation and growth of metallic clusters/particles, e.g., platinum, palladium, and their alloys, etc., as well as for thin-film formation above the surface of a solid substrate. The photochemically mediated technique offers numerous advantages over traditional deposition methods including quantitative atom deposition control, high energy efficiency, and mild operating condition requirements.

  11. A Method for Comprehensive Glycosite-Mapping and Direct Quantitation of Serum Glycoproteins.

    PubMed

    Hong, Qiuting; Ruhaak, L Renee; Stroble, Carol; Parker, Evan; Huang, Jincui; Maverakis, Emanual; Lebrilla, Carlito B

    2015-12-04

    A comprehensive glycan map was constructed for the top eight abundant glycoproteins in plasma using both specific and nonspecific enzyme digestions followed by nano liquid chromatography (LC)-chip/quadrupole time-of-flight mass spectrometry (MS) analysis. Glycopeptides were identified using an in-house software tool, GPFinder. A sensitive and reproducible multiple reaction monitoring (MRM) technique on a triple quadrupole MS was developed and applied to quantify immunoglobulins G, A, M, and their site-specific glycans simultaneously and directly from human serum/plasma without protein enrichments. A total of 64 glycopeptides and 15 peptides were monitored for IgG, IgA, and IgM in a 20 min ultra high performance (UP)LC gradient. The absolute protein contents were quantified using peptide calibration curves. The glycopeptide ion abundances were normalized to the respective protein abundances to separate protein glycosylation from protein expression. This technique yields higher method reproducibility and less sample loss when compared with the quantitation method that involves protein enrichments. The absolute protein quantitation has a wide linear range (3-4 orders of magnitude) and low limit of quantitation (femtomole level). This rapid and robust quantitation technique, which provides quantitative information for both proteins and glycosylation, will further facilitate disease biomarker discoveries.

  12. Quantitative phase imaging using four interferograms with special phase shifts by dual-wavelength in-line phase-shifting interferometry

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoqing; Wang, Yawei; Ji, Ying; Xu, Yuanyuan; Xie, Ming; Han, Hao

    2018-05-01

    A new approach of quantitative phase imaging using four interferograms with special phase shifts in dual-wavelength in-line phase-shifting interferometry is presented. In this method, positive negative 2π phase shifts are employed to easily separate the incoherent addition of two single-wavelength interferograms by combining the phase-shifting technique with the subtraction procedure, then the quantitative phase at one of both wavelengths can be achieved based on two intensities without the corresponding dc terms by the use of the character of the trigonometric function. The quantitative phase of the other wavelength can be retrieved from two dc-term suppressed intensities obtained by employing the two-step phase-shifting technique or the filtering technique in the frequency domain. The proposed method is illustrated with theory, and its effectiveness is demonstrated by simulation experiments of the spherical cap and the HeLa cell, respectively.

  13. Evaluating motion processing algorithms for use with functional near-infrared spectroscopy data from young children.

    PubMed

    Delgado Reyes, Lourdes M; Bohache, Kevin; Wijeakumar, Sobanawartiny; Spencer, John P

    2018-04-01

    Motion artifacts are often a significant component of the measured signal in functional near-infrared spectroscopy (fNIRS) experiments. A variety of methods have been proposed to address this issue, including principal components analysis (PCA), correlation-based signal improvement (CBSI), wavelet filtering, and spline interpolation. The efficacy of these techniques has been compared using simulated data; however, our understanding of how these techniques fare when dealing with task-based cognitive data is limited. Brigadoi et al. compared motion correction techniques in a sample of adult data measured during a simple cognitive task. Wavelet filtering showed the most promise as an optimal technique for motion correction. Given that fNIRS is often used with infants and young children, it is critical to evaluate the effectiveness of motion correction techniques directly with data from these age groups. This study addresses that problem by evaluating motion correction algorithms implemented in HomER2. The efficacy of each technique was compared quantitatively using objective metrics related to the physiological properties of the hemodynamic response. Results showed that targeted PCA (tPCA), spline, and CBSI retained a higher number of trials. These techniques also performed well in direct head-to-head comparisons with the other approaches using quantitative metrics. The CBSI method corrected many of the artifacts present in our data; however, this approach produced sometimes unstable HRFs. The targeted PCA and spline methods proved to be the most robust, performing well across all comparison metrics. When compared head to head, tPCA consistently outperformed spline. We conclude, therefore, that tPCA is an effective technique for correcting motion artifacts in fNIRS data from young children.

  14. QUANTITATIVE MAGNETIC RESONANCE IMAGING OF ARTICULAR CARTILAGE AND ITS CLINICAL APPLICATIONS

    PubMed Central

    Li, Xiaojuan; Majumdar, Sharmila

    2013-01-01

    Cartilage is one of the most essential tissues for healthy joint function and is compromised in degenerative and traumatic joint diseases. There have been tremendous advances during the past decade using quantitative MRI techniques as a non-invasive tool for evaluating cartilage, with a focus on assessing cartilage degeneration during osteoarthritis (OA). In this review, after a brief overview of cartilage composition and degeneration, we discuss techniques that grade and quantify morphologic changes as well as the techniques that quantify changes in the extracellular matrix. The basic principles, in vivo applications, advantages and challenges for each technique are discussed. Recent studies using the OA Initiative (OAI) data are also summarized. Quantitative MRI provides non-invasive measures of cartilage degeneration at the earliest stages of joint degeneration, which is essential for efforts towards prevention and early intervention in OA. PMID:24115571

  15. Infrared Thermography-based Biophotonics: Integrated Diagnostic Technique for Systemic Reaction Monitoring

    NASA Astrophysics Data System (ADS)

    Vainer, Boris G.; Morozov, Vitaly V.

    A peculiar branch of biophotonics is a measurement, visualisation and quantitative analysis of infrared (IR) radiation emitted from living object surfaces. Focal plane array (FPA)-based IR cameras make it possible to realize in medicine the so called interventional infrared thermal diagnostics. An integrated technique aimed at the advancement of this new approach in biomedical science and practice is described in the paper. The assembled system includes a high-performance short-wave (2.45-3.05 μm) or long-wave (8-14 μm) IR camera, two laser Doppler flowmeters (LDF) and additional equipment and complementary facilities implementing the monitoring of human cardiovascular status. All these means operate synchronously. It is first ascertained the relationship between infrared thermography (IRT) and LDF data in humans in regard to their systemic cardiovascular reactivity. Blood supply real-time dynamics in a narcotized patient is first visualized and quantitatively represented during surgery in order to observe how the general hyperoxia influences thermoregulatory mechanisms; an abrupt increase in temperature of the upper limb is observed using IRT. It is outlined that the IRT-based integrated technique may act as a take-off runway leading to elaboration of informative new methods directly applicable to medicine and biomedical sciences.

  16. A practical technique for quantifying the performance of acoustic emission systems on plate-like structures.

    PubMed

    Scholey, J J; Wilcox, P D; Wisnom, M R; Friswell, M I

    2009-06-01

    A model for quantifying the performance of acoustic emission (AE) systems on plate-like structures is presented. Employing a linear transfer function approach the model is applicable to both isotropic and anisotropic materials. The model requires several inputs including source waveforms, phase velocity and attenuation. It is recognised that these variables may not be readily available, thus efficient measurement techniques are presented for obtaining phase velocity and attenuation in a form that can be exploited directly in the model. Inspired by previously documented methods, the application of these techniques is examined and some important implications for propagation characterisation in plates are discussed. Example measurements are made on isotropic and anisotropic plates and, where possible, comparisons with numerical solutions are made. By inputting experimentally obtained data into the model, quantitative system metrics are examined for different threshold values and sensor locations. By producing plots describing areas of hit success and source location error, the ability to measure the performance of different AE system configurations is demonstrated. This quantitative approach will help to place AE testing on a more solid foundation, underpinning its use in industrial AE applications.

  17. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.

    1987-01-01

    Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.

  18. Modeling susceptibility difference artifacts produced by metallic implants in magnetic resonance imaging with point-based thin-plate spline image registration.

    PubMed

    Pauchard, Y; Smith, M; Mintchev, M

    2004-01-01

    Magnetic resonance imaging (MRI) suffers from geometric distortions arising from various sources. One such source are the non-linearities associated with the presence of metallic implants, which can profoundly distort the obtained images. These non-linearities result in pixel shifts and intensity changes in the vicinity of the implant, often precluding any meaningful assessment of the entire image. This paper presents a method for correcting these distortions based on non-rigid image registration techniques. Two images from a modelled three-dimensional (3D) grid phantom were subjected to point-based thin-plate spline registration. The reference image (without distortions) was obtained from a grid model including a spherical implant, and the corresponding test image containing the distortions was obtained using previously reported technique for spatial modelling of magnetic susceptibility artifacts. After identifying the nonrecoverable area in the distorted image, the calculated spline model was able to quantitatively account for the distortions, thus facilitating their compensation. Upon the completion of the compensation procedure, the non-recoverable area was removed from the reference image and the latter was compared to the compensated image. Quantitative assessment of the goodness of the proposed compensation technique is presented.

  19. Quantitative analysis of packed and compacted granular systems by x-ray microtomography

    NASA Astrophysics Data System (ADS)

    Fu, Xiaowei; Milroy, Georgina E.; Dutt, Meenakshi; Bentham, A. Craig; Hancock, Bruno C.; Elliott, James A.

    2005-04-01

    The packing and compaction of powders are general processes in pharmaceutical, food, ceramic and powder metallurgy industries. Understanding how particles pack in a confined space and how powders behave during compaction is crucial for producing high quality products. This paper outlines a new technique, based on modern desktop X-ray tomography and image processing, to quantitatively investigate the packing of particles in the process of powder compaction and provide great insights on how powder densify during powder compaction, which relate in terms of materials properties and processing conditions to tablet manufacture by compaction. A variety of powder systems were considered, which include glass, sugar, NaCl, with a typical particle size of 200-300 mm and binary mixtures of NaCl-Glass Spheres. The results are new and have been validated by SEM observation and numerical simulations using discrete element methods (DEM). The research demonstrates that XMT technique has the potential in further investigating of pharmaceutical processing and even verifying other physical models on complex packing.

  20. Simulation of FRET dyes allows quantitative comparison against experimental data

    NASA Astrophysics Data System (ADS)

    Reinartz, Ines; Sinner, Claude; Nettels, Daniel; Stucki-Buchli, Brigitte; Stockmar, Florian; Panek, Pawel T.; Jacob, Christoph R.; Nienhaus, Gerd Ulrich; Schuler, Benjamin; Schug, Alexander

    2018-03-01

    Fully understanding biomolecular function requires detailed insight into the systems' structural dynamics. Powerful experimental techniques such as single molecule Förster Resonance Energy Transfer (FRET) provide access to such dynamic information yet have to be carefully interpreted. Molecular simulations can complement these experiments but typically face limits in accessing slow time scales and large or unstructured systems. Here, we introduce a coarse-grained simulation technique that tackles these challenges. While requiring only few parameters, we maintain full protein flexibility and include all heavy atoms of proteins, linkers, and dyes. We are able to sufficiently reduce computational demands to simulate large or heterogeneous structural dynamics and ensembles on slow time scales found in, e.g., protein folding. The simulations allow for calculating FRET efficiencies which quantitatively agree with experimentally determined values. By providing atomically resolved trajectories, this work supports the planning and microscopic interpretation of experiments. Overall, these results highlight how simulations and experiments can complement each other leading to new insights into biomolecular dynamics and function.

  1. Roof Moisture Surveys: Current State Of The Technology

    NASA Astrophysics Data System (ADS)

    Tobiasson, Wayne

    1983-03-01

    Moisture is the big enemy of compact roofing systems. Non-destructive nuclear, capacitance and infrared methods can all find wet insulation in such roofs but a few core samples are needed for verification. Nuclear and capacitance surveys generate quantitative results at grid points but examine only a small portion of the roof. Quantitative results are not usually provided by infrared scanners but they can rapidly examine every square inch of the roof. Being able to find wet areas when they are small is an important advantage. Prices vary with the scope of the investigation. For a particular scope, the three techniques are often cost-competitive. The limitations of each technique are related to the people involved as well as the equipment. When the right people are involved, non-destructive surveys are a very effective method for improving the long-term performance and reducing the life-cycle costs of roofing systems. Plans for the maintenance, repair or replacement of a roof should include a roof moisture survey.

  2. Application of the EM algorithm to radiographic images.

    PubMed

    Brailean, J C; Little, D; Giger, M L; Chen, C T; Sullivan, B J

    1992-01-01

    The expectation maximization (EM) algorithm has received considerable attention in the area of positron emitted tomography (PET) as a restoration and reconstruction technique. In this paper, the restoration capabilities of the EM algorithm when applied to radiographic images is investigated. This application does not involve reconstruction. The performance of the EM algorithm is quantitatively evaluated using a "perceived" signal-to-noise ratio (SNR) as the image quality metric. This perceived SNR is based on statistical decision theory and includes both the observer's visual response function and a noise component internal to the eye-brain system. For a variety of processing parameters, the relative SNR (ratio of the processed SNR to the original SNR) is calculated and used as a metric to compare quantitatively the effects of the EM algorithm with two other image enhancement techniques: global contrast enhancement (windowing) and unsharp mask filtering. The results suggest that the EM algorithm's performance is superior when compared to unsharp mask filtering and global contrast enhancement for radiographic images which contain objects smaller than 4 mm.

  3. Automatic spatiotemporal matching of detected pleural thickenings

    NASA Astrophysics Data System (ADS)

    Chaisaowong, Kraisorn; Keller, Simon Kai; Kraus, Thomas

    2014-01-01

    Pleural thickenings can be found in asbestos exposed patient's lung. Non-invasive diagnosis including CT imaging can detect aggressive malignant pleural mesothelioma in its early stage. In order to create a quantitative documentation of automatic detected pleural thickenings over time, the differences in volume and thickness of the detected thickenings have to be calculated. Physicians usually estimate the change of each thickening via visual comparison which provides neither quantitative nor qualitative measures. In this work, automatic spatiotemporal matching techniques of the detected pleural thickenings at two points of time based on the semi-automatic registration have been developed, implemented, and tested so that the same thickening can be compared fully automatically. As result, the application of the mapping technique using the principal components analysis turns out to be advantageous than the feature-based mapping using centroid and mean Hounsfield Units of each thickening, since the resulting sensitivity was improved to 98.46% from 42.19%, while the accuracy of feature-based mapping is only slightly higher (84.38% to 76.19%).

  4. A Quantitative Needs Assessment Technique for Cross-Cultural Work Adjustment Training.

    ERIC Educational Resources Information Center

    Selmer, Lyn

    2000-01-01

    A study of 67 Swedish expatriate bosses and 104 local Hong Kong middle managers tested a quantitative needs assessment technique measuring work values. Two-thirds of middle managers' work values were not correctly estimated by their bosses, especially instrumental values (pay, benefits, security, working hours and conditions), indicating a need…

  5. A Direct, Competitive Enzyme-Linked Immunosorbent Assay (ELISA) as a Quantitative Technique for Small Molecules

    ERIC Educational Resources Information Center

    Powers, Jennifer L.; Rippe, Karen Duda; Imarhia, Kelly; Swift, Aileen; Scholten, Melanie; Islam, Naina

    2012-01-01

    ELISA (enzyme-linked immunosorbent assay) is a widely used technique with applications in disease diagnosis, detection of contaminated foods, and screening for drugs of abuse or environmental contaminants. However, published protocols with a focus on quantitative detection of small molecules designed for teaching laboratories are limited. A…

  6. Digital Assays Part I: Partitioning Statistics and Digital PCR.

    PubMed

    Basu, Amar S

    2017-08-01

    A digital assay is one in which the sample is partitioned into many small containers such that each partition contains a discrete number of biological entities (0, 1, 2, 3, …). A powerful technique in the biologist's toolkit, digital assays bring a new level of precision in quantifying nucleic acids, measuring proteins and their enzymatic activity, and probing single-cell genotypes and phenotypes. Part I of this review begins with the benefits and Poisson statistics of partitioning, including sources of error. The remainder focuses on digital PCR (dPCR) for quantification of nucleic acids. We discuss five commercial instruments that partition samples into physically isolated chambers (cdPCR) or droplet emulsions (ddPCR). We compare the strengths of dPCR (absolute quantitation, precision, and ability to detect rare or mutant targets) with those of its predecessor, quantitative real-time PCR (dynamic range, larger sample volumes, and throughput). Lastly, we describe several promising applications of dPCR, including copy number variation, quantitation of circulating tumor DNA and viral load, RNA/miRNA quantitation with reverse transcription dPCR, and library preparation for next-generation sequencing. This review is intended to give a broad perspective to scientists interested in adopting digital assays into their workflows. Part II focuses on digital protein and cell assays.

  7. On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.

    2008-08-11

    The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less

  8. Modeling of Convective-Stratiform Precipitation Processes: Sensitivity to Partitioning Methods

    NASA Technical Reports Server (NTRS)

    Lang, S. E.; Tao, W.-K.; Simpson, J.; Ferrier, B.; Starr, David OC. (Technical Monitor)

    2001-01-01

    Six different convective-stratiform separation techniques, including a new technique that utilizes the ratio of vertical and terminal velocities, are compared and evaluated using two-dimensional numerical simulations of a tropical [Tropical Ocean Global Atmosphere Coupled Ocean Atmosphere Response Experiment (TOGA COARE)] and midlatitude continental [Preliminary Regional Experiment for STORM-Central (PRESTORM)] squall line. Comparisons are made in terms of rainfall, cloud coverage, mass fluxes, apparent heating and moistening, mean hydrometeor profiles, CFADs (Contoured Frequency with Altitude Diagrams), microphysics, and latent heating retrieval. Overall, it was found that the different separation techniques produced results that qualitatively agreed. However, the quantitative differences were significant. Observational comparisons were unable to conclusively evaluate the performance of the techniques. Latent heating retrieval was shown to be sensitive to the use of separation technique mainly due to the stratiform region for methods that found very little stratiform rain.

  9. Comparison of extraction techniques and modeling of accelerated solvent extraction for the authentication of natural vanilla flavors.

    PubMed

    Cicchetti, Esmeralda; Chaintreau, Alain

    2009-06-01

    Accelerated solvent extraction (ASE) of vanilla beans has been optimized using ethanol as a solvent. A theoretical model is proposed to account for this multistep extraction. This allows the determination, for the first time, of the total amount of analytes initially present in the beans and thus the calculation of recoveries using ASE or any other extraction technique. As a result, ASE and Soxhlet extractions have been determined to be efficient methods, whereas recoveries are modest for maceration techniques and depend on the solvent used. Because industrial extracts are obtained by many different procedures, including maceration in various solvents, authenticating vanilla extracts using quantitative ratios between the amounts of vanilla flavor constituents appears to be unreliable. When authentication techniques based on isotopic ratios are used, ASE is a valid sample preparation technique because it does not induce isotopic fractionation.

  10. A Flexible Hierarchical Bayesian Modeling Technique for Risk Analysis of Major Accidents.

    PubMed

    Yu, Hongyang; Khan, Faisal; Veitch, Brian

    2017-09-01

    Safety analysis of rare events with potentially catastrophic consequences is challenged by data scarcity and uncertainty. Traditional causation-based approaches, such as fault tree and event tree (used to model rare event), suffer from a number of weaknesses. These include the static structure of the event causation, lack of event occurrence data, and need for reliable prior information. In this study, a new hierarchical Bayesian modeling based technique is proposed to overcome these drawbacks. The proposed technique can be used as a flexible technique for risk analysis of major accidents. It enables both forward and backward analysis in quantitative reasoning and the treatment of interdependence among the model parameters. Source-to-source variability in data sources is also taken into account through a robust probabilistic safety analysis. The applicability of the proposed technique has been demonstrated through a case study in marine and offshore industry. © 2017 Society for Risk Analysis.

  11. New imaging systems in nuclear medicine. Final report, January 1, 1993--December 31, 1995

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-12-31

    The aim of this program has been to improve the performance of positron emission tomography (PET) to achieve high resolution with high sensitivity. Towards this aim, the authors have carried out the following studies: (1) explored new techniques for detection of annihilation radiation including new detector materials and system geometries, specific areas that they have studied include--exploration of factors related to resolution and sensitivity of PET instrumentation including geometry, detection materials and coding, and the exploration of technique to improve the image quality by use of depth of interaction and increased sampling; (2) complete much of the final testing ofmore » PCR-II, an analog-coded cylindrical positron tomograph, developed and constructed during the current funding period; (3) developed the design of a positron microtomograph with mm resolution for quantitative studies in small animals, a single slice version of this device has been designed and studied by use of computer simulation; (4) continued and expanded the program of biological studies in animal models. Current studies have included imaging of animal models of Parkinson`s and Huntington`s disease and cancer. These studies have included new radiopharmaceuticals and techniques involving molecular biology.« less

  12. [Cytogenetics, cytogenomics and cancer].

    PubMed

    Bernheim, Alain

    2002-02-01

    Chromosomal study in malignancy has demonstrated the pivotal role of somatic chromosomal rearrangements in oncogenesis and tumoral progression. Structural or quantitative these abnormalities can now be studied in great details with the various Fish techniques, including CGH on chromosomes or in a near future on micro arrays. The multistep pattern of most solid tumors is characterized and their genomic abnormalities more and more used for the diagnosis and the prognosis.

  13. Combined quantitative and qualitative two-channel optical biopsy technique for discrimination of tumor borders

    NASA Astrophysics Data System (ADS)

    Bocher, Thomas; Beuthan, Juergen; Scheller, M.; Hopf, Juergen U. G.; Linnarz, Marietta; Naber, Rolf-Dieter; Minet, Olaf; Becker, Wolfgang; Mueller, Gerhard J.

    1995-12-01

    Conventional laser-induced fluorescence spectroscopy (LIFS) of endogenous chromophores like NADH (Nicotineamide Adenine Dinucleotide, reduced form) and PP IX (Protoporphyrin IX) provides information about the relative amounts of these metabolites in the observed cells. But for diagnostic applications the concentrations of these chromophores have to be determined quantitatively to establish tissue-independent differentiation criterions. It is well- known that the individually and locally varying optical tissue parameters are major obstacles for the determination of the true chromophore concentrations by simple fluorescence spectroscopy. To overcome these problems a fiber-based, 2-channel technique including a rescaled NADH-channel (delivering quantitative values) and a relative PP IX-channel was developed. Using the accumulated information of both channels can provide good tissue state separation. Ex-vivo studies with resected and frozen samples (with LN2) of squamous cells in the histologically confirmed states: normal, tumor border, inflammation and hyperplasia were performed. Each state was represented in this series with at least 7 samples. At the identical tissue spot both, the rescaled NADH-fluorescence and the relative PP IX- fluorescence, were determined. In the first case a nitrogen laser (337 nm, 500 ps, 200 microjoule, 10 Hz) in the latter case a diode laser (633 nm, 15 mW, cw) were used as excitation sources. In this ex-vivo study a good separation between the different tissue states was achieved. With a device constructed for clinical usage one quantitative, in-vivo NADH- measurement was done recently showing similar separation capabilities.

  14. Improved assay for quantitating adherence of ruminal bacteria to cellulose.

    PubMed Central

    Rasmussen, M A; White, B A; Hespell, R B

    1989-01-01

    A quantitative technique suitable for the determination of adherence of ruminal bacteria to cellulose was developed. This technique employs adherence of cells to cellulose disks and alleviates the problem of nonspecific cell entrapment within cellulose particles. By using this technique, it was demonstrated that the adherence of Ruminococcus flavefaciens FD1 to cellulose was inhibited by formaldehyde, methylcellulose, and carboxymethyl cellulose. Adherence was unaffected by acid hydrolysates of methylcellulose, glucose, and cellobiose. PMID:2782879

  15. Video methods in the quantification of children's exposures.

    PubMed

    Ferguson, Alesia C; Canales, Robert A; Beamer, Paloma; Auyeung, Willa; Key, Maya; Munninghoff, Amy; Lee, Kevin Tse-Wing; Robertson, Alexander; Leckie, James O

    2006-05-01

    In 1994, Stanford University's Exposure Research Group (ERG) conducted its first pilot study to collect micro-level activity time series (MLATS) data for young children. The pilot study involved videotaping four children of farm workers in the Salinas Valley of California and converting their videotaped activities to valuable text files of contact behavior using video-translation techniques. These MLATS are especially useful for describing intermittent dermal (i.e., second-by-second account of surfaces and objects contacted) and non-dietary ingestion (second-by-second account of objects or hands placed in the mouth) contact behavior. Second-by-second records of children contact behavior are amenable to quantitative and statistical analysis and allow for more accurate model estimates of human exposure and dose to environmental contaminants. Activity patterns data for modeling inhalation exposure (i.e., accounts of microenvironments visited) can also be extracted from the MLATS data. Since the pilot study, ERG has collected an immense MLATS data set for 92 children using more developed and refined videotaping and video-translation methodologies. This paper describes all aspects required for the collection of MLATS including: subject recruitment techniques, videotaping and video-translation processes, and potential data analysis. This paper also describes the quality assurance steps employed for these new MLATS projects, including: training, data management, and the application of interobserver and intraobserver agreement during video translation. The discussion of these issues and ERG's experiences in dealing with them can assist other groups in the conduct of research that employs these more quantitative techniques.

  16. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  17. Fractography of modern engineering materials: Composites and metals, Second volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masters, J.E.; Gilbertson, L.N.

    1993-01-01

    This book contains the manuscripts of eleven papers that were presented at the Second Symposium on Fractography of Modern Engineering Materials held in May 1992. The numerous advances in materials science in the six year period following the First Symposium dictated this second meeting. Not only had new materials been developed in the intervening years, but understanding of older materials had also progressed. Similarly, advances in the technology and the techniques of fractography had occurred. The objective of the symposium was to extend the colloquy on fractography to include these many advances. The paper may be divided into three sections:more » Unique Fractographic Techniques; Metallic Materials; Polymeric and Composite Materials. The section titles reflect the diversity of materials discussed in the meeting. The range of materials included cross-linked polyethylene, AISI 52100 steel, 2024 aluminum, and a variety of organic and metal matrix fibrous composites. The case studies presented also covered a wide range. They included failure investigations of an antenna used in deep space exploration and chemical storage tanks. Advances in the techniques of fractography were also reflected in a number of presentations; quantitative techniques and expert systems were also subjects of presentations. A short precis of each paper is included here to assist the readers in identifying works of particular interest.« less

  18. International Seminar on Laser and Opto-Electronic Technology in Industry: State-of-the-Art Review, Xiamen, People's Republic of China, June 25-28, 1986, Proceedings

    NASA Astrophysics Data System (ADS)

    Ke, Jingtang; Pryputniewicz, Ryszard J.

    Various papers on the state of the art in laser and optoelectronic technology in industry are presented. Individual topics addressed include: wavelength compensation for holographic optical element, optoelectronic techniques for measurement and inspection, new optical measurement methods in Western Europe, applications of coherent optics at ISL, imaging techniques for gas turbine development, the Rolls-Royce experience with industrial holography, panoramic holocamera for tube and borehole inspection, optical characterization of electronic materials, optical strain measurement of rotating components, quantitative interpretation of holograms and specklegrams, laser speckle technique for hydraulic structural model test, study of holospeckle interferometry, common path shearing fringe scanning interferometer, and laser interferometry applied to nondestructive testing of tires.

  19. Surface analysis of space telescope material specimens

    NASA Technical Reports Server (NTRS)

    Fromhold, A. T.; Daneshvar, K.

    1985-01-01

    Qualitative and quantitative data on Space Telescope materials which were exposed to low Earth orbital atomic oxygen in a controlled experiment during the 41-G (STS-17) mission were obtained utilizing the experimental techniques of Rutherford backscattering (RBS), particle induced X-ray emission (PIXE), and ellipsometry (ELL). The techniques employed were chosen with a view towards appropriateness for the sample in question, after consultation with NASA scientific personnel who provided the material specimens. A group of eight samples and their controls selected by NASA scientists were measured before and after flight. Information reported herein include specimen surface characterization by ellipsometry techniques, a determination of the thickness of the evaporated metal specimens by RBS, and a determination of trace impurity species present on and within the surface by PIXE.

  20. Evaluation of non-intrusive flow measurement techniques for a re-entry flight experiment

    NASA Technical Reports Server (NTRS)

    Miles, R. B.; Santavicca, D. A.; Zimmermann, M.

    1983-01-01

    This study evaluates various non-intrusive techniques for the measurement of the flow field on the windward side of the Space Shuttle orbiter or a similar reentry vehicle. Included are linear (Rayleigh, Raman, Mie, Laser Doppler Velocimetry, Resonant Doppler Velocimetry) and nonlinear (Coherent Anti-Stokes Raman, Laser-Induced Fluorescence) light scattering, electron-beam fluorescence, thermal emission, and mass spectroscopy. Flow-field properties were taken from a nonequilibrium flow model by Shinn, Moss, and Simmonds at the NASA Langley Research Center. Conclusions are, when possible, based on quantitative scaling of known laboratory results to the conditions projected. Detailed discussion with researchers in the field contributed further to these conclusions and provided valuable insights regarding the experimental feasibility of each of the techniques.

  1. Computation of the three-dimensional medial surface dynamics of the vocal folds.

    PubMed

    Döllinger, Michael; Berry, David A

    2006-01-01

    To increase our understanding of pathological and healthy voice production, quantitative measurement of the medial surface dynamics of the vocal folds is significant, albeit rarely performed because of the inaccessibility of the vocal folds. Using an excised hemilarynx methodology, a new calibration technique, herein referred to as the linear approximate (LA) method, was introduced to compute the three-dimensional coordinates of fleshpoints along the entire medial surface of the vocal fold. The results were compared with results from the direct linear transform. An associated error estimation was presented, demonstrating the improved accuracy of the new method. A test on real data was reported including computation of quantitative measurements of vocal fold dynamics.

  2. Guidance for using mixed methods design in nursing practice research.

    PubMed

    Chiang-Hanisko, Lenny; Newman, David; Dyess, Susan; Piyakong, Duangporn; Liehr, Patricia

    2016-08-01

    The mixed methods approach purposefully combines both quantitative and qualitative techniques, enabling a multi-faceted understanding of nursing phenomena. The purpose of this article is to introduce three mixed methods designs (parallel; sequential; conversion) and highlight interpretive processes that occur with the synthesis of qualitative and quantitative findings. Real world examples of research studies conducted by the authors will demonstrate the processes leading to the merger of data. The examples include: research questions; data collection procedures and analysis with a focus on synthesizing findings. Based on experience with mixed methods studied, the authors introduce two synthesis patterns (complementary; contrasting), considering application for practice and implications for research. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Comparative analysis of quantitative methodologies for Vibrionaceae biofilms.

    PubMed

    Chavez-Dozal, Alba A; Nourabadi, Neda; Erken, Martina; McDougald, Diane; Nishiguchi, Michele K

    2016-11-01

    Multiple symbiotic and free-living Vibrio spp. grow as a form of microbial community known as a biofilm. In the laboratory, methods to quantify Vibrio biofilm mass include crystal violet staining, direct colony-forming unit (CFU) counting, dry biofilm cell mass measurement, and observation of development of wrinkled colonies. Another approach for bacterial biofilms also involves the use of tetrazolium (XTT) assays (used widely in studies of fungi) that are an appropriate measure of metabolic activity and vitality of cells within the biofilm matrix. This study systematically tested five techniques, among which the XTT assay and wrinkled colony measurement provided the most reproducible, accurate, and efficient methods for the quantitative estimation of Vibrionaceae biofilms.

  4. A-TEEMTM, a new molecular fingerprinting technique: simultaneous absorbance-transmission and fluorescence excitation-emission matrix method

    NASA Astrophysics Data System (ADS)

    Quatela, Alessia; Gilmore, Adam M.; Steege Gall, Karen E.; Sandros, Marinella; Csatorday, Karoly; Siemiarczuk, Alex; (Ben Yang, Boqian; Camenen, Loïc

    2018-04-01

    We investigate the new simultaneous absorbance-transmission and fluorescence excitation-emission matrix method for rapid and effective characterization of the varying components from a mixture. The absorbance-transmission and fluorescence excitation-emission matrix method uniquely facilitates correction of fluorescence inner-filter effects to yield quantitative fluorescence spectral information that is largely independent of component concentration. This is significant because it allows one to effectively monitor quantitative component changes using multivariate methods and to generate and evaluate spectral libraries. We present the use of this novel instrument in different fields: i.e. tracking changes in complex mixtures including natural water, wine as well as monitoring stability and aggregation of hormones for biotherapeutics.

  5. Quantitative Frequency-Domain Passive Cavitation Imaging

    PubMed Central

    Haworth, Kevin J.; Bader, Kenneth B.; Rich, Kyle T.; Holland, Christy K.; Mast, T. Douglas

    2017-01-01

    Passive cavitation detection has been an instrumental technique for measuring cavitation dynamics, elucidating concomitant bioeffects, and guiding ultrasound therapies. Recently, techniques have been developed to create images of cavitation activity to provide investigators with a more complete set of information. These techniques use arrays to record and subsequently beamform received cavitation emissions, rather than processing emissions received on a single-element transducer. In this paper, the methods for performing frequency-domain delay, sum, and integrate passive imaging are outlined. The method can be applied to any passively acquired acoustic scattering or emissions, including cavitation emissions. In order to compare data across different systems, techniques for normalizing Fourier transformed data and converting the data to the acoustic energy received by the array are described. A discussion of hardware requirements and alternative imaging approaches are additionally outlined. Examples are provided in MATLAB. PMID:27992331

  6. Semi-quantitative prediction of a multiple API solid dosage form with a combination of vibrational spectroscopy methods.

    PubMed

    Hertrampf, A; Sousa, R M; Menezes, J C; Herdling, T

    2016-05-30

    Quality control (QC) in the pharmaceutical industry is a key activity in ensuring medicines have the required quality, safety and efficacy for their intended use. QC departments at pharmaceutical companies are responsible for all release testing of final products but also all incoming raw materials. Near-infrared spectroscopy (NIRS) and Raman spectroscopy are important techniques for fast and accurate identification and qualification of pharmaceutical samples. Tablets containing two different active pharmaceutical ingredients (API) [bisoprolol, hydrochlorothiazide] in different commercially available dosages were analysed using Raman- and NIR Spectroscopy. The goal was to define multivariate models based on each vibrational spectroscopy to discriminate between different dosages (identity) and predict their dosage (semi-quantitative). Furthermore the combination of spectroscopic techniques was investigated. Therefore, two different multiblock techniques based on PLS have been applied: multiblock PLS (MB-PLS) and sequential-orthogonalised PLS (SO-PLS). NIRS showed better results compared to Raman spectroscopy for both identification and quantitation. The multiblock techniques investigated showed that each spectroscopy contains information not present or captured with the other spectroscopic technique, thus demonstrating that there is a potential benefit in their combined use for both identification and quantitation purposes. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Optimisation of techniques for quantification of Botrytis cinerea in grape berries and receptacles by quantitative polymerase chain reaction

    USDA-ARS?s Scientific Manuscript database

    Quantitative PCR (qPCR) can be used to detect and monitor pathogen colonization, but early attempts to apply the technology to Botrytis cinerea infection of grape berries have identified limitations to current techniques. In this study, four DNA extraction methods, two grinding methods, two grape or...

  8. Characterization and measurement of natural gas trace constituents. Volume 1. Arsenic. Final report, June 1989-October 1993

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chao, S.S.; Attari, A.

    1995-01-01

    The discovery of arsenic compounds, as alkylarsines, in natural gas prompted this research program to develop reliable measurement techniques needed to assess the efficiency of removal processes for these environmentally sensitive substances. These techniques include sampling, speciation, quantitation and on-line instrumental methods for monitoring the total arsenic concentration. The current program has yielded many products, including calibration standards, arsenic-specific sorbents, sensitive analytical methods and instrumentation. Four laboratory analytical methods have been developed and successfully employed for arsenic determination in natural gas. These methods use GC-AED and GC-MS instruments to speciate alkylarsines, and peroxydisulfate extraction with FIAS, special carbon sorbent withmore » XRF and an IGT developed sorbent with GFAA for total arsenic measurement.« less

  9. Ptychography: use of quantitative phase information for high-contrast label free time-lapse imaging of living cells

    NASA Astrophysics Data System (ADS)

    Suman, Rakesh; O'Toole, Peter

    2014-03-01

    Here we report a novel label free, high contrast and quantitative method for imaging live cells. The technique reconstructs an image from overlapping diffraction patterns using a ptychographical algorithm. The algorithm utilises both amplitude and phase data from the sample to report on quantitative changes related to the refractive index (RI) and thickness of the specimen. We report the ability of this technique to generate high contrast images, to visualise neurite elongation in neuronal cells, and to provide measure of cell proliferation.

  10. Guidelines for intraoperative neuromonitoring using raw (analog or digital waveforms) and quantitative electroencephalography: a position statement by the American Society of Neurophysiological Monitoring.

    PubMed

    Isley, Michael R; Edmonds, Harvey L; Stecker, Mark

    2009-12-01

    Electroencephalography (EEG) is one of the oldest and most commonly utilized modalities for intraoperative neuromonitoring. Historically, interest in the EEG patterns associated with anesthesia is as old as the discovery of the EEG itself. The evolution of its intraoperative use was also expanded to include monitoring for assessing cortical perfusion and oxygenation during a variety of vascular, cardiac, and neurosurgical procedures. Furthermore, a number of quantitative or computer-processed algorithms have also been developed to aid in its visual representation and interpretation. The primary clinical outcomes for which modern EEG technology has made significant intraoperative contributions include: (1) recognizing and/or preventing perioperative ischemic insults, and (2) monitoring of brain function for anesthetic drug administration in order to determine depth of anesthesia (and level of consciousness), including the tailoring of drug levels to achieve a predefined neural effect (e.g., burst suppression). While the accelerated development of microprocessor technologies has fostered an extraordinarily rapid growth in the use of intraoperative EEG, there is still no universal adoption of a monitoring technique(s) or of criteria for its neural end-point(s) by anesthesiologists, surgeons, neurologists, and neurophysiologists. One of the most important limitations to routine intraoperative use of EEG may be the lack of standardization of methods, alarm criteria, and recommendations related to its application. Lastly, refinements in technology and signal processing can be expected to advance the usefulness of the intraoperative EEG for both anesthetic and surgical management of patients. This paper is the position statement of the American Society of Neurophysiological Monitoring. It is the practice guidelines for the intraoperative use of raw (analog and digital) and quantitative EEG. The following recommendations are based on trends in the current scientific and clinical literature and meetings, guidelines published by other organizations, expert opinion, and public review by the members of the American Society of Neurophysiological Monitoring. This document may not include all possible methodologies and interpretative criteria, nor do the authors and their sponsor intentionally exclude any new alternatives. The use of the techniques reviewed in these guidelines may reduce perioperative neurological morbidity and mortality. This position paper summarizes commonly used protocols for recording and interpreting the intraoperative use of EEG. Furthermore, the American Society of Neurophysiological Monitoring recognizes this as primarily an educational service.

  11. Simultaneous off-axis multiplexed holography and regular fluorescence microscopy of biological cells.

    PubMed

    Nygate, Yoav N; Singh, Gyanendra; Barnea, Itay; Shaked, Natan T

    2018-06-01

    We present a new technique for obtaining simultaneous multimodal quantitative phase and fluorescence microscopy of biological cells, providing both quantitative phase imaging and molecular specificity using a single camera. Our system is based on an interferometric multiplexing module, externally positioned at the exit of an optical microscope. In contrast to previous approaches, the presented technique allows conventional fluorescence imaging, rather than interferometric off-axis fluorescence imaging. We demonstrate the presented technique for imaging fluorescent beads and live biological cells.

  12. A collection of flow visualization techniques used in the Aerodynamic Research Branch

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Theoretical and experimental research on unsteady aerodynamic flows is discussed. Complex flow fields that involve separations, vortex interactions, and transonic flow effects were investigated. Flow visualization techniques are used to obtain a global picture of the flow phenomena before detailed quantitative studies are undertaken. A wide variety of methods are used to visualize fluid flow and a sampling of these methods is presented. It is emphasized that the visualization technique is a thorough quantitative analysis and subsequent physical understanding of these flow fields.

  13. Review of chemical separation techniques applicable to alpha spectrometric measurements

    NASA Astrophysics Data System (ADS)

    de Regge, P.; Boden, R.

    1984-06-01

    Prior to alpha-spectrometric measurements several chemical manipulations are usually required to obtain alpha-radiating sources with the desired radiochemical and chemical purity. These include sampling, dissolution or leaching of the elements of interest, conditioning of the solution, chemical separation and preparation of the alpha-emitting source. The choice of a particular method is dependent on different criteria but always involves aspects of the selectivity or the quantitative nature of the separations. The availability of suitable tracers or spikes and modern high resolution instruments resulted in the wide-spread application of isotopic dilution techniques to the problems associated with quantitative chemical separations. This enhanced the development of highly elective methods and reagents which led to important simplifications in the separation schemes. The chemical separation methods commonly used in connection with alpha-spectrometric measurements involve precipitation with selected scavenger elements, solvent extraction, ion exchange and electrodeposition techniques or any combination of them. Depending on the purpose of the final measurement and the type of sample available the chemical separation methods have to be adapted to the particular needs of environment monitoring, nuclear chemistry and metrology, safeguards and safety, waste management and requirements in the nuclear fuel cycle. Against the background of separation methods available in the literature the present paper highlights the current developments and trends in the chemical techniques applicable to alpha spectrometry.

  14. Advanced imaging of the macrostructure and microstructure of bone

    NASA Technical Reports Server (NTRS)

    Genant, H. K.; Gordon, C.; Jiang, Y.; Link, T. M.; Hans, D.; Majumdar, S.; Lang, T. F.

    2000-01-01

    Noninvasive and/or nondestructive techniques are capable of providing more macro- or microstructural information about bone than standard bone densitometry. Although the latter provides important information about osteoporotic fracture risk, numerous studies indicate that bone strength is only partially explained by bone mineral density. Quantitative assessment of macro- and microstructural features may improve our ability to estimate bone strength. The methods available for quantitatively assessing macrostructure include (besides conventional radiographs) quantitative computed tomography (QCT) and volumetric quantitative computed tomography (vQCT). Methods for assessing microstructure of trabecular bone noninvasively and/or nondestructively include high-resolution computed tomography (hrCT), micro-computed tomography (muCT), high-resolution magnetic resonance (hrMR), and micromagnetic resonance (muMR). vQCT, hrCT and hrMR are generally applicable in vivo; muCT and muMR are principally applicable in vitro. Although considerable progress has been made in the noninvasive and/or nondestructive imaging of the macro- and microstructure of bone, considerable challenges and dilemmas remain. From a technical perspective, the balance between spatial resolution versus sampling size, or between signal-to-noise versus radiation dose or acquisition time, needs further consideration, as do the trade-offs between the complexity and expense of equipment and the availability and accessibility of the methods. The relative merits of in vitro imaging and its ultrahigh resolution but invasiveness versus those of in vivo imaging and its modest resolution but noninvasiveness also deserve careful attention. From a clinical perspective, the challenges for bone imaging include balancing the relative advantages of simple bone densitometry against the more complex architectural features of bone or, similarly, the deeper research requirements against the broader clinical needs. The considerable potential biological differences between the peripheral appendicular skeleton and the central axial skeleton have to be addressed further. Finally, the relative merits of these sophisticated imaging techniques have to be weighed with respect to their applications as diagnostic procedures requiring high accuracy or reliability on one hand and their monitoring applications requiring high precision or reproducibility on the other. Copyright 2000 S. Karger AG, Basel.

  15. Cartilage Repair Surgery: Outcome Evaluation by Using Noninvasive Cartilage Biomarkers Based on Quantitative MRI Techniques?

    PubMed Central

    Jungmann, Pia M.; Baum, Thomas; Bauer, Jan S.; Karampinos, Dimitrios C.; Link, Thomas M.; Li, Xiaojuan; Trattnig, Siegfried; Rummeny, Ernst J.; Woertler, Klaus; Welsch, Goetz H.

    2014-01-01

    Background. New quantitative magnetic resonance imaging (MRI) techniques are increasingly applied as outcome measures after cartilage repair. Objective. To review the current literature on the use of quantitative MRI biomarkers for evaluation of cartilage repair at the knee and ankle. Methods. Using PubMed literature research, studies on biochemical, quantitative MR imaging of cartilage repair were identified and reviewed. Results. Quantitative MR biomarkers detect early degeneration of articular cartilage, mainly represented by an increasing water content, collagen disruption, and proteoglycan loss. Recently, feasibility of biochemical MR imaging of cartilage repair tissue and surrounding cartilage was demonstrated. Ultrastructural properties of the tissue after different repair procedures resulted in differences in imaging characteristics. T2 mapping, T1rho mapping, delayed gadolinium-enhanced MRI of cartilage (dGEMRIC), and diffusion weighted imaging (DWI) are applicable on most clinical 1.5 T and 3 T MR scanners. Currently, a standard of reference is difficult to define and knowledge is limited concerning correlation of clinical and MR findings. The lack of histological correlations complicates the identification of the exact tissue composition. Conclusions. A multimodal approach combining several quantitative MRI techniques in addition to morphological and clinical evaluation might be promising. Further investigations are required to demonstrate the potential for outcome evaluation after cartilage repair. PMID:24877139

  16. MR Fingerprinting for Rapid Quantitative Abdominal Imaging

    PubMed Central

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D.; Wright, Katherine L.; Seiberlich, Nicole; Griswold, Mark A.

    2016-01-01

    Purpose To develop a magnetic resonance (MR) “fingerprinting” technique for quantitative abdominal imaging. Materials and Methods This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Results Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). Conclusion A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue properties within one 19-second breath hold, with measurements comparable to those in published literature. © RSNA, 2016 PMID:26794935

  17. MR Fingerprinting for Rapid Quantitative Abdominal Imaging.

    PubMed

    Chen, Yong; Jiang, Yun; Pahwa, Shivani; Ma, Dan; Lu, Lan; Twieg, Michael D; Wright, Katherine L; Seiberlich, Nicole; Griswold, Mark A; Gulani, Vikas

    2016-04-01

    To develop a magnetic resonance (MR) "fingerprinting" technique for quantitative abdominal imaging. This HIPAA-compliant study had institutional review board approval, and informed consent was obtained from all subjects. To achieve accurate quantification in the presence of marked B0 and B1 field inhomogeneities, the MR fingerprinting framework was extended by using a two-dimensional fast imaging with steady-state free precession, or FISP, acquisition and a Bloch-Siegert B1 mapping method. The accuracy of the proposed technique was validated by using agarose phantoms. Quantitative measurements were performed in eight asymptomatic subjects and in six patients with 20 focal liver lesions. A two-tailed Student t test was used to compare the T1 and T2 results in metastatic adenocarcinoma with those in surrounding liver parenchyma and healthy subjects. Phantom experiments showed good agreement with standard methods in T1 and T2 after B1 correction. In vivo studies demonstrated that quantitative T1, T2, and B1 maps can be acquired within a breath hold of approximately 19 seconds. T1 and T2 measurements were compatible with those in the literature. Representative values included the following: liver, 745 msec ± 65 (standard deviation) and 31 msec ± 6; renal medulla, 1702 msec ± 205 and 60 msec ± 21; renal cortex, 1314 msec ± 77 and 47 msec ± 10; spleen, 1232 msec ± 92 and 60 msec ± 19; skeletal muscle, 1100 msec ± 59 and 44 msec ± 9; and fat, 253 msec ± 42 and 77 msec ± 16, respectively. T1 and T2 in metastatic adenocarcinoma were 1673 msec ± 331 and 43 msec ± 13, respectively, significantly different from surrounding liver parenchyma relaxation times of 840 msec ± 113 and 28 msec ± 3 (P < .0001 and P < .01) and those in hepatic parenchyma in healthy volunteers (745 msec ± 65 and 31 msec ± 6, P < .0001 and P = .021, respectively). A rapid technique for quantitative abdominal imaging was developed that allows simultaneous quantification of multiple tissue properties within one 19-second breath hold, with measurements comparable to those in published literature.

  18. Overview of Supersonic Aerodynamics Measurement Techniques in the NASA Langley Unitary Plan Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Erickson, Gary E.

    2007-01-01

    An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.

  19. Evaluating the dynamic response of in-flight thrust calculation techniques during throttle transients

    NASA Technical Reports Server (NTRS)

    Ray, Ronald J.

    1994-01-01

    New flight test maneuvers and analysis techniques for evaluating the dynamic response of in-flight thrust models during throttle transients have been developed and validated. The approach is based on the aircraft and engine performance relationship between thrust and drag. Two flight test maneuvers, a throttle step and a throttle frequency sweep, were developed and used in the study. Graphical analysis techniques, including a frequency domain analysis method, were also developed and evaluated. They provide quantitative and qualitative results. Four thrust calculation methods were used to demonstrate and validate the test technique. Flight test applications on two high-performance aircraft confirmed the test methods as valid and accurate. These maneuvers and analysis techniques were easy to implement and use. Flight test results indicate the analysis techniques can identify the combined effects of model error and instrumentation response limitations on the calculated thrust value. The methods developed in this report provide an accurate approach for evaluating, validating, or comparing thrust calculation methods for dynamic flight applications.

  20. Study on THz spectra of the active ingredients in the TCM

    NASA Astrophysics Data System (ADS)

    Ma, ShiHua; Wang, WenFeng; Liu, GuiFeng; Ge, Min; Zhu, ZhiYong

    2008-03-01

    Terahertz spectroscopy has tremendous potential for applications to evaluate the quality of the drugs including the TCM. In this paper, the Terahertz Time-Domain Spectroscopy investigated two active ingredients: Andrographolide and Dehydroandrographoline, isolated from Andrographis paniculata (Burm. f.) Nees. We also measured the mixtures of two active ingredients at the different ratio and the quantitative analysis is also applied to determine the contents of compound. The Terahertz spectroscopy is a potential and promising technique in identifying the components, evaluating the drugs sanitation and inspecting the quality of medicine including TCM.

  1. A quartz nanopillar hemocytometer for high-yield separation and counting of CD4+ T lymphocytes

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Joo; Seol, Jin-Kyeong; Wu, Yu; Ji, Seungmuk; Kim, Gil-Sung; Hyung, Jung-Hwan; Lee, Seung-Yong; Lim, Hyuneui; Fan, Rong; Lee, Sang-Kwon

    2012-03-01

    We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting.We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting. Electronic supplementary information (ESI) available. See DOI: 10.1039/c2nr11338d

  2. Development of liquid chromatography-tandem mass spectrometry methods for the quantitation of Anisakis simplex proteins in fish.

    PubMed

    Fæste, Christiane Kruse; Moen, Anders; Schniedewind, Björn; Haug Anonsen, Jan; Klawitter, Jelena; Christians, Uwe

    2016-02-05

    The parasite Anisakis simplex is present in many marine fish species that are directly used as food or in processed products. The anisakid larvae infect mostly the gut and inner organs of fish but have also been shown to penetrate into the fillet. Thus, human health can be at risk, either by contracting anisakiasis through the consumption of raw or under-cooked fish, or by sensitisation to anisakid proteins in processed food. A number of different methods for the detection of A. simplex in fish and products thereof have been developed, including visual techniques and PCR for larvae tracing, and immunological assays for the determination of proteins. The recent identification of a number of anisakid proteins by mass spectrometry-based proteomics has laid the groundwork for the development of two quantitative liquid chromatography-tandem mass spectrometry methods for the detection of A. simplex in fish that are described in the present study. Both, the label-free semi-quantitative nLC-nESI-Orbitrap-MS/MS (MS1) and the heavy peptide-applying absolute-quantitative (AQUA) LC-TripleQ-MS/MS (MS2) use unique reporter peptides derived from anisakid hemoglobin and SXP/RAL-2 protein as analytes. Standard curves in buffer and in salmon matrix showed limits of detection at 1μg/mL and 10μg/mL for MS1 and 0.1μg/mL and 2μg/mL for MS2. Preliminary method validation included the assessment of sensitivity, repeatability, reproducibility, and applicability to incurred and naturally-contaminated samples for both assays. By further optimization and full validation in accordance with current recommendations the LC-MS/MS methods could be standardized and used generally as confirmative techniques for the detection of A. simplex protein in fish. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. MRI technique for the snapshot imaging of quantitative velocity maps using RARE

    NASA Astrophysics Data System (ADS)

    Shiko, G.; Sederman, A. J.; Gladden, L. F.

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T2 weighted, not T2∗ weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98 × 49 μm2, within 20 min, and monitored over ˜13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390 × 390 μm2. The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques.

  4. Preparing the NDE engineers of the future: Education, training, and diversity

    NASA Astrophysics Data System (ADS)

    Holland, Stephen D.

    2017-02-01

    As quantitative NDE has matured and entered the mainstream, it has created an industry need for engineers who can select, evaluate, and qualify NDE techniques to satisfy quantitative engineering requirements. NDE as a field is cross-disciplinary with major NDE techniques relying on a broad spectrum of physics disciplines including fluid mechanics, electromagnetics, mechanical waves, and high energy physics. An NDE engineer needs broad and deep understanding of the measurement physics across modalities, a general engineering background, and familiarity with shop-floor practices and tools. While there are a wide range of certification and training programs worldwide for NDE technicians, there are few programs aimed at engineers. At the same time, substantial demographic shifts are underway with many experienced NDE engineers and technicians nearing retirement, and with new generations coming from much more diverse backgrounds. There is a need for more and better education opportunities for NDE engineers. Both teaching and learning NDE engineering are inherently challenging because of the breadth and depth of knowledge required. At the same time, sustaining the field in a more diverse era will require broadening participation of previously underrepresented groups. The QNDE 2016 conference in Atlanta, GA included a session on NDE education, training, and diversity. This paper summarizes the outcomes and discussion from this session.

  5. Primary Phase Field of the Pb-Doped 2223 High-Tc Superconductor in the (Bi, Pb)-Sr-Ca-Cu-O System

    PubMed Central

    Wong-Ng, W.; Cook, L. P.; Kearsley, A.; Greenwood, W.

    1999-01-01

    Both liquidus and subsolidus phase equilibrium data are of central importance for applications of high temperature superconductors in the (Bi, Pb)-Sr-Ca-Cu-O system, including material synthesis, melt processing and single crystal growth. The subsolidus equilibria of the 110 K high-Tc Pb-doped 2223 ([Bi, Pb], Sr, Ca, Cu) phase and the location of the primary phase field (crystallization field) have been determined in this study. For the quantitative determination of liquidus data, a wicking technique was developed to capture the melt for quantitative microchemical analysis. A total of 29 five-phase volumes that include the 2223 phase as a component was obtained. The initial melt compositions of these volumes range from a mole fraction of 7.3 % to 28.0 % for Bi, 11.3 % to 27.8 % for Sr, 1.2 % to 19.4 % for Pb, 9.8 % to 30.8 % for Ca, and 17.1 % to 47.0 % for Cu. Based on these data, the crystallization field for the 2223 phase was constructed using the convex hull technique. A section of this “volume” was obtained by holding two components of the composition at the median value, allowing projection on the other three axes to show the extent of the field.

  6. Incorporating Multiple-Choice Questions into an AACSB Assurance of Learning Process: A Course-Embedded Assessment Application to an Introductory Finance Course

    ERIC Educational Resources Information Center

    Santos, Michael R.; Hu, Aidong; Jordan, Douglas

    2014-01-01

    The authors offer a classification technique to make a quantitative skills rubric more operational, with the groupings of multiple-choice questions to match the student learning levels in knowledge, calculation, quantitative reasoning, and analysis. The authors applied this classification technique to the mid-term exams of an introductory finance…

  7. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  8. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  9. Bound Pool Fractions Complement Diffusion Measures to Describe White Matter Micro and Macrostructure

    PubMed Central

    Stikov, Nikola; Perry, Lee M.; Mezer, Aviv; Rykhlevskaia, Elena; Wandell, Brian A.; Pauly, John M.; Dougherty, Robert F.

    2010-01-01

    Diffusion imaging and bound pool fraction (BPF) mapping are two quantitative magnetic resonance imaging techniques that measure microstructural features of the white matter of the brain. Diffusion imaging provides a quantitative measure of the diffusivity of water in tissue. BPF mapping is a quantitative magnetization transfer (qMT) technique that estimates the proportion of exchanging protons bound to macromolecules, such as those found in myelin, and is thus a more direct measure of myelin content than diffusion. In this work, we combine BPF estimates of macromolecular content with measurements of diffusivity within human white matter tracts. Within the white matter, the correlation between BPFs and diffusivity measures such as fractional anisotropy and radial diffusivity was modest, suggesting that diffusion tensor imaging and bound pool fractions are complementary techniques. We found that several major tracts have high BPF, suggesting a higher density of myelin in these tracts. We interpret these results in the context of a quantitative tissue model. PMID:20828622

  10. Statistical innovations in the medical device world sparked by the FDA.

    PubMed

    Campbell, Gregory; Yue, Lilly Q

    2016-01-01

    The world of medical devices while highly diverse is extremely innovative, and this facilitates the adoption of innovative statistical techniques. Statisticians in the Center for Devices and Radiological Health (CDRH) at the Food and Drug Administration (FDA) have provided leadership in implementing statistical innovations. The innovations discussed include: the incorporation of Bayesian methods in clinical trials, adaptive designs, the use and development of propensity score methodology in the design and analysis of non-randomized observational studies, the use of tipping-point analysis for missing data, techniques for diagnostic test evaluation, bridging studies for companion diagnostic tests, quantitative benefit-risk decisions, and patient preference studies.

  11. Multiple stage MS in analysis of plasma, serum, urine and in vitro samples relevant to clinical and forensic toxicology.

    PubMed

    Meyer, Golo M; Maurer, Hans H; Meyer, Markus R

    2016-01-01

    This paper reviews MS approaches applied to metabolism studies, structure elucidation and qualitative or quantitative screening of drugs (of abuse) and/or their metabolites. Applications in clinical and forensic toxicology were included using blood plasma or serum, urine, in vitro samples, liquids, solids or plant material. Techniques covered are liquid chromatography coupled to low-resolution and high-resolution multiple stage mass analyzers. Only PubMed listed studies published in English between January 2008 and January 2015 were considered. Approaches are discussed focusing on sample preparation and mass spectral settings. Comments on advantages and limitations of these techniques complete the review.

  12. [Research progress in neuropsychopharmacology updated for the post-genomic era].

    PubMed

    Nakanishi, Toru

    2009-11-01

    Neuropsychopharmacological research in the post genomic (genomic sequence) era has been developing rapidly through the use of novel techniques including DNA chips. We have applied these techniques to investigate the anti-tumor effect of NSAIDs, isolate novel genes specifically expressed in rheumatoid arthritis, and analyze gene expression profiles in mesenchymal stem cells. Recently, we have developed a novel system of quantitative PCR for detection of BDNF mRNA isoforms. By using this system, we identified the exon-specific mode of expression in acute and chronic pain. In addition, we have made gene expression profiles of KO mice of beta2 subunits in acetylcholine receptors.

  13. Feasibility of free-breathing dynamic contrast-enhanced MRI of gastric cancer using a golden-angle radial stack-of-stars VIBE sequence: comparison with the conventional contrast-enhanced breath-hold 3D VIBE sequence.

    PubMed

    Li, Huan-Huan; Zhu, Hui; Yue, Lei; Fu, Yi; Grimm, Robert; Stemmer, Alto; Fu, Cai-Xia; Peng, Wei-Jun

    2018-05-01

    To investigate the feasibility and diagnostic value of free-breathing, radial, stack-of-stars three-dimensional (3D) gradient echo (GRE) sequence ("golden angle") on dynamic contrast-enhanced (DCE) MRI of gastric cancer. Forty-three gastric cancer patients were divided into cooperative and uncooperative groups. Respiratory fluctuation was observed using an abdominal respiratory gating sensor. Those who breath-held for more than 15 s were placed in the cooperative group and the remainder in the uncooperative group. The 3-T MRI scanning protocol included 3D GRE and conventional breath-hold VIBE (volume-interpolated breath-hold examination) sequences, comparing images quantitatively and qualitatively. DCE-MRI parameters from VIBE images of normal gastric wall and malignant lesions were compared. For uncooperative patients, 3D GRE scored higher qualitatively, and had higher SNRs (signal-to-noise ratios) and CNRs (contrast-to-noise ratios) than conventional VIBE quantitatively. Though 3D GRE images scored lower in qualitative parameters compared with conventional VIBE for cooperative patients, it provided images with fewer artefacts. DCE parameters differed significantly between normal gastric wall and lesions, with higher Ve (extracellular volume) and lower Kep (reflux constant) in gastric cancer. The free-breathing, golden-angle, radial stack-of-stars 3D GRE technique is feasible for DCE-MRI of gastric cancer. Dynamic enhanced images can be used for quantitative analysis of this malignancy. • Golden-angle radial stack-of-stars VIBE aids gastric cancer MRI diagnosis. • The 3D GRE technique is suitable for patients unable to suspend respiration. • Method scored higher in the qualitative evaluation for uncooperative patients. • The technique produced images with fewer artefacts than conventional VIBE sequence. • Dynamic enhanced images can be used for quantitative analysis of gastric cancer.

  14. A Quantitative Comparison of Leading-edge Vortices in Incompressible and Supersonic Flows

    NASA Technical Reports Server (NTRS)

    Wang, F. Y.; Milanovic, I. M.; Zaman, K. B. M. Q.

    2002-01-01

    When requiring quantitative data on delta-wing vortices for design purposes, low-speed results have often been extrapolated to configurations intended for supersonic operation. This practice stems from a lack of database owing to difficulties that plague measurement techniques in high-speed flows. In the present paper an attempt is made to examine this practice by comparing quantitative data on the nearwake properties of such vortices in incompressible and supersonic flows. The incompressible flow data are obtained in experiments conducted in a low-speed wind tunnel. Detailed flow-field properties, including vorticity and turbulence characteristics, obtained by hot-wire and pressure probe surveys are documented. These data are compared, wherever possible, with available data from a past work for a Mach 2.49 flow for the same wing geometry and angles-of-attack. The results indicate that quantitative similarities exist in the distributions of total pressure and swirl velocity. However, the streamwise velocity of the core exhibits different trends. The axial flow characteristics of the vortices in the two regimes are examined, and a candidate theory is discussed.

  15. The detection of large deletions or duplications in genomic DNA.

    PubMed

    Armour, J A L; Barton, D E; Cockburn, D J; Taylor, G R

    2002-11-01

    While methods for the detection of point mutations and small insertions or deletions in genomic DNA are well established, the detection of larger (>100 bp) genomic duplications or deletions can be more difficult. Most mutation scanning methods use PCR as a first step, but the subsequent analyses are usually qualitative rather than quantitative. Gene dosage methods based on PCR need to be quantitative (i.e., they should report molar quantities of starting material) or semi-quantitative (i.e., they should report gene dosage relative to an internal standard). Without some sort of quantitation, heterozygous deletions and duplications may be overlooked and therefore be under-ascertained. Gene dosage methods provide the additional benefit of reporting allele drop-out in the PCR. This could impact on SNP surveys, where large-scale genotyping may miss null alleles. Here we review recent developments in techniques for the detection of this type of mutation and compare their relative strengths and weaknesses. We emphasize that comprehensive mutation analysis should include scanning for large insertions and deletions and duplications. Copyright 2002 Wiley-Liss, Inc.

  16. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  17. A thioacidolysis method tailored for higher-throughput quantitative analysis of lignin monomers

    DOE PAGES

    Harman-Ware, Anne E.; Foster, Cliff; Happs, Renee M.; ...

    2016-09-14

    Thioacidolysis is a method used to measure the relative content of lignin monomers bound by β-O-4 linkages. Current thioacidolysis methods are low-throughput as they require tedious steps for reaction product concentration prior to analysis using standard GC methods. A quantitative thioacidolysis method that is accessible with general laboratory equipment and uses a non-chlorinated organic solvent and is tailored for higher-throughput analysis is reported. The method utilizes lignin arylglycerol monomer standards for calibration, requires 1-2 mg of biomass per assay and has been quantified using fast-GC techniques including a Low Thermal Mass Modular Accelerated Column Heater (LTM MACH). Cumbersome steps, includingmore » standard purification, sample concentrating and drying have been eliminated to help aid in consecutive day-to-day analyses needed to sustain a high sample throughput for large screening experiments without the loss of quantitation accuracy. As a result, the method reported in this manuscript has been quantitatively validated against a commonly used thioacidolysis method and across two different research sites with three common biomass varieties to represent hardwoods, softwoods, and grasses.« less

  18. Comparison of clinical semi-quantitative assessment of muscle fat infiltration with quantitative assessment using chemical shift-based water/fat separation in MR studies of the calf of post-menopausal women.

    PubMed

    Alizai, Hamza; Nardo, Lorenzo; Karampinos, Dimitrios C; Joseph, Gabby B; Yap, Samuel P; Baum, Thomas; Krug, Roland; Majumdar, Sharmila; Link, Thomas M

    2012-07-01

    The goal of this study was to compare the semi-quantitative Goutallier classification for fat infiltration with quantitative fat-fraction derived from a magnetic resonance imaging (MRI) chemical shift-based water/fat separation technique. Sixty-two women (age 61 ± 6 years), 27 of whom had diabetes, underwent MRI of the calf using a T1-weighted fast spin-echo sequence and a six-echo spoiled gradient-echo sequence at 3 T. Water/fat images and fat fraction maps were reconstructed using the IDEAL algorithm with T2* correction and a multi-peak model for the fat spectrum. Two radiologists scored fat infiltration on the T1-weighted images using the Goutallier classification in six muscle compartments. Spearman correlations between the Goutallier grades and the fat fraction were calculated; in addition, intra-observer and inter-observer agreement were calculated. A significant correlation between the clinical grading and the fat fraction values was found for all muscle compartments (P < 0.0001, R values ranging from 0.79 to 0.88). Goutallier grades 0-4 had a fat fraction ranging from 3.5 to 19%. Intra-observer and inter-observer agreement values of 0.83 and 0.81 were calculated for the semi-quantitative grading. Semi-quantitative grading of intramuscular fat and quantitative fat fraction were significantly correlated and both techniques had excellent reproducibility. However, the clinical grading was found to overestimate muscle fat. Fat infiltration of muscle commonly occurs in many metabolic and neuromuscular diseases. • Image-based semi-quantitative classifications for assessing fat infiltration are not well validated. • Quantitative MRI techniques provide an accurate assessment of muscle fat.

  19. T1ρ MR Imaging of Human Musculoskeletal System

    PubMed Central

    Wang, Ligong; Regatte, Ravinder R.

    2014-01-01

    Magnetic resonance imaging (MRI) offers the direct visualization of human musculoskeletal (MSK) system, especially all diarthrodial tissues including cartilage, bone, menisci, ligaments, tendon, hip, synovium etc. Conventional MR imaging techniques based on T1- and T2-weighted, proton density (PD) contrast are inconclusive in quantifying early biochemically degenerative changes in MSK system in general and articular cartilage in particular. In recent years, quantitative MR parameter mapping techniques have been used to quantify the biochemical changes in articular cartilage with a special emphasis on evaluating joint injury, cartilage degeneration, and soft tissue repair. In this article, we will focus on cartilage biochemical composition, basic principles of T1ρ MR imaging, implementation of T1ρ pulse sequences, biochemical validation, and summarize the potential applications of T1ρ MR imaging technique in MSK diseases including osteoarthritis (OA), anterior cruciate ligament (ACL) injury, and knee joint repair. Finally, we will also review the potential advantages, challenges, and future prospects of T1ρ MR imaging for widespread clinical translation. PMID:24935818

  20. Visualization of gas dissolution following upward gas migration in porous media: Technique and implications for stray gas

    NASA Astrophysics Data System (ADS)

    Van De Ven, C. J. C.; Mumford, Kevin G.

    2018-05-01

    The study of gas-water mass transfer in porous media is important in many applications, including unconventional resource extraction, carbon storage, deep geological waste storage, and remediation of contaminated groundwater, all of which rely on an understanding of the fate and transport of free and dissolved gas. The novel visual technique developed in this study provided both quantitative and qualitative observations of gas-water mass transfer. Findings included interaction between free gas architecture and dissolved plume migration, plume geometry and longevity. The technique was applied to the injection of CO2 in source patterns expected for stray gas originating from oil and gas operations to measure dissolved phase concentrations of CO2 at high spatial and temporal resolutions. The data set is the first of its kind to provide high resolution quantification of gas-water dissolution, and will facilitate an improved understanding of the fundamental processes of gas movement and fate in these complex systems.

  1. T₁ρ MRI of human musculoskeletal system.

    PubMed

    Wang, Ligong; Regatte, Ravinder R

    2015-03-01

    Magnetic resonance imaging (MRI) offers the direct visualization of the human musculoskeletal (MSK) system, especially all diarthrodial tissues including cartilage, bone, menisci, ligaments, tendon, hip, synovium, etc. Conventional MRI techniques based on T1 - and T2 -weighted, proton density (PD) contrast are inconclusive in quantifying early biochemically degenerative changes in MSK system in general and articular cartilage in particular. In recent years, quantitative MR parameter mapping techniques have been used to quantify the biochemical changes in articular cartilage, with a special emphasis on evaluating joint injury, cartilage degeneration, and soft tissue repair. In this article we focus on cartilage biochemical composition, basic principles of T1ρ MRI, implementation of T1ρ pulse sequences, biochemical validation, and summarize the potential applications of the T1ρ MRI technique in MSK diseases including osteoarthritis (OA), anterior cruciate ligament (ACL) injury, and knee joint repair. Finally, we also review the potential advantages, challenges, and future prospects of T1ρ MRI for widespread clinical translation. © 2014 Wiley Periodicals, Inc.

  2. Diagnosis of human fascioliasis by stool and blood techniques: update for the present global scenario.

    PubMed

    Mas-Coma, S; Bargues, M D; Valero, M A

    2014-12-01

    Before the 1990s, human fascioliasis diagnosis focused on individual patients in hospitals or health centres. Case reports were mainly from developed countries and usually concerned isolated human infection in animal endemic areas. From the mid-1990s onwards, due to the progressive description of human endemic areas and human infection reports in developing countries, but also new knowledge on clinical manifestations and pathology, new situations, hitherto neglected, entered in the global scenario. Human fascioliasis has proved to be pronouncedly more heterogeneous than previously thought, including different transmission patterns and epidemiological situations. Stool and blood techniques, the main tools for diagnosis in humans, have been improved for both patient and survey diagnosis. Present availabilities for human diagnosis are reviewed focusing on advantages and weaknesses, sample management, egg differentiation, qualitative and quantitative diagnosis, antibody and antigen detection, post-treatment monitoring and post-control surveillance. Main conclusions refer to the pronounced difficulties of diagnosing fascioliasis in humans given the different infection phases and parasite migration capacities, clinical heterogeneity, immunological complexity, different epidemiological situations and transmission patterns, the lack of a diagnostic technique covering all needs and situations, and the advisability for a combined use of different techniques, at least including a stool technique and a blood technique.

  3. Mathematics Competency for Beginning Chemistry Students Through Dimensional Analysis.

    PubMed

    Pursell, David P; Forlemu, Neville Y; Anagho, Leonard E

    2017-01-01

    Mathematics competency in nursing education and practice may be addressed by an instructional variation of the traditional dimensional analysis technique typically presented in beginning chemistry courses. The authors studied 73 beginning chemistry students using the typical dimensional analysis technique and the variation technique. Student quantitative problem-solving performance was evaluated. Students using the variation technique scored significantly better (18.3 of 20 points, p < .0001) on the final examination quantitative titration problem than those who used the typical technique (10.9 of 20 points). American Chemical Society examination scores and in-house assessment indicate that better performing beginning chemistry students were more likely to use the variation technique rather than the typical technique. The variation technique may be useful as an alternative instructional approach to enhance beginning chemistry students' mathematics competency and problem-solving ability in both education and practice. [J Nurs Educ. 2017;56(1):22-26.]. Copyright 2017, SLACK Incorporated.

  4. Functional magnetic resonance imaging in oncology: state of the art.

    PubMed

    Guimaraes, Marcos Duarte; Schuch, Alice; Hochhegger, Bruno; Gross, Jefferson Luiz; Chojniak, Rubens; Marchiori, Edson

    2014-01-01

    In the investigation of tumors with conventional magnetic resonance imaging, both quantitative characteristics, such as size, edema, necrosis, and presence of metastases, and qualitative characteristics, such as contrast enhancement degree, are taken into consideration. However, changes in cell metabolism and tissue physiology which precede morphological changes cannot be detected by the conventional technique. The development of new magnetic resonance imaging techniques has enabled the functional assessment of the structures in order to obtain information on the different physiological processes of the tumor microenvironment, such as oxygenation levels, cellularity and vascularity. The detailed morphological study in association with the new functional imaging techniques allows for an appropriate approach to cancer patients, including the phases of diagnosis, staging, response evaluation and follow-up, with a positive impact on their quality of life and survival rate.

  5. Non-interferometric phase retrieval using refractive index manipulation.

    PubMed

    Chen, Chyong-Hua; Hsu, Hsin-Feng; Chen, Hou-Ren; Hsieh, Wen-Feng

    2017-04-07

    We present a novel, inexpensive and non-interferometric technique to retrieve phase images by using a liquid crystal phase shifter without including any physically moving parts. First, we derive a new equation of the intensity-phase relation with respect to the change of refractive index, which is similar to the transport of the intensity equation. The equation indicates that this technique is unneeded to consider the variation of magnifications between optical images. For proof of the concept, we use a liquid crystal mixture MLC 2144 to manufacture a phase shifter and to capture the optical images in a rapid succession by electrically tuning the applied voltage of the phase shifter. Experimental results demonstrate that this technique is capable of reconstructing high-resolution phase images and to realize the thickness profile of a microlens array quantitatively.

  6. Recent advances in Lorentz microscopy

    DOE PAGES

    Phatak, C.; Petford-Long, A. K.; De Graef, M.

    2016-01-05

    Lorentz transmission electron microscopy (LTEM) has evolved from a qualitative magnetic domain observation technique to a quantitative technique for the determination of the magnetization state of a sample. Here, we describe recent developments in techniques and imaging modes, including the use of spherical aberration correction to improve the spatial resolution of LTEM into the single nanometer range, and novel in situ observation modes. We also review recent advances in the modeling of the wave optical magnetic phase shift as well as in the area of phase reconstruction by means of the Transport of Intensity Equation (TIE) approach, and discuss vectormore » field electron tomography, which has emerged as a powerful tool for the 3D reconstruction of magnetization configurations. Finally, we conclude this review with a brief overview of recent LTEM applications.« less

  7. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders

    PubMed Central

    Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J.

    2014-01-01

    Abstract. Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed. PMID:26157976

  8. Review of quantitative phase-digital holographic microscopy: promising novel imaging technique to resolve neuronal network activity and identify cellular biomarkers of psychiatric disorders.

    PubMed

    Marquet, Pierre; Depeursinge, Christian; Magistretti, Pierre J

    2014-10-01

    Quantitative phase microscopy (QPM) has recently emerged as a new powerful quantitative imaging technique well suited to noninvasively explore a transparent specimen with a nanometric axial sensitivity. In this review, we expose the recent developments of quantitative phase-digital holographic microscopy (QP-DHM). Quantitative phase-digital holographic microscopy (QP-DHM) represents an important and efficient quantitative phase method to explore cell structure and dynamics. In a second part, the most relevant QPM applications in the field of cell biology are summarized. A particular emphasis is placed on the original biological information, which can be derived from the quantitative phase signal. In a third part, recent applications obtained, with QP-DHM in the field of cellular neuroscience, namely the possibility to optically resolve neuronal network activity and spine dynamics, are presented. Furthermore, potential applications of QPM related to psychiatry through the identification of new and original cell biomarkers that, when combined with a range of other biomarkers, could significantly contribute to the determination of high risk developmental trajectories for psychiatric disorders, are discussed.

  9. Applying Quantitative Genetic Methods to Primate Social Behavior

    PubMed Central

    Brent, Lauren J. N.

    2013-01-01

    Increasingly, behavioral ecologists have applied quantitative genetic methods to investigate the evolution of behaviors in wild animal populations. The promise of quantitative genetics in unmanaged populations opens the door for simultaneous analysis of inheritance, phenotypic plasticity, and patterns of selection on behavioral phenotypes all within the same study. In this article, we describe how quantitative genetic techniques provide studies of the evolution of behavior with information that is unique and valuable. We outline technical obstacles for applying quantitative genetic techniques that are of particular relevance to studies of behavior in primates, especially those living in noncaptive populations, e.g., the need for pedigree information, non-Gaussian phenotypes, and demonstrate how many of these barriers are now surmountable. We illustrate this by applying recent quantitative genetic methods to spatial proximity data, a simple and widely collected primate social behavior, from adult rhesus macaques on Cayo Santiago. Our analysis shows that proximity measures are consistent across repeated measurements on individuals (repeatable) and that kin have similar mean measurements (heritable). Quantitative genetics may hold lessons of considerable importance for studies of primate behavior, even those without a specific genetic focus. PMID:24659839

  10. Mammographic features and subsequent risk of breast cancer: a comparison of qualitative and quantitative evaluations in the Guernsey prospective studies.

    PubMed

    Torres-Mejía, Gabriela; De Stavola, Bianca; Allen, Diane S; Pérez-Gavilán, Juan J; Ferreira, Jorge M; Fentiman, Ian S; Dos Santos Silva, Isabel

    2005-05-01

    Mammographic features are known to be associated with breast cancer but the magnitude of the effect differs markedly from study to study. Methods to assess mammographic features range from subjective qualitative classifications to computer-automated quantitative measures. We used data from the UK Guernsey prospective studies to examine the relative value of these methods in predicting breast cancer risk. In all, 3,211 women ages > or =35 years who had a mammogram taken in 1986 to 1989 were followed-up to the end of October 2003, with 111 developing breast cancer during this period. Mammograms were classified using the subjective qualitative Wolfe classification and several quantitative mammographic features measured using computer-based techniques. Breast cancer risk was positively associated with high-grade Wolfe classification, percent breast density and area of dense tissue, and negatively associated with area of lucent tissue, fractal dimension, and lacunarity. Inclusion of the quantitative measures in the same model identified area of dense tissue and lacunarity as the best predictors of breast cancer, with risk increasing by 59% [95% confidence interval (95% CI), 29-94%] per SD increase in total area of dense tissue but declining by 39% (95% CI, 53-22%) per SD increase in lacunarity, after adjusting for each other and for other confounders. Comparison of models that included both the qualitative Wolfe classification and these two quantitative measures to models that included either the qualitative or the two quantitative variables showed that they all made significant contributions to prediction of breast cancer risk. These findings indicate that breast cancer risk is affected not only by the amount of mammographic density but also by the degree of heterogeneity of the parenchymal pattern and, presumably, by other features captured by the Wolfe classification.

  11. Novel CE-MS technique for detection of high explosives using perfluorooctanoic acid as a MEKC and mass spectrometric complexation reagent.

    PubMed

    Brensinger, Karen; Rollman, Christopher; Copper, Christine; Genzman, Ashton; Rine, Jacqueline; Lurie, Ira; Moini, Mehdi

    2016-01-01

    To address the need for the forensic analysis of high explosives, a novel capillary electrophoresis mass spectrometry (CE-MS) technique has been developed for high resolution, sensitivity, and mass accuracy detection of these compounds. The technique uses perfluorooctanoic acid (PFOA) as both a micellar electrokinetic chromatography (MEKC) reagent for separation of neutral explosives and as the complexation reagent for mass spectrometric detection of PFOA-explosive complexes in the negative ion mode. High explosives that formed complexes with PFOA included RDX, HMX, tetryl, and PETN. Some nitroaromatics were detected as molecular ions. Detection limits in the high parts per billion range and linear calibration responses over two orders of magnitude were obtained. For proof of concept, the technique was applied to the quantitative analysis of high explosives in sand samples. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. A spot pattern test chart technique for measurement of geometric aberrations caused by an intervening medium—a novel method

    NASA Astrophysics Data System (ADS)

    Ganesan, A. R.; Arulmozhivarman, P.; Jesson, M.

    2005-12-01

    Accurate surface metrology and transmission characteristics measurements have become vital to certify the manufacturing excellence in the field of glass visors, windshields, menu boards and transportation industries. We report a simple, cost-effective and novel technique for the measurement of geometric aberrations in transparent materials such as glass sheets, Perspex, etc. The technique makes use of an array of spot pattern, we call the spot pattern test chart technique, in the diffraction limited imaging position having large field of view. Performance features include variable angular dynamic range and angular sensitivity. Transparent sheets as the intervening medium introduced in the line of sight, causing aberrations, are estimated in real time using the Zernike reconstruction method. Quantitative comparative analysis between a Shack-Hartmann wavefront sensor and the proposed new method is presented and the results are discussed.

  13. Novel method for quantitative ANA measurement using near-infrared imaging.

    PubMed

    Peterson, Lisa K; Wells, Daniel; Shaw, Laura; Velez, Maria-Gabriela; Harbeck, Ronald; Dragone, Leonard L

    2009-09-30

    Antinuclear antibodies (ANA) have been detected in patients with systemic rheumatic diseases and are used in the screening and/or diagnosis of autoimmunity in patients as well as mouse models of systemic autoimmunity. Indirect immunofluorescence (IIF) on HEp-2 cells is the gold standard for ANA screening. However, its usefulness is limited in diagnosis, prognosis and monitoring of disease activity due to the lack of standardization in performing the technique, subjectivity in interpreting the results and the fact that it is only semi-quantitative. Various immunological techniques have been developed in an attempt to improve upon the method to quantify ANA, including enzyme-linked immunosorbent assays (ELISAs), line immunoassays (LIAs), multiplexed bead immunoassays and IIF on substrates other than HEp-2 cells. Yet IIF on HEp-2 cells remains the most common screening method for ANA. In this study, we describe a simple quantitative method to detect ANA which combines IIF on HEp-2 coated slides with analysis using a near-infrared imaging (NII) system. Using NII to determine ANA titer, 86.5% (32 of 37) of the titers for human patient samples were within 2 dilutions of those determined by IIF, which is the acceptable range for proficiency testing. Combining an initial screening for nuclear staining using microscopy with titration by NII resulted in 97.3% (36 of 37) of the titers detected to be within two dilutions of those determined by IIF. The NII method for quantitative ANA measurements using serum from both patients and mice with autoimmunity provides a fast, relatively simple, objective, sensitive and reproducible assay, which could easily be standardized for comparison between laboratories.

  14. Assessment of metabolic bone diseases by quantitative computed tomography

    NASA Technical Reports Server (NTRS)

    Richardson, M. L.; Genant, H. K.; Cann, C. E.; Ettinger, B.; Gordan, G. S.; Kolb, F. O.; Reiser, U. J.

    1985-01-01

    Advances in the radiologic sciences have permitted the development of numerous noninvasive techniques for measuring the mineral content of bone, with varying degrees of precision, accuracy, and sensitivity. The techniques of standard radiography, radiogrammetry, photodensitometry, Compton scattering, neutron activation analysis, single and dual photon absorptiometry, and quantitative computed tomography (QCT) are described and reviewed in depth. Results from previous cross-sectional and longitudinal QCT investigations are given. They then describe a current investigation in which they studied 269 subjects, including 173 normal women, 34 patients with hyperparathyroidism, 24 patients with steroid-induced osteoporosis, and 38 men with idiopathic osteoporosis. Spinal quantitative computed tomography, radiogrammetry, and single photon absorptiometry were performed, and a spinal fracture index was calculated on all patients. The authors found a disproportionate loss of spinal trabecular mineral compared to appendicular mineral in the men with idiopathic osteoporosis and the patients with steroid-induced osteoporosis. They observed roughly equivalent mineral loss in both the appendicular and axial regions in the hyperparathyroid patients. The appendicular cortical measurements correlated moderately well with each other but less well with spinal trabecular QCT. The spinal fracture index correlated well with QCT and less well with the appendicular measurements. Knowledge of appendicular cortical mineral status is important in its own right but is not a valid predictor of axial trabecular mineral status, which may be disproportionately decreased in certain diseases. Quantitative CT provides a reliable means of assessing the latter region of the skeleton, correlates well with the spinal fracture index (a semiquantitative measurement of end-organ failure), and offers the clinician a sensitive means of following the effects of therapy.

  15. Rate Constants and Mechanisms of Protein–Ligand Binding

    PubMed Central

    Pang, Xiaodong; Zhou, Huan-Xiang

    2017-01-01

    Whereas protein–ligand binding affinities have long-established prominence, binding rate constants and binding mechanisms have gained increasing attention in recent years. Both new computational methods and new experimental techniques have been developed to characterize the latter properties. It is now realized that binding mechanisms, like binding rate constants, can and should be quantitatively determined. In this review, we summarize studies and synthesize ideas on several topics in the hope of providing a coherent picture of and physical insight into binding kinetics. The topics include microscopic formulation of the kinetic problem and its reduction to simple rate equations; computation of binding rate constants; quantitative determination of binding mechanisms; and elucidation of physical factors that control binding rate constants and mechanisms. PMID:28375732

  16. Design and analysis issues in quantitative proteomics studies.

    PubMed

    Karp, Natasha A; Lilley, Kathryn S

    2007-09-01

    Quantitative proteomics is the comparison of distinct proteomes which enables the identification of protein species which exhibit changes in expression or post-translational state in response to a given stimulus. Many different quantitative techniques are being utilized and generate large datasets. Independent of the technique used, these large datasets need robust data analysis to ensure valid conclusions are drawn from such studies. Approaches to address the problems that arise with large datasets are discussed to give insight into the types of statistical analyses of data appropriate for the various experimental strategies that can be employed by quantitative proteomic studies. This review also highlights the importance of employing a robust experimental design and highlights various issues surrounding the design of experiments. The concepts and examples discussed within will show how robust design and analysis will lead to confident results that will ensure quantitative proteomics delivers.

  17. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies.

    PubMed

    Shahabpoor, Erfan; Pavic, Aleksandar

    2017-09-12

    Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the 'accuracy' and 'practicality' of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the ground reactions estimation methods to include pathological gaits and natural variability of gait in real-life physical environment.

  18. Measurement of Walking Ground Reactions in Real-Life Environments: A Systematic Review of Techniques and Technologies

    PubMed Central

    Shahabpoor, Erfan; Pavic, Aleksandar

    2017-01-01

    Monitoring natural human gait in real-life environments is essential in many applications, including quantification of disease progression, monitoring the effects of treatment, and monitoring alteration of performance biomarkers in professional sports. Nevertheless, developing reliable and practical techniques and technologies necessary for continuous real-life monitoring of gait is still an open challenge. A systematic review of English-language articles from scientific databases including Scopus, ScienceDirect, Pubmed, IEEE Xplore, EBSCO and MEDLINE were carried out to analyse the ‘accuracy’ and ‘practicality’ of the current techniques and technologies for quantitative measurement of the tri-axial walking ground reactions outside the laboratory environment, and to highlight their strengths and shortcomings. In total, 679 relevant abstracts were identified, 54 full-text papers were included in the paper and the quantitative results of 17 papers were used for meta-analysis and comparison. Three classes of methods were reviewed: (1) methods based on measured kinematic data; (2) methods based on measured plantar pressure; and (3) methods based on direct measurement of ground reactions. It was found that all three classes of methods have competitive accuracy levels with methods based on direct measurement of the ground reactions showing highest accuracy while being least practical for long-term real-life measurement. On the other hand, methods that estimate ground reactions using measured body kinematics show highest practicality of the three classes of methods reviewed. Among the most prominent technical and technological challenges are: (1) reducing the size and price of tri-axial load-cells; (2) improving the accuracy of orientation measurement using IMUs; (3) minimizing the number and optimizing the location of required IMUs for kinematic measurement; (4) increasing the durability of pressure insole sensors, and (5) enhancing the robustness and versatility of the ground reactions estimation methods to include pathological gaits and natural variability of gait in real-life physical environment. PMID:28895909

  19. In situ spectroradiometric quantification of ERTS data. [Prescott and Phoenix, Arizona

    NASA Technical Reports Server (NTRS)

    Yost, E. F. (Principal Investigator)

    1975-01-01

    The author has identified the following significant results. Analyses of ERTS-1 photographic data were made to quantitatively relate ground reflectance measurements to photometric characteristics of the images. Digital image processing of photographic data resulted in a nomograph to correct for atmospheric effects over arid terrain. Optimum processing techniques to derive maximum geologic information from desert areas were established. Additive color techniques to provide quantitative measurements of surface water between different orbits were developed which were accepted as the standard flood mapping techniques using ERTS.

  20. Bioimaging of metals in brain tissue by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) and metallomics.

    PubMed

    Becker, J Sabine; Matusch, Andreas; Palm, Christoph; Salber, Dagmar; Morton, Kathryn A; Becker, J Susanne

    2010-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been developed and established as an emerging technique in the generation of quantitative images of metal distributions in thin tissue sections of brain samples (such as human, rat and mouse brain), with applications in research related to neurodegenerative disorders. A new analytical protocol is described which includes sample preparation by cryo-cutting of thin tissue sections and matrix-matched laboratory standards, mass spectrometric measurements, data acquisition, and quantitative analysis. Specific examples of the bioimaging of metal distributions in normal rodent brains are provided. Differences to the normal were assessed in a Parkinson's disease and a stroke brain model. Furthermore, changes during normal aging were studied. Powerful analytical techniques are also required for the determination and characterization of metal-containing proteins within a large pool of proteins, e.g., after denaturing or non-denaturing electrophoretic separation of proteins in one-dimensional and two-dimensional gels. LA-ICP-MS can be employed to detect metalloproteins in protein bands or spots separated after gel electrophoresis. MALDI-MS can then be used to identify specific metal-containing proteins in these bands or spots. The combination of these techniques is described in the second section.

  1. Pre-clinical characterization of tissue engineering constructs for bone and cartilage regeneration

    PubMed Central

    Trachtenberg, Jordan E.; Vo, Tiffany N.; Mikos, Antonios G.

    2014-01-01

    Pre-clinical animal models play a crucial role in the translation of biomedical technologies from the bench top to the bedside. However, there is a need for improved techniques to evaluate implanted biomaterials within the host, including consideration of the care and ethics associated with animal studies, as well as the evaluation of host tissue repair in a clinically relevant manner. This review discusses non-invasive, quantitative, and real-time techniques for evaluating host-materials interactions, quality and rate of neotissue formation, and functional outcomes of implanted biomaterials for bone and cartilage tissue engineering. Specifically, a comparison will be presented for pre-clinical animal models, histological scoring systems, and non-invasive imaging modalities. Additionally, novel technologies to track delivered cells and growth factors will be discussed, including methods to directly correlate their release with tissue growth. PMID:25319726

  2. Pre-clinical characterization of tissue engineering constructs for bone and cartilage regeneration.

    PubMed

    Trachtenberg, Jordan E; Vo, Tiffany N; Mikos, Antonios G

    2015-03-01

    Pre-clinical animal models play a crucial role in the translation of biomedical technologies from the bench top to the bedside. However, there is a need for improved techniques to evaluate implanted biomaterials within the host, including consideration of the care and ethics associated with animal studies, as well as the evaluation of host tissue repair in a clinically relevant manner. This review discusses non-invasive, quantitative, and real-time techniques for evaluating host-materials interactions, quality and rate of neotissue formation, and functional outcomes of implanted biomaterials for bone and cartilage tissue engineering. Specifically, a comparison will be presented for pre-clinical animal models, histological scoring systems, and non-invasive imaging modalities. Additionally, novel technologies to track delivered cells and growth factors will be discussed, including methods to directly correlate their release with tissue growth.

  3. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Genevieve; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, Jose Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment. Several such techniques have emerged over the past decades to address the issue of reader's experience and inter-measurement variability in interpretation. Some were widely embraced by echocardiographers around the world and became part of the clinical routine, whereas others remained limited to research and exploration of new clinical applications. Two such techniques have dominated the research arena of echocardiography: (1) Doppler-based tissue velocity measurements, frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements. Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated back- scatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses, briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  4. Current and evolving echocardiographic techniques for the quantitative evaluation of cardiac mechanics: ASE/EAE consensus statement on methodology and indications endorsed by the Japanese Society of Echocardiography.

    PubMed

    Mor-Avi, Victor; Lang, Roberto M; Badano, Luigi P; Belohlavek, Marek; Cardim, Nuno Miguel; Derumeaux, Geneviève; Galderisi, Maurizio; Marwick, Thomas; Nagueh, Sherif F; Sengupta, Partho P; Sicari, Rosa; Smiseth, Otto A; Smulevitz, Beverly; Takeuchi, Masaaki; Thomas, James D; Vannan, Mani; Voigt, Jens-Uwe; Zamorano, José Luis

    2011-03-01

    Echocardiographic imaging is ideally suited for the evaluation of cardiac mechanics because of its intrinsically dynamic nature. Because for decades, echocardiography has been the only imaging modality that allows dynamic imaging of the heart, it is only natural that new, increasingly automated techniques for sophisticated analysis of cardiac mechanics have been driven by researchers and manufacturers of ultrasound imaging equipment.Several such technique shave emerged over the past decades to address the issue of reader's experience and inter measurement variability in interpretation.Some were widely embraced by echocardiographers around the world and became part of the clinical routine,whereas others remained limited to research and exploration of new clinical applications.Two such techniques have dominated the research arena of echocardiography: (1) Doppler based tissue velocity measurements,frequently referred to as tissue Doppler or myocardial Doppler, and (2) speckle tracking on the basis of displacement measurements.Both types of measurements lend themselves to the derivation of multiple parameters of myocardial function. The goal of this document is to focus on the currently available techniques that allow quantitative assessment of myocardial function via image-based analysis of local myocardial dynamics, including Doppler tissue imaging and speckle-tracking echocardiography, as well as integrated backscatter analysis. This document describes the current and potential clinical applications of these techniques and their strengths and weaknesses,briefly surveys a selection of the relevant published literature while highlighting normal and abnormal findings in the context of different cardiovascular pathologies, and summarizes the unresolved issues, future research priorities, and recommended indications for clinical use.

  5. Quantitative label-free multimodality nonlinear optical imaging for in situ differentiation of cancerous lesions

    NASA Astrophysics Data System (ADS)

    Xu, Xiaoyun; Li, Xiaoyan; Cheng, Jie; Liu, Zhengfan; Thrall, Michael J.; Wang, Xi; Wang, Zhiyong; Wong, Stephen T. C.

    2013-03-01

    The development of real-time, label-free imaging techniques has recently attracted research interest for in situ differentiation of cancerous lesions from normal tissues. Molecule-specific intrinsic contrast can arise from label-free imaging techniques such as Coherent Anti-Stokes Raman Scattering (CARS), Two-Photon Excited AutoFluorescence (TPEAF), and Second Harmonic Generation (SHG), which, in combination, would hold the promise of a powerful label-free tool for cancer diagnosis. Among cancer-related deaths, lung carcinoma is the leading cause for both sexes. Although early treatment can increase the survival rate dramatically, lesion detection and precise diagnosis at an early stage is unusual due to its asymptomatic nature and limitations of current diagnostic techniques that make screening difficult. We investigated the potential of using multimodality nonlinear optical microscopy that incorporates CARS, TPEAF, and SHG techniques for differentiation of lung cancer from normal tissue. Cancerous and non-cancerous lung tissue samples from patients were imaged using CARS, TPEAF, and SHG techniques for comparison. These images showed good pathology correlation with hematoxylin and eosin (H and E) stained sections from the same tissue samples. Ongoing work includes imaging at various penetration depths to show three-dimensional morphologies of tumor cell nuclei using CARS, elastin using TPEAF, and collagen using SHG and developing classification algorithms for quantitative feature extraction to enable lung cancer diagnosis. Our results indicate that via real-time morphology analyses, a multimodality nonlinear optical imaging platform potentially offers a powerful minimally-invasive way to differentiate cancer lesions from surrounding non-tumor tissues in vivo for clinical applications.

  6. Pitching Flexible Propulsors: Experimental Assessment of Performance Characteristics

    DTIC Science & Technology

    2014-05-09

    velocities pointing in this direction contribute to an overall momentum deficit in the wake , which may be quantitatively related to the drag force on...and explained the source of some of the additional vorticity in the wake of the foil that may have otherwise been ignored or treated as noise in the...is conducted through reduction of the measured force and torque data and multiple wake flow analysis techniques, including particle image

  7. Emerging non-invasive Raman methods in process control and forensic applications.

    PubMed

    Macleod, Neil A; Matousek, Pavel

    2008-10-01

    This article reviews emerging Raman techniques (Spatially Offset and Transmission Raman Spectroscopy) for non-invasive, sub-surface probing in process control and forensic applications. New capabilities offered by these methods are discussed and several application examples are given including the non-invasive detection of counterfeit drugs through blister packs and opaque plastic bottles and the rapid quantitative analysis of the bulk content of pharmaceutical tablets and capsules without sub-sampling.

  8. Optical holographic structural analysis of Kevlar rocket motor cases

    NASA Astrophysics Data System (ADS)

    Harris, W. J.

    1981-05-01

    The methodology of applying optical holography to evaluation of subscale Kevlar 49 composite pressure vessels is explored. The results and advantages of the holographic technique are discussed. The cases utilized were of similar design, but each had specific design features, the effects of which are reviewed. Burst testing results are presented in conjunction with the holographic fringe patterns obtained during progressive pressurization. Examples of quantitative data extracted by analysis of fringe fields are included.

  9. Fluoride Ion Regeneration of Cyclosarin (Gf) from Minipig Tissue and Fluids Following Whole Body GF Vapor Exposure

    DTIC Science & Technology

    2006-11-01

    Quantitation of organophosphorus nerve agent metabolites in human urine using isotope dilution gas chromatography- tandem mass spectrometry. J. Anal...Recent developments to improve nerve agent biomarker techniques include methods for measuring fluoride regenerated Sarin (GB) in blood and tissue...Our efforts extend the fluoride ion regeneration method to be able to determine cyclosarin (GF) in red blood cells, plasma, and tissue of minipig

  10. MRI technique for the snapshot imaging of quantitative velocity maps using RARE.

    PubMed

    Shiko, G; Sederman, A J; Gladden, L F

    2012-03-01

    A quantitative PGSE-RARE pulse sequence was developed and successfully applied to the in situ dissolution of two pharmaceutical formulations dissolving over a range of timescales. The new technique was chosen over other existing fast velocity imaging techniques because it is T(2) weighted, not T(2)(∗) weighted, and is, therefore, robust for imaging time-varying interfaces and flow in magnetically heterogeneous systems. The complex signal was preserved intact by separating odd and even echoes to obtain two phase maps which are then averaged in post-processing. Initially, the validity of the technique was shown when imaging laminar flow in a pipe. Subsequently, the dissolution of two drugs was followed in situ, where the technique enables the imaging and quantification of changes in the form of the tablet and the flow field surrounding it at high spatial and temporal resolution. First, the complete 3D velocity field around an eroding salicylic acid tablet was acquired at a resolution of 98×49 μm(2), within 20 min, and monitored over ∼13 h. The tablet was observed to experience a heterogeneous flow field and, hence a heterogeneous shear field, which resulted in the non-symmetric erosion of the tablet. Second, the dissolution of a fast dissolving immediate release tablet was followed using one-shot 2D velocity images acquired every 5.2 s at a resolution of 390×390 μm(2). The quantitative nature of the technique and fast acquisition times provided invaluable information on the dissolution behaviour of this tablet, which had not been attainable previously with conventional quantitative MRI techniques. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Comparison analysis between filtered back projection and algebraic reconstruction technique on microwave imaging

    NASA Astrophysics Data System (ADS)

    Ramadhan, Rifqi; Prabowo, Rian Gilang; Aprilliyani, Ria; Basari

    2018-02-01

    Victims of acute cancer and tumor are growing each year and cancer becomes one of the causes of human deaths in the world. Cancers or tumor tissue cells are cells that grow abnormally and turn to take over and damage the surrounding tissues. At the beginning, cancers or tumors do not have definite symptoms in its early stages, and can even attack the tissues inside of the body. This phenomena is not identifiable under visual human observation. Therefore, an early detection system which is cheap, quick, simple, and portable is essensially required to anticipate the further development of cancer or tumor. Among all of the modalities, microwave imaging is considered to be a cheaper, simple, and portable system method. There are at least two simple image reconstruction algorithms i.e. Filtered Back Projection (FBP) and Algebraic Reconstruction Technique (ART), which have been adopted in some common modalities. In this paper, both algorithms will be compared by reconstructing the image from an artificial tissue model (i.e. phantom), which has two different dielectric distributions. We addressed two performance comparisons, namely quantitative and qualitative analysis. Qualitative analysis includes the smoothness of the image and also the success in distinguishing dielectric differences by observing the image with human eyesight. In addition, quantitative analysis includes Histogram, Structural Similarity Index (SSIM), Mean Squared Error (MSE), and Peak Signal-to-Noise Ratio (PSNR) calculation were also performed. As a result, quantitative parameters of FBP might show better values than the ART. However, ART is likely more capable to distinguish two different dielectric value than FBP, due to higher contrast in ART and wide distribution grayscale level.

  12. Occurrence of invertebrates at 38 stream sites in the Mississippi Embayment study unit, 1996-99

    USGS Publications Warehouse

    Caskey, Brian J.; Justus, B.G.; Zappia, Humbert

    2002-01-01

    A total of 88 invertebrate species and 178 genera representing 59 families, 8 orders, 6 classes, and 3 phyla was identified at 38 stream sites in the Mississippi Embayment Study Unit from 1996 through 1999 as part of the National Water-Quality Assessment Program. Sites were selected based on land use within the drainage basins and the availability of long-term streamflow data. Invertebrates were sampled as part of an overall sampling design to provide information related to the status and trends in water quality in the Mississippi Embayment Study Unit, which includes parts of Arkansas, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. Invertebrate sampling and processing was conducted using nationally standardized techniques developed for the National Water-Quality Assessment Program. These techniques included both a semi-quantitative method, which targeted habitats where invertebrate diversity is expected to be highest, and a qualitative multihabitat method, which samples all available habitat types possible within a sampling reach. All invertebrate samples were shipped to the USGS National Water-Quality Laboratory (NWQL) where they were processed. Of the 365 taxa identified, 156 were identified with the semi-quantitative method that involved sampling a known quantity of what was expected to be the richest habitat, woody debris. The qualitative method, which involved sampling all available habitats, identified 345 taxa The number of organisms identified in the semi-quantitative samples ranged from 74 to 3,295, whereas the number of taxa identified ranged from 9 to 54. The number of organisms identified in the qualitative samples ranged from 42 to 29,634, whereas the number of taxa ranged from 18 to 81. From all the organisms identified, chironomid taxa were the most frequently identified, and plecopteran taxa were among the least frequently identified.

  13. Quantitative Metrics in Clinical Radiology Reporting: A Snapshot Perspective from a Single Mixed Academic-Community Practice

    PubMed Central

    Abramson, Richard G.; Su, Pei-Fang; Shyr, Yu

    2012-01-01

    Quantitative imaging has emerged as a leading priority on the imaging research agenda, yet clinical radiology has traditionally maintained a skeptical attitude toward numerical measurement in diagnostic interpretation. To gauge the extent to which quantitative reporting has been incorporated into routine clinical radiology practice, and to offer preliminary baseline data against which the evolution of quantitative imaging can be measured, we obtained all clinical computed tomography (CT) and magnetic resonance imaging (MRI) reports from two randomly selected weekdays in 2011 at a single mixed academic-community practice and evaluated those reports for the presence of quantitative descriptors. We found that 44% of all reports contained at least one “quantitative metric” (QM), defined as any numerical descriptor of a physical property other than quantity, but only 2% of reports contained an “advanced quantitative metric” (AQM), defined as a numerical parameter reporting on lesion function or composition, excluding simple size and distance measurements. Possible reasons for the slow translation of AQMs into routine clinical radiology reporting include perceptions that the primary clinical question may be qualitative in nature or that a qualitative answer may be sufficient; concern that quantitative approaches may obscure important qualitative information, may not be adequately validated, or may not allow sufficient expression of uncertainty; the feeling that “gestalt” interpretation may be superior to quantitative paradigms; and practical workflow limitations. We suggest that quantitative imaging techniques will evolve primarily as dedicated instruments for answering specific clinical questions requiring precise and standardized interpretation. Validation in real-world settings, ease of use, and reimbursement economics will all play a role in determining the rate of translation of AQMs into broad practice. PMID:22795791

  14. Recent developments and applications of saturation transfer difference nuclear magnetic resonance (STD NMR) spectroscopy.

    PubMed

    Wagstaff, Jane L; Taylor, Samantha L; Howard, Mark J

    2013-04-05

    This review aims to illustrate that STD NMR is not simply a method for drug screening and discovery, but has qualitative and quantitative applications that can answer fundamental and applied biological and biomedical questions involving molecular interactions between ligands and proteins. We begin with a basic introduction to the technique of STD NMR and report on recent advances and biological applications of STD including studies to follow the interactions of non-steroidal anti-inflammatories, minimum binding requirements for virus infection and understating inhibition of amyloid fibre formation. We expand on this introduction by reporting recent STD NMR studies of live-cell receptor systems, new methodologies using scanning STD, magic-angle spinning STD and approaches to use STD NMR in a quantitative fashion for dissociation constants and group epitope mapping (GEM) determination. We finish by outlining new approaches that have potential to influence future applications of the technique; NMR isotope-editing, heteronuclear multidimensional STD and (19)F STD methods that are becoming more amenable due to the latest NMR equipment technologies.

  15. Magnetically launched flyer plate technique for probing electrical conductivity of compressed copper

    NASA Astrophysics Data System (ADS)

    Cochrane, K. R.; Lemke, R. W.; Riford, Z.; Carpenter, J. H.

    2016-03-01

    The electrical conductivity of materials under extremes of temperature and pressure is of crucial importance for a wide variety of phenomena, including planetary modeling, inertial confinement fusion, and pulsed power based dynamic materials experiments. There is a dearth of experimental techniques and data for highly compressed materials, even at known states such as along the principal isentrope and Hugoniot, where many pulsed power experiments occur. We present a method for developing, calibrating, and validating material conductivity models as used in magnetohydrodynamic (MHD) simulations. The difficulty in calibrating a conductivity model is in knowing where the model should be modified. Our method isolates those regions that will have an impact. It also quantitatively prioritizes which regions will have the most beneficial impact. Finally, it tracks the quantitative improvements to the conductivity model during each incremental adjustment. In this paper, we use an experiment on Sandia National Laboratories Z-machine to isentropically launch multiple flyer plates and, with the MHD code ALEGRA and the optimization code DAKOTA, calibrated the conductivity such that we matched an experimental figure of merit to +/-1%.

  16. Magnetically launched flyer plate technique for probing electrical conductivity of compressed copper

    DOE PAGES

    Cochrane, Kyle R.; Lemke, Raymond W.; Riford, Z.; ...

    2016-03-11

    The electrical conductivity of materials under extremes of temperature and pressure is of crucial importance for a wide variety of phenomena, including planetary modeling, inertial confinement fusion, and pulsed power based dynamic materialsexperiments. There is a dearth of experimental techniques and data for highly compressed materials, even at known states such as along the principal isentrope and Hugoniot, where many pulsed power experiments occur. We present a method for developing, calibrating, and validating material conductivity models as used in magnetohydrodynamic(MHD) simulations. The difficulty in calibrating a conductivity model is in knowing where the model should be modified. Our method isolatesmore » those regions that will have an impact. It also quantitatively prioritizes which regions will have the most beneficial impact. Finally, it tracks the quantitative improvements to the conductivity model during each incremental adjustment. In this study, we use an experiment on Sandia National Laboratories Z-machine to isentropically launch multiple flyer plates and, with the MHD code ALEGRA and the optimization code DAKOTA, calibrated the conductivity such that we matched an experimental figure of merit to +/–1%.« less

  17. Use of remote-sensing techniques to survey the physical habitat of large rivers

    USGS Publications Warehouse

    Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffery W.; Kennedy, Gregory W.; Smith, Stephen B.; Edsall, Thomas A.; Behrendt, Thomas E.; Cholwek, Gary; Frey, Jeffrey W.; Kennedy, Gregory W.; Smith, Stephen B.

    1997-01-01

    Remote-sensing techniques that can be used to quantitatively characterize the physical habitat in large rivers in the United States where traditional survey approaches typically used in small- and medium-sized streams and rivers would be ineffective or impossible to apply. The state-of-the-art remote-sensing technologies that we discuss here include side-scan sonar, RoxAnn, acoustic Doppler current profiler, remotely operated vehicles and camera systems, global positioning systems, and laser level survey systems. The use of these technologies will permit the collection of information needed to create computer visualizations and hard copy maps and generate quantitative databases that can be used in real-time mode in the field to characterize the physical habitat at a study location of interest and to guide the distribution of sampling effort needed to address other habitat-related study objectives. This report augments habitat sampling and characterization guidance provided by Meador et al. (1993) and is intended for use primarily by U.S. Geological Survey National Water Quality Assessment program managers and scientists who are documenting water quality in streams and rivers of the United States.

  18. Focussed ion beam thin sample microanalysis using a field emission gun electron probe microanalyser

    NASA Astrophysics Data System (ADS)

    Kubo, Y.

    2018-01-01

    Field emission gun electron probe microanalysis (FEG-EPMA) in conjunction with wavelength-dispersive X-ray spectrometry using a low acceleration voltage (V acc) allows elemental analysis with sub-micrometre lateral spatial resolution (SR). However, this degree of SR does not necessarily meet the requirements associated with increasingly miniaturised devices. Another challenge related to performing FEG-EPMA with a low V acc is that the accuracy of quantitative analyses is adversely affected, primarily because low energy X-ray lines such as the L- and M-lines must be employed and due to the potential of line interference. One promising means of obtaining high SR with FEG-EPMA is to use thin samples together with high V acc values. This mini-review covers the basic principles of thin-sample FEG-EPMA and describes an application of this technique to the analysis of optical fibres. Outstanding issues related to this technique that must be addressed are also discussed, which include the potential for electron beam damage during analysis of insulating materials and the development of methods to use thin samples for quantitative analysis.

  19. The fragmented testis method: development and its advantages of a new quantitative evaluation technique for detection of testis-ova in male fish.

    PubMed

    Lin, Bin-Le; Hagino, Satoshi; Kagoshima, Michio; Iwamatsu, Takashi

    2009-02-01

    A new quantitative evaluation technique, termed the fragmented testis method, has been developed for the detection of testis-ova in genotypic male fish using the medaka (Oryzias latipes). The routine traditional histological method for detection of testis-ova in male fish exposed to estrogens or suspected endocrine-disrupting chemicals has several disadvantages, including possible oversight of testis-ova due to limited sampling of selected tissue sections. The method we have developed here allows for the accurate determination of the developmental stages and the number and the size of testis-ova in a whole testis. Each testis was removed from the fish specimen, fixed with 10% buffered formalin solution, and then divided into small fragments on a glass slide with a dissecting needle or scalpel and aciform forceps in glycerin solution containing a small amount of methylene blue or toluidine blue. If present, all developing testis-ova of various sizes in fragmented testicular tissues were clearly stained and were observable under a dissecting microscope. Testis-ova occurred in controls were ascertained, while spermatozoa were also distinguishable using this method. This proved to be a convenient and cost-effective method for quantitatively evaluating testis-ova appearance in fish, and it may help to clarify the mechanism of testis-ova formation and the biological significance of testis-ova in future studies of endocrine disruption.

  20. Application of X-ray phase contrast micro-tomography to the identification of traditional Chinese medicines

    NASA Astrophysics Data System (ADS)

    Ye, L. L.; Xue, Y. L.; Ni, L. H.; Tan, H.; Wang, Y. D.; Xiao, T. Q.

    2013-07-01

    Nondestructive and in situ investigation to the characteristic microstructures are important to the identification of traditional Chinese medicines (TCMs), especially for precious specimens and samples with oil contains. X-ray phase contrast micro-tomography (XPCMT) could be a practical solution for this kind of investigation. Fructus Foeniculi, a fruit kind of TCMs, is selected as the test sample. Experimental results show that the characteristic microstructures of Fructus Foeniculi, including vittae, vascular bundles, embryo, endosperm and the mesocarp reticulate cells around the vittae can be clearly distinguished and the integrated dissepiments microstructure in the vittae was observed successfully. Especially, for the first time, with virtual slice technique, it can investigate the liquid contains inside the TCMs. The results show that the vittae filled with volatile oil in the oil chamber were observed with this nondestructive and in situ 3-dimensional imaging technique. Furthermore, taking the advantage of micro-computed tomography, we can obtain the characteristic microstructures' quantitative information of the volume in liquid state. The volume of the oil chambers and the volatile oil, which are contained inside the vittae, was quantitatively analyzed. Accordingly, it can calculate the volume ratio of the volatile oil easily and accurately. As a result, we could conclude that XPCMT could be a useful tool for the nondestructive identification and quantitative analysis to TCMs.

  1. Non-interferometric quantitative phase imaging of yeast cells

    NASA Astrophysics Data System (ADS)

    Poola, Praveen K.; Pandiyan, Vimal Prabhu; John, Renu

    2015-12-01

    Real-time imaging of live cells is quite difficult without the addition of external contrast agents. Various methods for quantitative phase imaging of living cells have been proposed like digital holographic microscopy and diffraction phase microscopy. In this paper, we report theoretical and experimental results of quantitative phase imaging of live yeast cells with nanometric precision using transport of intensity equations (TIE). We demonstrate nanometric depth sensitivity in imaging live yeast cells using this technique. This technique being noninterferometric, does not need any coherent light sources and images can be captured through a regular bright-field microscope. This real-time imaging technique would deliver the depth or 3-D volume information of cells and is highly promising in real-time digital pathology applications, screening of pathogens and staging of diseases like malaria as it does not need any preprocessing of samples.

  2. Putting tools in the toolbox: Development of a free, open-source toolbox for quantitative image analysis of porous media.

    NASA Astrophysics Data System (ADS)

    Iltis, G.; Caswell, T. A.; Dill, E.; Wilkins, S.; Lee, W. K.

    2014-12-01

    X-ray tomographic imaging of porous media has proven to be a valuable tool for investigating and characterizing the physical structure and state of both natural and synthetic porous materials, including glass bead packs, ceramics, soil and rock. Given that most synchrotron facilities have user programs which grant academic researchers access to facilities and x-ray imaging equipment free of charge, a key limitation or hindrance for small research groups interested in conducting x-ray imaging experiments is the financial cost associated with post-experiment data analysis. While the cost of high performance computing hardware continues to decrease, expenses associated with licensing commercial software packages for quantitative image analysis continue to increase, with current prices being as high as $24,000 USD, for a single user license. As construction of the Nation's newest synchrotron accelerator nears completion, a significant effort is being made here at the National Synchrotron Light Source II (NSLS-II), Brookhaven National Laboratory (BNL), to provide an open-source, experiment-to-publication toolbox that reduces the financial and technical 'activation energy' required for performing sophisticated quantitative analysis of multidimensional porous media data sets, collected using cutting-edge x-ray imaging techniques. Implementation focuses on leveraging existing open-source projects and developing additional tools for quantitative analysis. We will present an overview of the software suite that is in development here at BNL including major design decisions, a demonstration of several test cases illustrating currently available quantitative tools for analysis and characterization of multidimensional porous media image data sets and plans for their future development.

  3. Phospholipid Fatty Acid Analysis: Past, Present and Future

    NASA Astrophysics Data System (ADS)

    Findlay, R. H.

    2008-12-01

    With their 1980 publication, Bobbie and White initiated the use of phospholipid fatty acids for the study of microbial communities. This method, integrated with a previously published biomass assay based on the colorimetric detection of orthophosphate liberated from phospholipids, provided the first quantitative method for determining microbial community structure. The method is based on a quantitative extraction of lipids from the sample matrix, isolation of the phospholipids, conversion of the phospholipid fatty acids to their corresponding fatty acid methyl esters (known by the acronym FAME) and the separation, identification and quantification of the FAME by gas chromatography. Early laboratory and field samples focused on correlating individual fatty acids to particular groups of microorganisms. Subsequent improvements to the methodology include reduced solvent volumes for extractions, improved sensitivity in the detection of orthophosphate and the use of solid phase extraction technology. Improvements in the field of gas chromatography also increased accessibility of the technique and it has been widely applied to water, sediment, soil and aerosol samples. Whole cell fatty acid analysis, a related but not equal technique, is currently used for phenotypic characterization in bacterial species descriptions and is the basis for a commercial, rapid bacterial identification system. In the early 1990ês application of multivariate statistical analysis, first cluster analysis and then principal component analysis, further improved the usefulness of the technique and allowed the development of a functional group approach to interpretation of phospholipid fatty acid profiles. Statistical techniques currently applied to the analysis of phospholipid fatty acid profiles include constrained ordinations and neutral networks. Using redundancy analysis, a form of constrained ordination, we have recently shown that both cation concentration and dissolved organic matter (DOM) quality are determinates of microbial community structure in forested headwater streams. One of the most exciting recent developments in phospholipid fatty acid analysis is the application of compound specific stable isotope analysis. We are currently applying this technique to stream sediments to help determine which microorganisms are involved in the initial processing of DOM and the technique promises to be a useful tool for assigning ecological function to microbial populations.

  4. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis.

    PubMed

    Lee, Hyokyeong; Moody-Davis, Asher; Saha, Utsab; Suzuki, Brian M; Asarnow, Daniel; Chen, Steven; Arkin, Michelle; Caffrey, Conor R; Singh, Rahul

    2012-01-01

    Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind.

  5. Quantification and clustering of phenotypic screening data using time-series analysis for chemotherapy of schistosomiasis

    PubMed Central

    2012-01-01

    Background Neglected tropical diseases, especially those caused by helminths, constitute some of the most common infections of the world's poorest people. Development of techniques for automated, high-throughput drug screening against these diseases, especially in whole-organism settings, constitutes one of the great challenges of modern drug discovery. Method We present a method for enabling high-throughput phenotypic drug screening against diseases caused by helminths with a focus on schistosomiasis. The proposed method allows for a quantitative analysis of the systemic impact of a drug molecule on the pathogen as exhibited by the complex continuum of its phenotypic responses. This method consists of two key parts: first, biological image analysis is employed to automatically monitor and quantify shape-, appearance-, and motion-based phenotypes of the parasites. Next, we represent these phenotypes as time-series and show how to compare, cluster, and quantitatively reason about them using techniques of time-series analysis. Results We present results on a number of algorithmic issues pertinent to the time-series representation of phenotypes. These include results on appropriate representation of phenotypic time-series, analysis of different time-series similarity measures for comparing phenotypic responses over time, and techniques for clustering such responses by similarity. Finally, we show how these algorithmic techniques can be used for quantifying the complex continuum of phenotypic responses of parasites. An important corollary is the ability of our method to recognize and rigorously group parasites based on the variability of their phenotypic response to different drugs. Conclusions The methods and results presented in this paper enable automatic and quantitative scoring of high-throughput phenotypic screens focused on helmintic diseases. Furthermore, these methods allow us to analyze and stratify parasites based on their phenotypic response to drugs. Together, these advancements represent a significant breakthrough for the process of drug discovery against schistosomiasis in particular and can be extended to other helmintic diseases which together afflict a large part of humankind. PMID:22369037

  6. Timed function tests, motor function measure, and quantitative thigh muscle MRI in ambulant children with Duchenne muscular dystrophy: A cross-sectional analysis.

    PubMed

    Schmidt, Simone; Hafner, Patricia; Klein, Andrea; Rubino-Nacht, Daniela; Gocheva, Vanya; Schroeder, Jonas; Naduvilekoot Devasia, Arjith; Zuesli, Stephanie; Bernert, Guenther; Laugel, Vincent; Bloetzer, Clemens; Steinlin, Maja; Capone, Andrea; Gloor, Monika; Tobler, Patrick; Haas, Tanja; Bieri, Oliver; Zumbrunn, Thomas; Fischer, Dirk; Bonati, Ulrike

    2018-01-01

    The development of new therapeutic agents for the treatment of Duchenne muscular dystrophy has put a focus on defining outcome measures most sensitive to capture treatment effects. This cross-sectional analysis investigates the relation between validated clinical assessments such as the 6-minute walk test, motor function measure and quantitative muscle MRI of thigh muscles in ambulant Duchenne muscular dystrophy patients, aged 6.5 to 10.8 years (mean 8.2, SD 1.1). Quantitative muscle MRI included the mean fat fraction using a 2-point Dixon technique, and transverse relaxation time (T2) measurements. All clinical assessments were highly significantly inter-correlated with p < 0.001. The strongest correlation with the motor function measure and its D1-subscore was shown by the 6-minute walk test. Clinical assessments showed no correlation with age. Importantly, quantitative muscle MRI values significantly correlated with all clinical assessments with the extensors showing the strongest correlation. In contrast to the clinical assessments, quantitative muscle MRI values were highly significantly correlated with age. In conclusion, the motor function measure and timed function tests measure disease severity in a highly comparable fashion and all tests correlated with quantitative muscle MRI values quantifying fatty muscle degeneration. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. The application of absolute quantitative (1)H NMR spectroscopy in drug discovery and development.

    PubMed

    Singh, Suruchi; Roy, Raja

    2016-07-01

    The identification of a drug candidate and its structural determination is the most important step in the process of the drug discovery and for this, nuclear magnetic resonance (NMR) is one of the most selective analytical techniques. The present review illustrates the various perspectives of absolute quantitative (1)H NMR spectroscopy in drug discovery and development. It deals with the fundamentals of quantitative NMR (qNMR), the physiochemical properties affecting qNMR, and the latest referencing techniques used for quantification. The precise application of qNMR during various stages of drug discovery and development, namely natural product research, drug quantitation in dosage forms, drug metabolism studies, impurity profiling and solubility measurements is elaborated. To achieve this, the authors explore the literature of NMR in drug discovery and development between 1963 and 2015. It also takes into account several other reviews on the subject. qNMR experiments are used for drug discovery and development processes as it is a non-destructive, versatile and robust technique with high intra and interpersonal variability. However, there are several limitations also. qNMR of complex biological samples is incorporated with peak overlap and a low limit of quantification and this can be overcome by using hyphenated chromatographic techniques in addition to NMR.

  8. Computational systems biology and dose-response modeling in relation to new directions in toxicity testing.

    PubMed

    Zhang, Qiang; Bhattacharya, Sudin; Andersen, Melvin E; Conolly, Rory B

    2010-02-01

    The new paradigm envisioned for toxicity testing in the 21st century advocates shifting from the current animal-based testing process to a combination of in vitro cell-based studies, high-throughput techniques, and in silico modeling. A strategic component of the vision is the adoption of the systems biology approach to acquire, analyze, and interpret toxicity pathway data. As key toxicity pathways are identified and their wiring details elucidated using traditional and high-throughput techniques, there is a pressing need to understand their qualitative and quantitative behaviors in response to perturbation by both physiological signals and exogenous stressors. The complexity of these molecular networks makes the task of understanding cellular responses merely by human intuition challenging, if not impossible. This process can be aided by mathematical modeling and computer simulation of the networks and their dynamic behaviors. A number of theoretical frameworks were developed in the last century for understanding dynamical systems in science and engineering disciplines. These frameworks, which include metabolic control analysis, biochemical systems theory, nonlinear dynamics, and control theory, can greatly facilitate the process of organizing, analyzing, and understanding toxicity pathways. Such analysis will require a comprehensive examination of the dynamic properties of "network motifs"--the basic building blocks of molecular circuits. Network motifs like feedback and feedforward loops appear repeatedly in various molecular circuits across cell types and enable vital cellular functions like homeostasis, all-or-none response, memory, and biological rhythm. These functional motifs and associated qualitative and quantitative properties are the predominant source of nonlinearities observed in cellular dose response data. Complex response behaviors can arise from toxicity pathways built upon combinations of network motifs. While the field of computational cell biology has advanced rapidly with increasing availability of new data and powerful simulation techniques, a quantitative orientation is still lacking in life sciences education to make efficient use of these new tools to implement the new toxicity testing paradigm. A revamped undergraduate curriculum in the biological sciences including compulsory courses in mathematics and analysis of dynamical systems is required to address this gap. In parallel, dissemination of computational systems biology techniques and other analytical tools among practicing toxicologists and risk assessment professionals will help accelerate implementation of the new toxicity testing vision.

  9. 3D quantitative phase imaging of neural networks using WDT

    NASA Astrophysics Data System (ADS)

    Kim, Taewoo; Liu, S. C.; Iyer, Raj; Gillette, Martha U.; Popescu, Gabriel

    2015-03-01

    White-light diffraction tomography (WDT) is a recently developed 3D imaging technique based on a quantitative phase imaging system called spatial light interference microscopy (SLIM). The technique has achieved a sub-micron resolution in all three directions with high sensitivity granted by the low-coherence of a white-light source. Demonstrations of the technique on single cell imaging have been presented previously; however, imaging on any larger sample, including a cluster of cells, has not been demonstrated using the technique. Neurons in an animal body form a highly complex and spatially organized 3D structure, which can be characterized by neuronal networks or circuits. Currently, the most common method of studying the 3D structure of neuron networks is by using a confocal fluorescence microscope, which requires fluorescence tagging with either transient membrane dyes or after fixation of the cells. Therefore, studies on neurons are often limited to samples that are chemically treated and/or dead. WDT presents a solution for imaging live neuron networks with a high spatial and temporal resolution, because it is a 3D imaging method that is label-free and non-invasive. Using this method, a mouse or rat hippocampal neuron culture and a mouse dorsal root ganglion (DRG) neuron culture have been imaged in order to see the extension of processes between the cells in 3D. Furthermore, the tomogram is compared with a confocal fluorescence image in order to investigate the 3D structure at synapses.

  10. Student Enrollment Forecasting Techniques for Higher Education.

    ERIC Educational Resources Information Center

    Ahrens, Stephen W.

    Various techniques used by state agencies, secondary schools, community colleges, and large universities to forecast enrollments are described and guidelines for constructing forecasting procedures are outlined. The forecasting techniques are divided into three categories: (1) quantitative techniques based on historical data that attempt curve…

  11. Iterative optimization method for design of quantitative magnetization transfer imaging experiments.

    PubMed

    Levesque, Ives R; Sled, John G; Pike, G Bruce

    2011-09-01

    Quantitative magnetization transfer imaging (QMTI) using spoiled gradient echo sequences with pulsed off-resonance saturation can be a time-consuming technique. A method is presented for selection of an optimum experimental design for quantitative magnetization transfer imaging based on the iterative reduction of a discrete sampling of the Z-spectrum. The applicability of the technique is demonstrated for human brain white matter imaging at 1.5 T and 3 T, and optimal designs are produced to target specific model parameters. The optimal number of measurements and the signal-to-noise ratio required for stable parameter estimation are also investigated. In vivo imaging results demonstrate that this optimal design approach substantially improves parameter map quality. The iterative method presented here provides an advantage over free form optimal design methods, in that pragmatic design constraints are readily incorporated. In particular, the presented method avoids clustering and repeated measures in the final experimental design, an attractive feature for the purpose of magnetization transfer model validation. The iterative optimal design technique is general and can be applied to any method of quantitative magnetization transfer imaging. Copyright © 2011 Wiley-Liss, Inc.

  12. Quantitative Hydrocarbon Surface Analysis

    NASA Technical Reports Server (NTRS)

    Douglas, Vonnie M.

    2000-01-01

    The elimination of ozone depleting substances, such as carbon tetrachloride, has resulted in the use of new analytical techniques for cleanliness verification and contamination sampling. The last remaining application at Rocketdyne which required a replacement technique was the quantitative analysis of hydrocarbons by infrared spectrometry. This application, which previously utilized carbon tetrachloride, was successfully modified using the SOC-400, a compact portable FTIR manufactured by Surface Optics Corporation. This instrument can quantitatively measure and identify hydrocarbons from solvent flush of hardware as well as directly analyze the surface of metallic components without the use of ozone depleting chemicals. Several sampling accessories are utilized to perform analysis for various applications.

  13. NASA Lewis Research Center Futuring Workshop

    NASA Technical Reports Server (NTRS)

    Boroush, Mark; Stover, John; Thomas, Charles

    1987-01-01

    On October 21 and 22, 1986, the Futures Group ran a two-day Futuring Workshop on the premises of NASA Lewis Research Center. The workshop had four main goals: to acquaint participants with the general history of technology forecasting; to familiarize participants with the range of forecasting methodologies; to acquaint participants with the range of applicability, strengths, and limitations of each method; and to offer participants some hands-on experience by working through both judgmental and quantitative case studies. Among the topics addressed during this workshop were: information sources; judgmental techniques; quantitative techniques; merger of judgment with quantitative measurement; data collection methods; and dealing with uncertainty.

  14. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  15. Quantitative body fluid proteomics in medicine - A focus on minimal invasiveness.

    PubMed

    Csősz, Éva; Kalló, Gergő; Márkus, Bernadett; Deák, Eszter; Csutak, Adrienne; Tőzsér, József

    2017-02-05

    Identification of new biomarkers specific for various pathological conditions is an important field in medical sciences. Body fluids have emerging potential in biomarker studies especially those which are continuously available and can be collected by non-invasive means. Changes in the protein composition of body fluids such as tears, saliva, sweat, etc. may provide information on both local and systemic conditions of medical relevance. In this review, our aim is to discuss the quantitative proteomics techniques used in biomarker studies, and to present advances in quantitative body fluid proteomics of non-invasively collectable body fluids with relevance to biomarker identification. The advantages and limitations of the widely used quantitative proteomics techniques are also presented. Based on the reviewed literature, we suggest an ideal pipeline for body fluid analyses aiming at biomarkers discoveries: starting from identification of biomarker candidates by shotgun quantitative proteomics or protein arrays, through verification of potential biomarkers by targeted mass spectrometry, to the antibody-based validation of biomarkers. The importance of body fluids as a rich source of biomarkers is discussed. Quantitative proteomics is a challenging part of proteomics applications. The body fluids collected by non-invasive means have high relevance in medicine; they are good sources for biomarkers used in establishing the diagnosis, follow up of disease progression and predicting high risk groups. The review presents the most widely used quantitative proteomics techniques in body fluid analysis and lists the potential biomarkers identified in tears, saliva, sweat, nasal mucus and urine for local and systemic diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Quantitative Imaging Biomarkers: A Review of Statistical Methods for Computer Algorithm Comparisons

    PubMed Central

    2014-01-01

    Quantitative biomarkers from medical images are becoming important tools for clinical diagnosis, staging, monitoring, treatment planning, and development of new therapies. While there is a rich history of the development of quantitative imaging biomarker (QIB) techniques, little attention has been paid to the validation and comparison of the computer algorithms that implement the QIB measurements. In this paper we provide a framework for QIB algorithm comparisons. We first review and compare various study designs, including designs with the true value (e.g. phantoms, digital reference images, and zero-change studies), designs with a reference standard (e.g. studies testing equivalence with a reference standard), and designs without a reference standard (e.g. agreement studies and studies of algorithm precision). The statistical methods for comparing QIB algorithms are then presented for various study types using both aggregate and disaggregate approaches. We propose a series of steps for establishing the performance of a QIB algorithm, identify limitations in the current statistical literature, and suggest future directions for research. PMID:24919829

  17. Quantitative Measurement of Local Infrared Absorption and Dielectric Function with Tip-Enhanced Near-Field Microscopy.

    PubMed

    Govyadinov, Alexander A; Amenabar, Iban; Huth, Florian; Carney, P Scott; Hillenbrand, Rainer

    2013-05-02

    Scattering-type scanning near-field optical microscopy (s-SNOM) and Fourier transform infrared nanospectroscopy (nano-FTIR) are emerging tools for nanoscale chemical material identification. Here, we push s-SNOM and nano-FTIR one important step further by enabling them to quantitatively measure local dielectric constants and infrared absorption. Our technique is based on an analytical model, which allows for a simple inversion of the near-field scattering problem. It yields the dielectric permittivity and absorption of samples with 2 orders of magnitude improved spatial resolution compared to far-field measurements and is applicable to a large class of samples including polymers and biological matter. We verify the capabilities by determining the local dielectric permittivity of a PMMA film from nano-FTIR measurements, which is in excellent agreement with far-field ellipsometric data. We further obtain local infrared absorption spectra with unprecedented accuracy in peak position and shape, which is the key to quantitative chemometrics on the nanometer scale.

  18. Numerical analysis of quantitative measurement of hydroxyl radical concentration using laser-induced fluorescence in flame

    NASA Astrophysics Data System (ADS)

    Shuang, Chen; Tie, Su; Yao-Bang, Zheng; Li, Chen; Ting-Xu, Liu; Ren-Bing, Li; Fu-Rong, Yang

    2016-06-01

    The aim of the present work is to quantitatively measure the hydroxyl radical concentration by using LIF (laser-induced fluorescence) in flame. The detailed physical models of spectral absorption lineshape broadening, collisional transition and quenching at elevated pressure are built. The fine energy level structure of the OH molecule is illustrated to understand the process with laser-induced fluorescence emission and others in the case without radiation, which include collisional quenching, rotational energy transfer (RET), and vibrational energy transfer (VET). Based on these, some numerical results are achieved by simulations in order to evaluate the fluorescence yield at elevated pressure. These results are useful for understanding the real physical processes in OH-LIF technique and finding a way to calibrate the signal for quantitative measurement of OH concentration in a practical combustor. Project supported by the National Natural Science Foundation of China (Grant No. 11272338) and the Fund from the Science and Technology on Scramjet Key Laboratory, China (Grant No. STSKFKT2013004).

  19. Quantitative imaging of the human upper airway: instrument design and clinical studies

    NASA Astrophysics Data System (ADS)

    Leigh, M. S.; Armstrong, J. J.; Paduch, A.; Sampson, D. D.; Walsh, J. H.; Hillman, D. R.; Eastwood, P. R.

    2006-08-01

    Imaging of the human upper airway is widely used in medicine, in both clinical practice and research. Common imaging modalities include video endoscopy, X-ray CT, and MRI. However, no current modality is both quantitative and safe to use for extended periods of time. Such a capability would be particularly valuable for sleep research, which is inherently reliant on long observation sessions. We have developed an instrument capable of quantitative imaging of the human upper airway, based on endoscopic optical coherence tomography. There are no dose limits for optical techniques, and the minimally invasive imaging probe is safe for use in overnight studies. We report on the design of the instrument and its use in preliminary clinical studies, and we present results from a range of initial experiments. The experiments show that the instrument is capable of imaging during sleep, and that it can record dynamic changes in airway size and shape. This information is useful for research into sleep disorders, and potentially for clinical diagnosis and therapies.

  20. Pansharpening on the Narrow Vnir and SWIR Spectral Bands of SENTINEL-2

    NASA Astrophysics Data System (ADS)

    Vaiopoulos, A. D.; Karantzalos, K.

    2016-06-01

    In this paper results from the evaluation of several state-of-the-art pansharpening techniques are presented for the VNIR and SWIR bands of Sentinel-2. A procedure for the pansharpening is also proposed which aims at respecting the closest spectral similarities between the higher and lower resolution bands. The evaluation included 21 different fusion algorithms and three evaluation frameworks based both on standard quantitative image similarity indexes and qualitative evaluation from remote sensing experts. The overall analysis of the evaluation results indicated that remote sensing experts disagreed with the outcomes and method ranking from the quantitative assessment. The employed image quality similarity indexes and quantitative evaluation framework based on both high and reduced resolution data from the literature didn't manage to highlight/evaluate mainly the spatial information that was injected to the lower resolution images. Regarding the SWIR bands none of the methods managed to deliver significantly better results than a standard bicubic interpolation on the original low resolution bands.

  1. Olfactory dysfunction and its measurement in the clinic and workplace.

    PubMed

    Doty, Richard L

    2006-04-01

    To provide an overview of practical means for quantitatively assessing the sense of smell in both the clinic and workplace. To address basic measurement issues, including those of test sensitivity, specificity, and reliability. To describe and discuss factors that influence olfactory function, including airborne toxins commonly found in industrial settings. Selective review and discussion. A number of well-validated practical threshold and suprathreshold tests are available for assessing smell function. The reliability, sensitivity, and specificity of such techniques vary, being influenced by such factors as test length and type. Numerous subject factors, including age, sex, health, medications, and exposure to environmental toxins, particularly heavy metals, influence the ability to smell. Modern advances in technology, in conjunction with better occupational medicine practices, now make it possible to reliably monitor and limit occupational exposures to hazardous chemicals and their potential adverse influences on the sense of smell. Quantitative olfactory testing is critical to establish the presence or absence of such adverse influences, as well as to (a) detect malingering, (b) establish disability compensation, and (c) monitor function over time.

  2. Classification of normal and malignant human gastric mucosa tissue with confocal Raman microspectroscopy and wavelet analysis

    NASA Astrophysics Data System (ADS)

    Hu, Yaogai; Shen, Aiguo; Jiang, Tao; Ai, Yong; Hu, Jiming

    2008-02-01

    Thirty-two samples from the human gastric mucosa tissue, including 13 normal and 19 malignant tissue samples were measured by confocal Raman microspectroscopy. The low signal-to-background ratio spectra from human gastric mucosa tissues were obtained by this technique without any sample preparation. Raman spectral interferences include a broad featureless sloping background due to fluorescence and noise. They mask most Raman spectral feature and lead to problems with precision and quantitation of the original spectral information. A preprocessed algorithm based on wavelet analysis was used to reduce noise and eliminate background/baseline of Raman spectra. Comparing preprocessed spectra of malignant gastric mucosa tissues with those of counterpart normal ones, there were obvious spectral changes, including intensity increase at ˜1156 cm -1 and intensity decrease at ˜1587 cm -1. The quantitative criterion based upon the intensity ratio of the ˜1156 and ˜1587 cm -1 was extracted for classification of the normal and malignant gastric mucosa tissue samples. This could result in a new diagnostic method, which would assist the early diagnosis of gastric cancer.

  3. Quantitative imaging technique using the layer-stripping algorithm

    NASA Astrophysics Data System (ADS)

    Beilina, L.

    2017-07-01

    We present the layer-stripping algorithm for the solution of the hyperbolic coefficient inverse problem (CIP). Our numerical examples show quantitative reconstruction of small tumor-like inclusions in two-dimensions.

  4. Response of Selected Microorganisms to Experimental Planetary Environments

    NASA Technical Reports Server (NTRS)

    Foster, T. L.; Winans, L., Jr.

    1977-01-01

    Results of studies in anaerobic phosphorus metabolism are presented. Specific topics discussed include: (1) anaerobic utilization of PH3; (2) reduction of phosphate or phosphite; (3) isolation of organisms which utilize phosphite or phosphate anaerobically as a final hydrogen acceptor; and (4) the toxicity of PH3 to the organisms. Techniques of anaerobic microbiology associated with space hardware were also studied. These include: (1) the Brewer anaerobe jar/GasPak system; (2) a new procedure to grow aerobes and anaerobes simultaneously; (3) a culture medium to differentiate oblagate from facultative anaerobes; and (4) a procedure to quantitate O2 sensitivity of anaerobes.

  5. LC–MS/MS Quantitation of Esophagus Disease Blood Serum Glycoproteins by Enrichment with Hydrazide Chemistry and Lectin Affinity Chromatography

    PubMed Central

    2015-01-01

    Changes in glycosylation have been shown to have a profound correlation with development/malignancy in many cancer types. Currently, two major enrichment techniques have been widely applied in glycoproteomics, namely, lectin affinity chromatography (LAC)-based and hydrazide chemistry (HC)-based enrichments. Here we report the LC–MS/MS quantitative analyses of human blood serum glycoproteins and glycopeptides associated with esophageal diseases by LAC- and HC-based enrichment. The separate and complementary qualitative and quantitative data analyses of protein glycosylation were performed using both enrichment techniques. Chemometric and statistical evaluations, PCA plots, or ANOVA test, respectively, were employed to determine and confirm candidate cancer-associated glycoprotein/glycopeptide biomarkers. Out of 139, 59 common glycoproteins (42% overlap) were observed in both enrichment techniques. This overlap is very similar to previously published studies. The quantitation and evaluation of significantly changed glycoproteins/glycopeptides are complementary between LAC and HC enrichments. LC–ESI–MS/MS analyses indicated that 7 glycoproteins enriched by LAC and 11 glycoproteins enriched by HC showed significantly different abundances between disease-free and disease cohorts. Multiple reaction monitoring quantitation resulted in 13 glycopeptides by LAC enrichment and 10 glycosylation sites by HC enrichment to be statistically different among disease cohorts. PMID:25134008

  6. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    PubMed

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  7. Technique for quantitative RT-PCR analysis directly from single muscle fibers.

    PubMed

    Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M

    2008-07-01

    The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.

  8. Study of fault-tolerant software technology

    NASA Technical Reports Server (NTRS)

    Slivinski, T.; Broglio, C.; Wild, C.; Goldberg, J.; Levitt, K.; Hitt, E.; Webb, J.

    1984-01-01

    Presented is an overview of the current state of the art of fault-tolerant software and an analysis of quantitative techniques and models developed to assess its impact. It examines research efforts as well as experience gained from commercial application of these techniques. The paper also addresses the computer architecture and design implications on hardware, operating systems and programming languages (including Ada) of using fault-tolerant software in real-time aerospace applications. It concludes that fault-tolerant software has progressed beyond the pure research state. The paper also finds that, although not perfectly matched, newer architectural and language capabilities provide many of the notations and functions needed to effectively and efficiently implement software fault-tolerance.

  9. Nuclear Resonance Fluorescence for Materials Assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quiter, Brian; Ludewigt, Bernhard; Mozin, Vladimir

    This paper discusses the use of nuclear resonance fluorescence (NRF) techniques for the isotopic and quantitative assaying of radioactive material. Potential applications include age-dating of an unknown radioactive source, pre- and post-detonation nuclear forensics, and safeguards for nuclear fuel cycles Examples of age-dating a strong radioactive source and assaying a spent fuel pin are discussed. The modeling work has ben performed with the Monte Carlo radiation transport computer code MCNPX, and the capability to simulate NRF has bee added to the code. Discussed are the limitations in MCNPX's photon transport physics for accurately describing photon scattering processes that are importantmore » contributions to the background and impact the applicability of the NRF assay technique.« less

  10. Characterization of Thermal and Mechanical Impact on Aluminum Honeycomb Structures

    NASA Technical Reports Server (NTRS)

    Robinson, Christen M.

    2013-01-01

    This study supports NASA Kennedy Space Center's research in the area of intelligent thermal management systems and multifunctional thermal systems. This project addresses the evaluation of the mechanical and thermal properties of metallic cellular solid (MCS) materials; those that are lightweight; high strength, tunable, multifunctional and affordable. A portion of the work includes understanding the mechanical properties of honeycomb structured cellular solids upon impact testing under ambient, water-immersed, liquid nitrogen-cooled, and liquid nitrogen-immersed conditions. Additionally, this study will address characterization techniques of the aluminum honeycomb's ability to resist multiple high-rate loadings or impacts in varying environmental conditions, using various techniques for the quantitative and qualitative determination for commercial applicability.

  11. Frequency response of electrochemical cells

    NASA Technical Reports Server (NTRS)

    Thomas, Daniel L.

    1990-01-01

    The main objective was to examine the feasibility of using frequency response techniques (1) as a tool in destructive physical analysis of batteries, particularly for estimating electrode structural parameters such as specific area, porosity, and tortuosity and (2) as a non-destructive testing technique for obtaining information such as state of charge and acceptability for space flight. The phenomena that contribute to the frequency response of an electrode include: (1) double layer capacitance; (2) Faradaic reaction resistance; (3) mass transfer of Warburg impedance; and (4) ohmic solution resistance. Nickel cadmium cells were investigated in solutions of KOH. A significant amount of data was acquired. Quantitative data analysis, using the developed software, is planned for the future.

  12. Silicon Heterojunction System Field Performance

    DOE PAGES

    Jordan, Dirk C.; Deline, Chris; Johnston, Steve; ...

    2017-11-17

    A silicon heterostructure photovoltaic system fielded for 10 years has been investigated in detail. The system has shown degradation, but at a rate similar to an average Si system, and still within the module warranty level. The power decline is dominated by a nonlinear Voc loss rather than more typical changes in Isc or Fill Factor. Modules have been evaluated using multiple techniques including: dark and light I-V measurement, Suns-Voc, thermal imaging, and quantitative electroluminescence. All techniques indicate that recombination and series resistance in the cells have increased along with a decrease of factor 2 in minority carrier lifetime. Performancemore » changes are fairly uniform across the module, indicating changes occur primarily within the cells.« less

  13. Exploring Techniques for Vision Based Human Activity Recognition: Methods, Systems, and Evaluation

    PubMed Central

    Xu, Xin; Tang, Jinshan; Zhang, Xiaolong; Liu, Xiaoming; Zhang, Hong; Qiu, Yimin

    2013-01-01

    With the wide applications of vision based intelligent systems, image and video analysis technologies have attracted the attention of researchers in the computer vision field. In image and video analysis, human activity recognition is an important research direction. By interpreting and understanding human activities, we can recognize and predict the occurrence of crimes and help the police or other agencies react immediately. In the past, a large number of papers have been published on human activity recognition in video and image sequences. In this paper, we provide a comprehensive survey of the recent development of the techniques, including methods, systems, and quantitative evaluation of the performance of human activity recognition. PMID:23353144

  14. Robust, Decoupled, Flight Control Design with Rate Saturating Actuators

    NASA Technical Reports Server (NTRS)

    Snell, S. A.; Hess, R. A.

    1997-01-01

    Techniques for the design of control systems for manually controlled, high-performance aircraft must provide the following: (1) multi-input, multi-output (MIMO) solutions, (2) acceptable handling qualities including no tendencies for pilot-induced oscillations, (3) a tractable approach for compensator design, (4) performance and stability robustness in the presence of significant plant uncertainty, and (5) performance and stability robustness in the presence actuator saturation (particularly rate saturation). A design technique built upon Quantitative Feedback Theory is offered as a candidate methodology which can provide flight control systems meeting these requirements, and do so over a considerable part of the flight envelope. An example utilizing a simplified model of a supermaneuverable fighter aircraft demonstrates the proposed design methodology.

  15. Integrating multiparametric prostate MRI into clinical practice

    PubMed Central

    2011-01-01

    Abstract Multifunctional magnetic resonance imaging (MRI) techniques are increasingly being used to address bottlenecks in prostate cancer patient management. These techniques yield qualitative, semi-quantitative and fully quantitative biomarkers that reflect on the underlying biological status of a tumour. If these techniques are to have a role in patient management, then standard methods of data acquisition, analysis and reporting have to be developed. Effective communication by the use of scoring systems, structured reporting and a graphical interface that matches prostate anatomy are key elements. Practical guidelines for integrating multiparametric MRI into clinical practice are presented. PMID:22187067

  16. Fluorescence-based Western blotting for quantitation of protein biomarkers in clinical samples.

    PubMed

    Zellner, Maria; Babeluk, Rita; Diestinger, Michael; Pirchegger, Petra; Skeledzic, Senada; Oehler, Rudolf

    2008-09-01

    Since most high throughput techniques used in biomarker discovery are very time and cost intensive, highly specific and quantitative analytical alternative application methods are needed for the routine analysis. Conventional Western blotting allows detection of specific proteins to the level of single isotypes while its quantitative accuracy is rather limited. We report a novel and improved quantitative Western blotting method. The use of fluorescently labelled secondary antibodies strongly extends the dynamic range of the quantitation and improves the correlation with the protein amount (r=0.997). By an additional fluorescent staining of all proteins immediately after their transfer to the blot membrane, it is possible to visualise simultaneously the antibody binding and the total protein profile. This allows for an accurate correction for protein load. Applying this normalisation it could be demonstrated that fluorescence-based Western blotting is able to reproduce a quantitative analysis of two specific proteins in blood platelet samples from 44 subjects with different diseases as initially conducted by 2D-DIGE. These results show that the proposed fluorescence-based Western blotting is an adequate application technique for biomarker quantitation and suggest possibilities of employment that go far beyond.

  17. Thermal Nondestructive Characterization of Corrosion in Boiler Tubes by Application fo a Moving Line Heat Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Wall thinning in utility boiler waterwall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used lor inspection of these tubes. This technique has proved to be very labor intensive and slow. This has resulted in a "spot check" approach to inspections, making thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source, coupled with this analysis technique, represents a significant improvement in the inspection speed for large structures such as boiler waterwalls while still providing high-resolution thickness measurements. A theoretical basis for the technique will be presented thus demonstrating the quantitative nature of the technique. Further, results of laboratory experiments on flat Panel specimens with fabricated material loss regions will be presented.

  18. Focused Group Interviews as an Innovative Quanti-Qualitative Methodology (QQM): Integrating Quantitative Elements into a Qualitative Methodology

    ERIC Educational Resources Information Center

    Grim, Brian J.; Harmon, Alison H.; Gromis, Judy C.

    2006-01-01

    There is a sharp divide between quantitative and qualitative methodologies in the social sciences. We investigate an innovative way to bridge this gap that incorporates quantitative techniques into a qualitative method, the "quanti-qualitative method" (QQM). Specifically, our research utilized small survey questionnaires and experiment-like…

  19. Functional magnetic resonance imaging in oncology: state of the art*

    PubMed Central

    Guimaraes, Marcos Duarte; Schuch, Alice; Hochhegger, Bruno; Gross, Jefferson Luiz; Chojniak, Rubens; Marchiori, Edson

    2014-01-01

    In the investigation of tumors with conventional magnetic resonance imaging, both quantitative characteristics, such as size, edema, necrosis, and presence of metastases, and qualitative characteristics, such as contrast enhancement degree, are taken into consideration. However, changes in cell metabolism and tissue physiology which precede morphological changes cannot be detected by the conventional technique. The development of new magnetic resonance imaging techniques has enabled the functional assessment of the structures in order to obtain information on the different physiological processes of the tumor microenvironment, such as oxygenation levels, cellularity and vascularity. The detailed morphological study in association with the new functional imaging techniques allows for an appropriate approach to cancer patients, including the phases of diagnosis, staging, response evaluation and follow-up, with a positive impact on their quality of life and survival rate. PMID:25741058

  20. Non-interferometric phase retrieval using refractive index manipulation

    PubMed Central

    Chen, Chyong-Hua; Hsu, Hsin-Feng; Chen, Hou-Ren; Hsieh, Wen-Feng

    2017-01-01

    We present a novel, inexpensive and non-interferometric technique to retrieve phase images by using a liquid crystal phase shifter without including any physically moving parts. First, we derive a new equation of the intensity-phase relation with respect to the change of refractive index, which is similar to the transport of the intensity equation. The equation indicates that this technique is unneeded to consider the variation of magnifications between optical images. For proof of the concept, we use a liquid crystal mixture MLC 2144 to manufacture a phase shifter and to capture the optical images in a rapid succession by electrically tuning the applied voltage of the phase shifter. Experimental results demonstrate that this technique is capable of reconstructing high-resolution phase images and to realize the thickness profile of a microlens array quantitatively. PMID:28387382

  1. Terahertz pulsed imaging as an advanced characterisation tool for film coatings--a review.

    PubMed

    Haaser, Miriam; Gordon, Keith C; Strachan, Clare J; Rades, Thomas

    2013-12-05

    Solid dosage forms are the pharmaceutical drug delivery systems of choice for oral drug delivery. These solid dosage forms are often coated to modify the physico-chemical properties of the active pharmaceutical ingredients (APIs), in particular to alter release kinetics. Since the product performance of coated dosage forms is a function of their critical coating attributes, including coating thickness, uniformity, and density, more advanced quality control techniques than weight gain are required. A recently introduced non-destructive method to quantitatively characterise coating quality is terahertz pulsed imaging (TPI). The ability of terahertz radiation to penetrate many pharmaceutical materials enables structural features of coated solid dosage forms to be probed at depth, which is not readily achievable with other established imaging techniques, e.g. near-infrared (NIR) and Raman spectroscopy. In this review TPI is introduced and various applications of the technique in pharmaceutical coating analysis are discussed. These include evaluation of coating thickness, uniformity, surface morphology, density, defects and buried structures as well as correlation between TPI measurements and drug release performance, coating process monitoring and scale up. Furthermore, challenges and limitations of the technique are discussed. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Carotid lesion characterization by synthetic-aperture-imaging techniques with multioffset ultrasonic probes

    NASA Astrophysics Data System (ADS)

    Capineri, Lorenzo; Castellini, Guido; Masotti, Leonardo F.; Rocchi, Santina

    1992-06-01

    This paper explores the applications of a high-resolution imaging technique to vascular ultrasound diagnosis, with emphasis on investigation of the carotid vessel. With the present diagnostic systems, it is difficult to measure quantitatively the extension of the lesions and to characterize the tissue; quantitative images require enough spatial resolution and dynamic to reveal fine high-risk pathologies. A broadband synthetic aperture technique with multi-offset probes is developed to improve the lesion characterization by the evaluation of local scattering parameters. This technique works with weak scatterers embedded in a constant velocity medium, large aperture, and isotropic sources and receivers. The features of this technique are: axial and lateral spatial resolution of the order of the wavelength, high dynamic range, quantitative measurements of the size and scattering intensity of the inhomogeneities, and capabilities of investigation of inclined layer. The evaluation of the performances in real condition is carried out by a software simulator in which different experimental situations can be reproduced. Images of simulated anatomic test-objects are presented. The images are obtained with an inversion process of the synthesized ultrasonic signals, collected on the linear aperture by a limited number of finite size transducers.

  3. Cerebral blood flow and autoregulation: current measurement techniques and prospects for noninvasive optical methods

    PubMed Central

    Fantini, Sergio; Sassaroli, Angelo; Tgavalekos, Kristen T.; Kornbluth, Joshua

    2016-01-01

    Abstract. Cerebral blood flow (CBF) and cerebral autoregulation (CA) are critically important to maintain proper brain perfusion and supply the brain with the necessary oxygen and energy substrates. Adequate brain perfusion is required to support normal brain function, to achieve successful aging, and to navigate acute and chronic medical conditions. We review the general principles of CBF measurements and the current techniques to measure CBF based on direct intravascular measurements, nuclear medicine, X-ray imaging, magnetic resonance imaging, ultrasound techniques, thermal diffusion, and optical methods. We also review techniques for arterial blood pressure measurements as well as theoretical and experimental methods for the assessment of CA, including recent approaches based on optical techniques. The assessment of cerebral perfusion in the clinical practice is also presented. The comprehensive description of principles, methods, and clinical requirements of CBF and CA measurements highlights the potentially important role that noninvasive optical methods can play in the assessment of neurovascular health. In fact, optical techniques have the ability to provide a noninvasive, quantitative, and continuous monitor of CBF and autoregulation. PMID:27403447

  4. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    PubMed Central

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  5. Designing dual-plate meteoroid shields: A new analysis

    NASA Technical Reports Server (NTRS)

    Swift, H. F.; Bamford, R.; Chen, R.

    1982-01-01

    Physics governing ultrahigh velocity impacts onto dual-plate meteor armor is discussed. Meteoroid shield design methodologies are considered: failure mechanisms, qualitative features of effective meteoroid shield designs, evaluating/processing meteoroid threat models, and quantitative techniques for optimizing effective meteoroid shield designs. Related investigations are included: use of Kevlar cloth/epoxy panels in meteoroid shields for the Halley's Comet intercept vehicle, mirror exposure dynamics, and evaluation of ion fields produced around the Halley Intercept Mission vehicle by meteoroid impacts.

  6. Integrating prediction, provenance, and optimization into high energy workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schram, M.; Bansal, V.; Friese, R. D.

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  7. Radar Remote Sensing

    NASA Technical Reports Server (NTRS)

    Rosen, Paul A.

    2012-01-01

    This lecture was just a taste of radar remote sensing techniques and applications. Other important areas include Stereo radar grammetry. PolInSAR for volumetric structure mapping. Agricultural monitoring, soil moisture, ice-mapping, etc. The broad range of sensor types, frequencies of observation and availability of sensors have enabled radar sensors to make significant contributions in a wide area of earth and planetary remote sensing sciences. The range of applications, both qualitative and quantitative, continue to expand with each new generation of sensors.

  8. Method of detecting and counting bacteria

    NASA Technical Reports Server (NTRS)

    Picciolo, G. L.; Chappelle, E. W. (Inventor)

    1976-01-01

    An improved method is provided for determining bacterial levels, especially in samples of aqueous physiological fluids. The method depends on the quantitative determination of bacterial adenosine triphosphate (ATP) in the presence of nonbacterial ATP. The bacterial ATP is released by cell rupture and is measured by an enzymatic bioluminescent assay. A concentration technique is included to make the method more sensitive. It is particularly useful where the fluid to be measured contains an unknown or low bacteria count.

  9. Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3

    NASA Technical Reports Server (NTRS)

    Brooks, Howard L.

    1986-01-01

    In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.

  10. Tracking Drug-induced Changes in Receptor Post-internalization Trafficking by Colocalizational Analysis.

    PubMed

    Ong, Edmund; Cahill, Catherine

    2015-07-03

    The intracellular trafficking of receptors is a collection of complex and highly controlled processes. Receptor trafficking modulates signaling and overall cell responsiveness to ligands and is, itself, influenced by intra- and extracellular conditions, including ligand-induced signaling. Optimized for use with monolayer-plated cultured cells, but extendable to free-floating tissue slices, this protocol uses immunolabelling and colocalizational analysis to track changes in intracellular receptor trafficking following both chronic/prolonged and acute interventions, including exogenous drug treatment. After drug treatment, cells are double-immunolabelled for the receptor and for markers for the intracellular compartments of interest. Sequential confocal microscopy is then used to capture two-channel photomicrographs of individual cells, which are subjected to computerized colocalizational analysis to yield quantitative colocalization scores. These scores are normalized to permit pooling of independent replicates prior to statistical analysis. Representative photomicrographs may also be processed to generate illustrative figures. Here, we describe a powerful and flexible technique for quantitatively assessing induced receptor trafficking.

  11. Ultrasonic NDE Simulation for Composite Manufacturing Defects

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.

    2016-01-01

    The increased use of composites in aerospace components is expected to continue into the future. The large scale use of composites in aerospace necessitates the development of composite-appropriate nondestructive evaluation (NDE) methods to quantitatively characterize defects in as-manufactured parts and damage incurred during or post manufacturing. Ultrasonic techniques are one of the most common approaches for defect/damage detection in composite materials. One key technical challenge area included in NASA's Advanced Composite's Project is to develop optimized rapid inspection methods for composite materials. Common manufacturing defects in carbon fiber reinforced polymer (CFRP) composites include fiber waviness (in-plane and out-of-plane), porosity, and disbonds; among others. This paper is an overview of ongoing work to develop ultrasonic wavefield based methods for characterizing manufacturing waviness defects. The paper describes the development and implementation of a custom ultrasound simulation tool that is used to model ultrasonic wave interaction with in-plane fiber waviness (also known as marcelling). Wavefield data processing methods are applied to the simulation data to explore possible routes for quantitative defect characterization.

  12. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  13. Quantitative Visualization of Salt Concentration Distributions in Lithium-Ion Battery Electrolytes during Battery Operation Using X-ray Phase Imaging.

    PubMed

    Takamatsu, Daiko; Yoneyama, Akio; Asari, Yusuke; Hirano, Tatsumi

    2018-02-07

    A fundamental understanding of concentrations of salts in lithium-ion battery electrolytes during battery operation is important for optimal operation and design of lithium-ion batteries. However, there are few techniques that can be used to quantitatively characterize salt concentration distributions in the electrolytes during battery operation. In this paper, we demonstrate that in operando X-ray phase imaging can quantitatively visualize the salt concentration distributions that arise in electrolytes during battery operation. From quantitative evaluation of the concentration distributions at steady states, we obtained the salt diffusivities in electrolytes with different initial salt concentrations. Because of no restriction on samples and high temporal and spatial resolutions, X-ray phase imaging will be a versatile technique for evaluating electrolytes, both aqueous and nonaqueous, of many electrochemical systems.

  14. NEXRAD quantitative precipitation estimates, data acquisition, and processing for the DuPage County, Illinois, streamflow-simulation modeling system

    USGS Publications Warehouse

    Ortel, Terry W.; Spies, Ryan R.

    2015-11-19

    Next-Generation Radar (NEXRAD) has become an integral component in the estimation of precipitation (Kitzmiller and others, 2013). The high spatial and temporal resolution of NEXRAD has revolutionized the ability to estimate precipitation across vast regions, which is especially beneficial in areas without a dense rain-gage network. With the improved precipitation estimates, hydrologic models can produce reliable streamflow forecasts for areas across the United States. NEXRAD data from the National Weather Service (NWS) has been an invaluable tool used by the U.S. Geological Survey (USGS) for numerous projects and studies; NEXRAD data processing techniques similar to those discussed in this Fact Sheet have been developed within the USGS, including the NWS Quantitative Precipitation Estimates archive developed by Blodgett (2013).

  15. Quantitative analysis of PEG-functionalized colloidal gold nanoparticles using charged aerosol detection.

    PubMed

    Smith, Mackensie C; Crist, Rachael M; Clogston, Jeffrey D; McNeil, Scott E

    2015-05-01

    Surface characteristics of a nanoparticle, such as functionalization with polyethylene glycol (PEG), are critical to understand and achieve optimal biocompatibility. Routine physicochemical characterization such as UV-vis spectroscopy (for gold nanoparticles), dynamic light scattering, and zeta potential are commonly used to assess the presence of PEG. However, these techniques are merely qualitative and are not sensitive enough to distinguish differences in PEG quantity, density, or presentation. As an alternative, two methods are described here which allow for quantitative measurement of PEG on PEGylated gold nanoparticles. The first, a displacement method, utilizes dithiothreitol to displace PEG from the gold surface. The dithiothreitol-coated gold nanoparticles are separated from the mixture via centrifugation, and the excess dithiothreitol and dissociated PEG are separated through reversed-phase high-performance liquid chromatography (RP-HPLC). The second, a dissolution method, utilizes potassium cyanide to dissolve the gold nanoparticles and liberate PEG. Excess CN(-), Au(CN)2 (-), and free PEG are separated using RP-HPLC. In both techniques, the free PEG can be quantified against a standard curve using charged aerosol detection. The displacement and dissolution methods are validated here using 2-, 5-, 10-, and 20-kDa PEGylated 30-nm colloidal gold nanoparticles. Further value in these techniques is demonstrated not only by quantitating the total PEG fraction but also by being able to be adapted to quantitate the free unbound PEG and the bound PEG fractions. This is an important distinction, as differences in the bound and unbound PEG fractions can affect biocompatibility, which would not be detected in techniques that only quantitate the total PEG fraction.

  16. Molecular and Cellular Quantitative Microscopy: theoretical investigations, technological developments and applications to neurobiology

    NASA Astrophysics Data System (ADS)

    Esposito, Alessandro

    2006-05-01

    This PhD project aims at the development and evaluation of microscopy techniques for the quantitative detection of molecular interactions and cellular features. The primarily investigated techniques are Fαrster Resonance Energy Transfer imaging and Fluorescence Lifetime Imaging Microscopy. These techniques have the capability to quantitatively probe the biochemical environment of fluorophores. An automated microscope capable of unsupervised operation has been developed that enables the investigation of molecular and cellular properties at high throughput levels and the analysis of cellular heterogeneity. State-of-the-art Förster Resonance Energy Transfer imaging, Fluorescence Lifetime Imaging Microscopy, Confocal Laser Scanning Microscopy and the newly developed tools have been combined with cellular and molecular biology techniques for the investigation of protein-protein interactions, oligomerization and post-translational modifications of α-Synuclein and Tau, two proteins involved in Parkinson’s and Alzheimer’s disease, respectively. The high inter-disciplinarity of this project required the merging of the expertise of both the Molecular Biophysics Group at the Debye Institute - Utrecht University and the Cell Biophysics Group at the European Neuroscience Institute - Gαttingen University. This project was conducted also with the support and the collaboration of the Center for the Molecular Physiology of the Brain (Göttingen), particularly with the groups associated with the Molecular Quantitative Microscopy and Parkinson’s Disease and Aggregopathies areas. This work demonstrates that molecular and cellular quantitative microscopy can be used in combination with high-throughput screening as a powerful tool for the investigation of the molecular mechanisms of complex biological phenomena like those occurring in neurodegenerative diseases.

  17. Assessment of and standardization for quantitative nondestructive test

    NASA Technical Reports Server (NTRS)

    Neuschaefer, R. W.; Beal, J. B.

    1972-01-01

    Present capabilities and limitations of nondestructive testing (NDT) as applied to aerospace structures during design, development, production, and operational phases are assessed. It will help determine what useful structural quantitative and qualitative data may be provided from raw materials to vehicle refurbishment. This assessment considers metal alloys systems and bonded composites presently applied in active NASA programs or strong contenders for future use. Quantitative and qualitative data has been summarized from recent literature, and in-house information, and presented along with a description of those structures or standards where the information was obtained. Examples, in tabular form, of NDT technique capabilities and limitations have been provided. NDT techniques discussed and assessed were radiography, ultrasonics, penetrants, thermal, acoustic, and electromagnetic. Quantitative data is sparse; therefore, obtaining statistically reliable flaw detection data must be strongly emphasized. The new requirements for reusable space vehicles have resulted in highly efficient design concepts operating in severe environments. This increases the need for quantitative NDT evaluation of selected structural components, the end item structure, and during refurbishment operations.

  18. In vivo quantification of spatially-varying mechanical properties in developing tissues

    PubMed Central

    Serwane, Friedhelm; Mongera, Alessandro; Rowghanian, Payam; Kealhofer, David A.; Lucio, Adam A.; Hockenbery, Zachary M.; Campàs, Otger

    2017-01-01

    It is generally believed that the mechanical properties of the cellular microenvironment and their spatiotemporal variations play a central role in sculpting embryonic tissues, maintaining organ architecture and controlling cell behavior, including cell differentiation. However, no direct in vivo and in situ measurement of mechanical properties within developing 3D tissues and organs has been performed yet. Here we introduce a technique that employs biocompatible ferrofluid microdroplets as local mechanical actuators and allows quantitative spatiotemporal measurements of mechanical properties in vivo. Using this technique, we show that vertebrate body elongation entails spatially-varying tissue mechanics along the anteroposterior axis. Specifically, we find that the zebrafish tailbud is viscoelastic (elastic below a few seconds and fluid after just one minute) and displays decreasing stiffness and increasing fluidity towards its posterior elongating region. This method opens new avenues to study mechanobiology in vivo, both in embryogenesis and in disease processes, including cancer. PMID:27918540

  19. Preclinical Imaging for the Study of Mouse Models of Thyroid Cancer

    PubMed Central

    Greco, Adelaide; Orlandella, Francesca Maria; Iervolino, Paola Lucia Chiara; Klain, Michele; Salvatore, Giuliana

    2017-01-01

    Thyroid cancer, which represents the most common tumors among endocrine malignancies, comprises a wide range of neoplasms with different clinical aggressiveness. One of the most important challenges in research is to identify mouse models that most closely resemble human pathology; other goals include finding a way to detect markers of disease that common to humans and mice and to identify the most appropriate and least invasive therapeutic strategies for specific tumor types. Preclinical thyroid imaging includes a wide range of techniques that allow for morphological and functional characterization of thyroid disease as well as targeting and in most cases, this imaging allows quantitative analysis of the molecular pattern of the thyroid cancer. The aim of this review paper is to provide an overview of all of the imaging techniques used to date both for diagnosis and theranostic purposes in mouse models of thyroid cancer. PMID:29258188

  20. Principles and applications of polymerase chain reaction in medical diagnostic fields: a review

    PubMed Central

    Valones, Marcela Agne Alves; Guimarães, Rafael Lima; Brandão, Lucas André Cavalcanti; de Souza, Paulo Roberto Eleutério; de Albuquerque Tavares Carvalho, Alessandra; Crovela, Sergio

    2009-01-01

    Recent developments in molecular methods have revolutionized the detection and characterization of microorganisms in a broad range of medical diagnostic fields, including virology, mycology, parasitology, microbiology and dentistry. Among these methods, Polymerase Chain Reaction (PCR) has generated great benefits and allowed scientific advancements. PCR is an excellent technique for the rapid detection of pathogens, including those difficult to culture. Along with conventional PCR techniques, Real-Time PCR has emerged as a technological innovation and is playing an ever-increasing role in clinical diagnostics and research laboratories. Due to its capacity to generate both qualitative and quantitative results, Real-Time PCR is considered a fast and accurate platform. The aim of the present literature review is to explore the clinical usefulness and potential of both conventional PCR and Real-Time PCR assays in diverse medical fields, addressing its main uses and advances. PMID:24031310

  1. A simplified flight-test method for determining aircraft takeoff performance that includes effects of pilot technique

    NASA Technical Reports Server (NTRS)

    Larson, T. J.; Schweikhard, W. G.

    1974-01-01

    A method for evaluating aircraft takeoff performance from brake release to air-phase height that requires fewer tests than conventionally required is evaluated with data for the XB-70 airplane. The method defines the effects of pilot technique on takeoff performance quantitatively, including the decrease in acceleration from drag due to lift. For a given takeoff weight and throttle setting, a single takeoff provides enough data to establish a standardizing relationship for the distance from brake release to any point where velocity is appropriate to rotation. The lower rotation rates penalized takeoff performance in terms of ground roll distance; the lowest observed rotation rate required a ground roll distance that was 19 percent longer than the highest. Rotations at the minimum rate also resulted in lift-off velocities that were approximately 5 knots lower than the highest rotation rate at any given lift-off distance.

  2. microRNA biosensors: Opportunities and challenges among conventional and commercially available techniques.

    PubMed

    Kilic, Tugba; Erdem, Arzum; Ozsoz, Mehmet; Carrara, Sandro

    2018-01-15

    As being the most extensively studied, non-coding, evolutionary conserved, post-transcriptional gene regulators of genome, microRNAs (miRNAs) have taken great attention among various disciplines due to their important roles in biological processes and link with cancer. Due to their diagnostic value, there have been many conventional methods used in detection of miRNAs including northern blotting, quantitative real time PCR (qRT-PCR) and microarray technology besides novel techniques based on various nanotechnology approaches and molecular biology tools including miRNA biosensors. The aim of this review is to explain the importance of miRNAs in biomedical field with an emphasis on early cancer diagnosis by overviewing both research based and commercially available miRNA detection methods in the last decade considering their strengths and weakness with an emphasis on miRNA biosensors. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Semiselective Optoelectronic Sensors for Monitoring Microbes

    NASA Technical Reports Server (NTRS)

    Tabacco, Mary Beth; Chuang, Han; Taylor,Laura; Russo, Jaime

    2003-01-01

    Sensor systems are under development for use in real-time detection and quantitation of microbes in water without need for sampling. These systems include arrays of optical sensors; miniature, portable electronic data-acquisition circuits; and optoelectronic interfaces between the sensor arrays and data-acquisition circuits. These systems are intended for original use in long-term, inline monitoring of waterborne micro-organisms in water-reclamation systems aboard future spacecraft. They could also be adapted to similar terrestrial uses with respect to municipal water supplies, stored drinking water, and swimming water; for detecting low-level biological contamination in biotechnological, semiconductor, and pharmaceutical process streams; and in verifying the safety of foods and beverages. In addition, they could be adapted to monitoring of airborne microbes and of surfaces (e.g., to detect and/or quantitate biofilms). The designs of the sensors in these systems are based partly on those of sensors developed previously for monitoring airborne biological materials. The designs exploit molecular- recognition and fluorescence-spectroscopy techniques, such that in the presence of micro-organisms of interest, fluorescence signals change and the changes can be measured. These systems are characterized as semiselective because they respond to classes of micro-organisms and can be used to discriminate among the classes. This semiselectivity is a major aspect of the design: It is important to distinguish between (1) the principle of detection and quantitation of classes of micro-organisms by use of these sensors and (2) the principle of detection and quantitation of individual microbiological species by means of prior immuno-diagnostic and/or molecular-biology techniques. Detection of classes (in contradistinction to species) is particularly valuable when the exact nature of a contaminant is unknown.

  4. MO-C-BRCD-03: The Role of Informatics in Medical Physics and Vice Versa.

    PubMed

    Andriole, K

    2012-06-01

    Like Medical Physics, Imaging Informatics encompasses concepts touching every aspect of the imaging chain from image creation, acquisition, management and archival, to image processing, analysis, display and interpretation. The two disciplines are in fact quite complementary, with similar goals to improve the quality of care provided to patients using an evidence-based approach, to assure safety in the clinical and research environments, to facilitate efficiency in the workplace, and to accelerate knowledge discovery. Use-cases describing several areas of informatics activity will be given to illustrate current limitations that would benefit from medical physicist participation, and conversely areas in which informaticists may contribute to the solution. Topics to be discussed include radiation dose monitoring, process management and quality control, display technologies, business analytics techniques, and quantitative imaging. Quantitative imaging is increasingly becoming an essential part of biomedicalresearch as well as being incorporated into clinical diagnostic activities. Referring clinicians are asking for more objective information to be gleaned from the imaging tests that they order so that they may make the best clinical management decisions for their patients. Medical Physicists may be called upon to identify existing issues as well as develop, validate and implement new approaches and technologies to help move the field further toward quantitative imaging methods for the future. Biomedical imaging informatics tools and techniques such as standards, integration, data mining, cloud computing and new systems architectures, ontologies and lexicons, data visualization and navigation tools, and business analytics applications can be used to overcome some of the existing limitations. 1. Describe what is meant by Medical Imaging Informatics and understand why the medical physicist should care. 2. Identify existing limitations in information technologies with respect to Medical Physics, and conversely see how Informatics may assist the medical physicist in filling some of the current gaps in their activities. 3. Understand general informatics concepts and areas of investigation including imaging and workflow standards, systems integration, computing architectures, ontologies, data mining and business analytics, data visualization and human-computer interface tools, and the importance of quantitative imaging for the future of Medical Physics and Imaging Informatics. 4. Become familiar with on-going efforts to address current challenges facing future research into and clinical implementation of quantitative imaging applications. © 2012 American Association of Physicists in Medicine.

  5. Combination of methylated-DNA precipitation and methylation-sensitive restriction enzymes (COMPARE-MS) for the rapid, sensitive and quantitative detection of DNA methylation.

    PubMed

    Yegnasubramanian, Srinivasan; Lin, Xiaohui; Haffner, Michael C; DeMarzo, Angelo M; Nelson, William G

    2006-02-09

    Hypermethylation of CpG island (CGI) sequences is a nearly universal somatic genome alteration in cancer. Rapid and sensitive detection of DNA hypermethylation would aid in cancer diagnosis and risk stratification. We present a novel technique, called COMPARE-MS, that can rapidly and quantitatively detect CGI hypermethylation with high sensitivity and specificity in hundreds of samples simultaneously. To quantitate CGI hypermethylation, COMPARE-MS uses real-time PCR of DNA that was first digested by methylation-sensitive restriction enzymes and then precipitated by methyl-binding domain polypeptides immobilized on a magnetic solid matrix. We show that COMPARE-MS could detect five genome equivalents of methylated CGIs in a 1000- to 10,000-fold excess of unmethylated DNA. COMPARE-MS was used to rapidly quantitate hypermethylation at multiple CGIs in >155 prostate tissues, including benign and malignant prostate specimens, and prostate cell lines. This analysis showed that GSTP1, MDR1 and PTGS2 CGI hypermethylation as determined by COMPARE-MS could differentiate between malignant and benign prostate with sensitivities >95% and specificities approaching 100%. This novel technology could significantly improve our ability to detect CGI hypermethylation.

  6. Monte Carlo evaluation of accuracy and noise properties of two scatter correction methods for /sup 201/Tl cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Narita, Y.; Iida, H.; Ebert, S.; Nakamura, T.

    1997-12-01

    Two independent scatter correction techniques, transmission dependent convolution subtraction (TDCS) and triple-energy window (TEW) method, were evaluated in terms of quantitative accuracy and noise properties using Monte Carlo simulation (EGS4). Emission projections (primary, scatter and scatter plus primary) were simulated for three numerical phantoms for /sup 201/Tl. Data were reconstructed with ordered-subset EM algorithm including noise-less transmission data based attenuation correction. Accuracy of TDCS and TEW scatter corrections were assessed by comparison with simulated true primary data. The uniform cylindrical phantom simulation demonstrated better quantitative accuracy with TDCS than with TEW (-2.0% vs. 16.7%) and better S/N (6.48 vs. 5.05). A uniform ring myocardial phantom simulation demonstrated better homogeneity with TDCS than TEW in the myocardium; i.e., anterior-to-posterior wall count ratios were 0.99 and 0.76 with TDCS and TEW, respectively. For the MCAT phantom, TDCS provided good visual and quantitative agreement with simulated true primary image without noticeably increasing the noise after scatter correction. Overall TDCS proved to be more accurate and less noisy than TEW, facilitating quantitative assessment of physiological functions with SPECT.

  7. Evaluative procedures to detect, characterize, and assess the severity of diabetic neuropathy.

    PubMed

    Dyck, P J

    1991-01-01

    Minimal criteria for diabetic neuropathy need to be defined and universally applied. Standardized evaluative procedures need to be agreed and normal ranges determined from healthy volunteers. Types and stages of neuropathy should be established and assessments performed on representative populations of both Type 1 and Type 2 diabetic patients. Potential minimal criteria include absent ankle reflexes and vibratory sensation, and abnormalities of nerve conduction. However, the preferred criterion is the identification of more than two statistically defined abnormalities among symptoms and deficits, nerve conduction, quantitative sensory examination or quantitative autonomic examination. Various evaluative procedures are available. Symptoms should be assessed and scores can be assigned to neurological deficits. However, assessments of nerve conduction provide the most specific, objective, sensitive, and repeatable procedures, although these may be the least meaningful. Many techniques are available for quantitative sensory examination, but are poorly standardized and normal values are not available. For quantitative autonomic examination, tests are available for the adequacy of cardiovascular and peripheral vascular reflexes and increasingly for other autonomic functions. In any assessment of nerve function the conditions should be optimized and standardized, and stimuli defined. Specific instructions should be given and normal ranges established in healthy volunteers.

  8. Quantitative analysis of sitagliptin using the (19)F-NMR method: a universal technique for fluorinated compound detection.

    PubMed

    Zhang, Fen-Fen; Jiang, Meng-Hong; Sun, Lin-Lin; Zheng, Feng; Dong, Lei; Shah, Vishva; Shen, Wen-Bin; Ding, Ya

    2015-01-07

    To expand the application scope of nuclear magnetic resonance (NMR) technology in quantitative analysis of pharmaceutical ingredients, (19)F nuclear magnetic resonance ((19)F-NMR) spectroscopy has been employed as a simple, rapid, and reproducible approach for the detection of a fluorine-containing model drug, sitagliptin phosphate monohydrate (STG). ciprofloxacin (Cipro) has been used as the internal standard (IS). Influential factors, including the relaxation delay time (d1) and pulse angle, impacting the accuracy and precision of spectral data are systematically optimized. Method validation has been carried out in terms of precision and intermediate precision, linearity, limit of detection (LOD) and limit of quantification (LOQ), robustness, and stability. To validate the reliability and feasibility of the (19)F-NMR technology in quantitative analysis of pharmaceutical analytes, the assay result has been compared with that of (1)H-NMR. The statistical F-test and student t-test at 95% confidence level indicate that there is no significant difference between these two methods. Due to the advantages of (19)F-NMR, such as higher resolution and suitability for biological samples, it can be used as a universal technology for the quantitative analysis of other fluorine-containing pharmaceuticals and analytes.

  9. Evaluation of macrozone dimensions by ultrasound and EBSD techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreau, Andre, E-mail: Andre.Moreau@cnrc-nrc.gc.ca; Toubal, Lotfi; Ecole de technologie superieure, 1100, rue Notre-Dame Ouest, Montreal, QC, Canada H3C 1K3

    2013-01-15

    Titanium alloys are known to have texture heterogeneities, i.e. regions much larger than the grain dimensions, where the local orientation distribution of the grains differs from one region to the next. The electron backscattering diffraction (EBSD) technique is the method of choice to characterize these macro regions, which are called macrozones. Qualitatively, the images obtained by EBSD show that these macrozones may be larger or smaller, elongated or equiaxed. However, often no well-defined boundaries are observed between the macrozones and it is very hard to obtain objective and quantitative estimates of the macrozone dimensions from these data. In the presentmore » work, we present a novel, non-destructive ultrasonic technique that provides objective and quantitative characteristic dimensions of the macrozones. The obtained dimensions are based on the spatial autocorrelation function of fluctuations in the sound velocity. Thus, a pragmatic definition of macrozone dimensions naturally arises from the ultrasonic measurement. This paper has three objectives: 1) to disclose the novel, non-destructive ultrasonic technique to measure macrozone dimensions, 2) to propose a quantitative and objective definition of macrozone dimensions adapted to and arising from the ultrasonic measurement, and which is also applicable to the orientation data obtained by EBSD, and 3) to compare the macrozone dimensions obtained using the two techniques on two samples of the near-alpha titanium alloy IMI834. In addition, it was observed that macrozones may present a semi-periodical arrangement. - Highlights: Black-Right-Pointing-Pointer Discloses a novel, ultrasonic NDT technique to measure macrozone dimensions Black-Right-Pointing-Pointer Proposes a quantitative and objective definition of macrozone dimensions Black-Right-Pointing-Pointer Compares macrozone dimensions obtained using EBSD and ultrasonics on 2 Ti samples Black-Right-Pointing-Pointer Observes that macrozones may have a semi-periodical arrangement.« less

  10. Cardiovascular magnetic resonance of myocardial edema using a short inversion time inversion recovery (STIR) black-blood technique: Diagnostic accuracy of visual and semi-quantitative assessment

    PubMed Central

    2012-01-01

    Background The short inversion time inversion recovery (STIR) black-blood technique has been used to visualize myocardial edema, and thus to differentiate acute from chronic myocardial lesions. However, some cardiovascular magnetic resonance (CMR) groups have reported variable image quality, and hence the diagnostic value of STIR in routine clinical practice has been put into question. The aim of our study was to analyze image quality and diagnostic performance of STIR using a set of pulse sequence parameters dedicated to edema detection, and to discuss possible factors that influence image quality. We hypothesized that STIR imaging is an accurate and robust way of detecting myocardial edema in non-selected patients with acute myocardial infarction. Methods Forty-six consecutive patients with acute myocardial infarction underwent CMR (day 4.5, +/- 1.6) including STIR for the assessment of myocardial edema and late gadolinium enhancement (LGE) for quantification of myocardial necrosis. Thirty of these patients underwent a follow-up CMR at approximately six months (195 +/- 39 days). Both STIR and LGE images were evaluated separately on a segmental basis for image quality as well as for presence and extent of myocardial hyper-intensity, with both visual and semi-quantitative (threshold-based) analysis. LGE was used as a reference standard for localization and extent of myocardial necrosis (acute) or scar (chronic). Results Image quality of STIR images was rated as diagnostic in 99.5% of cases. At the acute stage, the sensitivity and specificity of STIR to detect infarcted segments on visual assessment was 95% and 78% respectively, and on semi-quantitative assessment was 99% and 83%, respectively. STIR differentiated acutely from chronically infarcted segments with a sensitivity of 95% by both methods and with a specificity of 99% by visual assessment and 97% by semi-quantitative assessment. The extent of hyper-intense areas on acute STIR images was 85% larger than those on LGE images, with a larger myocardial salvage index in reperfused than in non-reperfused infarcts (p = 0.035). Conclusions STIR with appropriate pulse sequence settings is accurate in detecting acute myocardial infarction (MI) and distinguishing acute from chronic MI with both visual and semi-quantitative analysis. Due to its unique technical characteristics, STIR should be regarded as an edema-weighted rather than a purely T2-weighted technique. PMID:22455461

  11. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  12. An accurate computational method for an order parameter with a Markov state model constructed using a manifold-learning technique

    NASA Astrophysics Data System (ADS)

    Ito, Reika; Yoshidome, Takashi

    2018-01-01

    Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.

  13. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites.

    PubMed

    Lietz, Christopher B; Gemperline, Erin; Li, Lingjun

    2013-07-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Qualitative and quantitative mass spectrometry imaging of drugs and metabolites

    PubMed Central

    Lietz, Christopher B.; Gemperline, Erin; Li, Lingjun

    2013-01-01

    Mass spectrometric imaging (MSI) has rapidly increased its presence in the pharmaceutical sciences. While quantitative whole-body autoradiography and microautoradiography are the traditional techniques for molecular imaging of drug delivery and metabolism, MSI provides advantageous specificity that can distinguish the parent drug from metabolites and modified endogenous molecules. This review begins with the fundamentals of MSI sample preparation/ionization, and then moves on to both qualitative and quantitative applications with special emphasis on drug discovery and delivery. Cutting-edge investigations on sub-cellular imaging and endogenous signaling peptides are also highlighted, followed by perspectives on emerging technology and the path for MSI to become a routine analysis technique. PMID:23603211

  15. Generalized likelihood ratios for quantitative diagnostic test scores.

    PubMed

    Tandberg, D; Deely, J J; O'Malley, A J

    1997-11-01

    The reduction of quantitative diagnostic test scores to the dichotomous case is a wasteful and unnecessary simplification in the era of high-speed computing. Physicians could make better use of the information embedded in quantitative test results if modern generalized curve estimation techniques were applied to the likelihood functions of Bayes' theorem. Hand calculations could be completely avoided and computed graphical summaries provided instead. Graphs showing posttest probability of disease as a function of pretest probability with confidence intervals (POD plots) would enhance acceptance of these techniques if they were immediately available at the computer terminal when test results were retrieved. Such constructs would also provide immediate feedback to physicians when a valueless test had been ordered.

  16. A systematic review of the relationship factor between women and health professionals within the multivariant analysis of maternal satisfaction.

    PubMed

    Macpherson, Ignacio; Roqué-Sánchez, María V; Legget Bn, Finola O; Fuertes, Ferran; Segarra, Ignacio

    2016-10-01

    personalised support provided to women by health professionals is one of the prime factors attaining women's satisfaction during pregnancy and childbirth. However the multifactorial nature of 'satisfaction' makes difficult to assess it. Statistical multivariate analysis may be an effective technique to obtain in depth quantitative evidence of the importance of this factor and its interaction with the other factors involved. This technique allows us to estimate the importance of overall satisfaction in its context and suggest actions for healthcare services. systematic review of studies that quantitatively measure the personal relationship between women and healthcare professionals (gynecologists, obstetricians, nurse, midwifes, etc.) regarding maternity care satisfaction. The literature search focused on studies carried out between 1970 and 2014 that used multivariate analyses and included the woman-caregiver relationship as a factor of their analysis. twenty-four studies which applied various multivariate analysis tools to different periods of maternity care (antenatal, perinatal, post partum) were selected. The studies included discrete scale scores and questionnaires from women with low-risk pregnancies. The "personal relationship" factor appeared under various names: care received, personalised treatment, professional support, amongst others. The most common multivariate techniques used to assess the percentage of variance explained and the odds ratio of each factor were principal component analysis and logistic regression. the data, variables and factor analysis suggest that continuous, personalised care provided by the usual midwife and delivered within a family or a specialised setting, generates the highest level of satisfaction. In addition, these factors foster the woman's psychological and physiological recovery, often surpassing clinical action (e.g. medicalization and hospital organization) and/or physiological determinants (e.g. pain, pathologies, etc.). Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Updates on measurements and modeling techniques for expendable countermeasures

    NASA Astrophysics Data System (ADS)

    Gignilliat, Robert; Tepfer, Kathleen; Wilson, Rebekah F.; Taczak, Thomas M.

    2016-10-01

    The potential threat of recently-advertised anti-ship missiles has instigated research at the United States (US) Naval Research Laboratory (NRL) into the improvement of measurement techniques for visual band countermeasures. The goal of measurements is the collection of radiometric imagery for use in the building and validation of digital models of expendable countermeasures. This paper will present an overview of measurement requirements unique to the visual band and differences between visual band and infrared (IR) band measurements. A review of the metrics used to characterize signatures in the visible band will be presented and contrasted to those commonly used in IR band measurements. For example, the visual band measurements require higher fidelity characterization of the background, including improved high-transmittance measurements and better characterization of solar conditions to correlate results more closely with changes in the environment. The range of relevant engagement angles has also been expanded to include higher altitude measurements of targets and countermeasures. In addition to the discussion of measurement techniques, a top-level qualitative summary of modeling approaches will be presented. No quantitative results or data will be presented.

  18. Modified ecometric technique (four-quadrant sequential streak) to evaluate Campylobacter enrichment broth proficiency in suppressing background microflora

    USDA-ARS?s Scientific Manuscript database

    Ecometric technique is a semi-quantitative scoring method used for quality control of culture media in microbiological laboratories. The technique involves inoculation with defined populations of specific culture onto solid media via a standardized chronological streaking technique, leading to ever-...

  19. Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection

    PubMed Central

    Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan

    2011-01-01

    Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O2 content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. PMID:21497566

  20. A peptide-retrieval strategy enables significant improvement of quantitative performance without compromising confidence of identification.

    PubMed

    Tu, Chengjian; Shen, Shichen; Sheng, Quanhu; Shyr, Yu; Qu, Jun

    2017-01-30

    Reliable quantification of low-abundance proteins in complex proteomes is challenging largely owing to the limited number of spectra/peptides identified. In this study we developed a straightforward method to improve the quantitative accuracy and precision of proteins by strategically retrieving the less confident peptides that were previously filtered out using the standard target-decoy search strategy. The filtered-out MS/MS spectra matched to confidently-identified proteins were recovered, and the peptide-spectrum-match FDR were re-calculated and controlled at a confident level of FDR≤1%, while protein FDR maintained at ~1%. We evaluated the performance of this strategy in both spectral count- and ion current-based methods. >60% increase of total quantified spectra/peptides was respectively achieved for analyzing a spike-in sample set and a public dataset from CPTAC. Incorporating the peptide retrieval strategy significantly improved the quantitative accuracy and precision, especially for low-abundance proteins (e.g. one-hit proteins). Moreover, the capacity of confidently discovering significantly-altered proteins was also enhanced substantially, as demonstrated with two spike-in datasets. In summary, improved quantitative performance was achieved by this peptide recovery strategy without compromising confidence of protein identification, which can be readily implemented in a broad range of quantitative proteomics techniques including label-free or labeling approaches. We hypothesize that more quantifiable spectra and peptides in a protein, even including less confident peptides, could help reduce variations and improve protein quantification. Hence the peptide retrieval strategy was developed and evaluated in two spike-in sample sets with different LC-MS/MS variations using both MS1- and MS2-based quantitative approach. The list of confidently identified proteins using the standard target-decoy search strategy was fixed and more spectra/peptides with less confidence matched to confident proteins were retrieved. However, the total peptide-spectrum-match false discovery rate (PSM FDR) after retrieval analysis was still controlled at a confident level of FDR≤1%. As expected, the penalty for occasionally incorporating incorrect peptide identifications is negligible by comparison with the improvements in quantitative performance. More quantifiable peptides, lower missing value rate, better quantitative accuracy and precision were significantly achieved for the same protein identifications by this simple strategy. This strategy is theoretically applicable for any quantitative approaches in proteomics and thereby provides more quantitative information, especially on low-abundance proteins. Published by Elsevier B.V.

  1. Recent Achievements in Characterizing the Histone Code and Approaches to Integrating Epigenomics and Systems Biology.

    PubMed

    Janssen, K A; Sidoli, S; Garcia, B A

    2017-01-01

    Functional epigenetic regulation occurs by dynamic modification of chromatin, including genetic material (i.e., DNA methylation), histone proteins, and other nuclear proteins. Due to the highly complex nature of the histone code, mass spectrometry (MS) has become the leading technique in identification of single and combinatorial histone modifications. MS has now overcome antibody-based strategies due to its automation, high resolution, and accurate quantitation. Moreover, multiple approaches to analysis have been developed for global quantitation of posttranslational modifications (PTMs), including large-scale characterization of modification coexistence (middle-down and top-down proteomics), which is not currently possible with any other biochemical strategy. Recently, our group and others have simplified and increased the effectiveness of analyzing histone PTMs by improving multiple MS methods and data analysis tools. This review provides an overview of the major achievements in the analysis of histone PTMs using MS with a focus on the most recent improvements. We speculate that the workflow for histone analysis at its state of the art is highly reliable in terms of identification and quantitation accuracy, and it has the potential to become a routine method for systems biology thanks to the possibility of integrating histone MS results with genomics and proteomics datasets. © 2017 Elsevier Inc. All rights reserved.

  2. Quantitative assessment of antibody internalization with novel monoclonal antibodies against Alexa fluorophores.

    PubMed

    Liao-Chan, Sindy; Daine-Matsuoka, Barbara; Heald, Nathan; Wong, Tiffany; Lin, Tracey; Cai, Allen G; Lai, Michelle; D'Alessio, Joseph A; Theunissen, Jan-Willem

    2015-01-01

    Antibodies against cell surface antigens may be internalized through their specific interactions with these proteins and in some cases may induce or perturb antigen internalization. The anti-cancer efficacy of antibody-drug conjugates is thought to rely on their uptake by cancer cells expressing the surface antigen. Numerous techniques, including microscopy and flow cytometry, have been used to identify antibodies with desired cellular uptake rates. To enable quantitative measurements of internalization of labeled antibodies, an assay based on internalized and quenched fluorescence was developed. For this approach, we generated novel anti-Alexa Fluor monoclonal antibodies (mAbs) that effectively and specifically quench cell surface-bound Alexa Fluor 488 or Alexa Fluor 594 fluorescence. Utilizing Alexa Fluor-labeled mAbs against the EphA2 receptor tyrosine kinase, we showed that the anti-Alexa Fluor reagents could be used to monitor internalization quantitatively over time. The anti-Alexa Fluor mAbs were also validated in a proof of concept dual-label internalization assay with simultaneous exposure of cells to two different mAbs. Importantly, the unique anti-Alexa Fluor mAbs described here may also enable other single- and dual-label experiments, including label detection and signal enhancement in macromolecules, trafficking of proteins and microorganisms, and cell migration and morphology.

  3. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, noncancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancerinfiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End-Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  4. Detection of brain tumor margins using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier

    2018-02-01

    In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.

  5. Quantitative structure-activity relationship study of P2X7 receptor inhibitors using combination of principal component analysis and artificial intelligence methods.

    PubMed

    Ahmadi, Mehdi; Shahlaei, Mohsen

    2015-01-01

    P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure-activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7-7-1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure-activity relationship model suggested is robust and satisfactory.

  6. Quantitative structure–activity relationship study of P2X7 receptor inhibitors using combination of principal component analysis and artificial intelligence methods

    PubMed Central

    Ahmadi, Mehdi; Shahlaei, Mohsen

    2015-01-01

    P2X7 antagonist activity for a set of 49 molecules of the P2X7 receptor antagonists, derivatives of purine, was modeled with the aid of chemometric and artificial intelligence techniques. The activity of these compounds was estimated by means of combination of principal component analysis (PCA), as a well-known data reduction method, genetic algorithm (GA), as a variable selection technique, and artificial neural network (ANN), as a non-linear modeling method. First, a linear regression, combined with PCA, (principal component regression) was operated to model the structure–activity relationships, and afterwards a combination of PCA and ANN algorithm was employed to accurately predict the biological activity of the P2X7 antagonist. PCA preserves as much of the information as possible contained in the original data set. Seven most important PC's to the studied activity were selected as the inputs of ANN box by an efficient variable selection method, GA. The best computational neural network model was a fully-connected, feed-forward model with 7−7−1 architecture. The developed ANN model was fully evaluated by different validation techniques, including internal and external validation, and chemical applicability domain. All validations showed that the constructed quantitative structure–activity relationship model suggested is robust and satisfactory. PMID:26600858

  7. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  8. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    NASA Technical Reports Server (NTRS)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  9. Self-Normalized Photoacoustic Technique for the Quantitative Analysis of Paper Pigments

    NASA Astrophysics Data System (ADS)

    Balderas-López, J. A.; Gómez y Gómez, Y. M.; Bautista-Ramírez, M. E.; Pescador-Rojas, J. A.; Martínez-Pérez, L.; Lomelí-Mejía, P. A.

    2018-03-01

    A self-normalized photoacoustic technique was applied for quantitative analysis of pigments embedded in solids. Paper samples (filter paper, Whatman No. 1), attached with the pigment: Direct Fast Turquoise Blue GL, were used for this study. This pigment is a blue dye commonly used in industry to dye paper and other fabrics. The optical absorption coefficient, at a wavelength of 660 nm, was measured for this pigment at various concentrations in the paper substrate. It was shown that Beer-Lambert model for light absorption applies well for pigments in solid substrates and optical absorption coefficients as large as 220 cm^{-1} can be measured with this photoacoustic technique.

  10. Combining Ultrasound Pulse-Echo and Transmission Computed Tomography for Quantitative Imaging the Cortical Shell of Long Bone Replicas

    NASA Astrophysics Data System (ADS)

    Shortell, Matthew P.; Althomali, Marwan A. M.; Wille, Marie-Luise; Langton, Christian M.

    2017-11-01

    We demonstrate a simple technique for quantitative ultrasound imaging of the cortical shell of long bone replicas. Traditional ultrasound computed tomography instruments use the transmitted or reflected waves for separate reconstructions but suffer from strong refraction artefacts in highly heterogenous samples such as bones in soft tissue. The technique described here simplifies the long bone to a two-component composite and uses both the transmitted and reflected waves for reconstructions, allowing the speed of sound and thickness of the cortical shell to be calculated accurately. The technique is simple to implement, computationally inexpensive and sample positioning errors are minimal.

  11. Diffraction enhance x-ray imaging for quantitative phase contrast studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Agrawal, A. K.; Singh, B., E-mail: balwants@rrcat.gov.in; Kashyap, Y. S.

    2016-05-23

    Conventional X-ray imaging based on absorption contrast permits limited visibility of feature having small density and thickness variations. For imaging of weakly absorbing material or materials possessing similar densities, a novel phase contrast imaging techniques called diffraction enhanced imaging has been designed and developed at imaging beamline Indus-2 RRCAT Indore. The technique provides improved visibility of the interfaces and show high contrast in the image forsmall density or thickness gradients in the bulk. This paper presents basic principle, instrumentation and analysis methods for this technique. Initial results of quantitative phase retrieval carried out on various samples have also been presented.

  12. Viewpoint on ISA TR84.0.02--simplified methods and fault tree analysis.

    PubMed

    Summers, A E

    2000-01-01

    ANSI/ISA-S84.01-1996 and IEC 61508 require the establishment of a safety integrity level for any safety instrumented system or safety related system used to mitigate risk. Each stage of design, operation, maintenance, and testing is judged against this safety integrity level. Quantitative techniques can be used to verify whether the safety integrity level is met. ISA-dTR84.0.02 is a technical report under development by ISA, which discusses how to apply quantitative analysis techniques to safety instrumented systems. This paper discusses two of those techniques: (1) Simplified equations and (2) Fault tree analysis.

  13. Microscopy techniques in flavivirus research.

    PubMed

    Chong, Mun Keat; Chua, Anthony Jin Shun; Tan, Terence Tze Tong; Tan, Suat Hoon; Ng, Mah Lee

    2014-04-01

    The Flavivirus genus is composed of many medically important viruses that cause high morbidity and mortality, which include Dengue and West Nile viruses. Various molecular and biochemical techniques have been developed in the endeavour to study flaviviruses. However, microscopy techniques still have irreplaceable roles in the identification of novel virus pathogens and characterization of morphological changes in virus-infected cells. Fluorescence microscopy contributes greatly in understanding the fundamental viral protein localizations and virus-host protein interactions during infection. Electron microscopy remains the gold standard for visualizing ultra-structural features of virus particles and infected cells. New imaging techniques and combinatory applications are continuously being developed to push the limit of resolution and extract more quantitative data. Currently, correlative live cell imaging and high resolution three-dimensional imaging have already been achieved through the tandem use of optical and electron microscopy in analyzing biological specimens. Microscopy techniques are also used to measure protein binding affinities and determine the mobility pattern of proteins in cells. This chapter will consolidate on the applications of various well-established microscopy techniques in flavivirus research, and discuss how recently developed microscopy techniques can potentially help advance our understanding in these membrane viruses. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Leukotriene B4 catabolism: quantitation of leukotriene B4 and its omega-oxidation products by reversed-phase high-performance liquid chromatography.

    PubMed

    Shak, S

    1987-01-01

    LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.

  16. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  17. Cardiac imaging: working towards fully-automated machine analysis & interpretation

    PubMed Central

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-01-01

    Introduction Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation. PMID:28277804

  18. Quantitative model analysis with diverse biological data: applications in developmental pattern formation.

    PubMed

    Pargett, Michael; Umulis, David M

    2013-07-15

    Mathematical modeling of transcription factor and signaling networks is widely used to understand if and how a mechanism works, and to infer regulatory interactions that produce a model consistent with the observed data. Both of these approaches to modeling are informed by experimental data, however, much of the data available or even acquirable are not quantitative. Data that is not strictly quantitative cannot be used by classical, quantitative, model-based analyses that measure a difference between the measured observation and the model prediction for that observation. To bridge the model-to-data gap, a variety of techniques have been developed to measure model "fitness" and provide numerical values that can subsequently be used in model optimization or model inference studies. Here, we discuss a selection of traditional and novel techniques to transform data of varied quality and enable quantitative comparison with mathematical models. This review is intended to both inform the use of these model analysis methods, focused on parameter estimation, and to help guide the choice of method to use for a given study based on the type of data available. Applying techniques such as normalization or optimal scaling may significantly improve the utility of current biological data in model-based study and allow greater integration between disparate types of data. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standards.

    PubMed

    Halvorsen, Ken; Agris, Paul F

    2014-11-15

    Measuring interactions between biological molecules is vitally important to both basic and applied research as well as development of pharmaceuticals. Although a wide and growing range of techniques is available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using isothermal titration calorimetry, differential scanning calorimetry, and ultraviolet-visible (UV-vis) monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features, including low cost, high purity, easily measurable concentrations, and minimal handling concerns, making them ideal for use as a reference material. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Cross-platform comparison of nucleic acid hybridization: toward quantitative reference standardsa

    PubMed Central

    Halvorsen, Ken; Agris, Paul F.

    2014-01-01

    Measuring interactions between biological molecules is vitally important to both basic and applied research, as well as development of pharmaceuticals. While a wide and growing range of techniques are available to measure various kinetic and thermodynamic properties of interacting biomolecules, it can be difficult to compare data across techniques of different laboratories and personnel, or even across different instruments using the same technique. Here we evaluate relevant biological interactions based on complementary DNA and RNA oligonucleotides that could be used as reference standards for many experimental systems. We measured thermodynamics of duplex formation using Isothermal Titration Calorimetry, Differential Scanning Calorimetry, and UV-Vis monitored denaturation/renaturation. These standards can be used to validate results, compare data from disparate techniques, act as a teaching tool for laboratory classes, or potentially to calibrate instruments. The RNA and DNA standards have many attractive features including low cost, high purity, easily measureable concentrations, and minimal handling concerns, making them ideal for use as a reference material. PMID:25124363

  1. Application of Microextraction Techniques Including SPME and MESI to the Thermal Degradation of Polymers: A Review.

    PubMed

    Kaykhaii, Massoud; Linford, Matthew R

    2017-03-04

    Here, we discuss the newly developed micro and solventless sample preparation techniques SPME (Solid Phase Microextraction) and MESI (Membrane Extraction with a Sorbent Interface) as applied to the qualitative and quantitative analysis of thermal oxidative degradation products of polymers and their stabilizers. The coupling of these systems to analytical instruments is also described. Our comprehensive literature search revealed that there is no previously published review article on this topic. It is shown that these extraction techniques are valuable sample preparation tools for identifying complex series of degradation products in polymers. In general, the number of products identified by traditional headspace (HS-GC-MS) is much lower than with SPME-GC-MS. MESI is particularly well suited for the detection of non-polar compounds, therefore number of products identified by this technique is not also to the same degree of SPME. Its main advantage, however, is its ability of (semi-) continuous monitoring, but it is more expensive and not yet commercialized.

  2. Optimization of Native and Formaldehyde iPOND Techniques for Use in Suspension Cells.

    PubMed

    Wiest, Nathaniel E; Tomkinson, Alan E

    2017-01-01

    The isolation of proteins on nascent DNA (iPOND) technique developed by the Cortez laboratory allows a previously unparalleled ability to examine proteins associated with replicating and newly synthesized DNA in mammalian cells. Both the original, formaldehyde-based iPOND technique and a more recent derivative, accelerated native iPOND (aniPOND), have mostly been performed in adherent cell lines. Here, we describe modifications to both protocols for use with suspension cell lines. These include cell culture, pulse, and chase conditions that optimize sample recovery in both protocols using suspension cells and several key improvements to the published aniPOND technique that reduce sample loss, increase signal to noise, and maximize sample recovery. Additionally, we directly and quantitatively compare the iPOND and aniPOND protocols to test the strengths and limitations of both. Finally, we present a detailed protocol to perform the optimized aniPOND protocol in suspension cell lines. © 2017 Elsevier Inc. All rights reserved.

  3. Effects of dynamic diffraction conditions on magnetic parameter determination in a double perovskite Sr2FeMoO6 using electron energy-loss magnetic chiral dichroism.

    PubMed

    Wang, Z C; Zhong, X Y; Jin, L; Chen, X F; Moritomo, Y; Mayer, J

    2017-05-01

    Electron energy-loss magnetic chiral dichroism (EMCD) spectroscopy, which is similar to the well-established X-ray magnetic circular dichroism spectroscopy (XMCD), can determine the quantitative magnetic parameters of materials with high spatial resolution. One of the major obstacles in quantitative analysis using the EMCD technique is the relatively poor signal-to-noise ratio (SNR), compared to XMCD. Here, in the example of a double perovskite Sr 2 FeMoO 6 , we predicted the optimal dynamical diffraction conditions such as sample thickness, crystallographic orientation and detection aperture position by theoretical simulations. By using the optimized conditions, we showed that the SNR of experimental EMCD spectra can be significantly improved and the error of quantitative magnetic parameter determined by EMCD technique can be remarkably lowered. Our results demonstrate that, with enhanced SNR, the EMCD technique can be a unique tool to understand the structure-property relationship of magnetic materials particularly in the high-density magnetic recording and spintronic devices by quantitatively determining magnetic structure and properties at the nanometer scale. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. X-Ray Spectroscopic Laboratory Experiments in Support of the X-Ray Astronomy Program

    NASA Technical Reports Server (NTRS)

    Kahn, Steven M.

    1997-01-01

    Our program is to perform a series of laboratory investigations designed to resolved significant atomic physics uncertainties that limit the interpretation of cosmic X-ray spectra. Specific goals include a quantitative characterization of Fe L-shell spectra; the development of new techniques to simulate Maxwellian plasmas using an Electron Beam Ion Trap (EBIT); and the measurement of dielectronic recombination rates for photoionized gas. New atomic calculations have also been carried out in parallel with the laboratory investigations.

  5. LANDSAT land cover analysis completed for CIRSS/San Bernardino County project

    NASA Technical Reports Server (NTRS)

    Likens, W.; Maw, K.; Sinnott, D. (Principal Investigator)

    1982-01-01

    The LANDSAT analysis carried out as part of Ames Research Center's San Bernardino County Project, one of four projects sponsored by NASA as part of the California Integrated Remote Sensing System (CIRSS) effort for generating and utilizing digital geographic data bases, is described. Topics explored include use of data-base modeling with spectral cluster data to improve LANDSAT data classification, and quantitative evaluation of several change techniques. Both 1976 and 1979 LANDSAT data were used in the project.

  6. Quantitative Functional Imaging Using Dynamic Positron Computed Tomography and Rapid Parameter Estimation Techniques

    NASA Astrophysics Data System (ADS)

    Koeppe, Robert Allen

    Positron computed tomography (PCT) is a diagnostic imaging technique that provides both three dimensional imaging capability and quantitative measurements of local tissue radioactivity concentrations in vivo. This allows the development of non-invasive methods that employ the principles of tracer kinetics for determining physiological properties such as mass specific blood flow, tissue pH, and rates of substrate transport or utilization. A physiologically based, two-compartment tracer kinetic model was derived to mathematically describe the exchange of a radioindicator between blood and tissue. The model was adapted for use with dynamic sequences of data acquired with a positron tomograph. Rapid estimation techniques were implemented to produce functional images of the model parameters by analyzing each individual pixel sequence of the image data. A detailed analysis of the performance characteristics of three different parameter estimation schemes was performed. The analysis included examination of errors caused by statistical uncertainties in the measured data, errors in the timing of the data, and errors caused by violation of various assumptions of the tracer kinetic model. Two specific radioindicators were investigated. ('18)F -fluoromethane, an inert freely diffusible gas, was used for local quantitative determinations of both cerebral blood flow and tissue:blood partition coefficient. A method was developed that did not require direct sampling of arterial blood for the absolute scaling of flow values. The arterial input concentration time course was obtained by assuming that the alveolar or end-tidal expired breath radioactivity concentration is proportional to the arterial blood concentration. The scale of the input function was obtained from a series of venous blood concentration measurements. The method of absolute scaling using venous samples was validated in four studies, performed on normal volunteers, in which directly measured arterial concentrations were compared to those predicted from the expired air and venous blood samples. The glucose analog ('18)F-3-deoxy-3-fluoro-D -glucose (3-FDG) was used for quantitating the membrane transport rate of glucose. The measured data indicated that the phosphorylation rate of 3-FDG was low enough to allow accurate estimation of the transport rate using a two compartment model.

  7. Automatic quantitative analysis of in-stent restenosis using FD-OCT in vivo intra-arterial imaging.

    PubMed

    Mandelias, Kostas; Tsantis, Stavros; Spiliopoulos, Stavros; Katsakiori, Paraskevi F; Karnabatidis, Dimitris; Nikiforidis, George C; Kagadis, George C

    2013-06-01

    A new segmentation technique is implemented for automatic lumen area extraction and stent strut detection in intravascular optical coherence tomography (OCT) images for the purpose of quantitative analysis of in-stent restenosis (ISR). In addition, a user-friendly graphical user interface (GUI) is developed based on the employed algorithm toward clinical use. Four clinical datasets of frequency-domain OCT scans of the human femoral artery were analyzed. First, a segmentation method based on fuzzy C means (FCM) clustering and wavelet transform (WT) was applied toward inner luminal contour extraction. Subsequently, stent strut positions were detected by utilizing metrics derived from the local maxima of the wavelet transform into the FCM membership function. The inner lumen contour and the position of stent strut were extracted with high precision. Compared to manual segmentation by an expert physician, the automatic lumen contour delineation had an average overlap value of 0.917 ± 0.065 for all OCT images included in the study. The strut detection procedure achieved an overall accuracy of 93.80% and successfully identified 9.57 ± 0.5 struts for every OCT image. Processing time was confined to approximately 2.5 s per OCT frame. A new fast and robust automatic segmentation technique combining FCM and WT for lumen border extraction and strut detection in intravascular OCT images was designed and implemented. The proposed algorithm integrated in a GUI represents a step forward toward the employment of automated quantitative analysis of ISR in clinical practice.

  8. A quantitative link between face discrimination deficits and neuronal selectivity for faces in autism☆

    PubMed Central

    Jiang, Xiong; Bollich, Angela; Cox, Patrick; Hyder, Eric; James, Joette; Gowani, Saqib Ali; Hadjikhani, Nouchine; Blanz, Volker; Manoach, Dara S.; Barton, Jason J.S.; Gaillard, William D.; Riesenhuber, Maximilian

    2013-01-01

    Individuals with Autism Spectrum Disorder (ASD) appear to show a general face discrimination deficit across a range of tasks including social–emotional judgments as well as identification and discrimination. However, functional magnetic resonance imaging (fMRI) studies probing the neural bases of these behavioral differences have produced conflicting results: while some studies have reported reduced or no activity to faces in ASD in the Fusiform Face Area (FFA), a key region in human face processing, others have suggested more typical activation levels, possibly reflecting limitations of conventional fMRI techniques to characterize neuron-level processing. Here, we test the hypotheses that face discrimination abilities are highly heterogeneous in ASD and are mediated by FFA neurons, with differences in face discrimination abilities being quantitatively linked to variations in the estimated selectivity of face neurons in the FFA. Behavioral results revealed a wide distribution of face discrimination performance in ASD, ranging from typical performance to chance level performance. Despite this heterogeneity in perceptual abilities, individual face discrimination performance was well predicted by neural selectivity to faces in the FFA, estimated via both a novel analysis of local voxel-wise correlations, and the more commonly used fMRI rapid adaptation technique. Thus, face processing in ASD appears to rely on the FFA as in typical individuals, differing quantitatively but not qualitatively. These results for the first time mechanistically link variations in the ASD phenotype to specific differences in the typical face processing circuit, identifying promising targets for interventions. PMID:24179786

  9. Acousto-Optic Tunable Filter Spectroscopic Instrumentation for Quantitative Near-Ir Analysis of Organic Materials.

    NASA Astrophysics Data System (ADS)

    Eilert, Arnold James

    1995-01-01

    The utility of near-IR spectroscopy for routine quantitative analyses of a wide variety of compositional, chemical, or physical parameters of organic materials is well understood. It can be used for relatively fast and inexpensive non-destructive bulk material analysis before, during, and after processing. It has been demonstrated as being a particularly useful technique for numerous analytical applications in cereal (food and feed) science and industry. Further fulfillment of the potential of near-IR spectroscopic analysis, both in the process and laboratory environment, is reliant upon the development of instrumentation that is capable of meeting the challenges of increasingly difficult applications. One approach to the development of near-IR spectroscopic instrumentation that holds a great deal of promise is acousto-optic tunable filter (AOTF) technology. A combination of attributes offered by AOTF spectrometry, including speed, optical throughput, wavelength reproducibility, ruggedness (no -moving-parts operation) and flexibility, make it particularly desirable for numerous applications. A series of prototype (research model) acousto -optic tunable filter instruments were developed and tested in order to investigate the feasibility of the technology for quantitative near-IR spectrometry. Development included design, component procurement, assembly and/or configuration of the optical and electronic subsystems of which each functional spectrometer arrangement was comprised, as well as computer interfacing and acquisition/control software development. Investigation of this technology involved an evolution of several operational spectrometer systems, each of which offered improvements over its predecessor. Appropriate testing was conducted at various stages of development. Demonstrations of the potential applicability of our AOTF spectrometer to quantitative process monitoring or laboratory analysis of numerous organic substances, including food materials, were performed. Lipid determination in foods by spectroscopic analysis of a solvent used after cold batch extraction and simulated supercritical fluid extraction monitoring were among the applications tested. The ultimate performance specifications of our instrument included full-range wavelength coverage from 1250 to 2400 nm (with random, segmented range, or continuous range wavelength access capability), real -time quantitative analysis rates in excess of 150 determinations per second, and full range (2 nm increment) scanning speeds of 200 milliseconds.

  10. Development of Techniques for Spent Fuel Assay – Differential Dieaway Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swinhoe, Martyn Thomas; Goodsell, Alison; Ianakiev, Kiril Dimitrov

    This report summarizes the work done under a DNDO R&D funded project on the development of the differential dieaway method to measure plutonium in spent fuel. There are large amounts of plutonium that are contained in spent fuel assemblies, and currently there is no way to make quantitative non-destructive assay. This has led NA24 under the Next Generation Safeguards Initiative (NGSI) to establish a multi-year program to investigate, develop and implement measurement techniques for spent fuel. The techniques which are being experimentally tested by the existing NGSI project do not include any pulsed neutron active techniques. The present work coversmore » the active neutron differential dieaway technique and has advanced the state of knowledge of this technique as well as produced a design for a practical active neutron interrogation instrument for spent fuel. Monte Carlo results from the NGSI effort show that much higher accuracy (1-2%) for the Pu content in spent fuel assemblies can be obtained with active neutron interrogation techniques than passive techniques, and this would allow their use for nuclear material accountancy independently of any information from the operator. The main purpose of this work was to develop an active neutron interrogation technique for spent nuclear fuel.« less

  11. Quantifying Morphological Parameters of the Terminal Branching Units in a Mouse Lung by Phase Contrast Synchrotron Radiation Computed Tomography

    PubMed Central

    Hwang, Jeongeun; Kim, Miju; Kim, Seunghwan; Lee, Jinwon

    2013-01-01

    An effective technique of phase contrast synchrotron radiation computed tomography was established for the quantitative analysis of the microstructures in the respiratory zone of a mouse lung. Heitzman’s method was adopted for the whole-lung sample preparation, and Canny’s edge detector was used for locating the air-tissue boundaries. This technique revealed detailed morphology of the respiratory zone components, including terminal bronchioles and alveolar sacs, with sufficiently high resolution of 1.74 µm isotropic voxel size. The technique enabled visual inspection of the respiratory zone components and comprehension of their relative positions in three dimensions. To check the method’s feasibility for quantitative imaging, morphological parameters such as diameter, surface area and volume were measured and analyzed for sixteen randomly selected terminal branching units, each consisting of a terminal bronchiole and a pair of succeeding alveolar sacs. The four types of asymmetry ratios concerning alveolar sac mouth diameter, alveolar sac surface area, and alveolar sac volume are measured. This is the first ever finding of the asymmetry ratio for the terminal bronchioles and alveolar sacs, and it is noteworthy that an appreciable degree of branching asymmetry was observed among the alveolar sacs at the terminal end of the airway tree, despite the number of samples was small yet. The series of efficient techniques developed and confirmed in this study, from sample preparation to quantification, is expected to contribute to a wider and exacter application of phase contrast synchrotron radiation computed tomography to a variety of studies. PMID:23704918

  12. Label-free imaging of the native, living cellular nanoarchitecture using partial-wave spectroscopic microscopy

    PubMed Central

    Almassalha, Luay M.; Bauer, Greta M.; Chandler, John E.; Gladstein, Scott; Cherkezyan, Lusik; Stypula-Cyrus, Yolanda; Weinberg, Samuel; Zhang, Di; Thusgaard Ruhoff, Peder; Roy, Hemant K.; Subramanian, Hariharan; Chandel, Navdeep S.; Szleifer, Igal; Backman, Vadim

    2016-01-01

    The organization of chromatin is a regulator of molecular processes including transcription, replication, and DNA repair. The structures within chromatin that regulate these processes span from the nucleosomal (10-nm) to the chromosomal (>200-nm) levels, with little known about the dynamics of chromatin structure between these scales due to a lack of quantitative imaging technique in live cells. Previous work using partial-wave spectroscopic (PWS) microscopy, a quantitative imaging technique with sensitivity to macromolecular organization between 20 and 200 nm, has shown that transformation of chromatin at these length scales is a fundamental event during carcinogenesis. As the dynamics of chromatin likely play a critical regulatory role in cellular function, it is critical to develop live-cell imaging techniques that can probe the real-time temporal behavior of the chromatin nanoarchitecture. Therefore, we developed a live-cell PWS technique that allows high-throughput, label-free study of the causal relationship between nanoscale organization and molecular function in real time. In this work, we use live-cell PWS to study the change in chromatin structure due to DNA damage and expand on the link between metabolic function and the structure of higher-order chromatin. In particular, we studied the temporal changes to chromatin during UV light exposure, show that live-cell DNA-binding dyes induce damage to chromatin within seconds, and demonstrate a direct link between higher-order chromatin structure and mitochondrial membrane potential. Because biological function is tightly paired with structure, live-cell PWS is a powerful tool to study the nanoscale structure–function relationship in live cells. PMID:27702891

  13. A Chromosome-Scale Assembly of the Bactrocera cucurbitae Genome Provides Insight to the Genetic Basis of white pupae

    PubMed Central

    Sim, Sheina B.; Geib, Scott M.

    2017-01-01

    Genetic sexing strains (GSS) used in sterile insect technique (SIT) programs are textbook examples of how classical Mendelian genetics can be directly implemented in the management of agricultural insect pests. Although the foundation of traditionally developed GSS are single locus, autosomal recessive traits, their genetic basis are largely unknown. With the advent of modern genomic techniques, the genetic basis of sexing traits in GSS can now be further investigated. This study is the first of its kind to integrate traditional genetic techniques with emerging genomics to characterize a GSS using the tephritid fruit fly pest Bactrocera cucurbitae as a model. These techniques include whole-genome sequencing, the development of a mapping population and linkage map, and quantitative trait analysis. The experiment designed to map the genetic sexing trait in B. cucurbitae, white pupae (wp), also enabled the generation of a chromosome-scale genome assembly by integrating the linkage map with the assembly. Quantitative trait loci analysis revealed SNP loci near position 42 MB on chromosome 3 to be tightly linked to wp. Gene annotation and synteny analysis show a near perfect relationship between chromosomes in B. cucurbitae and Muller elements A–E in Drosophila melanogaster. This chromosome-scale genome assembly is complete, has high contiguity, was generated using a minimal input DNA, and will be used to further characterize the genetic mechanisms underlying wp. Knowledge of the genetic basis of genetic sexing traits can be used to improve SIT in this species and expand it to other economically important Diptera. PMID:28450369

  14. Thermographic Imaging of Material Loss in Boiler Water-Wall Tubing by Application of Scanning Line Source

    NASA Technical Reports Server (NTRS)

    Cramer, K. Elliott; Winfree, William P.

    2000-01-01

    Localized wall thinning due to corrosion in utility boiler water-wall tubing is a significant inspection concern for boiler operators. Historically, conventional ultrasonics has been used for inspection of these tubes. This technique has proven to be very manpower and time intensive. This has resulted in a spot check approach to inspections, documenting thickness measurements over a relatively small percentage of the total boiler wall area. NASA Langley Research Center has developed a thermal NDE technique designed to image and quantitatively characterize the amount of material thinning present in steel tubing. The technique involves the movement of a thermal line source across the outer surface of the tubing followed by an infrared imager at a fixed distance behind the line source. Quantitative images of the material loss due to corrosion are reconstructed from measurements of the induced surface temperature variations. This paper will present a discussion of the development of the thermal imaging system as well as the techniques used to reconstruct images of flaws. The application of the thermal line source coupled with the analysis technique represents a significant improvement in the inspection speed for large structures such as boiler water-walls. A theoretical basis for the technique will be presented which explains the quantitative nature of the technique. Further, a dynamic calibration system will be presented for the technique that allows the extraction of thickness information from the temperature data. Additionally, the results of applying this technology to actual water-wall tubing samples and in situ inspections will be presented.

  15. Order reduction for a model of marine bacteriophage evolution

    NASA Astrophysics Data System (ADS)

    Pagliarini, Silvia; Korobeinikov, Andrei

    2017-02-01

    A typical mechanistic model of viral evolution necessary includes several time scales which can differ by orders of magnitude. Such a diversity of time scales makes analysis of these models difficult. Reducing the order of a model is highly desirable when handling such a model. A typical approach applied to such slow-fast (or singularly perturbed) systems is the time scales separation technique. Constructing the so-called quasi-steady-state approximation is the usual first step in applying the technique. While this technique is commonly applied, in some cases its straightforward application can lead to unsatisfactory results. In this paper we construct the quasi-steady-state approximation for a model of evolution of marine bacteriophages based on the Beretta-Kuang model. We show that for this particular model the quasi-steady-state approximation is able to produce only qualitative but not quantitative fit.

  16. Multimode nonlinear optical imaging of the dermis in ex vivo human skin based on the combination of multichannel mode and Lambda mode.

    PubMed

    Zhuo, Shuangmu; Chen, Jianxin; Luo, Tianshu; Zou, Dingsong

    2006-08-21

    A Multimode nonlinear optical imaging technique based on the combination of multichannel mode and Lambda mode is developed to investigate human dermis. Our findings show that this technique not only improves the image contrast of the structural proteins of extracellular matrix (ECM) but also provides an image-guided spectral analysis method to identify both cellular and ECM intrinsic components including collagen, elastin, NAD(P)H and flavin. By the combined use of multichannel mode and Lambda mode in tandem, the obtained in-depth two photon-excited fluorescence (TPEF) and second-harmonic generation (SHG) imaging and TPEF/SHG signals depth-dependence decay can offer a sensitive tool for obtaining quantitative tissue structural and biochemical information. These results suggest that the technique has the potential to provide more accurate information for determining tissue physiological and pathological states.

  17. Imaging challenges in biomaterials and tissue engineering

    PubMed Central

    Appel, Alyssa A.; Anastasio, Mark A.; Larson, Jeffery C.; Brey, Eric M.

    2013-01-01

    Biomaterials are employed in the fields of tissue engineering and regenerative medicine (TERM) in order to enhance the regeneration or replacement of tissue function and/or structure. The unique environments resulting from the presence of biomaterials, cells, and tissues result in distinct challenges in regards to monitoring and assessing the results of these interventions. Imaging technologies for three-dimensional (3D) analysis have been identified as a strategic priority in TERM research. Traditionally, histological and immunohistochemical techniques have been used to evaluate engineered tissues. However, these methods do not allow for an accurate volume assessment, are invasive, and do not provide information on functional status. Imaging techniques are needed that enable non-destructive, longitudinal, quantitative, and three-dimensional analysis of TERM strategies. This review focuses on evaluating the application of available imaging modalities for assessment of biomaterials and tissue in TERM applications. Included is a discussion of limitations of these techniques and identification of areas for further development. PMID:23768903

  18. Identification of Aroma Compounds of Lamiaceae Species in Turkey Using the Purge and Trap Technique

    PubMed Central

    Sonmezdag, Ahmet Salih; Kelebek, Hasim; Selli, Serkan

    2017-01-01

    The present research was planned to characterize the aroma composition of important members of the Lamiaceae family such as Salvia officinalis, Lavandula angustifolia and Mentha asiatica. Aroma components of the S. officinalis, L. angustifolia and M. asiatica were extracted with the purge and trap technique with dichloromethane and analyzed with the gas chromatography–mass spectrometry (GC–MS) technique. A total of 23, 33 and 33 aroma compounds were detected in Salvia officinalis, Lavandula angustifolia and Mentha asiatica, respectively including, acids, alcohols, aldehydes, esters, hydrocarbons and terpenes. Terpene compounds were both qualitatively and quantitatively the major chemical group among the identified aroma compounds, followed by esters. The main terpene compounds were 1,8-cineole, sabinene and linalool in Salvia officinalis, Lavandula angustifolia and Mentha asiatica, respectively. Among esters, linalyl acetate was the only and most important ester compound which was detected in all samples. PMID:28231089

  19. Identification of Aroma Compounds of Lamiaceae Species in Turkey Using the Purge and Trap Technique.

    PubMed

    Sonmezdag, Ahmet Salih; Kelebek, Hasim; Selli, Serkan

    2017-02-08

    The present research was planned to characterize the aroma composition of important members of the Lamiaceae family such as Salvia officinalis , Lavandula angustifolia and Mentha asiatica . Aroma components of the S. officinalis , L. angustifolia and M. asiatica were extracted with the purge and trap technique with dichloromethane and analyzed with the gas chromatography-mass spectrometry (GC-MS) technique. A total of 23, 33 and 33 aroma compounds were detected in Salvia officinalis , Lavandula angustifolia and Mentha asiatica , respectively including, acids, alcohols, aldehydes, esters, hydrocarbons and terpenes. Terpene compounds were both qualitatively and quantitatively the major chemical group among the identified aroma compounds, followed by esters. The main terpene compounds were 1,8-cineole, sabinene and linalool in Salvia officinalis , Lavandula angustifolia and Mentha asiatica , respectively. Among esters, linalyl acetate was the only and most important ester compound which was detected in all samples.

  20. Analysis of dense-medium light scattering with applications to corneal tissue: experiments and Monte Carlo simulations.

    PubMed

    Kim, K B; Shanyfelt, L M; Hahn, D W

    2006-01-01

    Dense-medium scattering is explored in the context of providing a quantitative measurement of turbidity, with specific application to corneal haze. A multiple-wavelength scattering technique is proposed to make use of two-color scattering response ratios, thereby providing a means for data normalization. A combination of measurements and simulations are reported to assess this technique, including light-scattering experiments for a range of polystyrene suspensions. Monte Carlo (MC) simulations were performed using a multiple-scattering algorithm based on full Mie scattering theory. The simulations were in excellent agreement with the polystyrene suspension experiments, thereby validating the MC model. The MC model was then used to simulate multiwavelength scattering in a corneal tissue model. Overall, the proposed multiwavelength scattering technique appears to be a feasible approach to quantify dense-medium scattering such as the manifestation of corneal haze, although more complex modeling of keratocyte scattering, and animal studies, are necessary.

  1. Learning Style Differences in Adult Students' Perceptions of the Effectiveness of Classroom Techniques for Teaching Quantitative Skills

    ERIC Educational Resources Information Center

    Deever, Walter Thomas

    2012-01-01

    More than half of adults in the USA have quantitative literacy ratings at or below a basic level. This lack of literacy often becomes a barrier to employability. To overcome this barrier, adults are returning to college to improve their quantitative skills and complete an undergraduate education, often through an accelerated degree program. A…

  2. Trainee and Instructor Task Quantification: Development of Quantitative Indices and a Predictive Methodology.

    ERIC Educational Resources Information Center

    Whaton, George R.; And Others

    As the first step in a program to develop quantitative techniques for prescribing the design and use of training systems, the present study attempted: to compile an initial set of quantitative indices, to determine whether these indices could be used to describe a sample of trainee tasks and differentiate among them, to develop a predictive…

  3. Quantitative MRI in refractory temporal lobe epilepsy: relationship with surgical outcomes

    PubMed Central

    Bonilha, Leonardo

    2015-01-01

    Medically intractable temporal lobe epilepsy (TLE) remains a serious health problem. Across treatment centers, up to 40% of patients with TLE will continue to experience persistent postoperative seizures at 2-year follow-up. It is unknown why such a large number of patients continue to experience seizures despite being suitable candidates for resective surgery. Preoperative quantitative MRI techniques may provide useful information on why some patients continue to experience disabling seizures, and may have the potential to develop prognostic markers of surgical outcome. In this article, we provide an overview of how quantitative MRI morphometric and diffusion tensor imaging (DTI) data have improved the understanding of brain structural alterations in patients with refractory TLE. We subsequently review the studies that have applied quantitative structural imaging techniques to identify the neuroanatomical factors that are most strongly related to a poor postoperative prognosis. In summary, quantitative imaging studies strongly suggest that TLE is a disorder affecting a network of neurobiological systems, characterized by multiple and inter-related limbic and extra-limbic network abnormalities. The relationship between brain alterations and postoperative outcome are less consistent, but there is emerging evidence suggesting that seizures are less likely to remit with surgery when presurgical abnormalities are observed in the connectivity supporting brain regions serving as network nodes located outside the resected temporal lobe. Future work, possibly harnessing the potential from multimodal imaging approaches, may further elucidate the etiology of persistent postoperative seizures in patients with refractory TLE. Furthermore, quantitative imaging techniques may be explored to provide individualized measures of postoperative seizure freedom outcome. PMID:25853080

  4. Use of the learning conversation improves instructor confidence in life support training: An open randomised controlled cross-over trial comparing teaching feedback mechanisms.

    PubMed

    Baldwin, Lydia J L; Jones, Christopher M; Hulme, Jonathan; Owen, Andrew

    2015-11-01

    Feedback is vital for the effective delivery of skills-based education. We sought to compare the sandwich technique and learning conversation structured methods of feedback delivery in competency-based basic life support (BLS) training. Open randomised crossover study undertaken between October 2014 and March 2015 at the University of Birmingham, United Kingdom. Six-hundred and forty healthcare students undertaking a European Resuscitation Council (ERC) BLS course were enrolled, each of whom was randomised to receive teaching using either the sandwich technique or the learning conversation. Fifty-eight instructors were randomised to initially teach using either the learning conversation or sandwich technique, prior to crossing-over and teaching with the alternative technique after a pre-defined time period. Outcome measures included skill acquisition as measured by an end-of-course competency assessment, instructors' perception of teaching with each feedback technique and candidates' perception of the feedback they were provided with. Scores assigned to use of the learning conversation by instructors were significantly more favourable than for the sandwich technique across all but two assessed domains relating to instructor perception of the feedback technique, including all skills-based domains. No difference was seen in either assessment pass rates (80.9% sandwich technique vs. 77.2% learning conversation; OR 1.2, 95% CI 0.85-1.84; p=0.29) or any domain relating to candidates' perception of their teaching technique. This is the first direct comparison of two feedback techniques in clinical medical education using both quantitative and qualitative methodology. The learning conversation is preferred by instructors providing competency-based life support training and is perceived to favour skills acquisition. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. Quantitative study of Xanthosoma violaceum leaf surfaces using RIMAPS and variogram techniques.

    PubMed

    Favret, Eduardo A; Fuentes, Néstor O; Molina, Ana M

    2006-08-01

    Two new imaging techniques (rotated image with maximum averaged power spectrum (RIMAPS) and variogram) are presented for the study and description of leaf surfaces. Xanthosoma violaceum was analyzed to illustrate the characteristics of both techniques. Both techniques produce a quantitative description of leaf surface topography. RIMAPS combines digitized images rotation with Fourier transform, and it is used to detect patterns orientation and characteristics of surface topography. Variogram relates the mathematical variance of a surface with the area of the sample window observed. It gives the typical scale lengths of the surface patterns. RIMAPS detects the morphological variations of the surface topography pattern between fresh and dried (herbarium) samples of the leaf. The variogram method finds the characteristic dimensions of the leaf microstructure, i.e., cell length, papillae diameter, etc., showing that there are not significant differences between dry and fresh samples. The results obtained show the robustness of RIMAPS and variogram analyses to detect, distinguish, and characterize leaf surfaces, as well as give scale lengths. Both techniques are tools for the biologist to study variations of the leaf surface when different patterns are present. The use of RIMAPS and variogram opens a wide spectrum of possibilities by providing a systematic, quantitative description of the leaf surface topography.

  6. A relative quantitative assessment of myocardial perfusion by first-pass technique: animal study

    NASA Astrophysics Data System (ADS)

    Chen, Jun; Zhang, Zhang; Yu, Xuefang; Zhou, Kenneth J.

    2015-03-01

    The purpose of this study is to quantitatively assess the myocardial perfusion by first-pass technique in swine model. Numerous techniques based on the analysis of Computed Tomography (CT) Hounsfield Unit (HU) density have emerged. Although these methods proposed to be able to assess haemodynamically significant coronary artery stenosis, their limitations are noticed. There are still needs to develop some new techniques. Experiments were performed upon five (5) closed-chest swine. Balloon catheters were placed into the coronary artery to simulate different degrees of luminal stenosis. Myocardial Blood Flow (MBF) was measured using color microsphere technique. Fractional Flow Reserve (FFR) was measured using pressure wire. CT examinations were performed twice during First-pass phase under adenosine-stress condition. CT HU Density (HUDCT) and CT HU Density Ratio (HUDRCT) were calculated using the acquired CT images. Our study presents that HUDRCT shows a good (y=0.07245+0.09963x, r2=0.898) correlation with MBF and FFR. In receiver operating characteristic (ROC) curve analyses, HUDRCT provides excellent diagnostic performance for the detection of significant ischemia during adenosine-stress as defined by FFR indicated by the value of Area Under the Curve (AUC) of 0.927. HUDRCT has the potential to be developed as a useful indicator of quantitative assessment of myocardial perfusion.

  7. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  8. Automated Quantitative Nuclear Cardiology Methods

    PubMed Central

    Motwani, Manish; Berman, Daniel S.; Germano, Guido; Slomka, Piotr J.

    2016-01-01

    Quantitative analysis of SPECT and PET has become a major part of nuclear cardiology practice. Current software tools can automatically segment the left ventricle, quantify function, establish myocardial perfusion maps and estimate global and local measures of stress/rest perfusion – all with minimal user input. State-of-the-art automated techniques have been shown to offer high diagnostic accuracy for detecting coronary artery disease, as well as predict prognostic outcomes. This chapter briefly reviews these techniques, highlights several challenges and discusses the latest developments. PMID:26590779

  9. Analysis of Ergot Alkaloids

    PubMed Central

    Crews, Colin

    2015-01-01

    The principles and application of established and newer methods for the quantitative and semi-quantitative determination of ergot alkaloids in food, feed, plant materials and animal tissues are reviewed. The techniques of sampling, extraction, clean-up, detection, quantification and validation are described. The major procedures for ergot alkaloid analysis comprise liquid chromatography with tandem mass spectrometry (LC-MS/MS) and liquid chromatography with fluorescence detection (LC-FLD). Other methods based on immunoassays are under development and variations of these and minor techniques are available for specific purposes. PMID:26046699

  10. Hyper-spectrum scanning laser optical tomography

    NASA Astrophysics Data System (ADS)

    Chen, Lingling; Li, Guiye; Li, Yingchao; Liu, Lina; Liu, Ang; Hu, Xuejuan; Ruan, Shuangchen

    2018-02-01

    We describe a quantitative fluorescence projection tomography technique which measures the three-dimensional fluorescence spectrum in biomedical samples with size up to several millimeters. This is achieved by acquiring a series of hyperspectral images, by using laser scanning scheme, at different projection angles. We demonstrate that this technique provide a quantitative measure of the fluorescence signal by comparing the spectrum and intensity profile of a fluorescent bead phantom and also demonstrate its application to differentiating the extrinsic label and the autofluorescence in a mouse embryo.

  11. Analysis of defect structure in silicon. Characterization of samples from UCP ingot 5848-13C

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Guyer, T.; Stringfellow, G. B.

    1982-01-01

    Statistically significant quantitative structural imperfection measurements were made on samples from ubiquitous crystalline process (UCP) Ingot 5848 - 13 C. Important trends were noticed between the measured data, cell efficiency, and diffusion length. Grain boundary substructure appears to have an important effect on the conversion efficiency of solar cells from Semix material. Quantitative microscopy measurements give statistically significant information compared to other microanalytical techniques. A surface preparation technique to obtain proper contrast of structural defects suitable for QTM analysis was perfected.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carla J. Miller

    This report provides a summary of the literature review that was performed and based on previous work performed at the Idaho National Laboratory studying the Three Mile Island 2 (TMI-2) nuclear reactor accident, specifically the melted fuel debris. The purpose of the literature review was to document prior published work that supports the feasibility of the analytical techniques that were developed to provide quantitative results of the make-up of the fuel and reactor component debris located inside and outside the containment. The quantitative analysis provides a technique to perform nuclear fuel accountancy measurements

  13. Development of a Fourier transform infrared spectroscopy coupled to UV-Visible analysis technique for aminosides and glycopeptides quantitation in antibiotic locks.

    PubMed

    Sayet, G; Sinegre, M; Ben Reguiga, M

    2014-01-01

    Antibiotic Lock technique maintains catheters' sterility in high-risk patients with long-term parenteral nutrition. In our institution, vancomycin, teicoplanin, amikacin and gentamicin locks are prepared in the pharmaceutical department. In order to insure patient safety and to comply to regulatory requirements, antibiotic locks are submitted to qualitative and quantitative assays prior to their release. The aim of this study was to develop an alternative quantitation technique for each of these 4 antibiotics, using a Fourier transform infrared (FTIR) coupled to UV-Visible spectroscopy and to compare results to HPLC or Immunochemistry assays. Prevalidation studies permitted to assess spectroscopic conditions used for antibiotic locks quantitation: FTIR/UV combinations were used for amikacin (1091-1115cm(-1) and 208-224nm), vancomycin (1222-1240cm(-1) and 276-280nm), and teicoplanin (1226-1230cm(-1) and 278-282nm). Gentamicin was quantified with FTIR only (1045-1169cm(-1) and 2715-2850cm(-1)) due to interferences in UV domain of parabens, preservatives present in the commercial brand used to prepare locks. For all AL, the method was linear (R(2)=0.996 to 0.999), accurate, repeatable (intraday RSD%: from 2.9 to 7.1% and inter-days RSD%: 2.9 to 5.1%) and precise. Compared to the reference methods, the FTIR/UV method appeared tightly correlated (Pearson factor: 97.4 to 99.9%) and did not show significant difference in recovery determinations. We developed a new simple reliable analysis technique for antibiotics quantitation in locks using an original association of FTIR and UV analysis, allowing a short time analysis to identify and quantify the studied antibiotics. Copyright © 2013 Elsevier Masson SAS. All rights reserved.

  14. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  15. Development of inspection techniques for quantitatively measuring surface contamination on SRM hardware

    NASA Technical Reports Server (NTRS)

    Law, R. D.

    1989-01-01

    A contaminant is any material or substance which is potentially undesirable or which may adversely affect any part, component, or assembly. Contamination control of SRM hardware surfaces is a serious concern, for both Thiokol and NASA, with particular concern for contaminants which may adversely affect bonding surfaces. The purpose of this study is to develop laboratory analytical techniques which will make it possible to certify the cleanliness of any designated surface, with special focus on particulates (dust, dirt, lint, etc.), oils (hydrocarbons, silicones, plasticizers, etc.), and greases (HD-2, fluorocarbon grease, etc.). The hardware surfaces of concern will include D6AC steel, aluminum alloys, anodized aluminum alloys, glass/phenolic, carbon/phenolic, NBR/asbestos-silica, and EPDM rubber.

  16. Nuclear Resonance Fluorescence for Materials Assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quiter, Brian J.; Ludewigt, Bernhard; Mozin, Vladimir

    This paper discusses the use of nuclear resonance fluorescence (NRF) techniques for the isotopic and quantitative assaying of radioactive material. Potential applications include age-dating of an unknown radioactive source, pre- and post-detonation nuclear forensics, and safeguards for nuclear fuel cycles Examples of age-dating a strong radioactive source and assaying a spent fuel pin are discussed. The modeling work has ben performed with the Monte Carlo radiation transport computer code MCNPX, and the capability to simulate NRF has bee added to the code. Discussed are the limitations in MCNPX?s photon transport physics for accurately describing photon scattering processes that are importantmore » contributions to the background and impact the applicability of the NRF assay technique.« less

  17. Crop identification technology assessment for remote sensing (CITARS). Volume 10: Interpretation of results

    NASA Technical Reports Server (NTRS)

    Bizzell, R. M.; Feiveson, A. H.; Hall, F. G.; Bauer, M. E.; Davis, B. J.; Malila, W. A.; Rice, D. P.

    1975-01-01

    The CITARS was an experiment designed to quantitatively evaluate crop identification performance for corn and soybeans in various environments using a well-defined set of automatic data processing (ADP) techniques. Each technique was applied to data acquired to recognize and estimate proportions of corn and soybeans. The CITARS documentation summarizes, interprets, and discusses the crop identification performances obtained using (1) different ADP procedures; (2) a linear versus a quadratic classifier; (3) prior probability information derived from historic data; (4) local versus nonlocal recognition training statistics and the associated use of preprocessing; (5) multitemporal data; (6) classification bias and mixed pixels in proportion estimation; and (7) data with differnt site characteristics, including crop, soil, atmospheric effects, and stages of crop maturity.

  18. Five-Factor Model personality disorder prototypes: a review of their development, validity, and comparison to alternative approaches.

    PubMed

    Miller, Joshua D

    2012-12-01

    In this article, the development of Five-Factor Model (FFM) personality disorder (PD) prototypes for the assessment of DSM-IV PDs are reviewed, as well as subsequent procedures for scoring individuals' FFM data with regard to these PD prototypes, including similarity scores and simple additive counts that are based on a quantitative prototype matching methodology. Both techniques, which result in very strongly correlated scores, demonstrate convergent and discriminant validity, and provide clinically useful information with regard to various forms of functioning. The techniques described here for use with FFM data are quite different from the prototype matching methods used elsewhere. © 2012 The Author. Journal of Personality © 2012, Wiley Periodicals, Inc.

  19. You can run, you can hide: The epidemiology and statistical mechanics of zombies

    NASA Astrophysics Data System (ADS)

    Alemi, Alexander A.; Bierbaum, Matthew; Myers, Christopher R.; Sethna, James P.

    2015-11-01

    We use a popular fictional disease, zombies, in order to introduce techniques used in modern epidemiology modeling, and ideas and techniques used in the numerical study of critical phenomena. We consider variants of zombie models, from fully connected continuous time dynamics to a full scale exact stochastic dynamic simulation of a zombie outbreak on the continental United States. Along the way, we offer a closed form analytical expression for the fully connected differential equation, and demonstrate that the single person per site two dimensional square lattice version of zombies lies in the percolation universality class. We end with a quantitative study of the full scale US outbreak, including the average susceptibility of different geographical regions.

  20. Noninvasive imaging of bone microarchitecture

    PubMed Central

    Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila

    2015-01-01

    The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043

  1. Non-intrusive flow measurements on a reentry vehicle

    NASA Technical Reports Server (NTRS)

    Miles, R. B.; Satavicca, D. A.; Zimmermann, G. M.

    1983-01-01

    This study evaluates the utility of various non-intrusive techniques for the measurement of the flow field on the windward side of the Space Shuttle or a similar re-entry vehicle. Included are linear (Rayleigh, Raman, Mie, Laser Doppler Velocimetry, Resonant Doppler Velocimetry) and nonlinear (Coherent Anti-Stokes Raman, Laser Induced Fluorescence) light scattering, electron beam fluorescence, thermal emission and mass spectroscopy. Flow field properties are taken from a nonequilibrium flow model by Shinn, Moss and Simmonds at NASA Langley. Conclusions are, when possible, based on quantitative scaling of known laboratory results to the conditions projected. Detailed discussion with researchers in the field contributed further to these conclusions and provided valuable insights regarding the experimental feasibility of each of the techniques.

  2. Quantitative X-ray dark-field and phase tomography using single directional speckle scanning technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongchang, E-mail: hongchang.wang@diamond.ac.uk; Kashyap, Yogesh; Sawhney, Kawal

    2016-03-21

    X-ray dark-field contrast tomography can provide important supplementary information inside a sample to the conventional absorption tomography. Recently, the X-ray speckle based technique has been proposed to provide qualitative two-dimensional dark-field imaging with a simple experimental arrangement. In this letter, we deduce a relationship between the second moment of scattering angle distribution and cross-correlation degradation of speckle and establish a quantitative basis of X-ray dark-field tomography using single directional speckle scanning technique. In addition, the phase contrast images can be simultaneously retrieved permitting tomographic reconstruction, which yields enhanced contrast in weakly absorbing materials. Such complementary tomography technique can allow systematicmore » investigation of complex samples containing both soft and hard materials.« less

  3. A four-quadrant sequential streak technique to evaluate Campylobacter selective broths for suppressing background flora in broiler carcass rinses

    USDA-ARS?s Scientific Manuscript database

    The ecometric technique is a semi-quantitative scoring method used in the quality control of culture media in microbiology laboratories. This technique involves inoculation with defined populations of a specific culture onto solid media via a standardized chronological streaking technique, leading ...

  4. Quantitative CT: technique dependence of volume estimation on pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2012-03-01

    Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.

  5. Computational technique for stepwise quantitative assessment of equation correctness

    NASA Astrophysics Data System (ADS)

    Othman, Nuru'l Izzah; Bakar, Zainab Abu

    2017-04-01

    Many of the computer-aided mathematics assessment systems that are available today possess the capability to implement stepwise correctness checking of a working scheme for solving equations. The computational technique for assessing the correctness of each response in the scheme mainly involves checking the mathematical equivalence and providing qualitative feedback. This paper presents a technique, known as the Stepwise Correctness Checking and Scoring (SCCS) technique that checks the correctness of each equation in terms of structural equivalence and provides quantitative feedback. The technique, which is based on the Multiset framework, adapts certain techniques from textual information retrieval involving tokenization, document modelling and similarity evaluation. The performance of the SCCS technique was tested using worked solutions on solving linear algebraic equations in one variable. 350 working schemes comprising of 1385 responses were collected using a marking engine prototype, which has been developed based on the technique. The results show that both the automated analytical scores and the automated overall scores generated by the marking engine exhibit high percent agreement, high correlation and high degree of agreement with manual scores with small average absolute and mixed errors.

  6. Diffusing wave spectroscopy and its application for monitoring of skin blood microcirculation

    NASA Astrophysics Data System (ADS)

    Meglinski, Igor V.

    2003-10-01

    Diffusing Wave Spectroscopy (DWS) is a novel modern technique uniquely suited for the non-invasive measurements of the particles size and their motion within the randomly inhomogeneous highly scattering and absorbing media, including biological tissues as a human skin. The technique is based on the illuminating the media (tissues) with a coherent laser light, and analyzing the loss of coherence of the scattered field arises from motion of the scattering particles with respect to each other. Both theoretical and experimental results has shown the potentialities and viability of DWS application for the express non-invasive quantitative monitoring and functional diagnostics of skin blood microcirculation, with down to 1 μm/sec resolution. This is likely lead to quantitative monitoring in general diagnostics, diabetes studies, pharmacological intervention for the failing surgical skin flaps or replants, blood microcirculation monitoring during sepsis, assess burn depth, diagnose atherosclerotic disease, and investigate mechanisms of photodynamic therapy for cancer treatment. In frame of current report we describe the recent developments of DWS further to the point that skin blood micro-flow can be routinely and accurately obtained in a separate skin vascular bed on normal skin tissues.

  7. Quantitative Thermochronology

    NASA Astrophysics Data System (ADS)

    Braun, Jean; van der Beek, Peter; Batt, Geoffrey

    2006-05-01

    Thermochronology, the study of the thermal history of rocks, enables us to quantify the nature and timing of tectonic processes. Quantitative Thermochronology is a robust review of isotopic ages, and presents a range of numerical modeling techniques to allow the physical implications of isotopic age data to be explored. The authors provide analytical, semi-analytical, and numerical solutions to the heat transfer equation in a range of tectonic settings and under varying boundary conditions. They then illustrate their modeling approach built around a large number of case studies. The benefits of different thermochronological techniques are also described. Computer programs on an accompanying website at www.cambridge.org/9780521830577 are introduced through the text and provide a means of solving the heat transport equation in the deforming Earth to predict the ages of rocks and compare them directly to geological and geochronological data. Several short tutorials, with hints and solutions, are also included. Numerous case studies help geologists to interpret age data and relate it to Earth processes Essential background material to aid understanding and using thermochronological data Provides a thorough treatise on numerical modeling of heat transport in the Earth's crust Supported by a website hosting relevant computer programs and colour slides of figures from the book for use in teaching

  8. Assessing direct analysis in real-time-mass spectrometry (DART-MS) for the rapid identification of additives in food packaging.

    PubMed

    Ackerman, L K; Noonan, G O; Begley, T H

    2009-12-01

    The ambient ionization technique direct analysis in real time (DART) was characterized and evaluated for the screening of food packaging for the presence of packaging additives using a benchtop mass spectrometer (MS). Approximate optimum conditions were determined for 13 common food-packaging additives, including plasticizers, anti-oxidants, colorants, grease-proofers, and ultraviolet light stabilizers. Method sensitivity and linearity were evaluated using solutions and characterized polymer samples. Additionally, the response of a model additive (di-ethyl-hexyl-phthalate) was examined across a range of sample positions, DART, and MS conditions (temperature, voltage and helium flow). Under optimal conditions, molecular ion (M+H+) was the major ion for most additives. Additive responses were highly sensitive to sample and DART source orientation, as well as to DART flow rates, temperatures, and MS inlet voltages, respectively. DART-MS response was neither consistently linear nor quantitative in this setting, and sensitivity varied by additive. All additives studied were rapidly identified in multiple food-packaging materials by DART-MS/MS, suggesting this technique can be used to screen food packaging rapidly. However, method sensitivity and quantitation requires further study and improvement.

  9. Quantitative multi-modal NDT data analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heideklang, René; Shokouhi, Parisa

    2014-02-18

    A single NDT technique is often not adequate to provide assessments about the integrity of test objects with the required coverage or accuracy. In such situations, it is often resorted to multi-modal testing, where complementary and overlapping information from different NDT techniques are combined for a more comprehensive evaluation. Multi-modal material and defect characterization is an interesting task which involves several diverse fields of research, including signal and image processing, statistics and data mining. The fusion of different modalities may improve quantitative nondestructive evaluation by effectively exploiting the augmented set of multi-sensor information about the material. It is the redundantmore » information in particular, whose quantification is expected to lead to increased reliability and robustness of the inspection results. There are different systematic approaches to data fusion, each with its specific advantages and drawbacks. In our contribution, these will be discussed in the context of nondestructive materials testing. A practical study adopting a high-level scheme for the fusion of Eddy Current, GMR and Thermography measurements on a reference metallic specimen with built-in grooves will be presented. Results show that fusion is able to outperform the best single sensor regarding detection specificity, while retaining the same level of sensitivity.« less

  10. Three-dimensional surface profile intensity correction for spatially modulated imaging

    NASA Astrophysics Data System (ADS)

    Gioux, Sylvain; Mazhar, Amaan; Cuccia, David J.; Durkin, Anthony J.; Tromberg, Bruce J.; Frangioni, John V.

    2009-05-01

    We describe a noncontact profile correction technique for quantitative, wide-field optical measurement of tissue absorption (μa) and reduced scattering (μs') coefficients, based on geometric correction of the sample's Lambertian (diffuse) reflectance intensity. Because the projection of structured light onto an object is the basis for both phase-shifting profilometry and modulated imaging, we were able to develop a single instrument capable of performing both techniques. In so doing, the surface of the three-dimensional object could be acquired and used to extract the object's optical properties. The optical properties of flat polydimethylsiloxane (silicone) phantoms with homogenous tissue-like optical properties were extracted, with and without profilometry correction, after vertical translation and tilting of the phantoms at various angles. Objects having a complex shape, including a hemispheric silicone phantom and human fingers, were acquired and similarly processed, with vascular constriction of a finger being readily detectable through changes in its optical properties. Using profilometry correction, the accuracy of extracted absorption and reduced scattering coefficients improved from two- to ten-fold for surfaces having height variations as much as 3 cm and tilt angles as high as 40 deg. These data lay the foundation for employing structured light for quantitative imaging during surgery.

  11. Estimating background-subtracted fluorescence transients in calcium imaging experiments: a quantitative approach.

    PubMed

    Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe

    2013-08-01

    Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Generating One Biometric Feature from Another: Faces from Fingerprints

    PubMed Central

    Ozkaya, Necla; Sagiroglu, Seref

    2010-01-01

    This study presents a new approach based on artificial neural networks for generating one biometric feature (faces) from another (only fingerprints). An automatic and intelligent system was designed and developed to analyze the relationships among fingerprints and faces and also to model and to improve the existence of the relationships. The new proposed system is the first study that generates all parts of the face including eyebrows, eyes, nose, mouth, ears and face border from only fingerprints. It is also unique and different from similar studies recently presented in the literature with some superior features. The parameter settings of the system were achieved with the help of Taguchi experimental design technique. The performance and accuracy of the system have been evaluated with 10-fold cross validation technique using qualitative evaluation metrics in addition to the expanded quantitative evaluation metrics. Consequently, the results were presented on the basis of the combination of these objective and subjective metrics for illustrating the qualitative properties of the proposed methods as well as a quantitative evaluation of their performances. Experimental results have shown that one biometric feature can be determined from another. These results have once more indicated that there is a strong relationship between fingerprints and faces. PMID:22399877

  13. Real-time PCR to determine transgene copy number and to quantitate the biolocalization of adoptively transferred cells from EGFP-transgenic mice.

    PubMed

    Joshi, Molishree; Keith Pittman, H; Haisch, Carl; Verbanac, Kathryn

    2008-09-01

    Quantitative real-time PCR (qPCR) is a sensitive technique for the detection and quantitation of specific DNA sequences. Here we describe a Taqman qPCR assay for quantification of tissue-localized, adoptively transferred enhanced green fluorescent protein (EGFP)-transgenic cells. A standard curve constructed from serial dilutions of a plasmid containing the EGFP transgene was (i) highly reproducible, (ii) detected as few as two copies, and (iii) was included in each qPCR assay. qPCR analysis of genomic DNA was used to determine transgene copy number in several mouse strains. Fluorescent microscopy of tissue sections showed that adoptively transferred vascular endothelial cells (VEC) from EGFP-transgenic mice specifically localized to tissue with metastatic tumors in syngeneic recipients. VEC microscopic enumeration of liver metastases strongly correlated with qPCR analysis of identical sections (Pearson correlation 0.81). EGFP was undetectable in tissue from control mice by qPCR. In another study using intra-tumor EGFP-VEC delivery to subcutaneous tumors, manual cell count and qPCR analysis of alternating sections also strongly correlated (Pearson correlation 0.82). Confocal microscopy of the subcutaneous tumor sections determined that visual fluorescent signals were frequently tissue artifacts. This qPCR methodology offers specific, objective, and rapid quantitation, uncomplicated by tissue autofluorescence, and should be readily transferable to other in vivo models to quantitate the biolocalization of transplanted cells.

  14. An Investigation of Proposed Techniques for Quantifying Confidence in Assurance Arguments

    NASA Technical Reports Server (NTRS)

    Graydon, Patrick J.; Holloway, C. Michael

    2016-01-01

    The use of safety cases in certification raises the question of assurance argument sufficiency and the issue of confidence (or uncertainty) in the argument's claims. Some researchers propose to model confidence quantitatively and to calculate confidence in argument conclusions. We know of little evidence to suggest that any proposed technique would deliver trustworthy results when implemented by system safety practitioners. Proponents do not usually assess the efficacy of their techniques through controlled experiment or historical study. Instead, they present an illustrative example where the calculation delivers a plausible result. In this paper, we review current proposals, claims made about them, and evidence advanced in favor of them. We then show that proposed techniques can deliver implausible results in some cases. We conclude that quantitative confidence techniques require further validation before they should be recommended as part of the basis for deciding whether an assurance argument justifies fielding a critical system.

  15. Analysis of objects in binary images. M.S. Thesis - Old Dominion Univ.

    NASA Technical Reports Server (NTRS)

    Leonard, Desiree M.

    1991-01-01

    Digital image processing techniques are typically used to produce improved digital images through the application of successive enhancement techniques to a given image or to generate quantitative data about the objects within that image. In support of and to assist researchers in a wide range of disciplines, e.g., interferometry, heavy rain effects on aerodynamics, and structure recognition research, it is often desirable to count objects in an image and compute their geometric properties. Therefore, an image analysis application package, focusing on a subset of image analysis techniques used for object recognition in binary images, was developed. This report describes the techniques and algorithms utilized in three main phases of the application and are categorized as: image segmentation, object recognition, and quantitative analysis. Appendices provide supplemental formulas for the algorithms employed as well as examples and results from the various image segmentation techniques and the object recognition algorithm implemented.

  16. Prediction of rain effects on earth-space communication links operating in the 10 to 35 GHz frequency range

    NASA Technical Reports Server (NTRS)

    Stutzman, Warren L.

    1989-01-01

    This paper reviews the effects of precipitation on earth-space communication links operating the 10 to 35 GHz frequency range. Emphasis is on the quantitative prediction of rain attenuation and depolarization. Discussions center on the models developed at Virginia Tech. Comments on other models are included as well as literature references to key works. Also included is the system level modeling for dual polarized communication systems with techniques for calculating antenna and propagation medium effects. Simple models for the calculation of average annual attenuation and cross-polarization discrimination (XPD) are presented. Calculation of worst month statistics are also presented.

  17. Review of MR Elastography Applications and Recent Developments

    PubMed Central

    Glaser, Kevin J.; Manduca, Armando; Ehman, Richard L.

    2012-01-01

    The technique of MR elastography (MRE) has emerged as a useful modality for quantitatively imaging the mechanical properties of soft tissues in vivo. Recently, MRE has been introduced as a clinical tool for evaluating chronic liver disease, but many other potential applications are being explored. These applications include measuring tissue changes associated with diseases of the liver, breast, brain, heart, and skeletal muscle including both focal lesions (e.g., hepatic, breast, and brain tumors) and diffuse diseases (e.g., fibrosis and multiple sclerosis). The purpose of this review article is to summarize some of the recent developments of MRE and to highlight some emerging applications. PMID:22987755

  18. Streamlined approach to mapping the magnetic induction of skyrmionic materials.

    PubMed

    Chess, Jordan J; Montoya, Sergio A; Harvey, Tyler R; Ophus, Colin; Couture, Simon; Lomakin, Vitaliy; Fullerton, Eric E; McMorran, Benjamin J

    2017-06-01

    Recently, Lorentz transmission electron microscopy (LTEM) has helped researchers advance the emerging field of magnetic skyrmions. These magnetic quasi-particles, composed of topologically non-trivial magnetization textures, have a large potential for application as information carriers in low-power memory and logic devices. LTEM is one of a very few techniques for direct, real-space imaging of magnetic features at the nanoscale. For Fresnel-contrast LTEM, the transport of intensity equation (TIE) is the tool of choice for quantitative reconstruction of the local magnetic induction through the sample thickness. Typically, this analysis requires collection of at least three images. Here, we show that for uniform, thin, magnetic films, which includes many skyrmionic samples, the magnetic induction can be quantitatively determined from a single defocused image using a simplified TIE approach. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Fast real-time polymerase chain reaction for quantitative detection of Lactobacillus delbrueckii bacteriophages in milk.

    PubMed

    Martín, Maria Cruz; del Rio, Beatriz; Martínez, Noelia; Magadán, Alfonso H; Alvarez, Miguel A

    2008-12-01

    One of the main microbiological problems of the dairy industry is the susceptibility of starter bacteria to virus infections. Lactobacillus delbrueckii, a component of thermophilic starter cultures used in the manufacture of several fermented dairy products, including yogurt, is also sensitive to bacteriophage attacks. To avoid the problems associated with these viruses, quick and sensitive detection methods are necessary. In the present study, a fast real-time quantitative polymerase chain reaction assay for the direct detection and quantification of L. delbrueckii phages in milk was developed. A set of primers and a TaqMan MGB probe was designed, based on the lysin gene sequence of different L. delbrueckii phages. The results show the proposed method to be a rapid (total processing time 30 min), specific and highly sensitive technique for detecting L. delbrueckii phages in milk.

  20. Defining glycoprotein cancer biomarkers by MS in conjunction with glycoprotein enrichment.

    PubMed

    Song, Ehwang; Mechref, Yehia

    2015-01-01

    Protein glycosylation is an important and common post-translational modification. More than 50% of human proteins are believed to be glycosylated to modulate the functionality of proteins. Aberrant glycosylation has been correlated to several diseases, such as inflammatory skin diseases, diabetes mellitus, cardiovascular disorders, rheumatoid arthritis, Alzheimer's and prion diseases, and cancer. Many approved cancer biomarkers are glycoproteins which are not highly abundant proteins. Therefore, effective qualitative and quantitative assessment of glycoproteins entails enrichment methods. This chapter summarizes glycoprotein enrichment methods, including lectin affinity, immunoaffinity, hydrazide chemistry, hydrophilic interaction liquid chromatography, and click chemistry. The use of these enrichment approaches in assessing the qualitative and quantitative changes of glycoproteins in different types of cancers are presented and discussed. This chapter highlights the importance of glycoprotein enrichment techniques for the identification and characterization of new reliable cancer biomarkers.

Top