Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Zhang, Yin; Diao, Tianxi; Wang, Lei
2014-12-01
Designed to advance the two-way translational process between basic research and clinical practice, translational medicine has become one of the most important areas in biomedicine. The quantitative evaluation of translational medicine is valuable for the decision making of global translational medical research and funding. Using the scientometric analysis and information extraction techniques, this study quantitatively analyzed the scientific articles on translational medicine. The results showed that translational medicine had significant scientific output and impact, specific core field and institute, and outstanding academic status and benefit. While it is not considered in this study, the patent data are another important indicators that should be integrated in the relevant research in the future. © 2014 Wiley Periodicals, Inc.
Refractive index variance of cells and tissues measured by quantitative phase imaging.
Shan, Mingguang; Kandel, Mikhail E; Popescu, Gabriel
2017-01-23
The refractive index distribution of cells and tissues governs their interaction with light and can report on morphological modifications associated with disease. Through intensity-based measurements, refractive index information can be extracted only via scattering models that approximate light propagation. As a result, current knowledge of refractive index distributions across various tissues and cell types remains limited. Here we use quantitative phase imaging and the statistical dispersion relation (SDR) to extract information about the refractive index variance in a variety of specimens. Due to the phase-resolved measurement in three-dimensions, our approach yields refractive index results without prior knowledge about the tissue thickness. With the recent progress in quantitative phase imaging systems, we anticipate that using SDR will become routine in assessing tissue optical properties.
Extraction of quantitative surface characteristics from AIRSAR data for Death Valley, California
NASA Technical Reports Server (NTRS)
Kierein-Young, K. S.; Kruse, F. A.
1992-01-01
Polarimetric Airborne Synthetic Aperture Radar (AIRSAR) data were collected for the Geologic Remote Sensing Field Experiment (GRSFE) over Death Valley, California, USA, in Sep. 1989. AIRSAR is a four-look, quad-polarization, three frequency instrument. It collects measurements at C-band (5.66 cm), L-band (23.98 cm), and P-band (68.13 cm), and has a GIFOV of 10 meters and a swath width of 12 kilometers. Because the radar measures at three wavelengths, different scales of surface roughness are measured. Also, dielectric constants can be calculated from the data. The AIRSAR data were calibrated using in-scene trihedral corner reflectors to remove cross-talk; and to calibrate the phase, amplitude, and co-channel gain imbalance. The calibration allows for the extraction of accurate values of rms surface roughness, dielectric constants, sigma(sub 0) backscatter, and polarization information. The radar data sets allow quantitative characterization of small scale surface structure of geologic units, providing information about the physical and chemical processes that control the surface morphology. Combining the quantitative information extracted from the radar data with other remotely sensed data sets allows discrimination, identification and mapping of geologic units that may be difficult to discern using conventional techniques.
Synthesising quantitative and qualitative research in evidence-based patient information.
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-03-01
Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence-based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non-quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg "explain what the test involves") was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review.
Quantitative Indicators for Behaviour Drift Detection from Home Automation Data.
Veronese, Fabio; Masciadri, Andrea; Comai, Sara; Matteucci, Matteo; Salice, Fabio
2017-01-01
Smart Homes diffusion provides an opportunity to implement elderly monitoring, extending seniors' independence and avoiding unnecessary assistance costs. Information concerning the inhabitant behaviour is contained in home automation data, and can be extracted by means of quantitative indicators. The application of such approach proves it can evidence behaviour changes.
Synthesising quantitative and qualitative research in evidence‐based patient information
Goldsmith, Megan R; Bankhead, Clare R; Austoker, Joan
2007-01-01
Background Systematic reviews have, in the past, focused on quantitative studies and clinical effectiveness, while excluding qualitative evidence. Qualitative research can inform evidence‐based practice independently of other research methodologies but methods for the synthesis of such data are currently evolving. Synthesising quantitative and qualitative research in a single review is an important methodological challenge. Aims This paper describes the review methods developed and the difficulties encountered during the process of updating a systematic review of evidence to inform guidelines for the content of patient information related to cervical screening. Methods Systematic searches of 12 electronic databases (January 1996 to July 2004) were conducted. Studies that evaluated the content of information provided to women about cervical screening or that addressed women's information needs were assessed for inclusion. A data extraction form and quality assessment criteria were developed from published resources. A non‐quantitative synthesis was conducted and a tabular evidence profile for each important outcome (eg “explain what the test involves”) was prepared. The overall quality of evidence for each outcome was then assessed using an approach published by the GRADE working group, which was adapted to suit the review questions and modified to include qualitative research evidence. Quantitative and qualitative studies were considered separately for every outcome. Results 32 papers were included in the systematic review following data extraction and assessment of methodological quality. The review questions were best answered by evidence from a range of data sources. The inclusion of qualitative research, which was often highly relevant and specific to many components of the screening information materials, enabled the production of a set of recommendations that will directly affect policy within the NHS Cervical Screening Programme. Conclusions A practical example is provided of how quantitative and qualitative data sources might successfully be brought together and considered in one review. PMID:17325406
Quantitative proteomics in biological research.
Wilm, Matthias
2009-10-01
Proteomics has enabled the direct investigation of biological material, at first through the analysis of individual proteins, then of lysates from cell cultures, and finally of extracts from tissues and biopsies from entire organisms. Its latest manifestation - quantitative proteomics - allows deeper insight into biological systems. This article reviews the different methods used to extract quantitative information from mass spectra. It follows the technical developments aimed toward global proteomics, the attempt to characterize every expressed protein in a cell by at least one peptide. When applications of the technology are discussed, the focus is placed on yeast biology. In particular, differential quantitative proteomics, the comparison between an experiment and its control, is very discriminating for proteins involved in the process being studied. When trying to understand biological processes on a molecular level, differential quantitative proteomics tends to give a clearer picture than global transcription analyses. As a result, MS has become an even more indispensable tool for biochemically motivated biological research.
Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin
2008-11-01
Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.
Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei
2016-01-01
Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme. PMID:27362762
Zhu, Hongchun; Cai, Lijie; Liu, Haiying; Huang, Wei
2016-01-01
Multi-scale image segmentation and the selection of optimal segmentation parameters are the key processes in the object-oriented information extraction of high-resolution remote sensing images. The accuracy of remote sensing special subject information depends on this extraction. On the basis of WorldView-2 high-resolution data, the optimal segmentation parameters methodof object-oriented image segmentation and high-resolution image information extraction, the following processes were conducted in this study. Firstly, the best combination of the bands and weights was determined for the information extraction of high-resolution remote sensing image. An improved weighted mean-variance method was proposed andused to calculatethe optimal segmentation scale. Thereafter, the best shape factor parameter and compact factor parameters were computed with the use of the control variables and the combination of the heterogeneity and homogeneity indexes. Different types of image segmentation parameters were obtained according to the surface features. The high-resolution remote sensing images were multi-scale segmented with the optimal segmentation parameters. Ahierarchical network structure was established by setting the information extraction rules to achieve object-oriented information extraction. This study presents an effective and practical method that can explain expert input judgment by reproducible quantitative measurements. Furthermore the results of this procedure may be incorporated into a classification scheme.
Information extraction and transmission techniques for spaceborne synthetic aperture radar images
NASA Technical Reports Server (NTRS)
Frost, V. S.; Yurovsky, L.; Watson, E.; Townsend, K.; Gardner, S.; Boberg, D.; Watson, J.; Minden, G. J.; Shanmugan, K. S.
1984-01-01
Information extraction and transmission techniques for synthetic aperture radar (SAR) imagery were investigated. Four interrelated problems were addressed. An optimal tonal SAR image classification algorithm was developed and evaluated. A data compression technique was developed for SAR imagery which is simple and provides a 5:1 compression with acceptable image quality. An optimal textural edge detector was developed. Several SAR image enhancement algorithms have been proposed. The effectiveness of each algorithm was compared quantitatively.
Portable microwave assisted extraction: An original concept for green analytical chemistry.
Perino, Sandrine; Petitcolas, Emmanuel; de la Guardia, Miguel; Chemat, Farid
2013-11-08
This paper describes a portable microwave assisted extraction apparatus (PMAE) for extraction of bioactive compounds especially essential oils and aromas directly in a crop or in a forest. The developed procedure, based on the concept of green analytical chemistry, is appropriate to obtain direct in-field information about the level of essential oils in natural samples and to illustrate green chemical lesson and research. The efficiency of this experiment was validated for the extraction of essential oil of rosemary directly in a crop and allows obtaining a quantitative information on the content of essential oil, which was similar to that obtained by conventional methods in the laboratory. Copyright © 2013 Elsevier B.V. All rights reserved.
Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.
Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A
2016-09-01
In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Vasudevan, Srivathsan; Chen, George C K; Lin, Zhiping; Ng, Beng Koon
2015-05-10
Photothermal microscopy (PTM), a noninvasive pump-probe high-resolution microscopy, has been applied as a bioimaging tool in many biomedical studies. PTM utilizes a conventional phase contrast microscope to obtain highly resolved photothermal images. However, phase information cannot be extracted from these photothermal images, as they are not quantitative. Moreover, the problem of halos inherent in conventional phase contrast microscopy needs to be tackled. Hence, a digital holographic photothermal microscopy technique is proposed as a solution to obtain quantitative phase images. The proposed technique is demonstrated by extracting phase values of red blood cells from their photothermal images. These phase values can potentially be used to determine the temperature distribution of the photothermal images, which is an important study in live cell monitoring applications.
Gates, Allison; Shave, Kassi; Featherstone, Robin; Buckreus, Kelli; Ali, Samina; Scott, Shannon; Hartling, Lisa
2017-06-06
There exist many evidence-based interventions available to manage procedural pain in children and neonates, yet they are severely underutilized. Parents play an important role in the management of their child's pain; however, many do not possess adequate knowledge of how to effectively do so. The purpose of the planned study is to systematically review and synthesize current knowledge of the experiences and information needs of parents with regard to the management of their child's pain and distress related to medical procedures in the emergency department. We will conduct a systematic review using rigorous methods and reporting based on the PRISMA statement. We will conduct a comprehensive search of literature published between 2000 and 2016 reporting on parents' experiences and information needs with regard to helping their child manage procedural pain and distress. Ovid MEDLINE, Ovid PsycINFO, CINAHL, and PubMed will be searched. We will also search reference lists of key studies and gray literature sources. Two reviewers will screen the articles following inclusion criteria defined a priori. One reviewer will then extract the data from each article following a data extraction form developed by the study team. The second reviewer will check the data extraction for accuracy and completeness. Any disagreements with regard to study inclusion or data extraction will be resolved via discussion. Data from qualitative studies will be summarized thematically, while those from quantitative studies will be summarized narratively. The second reviewer will confirm the overarching themes resulting from the qualitative and quantitative data syntheses. The Critical Appraisal Skills Programme Qualitative Research Checklist and the Quality Assessment Tool for Quantitative Studies will be used to assess the quality of the evidence from each included study. To our knowledge, no published review exists that comprehensively reports on the experiences and information needs of parents related to the management of their child's procedural pain and distress. A systematic review of parents' experiences and information needs will help to inform strategies to empower them with the knowledge necessary to ensure their child's comfort during a painful procedure. PROSPERO CRD42016043698.
NASA Astrophysics Data System (ADS)
Chalmin, E.; Farges, F.; Brown, G. E.
2009-01-01
High-resolution manganese K-edge X-ray absorption near edge structure spectra were collected on a set of 40 Mn-bearing minerals. The pre-edge feature information (position, area) was investigated to extract as much as possible quantitative valence and symmetry information for manganese in various “test” and “unknown” minerals and glasses. The samples present a range of manganese symmetry environments (tetrahedral, square planar, octahedral, and cubic) and valences (II to VII). The extraction of the pre-edge information is based on a previous multiple scattering and multiplet calculations for model compounds. Using the method described in this study, a robust estimation of the manganese valence could be obtained from the pre-edge region at 5% accuracy level. This method applied to 20 “test” compounds (such as hausmannite and rancieite) and to 15 “unknown” compounds (such as axinite and birnessite) provides a quantitative estimate of the average valence of manganese in complex minerals and silicate glasses.
Thoma, Brent; Camorlinga, Paola; Chan, Teresa M; Hall, Andrew Koch; Murnaghan, Aleisha; Sherbino, Jonathan
2018-01-01
Quantitative research is one of the many research methods used to help educators advance their understanding of questions in medical education. However, little research has been done on how to succeed in publishing in this area. We conducted a scoping review to identify key recommendations and reporting guidelines for quantitative educational research and scholarship. Medline, ERIC, and Google Scholar were searched for English-language articles published between 2006 and January 2016 using the search terms, "research design," "quantitative," "quantitative methods," and "medical education." A hand search was completed for additional references during the full-text review. Titles/abstracts were reviewed by two authors (BT, PC) and included if they focused on quantitative research in medical education and outlined reporting guidelines, or provided recommendations on conducting quantitative research. One hundred articles were reviewed in parallel with the first 30 used for calibration and the subsequent 70 to calculate Cohen's kappa coefficient. Two reviewers (BT, PC) conducted a full text review and extracted recommendations and reporting guidelines. A simple thematic analysis summarized the extracted recommendations. Sixty-one articles were reviewed in full, and 157 recommendations were extracted. The thematic analysis identified 86 items, 14 categories, and 3 themes. Fourteen quality evaluation tools and reporting guidelines were found. Discussion This paper provides guidance for junior researchers in the form of key quality markers and reporting guidelines. We hope that quantitative researchers in medical education will be informed by the results and that further work will be done to refine the list of recommendations.
NASA Technical Reports Server (NTRS)
1986-01-01
Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.
Analysis of atomic force microscopy data for surface characterization using fuzzy logic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.
2011-07-15
In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less
Quantitation of Indoleacetic Acid Conjugates in Bean Seeds by Direct Tissue Hydrolysis 1
Bialek, Krystyna; Cohen, Jerry D.
1989-01-01
Gas chromatography-selected ion monitoring-mass spectral analysis using [13C6]indole-3-acetic acid (IAA) as an internal standard provides an effective means for quantitation of IAA liberated during direct strong basic hydrolysis of bean (Phaseolus vulgaris L.) seed powder, provided that extra precautions are undertaken to exclude oxygen from the reaction vial. Direct seed powder hydrolysis revealed that the major portion of amide IAA conjugates in bean seeds are not extractable by aqueous acetone, the solvent used commonly for IAA conjugate extraction from seeds and other plant tissues. Strong basic hydrolysis of plant tissue can be used to provide new information on IAA content. Images Figure 1 PMID:16666783
Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood
NASA Astrophysics Data System (ADS)
Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver
2016-09-01
Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.
Knittel, Diana N; Stintzing, Florian C; Kammerer, Dietmar R
2015-06-10
Sea squill (Drimia maritima L.) extracts have been used for centuries for the medical treatment of heart diseases. A procedure for the preparation of Drimia extracts applied for such purposes comprising a fermentation step is described in the German Homoeopathic Pharmacopoeia (GHP). However, little is known about the secondary metabolite profile of such extracts and the fate of these components upon processing and storage. Thus, in the present study sea squill extracts were monitored during fermentation and storage by HPLC-DAD-MS(n) and GC-MS to characterise and quantitate individual cardiac glycosides and phenolic compounds. For this purpose, a previously established HPLC method for the separation and quantitation of pharmacologically relevant cardiac glycosides (bufadienolides) was validated. Within 12 months of storage, total bufadienolide contents decreased by about 50%, which was attributed to microbial and plant enzyme activities. The metabolisation and degradation rates of individual bufadienolide glycosides significantly differed, which was attributed to differing structures of the aglycones. Further degradation of bufadienolide aglycones was also observed. Besides reactions well known from human metabolism studies, dehydration of individual compounds was monitored. Quantitatively predominating flavonoids were also metabolised throughout the fermentation process. The present study provides valuable information about the profile and stability of individual cardiac glycosides and phenolic compounds in fermented Drimia extracts prepared for medical applications, and expands the knowledge of cardiac glycoside conversion upon microbial fermentation. Copyright © 2015 Elsevier B.V. All rights reserved.
Semi-automatic building extraction in informal settlements from high-resolution satellite imagery
NASA Astrophysics Data System (ADS)
Mayunga, Selassie David
The extraction of man-made features from digital remotely sensed images is considered as an important step underpinning management of human settlements in any country. Man-made features and buildings in particular are required for varieties of applications such as urban planning, creation of geographical information systems (GIS) databases and Urban City models. The traditional man-made feature extraction methods are very expensive in terms of equipment, labour intensive, need well-trained personnel and cannot cope with changing environments, particularly in dense urban settlement areas. This research presents an approach for extracting buildings in dense informal settlement areas using high-resolution satellite imagery. The proposed system uses a novel strategy of extracting building by measuring a single point at the approximate centre of the building. The fine measurement of the building outlines is then effected using a modified snake model. The original snake model on which this framework is based, incorporates an external constraint energy term which is tailored to preserving the convergence properties of the snake model; its use to unstructured objects will negatively affect their actual shapes. The external constrained energy term was removed from the original snake model formulation, thereby, giving ability to cope with high variability of building shapes in informal settlement areas. The proposed building extraction system was tested on two areas, which have different situations. The first area was Tungi in Dar Es Salaam, Tanzania where three sites were tested. This area is characterized by informal settlements, which are illegally formulated within the city boundaries. The second area was Oromocto in New Brunswick, Canada where two sites were tested. Oromocto area is mostly flat and the buildings are constructed using similar materials. Qualitative and quantitative measures were employed to evaluate the accuracy of the results as well as the performance of the system. The qualitative and quantitative measures were based on visual inspection and by comparing the measured coordinates to the reference data respectively. In the course of this process, a mean area coverage of 98% was achieved for Dar Es Salaam test sites, which globally indicated that the extracted building polygons were close to the ground truth data. Furthermore, the proposed system saved time to extract a single building by 32%. Although the extracted building polygons are within the perimeter of ground truth data, visually some of the extracted building polygons were somewhat distorted. This implies that interactive post-editing process is necessary for cartographic representation.
NASA Astrophysics Data System (ADS)
García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel
2018-05-01
In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.
Building an automated SOAP classifier for emergency department reports.
Mowery, Danielle; Wiebe, Janyce; Visweswaran, Shyam; Harkema, Henk; Chapman, Wendy W
2012-02-01
Information extraction applications that extract structured event and entity information from unstructured text can leverage knowledge of clinical report structure to improve performance. The Subjective, Objective, Assessment, Plan (SOAP) framework, used to structure progress notes to facilitate problem-specific, clinical decision making by physicians, is one example of a well-known, canonical structure in the medical domain. Although its applicability to structuring data is understood, its contribution to information extraction tasks has not yet been determined. The first step to evaluating the SOAP framework's usefulness for clinical information extraction is to apply the model to clinical narratives and develop an automated SOAP classifier that classifies sentences from clinical reports. In this quantitative study, we applied the SOAP framework to sentences from emergency department reports, and trained and evaluated SOAP classifiers built with various linguistic features. We found the SOAP framework can be applied manually to emergency department reports with high agreement (Cohen's kappa coefficients over 0.70). Using a variety of features, we found classifiers for each SOAP class can be created with moderate to outstanding performance with F(1) scores of 93.9 (subjective), 94.5 (objective), 75.7 (assessment), and 77.0 (plan). We look forward to expanding the framework and applying the SOAP classification to clinical information extraction tasks. Copyright © 2011. Published by Elsevier Inc.
An approach to the systematic analysis of urinary steroids
Menini, E.; Norymberski, J. K.
1965-01-01
1. Human urine, its extracts, extracts of urine pretreated with enzyme preparations containing β-glucuronidase and steroid sulphatase or β-glucuronidase alone, and products derived from the specific solvolysis of urinary steroid sulphates, were submitted to the following sequence of operations: reduction with borohydride; oxidation with a glycol-cleaving agent (bismuthate or periodate); separation of the products into ketones and others; oxidation of each fraction with tert.-butyl chromate, resolution of the end products by means of paper chromatography or gas–liquid chromatography or both. 2. Qualitative experiments indicated the kind of information the method and some of its modifications can provide. Quantitative experiments were restricted to the direct treatment of urine by the basic procedure outlined. It was partly shown and partly argued that the quantitative results were probably as informative about the composition of the major neutral urinary steroids (and certainly about their presumptive secretory precursors) as those obtained by a number of established analytical procedures. 3. A possible extension of the scope of the reported method was indicated. 4. A simple technique was introduced for the quantitative deposition of a solid sample on to a gas–liquid-chromatographic column. PMID:14333557
NASA Astrophysics Data System (ADS)
Hosseini-Golgoo, S. M.; Bozorgi, H.; Saberkari, A.
2015-06-01
Performances of three neural networks, consisting of a multi-layer perceptron, a radial basis function, and a neuro-fuzzy network with local linear model tree training algorithm, in modeling and extracting discriminative features from the response patterns of a temperature-modulated resistive gas sensor are quantitatively compared. For response pattern recording, a voltage staircase containing five steps each with a 20 s plateau is applied to the micro-heater of the sensor, when 12 different target gases, each at 11 concentration levels, are present. In each test, the hidden layer neuron weights are taken as the discriminatory feature vector of the target gas. These vectors are then mapped to a 3D feature space using linear discriminant analysis. The discriminative information content of the feature vectors are determined by the calculation of the Fisher’s discriminant ratio, affording quantitative comparison among the success rates achieved by the different neural network structures. The results demonstrate a superior discrimination ratio for features extracted from local linear neuro-fuzzy and radial-basis-function networks with recognition rates of 96.27% and 90.74%, respectively.
Investigation of BOLD fMRI Resonance Frequency Shifts and Quantitative Susceptibility Changes at 7 T
Bianciardi, Marta; van Gelderen, Peter; Duyn, Jeff H.
2013-01-01
Although blood oxygenation level dependent (BOLD) functional magnetic resonance imaging (fMRI) experiments of brain activity generally rely on the magnitude of the signal, they also provide frequency information that can be derived from the phase of the signal. However, because of confounding effects of instrumental and physiological origin, BOLD related frequency information is difficult to extract and therefore rarely used. Here, we explored the use of high field (7 T) and dedicated signal processing methods to extract frequency information and use it to quantify and interpret blood oxygenation and blood volume changes. We found that optimized preprocessing improves detection of task-evoked and spontaneous changes in phase signals and resonance frequency shifts over large areas of the cortex with sensitivity comparable to that of magnitude signals. Moreover, our results suggest the feasibility of mapping BOLD quantitative susceptibility changes in at least part of the activated area and its largest draining veins. Comparison with magnitude data suggests that the observed susceptibility changes originate from neuronal activity through induced blood volume and oxygenation changes in pial and intracortical veins. Further, from frequency shifts and susceptibility values, we estimated that, relative to baseline, the fractional oxygen saturation in large vessels increased by 0.02–0.05 during stimulation, which is consistent to previously published estimates. Together, these findings demonstrate that valuable information can be derived from fMRI imaging of BOLD frequency shifts and quantitative susceptibility changes. PMID:23897623
Investigation of Carbon Fiber Architecture in Braided Composites Using X-Ray CT Inspection
NASA Technical Reports Server (NTRS)
Rhoads, Daniel J.; Miller, Sandi G.; Roberts, Gary D.; Rauser, Richard W.; Golovaty, Dmitry; Wilber, J. Patrick; Espanol, Malena I.
2017-01-01
During the fabrication of braided carbon fiber composite materials, process variations occur which affect the fiber architecture. Quantitative measurements of local and global fiber architecture variations are needed to determine the potential effect of process variations on mechanical properties of the cured composite. Although non-destructive inspection via X-ray CT imaging is a promising approach, difficulties in quantitative analysis of the data arise due to the similar densities of the material constituents. In an effort to gain more quantitative information about features related to fiber architecture, methods have been explored to improve the details that can be captured by X-ray CT imaging. Metal-coated fibers and thin veils are used as inserts to extract detailed information about fiber orientations and inter-ply behavior from X-ray CT images.
Comparison and evaluation of fusion methods used for GF-2 satellite image in coastal mangrove area
NASA Astrophysics Data System (ADS)
Ling, Chengxing; Ju, Hongbo; Liu, Hua; Zhang, Huaiqing; Sun, Hua
2018-04-01
GF-2 satellite is the highest spatial resolution Remote Sensing Satellite of the development history of China's satellite. In this study, three traditional fusion methods including Brovey, Gram-Schmidt and Color Normalized (CN were used to compare with the other new fusion method NNDiffuse, which used the qualitative assessment and quantitative fusion quality index, including information entropy, variance, mean gradient, deviation index, spectral correlation coefficient. Analysis results show that NNDiffuse method presented the optimum in qualitative and quantitative analysis. It had more effective for the follow up of remote sensing information extraction and forest, wetland resources monitoring applications.
Digital image processing for photo-reconnaissance applications
NASA Technical Reports Server (NTRS)
Billingsley, F. C.
1972-01-01
Digital image-processing techniques developed for processing pictures from NASA space vehicles are analyzed in terms of enhancement, quantitative restoration, and information extraction. Digital filtering, and the action of a high frequency filter in the real and Fourier domain are discussed along with color and brightness.
Markov Logic Networks for Adverse Drug Event Extraction from Text.
Natarajan, Sriraam; Bangera, Vishal; Khot, Tushar; Picado, Jose; Wazalwar, Anurag; Costa, Vitor Santos; Page, David; Caldwell, Michael
2017-05-01
Adverse drug events (ADEs) are a major concern and point of emphasis for the medical profession, government, and society. A diverse set of techniques from epidemiology, statistics, and computer science are being proposed and studied for ADE discovery from observational health data (e.g., EHR and claims data), social network data (e.g., Google and Twitter posts), and other information sources. Methodologies are needed for evaluating, quantitatively measuring, and comparing the ability of these various approaches to accurately discover ADEs. This work is motivated by the observation that text sources such as the Medline/Medinfo library provide a wealth of information on human health. Unfortunately, ADEs often result from unexpected interactions, and the connection between conditions and drugs is not explicit in these sources. Thus, in this work we address the question of whether we can quantitatively estimate relationships between drugs and conditions from the medical literature. This paper proposes and studies a state-of-the-art NLP-based extraction of ADEs from text.
Radiomics: Extracting more information from medical images using advanced feature analysis
Lambin, Philippe; Rios-Velazquez, Emmanuel; Leijenaar, Ralph; Carvalho, Sara; van Stiphout, Ruud G.P.M.; Granton, Patrick; Zegers, Catharina M.L.; Gillies, Robert; Boellard, Ronald; Dekker, André; Aerts, Hugo J.W.L.
2015-01-01
Solid cancers are spatially and temporally heterogeneous. This limits the use of invasive biopsy based molecular assays but gives huge potential for medical imaging, which has the ability to capture intra-tumoural heterogeneity in a non-invasive way. During the past decades, medical imaging innovations with new hardware, new imaging agents and standardised protocols, allows the field to move towards quantitative imaging. Therefore, also the development of automated and reproducible analysis methodologies to extract more information from image-based features is a requirement. Radiomics – the high-throughput extraction of large amounts of image features from radiographic images – addresses this problem and is one of the approaches that hold great promises but need further validation in multi-centric settings and in the laboratory. PMID:22257792
Nicolotti, Luca; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Sgorbini, Barbara; Rubiolo, Patrizia; Bicchi, Carlo
2013-10-10
The study proposes an investigation strategy that simultaneously provides detailed profiling and quantitative fingerprinting of food volatiles, through a "comprehensive" analytical platform that includes sample preparation by Headspace Solid Phase Microextraction (HS-SPME), separation by two-dimensional comprehensive gas chromatography coupled with mass spectrometry detection (GC×GC-MS) and data processing using advanced fingerprinting approaches. Experiments were carried out on roasted hazelnuts and on Gianduja pastes (sugar, vegetable oil, hazelnuts, cocoa, nonfat dried milk, vanilla flavorings) and demonstrated that the information potential of each analysis can better be exploited if suitable quantitation methods are applied. Quantitation approaches through Multiple Headspace Extraction and Standard Addition were compared in terms of performance parameters (linearity, precision, accuracy, Limit of Detection and Limit of Quantitation) under headspace linearity conditions. The results on 19 key analytes, potent odorants, and technological markers, and more than 300 fingerprint components, were used for further processing to obtain information concerning the effect of the matrix on volatile release, and to produce an informative chemical blueprint for use in sensomics and flavoromics. The importance of quantitation approaches in headspace analysis of solid matrices of complex composition, and the advantages of MHE, are also critically discussed. Copyright © 2013 Elsevier B.V. All rights reserved.
A Discriminant Distance Based Composite Vector Selection Method for Odor Classification
Choi, Sang-Il; Jeong, Gu-Min
2014-01-01
We present a composite vector selection method for an effective electronic nose system that performs well even in noisy environments. Each composite vector generated from a electronic nose data sample is evaluated by computing the discriminant distance. By quantitatively measuring the amount of discriminative information in each composite vector, composite vectors containing informative variables can be distinguished and the final composite features for odor classification are extracted using the selected composite vectors. Using the only informative composite vectors can be also helpful to extract better composite features instead of using all the generated composite vectors. Experimental results with different volatile organic compound data show that the proposed system has good classification performance even in a noisy environment compared to other methods. PMID:24747735
An Analysis of Students' Mistakes on Routine Slope Tasks
ERIC Educational Resources Information Center
Cho, Peter; Nagle, Courtney
2017-01-01
This study extends past research on students' understanding of slope by analyzing college students' mistakes on routine tasks involving slope. We conduct quantitative and qualitative analysis of students' mistakes to extract information regarding slope conceptualizations described in prior research. Results delineate procedural proficiencies and…
Prescott, Jeffrey William
2013-02-01
The importance of medical imaging for clinical decision making has been steadily increasing over the last four decades. Recently, there has also been an emphasis on medical imaging for preclinical decision making, i.e., for use in pharamaceutical and medical device development. There is also a drive towards quantification of imaging findings by using quantitative imaging biomarkers, which can improve sensitivity, specificity, accuracy and reproducibility of imaged characteristics used for diagnostic and therapeutic decisions. An important component of the discovery, characterization, validation and application of quantitative imaging biomarkers is the extraction of information and meaning from images through image processing and subsequent analysis. However, many advanced image processing and analysis methods are not applied directly to questions of clinical interest, i.e., for diagnostic and therapeutic decision making, which is a consideration that should be closely linked to the development of such algorithms. This article is meant to address these concerns. First, quantitative imaging biomarkers are introduced by providing definitions and concepts. Then, potential applications of advanced image processing and analysis to areas of quantitative imaging biomarker research are described; specifically, research into osteoarthritis (OA), Alzheimer's disease (AD) and cancer is presented. Then, challenges in quantitative imaging biomarker research are discussed. Finally, a conceptual framework for integrating clinical and preclinical considerations into the development of quantitative imaging biomarkers and their computer-assisted methods of extraction is presented.
Lipid Informed Quantitation and Identification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kevin Crowell, PNNL
2014-07-21
LIQUID (Lipid Informed Quantitation and Identification) is a software program that has been developed to enable users to conduct both informed and high-throughput global liquid chromatography-tandem mass spectrometry (LC-MS/MS)-based lipidomics analysis. This newly designed desktop application can quickly identify and quantify lipids from LC-MS/MS datasets while providing a friendly graphical user interface for users to fully explore the data. Informed data analysis simply involves the user specifying an electrospray ionization mode, lipid common name (i.e. PE(16:0/18:2)), and associated charge carrier. A stemplot of the isotopic profile and a line plot of the extracted ion chromatogram are also provided to showmore » the MS-level evidence of the identified lipid. In addition to plots, other information such as intensity, mass measurement error, and elution time are also provided. Typically, a global analysis for 15,000 lipid targets« less
Estimation of the Scatterer Distribution of the Cirrhotic Liver using Ultrasonic Image
NASA Astrophysics Data System (ADS)
Yamaguchi, Tadashi; Hachiya, Hiroyuki
1998-05-01
In the B-mode image of the liver obtained by an ultrasonic imaging system, the speckled pattern changes with the progression of the disease such as liver cirrhosis.In this paper we present the statistical characteristics of the echo envelope of the liver, and the technique to extract information of the scatterer distribution from the normal and cirrhotic liver images using constant false alarm rate (CFAR) processing.We analyze the relationship between the extracted scatterer distribution and the stage of liver cirrhosis. The ratio of the area in which the amplitude of the processing signal is more than the threshold to the entire processed image area is related quantitatively to the stage of liver cirrhosis.It is found that the proposed technique is valid for the quantitative diagnosis of liver cirrhosis.
Usability Evaluation of NLP-PIER: A Clinical Document Search Engine for Researchers.
Hultman, Gretchen; McEwan, Reed; Pakhomov, Serguei; Lindemann, Elizabeth; Skube, Steven; Melton, Genevieve B
2017-01-01
NLP-PIER (Natural Language Processing - Patient Information Extraction for Research) is a self-service platform with a search engine for clinical researchers to perform natural language processing (NLP) queries using clinical notes. We conducted user-centered testing of NLP-PIER's usability to inform future design decisions. Quantitative and qualitative data were analyzed. Our findings will be used to improve the usability of NLP-PIER.
An evidential reasoning extension to quantitative model-based failure diagnosis
NASA Technical Reports Server (NTRS)
Gertler, Janos J.; Anderson, Kenneth C.
1992-01-01
The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.
Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.
Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F
2017-09-27
Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.
Walzthoeni, Thomas; Joachimiak, Lukasz A; Rosenberger, George; Röst, Hannes L; Malmström, Lars; Leitner, Alexander; Frydman, Judith; Aebersold, Ruedi
2015-12-01
Chemical cross-linking in combination with mass spectrometry generates distance restraints of amino acid pairs in close proximity on the surface of native proteins and protein complexes. In this study we used quantitative mass spectrometry and chemical cross-linking to quantify differences in cross-linked peptides obtained from complexes in spatially discrete states. We describe a generic computational pipeline for quantitative cross-linking mass spectrometry consisting of modules for quantitative data extraction and statistical assessment of the obtained results. We used the method to detect conformational changes in two model systems: firefly luciferase and the bovine TRiC complex. Our method discovers and explains the structural heterogeneity of protein complexes using only sparse structural information.
Quantitative contrast-enhanced mammography for contrast medium kinetics studies
NASA Astrophysics Data System (ADS)
Arvanitis, C. D.; Speller, R.
2009-10-01
Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.
Extraction of actionable information from crowdsourced disaster data.
Kiatpanont, Rungsun; Tanlamai, Uthai; Chongstitvatana, Prabhas
Natural disasters cause enormous damage to countries all over the world. To deal with these common problems, different activities are required for disaster management at each phase of the crisis. There are three groups of activities as follows: (1) make sense of the situation and determine how best to deal with it, (2) deploy the necessary resources, and (3) harmonize as many parties as possible, using the most effective communication channels. Current technological improvements and developments now enable people to act as real-time information sources. As a result, inundation with crowdsourced data poses a real challenge for a disaster manager. The problem is how to extract the valuable information from a gigantic data pool in the shortest possible time so that the information is still useful and actionable. This research proposed an actionable-data-extraction process to deal with the challenge. Twitter was selected as a test case because messages posted on Twitter are publicly available. Hashtag, an easy and very efficient technique, was also used to differentiate information. A quantitative approach to extract useful information from the tweets was supported and verified by interviews with disaster managers from many leading organizations in Thailand to understand their missions. The information classifications extracted from the collected tweets were first performed manually, and then the tweets were used to train a machine learning algorithm to classify future tweets. One particularly useful, significant, and primary section was the request for help category. The support vector machine algorithm was used to validate the results from the extraction process of 13,696 sample tweets, with over 74 percent accuracy. The results confirmed that the machine learning technique could significantly and practically assist with disaster management by dealing with crowdsourced data.
Haines, Seth S.; Cook, Troy; Thamke, Joanna N.; Davis, Kyle W.; Long, Andrew J.; Healy, Richard W.; Hawkins, Sarah J.; Engle, Mark A.
2014-01-01
The U.S. Geological Survey is developing approaches for the quantitative assessment of water and proppant involved with possible future production of continuous petroleum deposits. The assessment approach is an extension of existing U.S. Geological Survey petroleum-assessment methods, and it aims to provide objective information that helps decision makers understand the tradeoffs inherent in resource-development decisions. This fact sheet provides an overview of U.S. Geological Survey assessments for quantities of water and proppant required for drilling and hydraulic fracturing and for flowback water extracted with petroleum; the report also presents the form of the intended assessment output information.
Nomura, J-I; Uwano, I; Sasaki, M; Kudo, K; Yamashita, F; Ito, K; Fujiwara, S; Kobayashi, M; Ogasawara, K
2017-12-01
Preoperative hemodynamic impairment in the affected cerebral hemisphere is associated with the development of cerebral hyperperfusion following carotid endarterectomy. Cerebral oxygen extraction fraction images generated from 7T MR quantitative susceptibility mapping correlate with oxygen extraction fraction images on positron-emission tomography. The present study aimed to determine whether preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping could identify patients at risk for cerebral hyperperfusion following carotid endarterectomy. Seventy-seven patients with unilateral internal carotid artery stenosis (≥70%) underwent preoperative 3D T2*-weighted imaging using a multiple dipole-inversion algorithm with a 7T MR imager. Quantitative susceptibility mapping images were then obtained, and oxygen extraction fraction maps were generated. Quantitative brain perfusion single-photon emission CT was also performed before and immediately after carotid endarterectomy. ROIs were automatically placed in the bilateral middle cerebral artery territories in all images using a 3D stereotactic ROI template, and affected-to-contralateral ratios in the ROIs were calculated on quantitative susceptibility mapping-oxygen extraction fraction images. Ten patients (13%) showed post-carotid endarterectomy hyperperfusion (cerebral blood flow increases of ≥100% compared with preoperative values in the ROIs on brain perfusion SPECT). Multivariate analysis showed that a high quantitative susceptibility mapping-oxygen extraction fraction ratio was significantly associated with the development of post-carotid endarterectomy hyperperfusion (95% confidence interval, 33.5-249.7; P = .002). Sensitivity, specificity, and positive- and negative-predictive values of the quantitative susceptibility mapping-oxygen extraction fraction ratio for the prediction of the development of post-carotid endarterectomy hyperperfusion were 90%, 84%, 45%, and 98%, respectively. Preoperative oxygen extraction fraction imaging generated from 7T MR quantitative susceptibility mapping identifies patients at risk for cerebral hyperperfusion following carotid endarterectomy. © 2017 by American Journal of Neuroradiology.
Stable Isotope Quantitative N-Glycan Analysis by Liquid Separation Techniques and Mass Spectrometry.
Mittermayr, Stefan; Albrecht, Simone; Váradi, Csaba; Millán-Martín, Silvia; Bones, Jonathan
2017-01-01
Liquid phase separation analysis and subsequent quantitation remains a challenging task for protein-derived oligosaccharides due to their inherent structural complexity and diversity. Incomplete resolution or co-detection of multiple glycan species complicates peak area-based quantitation and associated statistical analysis when optical detection methods are used. The approach outlined herein describes the utilization of stable isotope variants of commonly used fluorescent tags that allow for mass-based glycan identification and relative quantitation following separation by liquid chromatography (LC) or capillary electrophoresis (CE). Comparability assessment of glycoprotein-derived oligosaccharides is performed by derivatization with commercially available isotope variants of 2-aminobenzoic acid or aniline and analysis by LC- and CE-mass spectrometry. Quantitative information is attained from the extracted ion chromatogram/electropherogram ratios generated from the light and heavy isotope clusters.
NASA Astrophysics Data System (ADS)
Vasudevan, Srivathsan; Chen, George Chung Kit; Andika, Marta; Agarwal, Shuchi; Chen, Peng; Olivo, Malini
2010-09-01
Red blood cells (RBCs) have been found to undergo ``programmed cell death,'' or eryptosis, and understanding this process can provide more information about apoptosis of nucleated cells. Photothermal (PT) response, a label-free photothermal noninvasive technique, is proposed as a tool to monitor the cell death process of living human RBCs upon glucose depletion. Since the physiological status of the dying cells is highly sensitive to photothermal parameters (e.g., thermal diffusivity, absorption, etc.), we applied linear PT response to continuously monitor the death mechanism of RBC when depleted of glucose. The kinetics of the assay where the cell's PT response transforms from linear to nonlinear regime is reported. In addition, quantitative monitoring was performed by extracting the relevant photothermal parameters from the PT response. Twofold increases in thermal diffusivity and size reduction were found in the linear PT response during cell death. Our results reveal that photothermal parameters change earlier than phosphatidylserine externalization (used for fluorescent studies), allowing us to detect the initial stage of eryptosis in a quantitative manner. Hence, the proposed tool, in addition to detection of eryptosis earlier than fluorescence, could also reveal physiological status of the cells through quantitative photothermal parameter extraction.
Quantitative radiomic profiling of glioblastoma represents transcriptomic expression.
Kong, Doo-Sik; Kim, Junhyung; Ryu, Gyuha; You, Hye-Jin; Sung, Joon Kyung; Han, Yong Hee; Shin, Hye-Mi; Lee, In-Hee; Kim, Sung-Tae; Park, Chul-Kee; Choi, Seung Hong; Choi, Jeong Won; Seol, Ho Jun; Lee, Jung-Il; Nam, Do-Hyun
2018-01-19
Quantitative imaging biomarkers have increasingly emerged in the field of research utilizing available imaging modalities. We aimed to identify good surrogate radiomic features that can represent genetic changes of tumors, thereby establishing noninvasive means for predicting treatment outcome. From May 2012 to June 2014, we retrospectively identified 65 patients with treatment-naïve glioblastoma with available clinical information from the Samsung Medical Center data registry. Preoperative MR imaging data were obtained for all 65 patients with primary glioblastoma. A total of 82 imaging features including first-order statistics, volume, and size features, were semi-automatically extracted from structural and physiologic images such as apparent diffusion coefficient and perfusion images. Using commercially available software, NordicICE, we performed quantitative imaging analysis and collected the dataset composed of radiophenotypic parameters. Unsupervised clustering methods revealed that the radiophenotypic dataset was composed of three clusters. Each cluster represented a distinct molecular classification of glioblastoma; classical type, proneural and neural types, and mesenchymal type. These clusters also reflected differential clinical outcomes. We found that extracted imaging signatures does not represent copy number variation and somatic mutation. Quantitative radiomic features provide a potential evidence to predict molecular phenotype and treatment outcome. Radiomic profiles represents transcriptomic phenotypes more well.
Mining and Analyzing Circulation and ILL Data for Informed Collection Development
ERIC Educational Resources Information Center
Link, Forrest E.; Tosaka, Yuji; Weng, Cathy
2015-01-01
The authors investigated quantitative methods of collection use analysis employing library data that are available in ILS and ILL systems to better understand library collection use and user needs. For the purpose of the study, the authors extracted circulation and ILL records from the library's systems using data-mining techniques. By comparing…
Zhou, Yangbo; Fox, Daniel S; Maguire, Pierce; O’Connell, Robert; Masters, Robert; Rodenburg, Cornelia; Wu, Hanchun; Dapor, Maurizio; Chen, Ying; Zhang, Hongzhou
2016-01-01
Two-dimensional (2D) materials usually have a layer-dependent work function, which require fast and accurate detection for the evaluation of their device performance. A detection technique with high throughput and high spatial resolution has not yet been explored. Using a scanning electron microscope, we have developed and implemented a quantitative analytical technique which allows effective extraction of the work function of graphene. This technique uses the secondary electron contrast and has nanometre-resolved layer information. The measurement of few-layer graphene flakes shows the variation of work function between graphene layers with a precision of less than 10 meV. It is expected that this technique will prove extremely useful for researchers in a broad range of fields due to its revolutionary throughput and accuracy. PMID:26878907
Shao, Shiying; Guo, Tiannan; Gross, Vera; Lazarev, Alexander; Koh, Ching Chiek; Gillessen, Silke; Joerger, Markus; Jochum, Wolfram; Aebersold, Ruedi
2016-06-03
The reproducible and efficient extraction of proteins from biopsy samples for quantitative analysis is a critical step in biomarker and translational research. Recently, we described a method consisting of pressure-cycling technology (PCT) and sequential windowed acquisition of all theoretical fragment ions-mass spectrometry (SWATH-MS) for the rapid quantification of thousands of proteins from biopsy-size tissue samples. As an improvement of the method, we have incorporated the PCT-MicroPestle into the PCT-SWATH workflow. The PCT-MicroPestle is a novel, miniaturized, disposable mechanical tissue homogenizer that fits directly into the microTube sample container. We optimized the pressure-cycling conditions for tissue lysis with the PCT-MicroPestle and benchmarked the performance of the system against the conventional PCT-MicroCap method using mouse liver, heart, brain, and human kidney tissues as test samples. The data indicate that the digestion of the PCT-MicroPestle-extracted proteins yielded 20-40% more MS-ready peptide mass from all tissues tested with a comparable reproducibility when compared to the conventional PCT method. Subsequent SWATH-MS analysis identified a higher number of biologically informative proteins from a given sample. In conclusion, we have developed a new device that can be seamlessly integrated into the PCT-SWATH workflow, leading to increased sample throughput and improved reproducibility at both the protein extraction and proteomic analysis levels when applied to the quantitative proteomic analysis of biopsy-level samples.
NASA Astrophysics Data System (ADS)
Martin, Gabriel; Gonzalez-Ruiz, Vicente; Plaza, Antonio; Ortiz, Juan P.; Garcia, Inmaculada
2010-07-01
Lossy hyperspectral image compression has received considerable interest in recent years due to the extremely high dimensionality of the data. However, the impact of lossy compression on spectral unmixing techniques has not been widely studied. These techniques characterize mixed pixels (resulting from insufficient spatial resolution) in terms of a suitable combination of spectrally pure substances (called endmembers) weighted by their estimated fractional abundances. This paper focuses on the impact of JPEG2000-based lossy compression of hyperspectral images on the quality of the endmembers extracted by different algorithms. The three considered algorithms are the orthogonal subspace projection (OSP), which uses only spatial information, and the automatic morphological endmember extraction (AMEE) and spatial spectral endmember extraction (SSEE), which integrate both spatial and spectral information in the search for endmembers. The impact of compression on the resulting abundance estimation based on the endmembers derived by different methods is also substantiated. Experimental results are conducted using a hyperspectral data set collected by NASA Jet Propulsion Laboratory over the Cuprite mining district in Nevada. The experimental results are quantitatively analyzed using reference information available from U.S. Geological Survey, resulting in recommendations to specialists interested in applying endmember extraction and unmixing algorithms to compressed hyperspectral data.
Multiplexed, quantitative, and targeted metabolite profiling by LC-MS/MRM.
Wei, Ru; Li, Guodong; Seymour, Albert B
2014-01-01
Targeted metabolomics, which focuses on a subset of known metabolites representative of biologically relevant metabolic pathways, is a valuable tool to discover biomarkers and link disease phenotypes to underlying mechanisms or therapeutic modes of action. A key advantage of targeted metabolomics, compared to discovery metabolomics, is its immediate readiness for extracting biological information derived from known metabolites and quantitative measurements. However, simultaneously analyzing hundreds of endogenous metabolites presents a challenge due to their diverse chemical structures and properties. Here we report a method which combines different chromatographic separation conditions, optimal ionization polarities, and the most sensitive triple-quadrupole MS-based data acquisition mode, multiple reaction monitoring (MRM), to quantitatively profile 205 endogenous metabolites in 10 min.
End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.
Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen
2018-06-15
An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.
Suresh, Niraj; Stephens, Sean A; Adams, Lexor; Beck, Anthon N; McKinney, Adriana L; Varga, Tamas
2016-04-26
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as processes with important implications to climate change and crop management. Quantitative size information on roots in their native environment is invaluable for studying root growth and environmental processes involving plants. X-ray computed tomography (XCT) has been demonstrated to be an effective tool for in situ root scanning and analysis. We aimed to develop a costless and efficient tool that approximates the surface and volume of the root regardless of its shape from three-dimensional (3D) tomography data. The root structure of a Prairie dropseed (Sporobolus heterolepis) specimen was imaged using XCT. The root was reconstructed, and the primary root structure was extracted from the data using a combination of licensed and open-source software. An isosurface polygonal mesh was then created for ease of analysis. We have developed the standalone application imeshJ, generated in MATLAB(1), to calculate root volume and surface area from the mesh. The outputs of imeshJ are surface area (in mm(2)) and the volume (in mm(3)). The process, utilizing a unique combination of tools from imaging to quantitative root analysis, is described. A combination of XCT and open-source software proved to be a powerful combination to noninvasively image plant root samples, segment root data, and extract quantitative information from the 3D data. This methodology of processing 3D data should be applicable to other material/sample systems where there is connectivity between components of similar X-ray attenuation and difficulties arise with segmentation.
Chen, Guijie; Yuan, Qingxia; Saeeduddin, Muhammad; Ou, Shiyi; Zeng, Xiaoxiong; Ye, Hong
2016-11-20
Tea has a long history of medicinal and dietary use. Tea polysaccharide (TPS) is regarded as one of the main bioactive constituents of tea and is beneficial for health. Over the last decades, considerable efforts have been devoted to the studies on TPS: extraction, structural feature and bioactivity of TPS. However, it has been received much less attention compared with tea polyphenols. In order to provide new insight for further development of TPS in functional foods, in present review we summarize the recent literature, update the information and put forward future perspectives on TPS covering its extraction, purification, quantitative determination techniques as well as physicochemical characterization and bioactivities. Copyright © 2016 Elsevier Ltd. All rights reserved.
Valero, E; Sanz, J; Martínez-Castro, I
2001-06-01
Direct thermal desorption (DTD) has been used as a technique for extracting volatile components of cheese as a preliminary step to their gas chromatographic (GC) analysis. In this study, it is applied to different cheese varieties: Camembert, blue, Chaumes, and La Serena. Volatiles are also extracted using other techniques such as simultaneous distillation-extraction and dynamic headspace. Separation and identification of the cheese components are carried out by GC-mass spectrometry. Approximately 100 compounds are detected in the examined cheeses. The described results show that DTD is fast, simple, and easy to automate; requires only a small amount of sample (approximately 50 mg); and affords quantitative information about the main groups of compounds present in cheeses.
Karayiannis, Nicolaos B; Sami, Abdul; Frost, James D; Wise, Merrill S; Mizrahi, Eli M
2005-04-01
This paper presents an automated procedure developed to extract quantitative information from video recordings of neonatal seizures in the form of motor activity signals. This procedure relies on optical flow computation to select anatomical sites located on the infants' body parts. Motor activity signals are extracted by tracking selected anatomical sites during the seizure using adaptive block matching. A block of pixels is tracked throughout a sequence of frames by searching for the most similar block of pixels in subsequent frames; this search is facilitated by employing various update strategies to account for the changing appearance of the block. The proposed procedure is used to extract temporal motor activity signals from video recordings of neonatal seizures and other events not associated with seizures.
Quantitative analysis of facial paralysis using local binary patterns in biomedical videos.
He, Shu; Soraghan, John J; O'Reilly, Brian F; Xing, Dongshan
2009-07-01
Facial paralysis is the loss of voluntary muscle movement of one side of the face. A quantitative, objective, and reliable assessment system would be an invaluable tool for clinicians treating patients with this condition. This paper presents a novel framework for objective measurement of facial paralysis. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the local binary patterns (LBPs) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of novel block processing schemes. A multiresolution extension of uniform LBP is proposed to efficiently combine the micropatterns and large-scale patterns into a feature vector. The symmetry of facial movements is measured by the resistor-average distance (RAD) between LBP features extracted from the two sides of the face. Support vector machine is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.
Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd
2011-02-04
Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.
NASA Astrophysics Data System (ADS)
Rocha, José Celso; Passalia, Felipe José; Matos, Felipe Delestro; Takahashi, Maria Beatriz; Maserati, Marc Peter, Jr.; Alves, Mayra Fernanda; de Almeida, Tamie Guibu; Cardoso, Bruna Lopes; Basso, Andrea Cristina; Nogueira, Marcelo Fábio Gouveia
2017-12-01
There is currently no objective, real-time and non-invasive method for evaluating the quality of mammalian embryos. In this study, we processed images of in vitro produced bovine blastocysts to obtain a deeper comprehension of the embryonic morphological aspects that are related to the standard evaluation of blastocysts. Information was extracted from 482 digital images of blastocysts. The resulting imaging data were individually evaluated by three experienced embryologists who graded their quality. To avoid evaluation bias, each image was related to the modal value of the evaluations. Automated image processing produced 36 quantitative variables for each image. The images, the modal and individual quality grades, and the variables extracted could potentially be used in the development of artificial intelligence techniques (e.g., evolutionary algorithms and artificial neural networks), multivariate modelling and the study of defined structures of the whole blastocyst.
A method for the extraction and quantitation of phycoerythrin from algae
NASA Technical Reports Server (NTRS)
Stewart, D. E.
1982-01-01
A summary of a new technique for the extraction and quantitation of phycoerythrin (PHE) from algal samples is described. Results of analysis of four extracts representing three PHE types from algae including cryptomonad and cyanophyte types are presented. The method of extraction and an equation for quantitation are given. A graph showing the relationship of concentration and fluorescence units that may be used with samples fluorescing around 575-580 nm (probably dominated by cryptophytes in estuarine waters) and 560 nm (dominated by cyanophytes characteristics of the open ocean) is provided.
Fast Coding of Orientation in Primary Visual Cortex
Shriki, Oren; Kohn, Adam; Shamir, Maoz
2012-01-01
Understanding how populations of neurons encode sensory information is a major goal of systems neuroscience. Attempts to answer this question have focused on responses measured over several hundred milliseconds, a duration much longer than that frequently used by animals to make decisions about the environment. How reliably sensory information is encoded on briefer time scales, and how best to extract this information, is unknown. Although it has been proposed that neuronal response latency provides a major cue for fast decisions in the visual system, this hypothesis has not been tested systematically and in a quantitative manner. Here we use a simple ‘race to threshold’ readout mechanism to quantify the information content of spike time latency of primary visual (V1) cortical cells to stimulus orientation. We find that many V1 cells show pronounced tuning of their spike latency to stimulus orientation and that almost as much information can be extracted from spike latencies as from firing rates measured over much longer durations. To extract this information, stimulus onset must be estimated accurately. We show that the responses of cells with weak tuning of spike latency can provide a reliable onset detector. We find that spike latency information can be pooled from a large neuronal population, provided that the decision threshold is scaled linearly with the population size, yielding a processing time of the order of a few tens of milliseconds. Our results provide a novel mechanism for extracting information from neuronal populations over the very brief time scales in which behavioral judgments must sometimes be made. PMID:22719237
Larue, Ruben T H M; Defraene, Gilles; De Ruysscher, Dirk; Lambin, Philippe; van Elmpt, Wouter
2017-02-01
Quantitative analysis of tumour characteristics based on medical imaging is an emerging field of research. In recent years, quantitative imaging features derived from CT, positron emission tomography and MR scans were shown to be of added value in the prediction of outcome parameters in oncology, in what is called the radiomics field. However, results might be difficult to compare owing to a lack of standardized methodologies to conduct quantitative image analyses. In this review, we aim to present an overview of the current challenges, technical routines and protocols that are involved in quantitative imaging studies. The first issue that should be overcome is the dependency of several features on the scan acquisition and image reconstruction parameters. Adopting consistent methods in the subsequent target segmentation step is evenly crucial. To further establish robust quantitative image analyses, standardization or at least calibration of imaging features based on different feature extraction settings is required, especially for texture- and filter-based features. Several open-source and commercial software packages to perform feature extraction are currently available, all with slightly different functionalities, which makes benchmarking quite challenging. The number of imaging features calculated is typically larger than the number of patients studied, which emphasizes the importance of proper feature selection and prediction model-building routines to prevent overfitting. Even though many of these challenges still need to be addressed before quantitative imaging can be brought into daily clinical practice, radiomics is expected to be a critical component for the integration of image-derived information to personalize treatment in the future.
Biologically active extracts with kidney affections applications
NASA Astrophysics Data System (ADS)
Pascu (Neagu), Mihaela; Pascu, Daniela-Elena; Cozea, Andreea; Bunaciu, Andrei A.; Miron, Alexandra Raluca; Nechifor, Cristina Aurelia
2015-12-01
This paper is aimed to select plant materials rich in bioflavonoid compounds, made from herbs known for their application performances in the prevention and therapy of renal diseases, namely kidney stones and urinary infections (renal lithiasis, nephritis, urethritis, cystitis, etc.). This paper presents a comparative study of the medicinal plant extracts composition belonging to Ericaceae-Cranberry (fruit and leaves) - Vaccinium vitis-idaea L. and Bilberry (fruit) - Vaccinium myrtillus L. Concentrated extracts obtained from medicinal plants used in this work were analyzed from structural, morphological and compositional points of view using different techniques: chromatographic methods (HPLC), scanning electronic microscopy, infrared, and UV spectrophotometry, also by using kinetic model. Liquid chromatography was able to identify the specific compounds of the Ericaceae family, present in all three extracts, arbutosid, as well as specific components of each species, mostly from the class of polyphenols. The identification and quantitative determination of the active ingredients from these extracts can give information related to their therapeutic effects.
Spatiotemporal Characterization of a Fibrin Clot Using Quantitative Phase Imaging
Gannavarpu, Rajshekhar; Bhaduri, Basanta; Tangella, Krishnarao; Popescu, Gabriel
2014-01-01
Studying the dynamics of fibrin clot formation and its morphology is an important problem in biology and has significant impact for several scientific and clinical applications. We present a label-free technique based on quantitative phase imaging to address this problem. Using quantitative phase information, we characterized fibrin polymerization in real-time and present a mathematical model describing the transition from liquid to gel state. By exploiting the inherent optical sectioning capability of our instrument, we measured the three-dimensional structure of the fibrin clot. From this data, we evaluated the fractal nature of the fibrin network and extracted the fractal dimension. Our non-invasive and speckle-free approach analyzes the clotting process without the need for external contrast agents. PMID:25386701
Quantitative nanoparticle tracking: applications to nanomedicine.
Huang, Feiran; Dempsey, Christopher; Chona, Daniela; Suh, Junghae
2011-06-01
Particle tracking is an invaluable technique to extract quantitative and qualitative information regarding the transport of nanomaterials through complex biological environments. This technique can be used to probe the dynamic behavior of nanoparticles as they interact with and navigate through intra- and extra-cellular barriers. In this article, we focus on the recent developments in the application of particle-tracking technology to nanomedicine, including the study of synthetic and virus-based materials designed for gene and drug delivery. Specifically, we cover research where mean square displacements of nanomaterial transport were explicitly determined in order to quantitatively assess the transport of nanoparticles through biological environments. Particle-tracking experiments can provide important insights that may help guide the design of more intelligent and effective diagnostic and therapeutic nanoparticles.
Hęś, Marzanna; Gliszczyńska-Świgło, Anna; Gramza-Michałowska, Anna
2017-01-01
Plants are an important source of phenolic compounds. The antioxidant capacities of green tea, thyme and rosemary extracts that contain these compounds have been reported earlier. However, there is a lack of accessible information about their activity against lipid oxidation in emulsions and inhibit the interaction of lipid oxidation products with amino acids. Therefore, the influence of green tea, thyme and rosemary extracts and BHT (butylated hydroxytoluene) on quantitative changes in lysine and methionine in linoleic acid emulsions at a pH of isoelectric point and a pH lower than the isoelectric point of amino acids was investigated. Total phenolic contents in plant extracts were determined spectrophotometrically by using Folin-Ciocalteu's reagent, and individual phenols by using HPLC. The level of oxidation of emulsion was determined using the measurement of peroxides and TBARS (thiobarbituric acid reactive substances). Methionine and lysine in the system were reacted with sodium nitroprusside and trinitrobenzenesulphonic acid respectively, and the absorbance of the complexes was measured. Extract of green tea had the highest total polyphenol content. The system containing antioxidants and amino acid protected linoleic acid more efficiently than by the addition of antioxidants only. Lysine and methionine losses in samples without the addition of antioxidants were lower in their isoelectric points than below these points. Antioxidants decrease the loss of amino acids. The protective properties of antioxidants towards methionine were higher in a pH of isoelectric point whereas towards lysine in pH below this point. Green tea, thyme and rosemary extracts exhibit antioxidant activity in linoleic acid emulsions. Moreover, they can be utilized to inhibit quantitative changes in amino acids in lipid emulsions. However, the antioxidant efficiency of these extracts seems to depend on pH conditions. Further investigations should be carried out to clarify this issue.
On the analysis of time-of-flight spin-echo modulated dark-field imaging data
NASA Astrophysics Data System (ADS)
Sales, Morten; Plomp, Jeroen; Bouwman, Wim G.; Tremsin, Anton S.; Habicht, Klaus; Strobl, Markus
2017-06-01
Spin-Echo Modulated Small Angle Neutron Scattering with spatial resolution, i.e. quantitative Spin-Echo Dark Field Imaging, is an emerging technique coupling neutron imaging with spatially resolved quantitative small angle scattering information. However, the currently achieved relatively large modulation periods of the order of millimeters are superimposed to the images of the samples. So far this required an independent reduction and analyses of the image and scattering information encoded in the measured data and is involving extensive curve fitting routines. Apart from requiring a priori decisions potentially limiting the information content that is extractable also a straightforward judgment of the data quality and information content is hindered. In contrast we propose a significantly simplified routine directly applied to the measured data, which does not only allow an immediate first assessment of data quality and delaying decisions on potentially information content limiting further reduction steps to a later and better informed state, but also, as results suggest, generally better analyses. In addition the method enables to drop the spatial resolution detector requirement for non-spatially resolved Spin-Echo Modulated Small Angle Neutron Scattering.
Vega, Victor A; Young, Michelle; Todd, Sarah
2016-01-01
An extraction for aflatoxin M1 from bovine milk samples is described. The samples were extracted by adding 10 mL acetonitrile to 10 g of sample. The extract was salted out with sodium chloride and magnesium sulfate to separate the water and acetonitrile. The organic layer was dried down and reconstituted in water before being subjected to an immunoaffinity column for cleanup. Once the analyte was isolated, quantitation was obtained by LC with fluorescence detection. LC/fluorescence parameters were optimized with an Agilent Poroshell 120 C18 LC column resulting in a 4 min run time. To test the procedure's robustness, three different kinds of matrixes were fortified at three different levels each. Whole milk, reduced fat milk, and skim milk samples were fortified at approximately 0.25, 0.5, and 1.0 μg/kg. Recoveries from all samples ranged from 70 to 100%. Confirmation was accomplished by injecting the samples in an ion trap mass spectrometer. The method presented here entails an extraction step followed by an immunoaffinity column clean-up that leads to fast analysis time and consistent recoveries with an uncertainty measurement of 10.5% and method detection limit of less than 0.011 μg/kg.
Decoding 2D-PAGE complex maps: relevance to proteomics.
Pietrogrande, Maria Chiara; Marchetti, Nicola; Dondi, Francesco; Righetti, Pier Giorgio
2006-03-20
This review describes two mathematical approaches useful for decoding the complex signal of 2D-PAGE maps of protein mixtures. These methods are helpful for interpreting the large amount of data of each 2D-PAGE map by extracting all the analytical information hidden therein by spot overlapping. Here the basic theory and application to 2D-PAGE maps are reviewed: the means for extracting information from the experimental data and their relevance to proteomics are discussed. One method is based on the quantitative theory of statistical model of peak overlapping (SMO) using the spot experimental data (intensity and spatial coordinates). The second method is based on the study of the 2D-autocovariance function (2D-ACVF) computed on the experimental digitised map. They are two independent methods that are able to extract equal and complementary information from the 2D-PAGE map. Both methods permit to obtain fundamental information on the sample complexity and the separation performance and to single out ordered patterns present in spot positions: the availability of two independent procedures to compute the same separation parameters is a powerful tool to estimate the reliability of the obtained results. The SMO procedure is an unique tool to quantitatively estimate the degree of spot overlapping present in the map, while the 2D-ACVF method is particularly powerful in simply singling out the presence of order in the spot position from the complexity of the whole 2D map, i.e., spot trains. The procedures were validated by extensive numerical computation on computer-generated maps describing experimental 2D-PAGE gels of protein mixtures. Their applicability to real samples was tested on reference maps obtained from literature sources. The review describes the most relevant information for proteomics: sample complexity, separation performance, overlapping extent, identification of spot trains related to post-translational modifications (PTMs).
NASA Astrophysics Data System (ADS)
Wu, Tao; Cheung, Tak-Hong; Yim, So-Fan; Qu, Jianan Y.
2010-03-01
A quantitative colposcopic imaging system for the diagnosis of early cervical cancer is evaluated in a clinical study. This imaging technology based on 3-D active stereo vision and motion tracking extracts diagnostic information from the kinetics of acetowhitening process measured from the cervix of human subjects in vivo. Acetowhitening kinetics measured from 137 cervical sites of 57 subjects are analyzed and classified using multivariate statistical algorithms. Cross-validation methods are used to evaluate the performance of the diagnostic algorithms. The results show that an algorithm for screening precancer produced 95% sensitivity (SE) and 96% specificity (SP) for discriminating normal and human papillomavirus (HPV)-infected tissues from cervical intraepithelial neoplasia (CIN) lesions. For a diagnostic algorithm, 91% SE and 90% SP are achieved for discriminating normal tissue, HPV infected tissue, and low-grade CIN lesions from high-grade CIN lesions. The results demonstrate that the quantitative colposcopic imaging system could provide objective screening and diagnostic information for early detection of cervical cancer.
Ravikumar, Komandur Elayavilli; Wagholikar, Kavishwar B; Li, Dingcheng; Kocher, Jean-Pierre; Liu, Hongfang
2015-06-06
Advances in the next generation sequencing technology has accelerated the pace of individualized medicine (IM), which aims to incorporate genetic/genomic information into medicine. One immediate need in interpreting sequencing data is the assembly of information about genetic variants and their corresponding associations with other entities (e.g., diseases or medications). Even with dedicated effort to capture such information in biological databases, much of this information remains 'locked' in the unstructured text of biomedical publications. There is a substantial lag between the publication and the subsequent abstraction of such information into databases. Multiple text mining systems have been developed, but most of them focus on the sentence level association extraction with performance evaluation based on gold standard text annotations specifically prepared for text mining systems. We developed and evaluated a text mining system, MutD, which extracts protein mutation-disease associations from MEDLINE abstracts by incorporating discourse level analysis, using a benchmark data set extracted from curated database records. MutD achieves an F-measure of 64.3% for reconstructing protein mutation disease associations in curated database records. Discourse level analysis component of MutD contributed to a gain of more than 10% in F-measure when compared against the sentence level association extraction. Our error analysis indicates that 23 of the 64 precision errors are true associations that were not captured by database curators and 68 of the 113 recall errors are caused by the absence of associated disease entities in the abstract. After adjusting for the defects in the curated database, the revised F-measure of MutD in association detection reaches 81.5%. Our quantitative analysis reveals that MutD can effectively extract protein mutation disease associations when benchmarking based on curated database records. The analysis also demonstrates that incorporating discourse level analysis significantly improved the performance of extracting the protein-mutation-disease association. Future work includes the extension of MutD for full text articles.
Ni, Yan; Su, Mingming; Qiu, Yunping; Jia, Wei
2017-01-01
ADAP-GC is an automated computational pipeline for untargeted, GC-MS-based metabolomics studies. It takes raw mass spectrometry data as input and carries out a sequence of data processing steps including construction of extracted ion chromatograms, detection of chromatographic peak features, deconvolution of co-eluting compounds, and alignment of compounds across samples. Despite the increased accuracy from the original version to version 2.0 in terms of extracting metabolite information for identification and quantitation, ADAP-GC 2.0 requires appropriate specification of a number of parameters and has difficulty in extracting information of compounds that are in low concentration. To overcome these two limitations, ADAP-GC 3.0 was developed to improve both the robustness and sensitivity of compound detection. In this paper, we report how these goals were achieved and compare ADAP-GC 3.0 against three other software tools including ChromaTOF, AnalyzerPro, and AMDIS that are widely used in the metabolomics community. PMID:27461032
Ni, Yan; Su, Mingming; Qiu, Yunping; Jia, Wei; Du, Xiuxia
2016-09-06
ADAP-GC is an automated computational pipeline for untargeted, GC/MS-based metabolomics studies. It takes raw mass spectrometry data as input and carries out a sequence of data processing steps including construction of extracted ion chromatograms, detection of chromatographic peak features, deconvolution of coeluting compounds, and alignment of compounds across samples. Despite the increased accuracy from the original version to version 2.0 in terms of extracting metabolite information for identification and quantitation, ADAP-GC 2.0 requires appropriate specification of a number of parameters and has difficulty in extracting information on compounds that are in low concentration. To overcome these two limitations, ADAP-GC 3.0 was developed to improve both the robustness and sensitivity of compound detection. In this paper, we report how these goals were achieved and compare ADAP-GC 3.0 against three other software tools including ChromaTOF, AnalyzerPro, and AMDIS that are widely used in the metabolomics community.
Radiomics: a new application from established techniques
Parekh, Vishwa; Jacobs, Michael A.
2016-01-01
The increasing use of biomarkers in cancer have led to the concept of personalized medicine for patients. Personalized medicine provides better diagnosis and treatment options available to clinicians. Radiological imaging techniques provide an opportunity to deliver unique data on different types of tissue. However, obtaining useful information from all radiological data is challenging in the era of “big data”. Recent advances in computational power and the use of genomics have generated a new area of research termed Radiomics. Radiomics is defined as the high throughput extraction of quantitative imaging features or texture (radiomics) from imaging to decode tissue pathology and creating a high dimensional data set for feature extraction. Radiomic features provide information about the gray-scale patterns, inter-pixel relationships. In addition, shape and spectral properties can be extracted within the same regions of interest on radiological images. Moreover, these features can be further used to develop computational models using advanced machine learning algorithms that may serve as a tool for personalized diagnosis and treatment guidance. PMID:28042608
Quantitative Hyperspectral Reflectance Imaging
Klein, Marvin E.; Aalderink, Bernard J.; Padoan, Roberto; de Bruin, Gerrit; Steemers, Ted A.G.
2008-01-01
Hyperspectral imaging is a non-destructive optical analysis technique that can for instance be used to obtain information from cultural heritage objects unavailable with conventional colour or multi-spectral photography. This technique can be used to distinguish and recognize materials, to enhance the visibility of faint or obscured features, to detect signs of degradation and study the effect of environmental conditions on the object. We describe the basic concept, working principles, construction and performance of a laboratory instrument specifically developed for the analysis of historical documents. The instrument measures calibrated spectral reflectance images at 70 wavelengths ranging from 365 to 1100 nm (near-ultraviolet, visible and near-infrared). By using a wavelength tunable narrow-bandwidth light-source, the light energy used to illuminate the measured object is minimal, so that any light-induced degradation can be excluded. Basic analysis of the hyperspectral data includes a qualitative comparison of the spectral images and the extraction of quantitative data such as mean spectral reflectance curves and statistical information from user-defined regions-of-interest. More sophisticated mathematical feature extraction and classification techniques can be used to map areas on the document, where different types of ink had been applied or where one ink shows various degrees of degradation. The developed quantitative hyperspectral imager is currently in use by the Nationaal Archief (National Archives of The Netherlands) to study degradation effects of artificial samples and original documents, exposed in their permanent exhibition area or stored in their deposit rooms. PMID:27873831
Shak, S
1987-01-01
LTB4 and its omega-oxidation products may be rapidly, sensitively, and specifically quantitated by the methods of solid-phase extraction and reversed-phase high-performance liquid chromatography (HPLC), which are described in this chapter. Although other techniques, such as radioimmunoassay or gas chromatography-mass spectrometry, may be utilized for quantitative analysis of the lipoxygenase products of arachidonic acid, only the technique of reversed-phase HPLC can quantitate as many as 10 metabolites in a single analysis, without prior derivatization. In this chapter, we also reviewed the chromatographic theory which we utilized in order to optimize reversed-phase HPLC analysis of LTB4 and its omega-oxidation products. With this information and a gradient HPLC system, it is possible for any investigator to develop a powerful assay for the potent inflammatory mediator, LTB4, or for any other lipoxygenase product of arachidonic acid.
Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification.
Wang, Shouyi; Bowen, Stephen R; Chaovalitwongse, W Art; Sandison, George A; Grabowski, Thomas J; Kinahan, Paul E
2014-02-21
The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUV(peak)) over lesions of interest. Relative differences in SUV(peak) between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUV(peak) values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.
Respiratory trace feature analysis for the prediction of respiratory-gated PET quantification
NASA Astrophysics Data System (ADS)
Wang, Shouyi; Bowen, Stephen R.; Chaovalitwongse, W. Art; Sandison, George A.; Grabowski, Thomas J.; Kinahan, Paul E.
2014-02-01
The benefits of respiratory gating in quantitative PET/CT vary tremendously between individual patients. Respiratory pattern is among many patient-specific characteristics that are thought to play an important role in gating-induced imaging improvements. However, the quantitative relationship between patient-specific characteristics of respiratory pattern and improvements in quantitative accuracy from respiratory-gated PET/CT has not been well established. If such a relationship could be estimated, then patient-specific respiratory patterns could be used to prospectively select appropriate motion compensation during image acquisition on a per-patient basis. This study was undertaken to develop a novel statistical model that predicts quantitative changes in PET/CT imaging due to respiratory gating. Free-breathing static FDG-PET images without gating and respiratory-gated FDG-PET images were collected from 22 lung and liver cancer patients on a PET/CT scanner. PET imaging quality was quantified with peak standardized uptake value (SUVpeak) over lesions of interest. Relative differences in SUVpeak between static and gated PET images were calculated to indicate quantitative imaging changes due to gating. A comprehensive multidimensional extraction of the morphological and statistical characteristics of respiratory patterns was conducted, resulting in 16 features that characterize representative patterns of a single respiratory trace. The six most informative features were subsequently extracted using a stepwise feature selection approach. The multiple-regression model was trained and tested based on a leave-one-subject-out cross-validation. The predicted quantitative improvements in PET imaging achieved an accuracy higher than 90% using a criterion with a dynamic error-tolerance range for SUVpeak values. The results of this study suggest that our prediction framework could be applied to determine which patients would likely benefit from respiratory motion compensation when clinicians quantitatively assess PET/CT for therapy target definition and response assessment.
NASA Astrophysics Data System (ADS)
Ambekar Ramachandra Rao, Raghu; Mehta, Monal R.; Toussaint, Kimani C., Jr.
2010-02-01
We demonstrate the use of Fourier transform-second-harmonic generation (FT-SHG) imaging of collagen fibers as a means of performing quantitative analysis of obtained images of selected spatial regions in porcine trachea, ear, and cornea. Two quantitative markers, preferred orientation and maximum spatial frequency are proposed for differentiating structural information between various spatial regions of interest in the specimens. The ear shows consistent maximum spatial frequency and orientation as also observed in its real-space image. However, there are observable changes in the orientation and minimum feature size of fibers in the trachea indicating a more random organization. Finally, the analysis is applied to a 3D image stack of the cornea. It is shown that the standard deviation of the orientation is sensitive to the randomness in fiber orientation. Regions with variations in the maximum spatial frequency, but with relatively constant orientation, suggest that maximum spatial frequency is useful as an independent quantitative marker. We emphasize that FT-SHG is a simple, yet powerful, tool for extracting information from images that is not obvious in real space. This technique can be used as a quantitative biomarker to assess the structure of collagen fibers that may change due to damage from disease or physical injury.
A more quantitative extraction of arsenic-containing compounds from seafood matrices is essential in developing better dietary exposure estimates. More quantitative extraction often implies a more chemically aggressive set of extraction conditions. However, these conditions may...
New techniques for positron emission tomography in the study of human neurological disorders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1993-01-01
This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.
Radiomics and radiogenomics of prostate cancer.
Smith, Clayton P; Czarniecki, Marcin; Mehralivand, Sherif; Stoyanova, Radka; Choyke, Peter L; Harmon, Stephanie; Turkbey, Baris
2018-06-20
Radiomics and radiogenomics are attractive research topics in prostate cancer. Radiomics mainly focuses on extraction of quantitative information from medical imaging, whereas radiogenomics aims to correlate these imaging features to genomic data. The purpose of this review is to provide a brief overview summarizing recent progress in the application of radiomics-based approaches in prostate cancer and to discuss the potential role of radiogenomics in prostate cancer.
Separation of thorium from lanthanides by solvent extraction with ionizable crown ethers.
Du, H S; Wood, D J; Elshani, S; Wai, C M
1993-02-01
Thorium and the lanthanides are extracted by alpha-(sym-dibenzo-16-crown-5-oxy)acetic acid and its analogues in different pH ranges. At pH 4.5, Th is quantitatively extracted by the crown ether carboxylic acids into chloroform whereas the extraction of the lanthanides is negligible. Separation of Th from the lanthanides can be achieved by solvent extraction under this condition. The extraction does not require specific counteranions and is reversible with respect to pH. Trace amounts of Th in water can be quantitatively recovered using this extraction system for neutron activation analysis. The nature of the extracted Th complex and the mechanism of extraction are discussed.
NASA Astrophysics Data System (ADS)
Chen, Kun; Wu, Tao; Li, Yan; Wei, Haoyun
2017-12-01
Coherent anti-Stokes Raman scattering (CARS) is a powerful nonlinear spectroscopy technique that is rapidly gaining recognition of different molecules. Unfortunately, molecular concentration information is generally not immediately accessible from the raw CARS signal due to the nonresonant background. In addition, mainstream biomedical applications of CARS are currently hampered by a complex and bulky excitation setup. Here, we establish a dual-soliton Stokes based CARS spectroscopy scheme capable of quantifying the sample molecular, using a single fiber laser. This dual-soliton CARS scheme takes advantage of a differential configuration to achieve efficient suppression of nonresonant background and therefore allows extraction of quantitative composition information. Besides, our all-fiber based excitation source can probe the most fingerprint region (1100-1800 cm-1) with a spectral resolution of 15 cm-1 under the spectral focusing mechanism, where is considerably more information contained throughout an entire spectrum than at just a single frequency within that spectrum. Systematic studies of the scope of application and several fundamental aspects are discussed. Quantitative capability is further experimentally demonstrated through the determination of oleic acid concentration based on the linear dependence of signal on different Raman vibration bands.
Demeke, Tigst; Ratnayaka, Indira; Phan, Anh
2009-01-01
The quality of DNA affects the accuracy and repeatability of quantitative PCR results. Different DNA extraction and purification methods were compared for quantification of Roundup Ready (RR) soybean (event 40-3-2) by real-time PCR. DNA was extracted using cetylmethylammonium bromide (CTAB), DNeasy Plant Mini Kit, and Wizard Magnetic DNA purification system for food. CTAB-extracted DNA was also purified using the Zymo (DNA Clean & Concentrator 25 kit), Qtip 100 (Qiagen Genomic-Tip 100/G), and QIAEX II Gel Extraction Kit. The CTAB extraction method provided the largest amount of DNA, and the Zymo purification kit resulted in the highest percentage of DNA recovery. The Abs260/280 and Abs260/230 ratios were less than the expected values for some of the DNA extraction and purification methods used, indicating the presence of substances that could inhibit PCR reactions. Real-time quantitative PCR results were affected by the DNA extraction and purification methods used. Further purification or dilution of the CTAB DNA was required for successful quantification of RR soybean. Less variability of quantitative PCR results was observed among experiments and replications for DNA extracted and/or purified by CTAB, CTAB+Zymo, CTAB+Qtip 100, and DNeasy methods. Correct and repeatable results for real-time PCR quantification of RR soybean were achieved using CTAB DNA purified with Zymo and Qtip 100 methods.
Multi-Intelligence Analytics for Next Generation Analysts (MIAGA)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Waltz, Ed
2016-05-01
Current analysts are inundated with large volumes of data from which extraction, exploitation, and indexing are required. A future need for next-generation analysts is an appropriate balance between machine analytics from raw data and the ability of the user to interact with information through automation. Many quantitative intelligence tools and techniques have been developed which are examined towards matching analyst opportunities with recent technical trends such as big data, access to information, and visualization. The concepts and techniques summarized are derived from discussions with real analysts, documented trends of technical developments, and methods to engage future analysts with multiintelligence services. For example, qualitative techniques should be matched against physical, cognitive, and contextual quantitative analytics for intelligence reporting. Future trends include enabling knowledge search, collaborative situational sharing, and agile support for empirical decision-making and analytical reasoning.
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m(2), 5074.1790 m(3) and 1316.1250 m(2), 1591.5784 m(3), respectively. The results of the study provide a new method for the quantitative study of small gully erosion.
Masullo, Milena; Mari, Angela; Cerulli, Antonietta; Bottone, Alfredo; Kontek, Bogdan; Olas, Beata; Pizza, Cosimo; Piacente, Sonia
2016-10-01
There is only limited information available on the chemical composition of the non-edible parts of Corylus avellana, source of the Italian PGI product "Nocciola di Giffoni" (hazelnut). An initial LC-MS profile of the methanolic extract of the male flowers of C. avellana, cultivar 'Tonda di Giffoni' led to the isolation of 12 compounds, of which the structures were elucidated by NMR spectroscopy. These were identified as three previously undescribed diarylheptanoids, named giffonins Q-S, along with nine known compounds. Furthermore, the quantitative determination of the main compounds occurring in the methanolic extract of C. avellana flowers was carried out by an analytical approach based on LC-ESI(QqQ)MS, using the Multiple Reaction Monitoring (MRM) experiment. In order to explore the antioxidant ability of C. avellana flowers, the methanolic extract and the isolated compounds were evaluated for their inhibitory effects on human plasma lipid peroxidation induced by H2O2 and H2O2/Fe(2+), by measuring the concentration of TBARS. Copyright © 2016 Elsevier Ltd. All rights reserved.
Chen, Kun; Wu, Tao; Wei, Haoyun; Zhou, Tian; Li, Yan
2016-01-01
Coherent anti-Stokes Raman microscopy (CARS) is a quantitative, chemically specific, and label-free optical imaging technique for studying inhomogeneous systems. However, the complicating influence of the nonresonant response on the CARS signal severely limits its sensitivity and specificity and especially limits the extent to which CARS microscopy has been used as a fully quantitative imaging technique. On the basis of spectral focusing mechanism, we establish a dual-soliton Stokes based CARS microspectroscopy and microscopy scheme capable of quantifying the spatial information of densities and chemical composition within inhomogeneous samples, using a single fiber laser. Dual-soliton Stokes scheme not only removes the nonresonant background but also allows robust acquisition of multiple characteristic vibrational frequencies. This all-fiber based laser source can cover the entire fingerprint (800-2200 cm−1) region with a spectral resolution of 15 cm−1. We demonstrate that quantitative degree determination of lipid-chain unsaturation in the fatty acids mixture can be achieved by the characterization of C = C stretching and CH2 deformation vibrations. For microscopy purposes, we show that the spatially inhomogeneous distribution of lipid droplets can be further quantitatively visualized using this quantified degree of lipid unsaturation in the acyl chain for contrast in the hyperspectral CARS images. The combination of compact excitation source and background-free capability to facilitate extraction of quantitative composition information with multiplex spectral peaks will enable wider applications of quantitative chemical imaging in studying biological and material systems. PMID:27867704
Wires in the soup: quantitative models of cell signaling
Cheong, Raymond; Levchenko, Andre
2014-01-01
Living cells are capable of extracting information from their environments and mounting appropriate responses to a variety of associated challenges. The underlying signal transduction networks enabling this can be quite complex, necessitating for their unraveling by sophisticated computational modeling coupled with precise experimentation. Although we are still at the beginning of this process, some recent examples of integrative analysis of cell signaling are very encouraging. This review highlights the case of the NF-κB pathway in order to illustrate how a quantitative model of a signaling pathway can be gradually constructed through continuous experimental validation, and what lessons one might learn from such exercises. PMID:18291655
Measurement Tools for the Immersive Visualization Environment: Steps Toward the Virtual Laboratory.
Hagedorn, John G; Dunkers, Joy P; Satterfield, Steven G; Peskin, Adele P; Kelso, John T; Terrill, Judith E
2007-01-01
This paper describes a set of tools for performing measurements of objects in a virtual reality based immersive visualization environment. These tools enable the use of the immersive environment as an instrument for extracting quantitative information from data representations that hitherto had be used solely for qualitative examination. We provide, within the virtual environment, ways for the user to analyze and interact with the quantitative data generated. We describe results generated by these methods to obtain dimensional descriptors of tissue engineered medical products. We regard this toolbox as our first step in the implementation of a virtual measurement laboratory within an immersive visualization environment.
Gerbig, Stefanie; Stern, Gerold; Brunn, Hubertus E; Düring, Rolf-Alexander; Spengler, Bernhard; Schulz, Sabine
2017-03-01
Direct analysis of fruit and vegetable surfaces is an important tool for in situ detection of food contaminants such as pesticides. We tested three different ways to prepare samples for the qualitative desorption electrospray ionization mass spectrometry (DESI-MS) analysis of 32 pesticides found on nine authentic fruits collected from food control. Best recovery rates for topically applied pesticides (88%) were found by analyzing the surface of a glass slide which had been rubbed against the surface of the food. Pesticide concentration in all samples was at or below the maximum residue level allowed. In addition to the high sensitivity of the method for qualitative analysis, quantitative or, at least, semi-quantitative information is needed in food control. We developed a DESI-MS method for the simultaneous determination of linear calibration curves of multiple pesticides of the same chemical class using normalization to one internal standard (ISTD). The method was first optimized for food extracts and subsequently evaluated for the quantification of pesticides in three authentic food extracts. Next, pesticides and the ISTD were applied directly onto food surfaces, and the corresponding calibration curves were obtained. The determination of linear calibration curves was still feasible, as demonstrated for three different food surfaces. This proof-of-principle method was used to simultaneously quantify two pesticides on an authentic sample, showing that the method developed could serve as a fast and simple preselective tool for disclosure of pesticide regulation violations. Graphical Abstract Multiple pesticide residues were detected and quantified in-situ from an authentic set of food items and extracts in a proof of principle study.
Breast MRI radiomics: comparison of computer- and human-extracted imaging phenotypes.
Sutton, Elizabeth J; Huang, Erich P; Drukker, Karen; Burnside, Elizabeth S; Li, Hui; Net, Jose M; Rao, Arvind; Whitman, Gary J; Zuley, Margarita; Ganott, Marie; Bonaccio, Ermelinda; Giger, Maryellen L; Morris, Elizabeth A
2017-01-01
In this study, we sought to investigate if computer-extracted magnetic resonance imaging (MRI) phenotypes of breast cancer could replicate human-extracted size and Breast Imaging-Reporting and Data System (BI-RADS) imaging phenotypes using MRI data from The Cancer Genome Atlas (TCGA) project of the National Cancer Institute. Our retrospective interpretation study involved analysis of Health Insurance Portability and Accountability Act-compliant breast MRI data from The Cancer Imaging Archive, an open-source database from the TCGA project. This study was exempt from institutional review board approval at Memorial Sloan Kettering Cancer Center and the need for informed consent was waived. Ninety-one pre-operative breast MRIs with verified invasive breast cancers were analysed. Three fellowship-trained breast radiologists evaluated the index cancer in each case according to size and the BI-RADS lexicon for shape, margin, and enhancement (human-extracted image phenotypes [HEIP]). Human inter-observer agreement was analysed by the intra-class correlation coefficient (ICC) for size and Krippendorff's α for other measurements. Quantitative MRI radiomics of computerised three-dimensional segmentations of each cancer generated computer-extracted image phenotypes (CEIP). Spearman's rank correlation coefficients were used to compare HEIP and CEIP. Inter-observer agreement for HEIP varied, with the highest agreement seen for size (ICC 0.679) and shape (ICC 0.527). The computer-extracted maximum linear size replicated the human measurement with p < 10 -12 . CEIP of shape, specifically sphericity and irregularity, replicated HEIP with both p values < 0.001. CEIP did not demonstrate agreement with HEIP of tumour margin or internal enhancement. Quantitative radiomics of breast cancer may replicate human-extracted tumour size and BI-RADS imaging phenotypes, thus enabling precision medicine.
Robust real-time extraction of respiratory signals from PET list-mode data.
Salomon, Andre; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas
2018-05-01
Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions' detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting ("binning") of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signalsdirectly from the acquired PET data simplifies the clinical workflow as it avoids to handle additional signal measurement equipment. We introduce a new data-driven method "Combined Local Motion Detection" (CLMD). It uses the Time-of-Flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using 7 measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4s in total on a standard multi-core CPU and thus provides a robust and accurate approach enabling real-time processing capabilities using standard PC hardware. © 2018 Institute of Physics and Engineering in Medicine.
Robust real-time extraction of respiratory signals from PET list-mode data
NASA Astrophysics Data System (ADS)
Salomon, André; Zhang, Bin; Olivier, Patrick; Goedicke, Andreas
2018-06-01
Respiratory motion, which typically cannot simply be suspended during PET image acquisition, affects lesions’ detection and quantitative accuracy inside or in close vicinity to the lungs. Some motion compensation techniques address this issue via pre-sorting (‘binning’) of the acquired PET data into a set of temporal gates, where each gate is assumed to be minimally affected by respiratory motion. Tracking respiratory motion is typically realized using dedicated hardware (e.g. using respiratory belts and digital cameras). Extracting respiratory signals directly from the acquired PET data simplifies the clinical workflow as it avoids handling additional signal measurement equipment. We introduce a new data-driven method ‘combined local motion detection’ (CLMD). It uses the time-of-flight (TOF) information provided by state-of-the-art PET scanners in order to enable real-time respiratory signal extraction without additional hardware resources. CLMD applies center-of-mass detection in overlapping regions based on simple back-positioned TOF event sets acquired in short time frames. Following a signal filtering and quality-based pre-selection step, the remaining extracted individual position information over time is then combined to generate a global respiratory signal. The method is evaluated using seven measured FDG studies from single and multiple scan positions of the thorax region, and it is compared to other software-based methods regarding quantitative accuracy and statistical noise stability. Correlation coefficients around 90% between the reference and the extracted signal have been found for those PET scans where motion affected features such as tumors or hot regions were present in the PET field-of-view. For PET scans with a quarter of typically applied radiotracer doses, the CLMD method still provides similar high correlation coefficients which indicates its robustness to noise. Each CLMD processing needed less than 0.4 s in total on a standard multi-core CPU and thus provides a robust and accurate approach enabling real-time processing capabilities using standard PC hardware.
Cífková, Eva; Holčapek, Michal; Lísa, Miroslav; Ovčačíková, Magdaléna; Lyčka, Antonín; Lynen, Frédéric; Sandra, Pat
2012-11-20
The identification and quantitation of a wide range of lipids in complex biological samples is an essential requirement for the lipidomic studies. High-performance liquid chromatography-mass spectrometry (HPLC/MS) has the highest potential to obtain detailed information on the whole lipidome, but the reliable quantitation of multiple lipid classes is still a challenging task. In this work, we describe a new method for the nontargeted quantitation of polar lipid classes separated by hydrophilic interaction liquid chromatography (HILIC) followed by positive-ion electrospray ionization mass spectrometry (ESI-MS) using a single internal lipid standard to which all class specific response factors (RFs) are related to. The developed method enables the nontargeted quantitation of lipid classes and molecules inside these classes in contrast to the conventional targeted quantitation, which is based on predefined selected reaction monitoring (SRM) transitions for selected lipids only. In the nontargeted quantitation method described here, concentrations of lipid classes are obtained by the peak integration in HILIC chromatograms multiplied by their RFs related to the single internal standard (i.e., sphingosyl PE, d17:1/12:0) used as common reference for all polar lipid classes. The accuracy, reproducibility and robustness of the method have been checked by various means: (1) the comparison with conventional lipidomic quantitation using SRM scans on a triple quadrupole (QqQ) mass analyzer, (2) (31)P nuclear magnetic resonance (NMR) quantitation of the total lipid extract, (3) method robustness test using subsequent measurements by three different persons, (4) method transfer to different HPLC/MS systems using different chromatographic conditions, and (5) comparison with previously published results for identical samples, especially human reference plasma from the National Institute of Standards and Technology (NIST human plasma). Results on human plasma, egg yolk and porcine liver extracts are presented and discussed.
ERIC Educational Resources Information Center
Usher, Karyn M.; Simmons, Carolyn R.; Keating, Daniel W.; Rossi, Henry F., III
2015-01-01
Chemical separations are an important part of an undergraduate chemistry curriculum. Sophomore students often get experience with liquid-liquid extraction in organic chemistry classes, but liquid-liquid extraction is not as often introduced as a quantitative sample preparation method in honors general chemistry or quantitative analysis classes.…
Maldini, Mariateresa; Chessa, Mario; Petretto, Giacomo L; Montoro, Paola; Rourke, Jonathan P; Foddai, Marzia; Nicoletti, Marcello; Pintore, Giorgio
2016-09-01
Myrtus communis L. (Myrtaceae) is a self-seeded shrub, widespread in Sardinia, with anti-inflammatory, antiseptic, antimicrobial, hypoglycemic and balsamic properties. Its berries, employed for the production of sweet myrtle liqueur, are characterised by a high content of bioactive polyphenols, mainly anthocyanins. Anthocyanin composition is quite specific for vegetables/fruits and can be used as a fingerprint to determine the authenticity, geographical origin and quality of raw materials, products and extracts. To rapidly analyse and determine anthocyanins in 17 samples of Myrtus communis berries by developing a platform based on the integration of UHPLC-MS/MS quantitative data and multivariate analysis with the aim of extracting the most information possible from the data. UHPLC-ESI-MS/MS methods, working in positive ion mode, were performed for the detection and determination of target compounds in multiple reaction monitoring (MRM) mode. Optimal chromatographic conditions were achieved using an XSelect HSS T3 column and a gradient elution with 0.1% formic acid in water and 0.1% formic acid in acetonitrile. Principal component analysis (PCA) was applied to the quantitative data to correlate and discriminate 17 geographical collections of Myrtus communis. The developed quantitative method was reliable, sensitive and specific and was successfully applied to the quantification of 17 anthocyanins. Peonidin-3-O-glucoside was the most abundant compound in all the extracts investigated. The developed methodology allows the identification of quali-quantitative differences among M. communis samples and thus defines the quality and value of this raw material for marketed products. Moreover, the reported data have an immediate commercial value due to the current interest in developing antioxidant nutraceuticals from Mediterranean plants, including Sardinian Myrtus communis. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Characterization of Structural and Configurational Properties of DNA by Atomic Force Microscopy.
Meroni, Alice; Lazzaro, Federico; Muzi-Falconi, Marco; Podestà, Alessandro
2018-01-01
We describe a method to extract quantitative information on DNA structural and configurational properties from high-resolution topographic maps recorded by atomic force microscopy (AFM). DNA molecules are deposited on mica surfaces from an aqueous solution, carefully dehydrated, and imaged in air in Tapping Mode. Upon extraction of the spatial coordinates of the DNA backbones from AFM images, several parameters characterizing DNA structure and configuration can be calculated. Here, we explain how to obtain the distribution of contour lengths, end-to-end distances, and gyration radii. This modular protocol can be also used to characterize other statistical parameters from AFM topographies.
Rapid System to Quantitatively Characterize the Airborne Microbial Community
NASA Technical Reports Server (NTRS)
Macnaughton, Sarah J.
1998-01-01
Bioaerosols have been linked to a wide range of different allergies and respiratory illnesses. Currently, microorganism culture is the most commonly used method for exposure assessment. Such culture techniques, however, generally fail to detect between 90-99% of the actual viable biomass. Consequently, an unbiased technique for detecting airborne microorganisms is essential. In this Phase II proposal, a portable air sampling device his been developed for the collection of airborne microbial biomass from indoor (and outdoor) environments. Methods were evaluated for extracting and identifying lipids that provide information on indoor air microbial biomass, and automation of these procedures was investigated. Also, techniques to automate the extraction of DNA were explored.
NASA Astrophysics Data System (ADS)
Edward, Kert
Quantitative phase microscopy (QPM) allows for the imaging of translucent or transparent biological specimens without the need for exogenous contrast agents. This technique is usually applied towards the investigation of simple cells such as red blood cells which are typically enucleated and can be considered to be homogenous. However, most biological cells are nucleated and contain other interesting intracellular organelles. It has been established that the physical characteristics of certain subsurface structures such as the shape and roughness of the nucleus is well correlated with onset and progress of pathological conditions such as cancer. Although the acquired quantitative phase information of biological cells contains surface information as well as coupled subsurface information, the latter has been ignored up until now. A novel scanning quantitative phase imaging system unencumbered by 2pi ambiguities is hereby presented. This system is incorporated into a shear-force feedback scheme which allows for simultaneous phase and topography determination. It will be shown how subsequent image processing of these two data sets allows for the extraction of the subsurface component in the phase data and in vivo cell refractometry studies. Both fabricated samples and biological cells ranging from rat fibroblast cells to malaria infected human erythrocytes were investigated as part of this research. The results correlate quite well with that obtained via other microscopy techniques.
Preliminary experiments on pharmacokinetic diffuse fluorescence tomography of CT-scanning mode
NASA Astrophysics Data System (ADS)
Zhang, Yanqi; Wang, Xin; Yin, Guoyan; Li, Jiao; Zhou, Zhongxing; Zhao, Huijuan; Gao, Feng; Zhang, Limin
2016-10-01
In vivo tomographic imaging of the fluorescence pharmacokinetic parameters in tissues can provide additional specific and quantitative physiological and pathological information to that of fluorescence concentration. This modality normally requires a highly-sensitive diffuse fluorescence tomography (DFT) working in dynamic way to finally extract the pharmacokinetic parameters from the measured pharmacokinetics-associated temporally-varying boundary intensity. This paper is devoted to preliminary experimental validation of our proposed direct reconstruction scheme of instantaneous sampling based pharmacokinetic-DFT: A highly-sensitive DFT system of CT-scanning mode working with parallel four photomultiplier-tube photon-counting channels is developed to generate an instantaneous sampling dataset; A direct reconstruction scheme then extracts images of the pharmacokinetic parameters using the adaptive-EKF strategy. We design a dynamic phantom that can simulate the agent metabolism in living tissue. The results of the dynamic phantom experiments verify the validity of the experiment system and reconstruction algorithms, and demonstrate that system provides good resolution, high sensitivity and quantitativeness at different pump speed.
X-ray phase contrast tomography by tracking near field speckle
Wang, Hongchang; Berujon, Sebastien; Herzen, Julia; Atwood, Robert; Laundy, David; Hipp, Alexander; Sawhney, Kawal
2015-01-01
X-ray imaging techniques that capture variations in the x-ray phase can yield higher contrast images with lower x-ray dose than is possible with conventional absorption radiography. However, the extraction of phase information is often more difficult than the extraction of absorption information and requires a more sophisticated experimental arrangement. We here report a method for three-dimensional (3D) X-ray phase contrast computed tomography (CT) which gives quantitative volumetric information on the real part of the refractive index. The method is based on the recently developed X-ray speckle tracking technique in which the displacement of near field speckle is tracked using a digital image correlation algorithm. In addition to differential phase contrast projection images, the method allows the dark-field images to be simultaneously extracted. After reconstruction, compared to conventional absorption CT images, the 3D phase CT images show greatly enhanced contrast. This new imaging method has advantages compared to other X-ray imaging methods in simplicity of experimental arrangement, speed of measurement and relative insensitivity to beam movements. These features make the technique an attractive candidate for material imaging such as in-vivo imaging of biological systems containing soft tissue. PMID:25735237
Tuinman, Albert A; Lewis, Linda A; Lewis, Samuel A
2003-06-01
The application of electrospray ionization mass spectrometry (ESI-MS) to trace-fiber color analysis is explored using acidic dyes commonly employed to color nylon-based fibers, as well as extracts from dyed nylon fibers. Qualitative information about constituent dyes and quantitative information about the relative amounts of those dyes present on a single fiber become readily available using this technique. Sample requirements for establishing the color identity of different samples (i.e., comparative trace-fiber analysis) are shown to be submillimeter. Absolute verification of dye mixture identity (beyond the comparison of molecular weights derived from ESI-MS) can be obtained by expanding the technique to include tandem mass spectrometry (ESI-MS/MS). For dyes of unknown origin, the ESI-MS/MS analyses may offer insights into the chemical structure of the compound-information not available from chromatographic techniques alone. This research demonstrates that ESI-MS is viable as a sensitive technique for distinguishing dye constituents extracted from a minute amount of trace-fiber evidence. A protocol is suggested to establish/refute the proposition that two fibers--one of which is available in minute quantity only--are of the same origin.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...
2016-06-22
Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less
Generalized Feature Extraction for Wrist Pulse Analysis: From 1-D Time Series to 2-D Matrix.
Dimin Wang; Zhang, David; Guangming Lu
2017-07-01
Traditional Chinese pulse diagnosis, known as an empirical science, depends on the subjective experience. Inconsistent diagnostic results may be obtained among different practitioners. A scientific way of studying the pulse should be to analyze the objectified wrist pulse waveforms. In recent years, many pulse acquisition platforms have been developed with the advances in sensor and computer technology. And the pulse diagnosis using pattern recognition theories is also increasingly attracting attentions. Though many literatures on pulse feature extraction have been published, they just handle the pulse signals as simple 1-D time series and ignore the information within the class. This paper presents a generalized method of pulse feature extraction, extending the feature dimension from 1-D time series to 2-D matrix. The conventional wrist pulse features correspond to a particular case of the generalized models. The proposed method is validated through pattern classification on actual pulse records. Both quantitative and qualitative results relative to the 1-D pulse features are given through diabetes diagnosis. The experimental results show that the generalized 2-D matrix feature is effective in extracting both the periodic and nonperiodic information. And it is practical for wrist pulse analysis.
ERIC Educational Resources Information Center
Valverde, Juan; This, Herve; Vignolle, Marc
2007-01-01
A simple method for the quantitative determination of photosynthetic pigments extracted from green beans using thin-layer chromatography is proposed. Various extraction methods are compared, and it is shown how a simple flatbed scanner and free software for image processing can give a quantitative determination of pigments. (Contains 5 figures.)
USDA-ARS?s Scientific Manuscript database
A high resolution GC/MS with Selected Ion Monitor (SIM) method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts and commercial products was developed and validated. The method involved sample extraction with (1:1) meth...
A method was developed for the confirmed identification and quantitation of 17B-estradiol, estrone, 17B-ethynylestrodial and 16a-hydroxy-17B-estradiol (estriol) in ground water and swine lagoon samples. Centrifuged and filtered samples were extracted using solid phase extraction...
Sabu, Thomas K.; Shiju, Raj T.
2010-01-01
The present study provides data to decide on the most appropriate method for sampling of ground-dwelling arthropods measured in a moist-deciduous forest in the Western Ghats in South India. The abundance of ground-dwelling arthropods was compared among large numbers of samples obtained using pitfall trapping, Berlese and Winkler extraction methods. Highest abundance and frequency of most of the represented taxa indicated pitfall trapping as the ideal method for sampling of ground-dwelling arthropods. However, with possible bias towards surface-active taxa, pitfall-trapping data is inappropriate for quantitative studies, and Berlese extraction is the better alternative. Berlese extraction is the better method for quantitative measurements than the other two methods, whereas pitfall trapping would be appropriate for qualitative measurements. A comparison of the Berlese and Winkler extraction data shows that in a quantitative multigroup approach, Winkler extraction was inferior to Berlese extraction because the total number of arthropods caught was the lowest; and many of the taxa that were caught from an identical sample via Berlese extraction method were not caught. Significantly a greater frequency and higher abundance of arthropods belonging to Orthoptera, Blattaria, and Diptera occurred in pitfall-trapped samples and Psocoptera and Acariformes in Berlese-extracted samples than that were obtained in the other two methods, indicating that both methods are useful, one complementing the other, eliminating a chance for possible under-representation of taxa in quantitative studies. PMID:20673122
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuhl, D.E.
1993-06-01
This progress report describes accomplishments of four programs. The four programs are entitled (1) Faster,simpler processing of positron-computing precursors: New physicochemical approaches, (2) Novel solid phase reagents and methods to improve radiosynthesis and isotope production, (3) Quantitative evaluation of the extraction of information from PET images, and (4) Optimization of tracer kinetic methods for radioligand studies in PET.
[Cardiac Synchronization Function Estimation Based on ASM Level Set Segmentation Method].
Zhang, Yaonan; Gao, Yuan; Tang, Liang; He, Ying; Zhang, Huie
At present, there is no accurate and quantitative methods for the determination of cardiac mechanical synchronism, and quantitative determination of the synchronization function of the four cardiac cavities with medical images has a great clinical value. This paper uses the whole heart ultrasound image sequence, and segments the left & right atriums and left & right ventricles of each frame. After the segmentation, the number of pixels in each cavity and in each frame is recorded, and the areas of the four cavities of the image sequence are therefore obtained. The area change curves of the four cavities are further extracted, and the synchronous information of the four cavities is obtained. Because of the low SNR of Ultrasound images, the boundary lines of cardiac cavities are vague, so the extraction of cardiac contours is still a challenging problem. Therefore, the ASM model information is added to the traditional level set method to force the curve evolution process. According to the experimental results, the improved method improves the accuracy of the segmentation. Furthermore, based on the ventricular segmentation, the right and left ventricular systolic functions are evaluated, mainly according to the area changes. The synchronization of the four cavities of the heart is estimated based on the area changes and the volume changes.
Aldeek, Fadi; Hsieh, Kevin C; Ugochukwu, Obiadada N; Gerard, Ghislain; Hammack, Walter
2018-05-23
We developed and validated a method for the extraction, identification, and quantitation of four nitrofuran metabolites, 3-amino-2-oxazolidinone (AOZ), 3-amino-5-morpholinomethyl-2-oxazolidinone (AMOZ), semicarbazide (SC), and 1-aminohydantoin (AHD), as well as chloramphenicol and florfenicol in a variety of seafood commodities. Samples were extracted by liquid-liquid extraction techniques, analyzed by ultrahigh-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS), and quantitated using commercially sourced, derivatized nitrofuran metabolites, with their isotopically labeled internal standards in-solvent. We obtained recoveries of 90-100% at various fortification levels. The limit of detection (LOD) was set at 0.25 ng/g for AMOZ and AOZ, 1 ng/g for AHD and SC, and 0.1 ng/g for the phenicols. Various extraction methods, standard stability, derivatization efficiency, and improvements to conventional quantitation techniques were also investigated. We successfully applied this method to the identification and quantitation of nitrofuran metabolites and phenicols in 102 imported seafood products. Our results revealed that four of the samples contained residues from banned veterinary drugs.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2012-02-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to materials on metallic surfaces for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases -- uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. The degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Quantitative pathology in virtual microscopy: history, applications, perspectives.
Kayser, Gian; Kayser, Klaus
2013-07-01
With the emerging success of commercially available personal computers and the rapid progress in the development of information technologies, morphometric analyses of static histological images have been introduced to improve our understanding of the biology of diseases such as cancer. First applications have been quantifications of immunohistochemical expression patterns. In addition to object counting and feature extraction, laws of thermodynamics have been applied in morphometric calculations termed syntactic structure analysis. Here, one has to consider that the information of an image can be calculated for separate hierarchical layers such as single pixels, cluster of pixels, segmented small objects, clusters of small objects, objects of higher order composed of several small objects. Using syntactic structure analysis in histological images, functional states can be extracted and efficiency of labor in tissues can be quantified. Image standardization procedures, such as shading correction and color normalization, can overcome artifacts blurring clear thresholds. Morphometric techniques are not only useful to learn more about biological features of growth patterns, they can also be helpful in routine diagnostic pathology. In such cases, entropy calculations are applied in analogy to theoretical considerations concerning information content. Thus, regions with high information content can automatically be highlighted. Analysis of the "regions of high diagnostic value" can deliver in the context of clinical information, site of involvement and patient data (e.g. age, sex), support in histopathological differential diagnoses. It can be expected that quantitative virtual microscopy will open new possibilities for automated histological support. Automated integrated quantification of histological slides also serves for quality assurance. The development and theoretical background of morphometric analyses in histopathology are reviewed, as well as their application and potential future implementation in virtual microscopy. Copyright © 2012 Elsevier GmbH. All rights reserved.
Adebayo, Esther F; Ataguba, John E; Uthman, Olalekan A; Okwundu, Charles I; Lamont, Kim T; Wiysonge, Charles S
2014-01-01
Introduction Many people residing in low-income and middle-income countries (LMICs) are regularly exposed to catastrophic healthcare expenditure. It is therefore pertinent that LMICs should finance their health systems in ways that ensure that their citizens can use needed healthcare services and are protected from potential impoverishment arising from having to pay for services. Ways of financing health systems include government funding, health insurance schemes and out-of-pocket payment. A health insurance scheme refers to pooling of prepaid funds in a way that allows for risks to be shared. The health insurance scheme particularly suitable for the rural poor and the informal sector in LMICs is community-based health insurance (CBHI), that is, insurance schemes operated by organisations other than governments or private for-profit companies. We plan to search for and summarise currently available evidence on factors associated with the uptake of CBHI, as we are not aware of previous systematic reviews that have looked at this important topic. Methods This is a protocol for a systematic review of the literature. We will include both quantitative and qualitative studies in this review. Eligible quantitative studies include intervention and observational studies. Qualitative studies to be included are focus group discussions, direct observations, interviews, case studies and ethnography. We will search EMBASE, PubMed, Scopus, ERIC, PsycInfo, Africa-Wide Information, Academic Search Premier, Business Source Premier, WHOLIS, CINAHL and the Cochrane Library for eligible studies available by 31 October 2013, regardless of publication status or language of publication. We will also check reference lists of included studies and proceedings of relevant conferences and contact researchers for eligible studies. Two authors will independently screen the search output, select studies and extract data, resolving discrepancies by consensus and discussion. Qualitative data will be extracted using standardised data extraction tools adapted from the Critical Appraisal Skills Program (CASP) qualitative appraisal checklist and put together in a thematic analysis where applicable. We will statistically pool data from quantitative studies in a meta-analysis; but if included quantitative studies differ significantly in study settings, design and/or outcome measures, we will present the findings in a narrative synthesis. This protocol has been registered with PROSPERO (ID=CRD42013006364). Dissemination Recommendations will be made to health policy makers, managers and researchers in LMICs to help inform them on ways to strengthen and increase the uptake of CBHI. PMID:24531450
Adebayo, Esther F; Ataguba, John E; Uthman, Olalekan A; Okwundu, Charles I; Lamont, Kim T; Wiysonge, Charles S
2014-02-14
Many people residing in low-income and middle-income countries (LMICs) are regularly exposed to catastrophic healthcare expenditure. It is therefore pertinent that LMICs should finance their health systems in ways that ensure that their citizens can use needed healthcare services and are protected from potential impoverishment arising from having to pay for services. Ways of financing health systems include government funding, health insurance schemes and out-of-pocket payment. A health insurance scheme refers to pooling of prepaid funds in a way that allows for risks to be shared. The health insurance scheme particularly suitable for the rural poor and the informal sector in LMICs is community-based health insurance (CBHI), that is, insurance schemes operated by organisations other than governments or private for-profit companies. We plan to search for and summarise currently available evidence on factors associated with the uptake of CBHI, as we are not aware of previous systematic reviews that have looked at this important topic. This is a protocol for a systematic review of the literature. We will include both quantitative and qualitative studies in this review. Eligible quantitative studies include intervention and observational studies. Qualitative studies to be included are focus group discussions, direct observations, interviews, case studies and ethnography. We will search EMBASE, PubMed, Scopus, ERIC, PsycInfo, Africa-Wide Information, Academic Search Premier, Business Source Premier, WHOLIS, CINAHL and the Cochrane Library for eligible studies available by 31 October 2013, regardless of publication status or language of publication. We will also check reference lists of included studies and proceedings of relevant conferences and contact researchers for eligible studies. Two authors will independently screen the search output, select studies and extract data, resolving discrepancies by consensus and discussion. Qualitative data will be extracted using standardised data extraction tools adapted from the Critical Appraisal Skills Program (CASP) qualitative appraisal checklist and put together in a thematic analysis where applicable. We will statistically pool data from quantitative studies in a meta-analysis; but if included quantitative studies differ significantly in study settings, design and/or outcome measures, we will present the findings in a narrative synthesis. This protocol has been registered with PROSPERO (ID=CRD42013006364). Recommendations will be made to health policy makers, managers and researchers in LMICs to help inform them on ways to strengthen and increase the uptake of CBHI.
Analysing magnetism using scanning SQUID microscopy.
Reith, P; Renshaw Wang, X; Hilgenkamp, H
2017-12-01
Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.
Analysing magnetism using scanning SQUID microscopy
NASA Astrophysics Data System (ADS)
Reith, P.; Renshaw Wang, X.; Hilgenkamp, H.
2017-12-01
Scanning superconducting quantum interference device microscopy (SSM) is a scanning probe technique that images local magnetic flux, which allows for mapping of magnetic fields with high field and spatial accuracy. Many studies involving SSM have been published in the last few decades, using SSM to make qualitative statements about magnetism. However, quantitative analysis using SSM has received less attention. In this work, we discuss several aspects of interpreting SSM images and methods to improve quantitative analysis. First, we analyse the spatial resolution and how it depends on several factors. Second, we discuss the analysis of SSM scans and the information obtained from the SSM data. Using simulations, we show how signals evolve as a function of changing scan height, SQUID loop size, magnetization strength, and orientation. We also investigated 2-dimensional autocorrelation analysis to extract information about the size, shape, and symmetry of magnetic features. Finally, we provide an outlook on possible future applications and improvements.
Quantitative non-invasive intracellular imaging of Plasmodium falciparum infected human erythrocytes
NASA Astrophysics Data System (ADS)
Edward, Kert; Farahi, Faramarz
2014-05-01
Malaria is a virulent pathological condition which results in over a million annual deaths. The parasitic agent Plasmodium falciparum has been extensively studied in connection with this epidemic but much remains unknown about its development inside the red blood cell host. Optical and fluorescence imaging are among the two most common procedures for investigating infected erythrocytes but both require the introduction of exogenous contrast agents. In this letter, we present a procedure for the non-invasive in situ imaging of malaria infected red blood cells. The procedure is based on the utilization of simultaneously acquired quantitative phase and independent topography data to extract intracellular information. Our method allows for the identification of the developmental stages of the parasite and facilitates in situ analysis of the morphological changes associated with the progression of this disease. This information may assist in the development of efficacious treatment therapies for this condition.
Bae, Jong-Myon
2016-01-01
A common method for conducting a quantitative systematic review (QSR) for observational studies related to nutritional epidemiology is the "highest versus lowest intake" method (HLM), in which only the information concerning the effect size (ES) of the highest category of a food item is collected on the basis of its lowest category. However, in the interval collapsing method (ICM), a method suggested to enable a maximum utilization of all available information, the ES information is collected by collapsing all categories into a single category. This study aimed to compare the ES and summary effect size (SES) between the HLM and ICM. A QSR for evaluating the citrus fruit intake and risk of pancreatic cancer and calculating the SES by using the HLM was selected. The ES and SES were estimated by performing a meta-analysis using the fixed-effect model. The directionality and statistical significance of the ES and SES were used as criteria for determining the concordance between the HLM and ICM outcomes. No significant differences were observed in the directionality of SES extracted by using the HLM or ICM. The application of the ICM, which uses a broader information base, yielded more-consistent ES and SES, and narrower confidence intervals than the HLM. The ICM is advantageous over the HLM owing to its higher statistical accuracy in extracting information for QSR on nutritional epidemiology. The application of the ICM should hence be recommended for future studies.
Challenges in Extracting Information From Large Hydrogeophysical-monitoring Datasets
NASA Astrophysics Data System (ADS)
Day-Lewis, F. D.; Slater, L. D.; Johnson, T.
2012-12-01
Over the last decade, new automated geophysical data-acquisition systems have enabled collection of increasingly large and information-rich geophysical datasets. Concurrent advances in field instrumentation, web services, and high-performance computing have made real-time processing, inversion, and visualization of large three-dimensional tomographic datasets practical. Geophysical-monitoring datasets have provided high-resolution insights into diverse hydrologic processes including groundwater/surface-water exchange, infiltration, solute transport, and bioremediation. Despite the high information content of such datasets, extraction of quantitative or diagnostic hydrologic information is challenging. Visual inspection and interpretation for specific hydrologic processes is difficult for datasets that are large, complex, and (or) affected by forcings (e.g., seasonal variations) unrelated to the target hydrologic process. New strategies are needed to identify salient features in spatially distributed time-series data and to relate temporal changes in geophysical properties to hydrologic processes of interest while effectively filtering unrelated changes. Here, we review recent work using time-series and digital-signal-processing approaches in hydrogeophysics. Examples include applications of cross-correlation, spectral, and time-frequency (e.g., wavelet and Stockwell transforms) approaches to (1) identify salient features in large geophysical time series; (2) examine correlation or coherence between geophysical and hydrologic signals, even in the presence of non-stationarity; and (3) condense large datasets while preserving information of interest. Examples demonstrate analysis of large time-lapse electrical tomography and fiber-optic temperature datasets to extract information about groundwater/surface-water exchange and contaminant transport.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Baoqiang; Berti, Romain; Abran, Maxime
2014-05-15
Ultrasound imaging, having the advantages of low-cost and non-invasiveness over MRI and X-ray CT, was reported by several studies as an adequate complement to fluorescence molecular tomography with the perspective of improving localization and quantification of fluorescent molecular targets in vivo. Based on the previous work, an improved dual-modality Fluorescence-Ultrasound imaging system was developed and then validated in imaging study with preclinical tumor model. Ultrasound imaging and a profilometer were used to obtain the anatomical prior information and 3D surface, separately, to precisely extract the tissue boundary on both sides of sample in order to achieve improved fluorescence reconstruction. Furthermore,more » a pattern-based fluorescence reconstruction on the detection side was incorporated to enable dimensional reduction of the dataset while keeping the useful information for reconstruction. Due to its putative role in the current imaging geometry and the chosen reconstruction technique, we developed an attenuation compensated Born-normalization method to reduce the attenuation effects and cancel off experimental factors when collecting quantitative fluorescence datasets over large area. Results of both simulation and phantom study demonstrated that fluorescent targets could be recovered accurately and quantitatively using this reconstruction mechanism. Finally, in vivo experiment confirms that the imaging system associated with the proposed image reconstruction approach was able to extract both functional and anatomical information, thereby improving quantification and localization of molecular targets.« less
NASA Astrophysics Data System (ADS)
O'Keeffe, Jimmy; Buytaert, Wouter; Mijic, Ana; Brozovic, Nicholas
2015-04-01
To build an accurate, robust understanding of the environment, it is important to not only collect information describing its physical characteristics, but also the drivers which influence it. As environmental change, from increasing CO2 levels to decreasing water levels, is often heavily influenced by human activity, gathering information on anthropogenic as well as environmental variables is extremely important. This can mean collecting qualitative, as well as quantitative information. In reality studies are often bound by financial and time constraints, limiting the depth and detail of the research. It is up to the researcher to determine what the best methodology to answer the research questions is likely to be. Here we present a methodology of collecting qualitative and quantitative information in tandem for hydrological studies through the use of semi-structured interviews. This is applied to a case study in two districts of Uttar Pradesh, North India, one of the most intensely irrigated areas of the world. Here, decreasing water levels exacerbated by unchecked water abstraction, an expanding population and government subsidies, have put the long term resilience of the farming population in doubt. Through random selection of study locations, combined with convenience sampling of the participants therein, we show how the data collected can provide valuable insight into the drivers which have led to the current water scenario. We also show how reliable quantitative information can, using the same methodology, be effectively and efficiently extracted for modelling purposes, which along with developing an understanding of the characteristics of the environment is vital in coming up with realistic and sustainable solutions for water resource management in the future.
Hydrophobic ionic liquids for quantitative bacterial cell lysis with subsequent DNA quantification.
Fuchs-Telka, Sabine; Fister, Susanne; Mester, Patrick-Julian; Wagner, Martin; Rossmanith, Peter
2017-02-01
DNA is one of the most frequently analyzed molecules in the life sciences. In this article we describe a simple and fast protocol for quantitative DNA isolation from bacteria based on hydrophobic ionic liquid supported cell lysis at elevated temperatures (120-150 °C) for subsequent PCR-based analysis. From a set of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide was identified as the most suitable for quantitative cell lysis and DNA extraction because of limited quantitative PCR inhibition by the aqueous eluate as well as no detectable DNA uptake. The newly developed method was able to efficiently lyse Gram-negative bacterial cells, whereas Gram-positive cells were protected by their thick cell wall. The performance of the final protocol resulted in quantitative DNA extraction efficiencies for Gram-negative bacteria similar to those obtained with a commercial kit, whereas the number of handling steps, and especially the time required, was dramatically reduced. Graphical Abstract After careful evaluation of five hydrophobic ionic liquids, 1-butyl-1-methylpyrrolidinium bis(trifluoromethylsulfonyl)imide ([BMPyr + ][Ntf 2 - ]) was identified as the most suitable ionic liquid for quantitative cell lysis and DNA extraction. When used for Gram-negative bacteria, the protocol presented is simple and very fast and achieves DNA extraction efficiencies similar to those obtained with a commercial kit. ddH 2 O double-distilled water, qPCR quantitative PCR.
Hao, Yong; Sun, Xu-Dong; Yang, Qiang
2012-12-01
Variables selection strategy combined with local linear embedding (LLE) was introduced for the analysis of complex samples by using near infrared spectroscopy (NIRS). Three methods include Monte Carlo uninformation variable elimination (MCUVE), successive projections algorithm (SPA) and MCUVE connected with SPA were used for eliminating redundancy spectral variables. Partial least squares regression (PLSR) and LLE-PLSR were used for modeling complex samples. The results shown that MCUVE can both extract effective informative variables and improve the precision of models. Compared with PLSR models, LLE-PLSR models can achieve more accurate analysis results. MCUVE combined with LLE-PLSR is an effective modeling method for NIRS quantitative analysis.
3D electron tomography of pretreated biomass informs atomic modeling of cellulose microfibrils.
Ciesielski, Peter N; Matthews, James F; Tucker, Melvin P; Beckham, Gregg T; Crowley, Michael F; Himmel, Michael E; Donohoe, Bryon S
2013-09-24
Fundamental insights into the macromolecular architecture of plant cell walls will elucidate new structure-property relationships and facilitate optimization of catalytic processes that produce fuels and chemicals from biomass. Here we introduce computational methodology to extract nanoscale geometry of cellulose microfibrils within thermochemically treated biomass directly from electron tomographic data sets. We quantitatively compare the cell wall nanostructure in corn stover following two leading pretreatment strategies: dilute acid with iron sulfate co-catalyst and ammonia fiber expansion (AFEX). Computational analysis of the tomographic data is used to extract mathematical descriptions for longitudinal axes of cellulose microfibrils from which we calculate their nanoscale curvature. These nanostructural measurements are used to inform the construction of atomistic models that exhibit features of cellulose within real, process-relevant biomass. By computational evaluation of these atomic models, we propose relationships between the crystal structure of cellulose Iβ and the nanoscale geometry of cellulose microfibrils.
NASA Astrophysics Data System (ADS)
Kwak, Sangmin; Song, Seok Goo; Kim, Geunyoung; Cho, Chang Soo; Shin, Jin Soo
2017-10-01
Using recordings of a mine collapse event (Mw 4.2) in South Korea in January 2015, we demonstrated that the phase and amplitude information of impulse response functions (IRFs) can be effectively retrieved using seismic interferometry. This event is equivalent to a single downward force at shallow depth. Using quantitative metrics, we compared three different seismic interferometry techniques—deconvolution, coherency, and cross correlation—to extract the IRFs between two distant stations with ambient seismic noise data. The azimuthal dependency of the source distribution of the ambient noise was also evaluated. We found that deconvolution is the best method for extracting IRFs from ambient seismic noise within the period band of 2-10 s. The coherency method is also effective if appropriate spectral normalization or whitening schemes are applied during the data processing.
Gómez-Ríos, Germán Augusto; Gionfriddo, Emanuela; Poole, Justen; Pawliszyn, Janusz
2017-07-05
The direct interface of microextraction technologies to mass spectrometry (MS) has unquestionably revolutionized the speed and efficacy at which complex matrices are analyzed. Solid Phase Micro Extraction-Transmission Mode (SPME-TM) is a technology conceived as an effective synergy between sample preparation and ambient ionization. Succinctly, the device consists of a mesh coated with polymeric particles that extracts analytes of interest present in a given sample matrix. This coated mesh acts as a transmission-mode substrate for Direct Analysis in Real Time (DART), allowing for rapid and efficient thermal desorption/ionization of analytes previously concentrated on the coating, and dramatically lowering the limits of detection attained by sole DART analysis. In this study, we present SPME-TM as a novel tool for the ultrafast enrichment of pesticides present in food and environmental matrices and their quantitative determination by MS via DART ionization. Limits of quantitation in the subnanogram per milliliter range can be attained, while total analysis time does not exceed 2 min per sample. In addition to target information obtained via tandem MS, retrospective studies of the same sample via high-resolution mass spectrometry (HRMS) were accomplished by thermally desorbing a different segment of the microextraction device.
Monakhova, Yulia B; Mushtakova, Svetlana P
2017-05-01
A fast and reliable spectroscopic method for multicomponent quantitative analysis of targeted compounds with overlapping signals in complex mixtures has been established. The innovative analytical approach is based on the preliminary chemometric extraction of qualitative and quantitative information from UV-vis and IR spectral profiles of a calibration system using independent component analysis (ICA). Using this quantitative model and ICA resolution results of spectral profiling of "unknown" model mixtures, the absolute analyte concentrations in multicomponent mixtures and authentic samples were then calculated without reference solutions. Good recoveries generally between 95% and 105% were obtained. The method can be applied to any spectroscopic data that obey the Beer-Lambert-Bouguer law. The proposed method was tested on analysis of vitamins and caffeine in energy drinks and aromatic hydrocarbons in motor fuel with 10% error. The results demonstrated that the proposed method is a promising tool for rapid simultaneous multicomponent analysis in the case of spectral overlap and the absence/inaccessibility of reference materials.
Data from quantitative label free proteomics analysis of rat spleen.
Dudekula, Khadar; Le Bihan, Thierry
2016-09-01
The dataset presented in this work has been obtained using a label-free quantitative proteomic analysis of rat spleen. A robust method for extraction of proteins from rat spleen tissue and LC-MS-MS analysis was developed using a urea and SDS-based buffer. Different fractionation methods were compared. A total of 3484 different proteins were identified from the pool of all experiments run in this study (a total of 2460 proteins with at least two peptides). A total of 1822 proteins were identified from nine non-fractionated pulse gels, 2288 proteins and 2864 proteins were identified by SDS-PAGE fractionation into three and five fractions respectively. The proteomics data are deposited in ProteomeXchange Consortium via PRIDE PXD003520, Progenesis and Maxquant output are presented in the supported information. The generated list of proteins under different regimes of fractionation allow assessing the nature of the identified proteins; variability in the quantitative analysis associated with the different sampling strategy and allow defining a proper number of replicates for future quantitative analysis.
Review: Magnetic resonance imaging techniques in ophthalmology
Fagan, Andrew J.
2012-01-01
Imaging the eye with magnetic resonance imaging (MRI) has proved difficult due to the eye’s propensity to move involuntarily over typical imaging timescales, obscuring the fine structure in the eye due to the resulting motion artifacts. However, advances in MRI technology help to mitigate such drawbacks, enabling the acquisition of high spatiotemporal resolution images with a variety of contrast mechanisms. This review aims to classify the MRI techniques used to date in clinical and preclinical ophthalmologic studies, describing the qualitative and quantitative information that may be extracted and how this may inform on ocular pathophysiology. PMID:23112569
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trease, Lynn L.; Trease, Harold E.; Fowler, John
2007-03-15
One of the critical steps toward performing computational biology simulations, using mesh based integration methods, is in using topologically faithful geometry derived from experimental digital image data as the basis for generating the computational meshes. Digital image data representations contain both the topology of the geometric features and experimental field data distributions. The geometric features that need to be captured from the digital image data are three-dimensional, therefore the process and tools we have developed work with volumetric image data represented as data-cubes. This allows us to take advantage of 2D curvature information during the segmentation and feature extraction process.more » The process is basically: 1) segmenting to isolate and enhance the contrast of the features that we wish to extract and reconstruct, 2) extracting the geometry of the features in an isosurfacing technique, and 3) building the computational mesh using the extracted feature geometry. “Quantitative” image reconstruction and feature extraction is done for the purpose of generating computational meshes, not just for producing graphics "screen" quality images. For example, the surface geometry that we extract must represent a closed water-tight surface.« less
Johansson, M; Lenngren, S
1988-11-18
Extraction of the hydrophobic tertiary amine bromhexine from plasma using cyclohexane-heptafluorobutanol (99.5:0.5, v/v) was studied at different pH values. The extraction yield from buffer solutions was quantitative at pH greater than 4.1, but from plasma the extraction yield decreased with increasing pH. Furthermore, at pH 8.4 the extraction yield varied greatly (56-99%) in different human plasma. The addition of lipoproteins to phosphate buffer, at pH 8.1, decreased the extraction yields considerably. Quantitative extraction from plasma was obtained by using a very long extraction time at pH 8.4 or by decreasing the pH to 5.4. The chromatographic system consisted of a reversed-phase column (Nucleosil C18, 5 microns) with an acidic mobile phase (methanol-phosphate buffer, pH 2) containing an aliphatic tertiary amine. UV detection at 308 or 254 nm was used. The limit of quantitation was 5 ng/ml using 3.00 ml of plasma and detection at 254 nm. The intra-assay precision for bromhexine was better than 3.6% at 5 ng/ml.
Wang, Tao; He, Fuhong; Zhang, Anding; Gu, Lijuan; Wen, Yangmao; Jiang, Weiguo; Shao, Hongbo
2014-01-01
This paper took a subregion in a small watershed gully system at Beiyanzikou catchment of Qixia, China, as a study and, using object-orientated image analysis (OBIA), extracted shoulder line of gullies from high spatial resolution digital orthophoto map (DOM) aerial photographs. Next, it proposed an accuracy assessment method based on the adjacent distance between the boundary classified by remote sensing and points measured by RTK-GPS along the shoulder line of gullies. Finally, the original surface was fitted using linear regression in accordance with the elevation of two extracted edges of experimental gullies, named Gully 1 and Gully 2, and the erosion volume was calculated. The results indicate that OBIA can effectively extract information of gullies; average range difference between points field measured along the edge of gullies and classified boundary is 0.3166 m, with variance of 0.2116 m. The erosion area and volume of two gullies are 2141.6250 m2, 5074.1790 m3 and 1316.1250 m2, 1591.5784 m3, respectively. The results of the study provide a new method for the quantitative study of small gully erosion. PMID:24616626
Raffo, Antonio; Carcea, Marina; Castagna, Claudia; Magrì, Andrea
2015-08-07
An improved method based on headspace solid phase microextraction combined with gas chromatography-mass spectrometry (HS-SPME/GC-MS) was proposed for the semi-quantitative determination of wheat bread volatile compounds isolated from both whole slice and crust samples. A DVB/CAR/PDMS fibre was used to extract volatiles from the headspace of a bread powdered sample dispersed in a sodium chloride (20%) aqueous solution and kept for 60min at 50°C under controlled stirring. Thirty-nine out of all the extracted volatiles were fully identified, whereas for 95 other volatiles a tentative identification was proposed, to give a complete as possible profile of wheat bread volatile compounds. The use of an array of ten structurally and physicochemically similar internal standards allowed to markedly improve method precision with respect to previous HS-SPME/GC-MS methods for bread volatiles. Good linearity of the method was verified for a selection of volatiles from several chemical groups by calibration with matrix-matched extraction solutions. This simple, rapid, precise and sensitive method could represent a valuable tool to obtain semi-quantitative information when investigating the influence of technological factors on volatiles formation in wheat bread and other bakery products. Copyright © 2015 Elsevier B.V. All rights reserved.
Manolov, Rumen; Losada, José L; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2016-01-01
Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful.
Manolov, Rumen; Losada, José L.; Chacón-Moscoso, Salvador; Sanduvete-Chaves, Susana
2016-01-01
Two-phase single-case designs, including baseline evaluation followed by an intervention, represent the most clinically straightforward option for combining professional practice and research. However, unless they are part of a multiple-baseline schedule, such designs do not allow demonstrating a causal relation between the intervention and the behavior. Although the statistical options reviewed here cannot help overcoming this methodological limitation, we aim to make practitioners and applied researchers aware of the available appropriate options for extracting maximum information from the data. In the current paper, we suggest that the evaluation of behavioral change should include visual and quantitative analyses, complementing the substantive criteria regarding the practical importance of the behavioral change. Specifically, we emphasize the need to use structured criteria for visual analysis, such as the ones summarized in the What Works Clearinghouse Standards, especially if such criteria are complemented by visual aids, as illustrated here. For quantitative analysis, we focus on the non-overlap of all pairs and the slope and level change procedure, as they offer straightforward information and have shown reasonable performance. An illustration is provided of the use of these three pieces of information: visual, quantitative, and substantive. To make the use of visual and quantitative analysis feasible, open source software is referred to and demonstrated. In order to provide practitioners and applied researchers with a more complete guide, several analytical alternatives are commented on pointing out the situations (aims, data patterns) for which these are potentially useful. PMID:26834691
Zone analysis in biology articles as a basis for information extraction.
Mizuta, Yoko; Korhonen, Anna; Mullen, Tony; Collier, Nigel
2006-06-01
In the field of biomedicine, an overwhelming amount of experimental data has become available as a result of the high throughput of research in this domain. The amount of results reported has now grown beyond the limits of what can be managed by manual means. This makes it increasingly difficult for the researchers in this area to keep up with the latest developments. Information extraction (IE) in the biological domain aims to provide an effective automatic means to dynamically manage the information contained in archived journal articles and abstract collections and thus help researchers in their work. However, while considerable advances have been made in certain areas of IE, pinpointing and organizing factual information (such as experimental results) remains a challenge. In this paper we propose tackling this task by incorporating into IE information about rhetorical zones, i.e. classification of spans of text in terms of argumentation and intellectual attribution. As the first step towards this goal, we introduce a scheme for annotating biological texts for rhetorical zones and provide a qualitative and quantitative analysis of the data annotated according to this scheme. We also discuss our preliminary research on automatic zone analysis, and its incorporation into our IE framework.
Real-time Social Internet Data to Guide Forecasting Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Valle, Sara Y.
Our goal is to improve decision support by monitoring and forecasting events using social media, mathematical models, and quantifying model uncertainty. Our approach is real-time, data-driven forecasts with quantified uncertainty: Not just for weather anymore. Information flow from human observations of events through an Internet system and classification algorithms is used to produce quantitatively uncertain forecast. In summary, we want to develop new tools to extract useful information from Internet data streams, develop new approaches to assimilate real-time information into predictive models, validate approaches by forecasting events, and our ultimate goal is to develop an event forecasting system using mathematicalmore » approaches and heterogeneous data streams.« less
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-01-01
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists’ goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities that are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists—as opposed to a completely automatic computer interpretation—focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous—from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects—collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more—from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis. PMID:19175137
DOE Office of Scientific and Technical Information (OSTI.GOV)
Giger, Maryellen L.; Chan, Heang-Ping; Boone, John
2008-12-15
The roles of physicists in medical imaging have expanded over the years, from the study of imaging systems (sources and detectors) and dose to the assessment of image quality and perception, the development of image processing techniques, and the development of image analysis methods to assist in detection and diagnosis. The latter is a natural extension of medical physicists' goals in developing imaging techniques to help physicians acquire diagnostic information and improve clinical decisions. Studies indicate that radiologists do not detect all abnormalities on images that are visible on retrospective review, and they do not always correctly characterize abnormalities thatmore » are found. Since the 1950s, the potential use of computers had been considered for analysis of radiographic abnormalities. In the mid-1980s, however, medical physicists and radiologists began major research efforts for computer-aided detection or computer-aided diagnosis (CAD), that is, using the computer output as an aid to radiologists--as opposed to a completely automatic computer interpretation--focusing initially on methods for the detection of lesions on chest radiographs and mammograms. Since then, extensive investigations of computerized image analysis for detection or diagnosis of abnormalities in a variety of 2D and 3D medical images have been conducted. The growth of CAD over the past 20 years has been tremendous--from the early days of time-consuming film digitization and CPU-intensive computations on a limited number of cases to its current status in which developed CAD approaches are evaluated rigorously on large clinically relevant databases. CAD research by medical physicists includes many aspects--collecting relevant normal and pathological cases; developing computer algorithms appropriate for the medical interpretation task including those for segmentation, feature extraction, and classifier design; developing methodology for assessing CAD performance; validating the algorithms using appropriate cases to measure performance and robustness; conducting observer studies with which to evaluate radiologists in the diagnostic task without and with the use of the computer aid; and ultimately assessing performance with a clinical trial. Medical physicists also have an important role in quantitative imaging, by validating the quantitative integrity of scanners and developing imaging techniques, and image analysis tools that extract quantitative data in a more accurate and automated fashion. As imaging systems become more complex and the need for better quantitative information from images grows, the future includes the combined research efforts from physicists working in CAD with those working on quantitative imaging systems to readily yield information on morphology, function, molecular structure, and more--from animal imaging research to clinical patient care. A historical review of CAD and a discussion of challenges for the future are presented here, along with the extension to quantitative image analysis.« less
Relating interesting quantitative time series patterns with text events and text features
NASA Astrophysics Data System (ADS)
Wanner, Franz; Schreck, Tobias; Jentner, Wolfgang; Sharalieva, Lyubka; Keim, Daniel A.
2013-12-01
In many application areas, the key to successful data analysis is the integrated analysis of heterogeneous data. One example is the financial domain, where time-dependent and highly frequent quantitative data (e.g., trading volume and price information) and textual data (e.g., economic and political news reports) need to be considered jointly. Data analysis tools need to support an integrated analysis, which allows studying the relationships between textual news documents and quantitative properties of the stock market price series. In this paper, we describe a workflow and tool that allows a flexible formation of hypotheses about text features and their combinations, which reflect quantitative phenomena observed in stock data. To support such an analysis, we combine the analysis steps of frequent quantitative and text-oriented data using an existing a-priori method. First, based on heuristics we extract interesting intervals and patterns in large time series data. The visual analysis supports the analyst in exploring parameter combinations and their results. The identified time series patterns are then input for the second analysis step, in which all identified intervals of interest are analyzed for frequent patterns co-occurring with financial news. An a-priori method supports the discovery of such sequential temporal patterns. Then, various text features like the degree of sentence nesting, noun phrase complexity, the vocabulary richness, etc. are extracted from the news to obtain meta patterns. Meta patterns are defined by a specific combination of text features which significantly differ from the text features of the remaining news data. Our approach combines a portfolio of visualization and analysis techniques, including time-, cluster- and sequence visualization and analysis functionality. We provide two case studies, showing the effectiveness of our combined quantitative and textual analysis work flow. The workflow can also be generalized to other application domains such as data analysis of smart grids, cyber physical systems or the security of critical infrastructure, where the data consists of a combination of quantitative and textual time series data.
Quantitative determination of superoxide in plant leaves using a modified NBT staining method.
Bournonville, Carlos F Grellet; Díaz-Ricci, Juan Carlos
2011-01-01
In plants, the ROS (reactive oxygen species) level is tightly regulated because their accumulation produces irreversible damage leading to cell death. However, ROS accumulation plays a key role in plant signaling under biotic or abiotic stress. Although various methods were reported to evaluate ROS accumulation, they are restricted to model plants or provide only qualitative information. Develop a simple method to quantify superoxide radicals produced in plant tissues, based on the selective extraction of the formazan produced after nitroblue tetrazolium (NBT) reduction in histochemical staining. Plant leaves were stained with a standard NBT method and the formazan precipitated in tissues was selectively extracted using chloroform. The organic phase was dried and formazan residue dissolved in dimethylsulfoxide-potassium hydroxide and quantified by spectrophotometry. The method was tested in strawberry plant leaves under different stressing conditions. Formazan extracted from leaves subjected to stress conditions showed similar absorption spectra to those obtained from standard solutions using pure formazan. Calibration curves showed a linear relationship between absorbance and formazan amounts, within the range 0.5-8 µg. Outcomes suggested that formazan was retained in the solid residue of leaf tissues. This protocol allowed us to quantify superoxide radicals produced under different stress conditions. Chloroform allowed a selective formazan extraction and removal of potential endogenous, exogenous or procedural artefacts that may interfere with the quantitative determination. This protocol can be used to quantify the superoxide produced in plant tissues using any traditional qualitative NBT histochemical staining method. Copyright © 2011 John Wiley & Sons, Ltd.
McDonald, Jeffrey G.; Cummins, Carolyn L.; Barkley, Robert M.; Thompson, Bonne M.; Lincoln, Holly A.
2009-01-01
Reported here is the mass spectral identification of sorbitol-based nuclear clarifying agents (NCAs) and the quantitative description of their extractability from common laboratory and household plasticware made of polypropylene. NCAs are frequently added to polypropylene to improve optical clarity, increase performance properties, and aid in the manufacturing process of this plastic. NCA addition makes polypropylene plasticware more aesthetically pleasing to the user and makes the product competitive with other plastic formulations. We show here that several NCAs are readily extracted with either ethanol or water from plastic labware during typical laboratory procedures. Observed levels ranged from a nanogram to micrograms of NCA. NCAs were also detected in extracts from plastic food storage containers; levels ranged from 1to 10 μg in two of the three brands tested. The electron ionization mass spectra for three sorbitol-based nuclear clarifying agents (1,3:2,4-bis-O-(benzylidene)sorbitol, 1,3:2,4-bis-O-(p-methylbenzylidene)sorbitol, 1,3:2,4-bis-O-(3,4-dimethylbenzylidene)sorbitol) are presented for the native and trimethylsilylderivatized compounds together with the collision-induced dissociation mass spectra; gas and liquid chromatographic data are also reported. These NCAs now join other well-known plasticizers such as phthalate esters and bisphenol A as common laboratory contaminants. While the potential toxicity of NCAs in mammalian systems is unknown, the current data provide scientists and consumers the opportunity to make more informed decisions regarding the use of polypropylene plastics. PMID:18533681
Nguyen, Ha T.; Pearce, Joshua M.; Harrap, Rob; Barber, Gerald
2012-01-01
A methodology is provided for the application of Light Detection and Ranging (LiDAR) to automated solar photovoltaic (PV) deployment analysis on the regional scale. Challenges in urban information extraction and management for solar PV deployment assessment are determined and quantitative solutions are offered. This paper provides the following contributions: (i) a methodology that is consistent with recommendations from existing literature advocating the integration of cross-disciplinary competences in remote sensing (RS), GIS, computer vision and urban environmental studies; (ii) a robust methodology that can work with low-resolution, incomprehensive data and reconstruct vegetation and building separately, but concurrently; (iii) recommendations for future generation of software. A case study is presented as an example of the methodology. Experience from the case study such as the trade-off between time consumption and data quality are discussed to highlight a need for connectivity between demographic information, electrical engineering schemes and GIS and a typical factor of solar useful roofs extracted per method. Finally, conclusions are developed to provide a final methodology to extract the most useful information from the lowest resolution and least comprehensive data to provide solar electric assessments over large areas, which can be adapted anywhere in the world. PMID:22666044
Quantitative Schlieren analysis applied to holograms of crystals grown on Spacelab 3
NASA Technical Reports Server (NTRS)
Brooks, Howard L.
1986-01-01
In order to extract additional information about crystals grown in the microgravity environment of Spacelab, a quantitative schlieren analysis technique was developed for use in a Holography Ground System of the Fluid Experiment System. Utilizing the Unidex position controller, it was possible to measure deviation angles produced by refractive index gradients of 0.5 milliradians. Additionally, refractive index gradient maps for any recorded time during the crystal growth were drawn and used to create solute concentration maps for the environment around the crystal. The technique was applied to flight holograms of Cell 204 of the Fluid Experiment System that were recorded during the Spacelab 3 mission on STS 51B. A triglycine sulfate crystal was grown under isothermal conditions in the cell and the data gathered with the quantitative schlieren analysis technique is consistent with a diffusion limited growth process.
[Lithology feature extraction of CASI hyperspectral data based on fractal signal algorithm].
Tang, Chao; Chen, Jian-Ping; Cui, Jing; Wen, Bo-Tao
2014-05-01
Hyperspectral data is characterized by combination of image and spectrum and large data volume dimension reduction is the main research direction. Band selection and feature extraction is the primary method used for this objective. In the present article, the authors tested methods applied for the lithology feature extraction from hyperspectral data. Based on the self-similarity of hyperspectral data, the authors explored the application of fractal algorithm to lithology feature extraction from CASI hyperspectral data. The "carpet method" was corrected and then applied to calculate the fractal value of every pixel in the hyperspectral data. The results show that fractal information highlights the exposed bedrock lithology better than the original hyperspectral data The fractal signal and characterized scale are influenced by the spectral curve shape, the initial scale selection and iteration step. At present, research on the fractal signal of spectral curve is rare, implying the necessity of further quantitative analysis and investigation of its physical implications.
O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A
2010-03-01
Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to achieve significant improvements over the fixed template and the segmentation methods. As well as achieving acceptable kinetic parameter accuracy, the quality of fit of the region of interest (ROI) time-course data based on the extracted AIF, matches results based on arterially sampled AIFs. In comparison, significant deviation in the estimation of FDG flux and degradation in ROI data fit are found with the template and segmentation methods. The proposed AIF extraction method is recommended for practical use.
Zhang, Yu-Tian; Xiao, Mei-Feng; Deng, Kai-Wen; Yang, Yan-Tao; Zhou, Yi-Qun; Zhou, Jin; He, Fu-Yuan; Liu, Wen-Long
2018-06-01
Nowadays, to research and formulate an efficiency extraction system for Chinese herbal medicine, scientists have always been facing a great challenge for quality management, so that the transitivity of Q-markers in quantitative analysis of TCM was proposed by Prof. Liu recently. In order to improve the quality of extraction from raw medicinal materials for clinical preparations, a series of integrated mathematic models for transitivity of Q-markers in quantitative analysis of TCM were established. Buyanghuanwu decoction (BYHWD) was a commonly TCMs prescription, which was used to prevent and treat the ischemic heart and brain diseases. In this paper, we selected BYHWD as an extraction experimental subject to study the quantitative transitivity of TCM. Based on theory of Fick's Rule and Noyes-Whitney equation, novel kinetic models were established for extraction of active components. Meanwhile, fitting out kinetic equations of extracted models and then calculating the inherent parameters in material piece and Q-marker quantitative transfer coefficients, which were considered as indexes to evaluate transitivity of Q-markers in quantitative analysis of the extraction process of BYHWD. HPLC was applied to screen and analyze the potential Q-markers in the extraction process. Fick's Rule and Noyes-Whitney equation were adopted for mathematically modeling extraction process. Kinetic parameters were fitted and calculated by the Statistical Program for Social Sciences 20.0 software. The transferable efficiency was described and evaluated by potential Q-markers transfer trajectory via transitivity availability AUC, extraction ratio P, and decomposition ratio D respectively. The Q-marker was identified with AUC, P, D. Astragaloside IV, laetrile, paeoniflorin, and ferulic acid were studied as potential Q-markers from BYHWD. The relative technologic parameters were presented by mathematic models, which could adequately illustrate the inherent properties of raw materials preparation and affection of Q-markers transitivity in equilibrium processing. AUC, P, D for potential Q-markers of AST-IV, laetrile, paeoniflorin, and FA were obtained, with the results of 289.9 mAu s, 46.24%, 22.35%; 1730 mAu s, 84.48%, 1.963%; 5600 mAu s, 70.22%, 0.4752%; 7810 mAu s, 24.29%, 4.235%, respectively. The results showed that the suitable Q-markers were laetrile and paeoniflorin in our study, which exhibited acceptable traceability and transitivity in the extraction process of TCMs. Therefore, these novel mathematic models might be developed as a new standard to control TCMs quality process from raw medicinal materials to product manufacturing. Copyright © 2018 Elsevier GmbH. All rights reserved.
A simple method for the quantitative determination of elemental sulfur on oxidized sulfide minerals is described. Extraction of elemental sulfur in perchloroethylene and subsequent analysis with high-performance liquid chromatography were used to ascertain the total elemental ...
Soil samples from the GenCorp Lawrence Brownfields site were analyzed with a commercial semi-quantitative enzyme-linked immunosorbent assay (ELISA) using a methanol shake extraction. Many of the soil samples were extremely oily, with total petroleum hydrocarbon levels up to 240...
Wang, Yaping; Nie, Jingxin; Yap, Pew-Thian; Li, Gang; Shi, Feng; Geng, Xiujuan; Guo, Lei; Shen, Dinggang
2014-01-01
Accurate and robust brain extraction is a critical step in most neuroimaging analysis pipelines. In particular, for the large-scale multi-site neuroimaging studies involving a significant number of subjects with diverse age and diagnostic groups, accurate and robust extraction of the brain automatically and consistently is highly desirable. In this paper, we introduce population-specific probability maps to guide the brain extraction of diverse subject groups, including both healthy and diseased adult human populations, both developing and aging human populations, as well as non-human primates. Specifically, the proposed method combines an atlas-based approach, for coarse skull-stripping, with a deformable-surface-based approach that is guided by local intensity information and population-specific prior information learned from a set of real brain images for more localized refinement. Comprehensive quantitative evaluations were performed on the diverse large-scale populations of ADNI dataset with over 800 subjects (55∼90 years of age, multi-site, various diagnosis groups), OASIS dataset with over 400 subjects (18∼96 years of age, wide age range, various diagnosis groups), and NIH pediatrics dataset with 150 subjects (5∼18 years of age, multi-site, wide age range as a complementary age group to the adult dataset). The results demonstrate that our method consistently yields the best overall results across almost the entire human life span, with only a single set of parameters. To demonstrate its capability to work on non-human primates, the proposed method is further evaluated using a rhesus macaque dataset with 20 subjects. Quantitative comparisons with popularly used state-of-the-art methods, including BET, Two-pass BET, BET-B, BSE, HWA, ROBEX and AFNI, demonstrate that the proposed method performs favorably with superior performance on all testing datasets, indicating its robustness and effectiveness. PMID:24489639
Zuliani, Tea; Lespes, Gaetane; Milacic, Radmila; Scancar, Janez
2010-03-15
The toxicity and bioaccumulation of organotin compounds (OTCs) led to the development of sensitive and selective analytical methods for their determination. In the past much attention was assigned to the study of OTCs in biological samples, water and sediments, coming mostly from marine environment. Little information about OTCs pollution of terrestrial ecosystems is available. In order to optimise the extraction method for simultaneous determination of butyl-, phenyl- and octyltin compounds in sewage sludge five different extractants (tetramethylammonium hydroxide, HCl in methanol, glacial acetic acid, mixture of acetic acid and methanol (3:1), and mixture of acetic acid, methanol and water (1:1:1)), the presence or not of a complexing agent (tropolone), and the use of different modes of extraction (mechanical stirring, microwave and ultrasonic assisted extraction) were tested. Extracted OTCs were derivatised with sodium tetraethylborate and determined by gas chromatography coupled with mass spectrometer. Quantitative extraction of butyl-, phenyl- and octyltin compounds was obtained by the use of glacial acetic acid as extractant and mechanical stirring for 16h or sonication for 30 min. The limits of detection and quantification for OTCs investigated in sewage sludge were in the ng S ng(-1) range. Copyright (c) 2009 Elsevier B.V. All rights reserved.
Quantitative study of FORC diagrams in thermally corrected Stoner- Wohlfarth nanoparticles systems
NASA Astrophysics Data System (ADS)
De Biasi, E.; Curiale, J.; Zysler, R. D.
2016-12-01
The use of FORC diagrams is becoming increasingly popular among researchers devoted to magnetism and magnetic materials. However, a thorough interpretation of this kind of diagrams, in order to achieve quantitative information, requires an appropriate model of the studied system. For that reason most of the FORC studies are used for a qualitative analysis. In magnetic systems thermal fluctuations "blur" the signatures of the anisotropy, volume and particle interactions distributions, therefore thermal effects in nanoparticles systems conspire against a proper interpretation and analysis of these diagrams. Motivated by this fact, we have quantitatively studied the degree of accuracy of the information extracted from FORC diagrams for the special case of single-domain thermal corrected Stoner- Wohlfarth (easy axes along the external field orientation) nanoparticles systems. In this work, the starting point is an analytical model that describes the behavior of a magnetic nanoparticles system as a function of field, anisotropy, temperature and measurement time. In order to study the quantitative degree of accuracy of our model, we built FORC diagrams for different archetypical cases of magnetic nanoparticles. Our results show that from the quantitative information obtained from the diagrams, under the hypotheses of the proposed model, is possible to recover the features of the original system with accuracy above 95%. This accuracy is improved at low temperatures and also it is possible to access to the anisotropy distribution directly from the FORC coercive field profile. Indeed, our simulations predict that the volume distribution plays a secondary role being the mean value and its deviation the only important parameters. Therefore it is possible to obtain an accurate result for the inversion and interaction fields despite the features of the volume distribution.
Single-Cell Quantitative PCR: Advances and Potential in Cancer Diagnostics.
Ok, Chi Young; Singh, Rajesh R; Salim, Alaa A
2016-01-01
Tissues are heterogeneous in their components. If cells of interest are a minor population of collected tissue, it would be difficult to obtain genetic or genomic information of the interested cell population with conventional genomic DNA extraction from the collected tissue. Single-cell DNA analysis is important in the analysis of genetics of cell clonality, genetic anticipation, and single-cell DNA polymorphisms. Single-cell PCR using Single Cell Ampligrid/GeXP platform is described in this chapter.
NASA Space Engineering Research Center for Utilization of Local Planetary Resources
NASA Technical Reports Server (NTRS)
Ramohalli, Kumar; Lewis, John S.
1989-01-01
Progress toward the goal of exploiting extraterrestrial resources for space missions is documented. Some areas of research included are as follows: Propellant and propulsion optimization; Automation of propellant processing with quantitative simulation; Ore reduction through chlorination and free radical production; Characterization of lunar ilmenite and its simulants; Carbothermal reduction of ilmenite with special reference to microgravity chemical reactor design; Gaseous carbonyl extraction and purification of ferrous metals; Overall energy management; and Information management for space processing.
NASA Astrophysics Data System (ADS)
Edmiston, John Kearney
This work explores the field of continuum plasticity from two fronts. On the theory side, we establish a complete specification of a phenomenological theory of plasticity for single crystals. The model serves as an alternative to the popular crystal plasticity formulation. Such a model has been previously proposed in the literature; the new contribution made here is the constitutive framework and resulting simulations. We calibrate the model to available data and use a simple numerical method to explore resulting predictions in plane strain boundary value problems. Results show promise for further investigation of the plasticity model. Conveniently, this theory comes with a corresponding experimental tool in X-ray diffraction. Recent advances in hardware technology at synchrotron sources have led to an increased use of the technique for studies of plasticity in the bulk of materials. The method has been successful in qualitative observations of material behavior, but its use in quantitative studies seeking to extract material properties is open for investigation. Therefore in the second component of the thesis several contributions are made to synchrotron X-ray diffraction experiments, in terms of method development as well as the quantitative reporting of constitutive parameters. In the area of method development, analytical tools are developed to determine the available precision of this type of experiment—a crucial aspect to determine if the method is to be used for quantitative studies. We also extract kinematic information relating to intragranular inhomogeneity which is not accessible with traditional methods of data analysis. In the area of constitutive parameter identification, we use the method to extract parameters corresponding to the proposed formulation of plasticity for a titanium alloy (HCP) which is continuously sampled by X-ray diffraction during uniaxial extension. These results and the lessons learned from the efforts constitute early reporting of the quantitative profitability of undertaking such a line of experimentation for the study of plastic deformation processes.
Algorithm of pulmonary emphysema extraction using thoracic 3D CT images
NASA Astrophysics Data System (ADS)
Saita, Shinsuke; Kubo, Mitsuru; Kawata, Yoshiki; Niki, Noboru; Nakano, Yasutaka; Ohmatsu, Hironobu; Tominaga, Keigo; Eguchi, Kenji; Moriyama, Noriyuki
2007-03-01
Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.
Algorithm of pulmonary emphysema extraction using low dose thoracic 3D CT images
NASA Astrophysics Data System (ADS)
Saita, S.; Kubo, M.; Kawata, Y.; Niki, N.; Nakano, Y.; Omatsu, H.; Tominaga, K.; Eguchi, K.; Moriyama, N.
2006-03-01
Recently, due to aging and smoking, emphysema patients are increasing. The restoration of alveolus which was destroyed by emphysema is not possible, thus early detection of emphysema is desired. We describe a quantitative algorithm for extracting emphysematous lesions and quantitatively evaluate their distribution patterns using low dose thoracic 3-D CT images. The algorithm identified lung anatomies, and extracted low attenuation area (LAA) as emphysematous lesion candidates. Applying the algorithm to 100 thoracic 3-D CT images and then by follow-up 3-D CT images, we demonstrate its potential effectiveness to assist radiologists and physicians to quantitatively evaluate the emphysematous lesions distribution and their evolution in time interval changes.
Visualizing dispersive features in 2D image via minimum gradient method
DOE Office of Scientific and Technical Information (OSTI.GOV)
He, Yu; Wang, Yan; Shen, Zhi -Xun
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
Visualizing dispersive features in 2D image via minimum gradient method
He, Yu; Wang, Yan; Shen, Zhi -Xun
2017-07-24
Here, we developed a minimum gradient based method to track ridge features in a 2D image plot, which is a typical data representation in many momentum resolved spectroscopy experiments. Through both analytic formulation and numerical simulation, we compare this new method with existing DC (distribution curve) based and higher order derivative based analyses. We find that the new method has good noise resilience and enhanced contrast especially for weak intensity features and meanwhile preserves the quantitative local maxima information from the raw image. An algorithm is proposed to extract 1D ridge dispersion from the 2D image plot, whose quantitative applicationmore » to angle-resolved photoemission spectroscopy measurements on high temperature superconductors is demonstrated.« less
Triebl, Alexander; Trötzmüller, Martin; Eberl, Anita; Hanel, Pia; Hartler, Jürgen; Köfeler, Harald C
2014-06-20
A method for a highly selective and sensitive identification and quantitation of lysophosphatidic acid (LPA) and phosphatidic acid (PA) molecular species was developed using hydrophilic interaction liquid chromatography (HILIC) followed by negative-ion electrospray ionization high resolution mass spectrometry. Different extraction methods for the polar LPA and PA species were compared and a modified Bligh & Dyer extraction by addition of 0.1M hydrochloric acid resulted in a ≈1.2-fold increase of recovery for the 7 PA and a more than 15-fold increase for the 6 LPA molecular species of a commercially available natural mix compared to conventional Bligh & Dyer extraction. This modified Bligh & Dyer extraction did not show any artifacts resulting from hydrolysis of natural abundant phospholipids. The developed HILIC method is able to separate all PA and LPA species from major polar membrane lipid classes which might have suppressive effects on the minor abundant lipid classes of interest. The elemental compositions of intact lipid species are provided by the high mass resolution of 100,000 and high mass accuracy below 3ppm of the Orbitrap instrument. Additionally, tandem mass spectra were generated in a parallel data dependent acquisition mode in the linear ion trap to provide structural information at molecular level. Limits of quantitation were identified at 45fmol on column and the dynamic range reaches 20pmol on column, covering the range of natural abundance well. By applying the developed method to mouse brain it can be shown that phosphatidic acid contains less unsaturated fatty acids with PA 34:1 and PA 36:1 as the major species. In contrast, for LPA species a high content of polyunsaturated fatty acids (LPA 20:4 and LPA 22:6) was quantified. Copyright © 2014 Elsevier B.V. All rights reserved.
Kadoum, A M
1968-07-01
A simple, aqueous acetonitrile partition cleanup method for analyses of some common organophosphorus insecticide residues is described. The procedure described is for cleanup and quantitative recovery of parathion, methyl parathion, diazinon, malathion and thimet from different extracts. Those insecticides in the purified extracts of ground water, grain, soil, plant and animal tissues can be detected quantitatively by gas chromatography with an electron capture-detector at 0.01 ppm. Cleanup is satisfactory for paper and thin-layer chromatography for further identification of individual insecticides in the extracts.
QUANTITATIVE RADIO-CHEMICAL ANALYSIS-SOLVENT EXTRACTION OF MOLYBDENUM-99
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wish, L.
1961-09-12
A method was developed for the rapid quantitative separation of Mo/sup 99/ from fission product mixtures. It is based on the extraction of Mo into a solution of alpha -benzoin oxime in chloroform. The main contaminants are Zr, Nb, and 1. The first two are eliminated by couple with fluoride and the third by volatilization or solvent extraction. About 5% of the Te/sup 99/ daughter is extracted with its parent, and it is necessary to wait 48 hrs for equilibrium of fission product mixtures by this method and a standard radiochemical gravimetric procedure showed agreement within 1 to 2%. (auth)
NASA Astrophysics Data System (ADS)
Li, Xuan; Liu, Zhiping; Jiang, Xiaoli; Lodewijks, Gabrol
2018-01-01
Eddy current pulsed thermography (ECPT) is well established for non-destructive testing of electrical conductive materials, featuring the advantages of contactless, intuitive detecting and efficient heating. The concept of divergence characterization of the damage rate of carbon fibre-reinforced plastic (CFRP)-steel structures can be extended to ECPT thermal pattern characterization. It was found in this study that the use of ECPT technology on CFRP-steel structures generated a sizeable amount of valuable information for comprehensive material diagnostics. The relationship between divergence and transient thermal patterns can be identified and analysed by deploying mathematical models to analyse the information about fibre texture-like orientations, gaps and undulations in these multi-layered materials. The developed algorithm enabled the removal of information about fibre texture and the extraction of damage features. The model of the CFRP-glue-steel structures with damage was established using COMSOL Multiphysics® software, and quantitative non-destructive damage evaluation from the ECPT image areas was derived. The results of this proposed method illustrate that damaged areas are highly affected by available information about fibre texture. This proposed work can be applied for detection of impact induced damage and quantitative evaluation of CFRP structures.
NASA Astrophysics Data System (ADS)
Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo
2018-04-01
For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.
Huan, Tao; Li, Liang
2015-07-21
Generating precise and accurate quantitative information on metabolomic changes in comparative samples is important for metabolomics research where technical variations in the metabolomic data should be minimized in order to reveal biological changes. We report a method and software program, IsoMS-Quant, for extracting quantitative information from a metabolomic data set generated by chemical isotope labeling (CIL) liquid chromatography mass spectrometry (LC-MS). Unlike previous work of relying on mass spectral peak ratio of the highest intensity peak pair to measure relative quantity difference of a differentially labeled metabolite, this new program reconstructs the chromatographic peaks of the light- and heavy-labeled metabolite pair and then calculates the ratio of their peak areas to represent the relative concentration difference in two comparative samples. Using chromatographic peaks to perform relative quantification is shown to be more precise and accurate. IsoMS-Quant is integrated with IsoMS for picking peak pairs and Zero-fill for retrieving missing peak pairs in the initial peak pairs table generated by IsoMS to form a complete tool for processing CIL LC-MS data. This program can be freely downloaded from the www.MyCompoundID.org web site for noncommercial use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McKinney, Adriana L.; Varga, Tamas
Branching structures such as lungs, blood vessels and plant roots play a critical role in life. Growth, structure, and function of these branching structures have an immense effect on our lives. Therefore, quantitative size information on such structures in their native environment is invaluable for studying their growth and the effect of the environment on them. X-ray computed tomography (XCT) has been an effective tool for in situ imaging and analysis of branching structures. We developed a costless tool that approximates the surface and volume of branching structures. Our methodology of noninvasive imaging, segmentation and extraction of quantitative information ismore » demonstrated through the analysis of a plant root in its soil medium from 3D tomography data. XCT data collected on a grass specimen was used to visualize its root structure. A suite of open-source software was employed to segment the root from the soil and determine its isosurface, which was used to calculate its volume and surface. This methodology of processing 3D data is applicable to other branching structures even when the structure of interest is of similar x-ray attenuation to its environment and difficulties arise with sample segmentation.« less
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
Qi, Xiubin; Crooke, Emma; Ross, Andrew; Bastow, Trevor P; Stalvies, Charlotte
2011-09-21
This paper presents a system and method developed to identify a source oil's characteristic properties by testing the oil's dissolved components in water. Through close examination of the oil dissolution process in water, we hypothesise that when oil is in contact with water, the resulting oil-water extract, a complex hydrocarbon mixture, carries the signature property information of the parent oil. If the dominating differences in compositions between such extracts of different oils can be identified, this information could guide the selection of various sensors, capable of capturing such chemical variations. When used as an array, such a sensor system can be used to determine parent oil information from the oil-water extract. To test this hypothesis, 22 oils' water extracts were prepared and selected dominant hydrocarbons analyzed with Gas Chromatography-Mass Spectrometry (GC-MS); the subsequent Principal Component Analysis (PCA) indicates that the major difference between the extract solutions is the relative concentration between the volatile mono-aromatics and fluorescent polyaromatics. An integrated sensor array system that is composed of 3 volatile hydrocarbon sensors and 2 polyaromatic hydrocarbon sensors was built accordingly to capture the major and subtle differences of these extracts. It was tested by exposure to a total of 110 water extract solutions diluted from the 22 extracts. The sensor response data collected from the testing were processed with two multivariate analysis tools to reveal information retained in the response patterns of the arrayed sensors: by conducting PCA, we were able to demonstrate the ability to qualitatively identify and distinguish different oil samples from their sensor array response patterns. When a supervised PCA, Linear Discriminate Analysis (LDA), was applied, even quantitative classification can be achieved: the multivariate model generated from the LDA achieved 89.7% of successful classification of the type of the oil samples. By grouping the samples based on the level of viscosity and density we were able to reveal the correlation between the oil extracts' sensor array responses and their original oils' feature properties. The equipment and method developed in this study have promising potential to be readily applied in field studies and marine surveys for oil exploration or oil spill monitoring.
VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations
Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Mortimer, Peter S.
2017-01-01
BACKGROUND. Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. METHODS. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy–based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. RESULTS. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). CONCLUSION. VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. FUNDING. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003. PMID:28814672
VIPAR, a quantitative approach to 3D histopathology applied to lymphatic malformations.
Hägerling, René; Drees, Dominik; Scherzinger, Aaron; Dierkes, Cathrin; Martin-Almedina, Silvia; Butz, Stefan; Gordon, Kristiana; Schäfers, Michael; Hinrichs, Klaus; Ostergaard, Pia; Vestweber, Dietmar; Goerge, Tobias; Mansour, Sahar; Jiang, Xiaoyi; Mortimer, Peter S; Kiefer, Friedemann
2017-08-17
Lack of investigatory and diagnostic tools has been a major contributing factor to the failure to mechanistically understand lymphedema and other lymphatic disorders in order to develop effective drug and surgical therapies. One difficulty has been understanding the true changes in lymph vessel pathology from standard 2D tissue sections. VIPAR (volume information-based histopathological analysis by 3D reconstruction and data extraction), a light-sheet microscopy-based approach for the analysis of tissue biopsies, is based on digital reconstruction and visualization of microscopic image stacks. VIPAR allows semiautomated segmentation of the vasculature and subsequent nonbiased extraction of characteristic vessel shape and connectivity parameters. We applied VIPAR to analyze biopsies from healthy lymphedematous and lymphangiomatous skin. Digital 3D reconstruction provided a directly visually interpretable, comprehensive representation of the lymphatic and blood vessels in the analyzed tissue volumes. The most conspicuous features were disrupted lymphatic vessels in lymphedematous skin and a hyperplasia (4.36-fold lymphatic vessel volume increase) in the lymphangiomatous skin. Both abnormalities were detected by the connectivity analysis based on extracted vessel shape and structure data. The quantitative evaluation of extracted data revealed a significant reduction of lymphatic segment length (51.3% and 54.2%) and straightness (89.2% and 83.7%) for lymphedematous and lymphangiomatous skin, respectively. Blood vessel length was significantly increased in the lymphangiomatous sample (239.3%). VIPAR is a volume-based tissue reconstruction data extraction and analysis approach that successfully distinguished healthy from lymphedematous and lymphangiomatous skin. Its application is not limited to the vascular systems or skin. Max Planck Society, DFG (SFB 656), and Cells-in-Motion Cluster of Excellence EXC 1003.
Medical knowledge discovery and management.
Prior, Fred
2009-05-01
Although the volume of medical information is growing rapidly, the ability to rapidly convert this data into "actionable insights" and new medical knowledge is lagging far behind. The first step in the knowledge discovery process is data management and integration, which logically can be accomplished through the application of data warehouse technologies. A key insight that arises from efforts in biosurveillance and the global scope of military medicine is that information must be integrated over both time (longitudinal health records) and space (spatial localization of health-related events). Once data are compiled and integrated it is essential to encode the semantics and relationships among data elements through the use of ontologies and semantic web technologies to convert data into knowledge. Medical images form a special class of health-related information. Traditionally knowledge has been extracted from images by human observation and encoded via controlled terminologies. This approach is rapidly being replaced by quantitative analyses that more reliably support knowledge extraction. The goals of knowledge discovery are the improvement of both the timeliness and accuracy of medical decision making and the identification of new procedures and therapies.
NASA Astrophysics Data System (ADS)
Mooser, Matthias; Burri, Christian; Stoller, Markus; Luggen, David; Peyer, Michael; Arnold, Patrik; Meier, Christoph; Považay, Boris
2017-07-01
Ocular optical coherence tomography at the wavelengths ranges of 850 and 1060 nm have been integrated with a confocal scanning laser ophthalmoscope eye-tracker as a clinical commercial-class system. Collinear optics enables an exact overlap of the different channels to produce precisely overlapping depth-scans for evaluating the similarities and differences between the wavelengths to extract additional physiologic information. A reliable segmentation algorithm utilizing Graphcuts has been implemented and applied to automatically extract retinal and choroidal shape in cross-sections and volumes. The device has been tested in normals and pathologies including a cross-sectional and longitudinal study of myopia progress and control with a duplicate instrument in Asian children.
Basalo, Carlos; Mohn, Tobias; Hamburger, Matthias
2006-10-01
The extraction methods in selected monographs of the European and the Swiss Pharmacopoeia were compared to pressurized liquid extraction (PLE) with respect to the yield of constituents to be dosed in the quantitative assay for the respective herbal drugs. The study included five drugs, Belladonnae folium, Colae semen, Boldo folium, Tanaceti herba and Agni casti fructus. They were selected to cover different classes of compounds to be analyzed and different extraction methods to be used according to the monographs. Extraction protocols for PLE were optimized by varying the solvents and number of extraction cycles. In PLE, yields > 97 % of extractable analytes were typically achieved with two extraction cycles. For alkaloid-containing drugs, the addition of ammonia prior to extraction significantly increased the yield and reduced the number of extraction cycles required for exhaustive extraction. PLE was in all cases superior to the extraction protocol of the pharmacopoeia monographs (taken as 100 %), with differences ranging from 108 % in case of parthenolide in Tanaceti herba to 343 % in case of alkaloids in Boldo folium.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhandari, Deepak; Kertesz, Vilmos; Van Berkel, Gary J
RATIONALE: Ascorbic acid (AA) and folic acid (FA) are water-soluble vitamins and are usually fortified in food and dietary supplements. For the safety of human health, proper intake of these vitamins is recommended. Improvement in the analysis time required for the quantitative determination of these vitamins in food and nutritional formulations is desired. METHODS: A simple and fast (~5 min) in-tube sample preparation was performed, independently for FA and AA, by mixing extraction solvent with a powdered sample aliquot followed by agitation, centrifugation, and filtration to recover an extract for analysis. Quantitative detection was achieved by flow-injection (1 L injectionmore » volume) electrospray ionization tandem mass spectrometry (ESI-MS/MS) in negative ion mode using the method of standard addition. RESULTS: Method of standard addition was employed for the quantitative estimation of each vitamin in a sample extract. At least 2 spiked and 1 non-spiked sample extract were injected in triplicate for each quantitative analysis. Given an injection-to-injection interval of approximately 2 min, about 18 min was required to complete the quantitative estimation of each vitamin. The concentration values obtained for the respective vitamins in the standard reference material (SRM) 3280 using this approach were within the statistical range of the certified values provided in the NIST Certificate of Analysis. The estimated limit of detections of FA and AA were 13 and 5.9 ng/g, respectively. CONCLUSIONS: Flow-injection ESI-MS/MS was successfully applied for the rapid quantitation of FA and AA in SRM 3280 multivitamin/multielement tablets.« less
Observation of the immune response of cells and tissue through multimodal label-free microscopy
NASA Astrophysics Data System (ADS)
Pavillon, Nicolas; Smith, Nicholas I.
2017-02-01
We present applications of a label-free approach to assess the immune response based on the combination of interferometric microscopy and Raman spectroscopy, which makes it possible to simultaneously acquire morphological and molecular information of live cells. We employ this approach to derive statistical models for predicting the activation state of macrophage cells based both on morphological parameters extracted from the high-throughput full-field quantitative phase imaging, and on the molecular content information acquired through Raman spectroscopy. We also employ a system for 3D imaging based on coherence gating, enabling specific targeting of the Raman channel to structures of interest within tissue.
Singh, Rashmi; Sharma, Shatruhan; Sharma, Veena
2015-07-01
To compare and elucidate the antioxidant efficacy of ethanolic and hydroethanolic extracts of Indigofera tinctoria Linn. (Fabaceae family). Various in-vitro antioxidant assays and free radical-scavenging assays were done. Quantitative measurements of various phytoconstituents, reductive abilities and chelating potential were carried out along with standard compounds. Half inhibitory concentration (IC50) values for ethanol and hydroethanol extracts were analyzed and compared with respective standards. Hydroethanolic extracts showed considerably more potent antioxidant activity in comparison to ethanol extracts. Hydroethanolic extracts had lower IC50 values than ethanol extracts in the case of DPPH, metal chelation and hydroxyl radical-scavenging capacity (829, 659 and 26.7 μg/mL) but had slightly higher values than ethanol in case of SO2- and NO2-scavenging activity (P<0.001 vs standard). Quantitative measurements also showed that the abundance of phenolic and flavonoid bioactive phytoconstituents were significantly (P<0.001) greater in hydroethanol extracts (212.920 and 149.770 mg GAE and rutin/g of plant extract respectively) than in ethanol extracts (211.691 and 132.603 mg GAE and rutin/g of plant extract respectively). Karl Pearson's correlation analysis (r2) between various antioxidant parameters and bioactive components also associated the antioxidant potential of I. tinctoria with various phytoconstituents, especially phenolics, flavonoids, saponins and tannins. This study may be helpful to draw the attention of researchers towards the hydroethanol extracts of I. tinctoria, which has a high yield, and great prospects in herbal industries to produce inexpensive and powerful herbal products.
Sun, Shihao; Wang, Hui; Xie, Jianping; Su, Yue
2016-01-01
Jujube extract is commonly used as a food additive and flavoring. The sensory properties of the extract, especially sweetness, are a critical factor determining the product quality and therefore affecting consumer acceptability. Small molecular carbohydrates make major contribution to the sweetness of the jujube extract, and their types and contents in the extract have direct influence on quality of the product. So, an appropriate qualitative and quantitative method for determination of the carbohydrates is vitally important for quality control of the product. High performance liquid chromatography-evaporative light scattering detection (HPLC-ELSD), liquid chromatography-electronic spay ionization tandem mass spectrometry (LC-ESI-MS/MS), and gas chromatography-mass spectrometry (GC-MS) methods have been developed and applied to determining small molecular carbohydrates in jujube extract, respectively. Eight sugars and alditols were identified from the extract, including rhamnose, xylitol, arabitol, fructose, glucose, inositol, sucrose, and maltose. Comparisons were carried out to investigate the performance of the methods. Although the methods have been found to perform satisfactorily, only three sugars (fructose, glucose and inositol) could be detected by all these methods. Meanwhile, a similar quantitative result for the three sugars can be obtained by the methods. Eight sugars and alditols in the jujube extract were determined by HPLC-ELSD, LC-ESI-MS/MS and GC-MS, respectively. The LC-ELSD method and the LC-ESI-MS/MS method with good precision and accuracy were suitable for quantitative analysis of carbohydrates in jujube extract; although the performance of the GC-MS method for quantitative analysis was inferior to the other methods, it has a wider scope in qualitative analysis. A multi-analysis technique should be adopted in order to obtain complete constituents of about the carbohydrates in jujube extract, and the methods should be employed according to the purpose of analysis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, C.; et al.
The single-phase liquid argon time projection chamber (LArTPC) provides a large amount of detailed information in the form of fine-grained drifted ionization charge from particle traces. To fully utilize this information, the deposited charge must be accurately extracted from the raw digitized waveforms via a robust signal processing chain. Enabled by the ultra-low noise levels associated with cryogenic electronics in the MicroBooNE detector, the precise extraction of ionization charge from the induction wire planes in a single-phase LArTPC is qualitatively demonstrated on MicroBooNE data with event display images, and quantitatively demonstrated via waveform-level and track-level metrics. Improved performance of inductionmore » plane calorimetry is demonstrated through the agreement of extracted ionization charge measurements across different wire planes for various event topologies. In addition to the comprehensive waveform-level comparison of data and simulation, a calibration of the cryogenic electronics response is presented and solutions to various MicroBooNE-specific TPC issues are discussed. This work presents an important improvement in LArTPC signal processing, the foundation of reconstruction and therefore physics analyses in MicroBooNE.« less
Region of interest extraction based on multiscale visual saliency analysis for remote sensing images
NASA Astrophysics Data System (ADS)
Zhang, Yinggang; Zhang, Libao; Yu, Xianchuan
2015-01-01
Region of interest (ROI) extraction is an important component of remote sensing image processing. However, traditional ROI extraction methods are usually prior knowledge-based and depend on classification, segmentation, and a global searching solution, which are time-consuming and computationally complex. We propose a more efficient ROI extraction model for remote sensing images based on multiscale visual saliency analysis (MVS), implemented in the CIE L*a*b* color space, which is similar to visual perception of the human eye. We first extract the intensity, orientation, and color feature of the image using different methods: the visual attention mechanism is used to eliminate the intensity feature using a difference of Gaussian template; the integer wavelet transform is used to extract the orientation feature; and color information content analysis is used to obtain the color feature. Then, a new feature-competition method is proposed that addresses the different contributions of each feature map to calculate the weight of each feature image for combining them into the final saliency map. Qualitative and quantitative experimental results of the MVS model as compared with those of other models show that it is more effective and provides more accurate ROI extraction results with fewer holes inside the ROI.
Ahberg, Christian D.; Manz, Andreas; Neuzil, Pavel
2015-01-01
Since its invention in 1985 the polymerase chain reaction (PCR) has become a well-established method for amplification and detection of segments of double-stranded DNA. Incorporation of fluorogenic probe or DNA intercalating dyes (such as SYBR Green) into the PCR mixture allowed real-time reaction monitoring and extraction of quantitative information (qPCR). Probes with different excitation spectra enable multiplex qPCR of several DNA segments using multi-channel optical detection systems. Here we show multiplex qPCR using an economical EvaGreen-based system with single optical channel detection. Previously reported non quantitative multiplex real-time PCR techniques based on intercalating dyes were conducted once the PCR is completed by performing melting curve analysis (MCA). The technique presented in this paper is both qualitative and quantitative as it provides information about the presence of multiple DNA strands as well as the number of starting copies in the tested sample. Besides important internal control, multiplex qPCR also allows detecting concentrations of more than one DNA strand within the same sample. Detection of the avian influenza virus H7N9 by PCR is a well established method. Multiplex qPCR greatly enhances its specificity as it is capable of distinguishing both haemagglutinin (HA) and neuraminidase (NA) genes as well as their ratio. PMID:26088868
Sullards, M. Cameron; Liu, Ying; Chen, Yanfeng; Merrill, Alfred H.
2011-01-01
Sphingolipids are a highly diverse category of molecules that serve not only as components of biological structures but also as regulators of numerous cell functions. Because so many of the structural features of sphingolipids give rise to their biological activity, there is a need for comprehensive or “sphingolipidomic” methods for identification and quantitation of as many individual subspecies as possible. This review defines sphingolipids as a class, briefly discusses classical methods for their analysis, and focuses primarily on liquid chromatography tandem mass spectrometry (LC-MS/MS) and tissue imaging mass spectrometry (TIMS). Recently, a set of evolving and expanding methods have been developed and rigorously validated for the extraction, identification, separation, and quantitation of sphingolipids by LC-MS/MS. Quantitation of these biomolecules is made possible via the use of an internal standard cocktail. The compounds that can be readily analyzed are free long-chain (sphingoid) bases, sphingoid base 1-phosphates, and more complex species such as ceramides, ceramide 1-phosphates, sphingomyelins, mono- and di-hexosylceramides sulfatides, and novel compounds such as the 1-deoxy- and 1-(deoxymethyl)-sphingoid bases and their N-acyl-derivatives. These methods can be altered slightly to separate and quantitate isomeric species such as glucosyl/galactosylceramide. Because these techniques require the extraction of sphingolipids from their native environment, any information regarding their localization in histological slices is lost. Therefore, this review also describes methods for TIMS. This technique has been shown to be a powerful tool to determine the localization of individual molecular species of sphingolipids directly from tissue slices. PMID:21749933
High speed digital holographic interferometry for hypersonic flow visualization
NASA Astrophysics Data System (ADS)
Hegde, G. M.; Jagdeesh, G.; Reddy, K. P. J.
2013-06-01
Optical imaging techniques have played a major role in understanding the flow dynamics of varieties of fluid flows, particularly in the study of hypersonic flows. Schlieren and shadowgraph techniques have been the flow diagnostic tools for the investigation of compressible flows since more than a century. However these techniques provide only the qualitative information about the flow field. Other optical techniques such as holographic interferometry and laser induced fluorescence (LIF) have been used extensively for extracting quantitative information about the high speed flows. In this paper we present the application of digital holographic interferometry (DHI) technique integrated with short duration hypersonic shock tunnel facility having 1 ms test time, for quantitative flow visualization. Dynamics of the flow fields in hypersonic/supersonic speeds around different test models is visualized with DHI using a high-speed digital camera (0.2 million fps). These visualization results are compared with schlieren visualization and CFD simulation results. Fringe analysis is carried out to estimate the density of the flow field.
A lighting metric for quantitative evaluation of accent lighting systems
NASA Astrophysics Data System (ADS)
Acholo, Cyril O.; Connor, Kenneth A.; Radke, Richard J.
2014-09-01
Accent lighting is critical for artwork and sculpture lighting in museums, and subject lighting for stage, Film and television. The research problem of designing effective lighting in such settings has been revived recently with the rise of light-emitting-diode-based solid state lighting. In this work, we propose an easy-to-apply quantitative measure of the scene's visual quality as perceived by human viewers. We consider a well-accent-lit scene as one which maximizes the information about the scene (in an information-theoretic sense) available to the user. We propose a metric based on the entropy of the distribution of colors, which are extracted from an image of the scene from the viewer's perspective. We demonstrate that optimizing the metric as a function of illumination configuration (i.e., position, orientation, and spectral composition) results in natural, pleasing accent lighting. We use a photorealistic simulation tool to validate the functionality of our proposed approach, showing its successful application to two- and three-dimensional scenes.
NASA Astrophysics Data System (ADS)
Liu, X.; Zhang, J. X.; Zhao, Z.; Ma, A. D.
2015-06-01
Synthetic aperture radar in the application of remote sensing technology is becoming more and more widely because of its all-time and all-weather operation, feature extraction research in high resolution SAR image has become a hot topic of concern. In particular, with the continuous improvement of airborne SAR image resolution, image texture information become more abundant. It's of great significance to classification and extraction. In this paper, a novel method for built-up areas extraction using both statistical and structural features is proposed according to the built-up texture features. First of all, statistical texture features and structural features are respectively extracted by classical method of gray level co-occurrence matrix and method of variogram function, and the direction information is considered in this process. Next, feature weights are calculated innovatively according to the Bhattacharyya distance. Then, all features are weighted fusion. At last, the fused image is classified with K-means classification method and the built-up areas are extracted after post classification process. The proposed method has been tested by domestic airborne P band polarization SAR images, at the same time, two groups of experiments based on the method of statistical texture and the method of structural texture were carried out respectively. On the basis of qualitative analysis, quantitative analysis based on the built-up area selected artificially is enforced, in the relatively simple experimentation area, detection rate is more than 90%, in the relatively complex experimentation area, detection rate is also higher than the other two methods. In the study-area, the results show that this method can effectively and accurately extract built-up areas in high resolution airborne SAR imagery.
Martín-Campos, Trinidad; Mylonas, Roman; Masselot, Alexandre; Waridel, Patrice; Petricevic, Tanja; Xenarios, Ioannis; Quadroni, Manfredo
2017-08-04
Mass spectrometry (MS) has become the tool of choice for the large scale identification and quantitation of proteins and their post-translational modifications (PTMs). This development has been enabled by powerful software packages for the automated analysis of MS data. While data on PTMs of thousands of proteins can nowadays be readily obtained, fully deciphering the complexity and combinatorics of modification patterns even on a single protein often remains challenging. Moreover, functional investigation of PTMs on a protein of interest requires validation of the localization and the accurate quantitation of its changes across several conditions, tasks that often still require human evaluation. Software tools for large scale analyses are highly efficient but are rarely conceived for interactive, in-depth exploration of data on individual proteins. We here describe MsViz, a web-based and interactive software tool that supports manual validation of PTMs and their relative quantitation in small- and medium-size experiments. The tool displays sequence coverage information, peptide-spectrum matches, tandem MS spectra and extracted ion chromatograms through a single, highly intuitive interface. We found that MsViz greatly facilitates manual data inspection to validate PTM location and quantitate modified species across multiple samples.
Sun, Wanxin; Chang, Shi; Tai, Dean C S; Tan, Nancy; Xiao, Guangfa; Tang, Huihuan; Yu, Hanry
2008-01-01
Liver fibrosis is associated with an abnormal increase in an extracellular matrix in chronic liver diseases. Quantitative characterization of fibrillar collagen in intact tissue is essential for both fibrosis studies and clinical applications. Commonly used methods, histological staining followed by either semiquantitative or computerized image analysis, have limited sensitivity, accuracy, and operator-dependent variations. The fibrillar collagen in sinusoids of normal livers could be observed through second-harmonic generation (SHG) microscopy. The two-photon excited fluorescence (TPEF) images, recorded simultaneously with SHG, clearly revealed the hepatocyte morphology. We have systematically optimized the parameters for the quantitative SHG/TPEF imaging of liver tissue and developed fully automated image analysis algorithms to extract the information of collagen changes and cell necrosis. Subtle changes in the distribution and amount of collagen and cell morphology are quantitatively characterized in SHG/TPEF images. By comparing to traditional staining, such as Masson's trichrome and Sirius red, SHG/TPEF is a sensitive quantitative tool for automated collagen characterization in liver tissue. Our system allows for enhanced detection and quantification of sinusoidal collagen fibers in fibrosis research and clinical diagnostics.
All-Solid-State Batteries with Thick Electrode Configurations.
Kato, Yuki; Shiotani, Shinya; Morita, Keisuke; Suzuki, Kota; Hirayama, Masaaki; Kanno, Ryoji
2018-02-01
We report the preparation of thick electrode all-solid-state lithium-ion cells in which a large geometric capacity of 15.7 mAh cm -2 was achieved at room temperature using a 600 μm-thick cathode layer. The effect of ionic conductivity on the discharge performance was then examined using two different materials for the solid electrolyte. Furthermore, important morphological information regarding the tortuosity factor was electrochemically extracted from the capacity-current data. The effect of tortuosity on cell performance was also quantitatively discussed.
Cleavage Entropy as Quantitative Measure of Protease Specificity
Fuchs, Julian E.; von Grafenstein, Susanne; Huber, Roland G.; Margreiter, Michael A.; Spitzer, Gudrun M.; Wallnoefer, Hannes G.; Liedl, Klaus R.
2013-01-01
A purely information theory-guided approach to quantitatively characterize protease specificity is established. We calculate an entropy value for each protease subpocket based on sequences of cleaved substrates extracted from the MEROPS database. We compare our results with known subpocket specificity profiles for individual proteases and protease groups (e.g. serine proteases, metallo proteases) and reflect them quantitatively. Summation of subpocket-wise cleavage entropy contributions yields a measure for overall protease substrate specificity. This total cleavage entropy allows ranking of different proteases with respect to their specificity, separating unspecific digestive enzymes showing high total cleavage entropy from specific proteases involved in signaling cascades. The development of a quantitative cleavage entropy score allows an unbiased comparison of subpocket-wise and overall protease specificity. Thus, it enables assessment of relative importance of physicochemical and structural descriptors in protease recognition. We present an exemplary application of cleavage entropy in tracing substrate specificity in protease evolution. This highlights the wide range of substrate promiscuity within homologue proteases and hence the heavy impact of a limited number of mutations on individual substrate specificity. PMID:23637583
Quantitative magnetic resonance micro-imaging methods for pharmaceutical research.
Mantle, M D
2011-09-30
The use of magnetic resonance imaging (MRI) as a tool in pharmaceutical research is now well established and the current literature covers a multitude of different pharmaceutically relevant research areas. This review focuses on the use of quantitative magnetic resonance micro-imaging techniques and how they have been exploited to extract information that is of direct relevance to the pharmaceutical industry. The article is divided into two main areas. The first half outlines the theoretical aspects of magnetic resonance and deals with basic magnetic resonance theory, the effects of nuclear spin-lattice (T(1)), spin-spin (T(2)) relaxation and molecular diffusion upon image quantitation, and discusses the applications of rapid magnetic resonance imaging techniques. In addition to the theory, the review aims to provide some practical guidelines for the pharmaceutical researcher with an interest in MRI as to which MRI pulse sequences/protocols should be used and when. The second half of the article reviews the recent advances and developments that have appeared in the literature concerning the use of quantitative micro-imaging methods to pharmaceutically relevant research. Copyright © 2010 Elsevier B.V. All rights reserved.
Espresso coffee foam delays cooling of the liquid phase.
Arii, Yasuhiro; Nishizawa, Kaho
2017-04-01
Espresso coffee foam, called crema, is known to be a marker of the quality of espresso coffee extraction. However, the role of foam in coffee temperature has not been quantitatively clarified. In this study, we used an automatic machine for espresso coffee extraction. We evaluated whether the foam prepared using the machine was suitable for foam analysis. After extraction, the percentage and consistency of the foam were measured using various techniques, and changes in the foam volume were tracked over time. Our extraction method, therefore, allowed consistent preparation of high-quality foam. We also quantitatively determined that the foam phase slowed cooling of the liquid phase after extraction. High-quality foam plays an important role in delaying the cooling of espresso coffee.
McDonald, Gene D; Storrie-Lombardi, Michael C
2006-02-01
The relative abundance of the protein amino acids has been previously investigated as a potential marker for biogenicity in meteoritic samples. However, these investigations were executed without a quantitative metric to evaluate distribution variations, and they did not account for the possibility of interdisciplinary systematic error arising from inter-laboratory differences in extraction and detection techniques. Principal component analysis (PCA), hierarchical cluster analysis (HCA), and stochastic probabilistic artificial neural networks (ANNs) were used to compare the distributions for nine protein amino acids previously reported for the Murchison carbonaceous chondrite, Mars meteorites (ALH84001, Nakhla, and EETA79001), prebiotic synthesis experiments, and terrestrial biota and sediments. These techniques allowed us (1) to identify a shift in terrestrial amino acid distributions secondary to diagenesis; (2) to detect differences in terrestrial distributions that may be systematic differences between extraction and analysis techniques in biological and geological laboratories; and (3) to determine that distributions in meteoritic samples appear more similar to prebiotic chemistry samples than they do to the terrestrial unaltered or diagenetic samples. Both diagenesis and putative interdisciplinary differences in analysis complicate interpretation of meteoritic amino acid distributions. We propose that the analysis of future samples from such diverse sources as meteoritic influx, sample return missions, and in situ exploration of Mars would be less ambiguous with adoption of standardized assay techniques, systematic inclusion of assay standards, and the use of a quantitative, probabilistic metric. We present here one such metric determined by sequential feature extraction and normalization (PCA), information-driven automated exploration of classification possibilities (HCA), and prediction of classification accuracy (ANNs).
NASA Astrophysics Data System (ADS)
Fan, Li; Lin, Changhu; Duan, Wenjuan; Wang, Xiao; Liu, Jianhua; Liu, Feng
2015-01-01
An ultrahigh pressure extraction (UPE)-high performance liquid chromatography (HPLC)/diode array detector (DAD) method was established to evaluate the quality of Lonicera japonica Thunb. Ten active components, including neochlorogenic acid, chlorogenic acid, 4-dicaffeoylquinic acid, caffeic acid, rutin, luteoloside, isochlorogenic acid B, isochlorogenic acid A, isochlorogenic acid C, and quercetin, were qualitatively evaluated and quantitatively determined. Scanning electron microscope images elucidated the bud surface microstructure and extraction mechanism. The optimal extraction conditions of the UPE were 60% methanol solution, 400 MPa of extraction pressure, 3 min of extraction time, and 1:30 (g/mL) solid:liquid ratio. Under the optimized conditions, the total extraction yield of 10 active components was 57.62 mg/g. All the components showed good linearity (r2 ≥ 0.9994) and recoveries. This method was successfully applied to quantify 10 components in 22 batches of L. japonica samples from different areas. Compared with heat reflux extraction and ultrasonic-assisted extraction, UPE can be considered as an alternative extraction technique for fast extraction of active ingredient from L. japonica.
Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina
2006-01-01
Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967
Time dependent calibration of a sediment extraction scheme.
Roychoudhury, Alakendra N
2006-04-01
Sediment extraction methods to quantify metal concentration in aquatic sediments usually present limitations in accuracy and reproducibility because metal concentration in the supernatant is controlled to a large extent by the physico-chemical properties of the sediment that result in a complex interplay between the solid and the solution phase. It is suggested here that standardization of sediment extraction methods using pure mineral phases or reference material is futile and instead the extraction processes should be calibrated using site-specific sediments before their application. For calibration, time dependent release of metals should be observed for each leachate to ascertain the appropriate time for a given extraction step. Although such an approach is tedious and time consuming, using iron extraction as an example, it is shown here that apart from quantitative data such an approach provides additional information on factors that play an intricate role in metal dynamics in the environment. Single step ascorbate, HCl, oxalate and dithionite extractions were used for targeting specific iron phases from saltmarsh sediments and their response was observed over time in order to calibrate the extraction times for each extractant later to be used in a sequential extraction. For surficial sediments, an extraction time of 24 h, 1 h, 2 h and 3 h was ascertained for ascorbate, HCl, oxalate and dithionite extractions, respectively. Fluctuations in iron concentration in the supernatant over time were ubiquitous. The adsorption-desorption behavior is possibly controlled by the sediment organic matter, formation or consumption of active exchange sites during extraction and the crystallinity of iron mineral phase present in the sediments.
Guidelines for reporting quantitative mass spectrometry based experiments in proteomics.
Martínez-Bartolomé, Salvador; Deutsch, Eric W; Binz, Pierre-Alain; Jones, Andrew R; Eisenacher, Martin; Mayer, Gerhard; Campos, Alex; Canals, Francesc; Bech-Serra, Joan-Josep; Carrascal, Montserrat; Gay, Marina; Paradela, Alberto; Navajas, Rosana; Marcilla, Miguel; Hernáez, María Luisa; Gutiérrez-Blázquez, María Dolores; Velarde, Luis Felipe Clemente; Aloria, Kerman; Beaskoetxea, Jabier; Medina-Aunon, J Alberto; Albar, Juan P
2013-12-16
Mass spectrometry is already a well-established protein identification tool and recent methodological and technological developments have also made possible the extraction of quantitative data of protein abundance in large-scale studies. Several strategies for absolute and relative quantitative proteomics and the statistical assessment of quantifications are possible, each having specific measurements and therefore, different data analysis workflows. The guidelines for Mass Spectrometry Quantification allow the description of a wide range of quantitative approaches, including labeled and label-free techniques and also targeted approaches such as Selected Reaction Monitoring (SRM). The HUPO Proteomics Standards Initiative (HUPO-PSI) has invested considerable efforts to improve the standardization of proteomics data handling, representation and sharing through the development of data standards, reporting guidelines, controlled vocabularies and tooling. In this manuscript, we describe a key output from the HUPO-PSI-namely the MIAPE Quant guidelines, which have developed in parallel with the corresponding data exchange format mzQuantML [1]. The MIAPE Quant guidelines describe the HUPO-PSI proposal concerning the minimum information to be reported when a quantitative data set, derived from mass spectrometry (MS), is submitted to a database or as supplementary information to a journal. The guidelines have been developed with input from a broad spectrum of stakeholders in the proteomics field to represent a true consensus view of the most important data types and metadata, required for a quantitative experiment to be analyzed critically or a data analysis pipeline to be reproduced. It is anticipated that they will influence or be directly adopted as part of journal guidelines for publication and by public proteomics databases and thus may have an impact on proteomics laboratories across the world. This article is part of a Special Issue entitled: Standardization and Quality Control. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Hok, S; Alcaraz, A
Tetramethylenedisulfotetramine, commonly known as tetramine, is a highly neurotoxic rodenticide (human oral LD{sub 50} = 0.1 mg/kg) used in hundreds of deliberate food poisoning events in China. Here we describe a method for quantitation of tetramine spiked into beverages, including milk, juice, tea, cola, and water and cleaned up by C8 solid phase extraction and liquid-liquid extraction. Quantitation by high performance liquid chromatography tandem mass spectrometry (LC/MS/MS) was based upon fragmentation of m/z 347 to m/z 268. The method was validated by gas chromatography mass spectrometry (GC/MS) operated in SIM mode for ions m/z 212, 240, and 360. The limitmore » of quantitation was 0.10 {micro}g/mL by LC/MS/MS versus 0.15 {micro}g/mL for GC/MS. Fortifications of the beverages at 2.5 {micro}g/mL and 0.25 {micro}g/mL were recovered ranging from 73-128% by liquid-liquid extraction for GC/MS analysis, 13-96% by SPE and 10-101% by liquid-liquid extraction for LC/MS/MS analysis.« less
Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand
2016-03-15
A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.
Xu, Leilei; Wang, Fang; Xu, Ying; Wang, Yi; Zhang, Cuiping; Qin, Xue; Yu, Hongxiu; Yang, Pengyuan
2015-12-07
As a key post-translational modification mechanism, protein acetylation plays critical roles in regulating and/or coordinating cell metabolism. Acetylation is a prevalent modification process in enzymes. Protein acetylation modification occurs in sub-stoichiometric amounts; therefore extracting biologically meaningful information from these acetylation sites requires an adaptable, sensitive, specific, and robust method for their quantification. In this work, we combine immunoassays and multiple reaction monitoring-mass spectrometry (MRM-MS) technology to develop an absolute quantification for acetylation modification. With this hybrid method, we quantified the acetylation level of metabolic enzymes, which could demonstrate the regulatory mechanisms of the studied enzymes. The development of this quantitative workflow is a pivotal step for advancing our knowledge and understanding of the regulatory effects of protein acetylation in physiology and pathophysiology.
Determination of psilocybin in Psilocybe semilanceata by capillary zone electrophoresis.
Pedersen-Bjergaard, S; Sannes, E; Rasmussen, K E; Tønnesen, F
1997-07-04
A capillary zone electrophoretic (CZE) method was developed for the rapid determination of psilocybin in Psilocybe semilanceata. Following a simple two step extraction with 3.0+2.0 ml methanol, the hallucinogenic compound was effectively separated from matrix components by CZE utilizing a 10 mM borate-phosphate running buffer adjusted to pH 11.5. The identity of psilocybin was confirmed by migration time information and by UV spectra, while quantitation was accomplished utilizing barbital as internal standard. The calibration curve for psilocybin was linear within 0.01-1 mg/ml, while intra-day and inter-day variations of quantitative data were 0.5 and 2.5% R.S.D., respectively. In addition to psilocybin, the method was also suitable for the determination of the structurally related compound baeocystin.
How to integrate quantitative information into imaging reports for oncologic patients.
Martí-Bonmatí, L; Ruiz-Martínez, E; Ten, A; Alberich-Bayarri, A
2018-05-01
Nowadays, the images and information generated in imaging tests, as well as the reports that are issued, are digital and represent a reliable source of data. Reports can be classified according to their content and to the type of information they include into three main types: organized (free text in natural language), predefined (with templates and guidelines elaborated with previously determined natural language like that used in BI-RADS and PI-RADS), or structured (with drop-down menus displaying questions with various possible answers that have been agreed on with the rest of the multidisciplinary team, which use standardized lexicons and are structured in the form of a database with data that can be traced and exploited with statistical tools and data mining). The structured report, compatible with Management of Radiology Report Templates (MRRT), makes it possible to incorporate quantitative information related with the digital analysis of the data from the acquired images to accurately and precisely describe the properties and behavior of tissues by means of radiomics (characteristics and parameters). In conclusion, structured digital information (images, text, measurements, radiomic features, and imaging biomarkers) should be integrated into computerized reports so that they can be indexed in large repositories. Radiologic databanks are fundamental for exploiting health information, phenotyping lesions and diseases, and extracting conclusions in personalized medicine. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Astrophysics Data System (ADS)
Bredfeldt, Jeremy S.; Liu, Yuming; Pehlke, Carolyn A.; Conklin, Matthew W.; Szulczewski, Joseph M.; Inman, David R.; Keely, Patricia J.; Nowak, Robert D.; Mackie, Thomas R.; Eliceiri, Kevin W.
2014-01-01
Second-harmonic generation (SHG) imaging can help reveal interactions between collagen fibers and cancer cells. Quantitative analysis of SHG images of collagen fibers is challenged by the heterogeneity of collagen structures and low signal-to-noise ratio often found while imaging collagen in tissue. The role of collagen in breast cancer progression can be assessed post acquisition via enhanced computation. To facilitate this, we have implemented and evaluated four algorithms for extracting fiber information, such as number, length, and curvature, from a variety of SHG images of collagen in breast tissue. The image-processing algorithms included a Gaussian filter, SPIRAL-TV filter, Tubeness filter, and curvelet-denoising filter. Fibers are then extracted using an automated tracking algorithm called fiber extraction (FIRE). We evaluated the algorithm performance by comparing length, angle and position of the automatically extracted fibers with those of manually extracted fibers in twenty-five SHG images of breast cancer. We found that the curvelet-denoising filter followed by FIRE, a process we call CT-FIRE, outperforms the other algorithms under investigation. CT-FIRE was then successfully applied to track collagen fiber shape changes over time in an in vivo mouse model for breast cancer.
Cilia, M.; Fish, T.; Yang, X.; Mclaughlin, M.; Thannhauser, T. W.
2009-01-01
Protein extraction methods can vary widely in reproducibility and in representation of the total proteome, yet there are limited data comparing protein isolation methods. The methodical comparison of protein isolation methods is the first critical step for proteomic studies. To address this, we compared three methods for isolation, purification, and solubilization of insect proteins. The aphid Schizaphis graminum, an agricultural pest, was the source of insect tissue. Proteins were extracted using TCA in acetone (TCA-acetone), phenol, or multi-detergents in a chaotrope solution. Extracted proteins were solubilized in a multiple chaotrope solution and examined using 1-D and 2-D electrophoresis and compared directly using 2-D Difference Gel Electrophoresis (2-D DIGE). Mass spectrometry was used to identify proteins from each extraction type. We were unable to ascribe the differences in the proteins extracted to particular physical characteristics, cell location, or biological function. The TCA-acetone extraction yielded the greatest amount of protein from aphid tissues. Each extraction method isolated a unique subset of the aphid proteome. The TCA-acetone method was explored further for its quantitative reliability using 2-D DIGE. Principal component analysis showed that little of the variation in the data was a result of technical issues, thus demonstrating that the TCA-acetone extraction is a reliable method for preparing aphid proteins for a quantitative proteomics experiment. These data suggest that although the TCA-acetone method is a suitable method for quantitative aphid proteomics, a combination of extraction approaches is recommended for increasing proteome coverage when using gel-based separation techniques. PMID:19721822
Research on the use of data fusion technology to evaluate the state of electromechanical equipment
NASA Astrophysics Data System (ADS)
Lin, Lin
2018-04-01
Aiming at the problems of different testing information modes and the coexistence of quantitative and qualitative information in the state evaluation of electromechanical equipment, the paper proposes the use of data fusion technology to evaluate the state of electromechanical equipment. This paper introduces the state evaluation process of mechanical and electrical equipment in detail, uses the D-S evidence theory to fuse the decision-making layers of mechanical and electrical equipment state evaluation and carries out simulation tests. The simulation results show that it is feasible and effective to apply the data fusion technology to the state evaluation of the mechatronic equipment. After the multiple decision-making information provided by different evaluation methods are fused repeatedly and the useful information is extracted repeatedly, the fuzziness of judgment can be reduced and the state evaluation Credibility.
Ţarălungă, Dragoş-Daniel; Ungureanu, Georgeta-Mihaela; Gussi, Ilinca; Strungaru, Rodica; Wolf, Werner
2014-01-01
Interference of power line (PLI) (fundamental frequency and its harmonics) is usually present in biopotential measurements. Despite all countermeasures, the PLI still corrupts physiological signals, for example, electromyograms (EMG), electroencephalograms (EEG), and electrocardiograms (ECG). When analyzing the fetal ECG (fECG) recorded on the maternal abdomen, the PLI represents a particular strong noise component, being sometimes 10 times greater than the fECG signal, and thus impairing the extraction of any useful information regarding the fetal health state. Many signal processing methods for cancelling the PLI from biopotentials are available in the literature. In this review study, six different principles are analyzed and discussed, and their performance is evaluated on simulated data (three different scenarios), based on five quantitative performance indices.
Ţarălungă, Dragoş-Daniel; Ungureanu, Georgeta-Mihaela; Gussi, Ilinca; Strungaru, Rodica; Wolf, Werner
2014-01-01
Interference of power line (PLI) (fundamental frequency and its harmonics) is usually present in biopotential measurements. Despite all countermeasures, the PLI still corrupts physiological signals, for example, electromyograms (EMG), electroencephalograms (EEG), and electrocardiograms (ECG). When analyzing the fetal ECG (fECG) recorded on the maternal abdomen, the PLI represents a particular strong noise component, being sometimes 10 times greater than the fECG signal, and thus impairing the extraction of any useful information regarding the fetal health state. Many signal processing methods for cancelling the PLI from biopotentials are available in the literature. In this review study, six different principles are analyzed and discussed, and their performance is evaluated on simulated data (three different scenarios), based on five quantitative performance indices. PMID:24660020
Preliminary Estimation of Deoxynivalenol Excretion through a 24 h Pilot Study
Rodríguez-Carrasco, Yelko; Mañes, Jordi; Berrada, Houda; Font, Guillermina
2015-01-01
A duplicate diet study was designed to explore the occurrence of 15 Fusarium mycotoxins in the 24 h-diet consumed by one volunteer as well as the levels of mycotoxins in his 24 h-collected urine. The employed methodology involved solvent extraction at high ionic strength followed by dispersive solid phase extraction and gas chromatography determination coupled to mass spectrometry in tandem. Satisfactory results in method performance were achieved. The method’s accuracy was in a range of 68%–108%, with intra-day relative standard deviation and inter-day relative standard deviation lower than 12% and 15%, respectively. The limits of quantitation ranged from 0.1 to 8 µg/Kg. The matrix effect was evaluated and matrix-matched calibrations were used for quantitation. Only deoxynivalenol (DON) was quantified in both food and urine samples. A total DON daily intake amounted to 49.2 ± 5.6 µg whereas DON daily excretion of 35.2 ± 4.3 µg was determined. DON daily intake represented 68.3% of the established DON provisional maximum tolerable daily intake (PMTDI). Valuable preliminary information was obtained as regards DON excretion and needs to be confirmed in large-scale monitoring studies. PMID:25723325
Stout, Peter R; Gehlhausen, Jay M; Horn, Carl K; Klette, Kevin L
2002-10-01
A novel extraction and derivatization procedure for the cocaine metabolite benzoylecgonine (BZE) was developed and evaluated for use in a high-volume forensic urine analysis laboratory. Extractions utilized a Speedisk 48 positive pressure extraction manifold and polymer-based cation-exchange extraction columns. Samples were derivatized by the addition of pentafluoropropionic anhydride and pentafluoropropanol. All analyses were performed in selected ion monitoring mode; ions included m/z 421, 300, 272, 429, and 303 with m/z 421 to 429 ratio used for quantitation. The average extraction efficiency was 80%. Seventy-five common over-the-counter products, including prescription drugs, drug metabolites, and other drugs of abuse, demonstrated no significant interference with respect to chromatography or quantitation. The limit of detection and limit of quantitation were calculated at 12.5 ng/mL, and the assay was linear from 12.5 to 20,000 ng/mL with an r2 of 0.99932. A series of 20 precision samples (100 ng/mL) produced an average response of 97.8 ng/mL and a percent coefficient of variation of 4.1%. A set of 79 archived human urine samples that had previously been found to contain BZE were analyzed by 3 separate laboratories. The results did not differ significantly from prior quantitation or between laboratories. The Speedisk has proven viable for a high-volume production facility reducing overall cost of analysis by decreasing analysis time and minimizing waste production while meeting strict forensic requirements.
Ekeberg, Dag; Flaete, Per-Otto; Eikenes, Morten; Fongen, Monica; Naess-Andresen, Carl Fredrik
2006-03-24
A method for quantitative determination of extractives from heartwood of Scots pine (Pinus sylvestris L.) using gas chromatography (GC) with flame ionization detection (FID) was developed. The limit of detection (LOD) was 0.03 mg/g wood and the linear range (r = 0.9994) was up to 10 mg/g with accuracy within +/- 10% and precision of 18% relative standard deviation. The identification of the extractives was performed using gas chromatography combined with mass spectrometry (GC-MS). The yields of extraction by Soxhlet were tested for solid wood, small particles and fine powder. Small particles were chosen for further analysis. This treatment gave good yields of the most important extractives: pinosylvin, pinosylvin monomethyl ether, resin acids and free fatty acids. The method is used to demonstrate the variation of these extractives across stems and differences in north-south direction.
The Large Area Crop Inventory Experiment /LACIE/ - A summary of three years' experience
NASA Technical Reports Server (NTRS)
Erb, R. B.; Moore, B. H.
1979-01-01
Aims, history and schedule of the Large Area Crop Inventory Experiment (LACIE) conducted by NASA, USDA and NOAA from 1974-1977 are described. The LACIE experiment designed to research, develop, apply and evaluate a technology to monitor wheat production in important regions throughout the world (U.S., Canada, USSR, Brasil) utilized quantitative multispectral data collected by Landsat in concert with current weather data and historical information. The experiment successfully exploited computer data and mathematical models to extract timely corp information. A follow-on activities for the early 1980's is planned focusing especially on the early warning of changes affecting production and quality of renewable resources and commodity production forecast.
Spectral imaging of histological and cytological specimens
NASA Astrophysics Data System (ADS)
Rothmann, Chana; Malik, Zvi
1999-05-01
Evaluation of cell morphology by bright field microscopy is the pillar of histopathological diagnosis. The need for quantitative and objective parameters for diagnosis has given rise to the development of morphometric methods. The development of spectral imaging for biological and medical applications introduced both fields to large amounts of information extracted from a single image. Spectroscopic analysis is based on the ability of a stained histological specimen to absorb, reflect, or emit photons in ways characteristic to its interactions with specific dyes. Spectral information obtained from a histological specimen is stored in a cube whose appellate signifies the two spatial dimensions of a flat sample (x and y) and the third dimension, the spectrum, representing the light intensity for every wavelength. The spectral information stored in the cube can be further processed by morphometric analysis and quantitative procedures. Such a procedure is spectral-similarity mapping (SSM), which enables the demarcation of areas occupied by the same type of material. SSM constructs new images of the specimen, revealing areas with similar stain-macromolecule characteristics and enhancing subcellular features. Spectral imaging combined with SSM reveals nuclear organization through the differentiation stages as well as in apoptotic and necrotic conditions and identifies specifically the nucleoli domains.
Astronomy, Visual Literacy, and Liberal Arts Education
NASA Astrophysics Data System (ADS)
Crider, Anthony
2016-01-01
With the exponentially growing amount of visual content that twenty-first century students will face throughout their lives, teaching them to respond to it with visual and information literacy skills should be a clear priority for liberal arts education. While visual literacy is more commonly covered within humanities curricula, I will argue that because astronomy is inherently a visual science, it is a fertile academic discipline for the teaching and learning of visual literacy. Astronomers, like many scientists, rely on three basic types of visuals to convey information: images, qualitative diagrams, and quantitative plots. In this talk, I will highlight classroom methods that can be used to teach students to "read" and "write" these three separate visuals. Examples of "reading" exercises include questioning the authorship and veracity of images, confronting the distorted scales of many diagrams published in astronomy textbooks, and extracting quantitative information from published plots. Examples of "writing" exercises include capturing astronomical images with smartphones, re-sketching textbook diagrams on whiteboards, and plotting data with Google Motion Charts or iPython notebooks. Students can be further pushed to synthesize these skills with end-of-semester slide presentations that incorporate relevant images, diagrams, and plots rather than relying solely on bulleted lists.
Quantitative proton imaging from multiple physics processes: a proof of concept
NASA Astrophysics Data System (ADS)
Bopp, C.; Rescigno, R.; Rousseau, M.; Brasse, D.
2015-07-01
Proton imaging is developed in order to improve the accuracy of charged particle therapy treatment planning. It makes it possible to directly map the relative stopping powers of the materials using the information on the energy loss of the protons. In order to reach a satisfactory spatial resolution in the reconstructed images, the position and direction of each particle is recorded upstream and downstream from the patient. As a consequence of individual proton detection, information on the transmission rate and scattering of the protons is available. Image reconstruction processes are proposed to make use of this information. A proton tomographic acquisition of an anthropomorphic head phantom was simulated. The transmission rate of the particles was used to reconstruct a map of the macroscopic cross section for nuclear interactions of the materials. A two-step iterative reconstruction process was implemented to reconstruct a map of the inverse scattering length of the materials using the scattering of the protons. Results indicate that, while the reconstruction processes should be optimized, it is possible to extract quantitative information from the transmission rate and scattering of the protons. This suggests that proton imaging could provide additional knowledge on the materials that may be of use to further improve treatment planning.
Quantitative proton imaging from multiple physics processes: a proof of concept.
Bopp, C; Rescigno, R; Rousseau, M; Brasse, D
2015-07-07
Proton imaging is developed in order to improve the accuracy of charged particle therapy treatment planning. It makes it possible to directly map the relative stopping powers of the materials using the information on the energy loss of the protons. In order to reach a satisfactory spatial resolution in the reconstructed images, the position and direction of each particle is recorded upstream and downstream from the patient. As a consequence of individual proton detection, information on the transmission rate and scattering of the protons is available. Image reconstruction processes are proposed to make use of this information. A proton tomographic acquisition of an anthropomorphic head phantom was simulated. The transmission rate of the particles was used to reconstruct a map of the macroscopic cross section for nuclear interactions of the materials. A two-step iterative reconstruction process was implemented to reconstruct a map of the inverse scattering length of the materials using the scattering of the protons. Results indicate that, while the reconstruction processes should be optimized, it is possible to extract quantitative information from the transmission rate and scattering of the protons. This suggests that proton imaging could provide additional knowledge on the materials that may be of use to further improve treatment planning.
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols
2016-01-01
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM–0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information. PMID:27385047
Electrochemical Probing through a Redox Capacitor To Acquire Chemical Information on Biothiols.
Liu, Zhengchun; Liu, Yi; Kim, Eunkyoung; Bentley, William E; Payne, Gregory F
2016-07-19
The acquisition of chemical information is a critical need for medical diagnostics, food/environmental monitoring, and national security. Here, we report an electrochemical information processing approach that integrates (i) complex electrical inputs/outputs, (ii) mediators to transduce the electrical I/O into redox signals that can actively probe the chemical environment, and (iii) a redox capacitor that manipulates signals for information extraction. We demonstrate the capabilities of this chemical information processing strategy using biothiols because of the emerging importance of these molecules in medicine and because their distinct chemical properties allow evaluation of hypothesis-driven information probing. We show that input sequences can be tailored to probe for chemical information both qualitatively (step inputs probe for thiol-specific signatures) and quantitatively. Specifically, we observed picomolar limits of detection and linear responses to concentrations over 5 orders of magnitude (1 pM-0.1 μM). This approach allows the capabilities of signal processing to be extended for rapid, robust, and on-site analysis of chemical information.
NASA Astrophysics Data System (ADS)
Koma, Zsófia; Székely, Balázs; Dorninger, Peter; Rasztovits, Sascha; Roncat, Andreas; Zámolyi, András; Krawczyk, Dominik; Pfeifer, Norbert
2014-05-01
Aerial imagery derivatives collected by the Unmanned Aerial Vehicle (UAV) technology can be used as input for generation of high resolution digital terrain model (DTM) data along with the Terrestrial Laser Scanning (TLS) method. Both types of datasets are suitable for detailed geological and geomorphometric analysis, because the data provide micro-topographical and structural geological information. Our study focuses on the comparison of the possibilities of the extracted geological information, which is available from high resolution DTMs. This research attempts to find an answer which technology is more effective for geological and geomorphological analysis. The measurements were taken at the Doren landslide (Vorarlberg, Austria), a complex rotational land slide situated in the Alpine molasse foreland. Several formations (Kojen Formation, Würmian glacial moraine sediments, Weissach Formation) were tectonized there in the course of the alpine orogeny (Oberhauser et al, 2007). The typical fault direction is WSW-ENE. The UAV measurements that were carried out simultaneously with the TLS campaign focused on the landslide scarp. The original image resolution was 4 mm/pixel. Image matching was implemented in pyramid level 2 and the achieved resolution of the DTM was 0.05 meter. The TLS dataset includes 18 scan positions and more than 300 million points for the whole landslide area. The achieved DTM has 0.2 meter resolution. The steps of the geological and geomorphological analysis were: (1) visual interpretation based on field work and geological maps, (2) quantitative DTM analysis. In the quantitative analysis input data provided by the different kinds of DTMs were used for further parameter calculations (e.g. slope, aspect, sigmaZ). In the next step an automatic classification method was used for the detection of faults and classification of different parts of the landslide. The conclusion was that for geological visualization interpretation UAV datasets are better, because the high resolution texture information allows for the extraction of the digital geomorphology indicators. For quantitative analysis both datasets are informative, but the TLS DTM has an advantage of accessing additional information on faults beneath the vegetation cover. These studies were carried out partly in the framework of Hybrid 3D project financed by the Austrian Research Promotion Agency (FFG) and Von-Oben and 4D-IT; the contribution of ZsK was partly funded by Campus Hungary Internship TÁMOP-424B1; BSz contributed partly as an Alexander von Humboldt Research Fellow.
Hossain, Mohammad Amzad; AL-Raqmi, Khulood Ahmed Salim; AL-Mijizy, Zawan Hamood; Weli, Afaf Mohammed; Al-Riyami, Qasim
2013-09-01
To prepare various crude extracts using different polarities of solvent and to quantitatively evaluate their total phenol, flavonoids contents and phytochemical screening of Thymus vulgaris collected from Al Jabal Al Akhdar, Nizwa, Sultanate of Oman. The leave sample was extracted with methanol and evaporated. Then it was defatted with water and extracted with different polarities organic solvents with increasing polarities. The prepare hexane, chloroform, ethyl acetate, butanol and methanol crude extracts were used for their evaluation of total phenol, flavonoids contents and phytochemical screening study. The established conventional methods were used for quantitative determination of total phenol, flavonoids contents and phytochemical screening. Phytochemical screening for various crude extracts were tested and shown positive result for flavonoids, saponins and steroids compounds. The result for total phenol content was the highest in butanol and the lowest in methanol crude extract whereas the total flavonoids contents was the highest in methanol and the lowest hexane crude extract. The crude extracts from locally grown Thymus vulgaris showed high concentration of flavonoids and it could be used as antibiotics for different curable and uncurable diseases.
Anderson, M A; Wachs, T; Henion, J D
1997-02-01
A method based on ionspray liquid chromatography/tandem mass spectrometry (LC/MS/MS) was developed for the determination of reserpine in equine plasma. A comparison was made of the isolation of reserpine from plasma by liquid-liquid extraction and by solid-phase extraction. A structural analog, rescinnamine, was used as the internal standard. The reconstituted extracts were analyzed by ionspray LC/MS/MS in the selected reaction monitoring (SRM) mode. The calibration graph for reserpine extracted from equine plasma obtained using liquid-liquid extraction was linear from 10 to 5000 pg ml-1 and that using solid-phase extraction from 100 to 5000 pg ml-1. The lower level of quantitation (LLQ) using liquid-liquid and solid-phase extraction was 50 and 200 pg ml-1, respectively. The lower level of detection for reserpine by LC/MS/MS was 10 pg ml-1. The intra-assay accuracy did not exceed 13% for liquid-liquid and 12% for solid-phase extraction. The recoveries for the LLQ were 68% for liquid-liquid and 58% for solid-phase extraction.
Ozcan, Adnan; Ozcan, Asiye Safa
2004-10-08
This study compares conventional Soxhlet extraction and analytical scale supercritical fluid extraction (SFE) for their yields in extracting of hydrocarbons from arid-land plant Euphorbia macroclada. The plant material was firstly sequentially extracted with supercritical carbon dioxide, modified with 10% methanol (v/v) in the optimum conditions that is a pressure of 400atm and a temperature of 50 degrees C and then it was sonicated in methylene chloride for an additional 4h. E. macroclada was secondly extracted by using a Soxhlet apparatus at 30 degrees C for 8h in methylene chloride. The validated SFE was then compared to the extraction yield of E. macroclada with a Soxhlet extraction by using the Student's t-test at the 95% confidence level. All of extracts were fractionated with silica-gel in a glass column to get better hydrocarbon yields. Thus, the highest hydrocarbons yield from E. macroclada was achieved with SFE (5.8%) when it compared with Soxhlet extractions (1.1%). Gas chromatography (GC) analysis was performed to determine the quantitative hydrocarbons from plant material. The greatest quantitative hydrocarbon recovery from GC was obtained by supercritical carbon dioxide extract (0.6mgg(-1)).
Accurate airway centerline extraction based on topological thinning using graph-theoretic analysis.
Bian, Zijian; Tan, Wenjun; Yang, Jinzhu; Liu, Jiren; Zhao, Dazhe
2014-01-01
The quantitative analysis of the airway tree is of critical importance in the CT-based diagnosis and treatment of popular pulmonary diseases. The extraction of airway centerline is a precursor to identify airway hierarchical structure, measure geometrical parameters, and guide visualized detection. Traditional methods suffer from extra branches and circles due to incomplete segmentation results, which induce false analysis in applications. This paper proposed an automatic and robust centerline extraction method for airway tree. First, the centerline is located based on the topological thinning method; border voxels are deleted symmetrically to preserve topological and geometrical properties iteratively. Second, the structural information is generated using graph-theoretic analysis. Then inaccurate circles are removed with a distance weighting strategy, and extra branches are pruned according to clinical anatomic knowledge. The centerline region without false appendices is eventually determined after the described phases. Experimental results show that the proposed method identifies more than 96% branches and keep consistency across different cases and achieves superior circle-free structure and centrality.
Oshima, Shinji; Enjuji, Takako; Negishi, Akio; Akimoto, Hayato; Ohara, Kousuke; Okita, Mitsuyoshi; Numajiri, Sachihiko; Inoue, Naoko; Ohshima, Shigeru; Terao, Akira; Kobayashi, Daisuke
2017-09-01
In order to avoid adverse drug reactions (ADRs), pharmacists are reconstructing ADR-related information based on various types of data gathered from patients, and then providing this information to patients. Among the data provided to patients is the time-to-onset of ADRs after starting the medication (i.e., ADR onset timing information). However, a quantitative evaluation of the effect of onset timing information offered by pharmacists on the probability of ADRs occurring in patients receiving this information has not been reported to date. In this study, we extracted 40 ADR-drug combinations from the data in the Japanese Adverse Drug Event Report database. By applying Bayes' theorem to these combinations, we quantitatively evaluated the usefulness of onset timing information as an ADR detection predictor. As a result, when information on days after taking medication was added, 54 ADR-drug combinations showed a likelihood ratio (LR) in excess of 2. In particular, when considering the ADR-drug combination of anaphylactic shock with levofloxacin or loxoprofen, the number of days elapsed between start of medication and the onset of the ADR was 0, which corresponded to increased likelihood ratios (LRs) of 138.7301 or 58.4516, respectively. When information from 1-7 d after starting medication was added to the combination of liver disorder and acetaminophen, the LR was 11.1775. The results of this study indicate the clinical usefulness of offering information on ADR onset timing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Benker, Dennis; Delmau, Laetitia Helene; Dryman, Joshua Cory
This report presents the studies carried out to demonstrate the possibility of quantitatively extracting trivalent actinides and lanthanides from highly acidic solutions using a neutral ligand-based solvent extraction system. These studies stemmed from the perceived advantage of such systems over cationexchange- based solvent extraction systems that require an extensive feed adjustment to make a low-acid feed. The targeted feed solutions are highly acidic aqueous phases obtained after the dissolution of curium targets during a californium (Cf) campaign. Results obtained with actual Cf campaign solutions, but highly diluted to be manageable in a glove box, are presented, followed by results ofmore » tests run in the hot cells with Cf campaign rework solutions. It was demonstrated that a solvent extraction system based on the tetraoctyl diglycolamide molecule is capable of quantitatively extracting trivalent actinides from highly acidic solutions. This system was validated using actual feeds from a Cf campaign.« less
Tess, D A; Cole, R O; Toler, S M
1995-12-15
A simple and highly sensitive reversed-phase fluorimetric HPLC method for the quantitation of droloxifene from rat, monkey, and human plasma as well as human serum is described. This assay employs solid-phase extraction and has a dynamic range of 25 to 10,000 pg/ml. Sample extraction (efficiencies > 86%) was accomplished using a benzenesulfonic acid (SCX) column with water and methanol rinses. Droloxifene and internal standard were eluted with 1 ml of 3.5% (v/v) ammonium hydroxide (30%) in methanol. Samples were quantitated using post-column UV-photochemical cyclization coupled with fluorimetric detection with excitation and emission wavelengths of 260 nm and 375 nm, respectively. Relative ease of sample extraction and short run times allow for the analysis of approximately 100 samples per day.
NASA Astrophysics Data System (ADS)
Soilán, M.; Riveiro, B.; Sánchez-Rodríguez, A.; González-deSantos, L. M.
2018-05-01
During the last few years, there has been a huge methodological development regarding the automatic processing of 3D point cloud data acquired by both terrestrial and aerial mobile mapping systems, motivated by the improvement of surveying technologies and hardware performance. This paper presents a methodology that, in a first place, extracts geometric and semantic information regarding the road markings within the surveyed area from Mobile Laser Scanning (MLS) data, and then employs it to isolate street areas where pedestrian crossings are found and, therefore, pedestrians are more likely to cross the road. Then, different safety-related features can be extracted in order to offer information about the adequacy of the pedestrian crossing regarding its safety, which can be displayed in a Geographical Information System (GIS) layer. These features are defined in four different processing modules: Accessibility analysis, traffic lights classification, traffic signs classification, and visibility analysis. The validation of the proposed methodology has been carried out in two different cities in the northwest of Spain, obtaining both quantitative and qualitative results for pedestrian crossing classification and for each processing module of the safety assessment on pedestrian crossing environments.
Rapid quantitation of neuraminidase inhibitor drug resistance in influenza virus quasispecies.
Lackenby, Angie; Democratis, Jane; Siqueira, Marilda M; Zambon, Maria C
2008-01-01
Emerging resistance of influenza viruses to neuraminidase inhibitors is a concern, both in surveillance of global circulating strains and in treatment of individual patients. Current methodologies to detect resistance rely on the use of cultured virus, thus taking time to complete or lacking the sensitivity to detect mutations in viral quasispecies. Methodology for rapid detection of clinically meaningful resistance is needed to assist individual patient management and to track the transmission of resistant viruses in the community. We have developed a pyrosequencing methodology to detect and quantitate influenza neuraminidase inhibitor resistance mutations in cultured virus and directly in clinical material. Our assays target polymorphisms associated with drug resistance in the neuraminidase genes of human influenza A H1N1 as well as human and avian H5N1 viruses. Quantitation can be achieved using viral RNA extracted directly from respiratory or tissue samples, thus eliminating the need for virus culture and allowing the assay of highly pathogenic viruses such as H5N1 without high containment laboratory facilities. Antiviral-resistant quasispecies are detected and quantitated accurately when present in the total virus population at levels as low as 10%. Pyrosequencing is a real-time assay; therefore, results can be obtained within a clinically relevant timeframe and provide information capable of informing individual patient or outbreak management. Pyrosequencing is ideally suited for early identification of emerging antiviral resistance in human and avian influenza infection and is a useful tool for laboratory surveillance and pandemic preparedness.
Hildon, Zoe; Allwood, Dominique; Black, Nick
2012-02-01
Displays comparing the performance of healthcare providers are largely based on commonsense. To review the literature on the impact of compositional format and content of quantitative data displays on people's comprehension, choice and preference. Ovid databases, expert recommendations and snowballing techniques. Evaluations of the impact of different formats (bar charts, tables and pictographs) and content (ordering, explanatory visual cues, etc.) of quantitative data displays meeting defined quality criteria. Data extraction Type of decision; decision-making domains; audiences; formats; content; methodology; findings. Most of the 30 studies used quantitative (n= 26) methods with patients or public groups (n= 28) rather than with professionals (n= 2). Bar charts were the most frequent format, followed by pictographs and tables. As regards format, tables and pictographs appeared better understood than bar charts despite the latter being preferred. Although accessible to less numerate and older populations, pictographs tended to lead to more risk avoidance. Tables appeared accessible to all. Aspects of content enhancing the impact of data displays included giving visual explanatory cues and contextual information while still attempting simplicity ('less is more'); ordering data; consistency. Icons rather than numbers were more user-friendly but could lead to over-estimation of risk. Uncertainty was not widely understood, nor well represented. Though heterogeneous and limited in scope, there is sufficient research evidence to inform the presentation of quantitative data that compares the performance of healthcare providers. The impact of new formats, such as funnel plots, needs to be evaluated.
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-01-01
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874
Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook
2014-09-15
Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.
Simultaneous extraction and quantitation of several bioactive amines in cheese and chocolate.
Baker, G B; Wong, J T; Coutts, R T; Pasutto, F M
1987-04-17
A method is described for simultaneous extraction and quantitation of the amines 2-phenylethylamine, tele-methylhistamine, histamine, tryptamine, m- and p-tyramine, 3-methoxytyramine, 5-hydroxytryptamine, cadaverine, putrescine, spermidine and spermine. This method is based on extractive derivatization of the amines with a perfluoroacylating agent, pentafluorobenzoyl chloride, under basic aqueous conditions. Analysis was done on a gas chromatograph equipped with an electron-capture detector and a capillary column system. The procedure is relatively rapid and provides derivatives with good chromatographic properties. Its application to analysis of the above amines in cheese and chocolate products is described.
Fan, Lihua; Shuai, Jiangbing; Zeng, Ruoxue; Mo, Hongfei; Wang, Suhua; Zhang, Xiaofeng; He, Yongqiang
2017-12-01
Genome fragment enrichment (GFE) method was applied to identify host-specific bacterial genetic markers that differ among different fecal metagenomes. To enrich for swine-specific DNA fragments, swine fecal DNA composite (n = 34) was challenged against a DNA composite consisting of cow, human, goat, sheep, chicken, duck and goose fecal DNA extracts (n = 83). Bioinformatic analyses of 384 non-redundant swine enriched metagenomic sequences indicated a preponderance of Bacteroidales-like regions predicted to encode metabolism-associated, cellular processes and information storage and processing. After challenged against fecal DNA extracted from different animal sources, four sequences from the clone libraries targeting two Bacteroidales- (genes 1-38 and 3-53), a Clostridia- (gene 2-109) as well as a Bacilli-like sequence (gene 2-95), respectively, showed high specificity to swine feces based on PCR analysis. Host-specificity and host-sensitivity analysis confirmed that oligonucleotide primers and probes capable of annealing to select Bacteroidales-like sequences (1-38 and 3-53) exhibited high specificity (>90%) in quantitative PCR assays with 71 fecal DNAs from non-target animal sources. The two assays also demonstrated broad distributions of corresponding genetic markers (>94% positive) among 72 swine feces. After evaluation with environmental water samples from different areas, swine-targeted assays based on two Bacteroidales-like GFE sequences appear to be suitable quantitative tracing tools for swine fecal pollution. Copyright © 2017 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kobayashi, F.; Ozawa, N.; Hanai, J.
Twenty-one water-soluble acid dyes, including eleven azo, five triphenylmethane four xanthene, one naphthol derivatives, used at practical concentrations for food coloration, were quantitatively extracted from water and various carbonated beverages into a 0.1 M quinine-chloroform solution in the presence of 0.5 M boric acid by brief shaking. Quantitative extraction of these dyes was also accomplished by the 0.1 M quinine-chloroform solution made conveniently from chloroform, quinine hydrochloride, and sodium hydroxide added successively to water or beverages containing boric acid. Quinine acted as a countercation on the dyes having sulfonic and/or carboxylic acid group(s) to form chloroform-soluble ion-pair complexes. The diacidicmore » base alkaloid interacted with each acid group of mono-, di-, tri-, and tetrasulfonic acid dyes approximately in the ratio 0.8-0.9 to 1. The dyes in the chloroform solution were quantitatively concentrated into a small volume of sodium hydroxide solution also by brief shaking. The convenient quinine-chloroform method was applicable to the quantitative extraction of a mixture of 12 dyes from carbonated beverages, which are all currently used for food coloration. A high-pressure liquid chromatographic method is also presented for the systematic separation and determination of these 12 dyes following their concentration into the aqueous alkaline solution. The chromatogram was monitored by double-wavelength absorptiometry in the visible and ultraviolet ray regions.« less
QTLTableMiner++: semantic mining of QTL tables in scientific articles.
Singh, Gurnoor; Kuzniar, Arnold; van Mulligen, Erik M; Gavai, Anand; Bachem, Christian W; Visser, Richard G F; Finkers, Richard
2018-05-25
A quantitative trait locus (QTL) is a genomic region that correlates with a phenotype. Most of the experimental information about QTL mapping studies is described in tables of scientific publications. Traditional text mining techniques aim to extract information from unstructured text rather than from tables. We present QTLTableMiner ++ (QTM), a table mining tool that extracts and semantically annotates QTL information buried in (heterogeneous) tables of plant science literature. QTM is a command line tool written in the Java programming language. This tool takes scientific articles from the Europe PMC repository as input, extracts QTL tables using keyword matching and ontology-based concept identification. The tables are further normalized using rules derived from table properties such as captions, column headers and table footers. Furthermore, table columns are classified into three categories namely column descriptors, properties and values based on column headers and data types of cell entries. Abbreviations found in the tables are expanded using the Schwartz and Hearst algorithm. Finally, the content of QTL tables is semantically enriched with domain-specific ontologies (e.g. Crop Ontology, Plant Ontology and Trait Ontology) using the Apache Solr search platform and the results are stored in a relational database and a text file. The performance of the QTM tool was assessed by precision and recall based on the information retrieved from two manually annotated corpora of open access articles, i.e. QTL mapping studies in tomato (Solanum lycopersicum) and in potato (S. tuberosum). In summary, QTM detected QTL statements in tomato with 74.53% precision and 92.56% recall and in potato with 82.82% precision and 98.94% recall. QTM is a unique tool that aids in providing QTL information in machine-readable and semantically interoperable formats.
Integrated work-flow for quantitative metabolome profiling of plants, Peucedani Radix as a case.
Song, Yuelin; Song, Qingqing; Liu, Yao; Li, Jun; Wan, Jian-Bo; Wang, Yitao; Jiang, Yong; Tu, Pengfei
2017-02-08
Universal acquisition of reliable information regarding the qualitative and quantitative properties of complicated matrices is the premise for the success of metabolomics study. Liquid chromatography-mass spectrometry (LC-MS) is now serving as a workhorse for metabolomics; however, LC-MS-based non-targeted metabolomics is suffering from some shortcomings, even some cutting-edge techniques have been introduced. Aiming to tackle, to some extent, the drawbacks of the conventional approaches, such as redundant information, detector saturation, low sensitivity, and inconstant signal number among different runs, herein, a novel and flexible work-flow consisting of three progressive steps was proposed to profile in depth the quantitative metabolome of plants. The roots of Peucedanum praeruptorum Dunn (Peucedani Radix, PR) that are rich in various coumarin isomers, were employed as a case study to verify the applicability. First, offline two dimensional LC-MS was utilized for in-depth detection of metabolites in a pooled PR extract namely universal metabolome standard (UMS). Second, mass fragmentation rules, notably concerning angular-type pyranocoumarins that are the primary chemical homologues in PR, and available databases were integrated for signal assignment and structural annotation. Third, optimum collision energy (OCE) as well as ion transition for multiple monitoring reaction measurement was online optimized with a reference compound-free strategy for each annotated component and large-scale relative quantification of all annotated components was accomplished by plotting calibration curves via serially diluting UMS. It is worthwhile to highlight that the potential of OCE for isomer discrimination was described and the linearity ranges of those primary ingredients were extended by suppressing their responses. The integrated workflow is expected to be qualified as a promising pipeline to clarify the quantitative metabolome of plants because it could not only holistically provide qualitative information, but also straightforwardly generate accurate quantitative dataset. Copyright © 2016 Elsevier B.V. All rights reserved.
Mouly, P P; Gaydou, E M; Corsetti, J
1999-03-01
The carotenoid pigment profiles of authentic pure orange juices from Spain and Florida and an industrial paprika (Capsicum annuum) extract used for food coloring were obtained using reversed-phase liquid chromatography with a C18 packed column and an acetone/methanol/water eluent system. The procedure involving the carotenoid extraction is described. Both retention times and spectral properties using photodiode array detection for characterization of the major carotenoids at 430 and 519 nm are given. The influence of external addition of tangerine juice and/or paprika extract on orange juice color is described using the U.S. Department of Agriculture scale and adulterated orange juice. The procedure for quantitation of externally added paprika extract to orange juice is investigated, and the limit of quantitation, coefficient of variation, and recoveries are determined.
Yankson, Kweku K.; Steck, Todd R.
2009-01-01
We present a simple strategy for isolating and accurately enumerating target DNA from high-clay-content soils: desorption with buffers, an optional magnetic capture hybridization step, and quantitation via real-time PCR. With the developed technique, μg quantities of DNA were extracted from mg samples of pure kaolinite and a field clay soil. PMID:19633108
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-01-18
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels.
NASA Astrophysics Data System (ADS)
Sharma, Dharmendar Kumar; Irfanullah, Mir; Basu, Santanu Kumar; Madhu, Sheri; De, Suman; Jadhav, Sameer; Ravikanth, Mangalampalli; Chowdhury, Arindam
2017-03-01
While fluorescence microscopy has become an essential tool amongst chemists and biologists for the detection of various analyte within cellular environments, non-uniform spatial distribution of sensors within cells often restricts extraction of reliable information on relative abundance of analytes in different subcellular regions. As an alternative to existing sensing methodologies such as ratiometric or FRET imaging, where relative proportion of analyte with respect to the sensor can be obtained within cells, we propose a methodology using spectrally-resolved fluorescence microscopy, via which both the relative abundance of sensor as well as their relative proportion with respect to the analyte can be simultaneously extracted for local subcellular regions. This method is exemplified using a BODIPY sensor, capable of detecting mercury ions within cellular environments, characterized by spectral blue-shift and concurrent enhancement of emission intensity. Spectral emission envelopes collected from sub-microscopic regions allowed us to compare the shift in transition energies as well as integrated emission intensities within various intracellular regions. Construction of a 2D scatter plot using spectral shifts and emission intensities, which depend on the relative amount of analyte with respect to sensor and the approximate local amounts of the probe, respectively, enabled qualitative extraction of relative abundance of analyte in various local regions within a single cell as well as amongst different cells. Although the comparisons remain semi-quantitative, this approach involving analysis of multiple spectral parameters opens up an alternative way to extract spatial distribution of analyte in heterogeneous systems. The proposed method would be especially relevant for fluorescent probes that undergo relatively nominal shift in transition energies compared to their emission bandwidths, which often restricts their usage for quantitative ratiometric imaging in cellular media due to strong cross-talk between energetically separated detection channels. Dedicated to Professor Kankan Bhattacharyya.
Chang, Yan-Li; Shen, Meng; Ren, Xue-Yang; He, Ting; Wang, Le; Fan, Shu-Sheng; Wang, Xiu-Huan; Li, Xiao; Wang, Xiao-Ping; Chen, Xiao-Yi; Sui, Hong; She, Gai-Mei
2018-04-19
Thymus quinquecostatus Celak is a species of thyme in China and it used as condiment and herbal medicine for a long time. To set up the quality evaluation of T. quinquecostatus , the response surface methodology (RSM) based on its 2,2-Diphenyl-1-picrylhydrazyl (DPPH) radical scavenging activity was introduced to optimize the extraction condition, and the main indicator components were found through an UPLC-LTQ-Orbitrap MS n method. The ethanol concentration, solid-liquid ratio, and extraction time on optimum conditions were 42.32%, 1:17.51, and 1.8 h, respectively. 35 components having 12 phenolic acids and 23 flavonoids were unambiguously or tentatively identified both positive and negative modes to employ for the comprehensive analysis in the optimum anti-oxidative part. A simple, reliable, and sensitive HPLC method was performed for the multi-component quantitative analysis of T. quinquecostatus using six characteristic and principal phenolic acids and flavonoids as reference compounds. Furthermore, the chemometrics methods (principal components analysis (PCA) and hierarchical clustering analysis (HCA)) appraised the growing areas and harvest time of this herb closely relative to the quality-controlled. This study provided full-scale qualitative and quantitative information for the quality evaluation of T. quinquecostatus , which would be a valuable reference for further study and development of this herb and related laid the foundation of further study on its pharmacological efficacy.
Radiation reaction studies in an all-optical set-up: experimental limitations
NASA Astrophysics Data System (ADS)
Samarin, G. M.; Zepf, M.; Sarri, G.
2018-06-01
The recent development of ultra-high intensity laser facilities is finally opening up the possibility of studying high-field quantum electrodynamics in the laboratory. Arguably, one of the central phenomena in this area is that of quantum radiation reaction experienced by an ultra-relativistic electron beam as it propagates through the tight focus of a laser beam. In this paper, we discuss the major experimental challenges that are to be faced in order to extract meaningful and quantitative information from this class of experiments using existing and near-term laser facilities.
Use of Laboratory Data to Model Interstellar Chemistry
NASA Technical Reports Server (NTRS)
Vidali, Gianfranco; Roser, J. E.; Manico, G.; Pirronello, V.
2006-01-01
Our laboratory research program is about the formation of molecules on dust grains analogues in conditions mimicking interstellar medium environments. Using surface science techniques, in the last ten years we have investigated the formation of molecular hydrogen and other molecules on different types of dust grain analogues. We analyzed the results to extract quantitative information on the processes of molecule formation on and ejection from dust grain analogues. The usefulness of these data lies in the fact that these results have been employed by theoreticians in models of the chemical evolution of ISM environments.
Bellon, L; Maloney, L; Zinnen, S P; Sandberg, J A; Johnson, K E
2000-08-01
Versatile bioanalytical assays to detect chemically stabilized hammerhead ribozyme and putative ribozyme metabolites from plasma are described. The extraction protocols presented are based on serial solid-phase extractions performed on a 96-well plate format and are compatible with either IEX-HPLC or CGE back-end analysis. A validation of both assays confirmed that both the HPLC and the CGE methods possess the required linearity, accuracy, and precision to accurately measure concentrations of hammerhead ribozyme extracted from plasma. These methods should be of general use to detect and quantitate ribozymes from other biological fluids such as serum and urine. Copyright 2000 Academic Press.
Automated quantitative assessment of proteins' biological function in protein knowledge bases.
Mayr, Gabriele; Lepperdinger, Günter; Lackner, Peter
2008-01-01
Primary protein sequence data are archived in databases together with information regarding corresponding biological functions. In this respect, UniProt/Swiss-Prot is currently the most comprehensive collection and it is routinely cross-examined when trying to unravel the biological role of hypothetical proteins. Bioscientists frequently extract single entries and further evaluate those on a subjective basis. In lieu of a standardized procedure for scoring the existing knowledge regarding individual proteins, we here report about a computer-assisted method, which we applied to score the present knowledge about any given Swiss-Prot entry. Applying this quantitative score allows the comparison of proteins with respect to their sequence yet highlights the comprehension of functional data. pfs analysis may be also applied for quality control of individual entries or for database management in order to rank entry listings.
A Charge Coupled Device Imaging System For Ophthalmology
NASA Astrophysics Data System (ADS)
Rowe, R. Wanda; Packer, Samuel; Rosen, James; Bizais, Yves
1984-06-01
A digital camera system has been constructed for obtaining reflectance images of the fundus of the eye with monochromatic light. Images at wavelengths in the visible and near infrared regions of the spectrum are recorded by a charge-coupled device array and transferred to a computer. A variety of image processing operations are performed to restore the pictures, correct for distortions in the image formation process, and extract new and diagnostically useful information. The steps involved in calibrating the system to permit quantitative measurement of fundus reflectance are discussed. Three clinically important applications of such a quantitative system are addressed: the characterization of changes in the optic nerve arising from glaucoma, the diagnosis of choroidal melanoma through spectral signatures, and the early detection and improved management of diabetic retinopathy by measurement of retinal tissue oxygen saturation.
Kim, Eun-mi; Lee, Ju-seon; Choi, Sang-kil; Lim, Mi-ae; Chung, Hee-sun
2008-01-30
Ketamine (KT) is widely abused for hallucination and also misused as a "date-rape" drug in recent years. An analytical method using positive ion chemical ionization-gas chromatography-mass spectrometry (PCI-GC-MS) with an automatic solid-phase extraction (SPE) apparatus was studied for the determination of KT and its major metabolite, norketamine (NK), in urine. Six ketamine suspected urine samples were provided by the police. For the research of KT metabolism, KT was administered to SD rats by i.p. at a single dose of 5, 10 and 20mg/kg, respectively, and urine samples were collected 24, 48 and 72 h after administration. For the detection of KT and NK, urine samples were extracted on an automatic SPE apparatus (RapidTrace, Zymark) with mixed mode type cartridge, Drug-Clean (200 mg, Alltech). The identification of KT and NK was by PCI-GC-MS. m/z238 (M+1), 220 for KT, m/z 224 (M+1), 207 for NK and m/z307 (M+1) for Cocaine-D(3) as internal standard were extracted from the full-scan mass spectrum and the underlined ions were used for quantitation. Extracted calibration curves were linear from 50 to 1000 ng/mL for KT and NK with correlation coefficients exceeding 0.99. The limit of detection (LOD) was 25 ng/mL for KT and NK. The limit of quantitation (LOQ) was 50 ng/mL for KT and NK. The recoveries of KT and NK at three different concentrations (86, 430 and 860 ng/mL) were 53.1 to 79.7% and 45.7 to 83.0%, respectively. The intra- and inter-day run precisions (CV) for KT and NK were less than 15.0%, and the accuracies (bias) for KT and NK were also less than 15% at the three different concentration levels (86, 430 and 860 ng/mL). The analytical method was also applied to real six KT suspected urine specimens and KT administered rat urines, and the concentrations of KT and NK were determined. Dehydronorketamine (DHNK) was also confirmed in these urine samples, however the concentration of DHNK was not calculated. SPE is simple, and needs less organic solvent than liquid-liquid extraction (LLE), and PCI-GC-MS can offer both qualitative and quantitative information for urinalysis of KT in forensic analysis.
Ferreira-Gonzalez, A; Yanovich, S; Langley, M R; Weymouth, L A; Wilkinson, D S; Garrett, C T
2000-01-01
Accurate and rapid diagnosis of CMV disease in immunocompromised individuals remains a challenge. Quantitative polymerase chain reaction (QPCR) methods for detection of CMV in peripheral blood mononuclear cells (PBMC) have improved the positive and negative predictive value of PCR for diagnosis of CMV disease. However, detection of CMV in plasma has demonstrated a lower negative predictive value for plasma as compared with PBMC. To enhance the sensitivity of the QPCR assay for plasma specimens, plasma samples were centrifuged before nucleic-acid extraction and the extracted DNA resolubilized in reduced volume. Optimization of the nucleic-acid extraction focused on decreasing or eliminating the presence of inhibitors in the pelleted plasma. Quantitation was achieved by co-amplifying an internal quantitative standard (IS) with the same primer sequences as CMV. PCR products were detected by hybridization in a 96-well microtiter plate coated with a CMV or IS specific probe. The precision of the QPCR assay for samples prepared from untreated and from pelleted plasma was then assessed. The coefficient of variation for both types of samples was almost identical and the magnitude of the coefficient of variations was reduced by a factor of ten if the data were log transformed. Linearity of the QPCR assay extended over a 3.3-log range for both types of samples but the range of linearity for pelleted plasma was 20 to 40,000 viral copies/ml (vc/ml) in contrast to 300 to 400,000 vc/ml for plasma. Thus, centrifugation of plasma before nucleic-acid extraction and resuspension of extracted CMV DNA in reduced volume enhanced the analytical sensitivity approximately tenfold over the dynamic range of the assay. Copyright 2000 Wiley-Liss, Inc.
Quantitative IR microscopy and spectromics open the way to 3D digital pathology.
Bobroff, Vladimir; Chen, Hsiang-Hsin; Delugin, Maylis; Javerzat, Sophie; Petibois, Cyril
2017-04-01
Currently, only mass-spectrometry (MS) microscopy brings a quantitative analysis of chemical contents of tissue samples in 3D. Here, the reconstruction of a 3D quantitative chemical images of a biological tissue by FTIR spectro-microscopy is reported. An automated curve-fitting method is developed to extract all intense absorption bands constituting IR spectra. This innovation benefits from three critical features: (1) the correction of raw IR spectra to make them quantitatively comparable; (2) the automated and iterative data treatment allowing to transfer the IR-absorption spectrum into a IR-band spectrum; (3) the reconstruction of an 3D IR-band matrix (x, y, z for voxel position and a 4 th dimension with all IR-band parameters). Spectromics, which is a new method for exploiting spectral data for tissue metadata reconstruction, is proposed to further translate the related chemical information in 3D, as biochemical and anatomical tissue parameters. An example is given with oxidative stress distribution and the reconstruction of blood vessels in tissues. The requirements of IR microscopy instrumentation to propose 3D digital histology as a clinical routine technology is briefly discussed. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Principles Underlying the Use of Multiple Informants’ Reports
De Los Reyes, Andres; Thomas, Sarah A.; Goodman, Kimberly L.; Kundey, Shannon M.A.
2014-01-01
Researchers use multiple informants’ reports to assess and examine behavior. However, informants’ reports commonly disagree. Informants’ reports often disagree in their perceived levels of a behavior (“low” vs. “elevated” mood), and examining multiple reports in a single study often results in inconsistent findings. Although researchers often espouse taking a multi-informant assessment approach, they frequently address informant discrepancies using techniques that treat discrepancies as measurement error. Yet, recent work indicates that researchers in a variety of fields often may be unable to justify treating informant discrepancies as measurement error. In this paper, the authors advance a framework (Operations Triad Model) outlining general principles for using and interpreting informants’ reports. Using the framework, researchers can test whether or not they can extract meaningful information about behavior from discrepancies among multiple informants’ reports. The authors provide supportive evidence for this framework and discuss its implications for hypothesis testing, study design, and quantitative review. PMID:23140332
Pungency Quantitation of Hot Pepper Sauces Using HPLC
NASA Astrophysics Data System (ADS)
Betts, Thomas A.
1999-02-01
A class of compounds known as capsaicinoids are responsible for the "heat" of hot peppers. To determine the pungency of a particular pepper or pepper product, one may quantify the capsaicinoids and relate those concentrations to the perceived heat. The format of the laboratory described here allows students to collectively develop an HPLC method for the quantitation of the two predominant capsaicinoids (capsaicin and dihydrocapsaicin) in hot-pepper products. Each small group of students investigated one of the following aspects of the method: detector wavelength, mobile-phase composition, extraction of capsaicinoids, calibration, and quantitation. The format of the lab forced students to communicate and cooperate to develop this method. The resulting HPLC method involves extraction with acetonitrile followed by solid-phase extraction clean-up, an isocratic 80:20 methanol-water mobile phase, a 4.6 mm by 25 cm C-18 column, and UV absorbance detection at 284 nm. The method developed by the students was then applied to the quantitation of capsaicinoids in a variety of hot pepper sauces. Editor's Note on Hazards in our April 2000 issue addresses the above.
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J; Kertesz, Vilmos; Gan, Jinping
2016-03-25
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites were studied. Major organs (brain, lung, liver, kidney and muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed the same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. In addition, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement. Copyright © 2015 Elsevier B.V. All rights reserved.
Schmitz-Afonso, I.; Loyo-Rosales, J.E.; de la Paz Aviles, M.; Rattner, B.A.; Rice, C.P.
2003-01-01
A quantitative method for the simultaneous determination of octylphenol, nonylphenol and the corresponding ethoxylates (1 to 5) in biota is presented. Extraction methods were developed for egg and fish matrices based on accelerated solvent extraction followed by a solid-phase extraction cleanup, using octadecylsilica or aminopropyl cartridges. Identification and quantitation were accomplished by liquid chromatography-electrospray tandem mass spectrometry (LC-MS-MS) and compared to the traditional liquid chromatography with fluorescence spectroscopy detection. LC-MS-MS provides high sensitivity and specificity required for these complex matrices and an accurate quantitation with the use of 13C-labeled internal standards. Quantitation limits by LC-MS-MS ranged from 4 to 12 ng/g in eggs, and from 6 to 22 ng/g in fish samples. These methods were successfully applied to osprey eggs from the Chesapeake Bay and fish from the Great Lakes area. Total levels found in osprey egg samples were up to 18 ng/g wet mass and as high as 8.2 ug/g wet mass in the fish samples.
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.; ...
2015-11-03
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Weiqi; Wang, Lifei; Van Berkel, Gary J.
Herein, quantitation aspects of a fully automated autosampler/HPLC-MS/MS system applied for unattended droplet-based surface sampling of repaglinide dosed thin tissue sections with subsequent HPLC separation and mass spectrometric analysis of parent drug and various drug metabolites was studied. Major organs (brain, lung, liver, kidney, muscle) from whole-body thin tissue sections and corresponding organ homogenates prepared from repaglinide dosed mice were sampled by surface sampling and by bulk extraction, respectively, and analyzed by HPLC-MS/MS. A semi-quantitative agreement between data obtained by surface sampling and that by employing organ homogenate extraction was observed. Drug concentrations obtained by the two methods followed themore » same patterns for post-dose time points (0.25, 0.5, 1 and 2 h). Drug amounts determined in the specific tissues was typically higher when analyzing extracts from the organ homogenates. Furthermore, relative comparison of the levels of individual metabolites between the two analytical methods also revealed good semi-quantitative agreement.« less
Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana
2014-09-01
Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.
Knold, Lone; Reitov, Marianne; Mortensen, Anna Birthe; Hansen-Møller, Jens
2002-01-01
A rapid and quantitative method for the extraction, derivatization, and liquid chromatography with fluorescence detection of ivermectin (IVM) and doramectin (DOM) residues in porcine liver was developed and validated. IVM and DOM were extracted from the liver samples with acetonitrile, the supernatant was evaporated to dryness at 37 degrees C under nitrogen, and the residue was reconstituted in 1-methylimidazole solution. After 2 min at room temperature, IVM and DOM were converted to a fluorescent derivative and then separated on a Hypersil ODS column. The derivatives of IVM and DOM were detected and quantitated with high specificity by fluorescence (excitation: 365 nm, emission: 475 nm). Abamectin was used as an internal standard. The mean extraction efficiencies from fortified samples (15 ng/g) were 75% for IVM and 70% for DOM. The limit of detection was 0.8 ng/g for both IVM and DOM.
Absolute quantitation of intracellular metabolite concentrations by an isotope ratio-based approach
Bennett, Bryson D; Yuan, Jie; Kimball, Elizabeth H; Rabinowitz, Joshua D
2009-01-01
This protocol provides a method for quantitating the intracellular concentrations of endogenous metabolites in cultured cells. The cells are grown in stable isotope-labeled media to near-complete isotopic enrichment and then extracted in organic solvent containing unlabeled internal standards in known concentrations. The ratio of endogenous metabolite to internal standard in the extract is determined using mass spectrometry (MS). The product of this ratio and the unlabeled standard amount equals the amount of endogenous metabolite present in the cells. The cellular concentration of the metabolite can then be calculated on the basis of intracellular volume of the extracted cells. The protocol is exemplified using Escherichia coli and primary human fibroblasts fed uniformly with 13C-labeled carbon sources, with detection of 13C-assimilation by liquid chromatography–tandem MS. It enables absolute quantitation of several dozen metabolites over ~1 week of work. PMID:18714298
Harlé, Alexandre; Lion, Maëva; Husson, Marie; Dubois, Cindy; Merlin, Jean-Louis
2013-01-01
According to the French legislation on medical biology (January 16th, 2010), all biological laboratories must be accredited according to ISO 15189 for at least 50% of their activities before the end of 2016. The extraction of DNA from a sample of interest, whether solid or liquid is one of the critical steps in molecular biology and specifically in somatic or constitutional genetic. The extracted DNA must meet a number of criteria such quality and also be in sufficient concentration to allow molecular biology assays such as the detection of somatic mutations. This paper describes the validation of the extraction and purification of DNA using chromatographic column extraction and quantitative determination by spectrophotometric assay, according to ISO 15189 and the accreditation technical guide in Human Health SH-GTA-04.
NASA Astrophysics Data System (ADS)
Jawak, Shridhar D.; Jadhav, Ajay; Luis, Alvarinho J.
2016-05-01
Supraglacial debris was mapped in the Schirmacher Oasis, east Antarctica, by using WorldView-2 (WV-2) high resolution optical remote sensing data consisting of 8-band calibrated Gram Schmidt (GS)-sharpened and atmospherically corrected WV-2 imagery. This study is a preliminary attempt to develop an object-oriented rule set to extract supraglacial debris for Antarctic region using 8-spectral band imagery. Supraglacial debris was manually digitized from the satellite imagery to generate the ground reference data. Several trials were performed using few existing traditional pixel-based classification techniques and color-texture based object-oriented classification methods to extract supraglacial debris over a small domain of the study area. Multi-level segmentation and attributes such as scale, shape, size, compactness along with spectral information from the data were used for developing the rule set. The quantitative analysis of error was carried out against the manually digitized reference data to test the practicability of our approach over the traditional pixel-based methods. Our results indicate that OBIA-based approach (overall accuracy: 93%) for extracting supraglacial debris performed better than all the traditional pixel-based methods (overall accuracy: 80-85%). The present attempt provides a comprehensive improved method for semiautomatic feature extraction in supraglacial environment and a new direction in the cryospheric research.
Hexaacetato calix(6)arene as the novel extractant for palladium.
Mathew, V J; Khopkar, S M
1997-10-01
A novel method is proposed for the solvent extraction of palladium. A superamolecular compound, hexaacetato calix(6)arene in low concentration in toluene quantitatively extracts microgram concentration of palladium at pH 7.5. It can be stripped from the organic phase with 2 M nitric acid and determined spectrophotometrically as its stannous chloride complex at 635 nm. The probable composition of the extracted species is Pd(HR)(2)Cl. As low as 1x10(-3) M of extractant is adequate for quantitative extraction. Toluene was the best diluent. With nitric and perchloric acid (1.5-3 M) the stripping was complete. Palladium was separated in large ratios from alkali and alkaline earths (1:50). The main group elements were tolerated in higher ratios (1:25), but ions like zinc, cadmium, iron, nickel, platinium, thorium, vanadium and molydenum were tolerated at low concentrations (1:1). The ions showing strong interference were copper, chromium. The relative standard deviation is +/-1.1%.
Biological and analytical characterization of two extracts from Valeriana officinalis.
Circosta, Clara; De Pasquale, Rita; Samperi, Stefania; Pino, Annalisa; Occhiuto, Francesco
2007-06-13
The anticoronaryspastic and antibronchospastic activities of ethanolic and aqueous extracts of Valeriana officinalis L. roots were investigated in anaesthetized guinea-pigs and the results were correlated with the qualitative/quantitative chemical composition of the extracts in order to account for some of the common uses of this plant. The protective effects of orally administered ethanolic and aqueous extracts (50, 100 and 200 mg/kg) were evaluated against pitressin-induced coronary spasm and pressor response in guinea-pigs and were compared with those of nifedipine. Furthermore, the protective effects against histamine-induced and Oleaceae antigen challenge-induced bronchospasm were evaluated. Finally, the two valerian extracts were analytically characterized by qualitative and quantitative chromatographic analysis. The results showed that the two valeriana extracts possessed significant anticoronaryspastic, antihypertensive and antibronchospastic properties. These were similar to those exhibited by nifedipine and are due to the structural features of the active principles they contain. This study justifies the traditional use of this plant in the treatment of some respiratory and cardiovascular disorders.
El-Rami, Fadi; Nelson, Kristina; Xu, Ping
2017-01-01
Streptococcus sanguinis is a commensal and early colonizer of oral cavity as well as an opportunistic pathogen of infectious endocarditis. Extracting the soluble proteome of this bacterium provides deep insights about the physiological dynamic changes under different growth and stress conditions, thus defining “proteomic signatures” as targets for therapeutic intervention. In this protocol, we describe an experimentally verified approach to extract maximal cytoplasmic proteins from Streptococcus sanguinis SK36 strain. A combination of procedures was adopted that broke the thick cell wall barrier and minimized denaturation of the intracellular proteome, using optimized buffers and a sonication step. Extracted proteome was quantitated using Pierce BCA Protein Quantitation assay and protein bands were macroscopically assessed by Coomassie Blue staining. Finally, a high resolution detection of the extracted proteins was conducted through Synapt G2Si mass spectrometer, followed by label-free relative quantification via Progenesis QI. In conclusion, this pipeline for proteomic extraction and analysis of soluble proteins provides a fundamental tool in deciphering the biological complexity of Streptococcus sanguinis. PMID:29152022
Use of keyword hierarchies to interpret gene expression patterns.
Masys, D R; Welsh, J B; Lynn Fink, J; Gribskov, M; Klacansky, I; Corbeil, J
2001-04-01
High-density microarray technology permits the quantitative and simultaneous monitoring of thousands of genes. The interpretation challenge is to extract relevant information from this large amount of data. A growing variety of statistical analysis approaches are available to identify clusters of genes that share common expression characteristics, but provide no information regarding the biological similarities of genes within clusters. The published literature provides a potential source of information to assist in interpretation of clustering results. We describe a data mining method that uses indexing terms ('keywords') from the published literature linked to specific genes to present a view of the conceptual similarity of genes within a cluster or group of interest. The method takes advantage of the hierarchical nature of Medical Subject Headings used to index citations in the MEDLINE database, and the registry numbers applied to enzymes.
Semantic characteristics of NLP-extracted concepts in clinical notes vs. biomedical literature.
Wu, Stephen; Liu, Hongfang
2011-01-01
Natural language processing (NLP) has become crucial in unlocking information stored in free text, from both clinical notes and biomedical literature. Clinical notes convey clinical information related to individual patient health care, while biomedical literature communicates scientific findings. This work focuses on semantic characterization of texts at an enterprise scale, comparing and contrasting the two domains and their NLP approaches. We analyzed the empirical distributional characteristics of NLP-discovered named entities in Mayo Clinic clinical notes from 2001-2010, and in the 2011 MetaMapped Medline Baseline. We give qualitative and quantitative measures of domain similarity and point to the feasibility of transferring resources and techniques. An important by-product for this study is the development of a weighted ontology for each domain, which gives distributional semantic information that may be used to improve NLP applications.
Quantitative Analysis of the Cervical Texture by Ultrasound and Correlation with Gestational Age.
Baños, Núria; Perez-Moreno, Alvaro; Migliorelli, Federico; Triginer, Laura; Cobo, Teresa; Bonet-Carne, Elisenda; Gratacos, Eduard; Palacio, Montse
2017-01-01
Quantitative texture analysis has been proposed to extract robust features from the ultrasound image to detect subtle changes in the textures of the images. The aim of this study was to evaluate the feasibility of quantitative cervical texture analysis to assess cervical tissue changes throughout pregnancy. This was a cross-sectional study including singleton pregnancies between 20.0 and 41.6 weeks of gestation from women who delivered at term. Cervical length was measured, and a selected region of interest in the cervix was delineated. A model to predict gestational age based on features extracted from cervical images was developed following three steps: data splitting, feature transformation, and regression model computation. Seven hundred images, 30 per gestational week, were included for analysis. There was a strong correlation between the gestational age at which the images were obtained and the estimated gestational age by quantitative analysis of the cervical texture (R = 0.88). This study provides evidence that quantitative analysis of cervical texture can extract features from cervical ultrasound images which correlate with gestational age. Further research is needed to evaluate its applicability as a biomarker of the risk of spontaneous preterm birth, as well as its role in cervical assessment in other clinical situations in which cervical evaluation might be relevant. © 2016 S. Karger AG, Basel.
Model-driven meta-analyses for informing health care: a diabetes meta-analysis as an exemplar.
Brown, Sharon A; Becker, Betsy Jane; García, Alexandra A; Brown, Adama; Ramírez, Gilbert
2015-04-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. © The Author(s) 2014.
MODEL-DRIVEN META-ANALYSES FOR INFORMING HEALTH CARE: A DIABETES META-ANALYSIS AS AN EXEMPLAR
Brown, Sharon A.; Becker, Betsy Jane; García, Alexandra A.; Brown, Adama; Ramírez, Gilbert
2015-01-01
A relatively novel type of meta-analysis, a model-driven meta-analysis, involves the quantitative synthesis of descriptive, correlational data and is useful for identifying key predictors of health outcomes and informing clinical guidelines. Few such meta-analyses have been conducted and thus, large bodies of research remain unsynthesized and uninterpreted for application in health care. We describe the unique challenges of conducting a model-driven meta-analysis, focusing primarily on issues related to locating a sample of published and unpublished primary studies, extracting and verifying descriptive and correlational data, and conducting analyses. A current meta-analysis of the research on predictors of key health outcomes in diabetes is used to illustrate our main points. PMID:25142707
Plant leaf chlorophyll content retrieval based on a field imaging spectroscopy system.
Liu, Bo; Yue, Yue-Min; Li, Ru; Shen, Wen-Jing; Wang, Ke-Lin
2014-10-23
A field imaging spectrometer system (FISS; 380-870 nm and 344 bands) was designed for agriculture applications. In this study, FISS was used to gather spectral information from soybean leaves. The chlorophyll content was retrieved using a multiple linear regression (MLR), partial least squares (PLS) regression and support vector machine (SVM) regression. Our objective was to verify the performance of FISS in a quantitative spectral analysis through the estimation of chlorophyll content and to determine a proper quantitative spectral analysis method for processing FISS data. The results revealed that the derivative reflectance was a more sensitive indicator of chlorophyll content and could extract content information more efficiently than the spectral reflectance, which is more significant for FISS data compared to ASD (analytical spectral devices) data, reducing the corresponding RMSE (root mean squared error) by 3.3%-35.6%. Compared with the spectral features, the regression methods had smaller effects on the retrieval accuracy. A multivariate linear model could be the ideal model to retrieve chlorophyll information with a small number of significant wavelengths used. The smallest RMSE of the chlorophyll content retrieved using FISS data was 0.201 mg/g, a relative reduction of more than 30% compared with the RMSE based on a non-imaging ASD spectrometer, which represents a high estimation accuracy compared with the mean chlorophyll content of the sampled leaves (4.05 mg/g). Our study indicates that FISS could obtain both spectral and spatial detailed information of high quality. Its image-spectrum-in-one merit promotes the good performance of FISS in quantitative spectral analyses, and it can potentially be widely used in the agricultural sector.
Plant Leaf Chlorophyll Content Retrieval Based on a Field Imaging Spectroscopy System
Liu, Bo; Yue, Yue-Min; Li, Ru; Shen, Wen-Jing; Wang, Ke-Lin
2014-01-01
A field imaging spectrometer system (FISS; 380–870 nm and 344 bands) was designed for agriculture applications. In this study, FISS was used to gather spectral information from soybean leaves. The chlorophyll content was retrieved using a multiple linear regression (MLR), partial least squares (PLS) regression and support vector machine (SVM) regression. Our objective was to verify the performance of FISS in a quantitative spectral analysis through the estimation of chlorophyll content and to determine a proper quantitative spectral analysis method for processing FISS data. The results revealed that the derivative reflectance was a more sensitive indicator of chlorophyll content and could extract content information more efficiently than the spectral reflectance, which is more significant for FISS data compared to ASD (analytical spectral devices) data, reducing the corresponding RMSE (root mean squared error) by 3.3%–35.6%. Compared with the spectral features, the regression methods had smaller effects on the retrieval accuracy. A multivariate linear model could be the ideal model to retrieve chlorophyll information with a small number of significant wavelengths used. The smallest RMSE of the chlorophyll content retrieved using FISS data was 0.201 mg/g, a relative reduction of more than 30% compared with the RMSE based on a non-imaging ASD spectrometer, which represents a high estimation accuracy compared with the mean chlorophyll content of the sampled leaves (4.05 mg/g). Our study indicates that FISS could obtain both spectral and spatial detailed information of high quality. Its image-spectrum-in-one merit promotes the good performance of FISS in quantitative spectral analyses, and it can potentially be widely used in the agricultural sector. PMID:25341439
Camp, Charles H.; Lee, Young Jong; Cicerone, Marcus T.
2017-01-01
Coherent anti-Stokes Raman scattering (CARS) microspectroscopy has demonstrated significant potential for biological and materials imaging. To date, however, the primary mechanism of disseminating CARS spectroscopic information is through pseudocolor imagery, which explicitly neglects a vast majority of the hyperspectral data. Furthermore, current paradigms in CARS spectral processing do not lend themselves to quantitative sample-to-sample comparability. The primary limitation stems from the need to accurately measure the so-called nonresonant background (NRB) that is used to extract the chemically-sensitive Raman information from the raw spectra. Measurement of the NRB on a pixel-by-pixel basis is a nontrivial task; thus, reference NRB from glass or water are typically utilized, resulting in error between the actual and estimated amplitude and phase. In this manuscript, we present a new methodology for extracting the Raman spectral features that significantly suppresses these errors through phase detrending and scaling. Classic methods of error-correction, such as baseline detrending, are demonstrated to be inaccurate and to simply mask the underlying errors. The theoretical justification is presented by re-developing the theory of phase retrieval via the Kramers-Kronig relation, and we demonstrate that these results are also applicable to maximum entropy method-based phase retrieval. This new error-correction approach is experimentally applied to glycerol spectra and tissue images, demonstrating marked consistency between spectra obtained using different NRB estimates, and between spectra obtained on different instruments. Additionally, in order to facilitate implementation of these approaches, we have made many of the tools described herein available free for download. PMID:28819335
Nadeau, Kyle P; Rice, Tyler B; Durkin, Anthony J; Tromberg, Bruce J
2015-11-01
We present a method for spatial frequency domain data acquisition utilizing a multifrequency synthesis and extraction (MSE) method and binary square wave projection patterns. By illuminating a sample with square wave patterns, multiple spatial frequency components are simultaneously attenuated and can be extracted to determine optical property and depth information. Additionally, binary patterns are projected faster than sinusoids typically used in spatial frequency domain imaging (SFDI), allowing for short (millisecond or less) camera exposure times, and data acquisition speeds an order of magnitude or more greater than conventional SFDI. In cases where sensitivity to superficial layers or scattering is important, the fundamental component from higher frequency square wave patterns can be used. When probing deeper layers, the fundamental and harmonic components from lower frequency square wave patterns can be used. We compared optical property and depth penetration results extracted using square waves to those obtained using sinusoidal patterns on an in vivo human forearm and absorbing tube phantom, respectively. Absorption and reduced scattering coefficient values agree with conventional SFDI to within 1% using both high frequency (fundamental) and low frequency (fundamental and harmonic) spatial frequencies. Depth penetration reflectance values also agree to within 1% of conventional SFDI.
Nadeau, Kyle P.; Rice, Tyler B.; Durkin, Anthony J.; Tromberg, Bruce J.
2015-01-01
Abstract. We present a method for spatial frequency domain data acquisition utilizing a multifrequency synthesis and extraction (MSE) method and binary square wave projection patterns. By illuminating a sample with square wave patterns, multiple spatial frequency components are simultaneously attenuated and can be extracted to determine optical property and depth information. Additionally, binary patterns are projected faster than sinusoids typically used in spatial frequency domain imaging (SFDI), allowing for short (millisecond or less) camera exposure times, and data acquisition speeds an order of magnitude or more greater than conventional SFDI. In cases where sensitivity to superficial layers or scattering is important, the fundamental component from higher frequency square wave patterns can be used. When probing deeper layers, the fundamental and harmonic components from lower frequency square wave patterns can be used. We compared optical property and depth penetration results extracted using square waves to those obtained using sinusoidal patterns on an in vivo human forearm and absorbing tube phantom, respectively. Absorption and reduced scattering coefficient values agree with conventional SFDI to within 1% using both high frequency (fundamental) and low frequency (fundamental and harmonic) spatial frequencies. Depth penetration reflectance values also agree to within 1% of conventional SFDI. PMID:26524682
Wei, Shih-Chun; Fan, Shen; Lien, Chia-Wen; Unnikrishnan, Binesh; Wang, Yi-Sheng; Chu, Han-Wei; Huang, Chih-Ching; Hsu, Pang-Hung; Chang, Huan-Tsung
2018-03-20
A graphene oxide (GO) nanosheet-modified N + -nylon membrane (GOM) has been prepared and used as an extraction and spray-ionization substrate for robust mass spectrometric detection of malachite green (MG), a highly toxic disinfectant in liquid samples and fish meat. The GOM is prepared by self-deposition of GO thin film onto an N + -nylon membrane, which has been used for efficient extraction of MG in aquaculture water samples or homogenized fish meat samples. Having a dissociation constant of 2.17 × 10 -9 M -1 , the GOM allows extraction of approximately 98% of 100 nM MG. Coupling of the GOM-spray with an ion-trap mass spectrometer allows quantitation of MG in aquaculture freshwater and seawater samples down to nanomolar levels. Furthermore, the system possesses high selectivity and sensitivity for the quantitation of MG and its metabolite (leucomalachite green) in fish meat samples. With easy extraction and efficient spray ionization properties of GOM, this membrane spray-mass spectrometry technique is relatively simple and fast in comparison to the traditional LC-MS/MS methods for the quantitation of MG and its metabolite in aquaculture products. Copyright © 2017 Elsevier B.V. All rights reserved.
Quantitative analysis of perfumes in talcum powder by using headspace sorptive extraction.
Ng, Khim Hui; Heng, Audrey; Osborne, Murray
2012-03-01
Quantitative analysis of perfume dosage in talcum powder has been a challenge due to interference of the matrix and has so far not been widely reported. In this study, headspace sorptive extraction (HSSE) was validated as a solventless sample preparation method for the extraction and enrichment of perfume raw materials from talcum powder. Sample enrichment is performed on a thick film of poly(dimethylsiloxane) (PDMS) coated onto a magnetic stir bar incorporated in a glass jacket. Sampling is done by placing the PDMS stir bar in the headspace vial by using a holder. The stir bar is then thermally desorbed online with capillary gas chromatography-mass spectrometry. The HSSE method is based on the same principles as headspace solid-phase microextraction (HS-SPME). Nevertheless, a relatively larger amount of extracting phase is coated on the stir bar as compared to SPME. Sample amount and extraction time were optimized in this study. The method has shown good repeatability (with relative standard deviation no higher than 12.5%) and excellent linearity with correlation coefficients above 0.99 for all analytes. The method was also successfully applied in the quantitative analysis of talcum powder spiked with perfume at different dosages. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High Performance Liquid Chromatography of Vitamin A: A Quantitative Determination.
ERIC Educational Resources Information Center
Bohman, Ove; And Others
1982-01-01
Experimental procedures are provided for the quantitative determination of Vitamin A (retinol) in food products by analytical liquid chromatography. Standard addition and calibration curve extraction methods are outlined. (SK)
Fracture mechanics by three-dimensional crack-tip synchrotron X-ray microscopy
Withers, P. J.
2015-01-01
To better understand the relationship between the nucleation and growth of defects and the local stresses and phase changes that cause them, we need both imaging and stress mapping. Here, we explore how this can be achieved by bringing together synchrotron X-ray diffraction and tomographic imaging. Conventionally, these are undertaken on separate synchrotron beamlines; however, instruments capable of both imaging and diffraction are beginning to emerge, such as ID15 at the European Synchrotron Radiation Facility and JEEP at the Diamond Light Source. This review explores the concept of three-dimensional crack-tip X-ray microscopy, bringing them together to probe the crack-tip behaviour under realistic environmental and loading conditions and to extract quantitative fracture mechanics information about the local crack-tip environment. X-ray diffraction provides information about the crack-tip stress field, phase transformations, plastic zone and crack-face tractions and forces. Time-lapse CT, besides providing information about the three-dimensional nature of the crack and its local growth rate, can also provide information as to the activation of extrinsic toughening mechanisms such as crack deflection, crack-tip zone shielding, crack bridging and crack closure. It is shown how crack-tip microscopy allows a quantitative measure of the crack-tip driving force via the stress intensity factor or the crack-tip opening displacement. Finally, further opportunities for synchrotron X-ray microscopy are explored. PMID:25624521
Lerner, Eitan; Ploetz, Evelyn; Hohlbein, Johannes; Cordes, Thorben; Weiss, Shimon
2016-07-07
Single-molecule, protein-induced fluorescence enhancement (PIFE) serves as a molecular ruler at molecular distances inaccessible to other spectroscopic rulers such as Förster-type resonance energy transfer (FRET) or photoinduced electron transfer. In order to provide two simultaneous measurements of two distances on different molecular length scales for the analysis of macromolecular complexes, we and others recently combined measurements of PIFE and FRET (PIFE-FRET) on the single molecule level. PIFE relies on steric hindrance of the fluorophore Cy3, which is covalently attached to a biomolecule of interest, to rotate out of an excited-state trans isomer to the cis isomer through a 90° intermediate. In this work, we provide a theoretical framework that accounts for relevant photophysical and kinetic parameters of PIFE-FRET, show how this framework allows the extraction of the fold-decrease in isomerization mobility from experimental data, and show how these results provide information on changes in the accessible volume of Cy3. The utility of this model is then demonstrated for experimental results on PIFE-FRET measurement of different protein-DNA interactions. The proposed model and extracted parameters could serve as a benchmark to allow quantitative comparison of PIFE effects in different biological systems.
NASA Astrophysics Data System (ADS)
Gibergans-Báguena, J.; Llasat, M. C.
2007-12-01
The objective of this paper is to present the improvement of quantitative forecasting of daily rainfall in Catalonia (NE Spain) from an analogues technique, taking into account synoptic and local data. This method is based on an analogues sorting technique: meteorological situations similar to the current one, in terms of 700 and 1000 hPa geopotential fields at 00 UTC, complemented with the inclusion of some thermodynamic parameters extracted from an historical data file. Thermodynamic analysis acts as a highly discriminating feature for situations in which the synoptic situation fails to explain either atmospheric phenomena or rainfall distribution. This is the case in heavy rainfall situations, where the existence of instability and high water vapor content is essential. With the objective of including these vertical thermodynamic features, information provided by the Palma de Mallorca radiosounding (Spain) has been used. Previously, a selection of the most discriminating thermodynamic parameters for the daily rainfall was made, and then the analogues technique applied to them. Finally, three analog forecasting methods were applied for the quantitative daily rainfall forecasting in Catalonia. The first one is based on analogies from geopotential fields to synoptic scale; the second one is exclusively based on the search of similarity from local thermodynamic information and the third method combines the other two methods. The results show that this last method provides a substantial improvement of quantitative rainfall estimation.
Liu, Xin; Yetik, Imam Samil
2011-06-01
Multiparametric magnetic resonance imaging (MRI) has been shown to have higher localization accuracy than transrectal ultrasound (TRUS) for prostate cancer. Therefore, automated cancer segmentation using multiparametric MRI is receiving a growing interest, since MRI can provide both morphological and functional images for tissue of interest. However, all automated methods to this date are applicable to a single zone of the prostate, and the peripheral zone (PZ) of the prostate needs to be extracted manually, which is a tedious and time-consuming job. In this paper, our goal is to remove the need of PZ extraction by incorporating the spatial and geometric information of prostate tumors with multiparametric MRI derived from T2-weighted MRI, diffusion-weighted imaging (DWI) and dynamic contrast enhanced MRI (DCE-MRI). In order to remove the need of PZ extraction, the authors propose a new method to incorporate the spatial information of the cancer. This is done by introducing a new feature called location map. This new feature is constructed by applying a nonlinear transformation to the spatial position coordinates of each pixel, so that the location map implicitly represents the geometric position of each pixel with respect to the prostate region. Then, this new feature is combined with multiparametric MR images to perform tumor localization. The proposed algorithm is applied to multiparametric prostate MRI data obtained from 20 patients with biopsy-confirmed prostate cancer. The proposed method which does not need the masks of PZ was found to have prostate cancer detection specificity of 0.84, sensitivity of 0.80 and dice coefficient value of 0.42. The authors have found that fusing the spatial information allows us to obtain tumor outline without the need of PZ extraction with a considerable success (better or similar performance to methods that require manual PZ extraction). Our experimental results quantitatively demonstrate the effectiveness of the proposed method, depicting that the proposed method has a slightly better or similar localization performance compared to methods which require the masks of PZ.
NASA Astrophysics Data System (ADS)
Xu, Jing; Wang, Yu-Tian; Liu, Xiao-Fei
2015-04-01
Edible blend oil is a mixture of vegetable oils. Eligible blend oil can meet the daily need of two essential fatty acids for human to achieve the balanced nutrition. Each vegetable oil has its different composition, so vegetable oils contents in edible blend oil determine nutritional components in blend oil. A high-precision quantitative analysis method to detect the vegetable oils contents in blend oil is necessary to ensure balanced nutrition for human being. Three-dimensional fluorescence technique is high selectivity, high sensitivity, and high-efficiency. Efficiency extraction and full use of information in tree-dimensional fluorescence spectra will improve the accuracy of the measurement. A novel quantitative analysis is proposed based on Quasi-Monte-Carlo integral to improve the measurement sensitivity and reduce the random error. Partial least squares method is used to solve nonlinear equations to avoid the effect of multicollinearity. The recovery rates of blend oil mixed by peanut oil, soybean oil and sunflower are calculated to verify the accuracy of the method, which are increased, compared the linear method used commonly for component concentration measurement.
NASA Astrophysics Data System (ADS)
Raegen, Adam; Reiter, Kyle; Clarke, Anthony; Lipkowski, Jacek; Dutcher, John
2013-03-01
The Surface Plasmon Resonance (SPR) phenomenon is routinely exploited to qualitatively probe changes to the optical properties of nanoscale coatings on thin metallic surfaces, for use in probes and sensors. Unfortunately, extracting truly quantitative information is usually limited to a select few cases - uniform absorption/desorption of small biomolecules and films, in which a continuous ``slab'' model is a good approximation. We present advancements in the SPR technique that expand the number of cases for which the technique can provide meaningful results. Use of a custom, angle-scanning SPR imaging system, together with a refined data analysis method, allow for quantitative kinetic measurements of laterally heterogeneous systems. We first demonstrate the directionally heterogeneous nature of the SPR phenomenon using a directionally ordered sample, then show how this allows for the calculation of the average coverage of a heterogeneous sample. Finally, the degradation of cellulose microfibrils and bundles of microfibrils due to the action of cellulolytic enzymes will be presented as an excellent example of the capabilities of the SPR imaging system.
Lu, Shao Hua; Li, Bao Qiong; Zhai, Hong Lin; Zhang, Xin; Zhang, Zhuo Yong
2018-04-25
Terahertz time-domain spectroscopy has been applied to many fields, however, it still encounters drawbacks in multicomponent mixtures analysis due to serious spectral overlapping. Here, an effective approach to quantitative analysis was proposed, and applied on the determination of the ternary amino acids in foxtail millet substrate. Utilizing three parameters derived from the THz-TDS, the images were constructed and the Tchebichef image moments were used to extract the information of target components. Then the quantitative models were obtained by stepwise regression. The correlation coefficients of leave-one-out cross-validation (R loo-cv 2 ) were more than 0.9595. As for external test set, the predictive correlation coefficients (R p 2 ) were more than 0.8026 and the root mean square error of prediction (RMSE p ) were less than 1.2601. Compared with the traditional methods (PLS and N-PLS methods), our approach is more accurate, robust and reliable, and can be a potential excellent approach to quantify multicomponent with THz-TDS spectroscopy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Three-dimensional cardiac architecture determined by two-photon microtomy
NASA Astrophysics Data System (ADS)
Huang, Hayden; MacGillivray, Catherine; Kwon, Hyuk-Sang; Lammerding, Jan; Robbins, Jeffrey; Lee, Richard T.; So, Peter
2009-07-01
Cardiac architecture is inherently three-dimensional, yet most characterizations rely on two-dimensional histological slices or dissociated cells, which remove the native geometry of the heart. We previously developed a method for labeling intact heart sections without dissociation and imaging large volumes while preserving their three-dimensional structure. We further refine this method to permit quantitative analysis of imaged sections. After data acquisition, these sections are assembled using image-processing tools, and qualitative and quantitative information is extracted. By examining the reconstructed cardiac blocks, one can observe end-to-end adjacent cardiac myocytes (cardiac strands) changing cross-sectional geometries, merging and separating from other strands. Quantitatively, representative cross-sectional areas typically used for determining hypertrophy omit the three-dimensional component; we show that taking orientation into account can significantly alter the analysis. Using fast-Fourier transform analysis, we analyze the gross organization of cardiac strands in three dimensions. By characterizing cardiac structure in three dimensions, we are able to determine that the α crystallin mutation leads to hypertrophy with cross-sectional area increases, but not necessarily via changes in fiber orientation distribution.
Kroll, Torsten; Schmidt, David; Schwanitz, Georg; Ahmad, Mubashir; Hamann, Jana; Schlosser, Corinne; Lin, Yu-Chieh; Böhm, Konrad J; Tuckermann, Jan; Ploubidou, Aspasia
2016-07-01
High-content analysis (HCA) converts raw light microscopy images to quantitative data through the automated extraction, multiparametric analysis, and classification of the relevant information content. Combined with automated high-throughput image acquisition, HCA applied to the screening of chemicals or RNAi-reagents is termed high-content screening (HCS). Its power in quantifying cell phenotypes makes HCA applicable also to routine microscopy. However, developing effective HCA and bioinformatic analysis pipelines for acquisition of biologically meaningful data in HCS is challenging. Here, the step-by-step development of an HCA assay protocol and an HCS bioinformatics analysis pipeline are described. The protocol's power is demonstrated by application to focal adhesion (FA) detection, quantitative analysis of multiple FA features, and functional annotation of signaling pathways regulating FA size, using primary data of a published RNAi screen. The assay and the underlying strategy are aimed at researchers performing microscopy-based quantitative analysis of subcellular features, on a small scale or in large HCS experiments. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Krishnamurthy, Krish
2013-12-01
The intrinsic quantitative nature of NMR is increasingly exploited in areas ranging from complex mixture analysis (as in metabolomics and reaction monitoring) to quality assurance/control. Complex NMR spectra are more common than not, and therefore, extraction of quantitative information generally involves significant prior knowledge and/or operator interaction to characterize resonances of interest. Moreover, in most NMR-based metabolomic experiments, the signals from metabolites are normally present as a mixture of overlapping resonances, making quantification difficult. Time-domain Bayesian approaches have been reported to be better than conventional frequency-domain analysis at identifying subtle changes in signal amplitude. We discuss an approach that exploits Bayesian analysis to achieve a complete reduction to amplitude frequency table (CRAFT) in an automated and time-efficient fashion - thus converting the time-domain FID to a frequency-amplitude table. CRAFT uses a two-step approach to FID analysis. First, the FID is digitally filtered and downsampled to several sub FIDs, and secondly, these sub FIDs are then modeled as sums of decaying sinusoids using the Bayesian approach. CRAFT tables can be used for further data mining of quantitative information using fingerprint chemical shifts of compounds of interest and/or statistical analysis of modulation of chemical quantity in a biological study (metabolomics) or process study (reaction monitoring) or quality assurance/control. The basic principles behind this approach as well as results to evaluate the effectiveness of this approach in mixture analysis are presented. Copyright © 2013 John Wiley & Sons, Ltd.
Numerous extraction methods have been developed and used in the quantitation of both photopigments and mycosporine amino acids (MAAs) found in Symbiodinium sp. and zooanthellate metazoans. We have development of a simple, mild extraction procedure using methanol, which when coupl...
Mousavi, Fatemeh; Pawliszyn, Janusz
2013-11-25
1-Vinyl-3-octadecylimidazolium bromide ionic liquid [C18VIm]Br was prepared and used for the modification of mercaptopropyl-functionalized silica (Si-MPS) through surface radical chain-transfer addition. The synthesized octadecylimidazolium-modified silica (SiImC18) was characterized by thermogravimetric analysis (TGA), infrared spectroscopy (IR), (13)C NMR and (29)Si NMR spectroscopy and used as an extraction phase for the automated 96-blade solid phase microextraction (SPME) system with thin-film geometry using polyacrylonitrile (PAN) glue. The new proposed extraction phase was applied for extraction of aminoacids from grape pulp, and LC-MS-MS method was developed for separation of model compounds. Extraction efficiency, reusability, linearity, limit of detection, limit of quantitation and matrix effect were evaluated. The whole process of sample preparation for the proposed method requires 270min for 96 samples simultaneously (60min preconditioning, 90min extraction, 60min desorption and 60min for carryover step) using 96-blade SPME system. Inter-blade and intra-blade reproducibility were in the respective ranges of 5-13 and 3-10% relative standard deviation (RSD) for all model compounds. Limits of detection and quantitation of the proposed SPME-LC-MS/MS system for analysis of analytes were found to range from 0.1 to 1.0 and 0.5 to 3.0μgL(-1), respectively. Standard addition calibration was applied for quantitative analysis of aminoacids from grape juice and the results were validated with solvent extraction (SE) technique. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cabrera Fernandez, Delia; Salinas, Harry M.; Somfai, Gabor; Puliafito, Carmen A.
2006-03-01
Optical coherence tomography (OCT) is a rapidly emerging medical imaging technology. In ophthalmology, OCT is a powerful tool because it enables visualization of the cross sectional structure of the retina and anterior eye with higher resolutions than any other non-invasive imaging modality. Furthermore, OCT image information can be quantitatively analyzed, enabling objective assessment of features such as macular edema and diabetes retinopathy. We present specific improvements in the quantitative analysis of the OCT system, by combining the diffusion equation with the free Shrödinger equation. In such formulation, important features of the image can be extracted by extending the analysis from the real axis to the complex domain. Experimental results indicate that our proposed novel approach has good performance in speckle noise removal, enhancement and segmentation of the various cellular layers of the retina using the OCT system.
NASA Astrophysics Data System (ADS)
Di, Jianglei; Song, Yu; Xi, Teli; Zhang, Jiwei; Li, Ying; Ma, Chaojie; Wang, Kaiqiang; Zhao, Jianlin
2017-11-01
Biological cells are usually transparent with a small refractive index gradient. Digital holographic interferometry can be used in the measurement of biological cells. We propose a dual-wavelength common-path digital holographic microscopy for the quantitative phase imaging of biological cells. In the proposed configuration, a parallel glass plate is inserted in the light path to create the lateral shearing, and two lasers with different wavelengths are used as the light source to form the dual-wavelength composite digital hologram. The information of biological cells for different wavelengths is separated and extracted in the Fourier domain of the hologram, and then combined to a shorter wavelength in the measurement process. This method could improve the system's temporal stability and reduce speckle noises simultaneously. Mouse osteoblastic cells and peony pollens are measured to show the feasibility of this method.
Wu, Shu-lian; Li, Hui; Zhang, Xiao-man; Chen, Wei R; Wang, Yun-Xia
2014-01-01
Quantitative characterization of skin collagen on photo-thermal response and its regeneration process is an important but difficult task. In this study, morphology and spectrum characteristics of collagen during photo-thermal response and its light-induced remodeling process were obtained by second-harmonic generation microscope in vivo. The texture feature of collagen orientation index and fractal dimension was extracted by image processing. The aim of this study is to detect the information hidden in skin texture during the process of photo-thermal response and its regeneration. The quantitative relations between injured collagen and texture feature were established for further analysis of the injured characteristics. Our results show that it is feasible to determine the main impacts of phototherapy on the skin. It is important to understand the process of collagen remodeling after photo-thermal injuries from texture feature.
An Overview of Advanced SILAC-Labeling Strategies for Quantitative Proteomics.
Terzi, F; Cambridge, S
2017-01-01
Comparative, quantitative mass spectrometry of proteins provides great insight to protein abundance and function, but some molecular characteristics related to protein dynamics are not so easily obtained. Because the metabolic incorporation of stable amino acid isotopes allows the extraction of distinct temporal and spatial aspects of protein dynamics, the SILAC methodology is uniquely suited to be adapted for advanced labeling strategies. New SILAC strategies have emerged that allow deeper foraging into the complexity of cellular proteomes. Here, we review a few advanced SILAC-labeling strategies that have been published during last the years. Among them, different subsaturating-labeling as well as dual-labeling schemes are most prominent for a range of analyses including those of neuronal proteomes, secretion, or cell-cell-induced stimulations. These recent developments suggest that much more information can be gained from proteomic analyses if the labeling strategies are specifically tailored toward the experimental design. © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Pi, Shiqiang; Liu, Wenzhong; Jiang, Tao
2018-03-01
The magnetic transparency of biological tissue allows the magnetic nanoparticle (MNP) to be a promising functional sensor and contrast agent. The complex susceptibility of MNPs, strongly influenced by particle concentration, excitation magnetic field and their surrounding microenvironment, provides significant implications for biomedical applications. Therefore, magnetic susceptibility imaging of high spatial resolution will give more detailed information during the process of MNP-aided diagnosis and therapy. In this study, we present a novel spatial magnetic susceptibility extraction method for MNPs under a gradient magnetic field, a low-frequency drive magnetic field, and a weak strength high-frequency magnetic field. Based on this novel method, a magnetic particle susceptibility imaging (MPSI) of millimeter-level spatial resolution (<3 mm) was achieved using our homemade imaging system. Corroborated by the experimental results, the MPSI shows real-time (1 s per frame acquisition) and quantitative abilities, and isotropic high resolution.
Garcia, Ernest V; Taylor, Andrew; Folks, Russell; Manatunga, Daya; Halkar, Raghuveer; Savir-Baruch, Bital; Dubovsky, Eva
2012-09-01
Decision support systems for imaging analysis and interpretation are rapidly being developed and will have an increasing impact on the practice of medicine. RENEX is a renal expert system to assist physicians evaluate suspected obstruction in patients undergoing mercaptoacetyltriglycine (MAG3) renography. RENEX uses quantitative parameters extracted from the dynamic renal scan data using QuantEM™II and heuristic rules in the form of a knowledge base gleaned from experts to determine if a kidney is obstructed; however, RENEX does not have access to and could not consider the clinical information available to diagnosticians interpreting these studies. We designed and implemented a methodology to incorporate clinical information into RENEX, implemented motion detection and evaluated this new comprehensive system (iRENEX) in a pilot group of 51 renal patients. To reach a conclusion as to whether a kidney is obstructed, 56 new clinical rules were added to the previously reported 60 rules used to interpret quantitative MAG3 parameters. All the clinical rules were implemented after iRENEX reached a conclusion on obstruction based on the quantitative MAG3 parameters, and the evidence of obstruction was then modified by the new clinical rules. iRENEX consisted of a library to translate parameter values to certainty factors, a knowledge base with 116 heuristic interpretation rules, a forward chaining inference engine to determine obstruction and a justification engine. A clinical database was developed containing patient histories and imaging report data obtained from the hospital information system associated with the pertinent MAG3 studies. The system was fine-tuned and tested using a pilot group of 51 patients (21 men, mean age 58.2 ± 17.1 years, 100 kidneys) deemed by an expert panel to have 61 unobstructed and 39 obstructed kidneys. iRENEX, using only quantitative MAG3 data agreed with the expert panel in 87 % (34/39) of obstructed and 90 % (55/61) of unobstructed kidneys. iRENEX, using both quantitative and clinical data agreed with the expert panel in 95 % (37/39) of obstructed and 92 % (56/61) of unobstructed kidneys. The clinical information significantly (p < 0.001) increased iRENEX certainty in detecting obstruction over using the quantitative data alone. Our renal expert system for detecting renal obstruction has been substantially expanded to incorporate the clinical information available to physicians as well as advanced quality control features and was shown to interpret renal studies in a pilot group at a standardized expert level. These encouraging results warrant a prospective study in a large population of patients with and without renal obstruction to establish the diagnostic performance of iRENEX.
Our goal is to construct a publicly available computational radiomics system for the objective and automated extraction of quantitative imaging features that we believe will yield biomarkers of greater prognostic value compared with routinely extracted descriptors of tumor size. We will create a generalized, open, portable, and extensible radiomics platform that is widely applicable across cancer types and imaging modalities and describe how we will use lung and head and neck cancers as models to validate our developments.
Quantifying hydrogen-deuterium exchange of meteoritic dicarboxylic acids during aqueous extraction
NASA Astrophysics Data System (ADS)
Fuller, M.; Huang, Y.
2003-03-01
Hydrogen isotope ratios of organic compounds in carbonaceous chondrites provide critical information about their origins and evolutionary history. However, because many of these compounds are obtained by aqueous extraction, the degree of hydrogen-deuterium (H/D) exchange that occurs during the process needs to be quantitatively evaluated. This study uses compound- specific hydrogen isotopic analysis to quantify the H/D exchange during aqueous extraction. Three common meteoritic dicarboxylic acids (succinic, glutaric, and 2-methyl glutaric acids) were refluxed under conditions simulating the extraction process. Changes in D values of the dicarboxylic acids were measured following the reflux experiments. A pseudo-first order rate law was used to model the H/D exchange rates which were then used to calculate the isotope exchange resulting from aqueous extraction. The degree of H/D exchange varies as a result of differences in molecular structure, the alkalinity of the extraction solution and presence/absence of meteorite powder. However, our model indicates that succinic, glutaric, and 2-methyl glutaric acids with a D of 1800 would experience isotope changes of 38, 10, and 6, respectively during the extraction process. Therefore, the overall change in D values of the dicarboxylic acids during the aqueous extraction process is negligible. We also demonstrate that H/D exchange occurs on the chiral -carbon in 2-methyl glutaric acid. The results suggest that the racemic mixture of 2-methyl glutaric acid in the Tagish Lake meteorite could result from post-synthesis aqueous alteration. The approach employed in this study can also be used to quantify H/D exchange for other important meteoritic compounds such as amino acids.
Oldham, Athenia L; Drilling, Heather S; Stamps, Blake W; Stevenson, Bradley S; Duncan, Kathleen E
2012-11-20
The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources.
2012-01-01
The analysis of microbial assemblages in industrial, marine, and medical systems can inform decisions regarding quality control or mitigation. Modern molecular approaches to detect, characterize, and quantify microorganisms provide rapid and thorough measures unbiased by the need for cultivation. The requirement of timely extraction of high quality nucleic acids for molecular analysis is faced with specific challenges when used to study the influence of microorganisms on oil production. Production facilities are often ill equipped for nucleic acid extraction techniques, making the preservation and transportation of samples off-site a priority. As a potential solution, the possibility of extracting nucleic acids on-site using automated platforms was tested. The performance of two such platforms, the Fujifilm QuickGene-Mini80™ and Promega Maxwell®16 was compared to a widely used manual extraction kit, MOBIO PowerBiofilm™ DNA Isolation Kit, in terms of ease of operation, DNA quality, and microbial community composition. Three pipeline biofilm samples were chosen for these comparisons; two contained crude oil and corrosion products and the third transported seawater. Overall, the two more automated extraction platforms produced higher DNA yields than the manual approach. DNA quality was evaluated for amplification by quantitative PCR (qPCR) and end-point PCR to generate 454 pyrosequencing libraries for 16S rRNA microbial community analysis. Microbial community structure, as assessed by DGGE analysis and pyrosequencing, was comparable among the three extraction methods. Therefore, the use of automated extraction platforms should enhance the feasibility of rapidly evaluating microbial biofouling at remote locations or those with limited resources. PMID:23168231
Aims: To determine the performance of a rapid, real time polymerase chain reaction (PCR) method for the detection and quantitative analysis Helicobacter pylori at low concentrations in drinking water.
Methods and Results: A rapid DNA extraction and quantitative PCR (QPCR)...
Wang, Mei; Zhao, Jianping; Avula, Bharathi; Wang, Yan-Hong; Avonto, Cristina; Chittiboyina, Amar G; Wylie, Philip L; Parcher, Jon F; Khan, Ikhlas A
2014-12-17
A high-resolution gas chromatography/mass spectrometry (GC/MS) with selected ion monitor method focusing on the characterization and quantitative analysis of ginkgolic acids (GAs) in Ginkgo biloba L. plant materials, extracts, and commercial products was developed and validated. The method involved sample extraction with (1:1) methanol and 10% formic acid, liquid-liquid extraction with n-hexane, and derivatization with trimethylsulfonium hydroxide (TMSH). Separation of two saturated (C13:0 and C15:0) and six unsaturated ginkgolic acid methyl esters with different positional double bonds (C15:1 Δ8 and Δ10, C17:1 Δ8, Δ10, and Δ12, and C17:2) was achieved on a very polar (88% cyanopropyl) aryl-polysiloxane HP-88 capillary GC column. The double bond positions in the GAs were determined by ozonolysis. The developed GC/MS method was validated according to ICH guidelines, and the quantitation results were verified by comparison with a standard high-performance liquid chromatography method. Nineteen G. biloba authenticated and commercial plant samples and 21 dietary supplements purported to contain G. biloba leaf extracts were analyzed. Finally, the presence of the marker compounds, terpene trilactones and flavonol glycosides for Ginkgo biloba in the dietary supplements was determined by UHPLC/MS and used to confirm the presence of G. biloba leaf extracts in all of the botanical dietary supplements.
Ro, Chul-Un; Kim, HyeKyeong; Van Grieken, René
2004-03-01
An electron probe X-ray microanalysis (EPMA) technique, using an energy-dispersive X-ray detector with an ultrathin window, designated a low-Z particle EPMA, has been developed. The low-Z particle EPMA allows the quantitative determination of concentrations of low-Z elements, such as C, N, and O, as well as chemical elements that can be analyzed by conventional energy-dispersive EPMA, in individual particles. Since a data set is usually composed of data for several thousands of particles in order to make environmentally meaningful observations of real atmospheric aerosol samples, the development of a method that fully extracts chemical information contained in the low-Z particle EPMA data is important. An expert system that can rapidly and reliably perform chemical speciation from the low-Z particle EPMA data is presented. This expert system tries to mimic the logic used by experts and is implemented by applying macroprogramming available in MS Excel software. Its feasibility is confirmed by applying the expert system to data for various types of standard particles and a real atmospheric aerosol sample. By applying the expert system, the time necessary for chemical speciation becomes shortened very much and detailed information on particle data can be saved and extracted later if more information is needed for further analysis.
Diagnostic analysis of liver B ultrasonic texture features based on LM neural network
NASA Astrophysics Data System (ADS)
Chi, Qingyun; Hua, Hu; Liu, Menglin; Jiang, Xiuying
2017-03-01
In this study, B ultrasound images of 124 benign and malignant patients were randomly selected as the study objects. The B ultrasound images of the liver were treated by enhanced de-noising. By constructing the gray level co-occurrence matrix which reflects the information of each angle, Principal Component Analysis of 22 texture features were extracted and combined with LM neural network for diagnosis and classification. Experimental results show that this method is a rapid and effective diagnostic method for liver imaging, which provides a quantitative basis for clinical diagnosis of liver diseases.
Wang, Peng; Liu, Donghui; Gu, Xu; Jiang, Shuren; Zhou, Zhiqiang
2008-01-01
Methods for the enantiomeric quantitative determination of 3 chiral pesticides, paclobutrazol, myclobutanil, and uniconazole, and their residues in soil and water are reported. An effective chiral high-performance liquid chromatographic (HPLC)-UV method using an amylose-tris(3,5-dimethylphenylcarbamate; AD) column was developed for resolving the enantiomers and quantitative determination. The enantiomers were identified by a circular dichroism detector. Validation involved complete resolution of each of the 2 enantiomers, plus determination of linearity, precision, and limit of detection (LOD). The pesticide enantiomers were isolated by solvent extraction from soil and C18 solid-phase extraction from water. The 2 enantiomers of the 3 pesticides could be completely separated on the AD column using n-hexane isopropanol mobile phase. The linearity and precision results indicated that the method was reliable for the quantitative analysis of the enantiomers. LODs were 0.025, 0.05, and 0.05 mg/kg for each enantiomer of paclobutrazol, myclobutanil, and uniconazole, respectively. Recovery and precision data showed that the pretreatment procedures were satisfactory for enantiomer extraction and cleanup. This method can be used for optical purity determination of technical material and analysis of environmental residues.
Positron emission tomography (PET) advances in neurological applications
NASA Astrophysics Data System (ADS)
Sossi, V.
2003-09-01
Positron Emission Tomography (PET) is a functional imaging modality used in brain research to map in vivo neurotransmitter and receptor activity and to investigate glucose utilization or blood flow patterns both in healthy and disease states. Such research is made possible by the wealth of radiotracers available for PET, by the fact that metabolic and kinetic parameters of particular processes can be extracted from PET data and by the continuous development of imaging techniques. In recent years great advancements have been made in the areas of PET instrumentation, data quantification and image reconstruction that allow for more detailed and accurate biological information to be extracted from PET data. It is now possible to quantitatively compare data obtained either with different tracers or with the same tracer under different scanning conditions. These sophisticated imaging approaches enable detailed investigation of disease mechanisms and system response to disease and/or therapy.
Ladoux, Benoit; Quivy, Jean-Pierre; Doyle, Patrick; Roure, Olivia du; Almouzni, Geneviève; Viovy, Jean-Louis
2000-01-01
Fluorescence videomicroscopy and scanning force microscopy were used to follow, in real time, chromatin assembly on individual DNA molecules immersed in cell-free systems competent for physiological chromatin assembly. Within a few seconds, molecules are already compacted into a form exhibiting strong similarities to native chromatin fibers. In these extracts, the compaction rate is more than 100 times faster than expected from standard biochemical assays. Our data provide definite information on the forces involved (a few piconewtons) and on the reaction path. DNA compaction as a function of time revealed unique features of the assembly reaction in these extracts. They imply a sequential process with at least three steps, involving DNA wrapping as the final event. An absolute and quantitative measure of the kinetic parameters of the early steps in chromatin assembly under physiological conditions could thus be obtained. PMID:11114182
Control volume based hydrocephalus research; analysis of human data
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer
2010-11-01
Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.
Assessing healthcare professionals' experiences of integrated care: do surveys tell the full story?
Stephenson, Matthew D; Campbell, Jared M; Lisy, Karolina; Aromataris, Edoardo C
2017-09-01
Integrated care is the combination of different healthcare services with the goal to provide comprehensive, seamless, effective and efficient patient care. Assessing the experiences of healthcare professionals (HCPs) is an important aspect when evaluating integrated care strategies. The aim of this rapid review was to investigate if quantitative surveys used to assess HCPs' experiences with integrated care capture all the aspects highlighted as being important in qualitative research, with a view to informing future survey development. The review considered all types of health professionals in primary care, and hospital and specialist services, with a specific focus on the provision of integrated care aimed at improving the patient journey. PubMed, CINAHL and grey literature sources were searched for relevant surveys/program evaluations and qualitative research studies. Full text articles deemed to be of relevance to the review were appraised for methodological quality using abridged critical appraisal instruments from the Joanna Briggs Institute. Data were extracted from included studies using standardized data extraction templates. Findings from included studies were grouped into domains based on similarity of meaning. Similarities and differences in the domains covered in quantitative surveys and those identified as being important in qualitative research were explored. A total of 37 studies (19 quantitative surveys, 14 qualitative studies and four mixed-method studies) were included in the review. A range of healthcare professions participated in the included studies, the majority being primary care providers. Common domains identified from quantitative surveys and qualitative studies included Communication, Agreement on Clear Roles and Responsibilities, Facilities, Information Systems, and Coordination of Care and Access. Qualitative research highlighted domains identified by HCPs as being relevant to their experiences with integrated care that have not routinely being surveyed, including Workload, Clear Leadership/Decision-Making, Management, Flexibility of Integrated Care Model, Engagement, Usefulness of Integrated Care and Collaboration, and Positive Impact/Clinical Benefits/Practice Level Benefits. There were several domains identified from qualitative research that are not routinely included in quantitative surveys to assess health professionals' experiences of integrated care. In addition, the qualitative findings suggest that the experiences of HCPs are often impacted by deeper aspects than those measured by existing surveys. Incorporation of targeted items within these domains in the design of surveys should enhance the capture of data that are relevant to the experiences of HCPs with integrated care, which may assist in more comprehensive evaluation and subsequent improvement of integrated care programs.
Wang, Jiaming; Gambetta, Joanna M; Jeffery, David W
2016-05-18
Two rosé wines, representing a tropical and a fruity/floral style, were chosen from a previous study for further exploration by aroma extract dilution analysis (AEDA) and quantitative analysis. Volatiles were extracted using either liquid-liquid extraction (LLE) followed by solvent-assisted flavor evaporation (SAFE) or a recently developed dynamic headspace (HS) sampling method utilizing solid-phase extraction (SPE) cartridges. AEDA was conducted using gas chromatography-mass spectrometry/olfactometry (GC-MS/O) and a total of 51 aroma compounds with a flavor dilution (FD) factor ≥3 were detected. Quantitative analysis of 92 volatiles was undertaken in both wines for calculation of odor activity values. The fruity and floral wine style was mostly driven by 2-phenylethanol, β-damascenone, and a range of esters, whereas 3-SHA and several volatile acids were seen as essential for the tropical style. When extraction methods were compared, HS-SPE was as efficient as SAFE for extracting most esters and higher alcohols, which were associated with fruity and floral characters, but it was difficult to capture volatiles with greater polarity or higher boiling point that may still be important to perceived wine aroma.
Deciphering the proteomic profile of rice (Oryza sativa) bran: a pilot study.
Ferrari, Fabio; Fumagalli, Marco; Profumo, Antonella; Viglio, Simona; Sala, Alberto; Dolcini, Lorenzo; Temporini, Caterina; Nicolis, Stefania; Merli, Daniele; Corana, Federica; Casado, Begona; Iadarola, Paolo
2009-12-01
The exact knowledge of the qualitative and quantitative protein components of rice bran is an essential aspect to be considered for a better understanding of the functional properties of this resource. Aim of the present investigation was to extract the largest number of rice bran proteins and to obtain their qualitative characterization. For this purpose, three different extraction protocols have been applied either on full-fat or on defatted rice bran. Likewise, to identify the highest number of proteins, MS data collected from 1-DE, 2-DE and gel-free procedures have been combined. These approaches allowed to unambiguously identify 43 proteins that were classified as signalling/regulation proteins (30%), proteins with enzymatic activity (30%), storage proteins (30%), transfer (5%) and structural (5%) proteins. The fact that all extraction and identification procedures have been performed in triplicate with an excellent reproducibility provides a rationale for considering the platform of proteins shown in this study as the potential proteome profile of rice bran. It also represents a source of information to evaluate better the qualities of rice bran as food resource.
Sinkiewicz, Daniel; Friesen, Lendra; Ghoraani, Behnaz
2017-02-01
Cortical auditory evoked potentials (CAEP) are used to evaluate cochlear implant (CI) patient auditory pathways, but the CI device produces an electrical artifact, which obscures the relevant information in the neural response. Currently there are multiple methods, which attempt to recover the neural response from the contaminated CAEP, but there is no gold standard, which can quantitatively confirm the effectiveness of these methods. To address this crucial shortcoming, we develop a wavelet-based method to quantify the amount of artifact energy in the neural response. In addition, a novel technique for extracting the neural response from single channel CAEPs is proposed. The new method uses matching pursuit (MP) based feature extraction to represent the contaminated CAEP in a feature space, and support vector machines (SVM) to classify the components as normal hearing (NH) or artifact. The NH components are combined to recover the neural response without artifact energy, as verified using the evaluation tool. Although it needs some further evaluation, this approach is a promising method of electrical artifact removal from CAEPs. Copyright © 2016 IPEM. Published by Elsevier Ltd. All rights reserved.
Wang, Ning; Wu, Xiaolin; Ku, Lixia; Chen, Yanhui; Wang, Wei
2016-01-01
Leaf morphology is closely related to the growth and development of maize (Zea mays L.) plants and final kernel production. As an important part of the maize leaf, the midrib holds leaf blades in the aerial position for maximum sunlight capture. Leaf midribs of adult plants contain substantial sclerenchyma cells with heavily thickened and lignified secondary walls and have a high amount of phenolics, making protein extraction and proteome analysis difficult in leaf midrib tissue. In the present study, three protein-extraction methods that are commonly used in plant proteomics, i.e., phenol extraction, TCA/acetone extraction, and TCA/acetone/phenol extraction, were qualitatively and quantitatively evaluated based on 2DE maps and MS/MS analysis using the midribs of the 10th newly expanded leaves of maize plants. Microscopy revealed the existence of substantial amounts of sclerenchyma underneath maize midrib epidermises (particularly abaxial epidermises). The spot-number order obtained via 2DE mapping was as follows: phenol extraction (655) > TCA/acetone extraction (589) > TCA/acetone/phenol extraction (545). MS/MS analysis identified a total of 17 spots that exhibited 2-fold changes in abundance among the three methods (using phenol extraction as a control). Sixteen of the proteins identified were hydrophilic, with GRAVY values ranging from -0.026 to -0.487. For all three methods, we were able to obtain high-quality protein samples and good 2DE maps for the maize leaf midrib. However, phenol extraction produced a better 2DE map with greater resolution between spots, and TCA/acetone extraction produced higher protein yields. Thus, this paper includes a discussion regarding the possible reasons for differential protein extraction among the three methods. This study provides useful information that can be used to select suitable protein extraction methods for the proteome analysis of recalcitrant plant tissues that are rich in sclerenchyma cells.
Kosulin, K; Dworzak, S; Lawitschka, A; Matthes-Leodolter, S; Lion, T
2016-12-01
Adenoviruses almost invariably proliferate in the gastrointestinal tract prior to dissemination, and critical threshold concentrations in stool correlate with the risk of viremia. Monitoring of adenovirus loads in stool may therefore be important for timely initiation of treatment in order to prevent invasive infection. Comparison of a manual DNA extraction kit in combination with a validated in-house PCR assay with automated extraction on the NucliSENS-EasyMAG device coupled with the Adenovirus R-gene kit (bioMérieux) for quantitative adenovirus analysis in stool samples. Stool specimens spiked with adenovirus concentrations in a range from 10E2-10E11 copies/g and 32 adenovirus-positive clinical stool specimens from pediatric stem cell transplant recipients were tested along with appropriate negative controls. Quantitative analysis of viral load in adenovirus-positive stool specimens revealed a median difference of 0.5 logs (range 0.1-2.2) between the detection systems tested and a difference of 0.3 logs (range 0.0-1.7) when the comparison was restricted to the PCR assays only. Spiking experiments showed a detection limit of 10 2 -10 3 adenovirus copies/g stool revealing a somewhat higher sensitivity offered by the automated extraction. The dynamic range of accurate quantitative analysis by both systems investigated was between 10 3 and 10 8 virus copies/g. The differences in quantitative analysis of adenovirus copy numbers between the systems tested were primarily attributable to the DNA extraction method used, while the qPCR assays revealed a high level of concordance. Both systems showed adequate performance for detection and monitoring of adenoviral load in stool specimens. Copyright © 2016 Elsevier B.V. All rights reserved.
A new method to evaluate image quality of CBCT images quantitatively without observers
Shimizu, Mayumi; Okamura, Kazutoshi; Yoshida, Shoko; Weerawanich, Warangkana; Tokumori, Kenji; Jasa, Gainer R; Yoshiura, Kazunori
2017-01-01
Objectives: To develop an observer-free method for quantitatively evaluating the image quality of CBCT images by applying just-noticeable difference (JND). Methods: We used two test objects: (1) a Teflon (polytetrafluoroethylene) plate phantom attached to a dry human mandible; and (2) a block phantom consisting of a Teflon step phantom and an aluminium step phantom. These phantoms had holes with different depths. They were immersed in water and scanned with a CB MercuRay (Hitachi Medical Corporation, Tokyo, Japan) at tube voltages of 120 kV, 100 kV, 80 kV and 60 kV. Superimposed images of the phantoms with holes were used for evaluation. The number of detectable holes was used as an index of image quality. In detecting holes quantitatively, the threshold grey value (ΔG), which differentiated holes from the background, was calculated using a specific threshold (the JND), and we extracted the holes with grey values above ΔG. The indices obtained by this quantitative method (the extracted hole values) were compared with the observer evaluations (the observed hole values). In addition, the contrast-to-noise ratio (CNR) of the shallowest detectable holes and the deepest undetectable holes were measured to evaluate the contribution of CNR to detectability. Results: The results of this evaluation method corresponded almost exactly with the evaluations made by observers. The extracted hole values reflected the influence of different tube voltages. All extracted holes had an area with a CNR of ≥1.5. Conclusions: This quantitative method of evaluating CBCT image quality may be more useful and less time-consuming than evaluation by observation. PMID:28045343
Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip
2012-02-01
The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.
New method for determination of ten pesticides in human blood.
García-Repetto, R; Giménez, M P; Repetto, M
2001-01-01
An analytical method was developed for precise identification and quantitation of 10 pesticides in human blood. The pesticides studied, which have appeared frequently in actual cases, were endosulfan, lindane, parathion, ethyl-azinphos, diazinon, malathion, alachlor, tetradifon, fenthion and dicofol (o-p' and p-p' isomers). The current method replaces an earlier method which involved liquid-liquid extraction with a mixture of n-hexane-benzene (1 + 1). The extraction is performed by solid-phase extraction, with C18 cartridges and 2 internal standards, perthane and triphenylphosphate. Eluates were analyzed by gas chromatography (GC) with nitrogen-phosphorus and electrochemical detectors. Results were confirmed by GC-mass spectrometry in the electron impact mode. Blood blank samples spiked with 2 standard mixtures and an internal standard were used for quantitation. Mean recoveries ranged from 71.83 to 97.10%. Detection and quantitation limits are reported for each pesticide. Examples are provided to show the application of the present method to actual samples.
Subcritical (hot) water with ethanol as modifier was used
to extract nonylphenol polyethoxy carboxylates (NPECs)
with 1-4 ethoxy groups from sludge samples. Quantitative
recovery of native NPECs from sludge was accomplished
by extracting 0.25 g samples for 20 min w...
ERIC Educational Resources Information Center
Lavilla, Isela; Costas, Marta; Pena-Pereira, Francisco; Gil, Sandra; Bendicho, Carlos
2011-01-01
Ultrasound-assisted extraction (UAE) is introduced to upper-level analytical chemistry students as a simple strategy focused on sample preparation for trace-metal determination in biological tissues. Nickel extraction in seafood samples and quantification by electrothermal atomic absorption spectrometry (ETAAS) are carried out by a team of four…
A Teaching Laboratory for Comprehensive Lipid Characterization from Food Samples
ERIC Educational Resources Information Center
Bendinskas, Kestutis; Weber, Benjamin; Nsouli, Tamara; Nguyen, Hoangvy V.; Joyce, Carolyn; Niri, Vadoud; Jaskolla, Thorsten W.
2014-01-01
Traditional and state-of-the-art techniques were combined to probe for various lipid classes from egg yolk and avocado qualitatively and quantitatively. A total lipid extract was isolated using liquid-liquid extraction. An aliquot of the total lipid extract was subjected to transesterification to form volatile fatty acid methyl esters suitable for…
NASA Astrophysics Data System (ADS)
Harney, Robert C.
1997-03-01
A novel methodology offering the potential for resolving two of the significant problems of implementing multisensor target recognition systems, i.e., the rational selection of a specific sensor suite and optimal allocation of requirements among sensors, is presented. Based on a sequence of conjectures (and their supporting arguments) concerning the relationship of extractable information content to recognition performance of a sensor system, a set of heuristics (essentially a reformulation of Johnson's criteria applicable to all sensor and data types) is developed. An approach to quantifying the information content of sensor data is described. Coupling this approach with the widely accepted Johnson's criteria for target recognition capabilities results in a quantitative method for comparing the target recognition ability of diverse sensors (imagers, nonimagers, active, passive, electromagnetic, acoustic, etc.). Extension to describing the performance of multiple sensors is straightforward. The application of the technique to sensor selection and requirements allocation is discussed.
Ligocka, D; Lison, D; Haufroid, V
2002-10-05
The aim of this work was to validate a sensitive method for quantitative analysis of 5-hydroxy-N-methylpyrrolidone (5-HNMP) in urine. This compound has been recommended as a marker for biological monitoring of N-methylpyrrolidone (NMP) exposure. Different solvents and alternative methods of extraction including liquid-liquid extraction (LLE) on Chem Elut and solid-phase extraction (SPE) on Oasis HLB columns were tested. The most efficient extraction of 5-HNMP in urine was LLE with Chem Elut columns and dichloromethane as a solvent (consistently 22% of recovery). The urinary extracts were derivatized by bis(trimethylsilyl)trifluoroacetamide and analysed by gas chromatography-mass spectrometry (GC-MS) with tetradeutered 5-HNMP as an internal standard. The detection limit of this method is 0.017 mg/l urine with an intraassay precision of 1.6-2.6%. The proposed method of extraction is simple and reproducible. Four different m/z signal ratios of TMS-5-HNMP and tetralabelled TMS-5-HNMP have been validated and could be indifferently used in case of unexpected impurities from urine matrix. Copyright 2002 Elsevier Science B.V.
Piatak, N.M.; Seal, R.R.; Sanzolone, R.F.; Lamothe, P.J.; Brown, Z.A.; Adams, M.
2007-01-01
We report results from sequential extraction experiments and the quantitative mineralogy for samples of stream sediments and mine wastes collected from metal mines. Samples were from the Elizabeth, Ely Copper, and Pike Hill Copper mines in Vermont, the Callahan Mine in Maine, and the Martha Mine in New Zealand. The extraction technique targeted the following operationally defined fractions and solid-phase forms: (1) soluble, adsorbed, and exchangeable fractions; (2) carbonates; (3) organic material; (4) amorphous iron- and aluminum-hydroxides and crystalline manganese-oxides; (5) crystalline iron-oxides; (6) sulfides and selenides; and (7) residual material. For most elements, the sum of an element from all extractions steps correlated well with the original unleached concentration. Also, the quantitative mineralogy of the original material compared to that of the residues from two extraction steps gave insight into the effectiveness of reagents at dissolving targeted phases. The data are presented here with minimal interpretation or discussion and further analyses and interpretation will be presented elsewhere.
Chen, Huiping; Li, Xuewen; Xu, Yongli; Lo, Kakei; Zheng, Huizhen; Hu, Haiyan; Wang, Jun; Lin, Yongcheng
2018-05-15
The polar extract of the Dendrobium species or F. fimbriata (a substitute of Dendrobium ), between the fat-soluble extract and polysaccharide has barely been researched. This report worked on the qualitative and quantitative studies of polar extracts from D. nobile , D. officinale , D. loddigesii , and F. fimbriata . Eight water-soluble metabolites containing a new diglucoside, flifimdioside A ( 1 ), and a rare imidazolium-type alkaloid, anosmine ( 4 ), were identified using chromatography as well as spectroscopic techniques. Their contents in the four herbs were high, approximately 0.9⁻3.7 mg/g based on the analysis of quantitative nuclear magnetic resonance (qNMR) spectroscopy. Biological activity evaluation showed that the polar extract of F. fimbriata or its pure component had good antioxidant and neuroprotective activity; compounds 1 ‒ 4 and shihunine ( 8 ) showed weak α-glucosidase inhibitory activity; 4 and 8 had weak anti-inflammatory activity. Under trial conditions, all samples had no cytotoxic activity.
Kline, Margaret C; Duewer, David L; Travis, John C; Smith, Melody V; Redman, Janette W; Vallone, Peter M; Decker, Amy E; Butler, John M
2009-06-01
Modern highly multiplexed short tandem repeat (STR) assays used by the forensic human-identity community require tight control of the initial amount of sample DNA amplified in the polymerase chain reaction (PCR) process. This, in turn, requires the ability to reproducibly measure the concentration of human DNA, [DNA], in a sample extract. Quantitative PCR (qPCR) techniques can determine the number of intact stretches of DNA of specified nucleotide sequence in an extremely small sample; however, these assays must be calibrated with DNA extracts of well-characterized and stable composition. By 2004, studies coordinated by or reported to the National Institute of Standards and Technology (NIST) indicated that a well-characterized, stable human DNA quantitation certified reference material (CRM) could help the forensic community reduce within- and among-laboratory quantitation variability. To ensure that the stability of such a quantitation standard can be monitored and that, if and when required, equivalent replacement materials can be prepared, a measurement of some stable quantity directly related to [DNA] is required. Using a long-established conventional relationship linking optical density (properly designated as decadic attenuance) at 260 nm with [DNA] in aqueous solution, NIST Standard Reference Material (SRM) 2372 Human DNA Quantitation Standard was issued in October 2007. This SRM consists of three quite different DNA extracts: a single-source male, a multiple-source female, and a mixture of male and female sources. All three SRM components have very similar optical densities, and thus very similar conventional [DNA]. The materials perform very similarly in several widely used gender-neutral assays, demonstrating that the combination of appropriate preparation methods and metrologically sound spectrophotometric measurements enables the preparation and certification of quantitation [DNA] standards that are both maintainable and of practical utility.
Technique for quantitative RT-PCR analysis directly from single muscle fibers.
Wacker, Michael J; Tehel, Michelle M; Gallagher, Philip M
2008-07-01
The use of single-cell quantitative RT-PCR has greatly aided the study of gene expression in fields such as muscle physiology. For this study, we hypothesized that single muscle fibers from a biopsy can be placed directly into the reverse transcription buffer and that gene expression data can be obtained without having to first extract the RNA. To test this hypothesis, biopsies were taken from the vastus lateralis of five male subjects. Single muscle fibers were isolated and underwent RNA isolation (technique 1) or placed directly into reverse transcription buffer (technique 2). After cDNA conversion, individual fiber cDNA was pooled and quantitative PCR was performed using primer-probes for beta(2)-microglobulin, glyceraldehyde-3-phosphate dehydrogenase, insulin-like growth factor I receptor, and glucose transporter subtype 4. The no RNA extraction method provided similar quantitative PCR data as that of the RNA extraction method. A third technique was also tested in which we used one-quarter of an individual fiber's cDNA for PCR (not pooled) and the average coefficient of variation between fibers was <8% (cycle threshold value) for all genes studied. The no RNA extraction technique was tested on isolated muscle fibers using a gene known to increase after exercise (pyruvate dehydrogenase kinase 4). We observed a 13.9-fold change in expression after resistance exercise, which is consistent with what has been previously observed. These results demonstrate a successful method for gene expression analysis directly from single muscle fibers.
High-throughput quantitative analysis by desorption electrospray ionization mass spectrometry.
Manicke, Nicholas E; Kistler, Thomas; Ifa, Demian R; Cooks, R Graham; Ouyang, Zheng
2009-02-01
A newly developed high-throughput desorption electrospray ionization (DESI) source was characterized in terms of its performance in quantitative analysis. A 96-sample array, containing pharmaceuticals in various matrices, was analyzed in a single run with a total analysis time of 3 min. These solution-phase samples were examined from a hydrophobic PTFE ink printed on glass. The quantitative accuracy, precision, and limit of detection (LOD) were characterized. Chemical background-free samples of propranolol (PRN) with PRN-d(7) as internal standard (IS) and carbamazepine (CBZ) with CBZ-d(10) as IS were examined. So were two other sample sets consisting of PRN/PRN-d(7) at varying concentration in a biological milieu of 10% urine or porcine brain total lipid extract, total lipid concentration 250 ng/microL. The background-free samples, examined in a total analysis time of 1.5 s/sample, showed good quantitative accuracy and precision, with a relative error (RE) and relative standard deviation (RSD) generally less than 3% and 5%, respectively. The samples in urine and the lipid extract required a longer analysis time (2.5 s/sample) and showed RSD values of around 10% for the samples in urine and 4% for the lipid extract samples and RE values of less than 3% for both sets. The LOD for PRN and CBZ when analyzed without chemical background was 10 and 30 fmol, respectively. The LOD of PRN increased to 400 fmol analyzed in 10% urine, and 200 fmol when analyzed in the brain lipid extract.
Comparison of salivary collection and processing methods for quantitative HHV-8 detection.
Speicher, D J; Johnson, N W
2014-10-01
Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Detection of propofol concentrations in blood by Raman spectroscopy
NASA Astrophysics Data System (ADS)
Wróbel, M. S.; Gnyba, M.; UrniaŻ, R.; Myllylä, T. S.; Jedrzejewska-Szczerska, M.
2015-07-01
In this paper we present a proof-of-concept of a Raman spectroscopy-based approach for measuring the content of propofol, a common anesthesia drug, in whole human blood, and plasma, which is intended for use during clinical procedures. This method utilizes the Raman spectroscopy as a chemically-sensitive method for qualitative detection of the presence of a drug and a quantitative determination of its concentration. A number of samples from different patients with added various concentrations of propofol IV solution were measured. This is most equivalent to a real in-vivo situation. Subsequent analysis of a set of spectra was carried out to extract qualitative and quantitative information. We conclude, that the changes in the spectra of blood with propofol, overlap with the most prominent lines of the propofol solution, especially at spectral regions: 1450 cm-1, 1250- 1260 cm-1, 1050 cm-1, 875-910 cm-1, 640 cm-1. Later, we have introduced a quantitative analysis program based on correlation matrix closest fit, and a LOO cross-validation. We have achieved 36.67% and 60% model precision when considering full spectra, or specified bands, respectively. These results prove the possibility of using Raman spectroscopy for quantitative detection of propofol concentrations in whole human blood.
Beveridge, Thomas H J; Girard, Benoit; Kopp, Thomas; Drover, John C G
2005-03-09
Grape seed has a well-known potential for production of oil as a byproduct of winemaking and is currently produced as a specialty oil byproduct of wine manufacture. Seed oils from eight varieties of grapes crushed for wine production in British Columbia were extracted by supercritical carbon dioxide (SCE) and petroleum ether (PE). Oil yields by SCE ranged from 5.85 +/- 0.33 to 13.6 +/- 0.46% (w/w), whereas PE yields ranged from 6.64 +/- 0.16 to 11.17 +/- 0.05% (+/- is standard deviation). The oils contained alpha-, beta-, and gamma-tocopherols and alpha- and gamma-tocotrienols, with gamma-tocotrienol being most important quantitatively. In both SCE- and PE-extracted oils, phytosterols were a prominent feature of the unsaponifiable fraction, with beta-sitosterol quantitatively most important with both extractants. Total phytosterol extraction was higher with SCE than with PE in seven of eight variety extractions. Fatty acid composition of oils from all varieties tested, and from both extraction methods, indicated linoleic acid as the major component ranging from 67.56 to 73.23% of the fatty acids present, in agreement with literature reports.
Folks, Russell D; Savir-Baruch, Bital; Garcia, Ernest V; Verdes, Liudmila; Taylor, Andrew T
2012-12-01
Our objective was to design and implement a clinical history database capable of linking to our database of quantitative results from (99m)Tc-mercaptoacetyltriglycine (MAG3) renal scans and export a data summary for physicians or our software decision support system. For database development, we used a commercial program. Additional software was developed in Interactive Data Language. MAG3 studies were processed using an in-house enhancement of a commercial program. The relational database has 3 parts: a list of all renal scans (the RENAL database), a set of patients with quantitative processing results (the Q2 database), and a subset of patients from Q2 containing clinical data manually transcribed from the hospital information system (the CLINICAL database). To test interobserver variability, a second physician transcriber reviewed 50 randomly selected patients in the hospital information system and tabulated 2 clinical data items: hydronephrosis and presence of a current stent. The CLINICAL database was developed in stages and contains 342 fields comprising demographic information, clinical history, and findings from up to 11 radiologic procedures. A scripted algorithm is used to reliably match records present in both Q2 and CLINICAL. An Interactive Data Language program then combines data from the 2 databases into an XML (extensible markup language) file for use by the decision support system. A text file is constructed and saved for review by physicians. RENAL contains 2,222 records, Q2 contains 456 records, and CLINICAL contains 152 records. The interobserver variability testing found a 95% match between the 2 observers for presence or absence of ureteral stent (κ = 0.52), a 75% match for hydronephrosis based on narrative summaries of hospitalizations and clinical visits (κ = 0.41), and a 92% match for hydronephrosis based on the imaging report (κ = 0.84). We have developed a relational database system to integrate the quantitative results of MAG3 image processing with clinical records obtained from the hospital information system. We also have developed a methodology for formatting clinical history for review by physicians and export to a decision support system. We identified several pitfalls, including the fact that important textual information extracted from the hospital information system by knowledgeable transcribers can show substantial interobserver variation, particularly when record retrieval is based on the narrative clinical records.
Empirical advances with text mining of electronic health records.
Delespierre, T; Denormandie, P; Bar-Hen, A; Josseran, L
2017-08-22
Korian is a private group specializing in medical accommodations for elderly and dependent people. A professional data warehouse (DWH) established in 2010 hosts all of the residents' data. Inside this information system (IS), clinical narratives (CNs) were used only by medical staff as a residents' care linking tool. The objective of this study was to show that, through qualitative and quantitative textual analysis of a relatively small physiotherapy and well-defined CN sample, it was possible to build a physiotherapy corpus and, through this process, generate a new body of knowledge by adding relevant information to describe the residents' care and lives. Meaningful words were extracted through Standard Query Language (SQL) with the LIKE function and wildcards to perform pattern matching, followed by text mining and a word cloud using R® packages. Another step involved principal components and multiple correspondence analyses, plus clustering on the same residents' sample as well as on other health data using a health model measuring the residents' care level needs. By combining these techniques, physiotherapy treatments could be characterized by a list of constructed keywords, and the residents' health characteristics were built. Feeding defects or health outlier groups could be detected, physiotherapy residents' data and their health data were matched, and differences in health situations showed qualitative and quantitative differences in physiotherapy narratives. This textual experiment using a textual process in two stages showed that text mining and data mining techniques provide convenient tools to improve residents' health and quality of care by adding new, simple, useable data to the electronic health record (EHR). When used with a normalized physiotherapy problem list, text mining through information extraction (IE), named entity recognition (NER) and data mining (DM) can provide a real advantage to describe health care, adding new medical material and helping to integrate the EHR system into the health staff work environment.
Quantitative Analysis of Cotton Canopy Size in Field Conditions Using a Consumer-Grade RGB-D Camera.
Jiang, Yu; Li, Changying; Paterson, Andrew H; Sun, Shangpeng; Xu, Rui; Robertson, Jon
2017-01-01
Plant canopy structure can strongly affect crop functions such as yield and stress tolerance, and canopy size is an important aspect of canopy structure. Manual assessment of canopy size is laborious and imprecise, and cannot measure multi-dimensional traits such as projected leaf area and canopy volume. Field-based high throughput phenotyping systems with imaging capabilities can rapidly acquire data about plants in field conditions, making it possible to quantify and monitor plant canopy development. The goal of this study was to develop a 3D imaging approach to quantitatively analyze cotton canopy development in field conditions. A cotton field was planted with 128 plots, including four genotypes of 32 plots each. The field was scanned by GPhenoVision (a customized field-based high throughput phenotyping system) to acquire color and depth images with GPS information in 2016 covering two growth stages: canopy development, and flowering and boll development. A data processing pipeline was developed, consisting of three steps: plot point cloud reconstruction, plant canopy segmentation, and trait extraction. Plot point clouds were reconstructed using color and depth images with GPS information. In colorized point clouds, vegetation was segmented from the background using an excess-green (ExG) color filter, and cotton canopies were further separated from weeds based on height, size, and position information. Static morphological traits were extracted on each day, including univariate traits (maximum and mean canopy height and width, projected canopy area, and concave and convex volumes) and a multivariate trait (cumulative height profile). Growth rates were calculated for univariate static traits, quantifying canopy growth and development. Linear regressions were performed between the traits and fiber yield to identify the best traits and measurement time for yield prediction. The results showed that fiber yield was correlated with static traits after the canopy development stage ( R 2 = 0.35-0.71) and growth rates in early canopy development stages ( R 2 = 0.29-0.52). Multi-dimensional traits (e.g., projected canopy area and volume) outperformed one-dimensional traits, and the multivariate trait (cumulative height profile) outperformed univariate traits. The proposed approach would be useful for identification of quantitative trait loci (QTLs) controlling canopy size in genetics/genomics studies or for fiber yield prediction in breeding programs and production environments.
2014-01-01
Background Left pulmonary artery sling (LPAS) is a rare but severe congenital anomaly, in which the stenoses are formed in the trachea and/or main bronchi. Multi-detector computed tomography (MDCT) provides useful anatomical images, but does not offer functional information. The objective of the present study is to quantitatively analyze the airflow in the trachea and main bronchi of LPAS subjects through computational fluid dynamics (CFD) simulation. Methods Five subjects (four LPAS patients, one normal control) aging 6-19 months are analyzed. The geometric model of the trachea and the two main bronchi is extracted from the MDCT images. The inlet velocity is determined based on the body weight and the inlet area. Both the geometric model and personalized inflow conditions are imported into CFD software, ANSYS. The pressure drop, mass flow ratio through two bronchi, wall pressure, flow velocity and wall shear stress (WSS) are obtained, and compared to the normal control. Results Due to the tracheal and/or bronchial stenosis, the pressure drop for the LPAS patients ranges 78.9 - 914.5 Pa, much higher than for the normal control (0.7 Pa). The mass flow ratio through the two bronchi does not correlate with the sectional area ratio if the anomalous left pulmonary artery compresses the trachea or bronchi. It is suggested that the C-shaped trachea plays an important role on facilitating the air flow into the left bronchus with the inertia force. For LPAS subjects, the distributions of velocities, wall pressure and WSS are less regular than for the normal control. At the stenotic site, high velocity, low wall pressure and high WSS are observed. Conclusions Using geometric models extracted from CT images and the patient-specified inlet boundary conditions, CFD simulation can provide vital quantitative flow information for LPAS. Due to the stenosis, high pressure drops, inconsistent distributions of velocities, wall pressure and WSS are observed. The C-shaped trachea may facilitate a larger flow of air into the left bronchus under the inertial force, and decrease the ventilation of the right lung. Quantitative and personalized information may help understand the mechanism of LPAS and the correlations between stenosis and dyspnea, and facilitate the structural and functional assessment of LPAS. PMID:24957947
Sinha, Arun Kumar; Verma, Subash Chandra; Sharma, Upendra Kumar
2007-01-01
A simple and fast method was developed using RP-HPLC for separation and quantitative determination of vanillin and related phenolic compounds in ethanolic extract of pods of Vanilla planifolia. Ten phenolic compounds, namely 4-hydroxybenzyl alcohol, vanillyl alcohol, 3,4-dihydroxybenzaldehyde, 4-hydroxybenzoic acid, vanillic acid, 4-hydroxybenzaldehyde, vanillin, p-coumaric acid, ferulic acid, and piperonal were quantitatively determined using ACN, methanol, and 0.2% acetic acid in water as a mobile phase with a gradient elution mode. The method showed good linearity, high precision, and good recovery of compounds of interest. The present method would be useful for analytical research and for routine analysis of vanilla extracts for their quality control.
NASA Astrophysics Data System (ADS)
Federici, Antoine; Aknoun, Sherazade; Savatier, Julien; Wattellier, Benoit F.
2017-02-01
Quadriwave lateral shearing interferometry (QWLSI) is a well-established quantitative phase imaging (QPI) technique based on the analysis of interference patterns of four diffraction orders by an optical grating set in front of an array detector [1]. As a QPI modality, this is a non-invasive imaging technique which allow to measure the optical path difference (OPD) of semi-transparent samples. We present a system enabling QWLSI with high-performance sCMOS cameras [2] and apply it to perform high-speed imaging, low noise as well as multimodal imaging. This modified QWLSI system contains a versatile optomechanical device which images the optical grating near the detector plane. Such a device is coupled with any kind of camera by varying its magnification. In this paper, we study the use of a sCMOS Zyla5.5 camera from Andor along with our modified QWLSI system. We will present high-speed live cell imaging, up to 200Hz frame rate, in order to follow intracellular fast motions while measuring the quantitative phase information. The structural and density information extracted from the OPD signal is complementary to the specific and localized fluorescence signal [2]. In addition, QPI detects cells even when the fluorophore is not expressed. This is very useful to follow a protein expression with time. The 10 µm spatial pixel resolution of our modified QWLSI associated to the high sensitivity of the Zyla5.5 enabling to perform high quality fluorescence imaging, we have carried out multimodal imaging revealing fine structures cells, like actin filaments, merged with the morphological information of the phase. References [1]. P. Bon, G. Maucort, B. Wattellier, and S. Monneret, "Quadriwave lateral shearing interferometry for quantitative phase microscopy of living cells," Opt. Express, vol. 17, pp. 13080-13094, 2009. [2] P. Bon, S. Lécart, E. Fort and S. Lévêque-Fort, "Fast label-free cytoskeletal network imaging in living mammalian cells," Biophysical journal, 106(8), pp. 1588-1595, 2014
Tailings dam-break flow - Analysis of sediment transport
NASA Astrophysics Data System (ADS)
Aleixo, Rui; Altinakar, Mustafa
2015-04-01
A common solution to store mining debris is to build tailings dams near the mining site. These dams are usually built with local materials such as mining debris and are more vulnerable than concrete dams (Rico et al. 2008). of The tailings and the pond water generally contain heavy metals and various toxic chemicals used in ore extraction. Thus, the release of tailings due to a dam-break can have severe ecological consequences in the environment. A tailings dam-break has many similarities with a common dam-break flow. It is highly transient and can be severely descructive. However, a significant difference is that the released sediment-water mixture will behave as a non-Newtonian flow. Existing numerical models used to simulate dam-break flows do not represent correctly the non-Newtonian behavior of tailings under a dam-break flow and may lead to unrealistic and incorrect results. The need for experiments to extract both qualitative and quantitative information regarding these flows is therefore real and actual. The present paper explores an existing experimental data base presented in Aleixo et al. (2014a,b) to further characterize the sediment transport under conditions of a severe transient flow and to extract quantitative information regarding sediment flow rate, sediment velocity, sediment-sediment interactions a among others. Different features of the flow are also described and analyzed in detail. The analysis is made by means of imaging techniques such as Particle Image Velocimetry and Particle Tracking Velocimetry that allow extracting not only the velocity field but the Lagrangian description of the sediments as well. An analysis of the results is presented and the limitations of the presented experimental approach are discussed. References Rico, M., Benito, G., Salgueiro, AR, Diez-Herrero, A. and Pereira, H.G. (2008) Reported tailings dam failures: A review of the European incidents in the worldwide context , Journal of Hazardous Materials, 152, 846-852 . Aleixo, R., Ozeren, Y., Altinakar, M. and Wren, D. (2014a) Velocity Measurements using Particle Tracking in Tailings dam Failure experiments, Proceedings of the 3rd IAHR-Europe conference, Porto, Portugal. Aleixo, R., Ozeren, Y., Altinakar, M. (2014b) Tailing dam-break analysis by means of a combined PIV-PTV tool, Proceedings of the River Flow Conference, Lausanne, Switzerland.
Hirunpanich, Vilasinee; Utaipat, Anocha; Morales, Noppawan Phumala; Bunyapraphatsara, Nuntavan; Sato, Hitoshi; Herunsalee, Angkana; Suthisisang, Chuthamanee
2005-03-01
The present study quantitatively investigated the antioxidant effects of the aqueous extracts from dried calyx of Hibiscus sabdariffa LINN. (roselle) in vitro using rat low-density lipoprotein (LDL). Formations of the conjugated dienes and thiobarbituric acid reactive substances (TBARs) were monitored as markers of the early and later stages of the oxidation of LDL, respectively. Thus, we demonstrated that the dried calyx extracts of roselle exhibits strong antioxidant activity in Cu(2+)-mediated oxidation of LDL (p<0.05) in vitro. The inhibitory effect of the extracts on LDL oxidation was dose-dependent at concentrations ranging from 0.1 to 5 mg/ml. Moreover, 5 mg/ml of roselle inhibited TBARs-formation with greater potency than 100 microM of vitamin E. In conclusion, this study provides a quantitative insight into the potent antioxidant effect of roselle in vitro.
Sklerov, J H; Kalasinsky, K S; Ehorn, C A
1999-10-01
A confirmatory method for the detection and quantitation of lysergic acid diethylamide (LSD) is presented. The method employs gas chromatography-tandem mass spectrometry (GC-MS-MS) using an internal ionization ion trap detector for sensitive MS-MS-in-time measurements of LSD extracted from urine. Following a single-step solid-phase extraction of 5 mL of urine, underivatized LSD can be measured with limits of quantitation and detection of 80 and 20 pg/mL, respectively. Temperature-programmed on-column injections of urine extracts were linear over the concentration range 20-2000 pg/mL (r2 = 0.999). Intraday and interday coefficients of variation were < 6% and < 13%, respectively. This procedure has been applied to quality-control specimens and LSD-positive samples in this laboratory. Comparisons with alternate GC-MS methods and extraction procedures are discussed.
A highly efficient, cell-free translation/translocation system prepared from Xenopus eggs.
Matthews, G; Colman, A
1991-01-01
We describe the use of a Xenopus laevis egg extract for the in vitro translation and post translational modification of membrane and secretory proteins. This extract is capable of the translation and segregation into membranes of microgram per millilitre levels of protein from added mRNAs. Signal sequences of segregated proteins are efficiently cleaved and appropriate N-linked glycosylation patterns are produced. The extract also supports the quantitative assembly of murine immunoglobulin heavy and light chains into tetramers, and two events which take place beyond the endoplasmic reticulum, mannose 6 phosphorylation of murine cathepsin D and O-linked glycosylation of coronavirus E1 protein, also occur, but at reduced efficiency. The stability of the membranes allows protease protection studies and quantitative centrifugal fractionation of segregated and unsegregated proteins to be performed. Conditions for the use of stored extract have also been determined. Images PMID:1754376
2017-01-01
Chemical standardization, along with morphological and DNA analysis ensures the authenticity and advances the integrity evaluation of botanical preparations. Achievement of a more comprehensive, metabolomic standardization requires simultaneous quantitation of multiple marker compounds. Employing quantitative 1H NMR (qHNMR), this study determined the total isoflavone content (TIfCo; 34.5–36.5% w/w) via multimarker standardization and assessed the stability of a 10-year-old isoflavone-enriched red clover extract (RCE). Eleven markers (nine isoflavones, two flavonols) were targeted simultaneously, and outcomes were compared with LC-based standardization. Two advanced quantitative measures in qHNMR were applied to derive quantities from complex and/or overlapping resonances: a quantum mechanical (QM) method (QM-qHNMR) that employs 1H iterative full spin analysis, and a non-QM method that uses linear peak fitting algorithms (PF-qHNMR). A 10 min UHPLC-UV method provided auxiliary orthogonal quantitation. This is the first systematic evaluation of QM and non-QM deconvolution as qHNMR quantitation measures. It demonstrates that QM-qHNMR can account successfully for the complexity of 1H NMR spectra of individual analytes and how QM-qHNMR can be built for mixtures such as botanical extracts. The contents of the main bioactive markers were in good agreement with earlier HPLC-UV results, demonstrating the chemical stability of the RCE. QM-qHNMR advances chemical standardization by its inherent QM accuracy and the use of universal calibrants, avoiding the impractical need for identical reference materials. PMID:28067513
Bieri, Stefan; Ilias, Yara; Bicchi, Carlo; Veuthey, Jean-Luc; Christen, Philippe
2006-04-21
An effective combination of focused microwave-assisted extraction (FMAE) with solid-phase microextraction (SPME) prior to gas chromatography (GC) is described for the selective extraction and quantitative analysis of cocaine from coca leaves (Erythroxylum coca). This approach required switching from an organic extraction solvent to an aqueous medium more compatible with SPME liquid sampling. SPME was performed in the direct immersion mode with a universal 100 microm polydimethylsiloxane (PDMS) coated fibre. Parameters influencing this extraction step, such as solution pH, sampling time and temperature are discussed. Furthermore, the overall extraction process takes into account the stability of cocaine in alkaline aqueous solutions at different temperatures. Cocaine degradation rate was determined by capillary electrophoresis using the short end injection procedure. In the selected extraction conditions, less than 5% of cocaine was degraded after 60 min. From a qualitative point of view, a significant gain in selectivity was obtained with the incorporation of SPME in the extraction procedure. As a consequence of SPME clean-up, shorter columns could be used and analysis time was reduced to 6 min compared to 35 min with conventional GC. Quantitative results led to a cocaine content of 0.70 +/- 0.04% in dry leaves (RSD <5%) which agreed with previous investigations.
NASA Astrophysics Data System (ADS)
Yan, X. L.; Coetsee, E.; Wang, J. Y.; Swart, H. C.; Terblans, J. J.
2017-07-01
The polycrystalline Ni/Cu multilayer thin films consisting of 8 alternating layers of Ni and Cu were deposited on a SiO2 substrate by means of electron beam evaporation in a high vacuum. Concentration-depth profiles of the as-deposited multilayered Ni/Cu thin films were determined with Auger electron spectroscopy (AES) in combination with Ar+ ion sputtering, under various bombardment conditions with the samples been stationary as well as rotating in some cases. The Mixing-Roughness-Information depth (MRI) model used for the fittings of the concentration-depth profiles accounts for the interface broadening of the experimental depth profiling. The interface broadening incorporates the effects of atomic mixing, surface roughness and information depth of the Auger electrons. The roughness values extracted from the MRI model fitting of the depth profiling data agrees well with those measured by atomic force microscopy (AFM). The ion sputtering induced surface roughness during the depth profiling was accordingly quantitatively evaluated from the fitted MRI parameters with sample rotation and stationary conditions. The depth resolutions of the AES depth profiles were derived directly from the values determined by the fitting parameters in the MRI model.
NASA Astrophysics Data System (ADS)
Lee, Minsuk; Won, Youngjae; Park, Byungjun; Lee, Seungrag
2017-02-01
Not only static characteristics but also dynamic characteristics of the red blood cell (RBC) contains useful information for the blood diagnosis. Quantitative phase imaging (QPI) can capture sample images with subnanometer scale depth resolution and millisecond scale temporal resolution. Various researches have been used QPI for the RBC diagnosis, and recently many researches has been developed to decrease the process time of RBC information extraction using QPI by the parallel computing algorithm, however previous studies are interested in the static parameters such as morphology of the cells or simple dynamic parameters such as root mean square (RMS) of the membrane fluctuations. Previously, we presented a practical blood test method using the time series correlation analysis of RBC membrane flickering with QPI. However, this method has shown that there is a limit to the clinical application because of the long computation time. In this study, we present an accelerated time series correlation analysis of RBC membrane flickering using the parallel computing algorithm. This method showed consistent fractal scaling exponent results of the surrounding medium and the normal RBC with our previous research.
Xu, Shuoyu; Kang, Chiang Huen; Gou, Xiaoli; Peng, Qiwen; Yan, Jie; Zhuo, Shuangmu; Cheng, Chee Leong; He, Yuting; Kang, Yuzhan; Xia, Wuzheng; So, Peter T C; Welsch, Roy; Rajapakse, Jagath C; Yu, Hanry
2016-04-01
Liver surface is covered by a collagenous layer called the Glisson's capsule. The structure of the Glisson's capsule is barely seen in the biopsy samples for histology assessment, thus the changes of the collagen network from the Glisson's capsule during the liver disease progression are not well studied. In this report, we investigated whether non-linear optical imaging of the Glisson's capsule at liver surface would yield sufficient information to allow quantitative staging of liver fibrosis. In contrast to conventional tissue sections whereby tissues are cut perpendicular to the liver surface and interior information from the liver biopsy samples were used, we have established a capsule index based on significant parameters extracted from the second harmonic generation (SHG) microscopy images of capsule collagen from anterior surface of rat livers. Thioacetamide (TAA) induced liver fibrosis animal models was used in this study. The capsule index is capable of differentiating different fibrosis stages, with area under receiver operating characteristics curve (AUC) up to 0.91, making it possible to quantitatively stage liver fibrosis via liver surface imaging potentially with endomicroscopy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Loyo-Rosales, Jorge E.; Torrents, Alba; Rosales-Rivera, Georgina C.; Rice, Clifford C.
2006-01-01
Several chemical concepts to the extraction of a water pollutant OPC (octylphenoxyacetic acid) is presented. As an introduction to the laboratory experiment, a discussion on endocrine disrupters is conducted to familiarize the student with the background of the experiment and to explain the need for the extraction and quantitation of the OPC which…
NASA Astrophysics Data System (ADS)
Fripp, Jurgen; Crozier, Stuart; Warfield, Simon K.; Ourselin, Sébastien
2007-03-01
The accurate segmentation of the articular cartilages from magnetic resonance (MR) images of the knee is important for clinical studies and drug trials into conditions like osteoarthritis. Currently, segmentations are obtained using time-consuming manual or semi-automatic algorithms which have high inter- and intra-observer variabilities. This paper presents an important step towards obtaining automatic and accurate segmentations of the cartilages, namely an approach to automatically segment the bones and extract the bone-cartilage interfaces (BCI) in the knee. The segmentation is performed using three-dimensional active shape models, which are initialized using an affine registration to an atlas. The BCI are then extracted using image information and prior knowledge about the likelihood of each point belonging to the interface. The accuracy and robustness of the approach was experimentally validated using an MR database of fat suppressed spoiled gradient recall images. The (femur, tibia, patella) bone segmentation had a median Dice similarity coefficient of (0.96, 0.96, 0.89) and an average point-to-surface error of 0.16 mm on the BCI. The extracted BCI had a median surface overlap of 0.94 with the real interface, demonstrating its usefulness for subsequent cartilage segmentation or quantitative analysis.
USDA-ARS?s Scientific Manuscript database
A wide range of analytical techniques are available for the detection, quantitation, and evaluation of vitamin K in foods. The methods vary from simple to complex depending on extraction, separation, identification and detection of the analyte. Among the extraction methods applied for vitamin K anal...
Temporal dynamics of 2D motion integration for ocular following in macaque monkeys.
Barthélemy, Fréderic V; Fleuriet, Jérome; Masson, Guillaume S
2010-03-01
Several recent studies have shown that extracting pattern motion direction is a dynamical process where edge motion is first extracted and pattern-related information is encoded with a small time lag by MT neurons. A similar dynamics was found for human reflexive or voluntary tracking. Here, we bring an essential, but still missing, piece of information by documenting macaque ocular following responses to gratings, unikinetic plaids, and barber-poles. We found that ocular tracking was always initiated first in the grating motion direction with ultra-short latencies (approximately 55 ms). A second component was driven only 10-15 ms later, rotating tracking toward pattern motion direction. At the end the open-loop period, tracking direction was aligned with pattern motion direction (plaids) or the average of the line-ending motion directions (barber-poles). We characterized the dependency on contrast of each component. Both timing and direction of ocular following were quantitatively very consistent with the dynamics of neuronal responses reported by others. Overall, we found a remarkable consistency between neuronal dynamics and monkey behavior, advocating for a direct link between the neuronal solution of the aperture problem and primate perception and action.
Booth, Andrew; Noyes, Jane; Flemming, Kate; Gerhardus, Ansgar; Wahlster, Philip; van der Wilt, Gert Jan; Mozygemba, Kati; Refolo, Pietro; Sacchini, Dario; Tummers, Marcia; Rehfuess, Eva
2018-07-01
To compare and contrast different methods of qualitative evidence synthesis (QES) against criteria identified from the literature and to map their attributes to inform selection of the most appropriate QES method to answer research questions addressed by qualitative research. Electronic databases, citation searching, and a study register were used to identify studies reporting QES methods. Attributes compiled from 26 methodological papers (2001-2014) were used as a framework for data extraction. Data were extracted into summary tables by one reviewer and then considered within the author team. We identified seven considerations determining choice of methods from the methodological literature, encapsulated within the mnemonic Review question-Epistemology-Time/Timescale-Resources-Expertise-Audience and purpose-Type of data. We mapped 15 different published QES methods against these seven criteria. The final framework focuses on stand-alone QES methods but may also hold potential when integrating quantitative and qualitative data. These findings offer a contemporary perspective as a conceptual basis for future empirical investigation of the advantages and disadvantages of different methods of QES. It is hoped that this will inform appropriate selection of QES approaches. Copyright © 2018 Elsevier Inc. All rights reserved.
The 2D analytic signal for envelope detection and feature extraction on ultrasound images.
Wachinger, Christian; Klein, Tassilo; Navab, Nassir
2012-08-01
The fundamental property of the analytic signal is the split of identity, meaning the separation of qualitative and quantitative information in form of the local phase and the local amplitude, respectively. Especially the structural representation, independent of brightness and contrast, of the local phase is interesting for numerous image processing tasks. Recently, the extension of the analytic signal from 1D to 2D, covering also intrinsic 2D structures, was proposed. We show the advantages of this improved concept on ultrasound RF and B-mode images. Precisely, we use the 2D analytic signal for the envelope detection of RF data. This leads to advantages for the extraction of the information-bearing signal from the modulated carrier wave. We illustrate this, first, by visual assessment of the images, and second, by performing goodness-of-fit tests to a Nakagami distribution, indicating a clear improvement of statistical properties. The evaluation is performed for multiple window sizes and parameter estimation techniques. Finally, we show that the 2D analytic signal allows for an improved estimation of local features on B-mode images. Copyright © 2012 Elsevier B.V. All rights reserved.
Multi-object segmentation framework using deformable models for medical imaging analysis.
Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel
2016-08-01
Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed framework has a wide range of applications especially in the presence of adjacent structures of interest or under intra-structure inhomogeneities giving excellent quantitative results.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rios Velazquez, E; Parmar, C; Narayan, V
Purpose: To compare the complementary value of quantitative radiomic features to that of radiologist-annotated semantic features in predicting EGFR mutations in lung adenocarcinomas. Methods: Pre-operative CT images of 258 lung adenocarcinoma patients were available. Tumors were segmented using the sing-click ensemble segmentation algorithm. A set of radiomic features was extracted using 3D-Slicer. Test-retest reproducibility and unsupervised dimensionality reduction were applied to select a subset of reproducible and independent radiomic features. Twenty semantic annotations were scored by an expert radiologist, describing the tumor, surrounding tissue and associated findings. Minimum-redundancy-maximum-relevance (MRMR) was used to identify the most informative radiomic and semantic featuresmore » in 172 patients (training-set, temporal split). Radiomic, semantic and combined radiomic-semantic logistic regression models to predict EGFR mutations were evaluated in and independent validation dataset of 86 patients using the area under the receiver operating curve (AUC). Results: EGFR mutations were found in 77/172 (45%) and 39/86 (45%) of the training and validation sets, respectively. Univariate AUCs showed a similar range for both feature types: radiomics median AUC = 0.57 (range: 0.50 – 0.62); semantic median AUC = 0.53 (range: 0.50 – 0.64, Wilcoxon p = 0.55). After MRMR feature selection, the best-performing radiomic, semantic, and radiomic-semantic logistic regression models, for EGFR mutations, showed a validation AUC of 0.56 (p = 0.29), 0.63 (p = 0.063) and 0.67 (p = 0.004), respectively. Conclusion: Quantitative volumetric and textural Radiomic features complement the qualitative and semi-quantitative radiologist annotations. The prognostic value of informative qualitative semantic features such as cavitation and lobulation is increased with the addition of quantitative textural features from the tumor region.« less
ERIC Educational Resources Information Center
Parker, Patrick D.; Beers, Brandon; Vergne, Matthew J.
2017-01-01
Laboratory experiments were developed to introduce students to the quantitation of drugs of abuse by high performance liquid chromatography-tandem mass spectrometry (LC-MS/MS). Undergraduate students were introduced to internal standard quantitation and the LC-MS/MS method optimization for cocaine. Cocaine extracted from paper currency was…
Yaripour, Saeid; Mohammadi, Ali; Esfanjani, Isa; Walker, Roderick B; Nojavan, Saeed
2018-01-01
In this study, for the first time, an electro-driven microextraction method named electromembrane extraction combined with a simple high performance liquid chromatography and ultraviolet detection was developed and validated for the quantitation of zolpidem in biological samples. Parameters influencing electromembrane extraction were evaluated and optimized. The membrane consisted of 2-ethylhexanol immobilized in the pores of a hollow fiber. As a driving force, a 150 V electric field was applied to facilitate the analyte migration from the sample matrix to an acceptor solution through a supported liquid membrane. The pHs of donor and acceptor solutions were optimized to 6.0 and 2.0, respectively. The enrichment factor was obtained >75 within 15 minutes. The effect of carbon nanotubes (as solid nano-sorbents) on the membrane performance and EME efficiency was evaluated. The method was linear over the range of 10-1000 ng/mL for zolpidem (R 2 >0.9991) with repeatability ( %RSD) between 0.3 % and 7.3 % ( n = 3). The limits of detection and quantitation were 3 and 10 ng/mL, respectively. The sensitivity of HPLC-UV for the determination of zolpidem was enhanced by electromembrane extraction. Finally, the method was employed for the quantitation of zolpidem in biological samples with relative recoveries in the range of 60-79 %.
Yaripour, Saeid; Mohammadi, Ali; Esfanjani, Isa; Walker, Roderick B.; Nojavan, Saeed
2018-01-01
In this study, for the first time, an electro-driven microextraction method named electromembrane extraction combined with a simple high performance liquid chromatography and ultraviolet detection was developed and validated for the quantitation of zolpidem in biological samples. Parameters influencing electromembrane extraction were evaluated and optimized. The membrane consisted of 2-ethylhexanol immobilized in the pores of a hollow fiber. As a driving force, a 150 V electric field was applied to facilitate the analyte migration from the sample matrix to an acceptor solution through a supported liquid membrane. The pHs of donor and acceptor solutions were optimized to 6.0 and 2.0, respectively. The enrichment factor was obtained >75 within 15 minutes. The effect of carbon nanotubes (as solid nano-sorbents) on the membrane performance and EME efficiency was evaluated. The method was linear over the range of 10-1000 ng/mL for zolpidem (R2 >0.9991) with repeatability ( %RSD) between 0.3 % and 7.3 % (n = 3). The limits of detection and quantitation were 3 and 10 ng/mL, respectively. The sensitivity of HPLC-UV for the determination of zolpidem was enhanced by electromembrane extraction. Finally, the method was employed for the quantitation of zolpidem in biological samples with relative recoveries in the range of 60-79 %. PMID:29805344
Modelling dental implant extraction by pullout and torque procedures.
Rittel, D; Dorogoy, A; Shemtov-Yona, K
2017-07-01
Dental implants extraction, achieved either by applying torque or pullout force, is used to estimate the bone-implant interfacial strength. A detailed description of the mechanical and physical aspects of the extraction process in the literature is still missing. This paper presents 3D nonlinear dynamic finite element simulations of a commercial implant extraction process from the mandible bone. Emphasis is put on the typical load-displacement and torque-angle relationships for various types of cortical and trabecular bone strengths. The simulations also study of the influence of the osseointegration level on those relationships. This is done by simulating implant extraction right after insertion when interfacial frictional contact exists between the implant and bone, and long after insertion, assuming that the implant is fully bonded to the bone. The model does not include a separate representation and model of the interfacial layer for which available data is limited. The obtained relationships show that the higher the strength of the trabecular bone the higher the peak extraction force, while for application of torque, it is the cortical bone which might dictate the peak torque value. Information on the relative strength contrast of the cortical and trabecular components, as well as the progressive nature of the damage evolution, can be revealed from the obtained relations. It is shown that full osseointegration might multiply the peak and average load values by a factor 3-12 although the calculated work of extraction varies only by a factor of 1.5. From a quantitative point of view, it is suggested that, as an alternative to reporting peak load or torque values, an average value derived from the extraction work be used to better characterize the bone-implant interfacial strength. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nolan, Richard C; Richmond, Peter; Prescott, Susan L; Mallon, Dominic F; Gong, Grace; Franzmann, Annkathrin M; Naidoo, Rama; Loh, Richard K S
2007-05-01
Peanut allergy is transient in some children but it is not clear whether quantitating peanut-specific IgE by Skin Prick Test (SPT) adds additional information to fluorescent-enzyme immunoassay (FEIA) in discriminating between allergic and tolerant children. To investigate whether SPT with a commercial extract or fresh foods adds additional predictive information for peanut challenge in children with a low FEIA (<10 k UA/L) who were previously sensitized, or allergic to peanuts. Children from a hospital-based allergy service who were previously sensitized or allergic to peanuts were invited to undergo a peanut challenge unless they had a serum peanut-specific IgE>10 k UA/L, a previous severe reaction, or a recent reaction to peanuts (within two years). SPT with a commercial extract, raw and roasted saline soaked peanuts was performed immediately prior to open challenge in hospital with increasing quantity of peanuts until total of 26.7 g of peanut was consumed. A positive challenge consisted of an objective IgE mediated reaction occurring during the observation period. 54 children (median age of 6.3 years) were admitted for a challenge. Nineteen challenges were positive, 27 negative, five were indeterminate and three did not proceed after SPT. Commercial and fresh food extracts provided similar diagnostic information. A wheal diameter of >or=7 mm of the commercial extract predicted an allergic outcome with specificity 97%, positive predictive value 93% and sensitivity 83%. There was a tendency for an increase in SPT wheal since initial diagnosis in children who remained allergic to peanuts while it decreased in those with a negative challenge. The outcome of a peanut challenge in peanut sensitized or previously allergic children with a low FEIA can be predicted by SPT. In this cohort, not challenging children with a SPT wheal of >or=7 mm would have avoided 15 of 18 positive challenges and denied a challenge to one out of 27 tolerant children.
A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images
Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.
1986-01-01
The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16
What are the social consequences of stroke for working-aged adults? A systematic review.
Daniel, Katie; Wolfe, Charles D A; Busch, Markus A; McKevitt, Christopher
2009-06-01
Approximately one fourth of strokes occur in people aged <65 years. UK current policy calls for services that meet the specific needs of working-aged adults with stroke. We aimed to identify the social consequences of stroke in working-aged adults, which might subsequently inform the development and evaluation of services for this group. We reviewed quantitative and qualitative studies identifying social consequences for working-aged adults with stroke using multiple search strategies (electronic databases, bibliographic references, hand searches). Social consequences were defined as those pertaining to the World Health Organization International Classification of Functioning, Disability and Health domain "participation." Two authors reviewed articles using a standardized matrix for data extraction. Seventy-eight studies were included: 66 were quantitative observational studies, 2 were quantitative interventional studies, 9 were qualitative studies, and one used mixed methods. Seventy studies reported data on return to work after stroke with proportions ranging from 0% to 100%. Other categories of social consequences included negative impact on family relationships (5% to 54%), deterioration in sexual life (5% to 76%), economic difficulties (24% to 33%), and deterioration in leisure activities (15% to 79%). Methodological variations account for the wide range of rates of return to work after stroke. There is limited evidence of the negative impact of stroke on other aspects of social participation. Robust estimates of the prevalence of such outcomes are required to inform the development of appropriate interventions. We propose strategies by which methodology and reporting in this field might be improved.
ERIC Educational Resources Information Center
Mei-Ratliff, Yuan
2012-01-01
Trace levels of oxytetracylcine spiked into commercial milk samples are extracted, cleaned up, and preconcentrated using a C[subscript 18] solid-phase extraction column. The extract is then analyzed by a high-performance liquid chromatography (HPLC) instrument equipped with a UV detector and a C[subscript 18] column (150 mm x 4.6 mm x 3.5 [mu]m).…
Cook, Linda; Ng, Ka-Wing; Bagabag, Arthur; Corey, Lawrence; Jerome, Keith R.
2004-01-01
Hepatitis C virus (HCV) infection is an increasing health problem worldwide. Quantitative assays for HCV viral load are valuable in predicting response to therapy and for following treatment efficacy. Unfortunately, most quantitative tests for HCV RNA are limited by poor sensitivity. We have developed a convenient, highly sensitive real-time reverse transcription-PCR assay for HCV RNA. The assay amplifies a portion of the 5′ untranslated region of HCV, which is then quantitated using the TaqMan 7700 detection system. Extraction of viral RNA for our assay is fully automated with the MagNA Pure LC extraction system (Roche). Our assay has a 100% detection rate for samples containing 50 IU of HCV RNA/ml and is linear up to viral loads of at least 109 IU/ml. The assay detects genotypes 1a, 2a, and 3a with equal efficiency. Quantitative results by our assay correlate well with HCV viral load as determined by the Bayer VERSANT HCV RNA 3.0 bDNA assay. In clinical use, our assay is highly reproducible, with high and low control specimens showing a coefficient of variation for the logarithmic result of 2.8 and 7.0%, respectively. The combination of reproducibility, extreme sensitivity, and ease of performance makes this assay an attractive option for routine HCV viral load testing. PMID:15365000
Gunnar, Teemu; Mykkänen, Sirpa; Ariniemi, Kari; Lillsunde, Pirjo
2004-07-05
A comprehensively validated procedure is presented for simultaneous semiquantitative/quantitative screening of 51 drugs of abuse or drugs potentially hazardous for traffic safety in serum, plasma or whole blood. Benzodiazepines (12), cannabinoids (3), opioids (8), cocaine, antidepressants (13), antipsychotics (5) and antiepileptics (2) as well as zolpidem, zaleplon, zopiclone, meprobamate, carisoprodol, tizanidine and orphenadrine and internal standard flurazepam, were isolated by high-yield liquid-liquid extraction (LLE). The dried extracts were derivatized by two-step silylation and analyzed by the combination of two different gas chromatographic (GC) separations with both electron capture detection (ECD) and mass spectrometry (MS) operating in a selected ion-monitoring (SIM) mode. Quantitative or semiquantitative results were obtained for each substance based on four-point calibration. In the validation tests, accuracy, reproducibility, linearity, limit of detection (LOD) and limit of quantitation (LOQ), selectivity, as well as extraction efficiency and stability of standard stock solutions were tested, and derivatization was optimized in detail. Intra- and inter-day precisions were within 2.5-21.8 and 6.0-22.5%, and square of correlation coefficients of linearity ranged from 0.9896 to 0.9999. The limit of quantitation (LOQ) varied from 2 to 2000 ng/ml due to a variety of the relevant concentrations of the analyzed substances in blood. The method is feasible for highly sensitive, reliable and possibly routinely performed clinical and forensic toxicological analyses.
Hjelmeland, Anna K; Wylie, Philip L; Ebeler, Susan E
2016-02-01
Methoxypyrazines are volatile compounds found in plants, microbes, and insects that have potent vegetal and earthy aromas. With sensory detection thresholds in the low ng L(-1) range, modest concentrations of these compounds can profoundly impact the aroma quality of foods and beverages, and high levels can lead to consumer rejection. The wine industry routinely analyzes the most prevalent methoxypyrazine, 2-isobutyl-3-methoxypyrazine (IBMP), to aid in harvest decisions, since concentrations decrease during berry ripening. In addition to IBMP, three other methoxypyrazines IPMP (2-isopropyl-3-methoxypyrazine), SBMP (2-sec-butyl-3-methoxypyrazine), and EMP (2-ethyl-3-methoxypyrazine) have been identified in grapes and/or wine and can impact aroma quality. Despite their routine analysis in the wine industry (mostly IBMP), accurate methoxypyrazine quantitation is hindered by two major challenges: sensitivity and resolution. With extremely low sensory detection thresholds (~8-15 ng L(-1) in wine for IBMP), highly sensitive analytical methods to quantify methoxypyrazines at trace levels are necessary. Here we were able to achieve resolution of IBMP as well as IPMP, EMP, and SBMP from co-eluting compounds using one-dimensional chromatography coupled to positive chemical ionization tandem mass spectrometry. Three extraction techniques HS-SPME (headspace-solid phase microextraction), SBSE (stirbar sorptive extraction), and HSSE (headspace sorptive extraction) were validated and compared. A 30 min extraction time was used for HS-SPME and SBSE extraction techniques, while 120 min was necessary to achieve sufficient sensitivity for HSSE extractions. All extraction methods have limits of quantitation (LOQ) at or below 1 ng L(-1) for all four methoxypyrazines analyzed, i.e., LOQ's at or below reported sensory detection limits in wine. The method is high throughput, with resolution of all compounds possible with a relatively rapid 27 min GC oven program. Copyright © 2015 Elsevier B.V. All rights reserved.
Bravo, Dayana; Clari, María Ángeles; Costa, Elisa; Muñoz-Cobo, Beatriz; Solano, Carlos; José Remigia, María; Navarro, David
2011-08-01
Limited data are available on the performance of different automated extraction platforms and commercially available quantitative real-time PCR (QRT-PCR) methods for the quantitation of cytomegalovirus (CMV) DNA in plasma. We compared the performance characteristics of the Abbott mSample preparation system DNA kit on the m24 SP instrument (Abbott), the High Pure viral nucleic acid kit on the COBAS AmpliPrep system (Roche), and the EZ1 Virus 2.0 kit on the BioRobot EZ1 extraction platform (Qiagen) coupled with the Abbott CMV PCR kit, the LightCycler CMV Quant kit (Roche), and the Q-CMV complete kit (Nanogen), for both plasma specimens from allogeneic stem cell transplant (Allo-SCT) recipients (n = 42) and the OptiQuant CMV DNA panel (AcroMetrix). The EZ1 system displayed the highest extraction efficiency over a wide range of CMV plasma DNA loads, followed by the m24 and the AmpliPrep methods. The Nanogen PCR assay yielded higher mean CMV plasma DNA values than the Abbott and the Roche PCR assays, regardless of the platform used for DNA extraction. Overall, the effects of the extraction method and the QRT-PCR used on CMV plasma DNA load measurements were less pronounced for specimens with high CMV DNA content (>10,000 copies/ml). The performance characteristics of the extraction methods and QRT-PCR assays evaluated herein for clinical samples were extensible at cell-based standards from AcroMetrix. In conclusion, different automated systems are not equally efficient for CMV DNA extraction from plasma specimens, and the plasma CMV DNA loads measured by commercially available QRT-PCRs can differ significantly. The above findings should be taken into consideration for the establishment of cutoff values for the initiation or cessation of preemptive antiviral therapies and for the interpretation of data from clinical studies in the Allo-SCT setting.
The Role of Mother in Informing Girls About Puberty: A Meta-Analysis Study
Sooki, Zahra; Shariati, Mohammad; Chaman, Reza; Khosravi, Ahmad; Effatpanah, Mohammad; Keramat, Afsaneh
2016-01-01
Context Family, especially the mother, has the most important role in the education, transformation of information, and health behaviors of girls in order for them to have a healthy transition from the critical stage of puberty, but there are different views in this regard. Objectives Considering the various findings about the source of information about puberty, a meta-analysis study was conducted to investigate the extent of the mother’s role in informing girls about puberty. Data Sources This meta-analysis study was based on English articles published from 2000 to February 2015 in the Scopus, PubMed, and Science direct databases and on Persian articles in the SID, Magiran, and Iran Medex databases with determined key words and their MeSH equivalent. Study Selection Quantitative cross-sectional articles were extracted by two independent researchers and finally 46 articles were selected based on inclusion criteria. STROBE list were used for evaluation of studies. Data Extraction The percent of mothers as the current and preferred source of gaining information about the process of puberty, menarche, and menstruation from the perspective of adolescent girls was extracted from the articles. The results of studies were analyzed using meta-analysis (random effects model) and the studies’ heterogeneity was analyzed using the I2 calculation index. Variance between studies was analyzed using tau squared (Tau2) and review manager 5 software. Results The results showed that, from the perspective of teenage girls in Iran and other countries, in 56% of cases, the mother was the current source of information about the process of puberty, menarche, and menstruation. The preferred source of information about the process of puberty, menarche, and menstruation was the mother in all studies at 60% (Iran 57%, and other countries 66%). Conclusions According to the findings of this study, it is essential that health professionals and officials of the ministry of health train mothers about the time, trends, and factors affecting the start of puberty using a multi-dimensional approach that involves religious organizations, community groups, and peer groups. PMID:27331056
Extracting quantitative measures from EAP: a small clinical study using BFOR.
Hosseinbor, A Pasha; Chung, Moo K; Wu, Yu-Chien; Fleming, John O; Field, Aaron S; Alexander, Andrew L
2012-01-01
The ensemble average propagator (EAP) describes the 3D average diffusion process of water molecules, capturing both its radial and angular contents, and hence providing rich information about complex tissue microstructure properties. Bessel Fourier orientation reconstruction (BFOR) is one of several analytical, non-Cartesian EAP reconstruction schemes employing multiple shell acquisitions that have recently been proposed. Such modeling bases have not yet been fully exploited in the extraction of rotationally invariant q-space indices that describe the degree of diffusion anisotropy/restrictivity. Such quantitative measures include the zero-displacement probability (P(o)), mean squared displacement (MSD), q-space inverse variance (QIV), and generalized fractional anisotropy (GFA), and all are simply scalar features of the EAP. In this study, a general relationship between MSD and q-space diffusion signal is derived and an EAP-based definition of GFA is introduced. A significant part of the paper is dedicated to utilizing BFOR in a clinical dataset, comprised of 5 multiple sclerosis (MS) patients and 4 healthy controls, to estimate P(o), MSD, QIV, and GFA of corpus callosum, and specifically, to see if such indices can detect changes between normal appearing white matter (NAWM) and healthy white matter (WM). Although the sample size is small, this study is a proof of concept that can be extended to larger sample sizes in the future.
Detection of brain tumor margins using optical coherence tomography
NASA Astrophysics Data System (ADS)
Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier
2018-02-01
In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, noncancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancerinfiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End-Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.
Detection of brain tumor margins using optical coherence tomography
NASA Astrophysics Data System (ADS)
Juarez-Chambi, Ronald M.; Kut, Carmen; Rico-Jimenez, Jesus; Campos-Delgado, Daniel U.; Quinones-Hinojosa, Alfredo; Li, Xingde; Jo, Javier
2018-02-01
In brain cancer surgery, it is critical to achieve extensive resection without compromising adjacent healthy, non-cancerous regions. Various technological advances have made major contributions in imaging, including intraoperative magnetic imaging (MRI) and computed tomography (CT). However, these technologies have pros and cons in providing quantitative, real-time and three-dimensional (3D) continuous guidance in brain cancer detection. Optical Coherence Tomography (OCT) is a non-invasive, label-free, cost-effective technique capable of imaging tissue in three dimensions and real time. The purpose of this study is to reliably and efficiently discriminate between non-cancer and cancer-infiltrated brain regions using OCT images. To this end, a mathematical model for quantitative evaluation known as the Blind End- Member and Abundances Extraction method (BEAE). This BEAE method is a constrained optimization technique which extracts spatial information from volumetric OCT images. Using this novel method, we are able to discriminate between cancerous and non-cancerous tissues and using logistic regression as a classifier for automatic brain tumor margin detection. Using this technique, we are able to achieve excellent performance using an extensive cross-validation of the training dataset (sensitivity 92.91% and specificity 98.15%) and again using an independent, blinded validation dataset (sensitivity 92.91% and specificity 86.36%). In summary, BEAE is well-suited to differentiate brain tissue which could support the guiding surgery process for tissue resection.
Fan, Audrey P; Govindarajan, Sindhuja T; Kinkel, R Philip; Madigan, Nancy K; Nielsen, A Scott; Benner, Thomas; Tinelli, Emanuele; Rosen, Bruce R; Adalsteinsson, Elfar; Mainero, Caterina
2015-01-01
Quantitative oxygen extraction fraction (OEF) in cortical veins was studied in patients with multiple sclerosis (MS) and healthy subjects via magnetic resonance imaging (MRI) phase images at 7 Tesla (7 T). Flow-compensated, three-dimensional gradient-echo scans were acquired for absolute OEF quantification in 23 patients with MS and 14 age-matched controls. In patients, we collected T2*-weighted images for characterization of white matter, deep gray matter, and cortical lesions, and also assessed cognitive function. Variability of OEF across readers and scan sessions was evaluated in a subset of volunteers. OEF was averaged from 2 to 3 pial veins in the sensorimotor, parietal, and prefrontal cortical regions for each subject (total of ~10 vessels). We observed good reproducibility of mean OEF, with intraobserver coefficient of variation (COV)=2.1%, interobserver COV=5.2%, and scan-rescan COV=5.9%. Patients exhibited a 3.4% reduction in cortical OEF relative to controls (P=0.0025), which was not different across brain regions. Although oxygenation did not relate with measures of structural tissue damage, mean OEF correlated with a global measure of information processing speed. These findings suggest that cortical OEF from 7-T MRI phase is a reproducible metabolic biomarker that may be sensitive to different pathologic processes than structural MRI in patients with MS.
Turner, Cameron R.; Miller, Derryl J.; Coyne, Kathryn J.; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species. PMID:25474207
Turner, Cameron R; Miller, Derryl J; Coyne, Kathryn J; Corush, Joel
2014-01-01
Indirect, non-invasive detection of rare aquatic macrofauna using aqueous environmental DNA (eDNA) is a relatively new approach to population and biodiversity monitoring. As such, the sensitivity of monitoring results to different methods of eDNA capture, extraction, and detection is being investigated in many ecosystems and species. One of the first and largest conservation programs with eDNA-based monitoring as a central instrument focuses on Asian bigheaded carp (Hypophthalmichthys spp.), an invasive fish spreading toward the Laurentian Great Lakes. However, the standard eDNA methods of this program have not advanced since their development in 2010. We developed new, quantitative, and more cost-effective methods and tested them against the standard protocols. In laboratory testing, our new quantitative PCR (qPCR) assay for bigheaded carp eDNA was one to two orders of magnitude more sensitive than the existing endpoint PCR assays. When applied to eDNA samples from an experimental pond containing bigheaded carp, the qPCR assay produced a detection probability of 94.8% compared to 4.2% for the endpoint PCR assays. Also, the eDNA capture and extraction method we adapted from aquatic microbiology yielded five times more bigheaded carp eDNA from the experimental pond than the standard method, at a per sample cost over forty times lower. Our new, more sensitive assay provides a quantitative tool for eDNA-based monitoring of bigheaded carp, and the higher-yielding eDNA capture and extraction method we describe can be used for eDNA-based monitoring of any aquatic species.
NASA Astrophysics Data System (ADS)
Xie, Jiayu; Wang, Gongwen; Sha, Yazhou; Liu, Jiajun; Wen, Botao; Nie, Ming; Zhang, Shuai
2017-04-01
Integrating multi-source geoscience information (such as geology, geophysics, geochemistry, and remote sensing) using GIS mapping is one of the key topics and frontiers in quantitative geosciences for mineral exploration. GIS prospective mapping and three-dimensional (3D) modeling can be used not only to extract exploration criteria and delineate metallogenetic targets but also to provide important information for the quantitative assessment of mineral resources. This paper uses the Shangnan district of Shaanxi province (China) as a case study area. GIS mapping and potential granite-hydrothermal uranium targeting were conducted in the study area combining weights of evidence (WofE) and concentration-area (C-A) fractal methods with multi-source geoscience information. 3D deposit-scale modeling using GOCAD software was performed to validate the shapes and features of the potential targets at the subsurface. The research results show that: (1) the known deposits have potential zones at depth, and the 3D geological models can delineate surface or subsurface ore-forming features, which can be used to analyze the uncertainty of the shape and feature of prospectivity mapping at the subsurface; (2) single geochemistry anomalies or remote sensing anomalies at the surface require combining the depth exploration criteria of geophysics to identify potential targets; and (3) the single or sparse exploration criteria zone with few mineralization spots at the surface has high uncertainty in terms of the exploration target.
Contributions to the phytochemical study of Bidens tripartitae herba from Romania. I. Tannins.
Zagnat, M; Cheptea, Corina; Spac, A F
2013-01-01
To analyze qualitatively and quantitatively tannins in the native plant, collected during the whole vegetation period from different areas of the country, and in its different organs (flower, stem, leaf). For quantitative analysis, the plant product was extracted by repeated maceration (3 days) with 80% methanol. Proanthocyanidins in the extract were quantified by spectrophotometric methods. condensed tannins were present while hydrolyzed tannins were absent. Chromatographic analysis showed that tannins spectrum is similar in all plant organs and in plants collected at different times throughout the vegetation period. The differences are only quantitative. The maximum amount of tannins was found during the flowering stage (10.32%). In terms of tannin content, flowering is the best time to collect. However, collection throughout the whole vegetation period is acceptable.
Removal of BCG artefact from concurrent fMRI-EEG recordings based on EMD and PCA.
Javed, Ehtasham; Faye, Ibrahima; Malik, Aamir Saeed; Abdullah, Jafri Malin
2017-11-01
Simultaneous electroencephalography (EEG) and functional magnetic resonance image (fMRI) acquisitions provide better insight into brain dynamics. Some artefacts due to simultaneous acquisition pose a threat to the quality of the data. One such problematic artefact is the ballistocardiogram (BCG) artefact. We developed a hybrid algorithm that combines features of empirical mode decomposition (EMD) with principal component analysis (PCA) to reduce the BCG artefact. The algorithm does not require extra electrocardiogram (ECG) or electrooculogram (EOG) recordings to extract the BCG artefact. The method was tested with both simulated and real EEG data of 11 participants. From the simulated data, the similarity index between the extracted BCG and the simulated BCG showed the effectiveness of the proposed method in BCG removal. On the other hand, real data were recorded with two conditions, i.e. resting state (eyes closed dataset) and task influenced (event-related potentials (ERPs) dataset). Using qualitative (visual inspection) and quantitative (similarity index, improved normalized power spectrum (INPS) ratio, power spectrum, sample entropy (SE)) evaluation parameters, the assessment results showed that the proposed method can efficiently reduce the BCG artefact while preserving the neuronal signals. Compared with conventional methods, namely, average artefact subtraction (AAS), optimal basis set (OBS) and combined independent component analysis and principal component analysis (ICA-PCA), the statistical analyses of the results showed that the proposed method has better performance, and the differences were significant for all quantitative parameters except for the power and sample entropy. The proposed method does not require any reference signal, prior information or assumption to extract the BCG artefact. It will be very useful in circumstances where the reference signal is not available. Copyright © 2017 Elsevier B.V. All rights reserved.
Gupta, Shikha; Shanker, Karuna; Srivastava, Santosh K
2012-07-01
A new validated high-performance thin-layer chromatographic (HPTLC) method has been developed for the simultaneous quantitation of four antipsychotic indole alkaloids (IAs), reserpiline (RP, 1), α-yohimbine (YH, 2), isoreserpiline (IRP, 3) and 10-methoxy tetrahydroalstonine (MTHA, 4) as markers in the leaves of Rauwolfia tetraphylla. Extraction efficiency of the targeted IAs from the leaf matrix with organic and ecofriendly (green) solvents using percolation, ultrasonication and microwave techniques were studied. Non-ionic surfactants, viz. Triton X-100, Triton X-114 and Genapol X-80 were used for extraction and no back-extraction or liquid chromatographic steps were used to remove the targeted IAs from the surfactant-rich extractant phase. The optimized cloud point extraction was found a potentially useful methodology for the preconcentration of the targeted IAs. The separation was achieved on silica gel 60F(254) HPTLC plates using hexane-ethylacetate-methanol (5:4:1, v/v/v) as mobile phase. The quantitation of IAs (1-4) was carried out using the densitometric reflection/absorption mode at 520 nm after post chromatographic derivatization using Dragendorff's reagent. The method was validated for peak purity, precision, accuracy, robustness, limit of detection (LOD) and quantitation (LOQ). Method specificity was confirmed using retention factor (R(f)) and visible spectral (post chromatographic scan) correlation of marker compounds in the samples and standard tracks. Copyright © 2012 Elsevier B.V. All rights reserved.
Nováková, Lucie; Vildová, Anna; Mateus, Joana Patricia; Gonçalves, Tiago; Solich, Petr
2010-09-15
UHPLC-MS/MS method using BEH C18 analytical column was developed for the separation and quantitation of 12 phenolic compounds of Chamomile (Matricaria recutita L.). The separation was accomplished using gradient elution with mobile phase consisting of methanol and formic acid 0.1%. ESI in both positive and negative ion mode was optimized with the aim to reach high sensitivity and selectivity for quantitation using SRM experiment. ESI in negative ion mode was found to be more convenient for quantitative analysis of all phenolics except of chlorogenic acid and kaempherol, which demonstrated better results of linearity, accuracy and precision in ESI positive ion mode. The results of method validation confirmed, that developed UHPLC-MS/MS method was convenient and reliable for the determination of phenolic compounds in Chamomile extracts with linearity >0.9982, accuracy within 76.7-126.7% and precision within 2.2-12.7% at three spiked concentration levels. Method sensitivity expressed as LOQ was typically 5-20 nmol/l. Extracts of Chamomile flowers and Chamomile tea were subjected to UHPLC-MS/MS analysis. The most abundant phenolic compounds in both Chamomile flowers and Chamomile tea extracts were chlorogenic acid, umbelliferone, apigenin and apigenin-7-glucoside. In Chamomile tea extracts there was greater abundance of flavonoid glycosides such as rutin or quercitrin, while the aglycone apigenin and its glycoside were present in lower amount. Copyright (c) 2010 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Rossi, Henry F., III; Rizzo, Jacqueline; Zimmerman, Devon C.; Usher, Karyn M.
2012-01-01
A chemical separation experiment can be an interesting addition to an introductory analytical chemistry laboratory course. We have developed an experiment to extract FD&C Red Dye #40 from beverages containing cranberry juice. After extraction, the dye is quantified using colorimetry. The experiment gives students hands-on experience in using solid…
Tóth, Anita; Végh, Krisztina; Alberti, Ágnes; Béni, Szabolcs; Kéry, Ágnes
2016-10-01
UPLC-DAD method was developed and validated for the quantitative determination of free flavonol aglycones (kaempferol, quercetin and myricetin) after acidic hydrolysis in six Lysimachia species. Quantitative analyses showed that the amounts of various flavonol aglycones were significantly different in Lysimachia vulgaris, Lysimachia nummularia, Lysimachia punctata, Lysimachia christinae, Lysimachia ciliata and Lysimachia clethroides. The L. clethroides sample was found to be the richest in kaempferol (25.77 ± 1.29 μg/mg extract) and quercetin (97.67 ± 4.61 μg/mg extract), while the L. nummularia sample contained the highest amount of myricetin (20.79 ± 1.00 μg/mg extract). The antioxidant capacity of hydrolysed extracts was evaluated using in vitro DPPH(•) (2,2-diphenyl-1-picrylhydrazyl) and ABTS(•+) [2,2'-azino-bis-(3-ethylbenzothiazoline-6-sulphonic acid)] decolourisation tests. The observed radical scavenging capacities of the extracts showed a relationship with the measured flavonol aglycone content and composition. The acidic treatment resulted in an increased free radical scavenging activity compared to the untreated methanol extract.
A computational image analysis glossary for biologists.
Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M
2012-09-01
Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies.
Analysis of photographic X-ray images. [S-054 telescope on Skylab
NASA Technical Reports Server (NTRS)
Krieger, A. S.
1977-01-01
Some techniques used to extract quantitative data from the information contained in photographic images produced by grazing incidence soft X-ray optical systems are described. The discussion is focussed on the analysis of the data returned by the S-054 X-Ray Spectrographic Telescope Experiment on Skylab. The parameters of the instrument and the procedures used for its calibration are described. The technique used to convert photographic density to focal plane X-ray irradiance is outlined. The deconvolution of the telescope point response function from the image data is discussed. Methods of estimating the temperature, pressure, and number density of coronal plasmas are outlined.
NASA Astrophysics Data System (ADS)
Xu, Jing; Liu, Xiaofei; Wang, Yutian
2016-08-01
Parallel factor analysis is a widely used method to extract qualitative and quantitative information of the analyte of interest from fluorescence emission-excitation matrix containing unknown components. Big amplitude of scattering will influence the results of parallel factor analysis. Many methods of eliminating scattering have been proposed. Each of these methods has its advantages and disadvantages. The combination of symmetrical subtraction and interpolated values has been discussed. The combination refers to both the combination of results and the combination of methods. Nine methods were used for comparison. The results show the combination of results can make a better concentration prediction for all the components.
NASA Astrophysics Data System (ADS)
Zhuo, Shuangmu; Chen, Jianxin; Xie, Shusen; Hong, Zhibin; Jiang, Xingshan
2009-03-01
Intrinsic two-photon excited fluorescence (TPEF) and second-harmonic generation (SHG) signals are shown to differentiate between normal and neoplastic human esophageal stroma. It was found that TPEF and SHG signals from normal and neoplastic stroma exhibit different organization features, providing quantitative information about the biomorphology and biochemistry of tissue. By comparing normal with neoplastic stroma, there were significant differences in collagen-related changes, elastin-related changes, and alteration in proportions of matrix molecules, giving insight into the stromal changes associated with cancer progression and providing substantial potential to be applied in vivo to the clinical diagnosis of epithelial precancers and cancers.
Qualitative and quantitative processing of side-scan sonar data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dwan, F.S.; Anderson, A.L.; Hilde, T.W.C.
1990-06-01
Modern side-scan sonar systems allow vast areas of seafloor to be rapidly imaged and quantitatively mapped in detail. The application of remote sensing image processing techniques can be used to correct for various distortions inherent in raw sonography. Corrections are possible for water column, slant-range, aspect ratio, speckle and striping noise, multiple returns, power drop-off, and for georeferencing. The final products reveal seafloor features and patterns that are geometrically correct, georeferenced, and have improved signal/noise ratio. These products can be merged with other georeferenced data bases for further database management and information extraction. In order to compare data collected bymore » different systems from a common area and to ground truth measurements and geoacoustic models, quantitative correction must be made for calibrated sonar system and bathymetry effects. Such data inversion must account for system source level, beam pattern, time-varying gain, processing gain, transmission loss, absorption, insonified area, and grazing angle effects. Seafloor classification can then be performed on the calculated back-scattering strength using Lambert's Law and regression analysis. Examples are given using both approaches: image analysis and inversion of data based on the sonar equation.« less
2011-01-01
Background The term 'inequities' refers to avoidable differences rooted in injustice. This review examined whether or not, and how, quantitative studies identifying inequalities in risk factors and health service utilization for asthma explicitly addressed underlying inequities. Asthma was chosen because recent decades have seen strong increases in asthma prevalence in many international settings, and inequalities in risk factors and related outcomes. Methods A review was conducted of studies that identified social inequalities in asthma-related outcomes or health service use in adult populations. Data were extracted on use of equity terms (objective evidence), and discussion of equity issues without using the exact terms (subjective evidence). Results Of the 219 unique articles retrieved, 21 were eligible for inclusion. None used the terms equity/inequity. While all but one article traced at least partial pathways to inequity, only 52% proposed any intervention and 55% of these interventions focused exclusively on the more proximal, clinical level. Conclusions Without more in-depth and systematic examination of inequities underlying asthma prevalence, quantitative studies may fail to provide the evidence required to inform equity-oriented interventions to address underlying circumstances restricting opportunities for health. PMID:21749720
Vera-Candioti, Luciana; Culzoni, María J; Olivieri, Alejandro C; Goicoechea, Héctor C
2008-11-01
Drug monitoring in serum samples was performed using second-order data generated by CE-DAD, processed with a suitable chemometric strategy. Carbamazepine could be accurately quantitated in the presence of its main metabolite (carbamazepine epoxide), other therapeutic drugs (lamotrigine, phenobarbital, phenytoin, phenylephrine, ibuprofen, acetaminophen, theophylline, caffeine, acetyl salicylic acid), and additional serum endogenous components. The analytical strategy consisted of the following steps: (i) serum sample clean-up to remove matrix interferences, (ii) data pre-processing, in order to reduce the background and to correct for electrophoretic time shifts, and (iii) resolution of fully overlapped CE peaks (corresponding to carbamazepine, its metabolite, lamotrigine and unexpected serum components) by the well-known multivariate curve resolution-alternating least squares algorithm, which extracts quantitative information that can be uniquely ascribed to the analyte of interest. The analyte concentration in serum samples ranged from 2.00 to 8.00 mg/L. Mean recoveries were 102.6% (s=7.7) for binary samples, and 94.8% (s=13.5) for spiked serum samples, while CV (%)=4.0 was computed for five replicate, indicative of the acceptable accuracy and precision of the proposed method.
Quantitative Evaluation of Performance during Robot-assisted Treatment.
Peri, E; Biffi, E; Maghini, C; Servodio Iammarrone, F; Gagliardi, C; Germiniasi, C; Pedrocchi, A; Turconi, A C; Reni, G
2016-01-01
This article is part of the Focus Theme of Methods of Information in Medicine on "Methodologies, Models and Algorithms for Patients Rehabilitation". The great potential of robots in extracting quantitative and meaningful data is not always exploited in clinical practice. The aim of the present work is to describe a simple parameter to assess the performance of subjects during upper limb robotic training exploiting data automatically recorded by the robot, with no additional effort for patients and clinicians. Fourteen children affected by cerebral palsy (CP) performed a training with Armeo®Spring. Each session was evaluated with P, a simple parameter that depends on the overall performance recorded, and median and interquartile values were computed to perform a group analysis. Median (interquartile) values of P significantly increased from 0.27 (0.21) at T0 to 0.55 (0.27) at T1 . This improvement was functionally validated by a significant increase of the Melbourne Assessment of Unilateral Upper Limb Function. The parameter described here was able to show variations in performance over time and enabled a quantitative evaluation of motion abilities in a way that is reliable with respect to a well-known clinical scale.
Quantitative phase and amplitude imaging using Differential-Interference Contrast (DIC) microscopy
NASA Astrophysics Data System (ADS)
Preza, Chrysanthe; O'Sullivan, Joseph A.
2009-02-01
We present an extension of the development of an alternating minimization (AM) method for the computation of a specimen's complex transmittance function (magnitude and phase) from DIC images. The ability to extract both quantitative phase and amplitude information from two rotationally-diverse DIC images (i.e., acquired by rotating the sample) extends previous efforts in computational DIC microscopy that have focused on quantitative phase imaging only. Simulation results show that the inverse problem at hand is sensitive to noise as well as to the choice of the AM algorithm parameters. The AM framework allows constraints and penalties on the magnitude and phase estimates to be incorporated in a principled manner. Towards this end, Green and De Pierro's "log-cosh" regularization penalty is applied to the magnitude of differences of neighboring values of the complex-valued function of the specimen during the AM iterations. The penalty is shown to be convex in the complex space. A procedure to approximate the penalty within the iterations is presented. In addition, a methodology to pre-compute AM parameters that are optimal with respect to the convergence rate of the AM algorithm is also presented. Both extensions of the AM method are investigated with simulations.
Quantitative polarized Raman spectroscopy in highly turbid bone tissue
NASA Astrophysics Data System (ADS)
Raghavan, Mekhala; Sahar, Nadder D.; Wilson, Robert H.; Mycek, Mary-Ann; Pleshko, Nancy; Kohn, David H.; Morris, Michael D.
2010-05-01
Polarized Raman spectroscopy allows measurement of molecular orientation and composition and is widely used in the study of polymer systems. Here, we extend the technique to the extraction of quantitative orientation information from bone tissue, which is optically thick and highly turbid. We discuss multiple scattering effects in tissue and show that repeated measurements using a series of objectives of differing numerical apertures can be employed to assess the contributions of sample turbidity and depth of field on polarized Raman measurements. A high numerical aperture objective minimizes the systematic errors introduced by multiple scattering. We test and validate the use of polarized Raman spectroscopy using wild-type and genetically modified (oim/oim model of osteogenesis imperfecta) murine bones. Mineral orientation distribution functions show that mineral crystallites are not as well aligned (p<0.05) in oim/oim bones (28+/-3 deg) compared to wild-type bones (22+/-3 deg), in agreement with small-angle X-ray scattering results. In wild-type mice, backbone carbonyl orientation is 76+/-2 deg and in oim/oim mice, it is 72+/-4 deg (p>0.05). We provide evidence that simultaneous quantitative measurements of mineral and collagen orientations on intact bone specimens are possible using polarized Raman spectroscopy.
Quantitative polarized Raman spectroscopy in highly turbid bone tissue.
Raghavan, Mekhala; Sahar, Nadder D; Wilson, Robert H; Mycek, Mary-Ann; Pleshko, Nancy; Kohn, David H; Morris, Michael D
2010-01-01
Polarized Raman spectroscopy allows measurement of molecular orientation and composition and is widely used in the study of polymer systems. Here, we extend the technique to the extraction of quantitative orientation information from bone tissue, which is optically thick and highly turbid. We discuss multiple scattering effects in tissue and show that repeated measurements using a series of objectives of differing numerical apertures can be employed to assess the contributions of sample turbidity and depth of field on polarized Raman measurements. A high numerical aperture objective minimizes the systematic errors introduced by multiple scattering. We test and validate the use of polarized Raman spectroscopy using wild-type and genetically modified (oim/oim model of osteogenesis imperfecta) murine bones. Mineral orientation distribution functions show that mineral crystallites are not as well aligned (p<0.05) in oim/oim bones (28+/-3 deg) compared to wild-type bones (22+/-3 deg), in agreement with small-angle X-ray scattering results. In wild-type mice, backbone carbonyl orientation is 76+/-2 deg and in oim/oim mice, it is 72+/-4 deg (p>0.05). We provide evidence that simultaneous quantitative measurements of mineral and collagen orientations on intact bone specimens are possible using polarized Raman spectroscopy.
Tip-Enhanced Raman Voltammetry: Coverage Dependence and Quantitative Modeling.
Mattei, Michael; Kang, Gyeongwon; Goubert, Guillaume; Chulhai, Dhabih V; Schatz, George C; Jensen, Lasse; Van Duyne, Richard P
2017-01-11
Electrochemical atomic force microscopy tip-enhanced Raman spectroscopy (EC-AFM-TERS) was employed for the first time to observe nanoscale spatial variations in the formal potential, E 0' , of a surface-bound redox couple. TERS cyclic voltammograms (TERS CVs) of single Nile Blue (NB) molecules were acquired at different locations spaced 5-10 nm apart on an indium tin oxide (ITO) electrode. Analysis of TERS CVs at different coverages was used to verify the observation of single-molecule electrochemistry. The resulting TERS CVs were fit to the Laviron model for surface-bound electroactive species to quantitatively extract the formal potential E 0' at each spatial location. Histograms of single-molecule E 0' at each coverage indicate that the electrochemical behavior of the cationic oxidized species is less sensitive to local environment than the neutral reduced species. This information is not accessible using purely electrochemical methods or ensemble spectroelectrochemical measurements. We anticipate that quantitative modeling and measurement of site-specific electrochemistry with EC-AFM-TERS will have a profound impact on our understanding of the role of nanoscale electrode heterogeneity in applications such as electrocatalysis, biological electron transfer, and energy production and storage.
de Kanel, J; Vickery, W E; Waldner, B; Monahan, R M; Diamond, F X
1998-05-01
A forensic procedure for the quantitative confirmation of lysergic acid diethylamide (LSD) and the qualitative confirmation of its metabolite, N-demethyl-LSD, in blood, serum, plasma, and urine samples is presented. The Zymark RapidTrace was used to perform fully automated solid-phase extractions of all specimen types. After extract evaporation, confirmations were performed using liquid chromatography (LC) followed by positive electrospray ionization (ESI+) mass spectrometry/mass spectrometry (MS/MS) without derivatization. Quantitation of LSD was accomplished using LSD-d3 as an internal standard. The limit of quantitation (LOQ) for LSD was 0.05 ng/mL. The limit of detection (LOD) for both LSD and N-demethyl-LSD was 0.025 ng/mL. The recovery of LSD was greater than 95% at levels of 0.1 ng/mL and 2.0 ng/mL. For LSD at 1.0 ng/mL, the within-run and between-run (different day) relative standard deviation (RSD) was 2.2% and 4.4%, respectively.
Libong, Danielle; Bouchonnet, Stéphane; Ricordel, Ivan
2003-01-01
A gas chromatography-ion trap tandem mass spectrometry (GC-ion trap MS-MS) method for detection and quantitation of LSD in whole blood is presented. The sample preparation process, including a solid-phase extraction step with Bond Elut cartridges, was performed with 2 mL of whole blood. Eight microliters of the purified extract was injected with a cold on-column injection method. Positive chemical ionization was performed using acetonitrile as reagent gas; LSD was detected in the MS-MS mode. The chromatograms obtained from blood extracts showed the great selectivity of the method. GC-MS quantitation was performed using lysergic acid methylpropylamide as the internal standard. The response of the MS was linear for concentrations ranging from 0.02 ng/mL (detection threshold) to 10.0 ng/mL. Several parameters such as the choice of the capillary column, the choice of the internal standard and that of the ionization mode (positive CI vs. EI) were rationalized. Decomposition pathways under both ionization modes were studied. Within-day and between-day stability were evaluated.
Noor, Ayesha; Gunasekaran, S.; Vijayalakshmi, M. A.
2017-01-01
Background: Diabetes mellitus is a metabolic disorder characterized by chronic hyperglycemia. Plant extracts and their products are being used as an alternative system of medicine for the treatment of diabetes. Aloe vera has been traditionally used to treat several diseases and it exhibits antioxidant, anti-inflammatory, and wound-healing effects. Streptozotocin (STZ)-induced Wistar diabetic rats were used in this study to understand the potential protective effect of A. vera extract on the pancreatic islets. Objective: The aim of the present study was to evaluate the A. vera extract on improvement of insulin secretion and pancreatic β-cell function by morphometric analysis of pancreatic islets in STZ-induced diabetic Wistar rats. Materials and Methods: After acclimatization, male Wistar rats, maintained as per the Committee for the Purpose of Control and Supervision of Experiments on Animals guidelines, were randomly divided into four groups of six rats each. Fasting plasma glucose and insulin levels were assessed. The effect of A. vera extract in STZ-induced diabetic rats on the pancreatic islets by morphometric analysis was evaluated. Results: Oral administration of A. vera extract (300 mg/kg) daily to diabetic rats for 3 weeks showed restoration of blood glucose levels to normal levels with a concomitant increase in insulin levels upon feeding with A. vera extract in STZ-induced diabetic rats. Morphometric analysis of pancreatic sections revealed quantitative and qualitative gain in terms of number, diameter, volume, and area of the pancreatic islets of diabetic rats treated with A. vera extract when compared to the untreated diabetic rats. Conclusion: A. vera extract exerts antidiabetic effects by improving insulin secretion and pancreatic β-cell function by restoring pancreatic islet mass in STZ-induced diabetic Wistar rats. SUMMARY Fasting plasma glucose (FPG) and insulin levels were restored to normal levels in diabetic rats treated with Aloe vera extractIslets of pancreas were qualitatively and quantitatively restored to normalcy leading to restoration of FPG and insulin levels of diabetic rats treated with Aloe vera extractMorphometric analysis of pancreatic sections revealed quantitative and qualitative gain in terms of number, diameter, volume, and area of the pancreatic islets of diabetic rats treated with Aloe vera extract when compared to the untreated diabetic rats. Abbreviations Used: A. vera, FPG: Fasting plasma glucose, STZ: Streptozotocin, BW: Body weight PMID:29333050
Åhrman, Emma; Hallgren, Oskar; Malmström, Lars; Hedström, Ulf; Malmström, Anders; Bjermer, Leif; Zhou, Xiao-Hong; Westergren-Thorsson, Gunilla; Malmström, Johan
2018-03-01
Remodeling of the extracellular matrix (ECM) is a common feature in lung diseases such as chronic obstructive pulmonary disease (COPD) and idiopathic pulmonary fibrosis (IPF). Here, we applied a sequential tissue extraction strategy to describe disease-specific remodeling of human lung tissue in disease, using end-stages of COPD and IPF. Our strategy was based on quantitative comparison of the disease proteomes, with specific focus on the matrisome, using data-independent acquisition and targeted data analysis (SWATH-MS). Our work provides an in-depth proteomic characterization of human lung tissue during impaired tissue remodeling. In addition, we show important quantitative and qualitative effects of the solubility of matrisome proteins. COPD was characterized by a disease-specific increase in ECM regulators, metalloproteinase inhibitor 3 (TIMP3) and matrix metalloproteinase 28 (MMP-28), whereas for IPF, impairment in cell adhesion proteins, such as collagen VI and laminins, was most prominent. For both diseases, we identified increased levels of proteins involved in the regulation of endopeptidase activity, with several proteins belonging to the serpin family. The established human lung quantitative proteome inventory and the construction of a tissue-specific protein assay library provides a resource for future quantitative proteomic analyses of human lung tissues. We present a sequential tissue extraction strategy to determine changes in extractability of matrisome proteins in end-stage COPD and IPF compared to healthy control tissue. Extensive quantitative analysis of the proteome changes of the disease states revealed altered solubility of matrisome proteins involved in ECM regulators and cell-ECM communication. The results highlight disease-specific remodeling mechanisms associated with COPD and IPF. Copyright © 2018 Elsevier B.V. All rights reserved.
Sevenster, M; Buurman, J; Liu, P; Peters, J F; Chang, P J
2015-01-01
Accumulating quantitative outcome parameters may contribute to constructing a healthcare organization in which outcomes of clinical procedures are reproducible and predictable. In imaging studies, measurements are the principal category of quantitative para meters. The purpose of this work is to develop and evaluate two natural language processing engines that extract finding and organ measurements from narrative radiology reports and to categorize extracted measurements by their "temporality". The measurement extraction engine is developed as a set of regular expressions. The engine was evaluated against a manually created ground truth. Automated categorization of measurement temporality is defined as a machine learning problem. A ground truth was manually developed based on a corpus of radiology reports. A maximum entropy model was created using features that characterize the measurement itself and its narrative context. The model was evaluated in a ten-fold cross validation protocol. The measurement extraction engine has precision 0.994 and recall 0.991. Accuracy of the measurement classification engine is 0.960. The work contributes to machine understanding of radiology reports and may find application in software applications that process medical data.
Rule-guided human classification of Volunteered Geographic Information
NASA Astrophysics Data System (ADS)
Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian
2017-05-01
During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass-related features like forest, garden, park, and meadow. The findings of this study indicate the feasibility of the proposed approach.
Quantification of synthetic cannabinoids in herbal smoking blends using NMR.
Dunne, Simon J; Rosengren-Holmberg, Jenny P
2017-05-01
Herbal smoking blends containing synthetic cannabinoids have become popular alternatives to marijuana. These products were previously sold in pre-packaged foil bags, but nowadays seizures usually contain synthetic cannabinoid powders together with unprepared plant materials. A question often raised by the Swedish police is how much smoking blend can be prepared from certain amounts of banned substance, in order to establish the severity of the crime. To address this question, information about the synthetic cannabinoid content in both the powder and the prepared herbal blends is necessary. In this work, an extraction procedure compatible with direct NMR quantification of synthetic cannabinoids in herbal smoking blends was developed. Extraction media, time and efficiency were tested for different carrier materials containing representative synthetic cannabinoids. The developed protocol utilizes a 30 min extraction step in d 4 -methanol in presence of internal standard allowing direct quantitation of the extract using NMR. The accuracy of the developed method was tested using in-house prepared herbal smoking blends. The results showed deviations less than 0.2% from the actual content, proving that the method is sufficiently accurate for these quantifications. Using this method, ten synthetic cannabinoids present in sixty-three different herbal blends seized by the Swedish police between October 2012 and April 2015 were quantified. Obtained results showed a variation in cannabinoid contents from 1.5% (w/w) for mixtures containing MDMB-CHMICA to over 5% (w/w) for mixtures containing 5F-AKB-48. This is important information for forensic experts when making theoretical calculations of production quantities in legal cases regarding "home-made" herbal smoking blends. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Shen, Weijian; Xu, Jinzhong; Yang, Wenquan; Shen, Chongyu; Zhao, Zengyun; Ding, Tao; Wu, Bin
2007-09-01
An analytical method of solid phase extraction-gas chromatography-mass spectrometry with two different ionization techniques was established for simultaneous determination of 12 acetanilide herbicide residues in tea-leaves. Herbicides were extracted from tea-leaf samples with ethyl acetate. The extract was cleaned-up on an active carbon SPE column connected to a Florisil SPE column. Analytical screening was determined by the technique of gas chromatography (GC)-mass spectrometry (MS) in the selected ion monitoring (SIM) mode with either electron impact ionization (EI) or negative chemical ionization (NCI). It is reliable and stable that the recoveries of all herbicides were in the range from 50% to 110% at three spiked levels, 10 microg/kg, 20 microg/kg and 40 microg/kg, and the relative standard deviations (RSDs) were no more than 10.9%. The two different ionization techniques are complementary as more ion fragmentation information can be obtained from the EI mode while more molecular ion information from the NCI mode. By comparison of the two techniques, the selectivity of NCI-SIM was much better than that of EI-SIM method. The sensitivities of the both techniques were high, the limit of quantitative (LOQ) for each herbicide was no more than 2.0 microg/kg, and the limit of detection (LOD) with NCI-SIM technique was much lower than that of EI-SIM when analyzing herbicides with several halogen atoms in the molecule.
NASA Astrophysics Data System (ADS)
Wang, Zhao; Yang, Shan; Wang, Shuguang; Shen, Yan
2017-10-01
The assessment of the dynamic urban structure has been affected by lack of timely and accurate spatial information for a long period, which has hindered the measurements of structural continuity at the macroscale. Defense meteorological satellite program's operational linescan system (DMSP/OLS) nighttime light (NTL) data provide an ideal source for urban information detection with a long-time span, short-time interval, and wide coverage. In this study, we extracted the physical boundaries of urban clusters from corrected NTL images and quantitatively analyzed the structure of the urban cluster system based on rank-size distribution, spatial metrics, and Mann-Kendall trend test. Two levels of urban cluster systems in the Yangtze River Delta region (YRDR) were examined. We found that (1) in the entire YRDR, the urban cluster system showed a periodic process, with a significant trend of even distribution before 2007 but an unequal growth pattern after 2007, and (2) at the metropolitan level, vast disparities exist in four metropolitan areas for the fluctuations of Pareto's exponent, the speed of cluster expansion, and the dominance of core cluster. The results suggest that the extracted urban cluster information from NTL data effectively reflect the evolving nature of regional urbanization, which in turn can aid in the planning of cities and help achieve more sustainable regional development.
Analysis of different image-based biofeedback models for improving cycling performances
NASA Astrophysics Data System (ADS)
Bibbo, D.; Conforto, S.; Bernabucci, I.; Carli, M.; Schmid, M.; D'Alessio, T.
2012-03-01
Sport practice can take advantage from the quantitative assessment of task execution, which is strictly connected to the implementation of optimized training procedures. To this aim, it is interesting to explore the effectiveness of biofeedback training techniques. This implies a complete chain for information extraction containing instrumented devices, processing algorithms and graphical user interfaces (GUIs) to extract valuable information (i.e. kinematics, dynamics, and electrophysiology) to be presented in real-time to the athlete. In cycling, performance indexes displayed in a simple and perceivable way can help the cyclist optimize the pedaling. To this purpose, in this study four different GUIs have been designed and used in order to understand if and how a graphical biofeedback can influence the cycling performance. In particular, information related to the mechanical efficiency of pedaling is represented in each of the designed interfaces and then displayed to the user. This index is real-time calculated on the basis of the force signals exerted on the pedals during cycling. Instrumented pedals for bikes, already designed and implemented in our laboratory, have been used to measure those force components. A group of subjects underwent an experimental protocol and pedaled with (the interfaces have been used in a randomized order) and without graphical biofeedback. Preliminary results show how the effective perception of the biofeedback influences the motor performance.
Edirs, Salamet; Turak, Ablajan; Numonov, Sodik; Xin, Xuelei; Aisa, Haji Akber
2017-01-01
By using extraction yield, total polyphenolic content, antidiabetic activities (PTP-1B and α -glycosidase), and antioxidant activity (ABTS and DPPH) as indicated markers, the extraction conditions of the prescription Kursi Wufarikun Ziyabit (KWZ) were optimized by response surface methodology (RSM). Independent variables were ethanol concentration, extraction temperature, solid-to-solvent ratio, and extraction time. The result of RSM analysis showed that the four variables investigated have a significant effect ( p < 0.05) for Y 1 , Y 2 , Y 3 , Y 4 , and Y 5 with R 2 value of 0.9120, 0.9793, 0.9076, 0.9125, and 0.9709, respectively. Optimal conditions for the highest extraction yield of 39.28%, PTP-1B inhibition rate of 86.21%, α -glycosidase enzymes inhibition rate of 96.56%, and ABTS inhibition rate of 77.38% were derived at ethanol concentration 50.11%, extraction temperature 72.06°C, solid-to-solvent ratio 1 : 22.73 g/mL, and extraction time 2.93 h. On the basis of total polyphenol content of 48.44% in this optimal condition, the quantitative analysis of effective part of KWZ was characterized via UPLC method, 12 main components were identified by standard compounds, and all of them have shown good regression within the test ranges and the total content of them was 11.18%.
Terpenes as green solvents for extraction of oil from microalgae.
Dejoye Tanzi, Celine; Abert Vian, Maryline; Ginies, Christian; Elmaataoui, Mohamed; Chemat, Farid
2012-07-09
Herein is described a green and original alternative procedure for the extraction of oil from microalgae. Extractions were carried out using terpenes obtained from renewable feedstocks as alternative solvents instead of hazardous petroleum solvents such as n-hexane. The described method is achieved in two steps using Soxhlet extraction followed by the elimination of the solvent from the medium using Clevenger distillation in the second step. Oils extracted from microalgae were compared in terms of qualitative and quantitative determination. No significant difference was obtained between each extract, allowing us to conclude that the proposed method is green, clean and efficient.
Turetschek, Reinhard; Lyon, David; Desalegn, Getinet; Kaul, Hans-Peter; Wienkoop, Stefanie
2016-01-01
The proteomic study of non-model organisms, such as many crop plants, is challenging due to the lack of comprehensive genome information. Changing environmental conditions require the study and selection of adapted cultivars. Mutations, inherent to cultivars, hamper protein identification and thus considerably complicate the qualitative and quantitative comparison in large-scale systems biology approaches. With this workflow, cultivar-specific mutations are detected from high-throughput comparative MS analyses, by extracting sequence polymorphisms with de novo sequencing. Stringent criteria are suggested to filter for confidential mutations. Subsequently, these polymorphisms complement the initially used database, which is ready to use with any preferred database search algorithm. In our example, we thereby identified 26 specific mutations in two cultivars of Pisum sativum and achieved an increased number (17 %) of peptide spectrum matches.
PVA/NaCl/MgO nanocomposites-microstructural analysis by whole pattern fitting method
NASA Astrophysics Data System (ADS)
Prashanth, K. S.; Mahesh, S. S.; Prakash, M. B. Nanda; Somashekar, R.; Nagabhushana, B. M.
2018-04-01
The nanofillers in the macromolecular matrix have displayed noteworthy changes in the structure and reactivity of the polymer nanocomposites. Novel functional materials usually consist of defects and are largely disordered. The intriguing properties of these materials are often attributed to defects. X-ray line profiles from powder diffraction reveal the quantitative information about size distribution and shape of diffracting domains which governs the contribution from small conventional X-ray diffraction (XRD) techniques to enumerate the microstructural information. In this study the MgO nanoparticles were prepared by solution combustion method and PVA/NaCl/MgO nanocomposite films were synthesized by the solvent cast method. Microstructural parameters viz crystal defects like stacking faults and twin faults, compositional inhomogeneity, crystallite size
Enriching semantic knowledge bases for opinion mining in big data applications.
Weichselbraun, A; Gindl, S; Scharl, A
2014-10-01
This paper presents a novel method for contextualizing and enriching large semantic knowledge bases for opinion mining with a focus on Web intelligence platforms and other high-throughput big data applications. The method is not only applicable to traditional sentiment lexicons, but also to more comprehensive, multi-dimensional affective resources such as SenticNet. It comprises the following steps: (i) identify ambiguous sentiment terms, (ii) provide context information extracted from a domain-specific training corpus, and (iii) ground this contextual information to structured background knowledge sources such as ConceptNet and WordNet. A quantitative evaluation shows a significant improvement when using an enriched version of SenticNet for polarity classification. Crowdsourced gold standard data in conjunction with a qualitative evaluation sheds light on the strengths and weaknesses of the concept grounding, and on the quality of the enrichment process.
Towards effective interactive three-dimensional colour postprocessing
NASA Technical Reports Server (NTRS)
Bailey, B. C.; Hajjar, J. F.; Abel, J. F.
1986-01-01
Recommendations for the development of effective three-dimensional, graphical color postprocessing are made. First, the evaluation of large, complex numerical models demands that a postprocessor be highly interactive. A menu of available functions should be provided and these operations should be performed quickly so that a sense of continuity and spontaneity exists during the post-processing session. Second, an agenda for three-dimensional color postprocessing is proposed. A postprocessor must be versatile with respect to application and basic algorithms must be designed so that they are flexible. A complete selection of tools is necessary to allow arbitrary specification of views, extraction of qualitative information, and access to detailed quantitative and problem information. Finally, full use of advanced display hardware is necessary if interactivity is to be maximized and effective postprocessing of today's numerical simulations is to be achieved.
Theoretical Foundations of Remote Sensing for Glacier Assessment and Mapping
NASA Technical Reports Server (NTRS)
Bishop, Michael P.; Bush, Andrew B. G.; Furfaro, Roberto; Gillespie, Alan R.; Hall, Dorothy K.; Haritashya, Umesh K.; Shroder, John F., Jr.
2014-01-01
The international scientific community is actively engaged in assessing ice sheet and alpine glacier fluctuations at a variety of scales. The availability of stereoscopic, multitemporal, and multispectral satellite imagery from the optical wavelength regions of the electromagnetic spectrum has greatly increased our ability to assess glaciological conditions and map the cryosphere. There are, however, important issues and limitations associated with accurate satellite information extraction and mapping, as well as new opportunities for assessment and mapping that are all rooted in understanding the fundamentals of the radiation transfer cascade. We address the primary radiation transfer components, relate them to glacier dynamics and mapping, and summarize the analytical approaches that permit transformation of spectral variation into thematic and quantitative parameters. We also discuss the integration of satellite-derived information into numerical modeling approaches to facilitate understandings of glacier dynamics and causal mechanisms.
Quantitative and Qualitative Analysis of Biomarkers in Fusarium verticillioides
USDA-ARS?s Scientific Manuscript database
In this study, a combination HPLC-DART-TOF-MS system was utilized to identify and quantitatively analyze carbohydrates in wild type and mutant strains of Fusarium verticillioides. Carbohydrate fractions were isolated from F. verticillioides cellular extracts by HPLC using a cation-exchange size-excl...
Furlong, E.T.; Vaught, D.G.; Merten, L.M.; Foreman, W.T.; Gates, Paul M.
1996-01-01
A method for the determination of 79 semivolatile organic compounds (SOCs) and 4 surrogate compounds in soils and bottom sediment is described. The SOCs are extracted from bottom sediment by solvent extraction, followed by partial isolation using high-performance gel permeation chromatography (GPC). The SOCs then are qualitatively identified and quantitative concentrations determined by capillary-column gas chromatography/mass spectrometry (GC/MS). This method also is designed for an optional simultaneous isolation of polychlorinated biphenyls (PCBs) and organochlorine (OC) insecticides, including toxaphene. When OCs and PCBs are determined, an additional alumina- over-silica column chromatography step follows GPC cleanup, and quantitation is by dual capillary- column gas chromatography with electron-capture detection (GC/ECD). Bottom-sediment samples are centrifuged to remove excess water and extracted overnight with dichloromethane. The extract is concentrated, centrifuged, and then filtered through a 0.2-micrometer polytetrafluoro-ethylene syringe filter. Two aliquots of the sample extract then are quantitatively injected onto two polystyrene- divinylbenzene GPC columns connected in series. The SOCs are eluted with dichloromethane, a fraction containing the SOCs is collected, and some coextracted interferences, including elemental sulfur, are separated and discarded. The SOC-containing GPC fraction then is analyzed by GC/MS. When desired, a second aliquot from GPC is further processed for OCs and PCBs by combined alumina-over-silica column chromatography. The two fractions produced in this cleanup then are analyzed by GC/ECD. This report fully describes and is limited to the determination of SOCs by GC/MS.
Schenker, Yael; Fernandez, Alicia; Sudore, Rebecca; Schillinger, Dean
2011-01-01
Patient understanding in clinical informed consent is often poor. Little is known about the effectiveness of interventions to improve comprehension or the extent to which such interventions address different elements of understanding in informed consent. . To systematically review communication interventions to improve patient comprehension in informed consent for medical and surgical procedures. Data Sources. A systematic literature search of English-language articles in MEDLINE (1949-2008) and EMBASE (1974-2008) was performed. In addition, a published bibliography of empirical research on informed consent and the reference lists of all eligible studies were reviewed. Study Selection. Randomized controlled trials and controlled trials with nonrandom allocation were included if they compared comprehension in informed consent for a medical or surgical procedure. Only studies that used a quantitative, objective measure of understanding were included. All studies addressed informed consent for a needed or recommended procedure in actual patients. Data Extraction. Reviewers independently extracted data using a standardized form. All results were compared, and disagreements were resolved by consensus. Data Synthesis. Forty-four studies were eligible. Intervention categories included written information, audiovisual/multimedia, extended discussions, and test/feedback techniques. The majority of studies assessed patient understanding of procedural risks; other elements included benefits, alternatives, and general knowledge about the procedure. Only 6 of 44 studies assessed all 4 elements of understanding. Interventions were generally effective in improving patient comprehension, especially regarding risks and general knowledge. Limitations. Many studies failed to include adequate description of the study population, and outcome measures varied widely. . A wide range of communication interventions improve comprehension in clinical informed consent. Decisions to enhance informed consent should consider the importance of different elements of understanding, beyond procedural risks, as well as feasibility and acceptability of the intervention to clinicians and patients. Conceptual clarity regarding the key elements of informed consent knowledge will help to focus improvements and standardize evaluations.
Schenker, Yael; Fernandez, Alicia; Sudore, Rebecca; Schillinger, Dean
2017-01-01
Background Patient understanding in clinical informed consent is often poor. Little is known about the effectiveness of interventions to improve comprehension or the extent to which such interventions address different elements of understanding in informed consent. Purpose To systematically review communication interventions to improve patient comprehension in informed consent for medical and surgical procedures. Data Sources A systematic literature search of English-language articles in MEDLINE (1949–2008) and EMBASE (1974–2008) was performed. In addition, a published bibliography of empirical research on informed consent and the reference lists of all eligible studies were reviewed. Study Selection Randomized controlled trials and controlled trials with non-random allocation were included if they compared comprehension in informed consent for a medical or surgical procedure. Only studies that used a quantitative, objective measure of understanding were included. All studies addressed informed consent for a needed or recommended procedure in actual patients. Data Extraction Reviewers independently extracted data using a standardized form. All results were compared, and disagreements were resolved by consensus. Data Synthesis Forty-four studies were eligible. Intervention categories included written information, audiovisual/multimedia, extended discussions, and test/feedback techniques. The majority of studies assessed patient understanding of procedural risks; other elements included benefits, alternatives, and general knowledge about the procedure. Only 6 of 44 studies assessed all 4 elements of understanding. Interventions were generally effective in improving patient comprehension, especially regarding risks and general knowledge. Limitations Many studies failed to include adequate description of the study population, and outcome measures varied widely. Conclusions A wide range of communication interventions improve comprehension in clinical informed consent. Decisions to enhance informed consent should consider the importance of different elements of understanding, beyond procedural risks, as well as feasibility and acceptability of the intervention to clinicians and patients. Conceptual clarity regarding the key elements of informed consent knowledge will help to focus improvements and standardize evaluations. PMID:20357225
Moran, Mika; Van Cauwenberg, Jelle; Hercky-Linnewiel, Rachel; Cerin, Ester; Deforche, Benedicte; Plaut, Pnina
2014-07-17
While physical activity (PA) provides many physical, social, and mental health benefits for older adults, they are the least physically active age group. Ecological models highlight the importance of the physical environment in promoting PA. However, results of previous quantitative research revealed inconsistencies in environmental correlates of older adults' PA that may be explained by methodological issues. Qualitative studies can inform and complement quantitative research on environment-PA relationships by providing insight into how and why the environment influences participants' PA behaviors. The current study aimed to provide a systematic review of qualitative studies exploring the potential impact of the physical environment on older adults' PA behaviors. A systematic search was conducted in databases of various disciplines, including: health, architecture and urban planning, transportation, and interdisciplinary databases. From 3,047 articles identified in the physical activity, initial search, 31 articles published from 1996 to 2012 met all inclusion criteria. An inductive content analysis was performed on the extracted findings to identify emerging environmental elements related to older adults' PA. The identified environmental elements were then grouped by study methodologies [indoor interviews (individual or focus groups) vs spatial methods (photo-voice, observations, walk-along interviews)]. This review provides detailed information about environmental factors that potentially influence older adults' PA behaviors. These factors were categorized into five themes: pedestrian infrastructure, safety, access to amenities, aesthetics, and environmental conditions. Environmental factors especially relevant to older adults (i.e., access to facilities, green open spaces and rest areas) tended to emerge more frequently in studies that combined interviews with spatial qualitative methods. Findings showed that qualitative research can provide in-depth information on environmental elements that influence older adults' PA. Future qualitative studies on the physical environment and older adults' PA would benefit from combining interviews with more spatially-oriented methods. Multidisciplinary mixed-methods studies are recommended to establish quantitative relationships complemented with in-depth qualitative information.
Ceol, M; Forino, M; Gambaro, G; Sauer, U; Schleicher, E D; D'Angelo, A; Anglani, F
2001-01-01
Gene expression can be examined with different techniques including ribonuclease protection assay (RPA), in situ hybridisation (ISH), and quantitative reverse transcription-polymerase chain reaction (RT/PCR). These methods differ considerably in their sensitivity and precision in detecting and quantifying low abundance mRNA. Although there is evidence that RT/PCR can be performed in a quantitative manner, the quantitative capacity of this method is generally underestimated. To demonstrate that the comparative kinetic RT/PCR strategy-which uses a housekeeping gene as internal standard-is a quantitative method to detect significant differences in mRNA levels between different samples, the inhibitory effect of heparin on phorbol 12-myristate 13-acetate (PMA)-induced-TGF-beta1 mRNA expression was evaluated by RT/PCR and RPA, the standard method of mRNA quantification, and the results were compared. The reproducibility of RT/PCR amplification was calculated by comparing the quantity of G3PDH and TGF-beta1 PCR products, generated during the exponential phases, estimated from two different RT/PCR (G3PDH, r = 0.968, P = 0.0000; TGF-beta1, r = 0.966, P = 0.0000). The quantitative capacity of comparative kinetic RT/PCR was demonstrated by comparing the results obtained from RPA and RT/PCR using linear regression analysis. Starting from the same RNA extraction, but using only 1% of the RNA for the RT/PCR compared to RPA, significant correlation was observed (r = 0.984, P = 0.0004). Moreover the morphometric analysis of ISH signal was applied for the semi-quantitative evaluation of the expression and localisation of TGF-beta1 mRNA in the entire cell population. Our results demonstrate the close similarity of the RT/PCR and RPA methods in giving quantitative information on mRNA expression and indicate the possibility to adopt the comparative kinetic RT/PCR as reliable quantitative method of mRNA analysis. Copyright 2001 Wiley-Liss, Inc.
Extracting microtubule networks from superresolution single-molecule localization microscopy data
Zhang, Zhen; Nishimura, Yukako; Kanchanawong, Pakorn
2017-01-01
Microtubule filaments form ubiquitous networks that specify spatial organization in cells. However, quantitative analysis of microtubule networks is hampered by their complex architecture, limiting insights into the interplay between their organization and cellular functions. Although superresolution microscopy has greatly facilitated high-resolution imaging of microtubule filaments, extraction of complete filament networks from such data sets is challenging. Here we describe a computational tool for automated retrieval of microtubule filaments from single-molecule-localization–based superresolution microscopy images. We present a user-friendly, graphically interfaced implementation and a quantitative analysis of microtubule network architecture phenotypes in fibroblasts. PMID:27852898
NASA Technical Reports Server (NTRS)
Mcguirk, James P.
1990-01-01
Satellite data analysis tools are developed and implemented for the diagnosis of atmospheric circulation systems over the tropical Pacific Ocean. The tools include statistical multi-variate procedures, a multi-spectral radiative transfer model, and the global spectral forecast model at NMC. Data include in-situ observations; satellite observations from VAS (moisture, infrared and visible) NOAA polar orbiters (including Tiros Operational Satellite System (TOVS) multi-channel sounding data and OLR grids) and scanning multichannel microwave radiometer (SMMR); and European Centre for Medium Weather Forecasts (ECHMWF) analyses. A primary goal is a better understanding of the relation between synoptic structures of the area, particularly tropical plumes, and the general circulation, especially the Hadley circulation. A second goal is the definition of the quantitative structure and behavior of all Pacific tropical synoptic systems. Finally, strategies are examined for extracting new and additional information from existing satellite observations. Although moisture structure is emphasized, thermal patterns are also analyzed. Both horizontal and vertical structures are studied and objective quantitative results are emphasized.
da Silva Nunes, Wilian; de Oliveira, Caroline Silva; Alcantara, Glaucia Braz
2016-04-01
This study reports the chemical composition of five types of industrial frozen fruit pulps (acerola, cashew, grape, passion fruit and pineapple fruit pulps) and compares them with homemade pulps at two different stages of ripening. The fruit pulps were characterized by analyzing their metabolic profiles and determining their ethanol content using quantitative Nuclear Magnetic Resonance (qNMR). In addition, principal component analysis (PCA) was applied to extract more information from the NMR data. We detected ethanol in all industrial and homemade pulps; and acetic acid in cashew, grape and passion fruit industrial and homemade pulps. The ethanol content in some industrial pulps is above the level recommended by regulatory agencies and is near the levels of some post-ripened homemade pulps. This study demonstrates that qNMR can be used to rapidly detect ethanol content in frozen fruit pulps and food derivatives. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.
Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S
2014-02-11
Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001.
Giraud, Nicolas; Blackledge, Martin; Goldman, Maurice; Böckmann, Anja; Lesage, Anne; Penin, François; Emsley, Lyndon
2005-12-28
A detailed analysis of nitrogen-15 longitudinal relaxation times in microcrystalline proteins is presented. A theoretical model to quantitatively interpret relaxation times is developed in terms of motional amplitude and characteristic time scale. Different averaging schemes are examined in order to propose an analysis of relaxation curves that takes into account the specificity of MAS experiments. In particular, it is shown that magic angle spinning averages the relaxation rate experienced by a single spin over one rotor period, resulting in individual relaxation curves that are dependent on the orientation of their corresponding carousel with respect to the rotor axis. Powder averaging thus leads to a nonexponential behavior in the observed decay curves. We extract dynamic information from experimental decay curves, using a diffusion in a cone model. We apply this study to the analysis of spin-lattice relaxation rates of the microcrystalline protein Crh at two different fields and determine differential dynamic parameters for several residues in the protein.
Phase correlation imaging of unlabeled cell dynamics
NASA Astrophysics Data System (ADS)
Ma, Lihong; Rajshekhar, Gannavarpu; Wang, Ru; Bhaduri, Basanta; Sridharan, Shamira; Mir, Mustafa; Chakraborty, Arindam; Iyer, Rajashekar; Prasanth, Supriya; Millet, Larry; Gillette, Martha U.; Popescu, Gabriel
2016-09-01
We present phase correlation imaging (PCI) as a novel approach to study cell dynamics in a spatially-resolved manner. PCI relies on quantitative phase imaging time-lapse data and, as such, functions in label-free mode, without the limitations associated with exogenous markers. The correlation time map outputted in PCI informs on the dynamics of the intracellular mass transport. Specifically, we show that PCI can extract quantitatively the diffusion coefficient map associated with live cells, as well as standard Brownian particles. Due to its high sensitivity to mass transport, PCI can be applied to studying the integrity of actin polymerization dynamics. Our results indicate that the cyto-D treatment blocking the actin polymerization has a dominant effect at the large spatial scales, in the region surrounding the cell. We found that PCI can distinguish between senescent and quiescent cells, which is extremely difficult without using specific markers currently. We anticipate that PCI will be used alongside established, fluorescence-based techniques to enable valuable new studies of cell function.
NASA Astrophysics Data System (ADS)
Gao, Li; Zhang, Yihui; Malyarchuk, Viktor; Jia, Lin; Jang, Kyung-In; Chad Webb, R.; Fu, Haoran; Shi, Yan; Zhou, Guoyan; Shi, Luke; Shah, Deesha; Huang, Xian; Xu, Baoxing; Yu, Cunjiang; Huang, Yonggang; Rogers, John A.
2014-09-01
Characterization of temperature and thermal transport properties of the skin can yield important information of relevance to both clinical medicine and basic research in skin physiology. Here we introduce an ultrathin, compliant skin-like, or ‘epidermal’, photonic device that combines colorimetric temperature indicators with wireless stretchable electronics for thermal measurements when softly laminated on the skin surface. The sensors exploit thermochromic liquid crystals patterned into large-scale, pixelated arrays on thin elastomeric substrates; the electronics provide means for controlled, local heating by radio frequency signals. Algorithms for extracting patterns of colour recorded from these devices with a digital camera and computational tools for relating the results to underlying thermal processes near the skin surface lend quantitative value to the resulting data. Application examples include non-invasive spatial mapping of skin temperature with milli-Kelvin precision (±50 mK) and sub-millimetre spatial resolution. Demonstrations in reactive hyperaemia assessments of blood flow and hydration analysis establish relevance to cardiovascular health and skin care, respectively.
A Modeling Approach for Burn Scar Assessment Using Natural Features and Elastic Property
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsap, L V; Zhang, Y; Goldgof, D B
2004-04-02
A modeling approach is presented for quantitative burn scar assessment. Emphases are given to: (1) constructing a finite element model from natural image features with an adaptive mesh, and (2) quantifying the Young's modulus of scars using the finite element model and the regularization method. A set of natural point features is extracted from the images of burn patients. A Delaunay triangle mesh is then generated that adapts to the point features. A 3D finite element model is built on top of the mesh with the aid of range images providing the depth information. The Young's modulus of scars ismore » quantified with a simplified regularization functional, assuming that the knowledge of scar's geometry is available. The consistency between the Relative Elasticity Index and the physician's rating based on the Vancouver Scale (a relative scale used to rate burn scars) indicates that the proposed modeling approach has high potentials for image-based quantitative burn scar assessment.« less
Gao, Li; Zhang, Yihui; Malyarchuk, Viktor; Jia, Lin; Jang, Kyung-In; Webb, R Chad; Fu, Haoran; Shi, Yan; Zhou, Guoyan; Shi, Luke; Shah, Deesha; Huang, Xian; Xu, Baoxing; Yu, Cunjiang; Huang, Yonggang; Rogers, John A
2014-09-19
Characterization of temperature and thermal transport properties of the skin can yield important information of relevance to both clinical medicine and basic research in skin physiology. Here we introduce an ultrathin, compliant skin-like, or 'epidermal', photonic device that combines colorimetric temperature indicators with wireless stretchable electronics for thermal measurements when softly laminated on the skin surface. The sensors exploit thermochromic liquid crystals patterned into large-scale, pixelated arrays on thin elastomeric substrates; the electronics provide means for controlled, local heating by radio frequency signals. Algorithms for extracting patterns of colour recorded from these devices with a digital camera and computational tools for relating the results to underlying thermal processes near the skin surface lend quantitative value to the resulting data. Application examples include non-invasive spatial mapping of skin temperature with milli-Kelvin precision (±50 mK) and sub-millimetre spatial resolution. Demonstrations in reactive hyperaemia assessments of blood flow and hydration analysis establish relevance to cardiovascular health and skin care, respectively.
Wells, Darren M.; French, Andrew P.; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein; Bennett, Malcolm J.; Pridmore, Tony P.
2012-01-01
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana. PMID:22527394
Sankar, Martial; Nieminen, Kaisa; Ragni, Laura; Xenarios, Ioannis; Hardtke, Christian S
2014-01-01
Among various advantages, their small size makes model organisms preferred subjects of investigation. Yet, even in model systems detailed analysis of numerous developmental processes at cellular level is severely hampered by their scale. For instance, secondary growth of Arabidopsis hypocotyls creates a radial pattern of highly specialized tissues that comprises several thousand cells starting from a few dozen. This dynamic process is difficult to follow because of its scale and because it can only be investigated invasively, precluding comprehensive understanding of the cell proliferation, differentiation, and patterning events involved. To overcome such limitation, we established an automated quantitative histology approach. We acquired hypocotyl cross-sections from tiled high-resolution images and extracted their information content using custom high-throughput image processing and segmentation. Coupled with automated cell type recognition through machine learning, we could establish a cellular resolution atlas that reveals vascular morphodynamics during secondary growth, for example equidistant phloem pole formation. DOI: http://dx.doi.org/10.7554/eLife.01567.001 PMID:24520159
Wells, Darren M; French, Andrew P; Naeem, Asad; Ishaq, Omer; Traini, Richard; Hijazi, Hussein I; Hijazi, Hussein; Bennett, Malcolm J; Pridmore, Tony P
2012-06-05
Roots are highly responsive to environmental signals encountered in the rhizosphere, such as nutrients, mechanical resistance and gravity. As a result, root growth and development is very plastic. If this complex and vital process is to be understood, methods and tools are required to capture the dynamics of root responses. Tools are needed which are high-throughput, supporting large-scale experimental work, and provide accurate, high-resolution, quantitative data. We describe and demonstrate the efficacy of the high-throughput and high-resolution root imaging systems recently developed within the Centre for Plant Integrative Biology (CPIB). This toolset includes (i) robotic imaging hardware to generate time-lapse datasets from standard cameras under infrared illumination and (ii) automated image analysis methods and software to extract quantitative information about root growth and development both from these images and via high-resolution light microscopy. These methods are demonstrated using data gathered during an experimental study of the gravitropic response of Arabidopsis thaliana.
Quantitative Understanding of SHAPE Mechanism from RNA Structure and Dynamics Analysis.
Hurst, Travis; Xu, Xiaojun; Zhao, Peinan; Chen, Shi-Jie
2018-05-10
The selective 2'-hydroxyl acylation analyzed by primer extension (SHAPE) method probes RNA local structural and dynamic information at single nucleotide resolution. To gain quantitative insights into the relationship between nucleotide flexibility, RNA 3D structure, and SHAPE reactivity, we develop a 3D Structure-SHAPE Relationship model (3DSSR) to rebuild SHAPE profiles from 3D structures. The model starts from RNA structures and combines nucleotide interaction strength and conformational propensity, ligand (SHAPE reagent) accessibility, and base-pairing pattern through a composite function to quantify the correlation between SHAPE reactivity and nucleotide conformational stability. The 3DSSR model shows the relationship between SHAPE reactivity and RNA structure and energetics. Comparisons between the 3DSSR-predicted SHAPE profile and the experimental SHAPE data show correlation, suggesting that the extracted analytical function may have captured the key factors that determine the SHAPE reactivity profile. Furthermore, the theory offers an effective method to sieve RNA 3D models and exclude models that are incompatible with experimental SHAPE data.
Cho, GyeYoon; Han, KyuChul; Yoon, JinYoung
2015-01-01
Objectives: Scolopendra subspinipes mutilans (S. subspinipes mutilans) is known as a traditional medicine and includes various amino acids, peptides and proteins. The amino acids in the pharmacopuncture extracted from S. subspinipes mutilans by using derivatization methods were analyzed quantitatively and qualitatively by using high performance liquid chromatography (HPLC) over a 12 month period to confirm its stability. Methods: Amino acids of pharmacopuncture extracted from S. subspinipes mutilans were derived by using O-phthaldialdehyde (OPA) & 9-fluorenyl methoxy carbonyl chloride (FMOC) reagent and were analyzed using HPLC. The amino acids were detected by using a diode array detector (DAD) and a fluorescence detector (FLD) to compare a mixed amino acid standard (STD) to the pharmacopuncture from centipedes. The stability tests on the pharmacopuncture from centipedes were done using HPLC for three conditions: a room temperature test chamber, an acceleration test chamber, and a cold test chamber. Results: The pharmacopuncture from centipedes was prepared by using the method of the Korean Pharmacopuncture Institute (KPI) and through quantitative analyses was shown to contain 9 amino acids of the 16 amino acids in the mixed amino acid STD. The amounts of the amino acids in the pharmacopuncture from centipedes were 34.37 ppm of aspartate, 123.72 ppm of arginine, 170.63 ppm of alanine, 59.55 ppm of leucine and 57 ppm of lysine. The relative standard deviation (RSD %) results for the pharmacopuncture from centipedes had a maximum value of 14.95% and minimum value of 1.795% on the room temperature test chamber, the acceleration test chamber and the cold test chamber stability tests. Conclusion: Stability tests on and quantitative and qualitative analyses of the amino acids in the pharmacopuncture extracted from centipedes by using derivatization methods were performed by using HPLC. Through research, we hope to determine the relationship between time and the concentrations of the amino acids in the pharmacopuncture extracted from centipedes. PMID:25830058
Cho, GyeYoon; Han, KyuChul; Yoon, JinYoung
2015-03-01
Scolopendra subspinipes mutilans (S. subspinipes mutilans) is known as a traditional medicine and includes various amino acids, peptides and proteins. The amino acids in the pharmacopuncture extracted from S. subspinipes mutilans by using derivatization methods were analyzed quantitatively and qualitatively by using high performance liquid chromatography (HPLC) over a 12 month period to confirm its stability. Amino acids of pharmacopuncture extracted from S. subspinipes mutilans were derived by using O-phthaldialdehyde (OPA) & 9-fluorenyl methoxy carbonyl chloride (FMOC) reagent and were analyzed using HPLC. The amino acids were detected by using a diode array detector (DAD) and a fluorescence detector (FLD) to compare a mixed amino acid standard (STD) to the pharmacopuncture from centipedes. The stability tests on the pharmacopuncture from centipedes were done using HPLC for three conditions: a room temperature test chamber, an acceleration test chamber, and a cold test chamber. The pharmacopuncture from centipedes was prepared by using the method of the Korean Pharmacopuncture Institute (KPI) and through quantitative analyses was shown to contain 9 amino acids of the 16 amino acids in the mixed amino acid STD. The amounts of the amino acids in the pharmacopuncture from centipedes were 34.37 ppm of aspartate, 123.72 ppm of arginine, 170.63 ppm of alanine, 59.55 ppm of leucine and 57 ppm of lysine. The relative standard deviation (RSD %) results for the pharmacopuncture from centipedes had a maximum value of 14.95% and minimum value of 1.795% on the room temperature test chamber, the acceleration test chamber and the cold test chamber stability tests. Stability tests on and quantitative and qualitative analyses of the amino acids in the pharmacopuncture extracted from centipedes by using derivatization methods were performed by using HPLC. Through research, we hope to determine the relationship between time and the concentrations of the amino acids in the pharmacopuncture extracted from centipedes.
NASA Astrophysics Data System (ADS)
Shi, Wenzhong; Deng, Susu; Xu, Wenbing
2018-02-01
For automatic landslide detection, landslide morphological features should be quantitatively expressed and extracted. High-resolution Digital Elevation Models (DEMs) derived from airborne Light Detection and Ranging (LiDAR) data allow fine-scale morphological features to be extracted, but noise in DEMs influences morphological feature extraction, and the multi-scale nature of landslide features should be considered. This paper proposes a method to extract landslide morphological features characterized by homogeneous spatial patterns. Both profile and tangential curvature are utilized to quantify land surface morphology, and a local Gi* statistic is calculated for each cell to identify significant patterns of clustering of similar morphometric values. The method was tested on both synthetic surfaces simulating natural terrain and airborne LiDAR data acquired over an area dominated by shallow debris slides and flows. The test results of the synthetic data indicate that the concave and convex morphologies of the simulated terrain features at different scales and distinctness could be recognized using the proposed method, even when random noise was added to the synthetic data. In the test area, cells with large local Gi* values were extracted at a specified significance level from the profile and the tangential curvature image generated from the LiDAR-derived 1-m DEM. The morphologies of landslide main scarps, source areas and trails were clearly indicated, and the morphological features were represented by clusters of extracted cells. A comparison with the morphological feature extraction method based on curvature thresholds proved the proposed method's robustness to DEM noise. When verified against a landslide inventory, the morphological features of almost all recent (< 5 years) landslides and approximately 35% of historical (> 10 years) landslides were extracted. This finding indicates that the proposed method can facilitate landslide detection, although the cell clusters extracted from curvature images should be filtered using a filtering strategy based on supplementary information provided by expert knowledge or other data sources.
Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection
Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan
2011-01-01
Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O2 content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. PMID:21497566
Multiwavelength UV/visible spectroscopy for the quantitative investigation of platelet quality
NASA Astrophysics Data System (ADS)
Mattley, Yvette D.; Leparc, German F.; Potter, Robert L.; Garcia-Rubio, Luis H.
1998-04-01
The quality of platelets transfused is vital to the effectiveness of the transfusion. Freshly prepared, discoid platelets are the most effective treatment for preventing spontaneous hemorrhage or for stopping an abnormal bleeding event. Current methodology for the routine testing of platelet quality involves random pH testing of platelet rich plasma and visual inspection of platelet rich plasma for a swirling pattern indicative of the discoid shape of the cells. The drawback to these methods is that they do not provide a quantitative and objective assay for platelet functionality that can be used on each platelet unit prior to transfusion. As part of a larger project aimed at characterizing whole blood and blood components with multiwavelength UV/vis spectroscopy, isolated platelets and platelet in platelet rich plasma have been investigated. Models based on Mie theory have been developed which allow for the extraction of quantitative information on platelet size, number and quality from multi-wavelength UV/vis spectra. These models have been used to quantify changes in platelet rich plasma during storage. The overall goal of this work is to develop a simple, rapid quantitative assay for platelet quality that can be used prior to platelet transfusion to ensure the effectiveness of the treatment. As a result of this work, the optical properties for isolated platelets, platelet rich plasma and leukodepleted platelet rich plasma have been determined.
LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.
Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei
2012-12-01
Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Pfeil, R M; Venkat, J A; Plimmer, J R; Sham, S; Davis, K; Nair, P P
1994-02-01
The genotoxicity of groundwater was evaluated, using a novel application of the SOS microplate assay (SOSMA). Organic residues were extracted from groundwater samples from Maryland, Pennsylvania, and Delaware by using C-18 bonded silica solid phase extraction tubes. Total organic carbon content (TOC) of water samples was also determined. The genotoxicity of the extracts was determined by the SOSMA. Relative activity (RA) as determined by the SOSMA is a quantitative measure of genotoxicity based on a comparison to the activity of the mutagen, 4-nitroquinoline oxide. Low levels of RA (about 2x background) were detected in waters from sites within these states. There was considerable temporal and spatial variation in the observed RA, but no definite patterns were observed in the variation. Between sampling sites there was a positive correlation between RA and TOC; however, this relationship appeared to be reversed occasionally within a sampling site. The extraction and bioassay methods provide an easy and relatively inexpensive means of determining water quality.
Crystallography of ordered colloids using optical microscopy. 2. Divergent-beam technique.
Rogers, Richard B; Lagerlöf, K Peter D
2008-04-10
A technique has been developed to extract quantitative crystallographic data from randomly oriented colloidal crystals using a divergent-beam approach. This technique was tested on a series of diverse experimental images of colloidal crystals formed from monodisperse suspensions of sterically stabilized poly-(methyl methacrylate) spheres suspended in organic index-matching solvents. Complete sets of reciprocal lattice basis vectors were extracted in all but one case. When data extraction was successful, results appeared to be accurate to about 1% for lattice parameters and to within approximately 2 degrees for orientation. This approach is easier to implement than a previously developed parallel-beam approach with the drawback that the divergent-beam approach is not as robust in certain situations with random hexagonal close-packed crystals. The two techniques are therefore complimentary to each other, and between them it should be possible to extract quantitative crystallographic data with a conventional optical microscope from any closely index-matched colloidal crystal whose lattice parameters are compatible with visible wavelengths.
Wang, Zhenyu; Li, Shiming; Ferguson, Stephen; Goodnow, Robert; Ho, Chi-Tang
2008-01-01
Polymethoxyflavones (PMFs), which exist exclusively in the citrus genus, have biological activities including anti-inflammatory, anticarcinogenic, and antiatherogenic properties. A validated RPLC method was developed for quantitative analysis of six major PMFs, namely nobiletin, tangeretin, sinensetin, 5,6,7,4'-tetramethoxyflavone, 3,5,6,7,3',4'-hexamethoxyflavone, and 3,5,6,7,8,3',4'-heptamethoxyflavone. The polar embedded LC stationary phase was able to fully resolve the six analogues. The developed method was fully validated in terms of linearity, accuracy, precision, sensitivity, and system suitability. The LOD of the method was calculated as 0.15 microg/mL and the recovery rate was between 97.0 and 105.1%. This analytical method was successfully applied to quantify the individual PMFs in four commercially available citrus peel extracts (CPEs). Each extract shows significant difference in the PMF composition and concentration. This method may provide a simple, rapid, and reliable tool to help reveal the correlation between the bioactivity of the PMF extracts and the individual PMF content.
NASA Astrophysics Data System (ADS)
Ohar, Orest P.; Lizotte, Todd E.
2009-08-01
Over the years law enforcement has become increasingly complex, driving a need for a better level of organization of knowledge within policing. The use of COMPSTAT or other Geospatial Information Systems (GIS) for crime mapping and analysis has provided opportunities for careful analysis of crime trends. By identifying hotspots within communities, data collected and entered into these systems can be analyzed to determine how, when and where law enforcement assets can be deployed efficiently. This paper will introduce in detail, a powerful new law enforcement and forensic investigative technology called Intentional Firearm Microstamping (IFM). Once embedded and deployed into firearms, IFM will provide data for identifying and tracking the sources of illegally trafficked firearms within the borders of the United States and across the border with Mexico. Intentional Firearm Microstamping is a micro code technology that leverages a laser based micromachining process to form optimally located, microscopic "intentional structures and marks" on components within a firearm. Thus when the firearm is fired, these IFM structures transfer an identifying tracking code onto the expended cartridge that is ejected from the firearm. Intentional Firearm Microstamped structures are laser micromachined alpha numeric and encoded geometric tracking numbers, linked to the serial number of the firearm. IFM codes can be extracted quickly and used without the need to recover the firearm. Furthermore, through the process of extraction, IFM codes can be quantitatively verified to a higher level of certainty as compared to traditional forensic matching techniques. IFM provides critical intelligence capable of identifying straw purchasers, trafficking routes and networks across state borders and can be used on firearms illegally exported across international borders. This paper will outline IFM applications for supporting intelligence led policing initiatives, IFM implementation strategies, describe the how IFM overcomes the firearms stochastic properties and explain the code extraction technologies that can be used by forensic investigators and discuss the applications where the extracted data will benefit geospatial information systems for forensic intelligence benefit.
Pérez-Sánchez, Almudena; Borrás-Linares, Isabel; Barrajón-Catalán, Enrique; Arráez-Román, David; González-Álvarez, Isabel; Ibáñez, Elena; Segura-Carretero, Antonio; Bermejo, Marival; Micol, Vicente
2017-01-01
Rosemary (Rosmarinus officinalis) is grown throughout the world and is widely used as a medicinal herb and to season and preserve food. Rosemary polyphenols and terpenoids have attracted great interest due to their potential health benefits. However, complete information regarding their absorption and bioavailability in Caco-2 cell model is scarce. The permeation properties of the bioactive compounds (flavonoids, diterpenes, triterpenes and phenylpropanoids) of a rosemary extract (RE), obtained by supercritical fluid extraction, was studied in Caco-2 cell monolayer model, both in a free form or liposomed. Compounds were identified and quantitated by liquid chromatography coupled to quadrupole time-of-flight with electrospray ionization mass spectrometry analysis (HPLC-ESI-QTOF-MS), and the apparent permeability values (Papp) were determined, for the first time in the extract, for 24 compounds in both directions across cell monolayer. For some compounds, such as triterpenoids and some flavonoids, Papp values found were reported for the first time in Caco-2 cells.Our results indicate that most compounds are scarcely absorbed, and passive diffusion is suggested to be the primary mechanism of absorption. The use of liposomes to vehiculize the extract resulted in reduced permeability for most compounds. Finally, the biopharmaceutical classification (BCS) of all the compounds was achieved according to their permeability and solubility data for bioequivalence purposes. BCS study reveal that most of the RE compounds could be classified as classes III and IV (low permeability); therefore, RE itself should also be classified into this category.
Methods were developed for the extraction from soil, identification, confirmation and quantitation by LC/MS/MS of trace levels of perfluorinated octanoic acid (PFOA), perfluorinated nonanoic acid (PFNA) and perfluorinated decanoic acid (PFDA). Whereas PFOA, PFNA and PFDA all can...
Taamalli, Amani; Abaza, Leila; Arráez Román, David; Segura Carretero, Antonio; Fernández Gutiérrez, Alberto; Zarrouk, Mokhtar; Nabil, Ben Youssef
2013-01-01
Plant phenolics are secondary metabolites that constitute one of the most widely occurring groups of phytochemicals that play several important functions in plants. In olive (Olea europaea L), there is not enough information about the occurrence of these compounds in buds and flowers. To conduct a comprehensive characterisation of buds and open flowers from the olive cultivar 'Chemlali'. The polar fraction of buds and open flowers was extracted using solid-liquid extraction with hydro-alcoholic solvent. Then extracts were analysed using high performance liquid chromatography (HPLC) coupled to electrospray ionisation time-of-flight mass spectrometry (ESI/TOF/MS) and electrospray ionisation ion-trap tandem mass spectrometry (ESI/IT/MS²) operating in negative ion mode. Phenolic compounds from different classes including secoiridoids, flavonoids, simple phenols, cinnamic acid derivatives and lignans were tentatively identified in both extracts. Qualitatively, no significant difference was observed between flower buds and open flowers extracts. However, quantitatively the secoiridoids presented higher percentage of total phenols in open flowers (41.7%) than in flower buds (30.5%) in contrast to flavonoids, which decreased slightly from 38.1 to 26.7%. Cinnamic acid derivatives and simple phenols did not show any change. Lignans presented the lowest percentage in both extracts with an increase during the development of the flower bud to open flower. The HPLC-TOF/IT/MS allowed the characterisation, for the first time, of the phenolic profile of extracts of 'Chemlali' olive buds and open flowers, proving to be a very useful technique for the characterisation and structure elucidation of phenolic compounds. Copyright © 2013 John Wiley & Sons, Ltd.
Eichmiller, Jessica J; Miller, Loren M; Sorensen, Peter W
2016-01-01
Few studies have examined capture and extraction methods for environmental DNA (eDNA) to identify techniques optimal for detection and quantification. In this study, precipitation, centrifugation and filtration eDNA capture methods and six commercially available DNA extraction kits were evaluated for their ability to detect and quantify common carp (Cyprinus carpio) mitochondrial DNA using quantitative PCR in a series of laboratory experiments. Filtration methods yielded the most carp eDNA, and a glass fibre (GF) filter performed better than a similar pore size polycarbonate (PC) filter. Smaller pore sized filters had higher regression slopes of biomass to eDNA, indicating that they were potentially more sensitive to changes in biomass. Comparison of DNA extraction kits showed that the MP Biomedicals FastDNA SPIN Kit yielded the most carp eDNA and was the most sensitive for detection purposes, despite minor inhibition. The MoBio PowerSoil DNA Isolation Kit had the lowest coefficient of variation in extraction efficiency between lake and well water and had no detectable inhibition, making it most suitable for comparisons across aquatic environments. Of the methods tested, we recommend using a 1.5 μm GF filter, followed by extraction with the MP Biomedicals FastDNA SPIN Kit for detection. For quantification of eDNA, filtration through a 0.2-0.6 μm pore size PC filter, followed by extraction with MoBio PowerSoil DNA Isolation Kit was optimal. These results are broadly applicable for laboratory studies on carps and potentially other cyprinids. The recommendations can also be used to inform choice of methodology for field studies. © 2015 John Wiley & Sons Ltd.
Wang, Zhifei; Xie, Yanming; Wang, Yongyan
2011-10-01
Computerizing extracting information from Chinese medicine literature seems more convenient than hand searching, which could simplify searching process and improve the accuracy. However, many computerized auto-extracting methods are increasingly used, regular expression is so special that could be efficient for extracting useful information in research. This article focused on regular expression applying in extracting information from Chinese medicine literature. Two practical examples were reported in this article about regular expression to extract "case number (non-terminology)" and "efficacy rate (subgroups for related information identification)", which explored how to extract information in Chinese medicine literature by means of some special research method.
Gul, Rahman; Jan, Syed Umer; Faridullah, Syed; Sherani, Samiullah; Jahan, Nusrat
2017-01-01
The aim of this study was to evaluate the antioxidant activity, screening the phytogenic chemical compounds, and to assess the alkaloids present in the E. intermedia to prove its uses in Pakistani folk medicines for the treatment of asthma and bronchitis. Antioxidant activity was analyzed by using 2,2-diphenyl-1-picryl-hydrazyl-hydrate assay. Standard methods were used for the identification of cardiac glycosides, phenolic compounds, flavonoids, anthraquinones, and alkaloids. High performance liquid chromatography (HPLC) was used for quantitative purpose of ephedrine alkaloids in E. intermedia . The quantitative separation was confirmed on Shimadzu 10AVP column (Shampack) of internal diameter (id) 3.0 mm and 50 mm in length. The extract of the solute in flow rate of 1 ml/min at the wavelength 210 nm and methanolic extract showed the antioxidant activity and powerful oxygen free radicals scavenging activities and the IC50 for the E. intermedia plant was near to the reference standard ascorbic acid. The HPLC method was useful for the quantitative purpose of ephedrine (E) and pseudoephedrine (PE) used for 45 samples of one species collected from central habitat in three districts (Ziarat, Shairani, and Kalat) of Balochistan. Results showed that average alkaloid substance in E. intermedia was as follows: PE (0.209%, 0.238%, and 0.22%) and E (0.0538%, 0.0666%, and 0.0514%).
Cesa, Stefania; Carradori, Simone; Bellagamba, Giuseppe; Locatelli, Marcello; Casadei, Maria Antonietta; Masci, Alessandra; Paolicelli, Patrizia
2017-10-01
Colour is the first organoleptic property that consumers appreciate of a foodstuff. In blueberry (Vaccinium spp.) fruits, the anthocyanins are the principal pigments determining the colour as well as many of the beneficial effects attributed to this functional food. Commercial blueberry-derived products represent important sources of these healthy molecules all year round. In this study, blueberries were produced into purees comparing two homogenization methods and further heated following different thermal treatments. All the supernatants of the homogenates were monitored for pH. Then, the hydroalcoholic extracts of the same samples were characterized by CIELAB and HPLC-DAD analyses. These analytical techniques provide complementary information on fruit pigments content as a whole and on quali-quantitative profile of the single bioactive colorants. These data could be very interesting to know the best manufacturing procedure to prepare blueberry-derived products, well accepted by the consumers, while maintaining their healthy properties unaltered. Copyright © 2017. Published by Elsevier Ltd.
Objective grading of facial paralysis using Local Binary Patterns in video processing.
He, Shu; Soraghan, John J; O'Reilly, Brian F
2008-01-01
This paper presents a novel framework for objective measurement of facial paralysis in biomedial videos. The motion information in the horizontal and vertical directions and the appearance features on the apex frames are extracted based on the Local Binary Patterns (LBP) on the temporal-spatial domain in each facial region. These features are temporally and spatially enhanced by the application of block schemes. A multi-resolution extension of uniform LBP is proposed to efficiently combine the micro-patterns and large-scale patterns into a feature vector, which increases the algorithmic robustness and reduces noise effects while still retaining computational simplicity. The symmetry of facial movements is measured by the Resistor-Average Distance (RAD) between LBP features extracted from the two sides of the face. Support Vector Machine (SVM) is applied to provide quantitative evaluation of facial paralysis based on the House-Brackmann (H-B) Scale. The proposed method is validated by experiments with 197 subject videos, which demonstrates its accuracy and efficiency.
Fractal-like Distributions over the Rational Numbers in High-throughput Biological and Clinical Data
NASA Astrophysics Data System (ADS)
Trifonov, Vladimir; Pasqualucci, Laura; Dalla-Favera, Riccardo; Rabadan, Raul
2011-12-01
Recent developments in extracting and processing biological and clinical data are allowing quantitative approaches to studying living systems. High-throughput sequencing (HTS), expression profiles, proteomics, and electronic health records (EHR) are some examples of such technologies. Extracting meaningful information from those technologies requires careful analysis of the large volumes of data they produce. In this note, we present a set of fractal-like distributions that commonly appear in the analysis of such data. The first set of examples are drawn from a HTS experiment. Here, the distributions appear as part of the evaluation of the error rate of the sequencing and the identification of tumorogenic genomic alterations. The other examples are obtained from risk factor evaluation and analysis of relative disease prevalence and co-mordbidity as these appear in EHR. The distributions are also relevant to identification of subclonal populations in tumors and the study of quasi-species and intrahost diversity of viral populations.
NASA Astrophysics Data System (ADS)
Varga, T.; McKinney, A. L.; Bingham, E.; Handakumbura, P. P.; Jansson, C.
2017-12-01
Plant roots play a critical role in plant-soil-microbe interactions that occur in the rhizosphere, as well as in processes with important implications to farming and thus human food supply. X-ray computed tomography (XCT) has been proven to be an effective tool for non-invasive root imaging and analysis. Selected Brachypodium distachyon phenotypes were grown in both natural and artificial soil mixes. The specimens were imaged by XCT, and the root architectures were extracted from the data using three different software-based methods; RooTrak, ImageJ-based WEKA segmentation, and the segmentation feature in VG Studio MAX. The 3D root image was successfully segmented at 30 µm resolution by all three methods. In this presentation, ease of segmentation and the accuracy of the extracted quantitative information (root volume and surface area) will be compared between soil types and segmentation methods. The best route to easy and accurate segmentation and root analysis will be highlighted.
Building Damage Extraction Triggered by Earthquake Using the Uav Imagery
NASA Astrophysics Data System (ADS)
Li, S.; Tang, H.
2018-04-01
When extracting building damage information, we can only determine whether the building is collapsed using the post-earthquake satellite images. Even the satellite images have the sub-meter resolution, the identification of slightly damaged buildings is still a challenge. As the complementary data to satellite images, the UAV images have unique advantages, such as stronger flexibility and higher resolution. In this paper, according to the spectral feature of UAV images and the morphological feature of the reconstructed point clouds, the building damage was classified into four levels: basically intact buildings, slightly damaged buildings, partially collapsed buildings and totally collapsed buildings, and give the rules of damage grades. In particular, the slightly damaged buildings are determined using the detected roof-holes. In order to verify the approach, we conduct experimental simulations in the cases of Wenchuan and Ya'an earthquakes. By analyzing the post-earthquake UAV images of the two earthquakes, the building damage was classified into four levels, and the quantitative statistics of the damaged buildings is given in the experiments.
A method for measuring total thiaminase activity in fish tissues
Zajicek, James L.; Tillitt, Donald E.; Honeyfield, Dale C.; Brown, Scott B.; Fitzsimons, John D.
2005-01-01
An accurate, quantitative, and rapid method for the measurement of thiaminase activity in fish samples is required to provide sufficient information to characterize the role of dietary thiaminase in the onset of thiamine deficiency in Great Lakes salmonines. A radiometric method that uses 14C-thiamine was optimized for substrate and co-substrate (nicotinic acid) concentrations, incubation time, and sample dilution. Total thiaminase activity was successfully determined in extracts of selected Great Lakes fishes and invertebrates. Samples included whole-body and selected tissues of forage fishes. Positive control material prepared from frozen alewives Alosa pseudoharengus collected in Lake Michigan enhanced the development and application of the method. The method allowed improved discrimination of thiaminolytic activity among forage fish species and their tissues. The temperature dependence of the thiaminase activity observed in crude extracts of Lake Michigan alewives followed a Q10 = 2 relationship for the 1-37??C temperature range, which is consistent with the bacterial-derived thiaminase I protein. ?? Copyright by the American Fisheries Society 2005.
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2013 CFR
2013-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2011 CFR
2011-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2014 CFR
2014-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2012 CFR
2012-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
21 CFR 101.56 - Nutrient content claims for “light” or “lite.”
Code of Federal Regulations, 2010 CFR
2010-04-01
...) Quantitative information comparing the level of calories and fat content in the product per labeled serving... information panel, the quantitative information may be located elsewhere on the information panel in... sauce); and (B) Quantitative information comparing the level of sodium per labeled serving size with...
Integrated Computational System for Aerodynamic Steering and Visualization
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new method researched during the grant year (5) summarize a few of the results and finally (6) discuss work within the last 6 months that are direct extensions from the grant.
Quantification of terpene trilactones in Ginkgo biloba with a 1H NMR method.
Liang, Tingfu; Miyakawa, Takuya; Yang, Jinwei; Ishikawa, Tsutomu; Tanokura, Masaru
2018-06-01
Ginkgo biloba L. has been used as a herbal medicine in the traditional treatment of insufficient blood flow, memory deficits, and cerebral insufficiency. The terpene trilactone components, the bioactive agents of Ginkgo biloba L., have also been reported to exhibit useful functionality such as anti-inflammatory and neuroprotective effects. Therefore, in the present research, we attempted to analyze quantitatively the terpene trilactone components in Ginkgo biloba leaf extract, with quantitative 1 H NMR (qNMR) and obtained almost identical results to data reported using HPLC. Application of the qNMR method for the analysis of the terpene trilactone contents in commercial Ginkgo extract products, such as soft gel capsules and tablets, produced the same levels noted in package labels. Thus, qNMR is an alternative method for quantification of the terpene trilactone components in commercial Ginkgo extract products.
Chen, Li; Mossa-Basha, Mahmud; Balu, Niranjan; Canton, Gador; Sun, Jie; Pimentel, Kristi; Hatsukami, Thomas S; Hwang, Jenq-Neng; Yuan, Chun
2018-06-01
To develop a quantitative intracranial artery measurement technique to extract comprehensive artery features from time-of-flight MR angiography (MRA). By semiautomatically tracing arteries based on an open-curve active contour model in a graphical user interface, 12 basic morphometric features and 16 basic intensity features for each artery were identified. Arteries were then classified as one of 24 types using prediction from a probability model. Based on the anatomical structures, features were integrated within 34 vascular groups for regional features of vascular trees. Eight 3D MRA acquisitions with intracranial atherosclerosis were assessed to validate this technique. Arterial tracings were validated by an experienced neuroradiologist who checked agreement at bifurcation and stenosis locations. This technique achieved 94% sensitivity and 85% positive predictive values (PPV) for bifurcations, and 85% sensitivity and PPV for stenosis. Up to 1,456 features, such as length, volume, and averaged signal intensity for each artery, as well as vascular group in each of the MRA images, could be extracted to comprehensively reflect characteristics, distribution, and connectivity of arteries. Length for the M1 segment of the middle cerebral artery extracted by this technique was compared with reviewer-measured results, and the intraclass correlation coefficient was 0.97. A semiautomated quantitative method to trace, label, and measure intracranial arteries from 3D-MRA was developed and validated. This technique can be used to facilitate quantitative intracranial vascular research, such as studying cerebrovascular adaptation to aging and disease conditions. Magn Reson Med 79:3229-3238, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Villagrasa, M; Guillamón, M; Navarro, A; Eljarrat, E; Barceló, D
2008-02-01
A new analytical method for the quantitative determination of benzoxazolinones and their degradation products in agricultural soils based on the use of pressurized liquid extraction (PLE) followed by solid-phase extraction (SPE) and then instrumental determination using liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ESI-MS-MS) is described. Using this method, the characterization, separation and quantitative detection of a mixture of two benzoxazolinones, benzoxazolin-2-one (BOA) and 6-methoxybenzoxazolin-2-one (MBOA) and their degradation products, 2-aminophenol (APH), N-(2-hydroxyphenyl)malonamic acid (HMPMA), 2-amino-3-H-phenoxazin-3-one (APO), 9-methoxy-2-amino-3-H-phenoxazin-3-one (AMPO), 2-acetylamino-3-H-phenoxazin-3-one (AAPO) and 2-acetylamino-9-methoxy-2-amino-3-H-phenoxazin-3-one (AAMPO) was achieved. The complete LC-ESI-MS-MS precursor-product ion fragmentation pathways for the degradation products of benzoxazolinones are described for the first time. Quantitative analysis was done in the multiple reaction mode using two specific combinations of precursor-product ion transitions for each compound. The optimized method was quality assessed by the measure of parameter as recovery, linearity, sensitivity, repeatability and reproducibility. Recoveries of the analytes ranged from 53 to 123%. The developed method offered improvements to the sensitivity as compared with our previously LC-MS method, with detection limits down to 2.4-21 ng/g of dry weight. This achievement allows us to identify and quantify for the first time degradation products of benzoxazolinones in real agricultural soil samples. Analytes were found in the range of 20.6-149 ng/g dry weight.
de Oliveira, Alberto; Silva, Claudinei A; Silva, Adalberto M; Tavares, Marina F M; Kato, Massuo J
2010-01-01
A large number of natural and synthetic compounds having butenolides as a core unit have been described and many of them display a wide range of biological activities. Butenolides from P. malacophyllum have presented potential antifungal activities but no specific, fast, and precise method has been developed for their determination. To develop a methodology based on micellar electrokinetic chromatography to determine butenolides in Piper species. The extracts were analysed in an uncoated fused-silica capillaries and for the micellar system 20 mmol/L SDS, 20% (v/v) acetonitrile (ACN) and 10 mmol/L STB aqueous buffer at pH 9.2 were used. The method was validated for precision, linearity, limit of detection (LOD) and limit of quantitation (LOQ) and the standard deviations were determined from the standard errors estimated by the regression line. A micellar electrokinetic chromatography (MEKC) method for determination of butenolides in extracts gave full resolution for 1 and 2. The analytical curve in the range 10.0-50.0 µg/mL (r(2) = 0.999) provided LOD and LOQ for 1 and 2 of 2.1/6.3 and 1.1/3.5 µg/mL, respectively. The RSD for migration times were 0.12 and 1.0% for peak area ratios with 100.0 ± 1.4% of recovery. A novel high-performance MEKC method developed for the analysis of butenolides 1 and 2 in leaf extracts of P. malacophyllum allowed their quantitative determined within an analysis time shorter than 5 min and the results indicated CE to be a feasible analytical technique for the quantitative determination of butenolides in Piper extracts. Copyright © 2010 John Wiley & Sons, Ltd.
Li, Austin C; Li, Yinghe; Guirguis, Micheal S; Caldwell, Robert G; Shou, Wilson Z
2007-01-04
A new analytical method is described here for the quantitation of anti-inflammatory drug cyclosporin A (CyA) in monkey and rat plasma. The method used tetrahydrofuran (THF)-water mobile phases to elute the analyte and internal standard, cyclosporin C (CyC). The gradient mobile phase program successfully eluted CyA into a sharp peak and therefore improved resolution between the analyte and possible interfering materials compared with previously reported analytical approaches, where CyA was eluted as a broad peak due to the rapid conversion between different conformers. The sharp peak resulted from this method facilitated the quantitative calculation as multiple smoothing and large number of bunching factors were not necessary. The chromatography in the new method was performed at 30 degrees C instead of 65-70 degrees C as reported previously. Other advantages of the method included simple and fast sample extraction-protein precipitation, direct injection of the extraction supernatant to column for analysis, and elimination of evaporation and reconstitution steps, which were needed in solid phase extraction or liquid-liquid extraction reported before. This method is amenable to high-throughput analysis with a total chromatographic run time of 3 min. This approach has been verified as sensitive, linear (0.977-4000 ng/mL), accurate and precise for the quantitation of CyA in monkey and rat plasma. However, compared with the usage of conventional mobile phases, the only drawback of this approach was the reduced detection response from the mass spectrometer that was possibly caused by poor desolvation in the ionization source. This is the first report to demonstrate the advantages of using THF-water mobile phases to elute CyA in liquid chromatography.
NASA Technical Reports Server (NTRS)
Song, Q.; Putcha, L.; Harm, D. L. (Principal Investigator)
2001-01-01
A chromatographic method for the quantitation of promethazine (PMZ) and its three metabolites in urine employing on-line solid-phase extraction and column-switching has been developed. The column-switching system described here uses an extraction column for the purification of PMZ and its metabolites from a urine matrix. The extraneous matrix interference was removed by flushing the extraction column with a gradient elution. The analytes of interest were then eluted onto an analytical column for further chromatographic separation using a mobile phase of greater solvent strength. This method is specific and sensitive with a range of 3.75-1400 ng/ml for PMZ and 2.5-1400 ng/ml for the metabolites promethazine sulfoxide, monodesmethyl promethazine sulfoxide and monodesmethyl promethazine. The lower limits of quantitation (LLOQ) were 3.75 ng/ml with less than 6.2% C.V. for PMZ and 2.50 ng/ml with less than 11.5% C.V. for metabolites based on a signal-to-noise ratio of 10:1 or greater. The accuracy and precision were within +/- 11.8% in bias and not greater than 5.5% C.V. in intra- and inter-assay precision for PMZ and metabolites. Method robustness was investigated using a Plackett-Burman experimental design. The applicability of the analytical method for pharmacokinetic studies in humans is illustrated.
Donaldson, K.A.; Griffin, Dale W.; Paul, J.H.
2002-01-01
A method was developed for the quantitative detection of pathogenic human enteroviruses from surface waters in the Florida Keys using Taqman (R) one-step Reverse transcription (RT)-PCR with the Model 7700 ABI Prism (R) Sequence Detection System. Viruses were directly extracted from unconcentrated grab samples of seawater, from seawater concentrated by vortex flow filtration using a 100kD filter and from sponge tissue. Total RNA was extracted from the samples, purified and concentrated using spin-column chromatography. A 192-196 base pair portion of the 5??? untranscribed region was amplified from these extracts. Enterovirus concentrations were estimated using real-time RT-PCR technology. Nine of 15 sample sites or 60% were positive for the presence of pathogenic human enteroviruses. Considering only near-shore sites, 69% were positive with viral concentrations ranging from 9.3viruses/ml to 83viruses/g of sponge tissue (uncorrected for extraction efficiency). Certain amplicons were selected for cloning and sequencing for identification. Three strains of waterborne enteroviruses were identified as Coxsackievirus A9, Coxsackievirus A16, and Poliovirus Sabin type 1. Time and cost efficiency of this one-step real-time RT-PCR methodology makes this an ideal technique to detect, quantitate and identify pathogenic enteroviruses in recreational waters. Copyright ?? 2002 Elsevier Science Ltd.
Five different DNA extraction methods were evaluated for their effectiveness in recovering PCR templates from the conidia of a series of fungal species often encountered in indoor air. The test organisms were Aspergillus versicolor, Penicillium chrysogenum, Stachybotrys chartaru...
Schwartz, Peter H; Perkins, Susan M; Schmidt, Karen K; Muriello, Paul F; Althouse, Sandra; Rawl, Susan M
2017-08-01
Guidelines recommend that patient decision aids should provide quantitative information about probabilities of potential outcomes, but the impact of this information is unknown. Behavioral economics suggests that patients confused by quantitative information could benefit from a "nudge" towards one option. We conducted a pilot randomized trial to estimate the effect sizes of presenting quantitative information and a nudge. Primary care patients (n = 213) eligible for colorectal cancer screening viewed basic screening information and were randomized to view (a) quantitative information (quantitative module), (b) a nudge towards stool testing with the fecal immunochemical test (FIT) (nudge module), (c) neither a nor b, or (d) both a and b. Outcome measures were perceived colorectal cancer risk, screening intent, preferred test, and decision conflict, measured before and after viewing the decision aid, and screening behavior at 6 months. Patients viewing the quantitative module were more likely to be screened than those who did not ( P = 0.012). Patients viewing the nudge module had a greater increase in perceived colorectal cancer risk than those who did not ( P = 0.041). Those viewing the quantitative module had a smaller increase in perceived risk than those who did not ( P = 0.046), and the effect was moderated by numeracy. Among patients with high numeracy who did not view the nudge module, those who viewed the quantitative module had a greater increase in intent to undergo FIT ( P = 0.028) than did those who did not. The limitations of this study were the limited sample size and single healthcare system. Adding quantitative information to a decision aid increased uptake of colorectal cancer screening, while adding a nudge to undergo FIT did not increase uptake. Further research on quantitative information in decision aids is warranted.
Quantitative polymerase chain reaction (QPCR) can be used as a rapid method for detecting fecal indicator bacteria. Because false negative results can be caused by PCR inhibitors that co-extract with the DNA samples, an internal amplification control (IAC) should be run with eac...
The ease and rapidity of quantitative DNA sequence detection by real-time PCR instruments promises to make their use increasingly common for the microbial analysis many different types of environmental samples. To fully exploit the capabilities of these instruments, correspondin...
Opportunistic fungal pathogens are a concern because of the increasing number of immunocompromised patients. The goal of this research was to test a simple extraction method and rapid quantitative PCR (QPCR) measurement of the occurrence of potential pathogens, Aspergillus fumiga...
2012-01-01
Background Carpobrotus edulis (Mesembryanthemaceae), also known as igcukuma in Xhosa language is a medicinal plant used by the traditional healers to treat common infections in HIV/AIDS patients. Based on this information, we researched on the plant phytoconstituents, as well as its inhibitory effect using aqueous and three different organic solvent extracts in order to justify its therapeutic usage. Methods Antioxidant activity of the extracts were investigated spectrophotometrically against 1,1- diphenyl-2-picrylhydrazyl (DPPH), 2,2’-azino-bis(3-ethylbenzthiazoline-6-sulfonic acid) (ABTS) diammonium salt, hydrogen peroxide (H2O2), nitric oxide (NO), and ferric reducing power, Total phenols, flavonoids, flavonols, proanthocyanidins, tannins, alkaloids and saponins were also determined using the standard methods. Results Quantitative phytochemical analysis of the four solvent extracts revealed a high percentage of phenolics (55.7 ± 0.404%) in the acetone extract, with appreciable amount of proanthocyanidins (86.9 ± 0.005%) and alkaloids (4.5 ± 0.057%) in the aqueous extract, while tannin (48.9 ± 0.28%) and saponin (4.5 ± 0.262%) were major constituents of the ethanol extract. Flavonoids (0.12 ± 0.05%) and flavonols (0.12 ± 0.05%) were found at higher level in the hexane extract in comparison with the other extracts. The leaf extracts demonstrated strong hydrogen peroxide scavenging activity, with the exception of water and ethanol extracts. IC50 values of the aqueous and ethanolic extract against DPPH, ABTS, and NO were 0.018 and 0.016; 0.020 and 0.022; 0.05 and 0.023 mg/ml, respectively. The reducing power of the extract was found to be concentration dependent. Conclusion The inhibitory effect of the extracts on free radicals may justify the traditional use of this plant in the management of common diseases in HIV/AIDs patients in Eastern Cape Province. Overall, both aqueous and ethanol were found to be the best solvents for antioxidant activity in C. edulis leaves. PMID:23140206
Omoruyi, Beauty E; Bradley, Graeme; Afolayan, Anthony J
2012-11-09
Carpobrotus edulis (Mesembryanthemaceae), also known as igcukuma in Xhosa language is a medicinal plant used by the traditional healers to treat common infections in HIV/AIDS patients. Based on this information, we researched on the plant phytoconstituents, as well as its inhibitory effect using aqueous and three different organic solvent extracts in order to justify its therapeutic usage. Antioxidant activity of the extracts were investigated spectrophotometrically against 1,1- diphenyl-2-picrylhydrazyl (DPPH), 2,2'-azino-bis(3-ethylbenzthiazoline-6-sulfonic acid) (ABTS) diammonium salt, hydrogen peroxide (H2O2), nitric oxide (NO), and ferric reducing power, Total phenols, flavonoids, flavonols, proanthocyanidins, tannins, alkaloids and saponins were also determined using the standard methods. Quantitative phytochemical analysis of the four solvent extracts revealed a high percentage of phenolics (55.7 ± 0.404%) in the acetone extract, with appreciable amount of proanthocyanidins (86.9 ± 0.005%) and alkaloids (4.5 ± 0.057%) in the aqueous extract, while tannin (48.9 ± 0.28%) and saponin (4.5 ± 0.262%) were major constituents of the ethanol extract. Flavonoids (0.12 ± 0.05%) and flavonols (0.12 ± 0.05%) were found at higher level in the hexane extract in comparison with the other extracts. The leaf extracts demonstrated strong hydrogen peroxide scavenging activity, with the exception of water and ethanol extracts. IC50 values of the aqueous and ethanolic extract against DPPH, ABTS, and NO were 0.018 and 0.016; 0.020 and 0.022; 0.05 and 0.023 mg/ml, respectively. The reducing power of the extract was found to be concentration dependent. The inhibitory effect of the extracts on free radicals may justify the traditional use of this plant in the management of common diseases in HIV/AIDs patients in Eastern Cape Province. Overall, both aqueous and ethanol were found to be the best solvents for antioxidant activity in C. edulis leaves.
NASA Technical Reports Server (NTRS)
Marshall, Jochen; Milos, Frank; Fredrich, Joanne; Rasky, Daniel J. (Technical Monitor)
1997-01-01
Laser Scanning Confocal Microscopy (LSCM) has been used to obtain digital images of the complicated 3-D (three-dimensional) microstructures of rigid, fibrous thermal protection system (TPS) materials. These orthotropic materials are comprised of refractory ceramic fibers with diameters in the range of 1 to 10 microns and have open porosities of 0.8 or more. Algorithms are being constructed to extract quantitative microstructural information from the digital data so that it may be applied to specific heat and mass transport modeling efforts; such information includes, for example, the solid and pore volume fractions, the internal surface area per volume, fiber diameter distributions, and fiber orientation distributions. This type of information is difficult to obtain in general, yet it is directly relevant to many computational efforts which seek to model macroscopic thermophysical phenomena in terms of microscopic mechanisms or interactions. Two such computational efforts for fibrous TPS materials are: i) the calculation of radiative transport properties; ii) the modeling of gas permeabilities.
Challenges in Managing Information Extraction
ERIC Educational Resources Information Center
Shen, Warren H.
2009-01-01
This dissertation studies information extraction (IE), the problem of extracting structured information from unstructured data. Example IE tasks include extracting person names from news articles, product information from e-commerce Web pages, street addresses from emails, and names of emerging music bands from blogs. IE is all increasingly…
Stochastic time series analysis of fetal heart-rate variability
NASA Astrophysics Data System (ADS)
Shariati, M. A.; Dripps, J. H.
1990-06-01
Fetal Heart Rate(FHR) is one of the important features of fetal biophysical activity and its long term monitoring is used for the antepartum(period of pregnancy before labour) assessment of fetal well being. But as yet no successful method has been proposed to quantitatively represent variety of random non-white patterns seen in FHR. Objective of this paper is to address this issue. In this study the Box-Jenkins method of model identification and diagnostic checking was used on phonocardiographic derived FHR(averaged) time series. Models remained exclusively autoregressive(AR). Kalman filtering in conjunction with maximum likelihood estimation technique forms the parametric estimator. Diagnosrics perfonned on the residuals indicated that a second order model may be adequate in capturing type of variability observed in 1 up to 2 mm data windows of FHR. The scheme may be viewed as a means of data reduction of a highly redundant information source. This allows a much more efficient transmission of FHR information from remote locations to places with facilities and expertise for doser analysis. The extracted parameters is aimed to reflect numerically the important FHR features. These are normally picked up visually by experts for their assessments. As a result long term FHR recorded during antepartum period could then be screened quantitatively for detection of patterns considered normal or abnonnal. 1.
McGarty, Arlene M; Melville, Craig A
2018-02-01
There is a need increase our understanding of what factors affect physical activity participation in children with intellectual disabilities (ID) and develop effective methods to overcome barriers and increase activity levels. This study aimed to systematically review parental perceptions of facilitators and barriers to physical activity for children with ID. A systematic search of Embase, Medline, ERIC, Web of Science, and PsycINFO was conducted (up to and including August, 2017) to identify relevant papers. A meta-ethnography approach was used to synthesise qualitative and quantitative results through the generation of third-order themes and a theoretical model. Ten studies were included, which ranged from weak to strong quality. Seventy-one second-order themes and 12 quantitative results were extracted. Five third-order themes were developed: family, child factors, inclusive programmes and facilities, social motivation, and child's experiences of physical activity. It is theorised that these factors can be facilitators or barriers to physical activity, depending on the information and education of relevant others, e.g. parents and coaches. Parents have an important role in supporting activity in children with ID. Increasing the information and education given to relevant others could be an important method of turning barriers into facilitators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Wu, Pei-Wen; Mason, Katelyn E; Durbin-Johnson, Blythe P; Salemi, Michelle; Phinney, Brett S; Rocke, David M; Parker, Glendon J; Rice, Robert H
2017-07-01
Forensic association of hair shaft evidence with individuals is currently assessed by comparing mitochondrial DNA haplotypes of reference and casework samples, primarily for exclusionary purposes. Present work tests and validates more recent proteomic approaches to extract quantitative transcriptional and genetic information from hair samples of monozygotic twin pairs, which would be predicted to partition away from unrelated individuals if the datasets contain identifying information. Protein expression profiles and polymorphic, genetically variant hair peptides were generated from ten pairs of monozygotic twins. Profiling using the protein tryptic digests revealed that samples from identical twins had typically an order of magnitude fewer protein expression differences than unrelated individuals. The data did not indicate that the degree of difference within twin pairs increased with age. In parallel, data from the digests were used to detect genetically variant peptides that result from common nonsynonymous single nucleotide polymorphisms in genes expressed in the hair follicle. Compilation of the variants permitted sorting of the samples by hierarchical clustering, permitting accurate matching of twin pairs. The results demonstrate that genetic differences are detectable by proteomic methods and provide a framework for developing quantitative statistical estimates of personal identification that increase the value of hair shaft evidence. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Dai, Huiqing; Chen, Chengyu; Yang, Bin
2010-09-01
To investigate the AAPH scavenging activities of 22 flavonoids and phenolic acids and 9 extracts of Chinese materia medica. The antioxidant activities of the samples were evaluated by an oxygen radical absorbance capacity method (ORAC), at the same time, the total contents of flavonoids and phenolic the 9 herb extracts were analyzed by Folin-Ciocalteu method, and the active components were qualitatively and quantitatively analyzed by an HPLC method. It was found that the tea extract showed the strongest AAPH activity with the ORAC value of 4786.40 micromol x g(-1) whereas safflower demonstrated the weakest activity with the ORAC value of 784.04 micromol x g(-1). As for compounds, quercetin had the strongest AAPH activity with the ORAC value of 12.90 while ( - )-EGC had the weakest activity with the ORAC value of 2.47. A quantitative relationship was obtained to describe the AAPH scavenging activity of the herb extracts: Y = 1844.8 lnX-3577.5, r = 0.8675, where Y stands for the ORAC vaule, and X stands for the concentration of total phenolic acids. Flavonoids and phenolic acids are the AAPH scavenging active ingredients in the Chinese herb extracts. It's a good way to study the antioxidant activity of Chinese herb extract and its chemical composition by combing ORAC method and HPLC method.
NASA Astrophysics Data System (ADS)
Hans, Kerstin M.-C.; Gianella, Michele; Sigrist, Markus W.
2012-03-01
On-site drug tests have gained importance, e.g., for protecting the society from impaired drivers. Since today's drug tests are majorly only positive/negative, there is a great need for a reliable, portable and preferentially quantitative drug test. In the project IrSens we aim to bridge this gap with the development of an optical sensor platform based on infrared spectroscopy and focus on cocaine detection in saliva. We combine a one-step extraction method, a sample drying technique and infrared attenuated total reflection (ATR) spectroscopy. As a first step we have developed an extraction technique that allows us to extract cocaine from saliva to an almost infrared-transparent solvent and to record ATR spectra with a commercially available Fourier Transform-infrared spectrometer. To the best of our knowledge this is the first time that such a simple and easy-to-use one-step extraction method is used to transfer cocaine from saliva into an organic solvent and detect it quantitatively. With this new method we are able to reach a current limit of detection around 10 μg/ml. This new extraction method could also be applied to waste water monitoring and controlling caffeine content in beverages.
Extraction and quantitative analysis of iodine in solid and solution matrixes.
Brown, Christopher F; Geiszler, Keith N; Vickerman, Tanya S
2005-11-01
129I is a contaminant of interest in the vadose zone and groundwater at numerous federal and privately owned facilities. Several techniques have been utilized to extract iodine from solid matrixes; however, all of them rely on two fundamental approaches: liquid extraction or chemical/heat-facilitated volatilization. While these methods are typically chosen for their ease of implementation, they do not totally dissolve the solid. We defined a method that produces complete solid dissolution and conducted laboratory tests to assess its efficacy to extract iodine from solid matrixes. Testing consisted of potassium nitrate/potassium hydroxide fusion of the sample, followed by sample dissolution in a mixture of sulfuric acid and sodium bisulfite. The fusion extraction method resulted in complete sample dissolution of all solid matrixes tested. Quantitative analysis of 127I and 129I via inductively coupled plasma mass spectrometry showed better than +/-10% accuracy for certified reference standards, with the linear operating range extending more than 3 orders of magnitude (0.005-5 microg/L). Extraction and analysis of four replicates of standard reference material containing 5 microg/g 127I resulted in an average recovery of 98% with a relative deviation of 6%. This simple and cost-effective technique can be applied to solid samples of varying matrixes with little or no adaptation.
Fromm, Matthias; Bayha, Sandra; Carle, Reinhold; Kammerer, Dietmar R
2012-02-08
The phenolic constituents of seeds of 12 different apple cultivars were fractionated by sequential extraction with aqueous acetone (30:70, v/v) and ethyl acetate after hexane extraction of the lipids. Low molecular weight phenolic compounds were individually quantitated by RP-HPLC-DAD. The contents of extractable and nonextractable procyanidins were determined by applying RP-HPLC following thiolysis and n-butanol/HCl hydrolysis, respectively. As expected, the results revealed marked differences of the ethyl acetate extracts, aqueous acetone extracts, and insoluble residues with regard to contents and mean degrees of polymerization of procyanidins. Total phenolic contents in the defatted apple seed residues ranged between 18.4 and 99.8 mg/g. Phloridzin was the most abundant phenolic compound, representing 79-92% of monomeric polyphenols. Yields of phenolic compounds significantly differed among the cultivars under study, with seeds of cider apples generally being richer in phloridzin and catechins than seeds of dessert apple cultivars. This is the first study presenting comprehensive data on the contents of phenolic compounds in apple seeds comprising extractable and nonextractable procyanidins. Furthermore, the present work points out a strategy for the sustainable and complete exploitation of apple seeds as valuable agro-industrial byproducts, in particular as a rich source of phloridzin and antioxidant flavanols.
Liguori, Lucia; Bjørsvik, Hans-René
2012-12-01
The development of a multivariate study for a quantitative analysis of six different polybrominated diphenyl ethers (PBDEs) in tissue of Atlantic Salmo salar L. is reported. An extraction, isolation, and purification process based on an accelerated solvent extraction system was designed, investigated, and optimized by means of statistical experimental design and multivariate data analysis and regression. An accompanying gas chromatography-mass spectrometry analytical method was developed for the identification and quantification of the analytes, BDE 28, BDE 47, BDE 99, BDE 100, BDE 153, and BDE 154. These PBDEs have been used in commercial blends that were used as flame-retardants for a variety of materials, including electronic devices, synthetic polymers and textiles. The present study revealed that an extracting solvent mixture composed of hexane and CH₂Cl₂ (10:90) provided excellent recoveries of all of the six PBDEs studied herein. A somewhat lower polarity in the extracting solvent, hexane and CH₂Cl₂ (40:60) decreased the analyte %-recoveries, which still remain acceptable and satisfactory. The study demonstrates the necessity to perform an intimately investigation of the extraction and purification process in order to achieve quantitative isolation of the analytes from the specific matrix. Copyright © 2012 Elsevier B.V. All rights reserved.
Building Extraction from Remote Sensing Data Using Fully Convolutional Networks
NASA Astrophysics Data System (ADS)
Bittner, K.; Cui, S.; Reinartz, P.
2017-05-01
Building detection and footprint extraction are highly demanded for many remote sensing applications. Though most previous works have shown promising results, the automatic extraction of building footprints still remains a nontrivial topic, especially in complex urban areas. Recently developed extensions of the CNN framework made it possible to perform dense pixel-wise classification of input images. Based on these abilities we propose a methodology, which automatically generates a full resolution binary building mask out of a Digital Surface Model (DSM) using a Fully Convolution Network (FCN) architecture. The advantage of using the depth information is that it provides geometrical silhouettes and allows a better separation of buildings from background as well as through its invariance to illumination and color variations. The proposed framework has mainly two steps. Firstly, the FCN is trained on a large set of patches consisting of normalized DSM (nDSM) as inputs and available ground truth building mask as target outputs. Secondly, the generated predictions from FCN are viewed as unary terms for a Fully connected Conditional Random Fields (FCRF), which enables us to create a final binary building mask. A series of experiments demonstrate that our methodology is able to extract accurate building footprints which are close to the buildings original shapes to a high degree. The quantitative and qualitative analysis show the significant improvements of the results in contrast to the multy-layer fully connected network from our previous work.
Automated extraction of pleural effusion in three-dimensional thoracic CT images
NASA Astrophysics Data System (ADS)
Kido, Shoji; Tsunomori, Akinori
2009-02-01
It is important for diagnosis of pulmonary diseases to measure volume of accumulating pleural effusion in threedimensional thoracic CT images quantitatively. However, automated extraction of pulmonary effusion correctly is difficult. Conventional extraction algorithm using a gray-level based threshold can not extract pleural effusion from thoracic wall or mediastinum correctly, because density of pleural effusion in CT images is similar to those of thoracic wall or mediastinum. So, we have developed an automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion. Our method used a template of lung obtained from a normal lung for segmentation of lungs with pleural effusions. Registration process consisted of two steps. First step was a global matching processing between normal and abnormal lungs of organs such as bronchi, bones (ribs, sternum and vertebrae) and upper surfaces of livers which were extracted using a region-growing algorithm. Second step was a local matching processing between normal and abnormal lungs which were deformed by the parameter obtained from the global matching processing. Finally, we segmented a lung with pleural effusion by use of the template which was deformed by two parameters obtained from the global matching processing and the local matching processing. We compared our method with a conventional extraction method using a gray-level based threshold and two published methods. The extraction rates of pleural effusions obtained from our method were much higher than those obtained from other methods. Automated extraction method of pulmonary effusion by use of extracting lung area with pleural effusion is promising for diagnosis of pulmonary diseases by providing quantitative volume of accumulating pleural effusion.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-05
... Request; Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... clearance. Experimental Study: Presentation of Quantitative Effectiveness Information to Consumers in Direct... research has proposed that providing quantitative information about product efficacy enables consumers to...
Blicharski, Tomasz; Oniszczuk, Anna; Olech, Marta; Oniszczuk, Tomasz; Wójtowicz, Agnieszka; Krawczyk, Wojciech; Nowak, Renata
2017-05-11
[b]Abstract Introduction[/b]. Functional food plays an important role in the prevention, management and treatment of chronic diseases. One of the most interesting techniques of functional food production is extrusion-cooking. Functional foods may include such items as puffed cereals, breads and beverages that are fortified with vitamins, some nutraceuticals and herbs. Due to its pharmacological activity, chamomile flowers are the most popular components added to functional food. Quantitative analysis of polyphenolic antioxidants, as well as comparison of various methods for the extraction of phenolic compounds from corn puffed cereals, puffed cereals with an addition of chamomile (3, 5, 10 and 20%) and from [i]Chamomillae anthodium. [/i] [b]Materials and Methods[/b]. Two modern extraction methods - ultrasound assisted extraction (UAE) at 40 °C and 60 °C, as well as accelerated solvent extraction (ASE) at 100 °C and 120 °C were used for the isolation of polyphenols from functional food. Analysis of flavonoids and phenolic acids was carried out using reversed-phase high-performance liquid chromatography and electrospray ionization mass spectrometry (LC-ESI-MS/MS). [b]Results and Conclusions[/b]. For most of the analyzed compounds, the highest yields were obtained by ultrasound assisted extraction. The highest temperature during the ultrasonification process (60 °C) increased the efficiency of extraction, without degradation of polyphenols. UAE easily arrives at extraction equilibrium and therefore permits shorter periods of time, reducing the energy input. Furthermore, UAE meets the requirements of 'Green Chemistry'.
ERIC Educational Resources Information Center
Purcell, Sean C.; Pande, Prithvi; Lin, Yingxin; Rivera, Ernesto J.; Paw U, Latisha; Smallwood, Luisa M.; Kerstiens, Geri A.; Armstrong, Laura B.; Robak, MaryAnn T.; Baranger, Anne M.; Douskey, Michelle C.
2016-01-01
In this undergraduate analytical chemistry experiment, students quantitatively assess the antibacterial activity of essential oils found in thyme leaves ("Thymus vulgaris") in an authentic, research-like environment. This multi-week experiment aims to instill green chemistry principles as intrinsic to chemical problem solving. Students…
This paper evaluates the chemical stability of four arsenosugars using tetramethylammonium hydroxide (TMAOH) as an extraction solvent. This solvent was chosen because of the near quantitative removal of these arsenicals from difficult to extract seafood (oysters and shellfish). ...
Cioancă, Oana; Hăncianu, Monica; Spac, A; Miron, Anca; Stănescu, Ursula
2009-01-01
Continuing a series of studies that intend to evaluate the pharmaceutical quality of 10 commercial samples of chamomile, we tried to investigate the chemical composition of the hydroalcoholic extracts obtained in our laboratory, starting from this raw material. The qualitative and semiquantitative analysis of the extracts was done by HPLC means. All extractive solutions have a high content in ferulic acid, whereas the caffeic acid level is the lowest. Regarding the flavonoids, there are many quantitative differences between the samples: one extract lacking the rutoside and two of them having low apigenin-7-glucoside contents.
QSAR DataBank - an approach for the digital organization and archiving of QSAR model information
2014-01-01
Background Research efforts in the field of descriptive and predictive Quantitative Structure-Activity Relationships or Quantitative Structure–Property Relationships produce around one thousand scientific publications annually. All the materials and results are mainly communicated using printed media. The printed media in its present form have obvious limitations when they come to effectively representing mathematical models, including complex and non-linear, and large bodies of associated numerical chemical data. It is not supportive of secondary information extraction or reuse efforts while in silico studies poses additional requirements for accessibility, transparency and reproducibility of the research. This gap can and should be bridged by introducing domain-specific digital data exchange standards and tools. The current publication presents a formal specification of the quantitative structure-activity relationship data organization and archival format called the QSAR DataBank (QsarDB for shorter, or QDB for shortest). Results The article describes QsarDB data schema, which formalizes QSAR concepts (objects and relationships between them) and QsarDB data format, which formalizes their presentation for computer systems. The utility and benefits of QsarDB have been thoroughly tested by solving everyday QSAR and predictive modeling problems, with examples in the field of predictive toxicology, and can be applied for a wide variety of other endpoints. The work is accompanied with open source reference implementation and tools. Conclusions The proposed open data, open source, and open standards design is open to public and proprietary extensions on many levels. Selected use cases exemplify the benefits of the proposed QsarDB data format. General ideas for future development are discussed. PMID:24910716
Abortion and mental health: quantitative synthesis and analysis of research published 1995-2009.
Coleman, Priscilla K
2011-09-01
Given the methodological limitations of recently published qualitative reviews of abortion and mental health, a quantitative synthesis was deemed necessary to represent more accurately the published literature and to provide clarity to clinicians. To measure the association between abortion and indicators of adverse mental health, with subgroup effects calculated based on comparison groups (no abortion, unintended pregnancy delivered, pregnancy delivered) and particular outcomes. A secondary objective was to calculate population-attributable risk (PAR) statistics for each outcome. After the application of methodologically based selection criteria and extraction rules to minimise bias, the sample comprised 22 studies, 36 measures of effect and 877 181 participants (163 831 experienced an abortion). Random effects pooled odds ratios were computed using adjusted odds ratios from the original studies and PAR statistics were derived from the pooled odds ratios. Women who had undergone an abortion experienced an 81% increased risk of mental health problems, and nearly 10% of the incidence of mental health problems was shown to be attributable to abortion. The strongest subgroup estimates of increased risk occurred when abortion was compared with term pregnancy and when the outcomes pertained to substance use and suicidal behaviour. This review offers the largest quantitative estimate of mental health risks associated with abortion available in the world literature. Calling into question the conclusions from traditional reviews, the results revealed a moderate to highly increased risk of mental health problems after abortion. Consistent with the tenets of evidence-based medicine, this information should inform the delivery of abortion services.
Pedersen, E B L; Angmo, D; Dam, H F; Thydén, K T S; Andersen, T R; Skjønsfjell, E T B; Krebs, F C; Holler, M; Diaz, A; Guizar-Sicairos, M; Breiby, D W; Andreasen, J W
2015-08-28
Organic solar cells have great potential for upscaling due to roll-to-roll processing and a low energy payback time, making them an attractive sustainable energy source for the future. Active layers coated with water-dispersible Landfester particles enable greater control of the layer formation and easier access to the printing industry, which has reduced the use of organic solvents since the 1980s. Through ptychographic X-ray computed tomography (PXCT), we image quantitatively a roll-to-roll coated photovoltaic tandem stack consisting of one bulk heterojunction active layer and one Landfester particle active layer. We extract the layered morphology with structural and density information including the porosity present in the various layers and the silver electrode with high resolution in 3D. The Landfester particle layer is found to have an undesired morphology with negatively correlated top- and bottom interfaces, wide thickness distribution and only partial surface coverage causing electric short circuits through the layer. By top coating a polymer material onto the Landfester nanoparticles we eliminate the structural defects of the layer such as porosity and roughness, and achieve the increased performance larger than 1 V expected for a tandem cell. This study highlights that quantitative imaging of weakly scattering stacked layers of organic materials has become feasible by PXCT, and that this information cannot be obtained by other methods. In the present study, this technique specifically reveals the need to improve the coatability and layer formation of Landfester nanoparticles, thus allowing improved solar cells to be produced.
a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information
NASA Astrophysics Data System (ADS)
Lian, Shizhong; Chen, Jiangping; Luo, Minghai
2016-06-01
Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.
Content of polyphenol compound in mangrove and macroalga extracts
NASA Astrophysics Data System (ADS)
Takarina, N. D.; Patria, M. P.
2017-07-01
Polyphenol or phenolic are compounds containing one or more hydroxyl group of the aromatic ring [1]. These compounds have some activities like antibacterial, antiseptic, and antioxidants. Natural resources like mangrove and macroalga were known containing these compounds. The purpose of the research was to investigate polyphenol content in mangrove and macroalga. Materials used in this research were mangrove (Avicennia sp.) leaves and the whole part of macroalga (Caulerpa racemosa). Samples were dried for 5 days then macerated in order to get an extract. Maceration were done using methanol for 48 hours (first) and 24 hours (second) continously. Polyphenol content was determined using phytochemical screening on both extracts. The quantitative test was carried out to determine catechin and tannin as polyphenol compound. The result showed that catechin was observed in both extracts while tannin in mangrove extract only. According to quantitative test, mangrove has a higher content of catechin and tannin which were 12.37-13.44 % compared to macroalga which was 2.57-4.58 %. Those indicated that both materials can be the source of polyphenol compound with higher content on mangrove. Moreover, according to this result, these resources can be utilized for advanced studies and human needs like medical drug.
NASA Astrophysics Data System (ADS)
Ansari, S.; Talebpour, Z.; Molaabasi, F.; Bijanzadeh, H. R.; Khazaeli, S.
2016-09-01
The analysis of pesticides in water samples is of primary concern for quality control laboratories due to the toxicity of these compounds and their associated public health risk. A novel analytical method based on stir bar sorptive extraction (SBSE), followed by 31P quantitative nuclear magnetic resonance (31P QNMR), has been developed for simultaneously monitoring and determining four organophosphorus pesticides (OPPs) in aqueous media. The effects of factors on the extraction efficiency of OPPs were investigated using a Draper-Lin small composite design. An optimal sample volume of 4.2 mL, extraction time of 96 min, extraction temperature of 42°C, and desorption time of 11 min were obtained. The results showed reasonable linearity ranges for all pesticides with correlation coefficients greater than 0.9920. The limit of quantification (LOQ) ranged from 0.1 to 2.60 mg/L, and the recoveries of spiked river water samples were from 82 to 94% with relative standard deviation (RSD) values less than 4%. The results show that this method is simple, selective, rapid, and can be applied to other sample matrices.
Jesse, Stephen; Kalinin, Sergei V; Nikiforov, Maxim P
2013-07-09
An approach for the thermomechanical characterization of phase transitions in polymeric materials (polyethyleneterephthalate) by band excitation acoustic force microscopy is developed. This methodology allows the independent measurement of resonance frequency, Q factor, and oscillation amplitude of a tip-surface contact area as a function of tip temperature, from which the thermal evolution of tip-surface spring constant and mechanical dissipation can be extracted. A heating protocol maintained a constant tip-surface contact area and constant contact force, thereby allowing for reproducible measurements and quantitative extraction of material properties including temperature dependence of indentation-based elastic and loss moduli.
Insight into the CH3NH3PbI3/C interface in hole-conductor-free mesoscopic perovskite solar cells
NASA Astrophysics Data System (ADS)
Li, Jiangwei; Niu, Guangda; Li, Wenzhe; Cao, Kun; Wang, Mingkui; Wang, Liduo
2016-07-01
Perovskite solar cells (PSCs) with hole-conductor-free mesoscopic architecture have shown superb stability and great potential in practical application. The printable carbon counter electrodes take full responsibility of extracting holes from the active CH3NH3PbI3 absorbers. However, an in depth study of the CH3NH3PbI3/C interface properties, such as the structural formation process and the effect of interfacial conditions on hole extraction, is still lacking. Herein, we present, for the first time, an insight into the spatial confinement induced CH3NH3PbI3/C interface formation by in situ photoluminescence observations during the crystallization process of CH3NH3PbI3. The derived reaction kinetics allows a quantitative description of the perovskite formation process. In addition, we found that the interfacial contact between carbon and perovskite was dominant for hole extraction efficiency and associated with the photovoltaic parameter of short circuit current density (JSC). Consequently, we conducted a solvent vapor assisted process of PbI2 diffusion to carefully control the CH3NH3PbI3/C interface with less unreacted PbI2 barrier. The improvement of interface conditions thereby contributes to a high hole extraction proved by the charge extraction resistance and PL lifetime change, resulting in the increased JSC valve.Perovskite solar cells (PSCs) with hole-conductor-free mesoscopic architecture have shown superb stability and great potential in practical application. The printable carbon counter electrodes take full responsibility of extracting holes from the active CH3NH3PbI3 absorbers. However, an in depth study of the CH3NH3PbI3/C interface properties, such as the structural formation process and the effect of interfacial conditions on hole extraction, is still lacking. Herein, we present, for the first time, an insight into the spatial confinement induced CH3NH3PbI3/C interface formation by in situ photoluminescence observations during the crystallization process of CH3NH3PbI3. The derived reaction kinetics allows a quantitative description of the perovskite formation process. In addition, we found that the interfacial contact between carbon and perovskite was dominant for hole extraction efficiency and associated with the photovoltaic parameter of short circuit current density (JSC). Consequently, we conducted a solvent vapor assisted process of PbI2 diffusion to carefully control the CH3NH3PbI3/C interface with less unreacted PbI2 barrier. The improvement of interface conditions thereby contributes to a high hole extraction proved by the charge extraction resistance and PL lifetime change, resulting in the increased JSC valve. Electronic supplementary information (ESI) available: Fig. S1-S11, Tables S1, S2 and details of the Avrami model for reaction kinetics. See DOI: 10.1039/c6nr03359h
Infrared thermography quantitative image processing
NASA Astrophysics Data System (ADS)
Skouroliakou, A.; Kalatzis, I.; Kalyvas, N.; Grivas, TB
2017-11-01
Infrared thermography is an imaging technique that has the ability to provide a map of temperature distribution of an object’s surface. It is considered for a wide range of applications in medicine as well as in non-destructive testing procedures. One of its promising medical applications is in orthopaedics and diseases of the musculoskeletal system where temperature distribution of the body’s surface can contribute to the diagnosis and follow up of certain disorders. Although the thermographic image can give a fairly good visual estimation of distribution homogeneity and temperature pattern differences between two symmetric body parts, it is important to extract a quantitative measurement characterising temperature. Certain approaches use temperature of enantiomorphic anatomical points, or parameters extracted from a Region of Interest (ROI). A number of indices have been developed by researchers to that end. In this study a quantitative approach in thermographic image processing is attempted based on extracting different indices for symmetric ROIs on thermograms of the lower back area of scoliotic patients. The indices are based on first order statistical parameters describing temperature distribution. Analysis and comparison of these indices result in evaluating the temperature distribution pattern of the back trunk expected in healthy, regarding spinal problems, subjects.
Pascali, Jennifer P; Fais, Paolo; Vaiano, Fabio; Bertol, Elisabetta
2018-05-01
The growing market of herbal remedies worldwide could pose severe problems to consumers' health due to the possible presence of potentially harmful, undeclared synthetic substances or analogues of prescription drugs. The present work shows a simple but effective approach to unequivocally identify synthetic anorectic compounds in allegedly 'natural' herbal extracts, by exploiting liquid chromatography/time of flight (Q-TOF LC/MS) technology coupled to liquid chromatography/triple quadrupole (LC-MS/MS) confirmation and quantitation. The procedure was applied to five tea herbal extracts and pills sold as coadjutant for weigh loss. The method exploited liquid-liquid sample extraction (LLE) and separation in a C18 (2.1mm×150mm, 1.8μm) column. QTOF acquisitions were carried out both in scan mode and all ion MS/MS mode and results were obtained after search against ad hoc prepared library. Sibutramine, 4-hydroxyamphetamine, caffeine and theophylline were preliminary identified samples. Confirmation and quantitation of the preliminary identified compounds were obtained in LC-MS/MS after preparation of appropriated standards. Sibutramine, caffeine and theophylline were finally confirmed and quantitate. Copyright © 2018 Elsevier B.V. All rights reserved.