Sample records for producing quantitatively accurate

  1. Quantitating Organoleptic Volatile Phenols in Smoke-Exposed Vitis vinifera Berries.

    PubMed

    Noestheden, Matthew; Thiessen, Katelyn; Dennis, Eric G; Tiet, Ben; Zandberg, Wesley F

    2017-09-27

    Accurate methods for quantitating volatile phenols (i.e., guaiacol, syringol, 4-ethylphenol, etc.) in smoke-exposed Vitis vinifera berries prior to fermentation are needed to predict the likelihood of perceptible smoke taint following vinification. Reported here is a complete, cross-validated analytical workflow to accurately quantitate free and glycosidically bound volatile phenols in smoke-exposed berries using liquid-liquid extraction, acid-mediated hydrolysis, and gas chromatography-tandem mass spectrometry. The reported workflow addresses critical gaps in existing methods for volatile phenols that impact quantitative accuracy, most notably the effect of injection port temperature and the variability in acid-mediated hydrolytic procedures currently used. Addressing these deficiencies will help the wine industry make accurate, informed decisions when producing wines from smoke-exposed berries.

  2. Comparison of salivary collection and processing methods for quantitative HHV-8 detection.

    PubMed

    Speicher, D J; Johnson, N W

    2014-10-01

    Saliva is a proved diagnostic fluid for the qualitative detection of infectious agents, but the accuracy of viral load determinations is unknown. Stabilising fluids impede nucleic acid degradation, compared with collection onto ice and then freezing, and we have shown that the DNA Genotek P-021 prototype kit (P-021) can produce high-quality DNA after 14 months of storage at room temperature. Here we evaluate the quantitative capability of 10 collection/processing methods. Unstimulated whole mouth fluid was spiked with a mixture of HHV-8 cloned constructs, 10-fold serial dilutions were produced, and samples were extracted and then examined with quantitative PCR (qPCR). Calibration curves were compared by linear regression and qPCR dynamics. All methods extracted with commercial spin columns produced linear calibration curves with large dynamic range and gave accurate viral loads. Ethanol precipitation of the P-021 does not produce a linear standard curve, and virus is lost in the cell pellet. DNA extractions from the P-021 using commercial spin columns produced linear standard curves with wide dynamic range and excellent limit of detection. When extracted with spin columns, the P-021 enables accurate viral loads down to 23 copies μl(-1) DNA. The quantitative and long-term storage capability of this system makes it ideal for study of salivary DNA viruses in resource-poor settings. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. High performance thin layer chromatography (HPTLC) and high performance liquid chromatography (HPLC) for the qualitative and quantitative analysis of Calendula officinalis-advantages and limitations.

    PubMed

    Loescher, Christine M; Morton, David W; Razic, Slavica; Agatonovic-Kustrin, Snezana

    2014-09-01

    Chromatography techniques such as HPTLC and HPLC are commonly used to produce a chemical fingerprint of a plant to allow identification and quantify the main constituents within the plant. The aims of this study were to compare HPTLC and HPLC, for qualitative and quantitative analysis of the major constituents of Calendula officinalis and to investigate the effect of different extraction techniques on the C. officinalis extract composition from different parts of the plant. The results found HPTLC to be effective for qualitative analysis, however, HPLC was found to be more accurate for quantitative analysis. A combination of the two methods may be useful in a quality control setting as it would allow rapid qualitative analysis of herbal material while maintaining accurate quantification of extract composition. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. METHODS TO CLASSIFY ENVIRONMENTAL SAMPLES BASED ON MOLD ANALYSES BY QPCR

    EPA Science Inventory

    Quantitative PCR (QPCR) analysis of molds in indoor environmental samples produces highly accurate speciation and enumeration data. In a number of studies, eighty of the most common or potentially problematic indoor molds were identified and quantified in dust samples from homes...

  5. A comparison of 3D poly(ε-caprolactone) tissue engineering scaffolds produced with conventional and additive manufacturing techniques by means of quantitative analysis of SR μ-CT images

    NASA Astrophysics Data System (ADS)

    Brun, F.; Intranuovo, F.; Mohammadi, S.; Domingos, M.; Favia, P.; Tromba, G.

    2013-07-01

    The technique used to produce a 3D tissue engineering (TE) scaffold is of fundamental importance in order to guarantee its proper morphological characteristics. An accurate assessment of the resulting structural properties is therefore crucial in order to evaluate the effectiveness of the produced scaffold. Synchrotron radiation (SR) computed microtomography (μ-CT) combined with further image analysis seems to be one of the most effective techniques to this aim. However, a quantitative assessment of the morphological parameters directly from the reconstructed images is a non trivial task. This study considers two different poly(ε-caprolactone) (PCL) scaffolds fabricated with a conventional technique (Solvent Casting Particulate Leaching, SCPL) and an additive manufacturing (AM) technique (BioCell Printing), respectively. With the first technique it is possible to produce scaffolds with random, non-regular, rounded pore geometry. The AM technique instead is able to produce scaffolds with square-shaped interconnected pores of regular dimension. Therefore, the final morphology of the AM scaffolds can be predicted and the resulting model can be used for the validation of the applied imaging and image analysis protocols. It is here reported a SR μ-CT image analysis approach that is able to effectively and accurately reveal the differences in the pore- and throat-size distributions as well as connectivity of both AM and SCPL scaffolds.

  6. Quantitative prediction of drug side effects based on drug-related features.

    PubMed

    Niu, Yanqing; Zhang, Wen

    2017-09-01

    Unexpected side effects of drugs are great concern in the drug development, and the identification of side effects is an important task. Recently, machine learning methods are proposed to predict the presence or absence of interested side effects for drugs, but it is difficult to make the accurate prediction for all of them. In this paper, we transform side effect profiles of drugs as their quantitative scores, by summing up their side effects with weights. The quantitative scores may measure the dangers of drugs, and thus help to compare the risk of different drugs. Here, we attempt to predict quantitative scores of drugs, namely the quantitative prediction. Specifically, we explore a variety of drug-related features and evaluate their discriminative powers for the quantitative prediction. Then, we consider several feature combination strategies (direct combination, average scoring ensemble combination) to integrate three informative features: chemical substructures, targets, and treatment indications. Finally, the average scoring ensemble model which produces the better performances is used as the final quantitative prediction model. Since weights for side effects are empirical values, we randomly generate different weights in the simulation experiments. The experimental results show that the quantitative method is robust to different weights, and produces satisfying results. Although other state-of-the-art methods cannot make the quantitative prediction directly, the prediction results can be transformed as the quantitative scores. By indirect comparison, the proposed method produces much better results than benchmark methods in the quantitative prediction. In conclusion, the proposed method is promising for the quantitative prediction of side effects, which may work cooperatively with existing state-of-the-art methods to reveal dangers of drugs.

  7. Global Quantitative Modeling of Chromatin Factor Interactions

    PubMed Central

    Zhou, Jian; Troyanskaya, Olga G.

    2014-01-01

    Chromatin is the driver of gene regulation, yet understanding the molecular interactions underlying chromatin factor combinatorial patterns (or the “chromatin codes”) remains a fundamental challenge in chromatin biology. Here we developed a global modeling framework that leverages chromatin profiling data to produce a systems-level view of the macromolecular complex of chromatin. Our model ultilizes maximum entropy modeling with regularization-based structure learning to statistically dissect dependencies between chromatin factors and produce an accurate probability distribution of chromatin code. Our unsupervised quantitative model, trained on genome-wide chromatin profiles of 73 histone marks and chromatin proteins from modENCODE, enabled making various data-driven inferences about chromatin profiles and interactions. We provided a highly accurate predictor of chromatin factor pairwise interactions validated by known experimental evidence, and for the first time enabled higher-order interaction prediction. Our predictions can thus help guide future experimental studies. The model can also serve as an inference engine for predicting unknown chromatin profiles — we demonstrated that with this approach we can leverage data from well-characterized cell types to help understand less-studied cell type or conditions. PMID:24675896

  8. Reducing misfocus-related motion artefacts in laser speckle contrast imaging.

    PubMed

    Ringuette, Dene; Sigal, Iliya; Gad, Raanan; Levi, Ofer

    2015-01-01

    Laser Speckle Contrast Imaging (LSCI) is a flexible, easy-to-implement technique for measuring blood flow speeds in-vivo. In order to obtain reliable quantitative data from LSCI the object must remain in the focal plane of the imaging system for the duration of the measurement session. However, since LSCI suffers from inherent frame-to-frame noise, it often requires a moving average filter to produce quantitative results. This frame-to-frame noise also makes the implementation of rapid autofocus system challenging. In this work, we demonstrate an autofocus method and system based on a novel measure of misfocus which serves as an accurate and noise-robust feedback mechanism. This measure of misfocus is shown to enable the localization of best focus with sub-depth-of-field sensitivity, yielding more accurate estimates of blood flow speeds and blood vessel diameters.

  9. Automated tumor volumetry using computer-aided image segmentation.

    PubMed

    Gaonkar, Bilwaj; Macyszyn, Luke; Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A; Ali, Zarina S; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M; Davatzikos, Christos

    2015-05-01

    Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0-5 rating scale where 5 indicated perfect segmentation. The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  10. Automated Tumor Volumetry Using Computer-Aided Image Segmentation

    PubMed Central

    Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos

    2015-01-01

    Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633

  11. Effects of spatial coherence in diffraction phase microscopy.

    PubMed

    Edwards, Chris; Bhaduri, Basanta; Nguyen, Tan; Griffin, Benjamin G; Pham, Hoa; Kim, Taewoo; Popescu, Gabriel; Goddard, Lynford L

    2014-03-10

    Quantitative phase imaging systems using white light illumination can exhibit lower noise figures than laser-based systems. However, they can also suffer from object-dependent artifacts, such as halos, which prevent accurate reconstruction of the surface topography. In this work, we show that white light diffraction phase microscopy using a standard halogen lamp can produce accurate height maps of even the most challenging structures provided that there is proper spatial filtering at: 1) the condenser to ensure adequate spatial coherence and 2) the output Fourier plane to produce a uniform reference beam. We explain that these object-dependent artifacts are a high-pass filtering phenomenon, establish design guidelines to reduce the artifacts, and then apply these guidelines to eliminate the halo effect. Since a spatially incoherent source requires significant spatial filtering, the irradiance is lower and proportionally longer exposure times are needed. To circumvent this tradeoff, we demonstrate that a supercontinuum laser, due to its high radiance, can provide accurate measurements with reduced exposure times, allowing for fast dynamic measurements.

  12. Infrasonic waves generated by supersonic auroral arcs

    NASA Astrophysics Data System (ADS)

    Pasko, Victor P.

    2012-10-01

    A finite-difference time-domain (FDTD) model of infrasound propagation in a realistic atmosphere is used to provide quantitative interpretation of infrasonic waves produced by auroral arcs moving with supersonic speed. The Lorentz force and Joule heating are discussed in the existing literature as primary sources producing infrasound waves in the frequency range 0.1-0.01 Hz associated with the auroral electrojet. The results are consistent with original ideas of Swift (1973) and demonstrate that the synchronization of the speed of auroral arc and phase speed of the acoustic wave in the electrojet volume is an important condition for generation of magnitudes and frequency contents of infrasonic waves observable on the ground. The reported modeling also allows accurate quantitative reproduction of previously observed complex infrasonic waveforms including direct shock and reflected shockwaves, which are refracted back to the earth by the thermosphere.

  13. Mass Spectral Investigations on Toxins. 7. Detection and Accurate Quantitation of Picogram Quantities of Macrocyclic Trichothecenes in Brazilian Plant Samples by Direct Chemical Ionization-Mass Spectrometer/Mass Spectrometer Techniques

    DTIC Science & Technology

    1987-09-01

    trichothecenes are naturally occurring di. and triesters of unsubstituted and substituted verrucarols. 1 -" The diesters are termed as roridins, satratoxins, and...Satratoxins produced M- ions very efficiently despite the nature of the CI reagent gases.’ 6 The protonated molecules of satratoxins formed under these

  14. An object tracking method based on guided filter for night fusion image

    NASA Astrophysics Data System (ADS)

    Qian, Xiaoyan; Wang, Yuedong; Han, Lei

    2016-01-01

    Online object tracking is a challenging problem as it entails learning an effective model to account for appearance change caused by intrinsic and extrinsic factors. In this paper, we propose a novel online object tracking with guided image filter for accurate and robust night fusion image tracking. Firstly, frame difference is applied to produce the coarse target, which helps to generate observation models. Under the restriction of these models and local source image, guided filter generates sufficient and accurate foreground target. Then accurate boundaries of the target can be extracted from detection results. Finally timely updating for observation models help to avoid tracking shift. Both qualitative and quantitative evaluations on challenging image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-art methods.

  15. Refining Landsat classification results using digital terrain data

    USGS Publications Warehouse

    Miller, Wayne A.; Shasby, Mark

    1982-01-01

     Scientists at the U.S. Geological Survey's Earth Resources Observation systems (EROS) Data Center have recently completed two land-cover mapping projects in which digital terrain data were used to refine Landsat classification results. Digital ter rain data were incorporated into the Landsat classification process using two different procedures that required developing decision criteria either subjectively or quantitatively. The subjective procedure was used in a vegetation mapping project in Arizona, and the quantitative procedure was used in a forest-fuels mapping project in Montana. By incorporating digital terrain data into the Landsat classification process, more spatially accurate landcover maps were produced for both projects.

  16. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    PubMed

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  17. Improving the geological interpretation of magnetic and gravity satellite anomalies

    NASA Technical Reports Server (NTRS)

    Hinze, William J.; Braile, Lawrence W.; Vonfrese, Ralph R. B.

    1987-01-01

    Quantitative analysis of the geologic component of observed satellite magnetic and gravity fields requires accurate isolation of the geologic component of the observations, theoretically sound and viable inversion techniques, and integration of collateral, constraining geologic and geophysical data. A number of significant contributions were made which make quantitative analysis more accurate. These include procedures for: screening and processing orbital data for lithospheric signals based on signal repeatability and wavelength analysis; producing accurate gridded anomaly values at constant elevations from the orbital data by three-dimensional least squares collocation; increasing the stability of equivalent point source inversion and criteria for the selection of the optimum damping parameter; enhancing inversion techniques through an iterative procedure based on the superposition theorem of potential fields; and modeling efficiently regional-scale lithospheric sources of satellite magnetic anomalies. In addition, these techniques were utilized to investigate regional anomaly sources of North and South America and India and to provide constraints to continental reconstruction. Since the inception of this research study, eleven papers were presented with associated published abstracts, three theses were completed, four papers were published or accepted for publication, and an additional manuscript was submitted for publication.

  18. A preliminary ferritic-martensitic stainless steel constitution diagram

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balmforth, M.C.; Lippold, J.C.

    1998-01-01

    This paper describes preliminary research to develop a constitution diagram that will more accurately predict the microstructure of ferritic and martensitic stainless steel weld deposits. A button melting technique was used to produce a wide range of compositions using mixtures of conventional ferritic and martensitic stainless steels, including types 403, 409, 410, 430, 439 and 444. These samples were prepared metallographically, and the vol-% ferrite and martensite was determined quantitatively. In addition, the hardness and ferrite number (FN) were measured. Using this data, a preliminary constitution diagram is proposed that provides a more accurate method for predicting the microstructures ofmore » arc welds in ferritic and martensitic stainless steels.« less

  19. Spontaneous polyploidization in cucumber.

    PubMed

    Ramírez-Madera, Axel O; Miller, Nathan D; Spalding, Edgar P; Weng, Yiqun; Havey, Michael J

    2017-07-01

    This is the first quantitative estimation of spontaneous polyploidy in cucumber and we detected 2.2% polyploids in a greenhouse study. We provide evidence that polyploidization is consistent with endoreduplication and is an on-going process during plant growth. Cucumber occasionally produces polyploid plants, which are problematic for growers because these plants produce misshaped fruits with non-viable seeds. In this study, we undertook the first quantitative study to estimate the relative frequency of spontaneous polyploids in cucumber. Seeds of recombinant inbred lines were produced in different environments, plants were grown in the field and greenhouse, and flow cytometry was used to establish ploidies. From 1422 greenhouse-grown plants, the overall relative frequency of spontaneous polyploidy was 2.2%. Plants possessed nuclei of different ploidies in the same leaves (mosaic) and on different parts of the same plant (chimeric). Our results provide evidence of endoreduplication and polysomaty in cucumber, and that it is an on-going and dynamic process. There was a significant effect (p = 0.018) of seed production environment on the occurrence of polyploid plants. Seed and seedling traits were not accurate predictors of eventual polyploids, and we recommend that cucumber producers rogue plants based on stature and leaf serration to remove potential polyploids.

  20. Impact of thermal atomic displacements on the Curie temperature of 3 d transition metals

    NASA Astrophysics Data System (ADS)

    Ruban, A. V.; Peil, O. E.

    2018-05-01

    It is demonstrated that thermally induced atomic displacements from ideal lattice positions can produce considerable effect on magnetic exchange interactions and, consequently, on the Curie temperature of Fe. Thermal lattice distortion should, therefore, be accounted for in quantitatively accurate theoretical modeling of the magnetic phase transition. At the same time, this effect seems to be not very important for magnetic exchange interactions and the Curie temperature of Co and Ni.

  1. Intramolecular Isotopic Studies: Chemical Enhancements and Alternatives

    NASA Astrophysics Data System (ADS)

    Hayes, J. M.

    2016-12-01

    As mass spectroscopic and NMR-based methods now appropriately flourish, chemical techniques should not be forgotten. First, the methods developed by pioneering intramolecular analysts can be reapplied to new samples. Second, they can be extended. The synthesis of intramolecular isotopic standards is particularly important and straightforward. It requires only that a chemical reaction has no secondary products. An example is provided by the addition of carbon dioxide to a Grignard reagent. The reaction proceeds with an isotope effect. The isotopic composition of the carboxyl group in the acid which is produced is thus not equal to that of the starting carbon dioxide but the unconsumed CO2 can be recovered and analyzed. A simple titration can show that all the rest of the CO2 is in the product acid. The isotopic composition of the carboxyl group can then be calculated by difference. The product is an intramolecular isotopic standard, an organic molecule in which the isotopic composition of a specific carbon position is known accurately. Both analysts and reviewers can thus gain invaluable confidence in the accuracy of instrumental results. A second example: the haloform reaction quantitatively degrades methyl ketones, producing a carboxylic acid which can be decarboxylated to determine the isotopic composition of the parent carbonyl and a haloform (CHI3, for example) that can be combusted to determine the isotopic composition of the methyl group. Ketones thus analyzed can be combined with Grignard reagents to yield carbon skeletons in which the isotopic compositions of internal and terminal -CH2- and -CH3 groups are known accurately. In general, analysts accustomed to demanding quantitative reactions should remember the power of mass balances and recognize that many organic-chemical reactions, while not quantitative, lack side products and can be driven to the total consumption of at least one reactant.

  2. Phase processing for quantitative susceptibility mapping of regions with large susceptibility and lack of signal.

    PubMed

    Fortier, Véronique; Levesque, Ives R

    2018-06-01

    Phase processing impacts the accuracy of quantitative susceptibility mapping (QSM). Techniques for phase unwrapping and background removal have been proposed and demonstrated mostly in brain. In this work, phase processing was evaluated in the context of large susceptibility variations (Δχ) and negligible signal, in particular for susceptibility estimation using the iterative phase replacement (IPR) algorithm. Continuous Laplacian, region-growing, and quality-guided unwrapping were evaluated. For background removal, Laplacian boundary value (LBV), projection onto dipole fields (PDF), sophisticated harmonic artifact reduction for phase data (SHARP), variable-kernel sophisticated harmonic artifact reduction for phase data (V-SHARP), regularization enabled sophisticated harmonic artifact reduction for phase data (RESHARP), and 3D quadratic polynomial field removal were studied. Each algorithm was quantitatively evaluated in simulation and qualitatively in vivo. Additionally, IPR-QSM maps were produced to evaluate the impact of phase processing on the susceptibility in the context of large Δχ with negligible signal. Quality-guided unwrapping was the most accurate technique, whereas continuous Laplacian performed poorly in this context. All background removal algorithms tested resulted in important phase inaccuracies, suggesting that techniques used for brain do not translate well to situations where large Δχ and no or low signal are expected. LBV produced the smallest errors, followed closely by PDF. Results suggest that quality-guided unwrapping should be preferred, with PDF or LBV for background removal, for QSM in regions with large Δχ and negligible signal. This reduces the susceptibility inaccuracy introduced by phase processing. Accurate background removal remains an open question. Magn Reson Med 79:3103-3113, 2017. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  3. Molecular quenching and relaxation in a plasmonic tunable system

    NASA Astrophysics Data System (ADS)

    Baffou, Guillaume; Girard, Christian; Dujardin, Erik; Colas Des Francs, Gérard; Martin, Olivier J. F.

    2008-03-01

    Molecular fluorescence decay is significantly modified when the emitting molecule is located near a plasmonic structure. When the lateral sizes of such structures are reduced to nanometer-scale cross sections, they can be used to accurately control and amplify the emission rate. In this Rapid Communication, we extend Green’s dyadic method to quantitatively investigate both radiative and nonradiative decay channels experienced by a single fluorescent molecule confined in an adjustable dielectric-metal nanogap. The technique produces data in excellent agreement with current experimental work.

  4. Study of correlations from Ab-Initio Simulations of Liquid Water

    NASA Astrophysics Data System (ADS)

    Soto, Adrian; Fernandez-Serra, Marivi; Lu, Deyu; Yoo, Shinjae

    An accurate understanding of the dynamics and the structure of H2O molecules in the liquid phase is of extreme importance both from a fundamental and from a practical standpoint. Despite the successes of Molecular Dynamics (MD) with Density Functional Theory (DFT), liquid water remains an extremely difficult material to simulate accurately and efficiently because of fine balance between the covalent O-H bond, the hydrogen bond and the attractive the van der Waals forces. Small errors in those produce dramatic changes in the macroscopic properties of the liquid or in its structural properties. Different density functionals produce answers that differ by as much as 35% in ambient conditions, with none producing quantitative results in agreement with experiment at different mass densities. In order to understand these differences we perform an exhaustive scanning of the geometrical coordinates of MD simulations and study their statistical correlations with the simulation output quantities using advanced correlation analyses and machine learning techniques. This work was partially supported by DOE Award No. DE-FG02-09ER16052, by DOE Early Career Award No. DE-SC0003871, by BNL LDRD 16-039 project and BNL Contract No. DE-SC0012704.

  5. Study of correlations from Ab-Initio Simulations of Liquid Water

    NASA Astrophysics Data System (ADS)

    Soto, Adrian; Fernandez-Serra, Marivi; Lu, Deyu; Yoo, Shinjae

    An accurate understanding of the dynamics and the structure of H2O molecules in the liquid phase is of extreme importance both from a fundamental and from a practical standpoint. Despite the successes of Molecular Dynamics (MD) with Density Functional Theory (DFT), liquid water remains an extremely difficult material to simulate accurately and efficiently because of fine balance between the covalent O-H bond, the hydrogen bond and the attractive the van der Waals forces. Small errors in those produce dramatic changes in the macroscopic properties of the liquid or in its structural properties. Different density functionals produce answers that differ by as much as 35% in ambient conditions, with none producing quantitative results in agreement with experiment at different mass densities [J. Chem Phys. 139, 194502(2013)]. In order to understand these differences we perform an exhaustive scanning of the geometrical coordinates of MD simulations and study their statistical correlations with the simulation output quantities using advanced correlation analyses and machine learning techniques. This work was partially supported by DOE Award No. DE-FG02-09ER16052, by DOE Early Career Award No. DE-SC0003871, by BNL LDRD 16-039 project and BNL Contract No. DE-SC0012704.

  6. CMEIAS color segmentation: an improved computing technology to process color images for quantitative microbial ecology studies at single-cell resolution.

    PubMed

    Gross, Colin A; Reddy, Chandan K; Dazzo, Frank B

    2010-02-01

    Quantitative microscopy and digital image analysis are underutilized in microbial ecology largely because of the laborious task to segment foreground object pixels from background, especially in complex color micrographs of environmental samples. In this paper, we describe an improved computing technology developed to alleviate this limitation. The system's uniqueness is its ability to edit digital images accurately when presented with the difficult yet commonplace challenge of removing background pixels whose three-dimensional color space overlaps the range that defines foreground objects. Image segmentation is accomplished by utilizing algorithms that address color and spatial relationships of user-selected foreground object pixels. Performance of the color segmentation algorithm evaluated on 26 complex micrographs at single pixel resolution had an overall pixel classification accuracy of 99+%. Several applications illustrate how this improved computing technology can successfully resolve numerous challenges of complex color segmentation in order to produce images from which quantitative information can be accurately extracted, thereby gain new perspectives on the in situ ecology of microorganisms. Examples include improvements in the quantitative analysis of (1) microbial abundance and phylotype diversity of single cells classified by their discriminating color within heterogeneous communities, (2) cell viability, (3) spatial relationships and intensity of bacterial gene expression involved in cellular communication between individual cells within rhizoplane biofilms, and (4) biofilm ecophysiology based on ribotype-differentiated radioactive substrate utilization. The stand-alone executable file plus user manual and tutorial images for this color segmentation computing application are freely available at http://cme.msu.edu/cmeias/ . This improved computing technology opens new opportunities of imaging applications where discriminating colors really matter most, thereby strengthening quantitative microscopy-based approaches to advance microbial ecology in situ at individual single-cell resolution.

  7. A general method for bead-enhanced quantitation by flow cytometry

    PubMed Central

    Montes, Martin; Jaensson, Elin A.; Orozco, Aaron F.; Lewis, Dorothy E.; Corry, David B.

    2009-01-01

    Flow cytometry provides accurate relative cellular quantitation (percent abundance) of cells from diverse samples, but technical limitations of most flow cytometers preclude accurate absolute quantitation. Several quantitation standards are now commercially available which, when added to samples, permit absolute quantitation of CD4+ T cells. However, these reagents are limited by their cost, technical complexity, requirement for additional software and/or limited applicability. Moreover, few studies have validated the use of such reagents in complex biological samples, especially for quantitation of non-T cells. Here we show that addition to samples of known quantities of polystyrene fluorescence standardization beads permits accurate quantitation of CD4+ T cells from complex cell samples. This procedure, here termed single bead-enhanced cytofluorimetry (SBEC), was equally capable of enumerating eosinophils as well as subcellular fragments of apoptotic cells, moieties with very different optical and fluorescent characteristics. Relative to other proprietary products, SBEC is simple, inexpensive and requires no special software, suggesting that the method is suitable for the routine quantitation of most cells and other particles by flow cytometry. PMID:17067632

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr; Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it; Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) takingmore » into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.« less

  9. Quantitative 3D reconstruction of airway and pulmonary vascular trees using HRCT

    NASA Astrophysics Data System (ADS)

    Wood, Susan A.; Hoford, John D.; Hoffman, Eric A.; Zerhouni, Elias A.; Mitzner, Wayne A.

    1993-07-01

    Accurate quantitative measurements of airway and vascular dimensions are essential to evaluate function in the normal and diseased lung. In this report, a novel method is described for three-dimensional extraction and analysis of pulmonary tree structures using data from High Resolution Computed Tomography (HRCT). Serially scanned two-dimensional slices of the lower left lobe of isolated dog lungs were stacked to create a volume of data. Airway and vascular trees were three-dimensionally extracted using a three dimensional seeded region growing algorithm based on difference in CT number between wall and lumen. To obtain quantitative data, we reduced each tree to its central axis. From the central axis, branch length is measured as the distance between two successive branch points, branch angle is measured as the angle produced by two daughter branches, and cross sectional area is measured from a plane perpendicular to the central axis point. Data derived from these methods can be used to localize and quantify structural differences both during changing physiologic conditions and in pathologic lungs.

  10. Quantitative analysis of Al-Si alloy using calibration free laser induced breakdown spectroscopy (CF-LIBS)

    NASA Astrophysics Data System (ADS)

    Shakeel, Hira; Haq, S. U.; Aisha, Ghulam; Nadeem, Ali

    2017-06-01

    The quantitative analysis of the standard aluminum-silicon alloy has been performed using calibration free laser induced breakdown spectroscopy (CF-LIBS). The plasma was produced using the fundamental harmonic (1064 nm) of the Nd: YAG laser and the emission spectra were recorded at 3.5 μs detector gate delay. The qualitative analysis of the emission spectra confirms the presence of Mg, Al, Si, Ti, Mn, Fe, Ni, Cu, Zn, Sn, and Pb in the alloy. The background subtracted and self-absorption corrected emission spectra were used for the estimation of plasma temperature as 10 100 ± 300 K. The plasma temperature and self-absorption corrected emission lines of each element have been used for the determination of concentration of each species present in the alloy. The use of corrected emission intensities and accurate evaluation of plasma temperature yield reliable quantitative analysis up to a maximum 2.2% deviation from reference sample concentration.

  11. Quantitative nuclear magnetic resonance imaging: characterisation of experimental cerebral oedema.

    PubMed Central

    Barnes, D; McDonald, W I; Johnson, G; Tofts, P S; Landon, D N

    1987-01-01

    Magnetic resonance imaging (MRI) has been used quantitatively to define the characteristics of two different models of experimental cerebral oedema in cats: vasogenic oedema produced by cortical freezing and cytotoxic oedema induced by triethyl tin. The MRI results have been correlated with the ultrastructural changes. The images accurately delineated the anatomical extent of the oedema in the two lesions, but did not otherwise discriminate between them. The patterns of measured increase in T1' and T2' were, however, characteristic for each type of oedema, and reflected the protein content. The magnetisation decay characteristics of both normal and oedematous white matter were monoexponential for T1 but biexponential for T2 decay. The relative sizes of the two component exponentials of the latter corresponded with the physical sizes of the major tissue water compartments. Quantitative MRI data can provide reliable information about the physico-chemical environment of tissue water in normal and oedematous cerebral tissue, and are useful for distinguishing between acute and chronic lesions in multiple sclerosis. Images PMID:3572428

  12. Geochemical variations of rare earth elements in Marcellus shale flowback waters and multiple-source cores in the Appalachian Basin

    NASA Astrophysics Data System (ADS)

    Noack, C.; Jain, J.; Hakala, A.; Schroeder, K.; Dzombak, D. A.; Karamalidis, A.

    2013-12-01

    Rare earth elements (REE) - encompassing the naturally occurring lanthanides, yttrium, and scandium - are potential tracers for subsurface groundwater-brine flows and geochemical processes. Application of these elements as naturally occurring tracers during shale gas development is reliant on accurate quantitation of trace metals in hypersaline brines. We have modified and validated a liquid-liquid technique for extraction and pre-concentration of REE from saline produced waters from shale gas extraction wells with quantitative analysis by ICP-MS. This method was used to analyze time-series samples of Marcellus shale flowback and produced waters. Additionally, the total REE content of core samples of various strata throughout the Appalachian Basin were determined using HF/HNO3 digestion and ICP-MS analysis. A primary goal of the study is to elucidate systematic geochemical variations as a function of location or shale characteristics. Statistical testing will be performed to study temporal variability of inter-element relationships and explore associations between REE abundance and major solution chemistry. The results of these analyses and discussion of their significance will be presented.

  13. Comparison of Real-Time PCR, Reverse Transcriptase Real-Time PCR, Loop-Mediated Isothermal Amplification, and the FDA Conventional Microbiological Method for the Detection of Salmonella spp. in Produce ▿ †

    PubMed Central

    Zhang, Guodong; Brown, Eric W.; González-Escalona, Narjol

    2011-01-01

    Contamination of foods, especially produce, with Salmonella spp. is a major concern for public health. Several methods are available for the detection of Salmonella in produce, but their relative efficiency for detecting Salmonella in commonly consumed vegetables, often associated with outbreaks of food poisoning, needs to be confirmed. In this study, the effectiveness of three molecular methods for detection of Salmonella in six produce matrices was evaluated and compared to the FDA microbiological detection method. Samples of cilantro (coriander leaves), lettuce, parsley, spinach, tomato, and jalapeno pepper were inoculated with Salmonella serovars at two different levels (105 and <101 CFU/25 g of produce). The inoculated produce was assayed by the FDA Salmonella culture method (Bacteriological Analytical Manual) and by three molecular methods: quantitative real-time PCR (qPCR), quantitative reverse transcriptase real-time PCR (RT-qPCR), and loop-mediated isothermal amplification (LAMP). Comparable results were obtained by these four methods, which all detected as little as 2 CFU of Salmonella cells/25 g of produce. All control samples (not inoculated) were negative by the four methods. RT-qPCR detects only live Salmonella cells, obviating the danger of false-positive results from nonviable cells. False negatives (inhibition of either qPCR or RT-qPCR) were avoided by the use of either a DNA or an RNA amplification internal control (IAC). Compared to the conventional culture method, the qPCR, RT-qPCR, and LAMP assays allowed faster and equally accurate detection of Salmonella spp. in six high-risk produce commodities. PMID:21803916

  14. Recent Progress in the Remote Detection of Vapours and Gaseous Pollutants.

    ERIC Educational Resources Information Center

    Moffat, A. J.; And Others

    Work has been continuing on the correlation spectrometry techniques described at previous remote sensing symposiums. Advances in the techniques are described which enable accurate quantitative measurements of diffused atmospheric gases to be made using controlled light sources, accurate quantitative measurements of gas clouds relative to…

  15. Impact of TRMM and SSM/I Rainfall Assimilation on Global Analysis and QPF

    NASA Technical Reports Server (NTRS)

    Hou, Arthur; Zhang, Sara; Reale, Oreste

    2002-01-01

    Evaluation of QPF skills requires quantitatively accurate precipitation analyses. We show that assimilation of surface rain rates derived from the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager and Special Sensor Microwave/Imager (SSM/I) improves quantitative precipitation estimates (QPE) and many aspects of global analyses. Short-range forecasts initialized with analyses with satellite rainfall data generally yield significantly higher QPF threat scores and better storm track predictions. These results were obtained using a variational procedure that minimizes the difference between the observed and model rain rates by correcting the moist physics tendency of the forecast model over a 6h assimilation window. In two case studies of Hurricanes Bonnie and Floyd, synoptic analysis shows that this procedure produces initial conditions with better-defined tropical storm features and stronger precipitation intensity associated with the storm.

  16. A Workstation for Interactive Display and Quantitative Analysis of 3-D and 4-D Biomedical Images

    PubMed Central

    Robb, R.A.; Heffeman, P.B.; Camp, J.J.; Hanson, D.P.

    1986-01-01

    The capability to extract objective and quantitatively accurate information from 3-D radiographic biomedical images has not kept pace with the capabilities to produce the images themselves. This is rather an ironic paradox, since on the one hand the new 3-D and 4-D imaging capabilities promise significant potential for providing greater specificity and sensitivity (i.e., precise objective discrimination and accurate quantitative measurement of body tissue characteristics and function) in clinical diagnostic and basic investigative imaging procedures than ever possible before, but on the other hand, the momentous advances in computer and associated electronic imaging technology which have made these 3-D imaging capabilities possible have not been concomitantly developed for full exploitation of these capabilities. Therefore, we have developed a powerful new microcomputer-based system which permits detailed investigations and evaluation of 3-D and 4-D (dynamic 3-D) biomedical images. The system comprises a special workstation to which all the information in a large 3-D image data base is accessible for rapid display, manipulation, and measurement. The system provides important capabilities for simultaneously representing and analyzing both structural and functional data and their relationships in various organs of the body. This paper provides a detailed description of this system, as well as some of the rationale, background, theoretical concepts, and practical considerations related to system implementation. ImagesFigure 5Figure 7Figure 8Figure 9Figure 10Figure 11Figure 12Figure 13Figure 14Figure 15Figure 16

  17. Improved Correction of Misclassification Bias With Bootstrap Imputation.

    PubMed

    van Walraven, Carl

    2018-07-01

    Diagnostic codes used in administrative database research can create bias due to misclassification. Quantitative bias analysis (QBA) can correct for this bias, requires only code sensitivity and specificity, but may return invalid results. Bootstrap imputation (BI) can also address misclassification bias but traditionally requires multivariate models to accurately estimate disease probability. This study compared misclassification bias correction using QBA and BI. Serum creatinine measures were used to determine severe renal failure status in 100,000 hospitalized patients. Prevalence of severe renal failure in 86 patient strata and its association with 43 covariates was determined and compared with results in which renal failure status was determined using diagnostic codes (sensitivity 71.3%, specificity 96.2%). Differences in results (misclassification bias) were then corrected with QBA or BI (using progressively more complex methods to estimate disease probability). In total, 7.4% of patients had severe renal failure. Imputing disease status with diagnostic codes exaggerated prevalence estimates [median relative change (range), 16.6% (0.8%-74.5%)] and its association with covariates [median (range) exponentiated absolute parameter estimate difference, 1.16 (1.01-2.04)]. QBA produced invalid results 9.3% of the time and increased bias in estimates of both disease prevalence and covariate associations. BI decreased misclassification bias with increasingly accurate disease probability estimates. QBA can produce invalid results and increase misclassification bias. BI avoids invalid results and can importantly decrease misclassification bias when accurate disease probability estimates are used.

  18. The effects of AVIRIS atmospheric calibration methodology on identification and quantitative mapping of surface mineralogy, Drum Mountains, Utah

    NASA Technical Reports Server (NTRS)

    Kruse, Fred A.; Dwyer, John L.

    1993-01-01

    The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) measures reflected light in 224 contiguous spectra bands in the 0.4 to 2.45 micron region of the electromagnetic spectrum. Numerous studies have used these data for mineralogic identification and mapping based on the presence of diagnostic spectral features. Quantitative mapping requires conversion of the AVIRIS data to physical units (usually reflectance) so that analysis results can be compared and validated with field and laboratory measurements. This study evaluated two different AVIRIS calibration techniques to ground reflectance: an empirically-based method and an atmospheric model based method to determine their effects on quantitative scientific analyses. Expert system analysis and linear spectral unmixing were applied to both calibrated data sets to determine the effect of the calibration on the mineral identification and quantitative mapping results. Comparison of the image-map results and image reflectance spectra indicate that the model-based calibrated data can be used with automated mapping techniques to produce accurate maps showing the spatial distribution and abundance of surface mineralogy. This has positive implications for future operational mapping using AVIRIS or similar imaging spectrometer data sets without requiring a priori knowledge.

  19. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.

  20. Quantitation of influenza virus using field flow fractionation and multi-angle light scattering for quantifying influenza A particles

    PubMed Central

    Bousse, Tatiana; Shore, David A.; Goldsmith, Cynthia S.; Hossain, M. Jaber; Jang, Yunho; Davis, Charles T.; Donis, Ruben O.; Stevens, James

    2017-01-01

    Summary Recent advances in instrumentation and data analysis in field flow fractionation and multi-angle light scattering (FFF-MALS) have enabled greater use of this technique to characterize and quantitate viruses. In this study, the FFF-MALS technique was applied to the characterization and quantitation of type A influenza virus particles to assess its usefulness for vaccine preparation. The use of FFF-MALS for quantitation and measurement of control particles provided data accurate to within 5% of known values, reproducible with a coefficient of variation of 1.9 %. The methods, sensitivity and limit of detection were established by analyzing different volumes of purified virus, which produced a linear regression with fitting value R2 of 0.99. FFF-MALS was further applied to detect and quantitate influenza virus in the supernatant of infected MDCK cells and allantoic fluids of infected eggs. FFF fractograms of the virus present in these different fluids revealed similar distribution of monomeric and oligomeric virions. However, the monomer fraction of cell grown virus has greater size variety. Notably, β-propialactone (BPL) inactivation of influenza viruses did not influence any of the FFF-MALS measurements. Quantitation analysis by FFF-MALS was compared to infectivity assays and real-time RT-PCR (qRT-PCR) and the limitations of each assay were discussed. PMID:23916678

  1. ASSAY OF POLY-β-HYDROXYBUTYRIC ACID

    PubMed Central

    Law, John H.; Slepecky, Ralph A.

    1961-01-01

    Law, John H. (Harvard University, Cambridge, Mass.) and Ralph A. Splepecky. Assay of poly-β-hydroxybutyric acid. J. Bacteriol. 82:33–36. 1961—A convenient spectrophotometric assay of bacterial poly-β-hydroxybutyric acid has been devised. Quantitative conversion of poly-β-hydroxybutyric acid to crotonic acid by heating in concentrated sulfuric acid and determination of the ultraviolet absorption of the produce permits an accurate determination of this material in quantities down to 5 μg. This method has been used to follow the production of poly-β-hydroxybutyric acid by Bacillus megaterium strain KM. PMID:13759651

  2. A test of the adhesion approximation for gravitational clustering

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.; Shandarin, Sergei; Weinberg, David H.

    1993-01-01

    We quantitatively compare a particle implementation of the adhesion approximation to fully non-linear, numerical 'N-body' simulations. Our primary tool, cross-correlation of N-body simulations with the adhesion approximation, indicates good agreement, better than that found by the same test performed with the Zel-dovich approximation (hereafter ZA). However, the cross-correlation is not as good as that of the truncated Zel-dovich approximation (TZA), obtained by applying the Zel'dovich approximation after smoothing the initial density field with a Gaussian filter. We confirm that the adhesion approximation produces an excessively filamentary distribution. Relative to the N-body results, we also find that: (a) the power spectrum obtained from the adhesion approximation is more accurate than that from ZA or TZA, (b) the error in the phase angle of Fourier components is worse than that from TZA, and (c) the mass distribution function is more accurate than that from ZA or TZA. It appears that adhesion performs well statistically, but that TZA is more accurate dynamically, in the sense of moving mass to the right place.

  3. A test of the adhesion approximation for gravitational clustering

    NASA Technical Reports Server (NTRS)

    Melott, Adrian L.; Shandarin, Sergei F.; Weinberg, David H.

    1994-01-01

    We quantitatively compare a particle implementation of the adhesion approximation to fully nonlinear, numerical 'N-body' simulations. Our primary tool, cross-correlation of N-body simulations with the adhesion approximation, indicates good agreement, better than that found by the same test performed with the Zel'dovich approximation (hereafter ZA). However, the cross-correlation is not as good as that of the truncated Zel'dovich approximation (TZA), obtained by applying the Zel'dovich approximation after smoothing the initial density field with a Gaussian filter. We confirm that the adhesion approximation produces an excessively filamentary distribution. Relative to the N-body results, we also find that: (a) the power spectrum obtained from the adhesion approximation is more accurate that that from ZA to TZA, (b) the error in the phase angle of Fourier components is worse that that from TZA, and (c) the mass distribution function is more accurate than that from ZA or TZA. It appears that adhesion performs well statistically, but that TZA is more accurate dynamically, in the sense of moving mass to the right place.

  4. A quantitative reconstruction software suite for SPECT imaging

    NASA Astrophysics Data System (ADS)

    Namías, Mauro; Jeraj, Robert

    2017-11-01

    Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.

  5. Dataglove measurement of joint angles in sign language handshapes

    PubMed Central

    Eccarius, Petra; Bour, Rebecca; Scheidt, Robert A.

    2012-01-01

    In sign language research, we understand little about articulatory factors involved in shaping phonemic boundaries or the amount (and articulatory nature) of acceptable phonetic variation between handshapes. To date, there exists no comprehensive analysis of handshape based on the quantitative measurement of joint angles during sign production. The purpose of our work is to develop a methodology for collecting and visualizing quantitative handshape data in an attempt to better understand how handshapes are produced at a phonetic level. In this pursuit, we seek to quantify the flexion and abduction angles of the finger joints using a commercial data glove (CyberGlove; Immersion Inc.). We present calibration procedures used to convert raw glove signals into joint angles. We then implement those procedures and evaluate their ability to accurately predict joint angle. Finally, we provide examples of how our recording techniques might inform current research questions. PMID:23997644

  6. Phenotyping for drought tolerance of crops in the genomics era

    PubMed Central

    Tuberosa, Roberto

    2012-01-01

    Improving crops yield under water-limited conditions is the most daunting challenge faced by breeders. To this end, accurate, relevant phenotyping plays an increasingly pivotal role for the selection of drought-resilient genotypes and, more in general, for a meaningful dissection of the quantitative genetic landscape that underscores the adaptive response of crops to drought. A major and universally recognized obstacle to a more effective translation of the results produced by drought-related studies into improved cultivars is the difficulty in properly phenotyping in a high-throughput fashion in order to identify the quantitative trait loci that govern yield and related traits across different water regimes. This review provides basic principles and a broad set of references useful for the management of phenotyping practices for the study and genetic dissection of drought tolerance and, ultimately, for the release of drought-tolerant cultivars. PMID:23049510

  7. A quantitative method to measure biofilm removal efficiency from complex biomaterial surfaces using SEM and image analysis

    NASA Astrophysics Data System (ADS)

    Vyas, N.; Sammons, R. L.; Addison, O.; Dehghani, H.; Walmsley, A. D.

    2016-09-01

    Biofilm accumulation on biomaterial surfaces is a major health concern and significant research efforts are directed towards producing biofilm resistant surfaces and developing biofilm removal techniques. To accurately evaluate biofilm growth and disruption on surfaces, accurate methods which give quantitative information on biofilm area are needed, as current methods are indirect and inaccurate. We demonstrate the use of machine learning algorithms to segment biofilm from scanning electron microscopy images. A case study showing disruption of biofilm from rough dental implant surfaces using cavitation bubbles from an ultrasonic scaler is used to validate the imaging and analysis protocol developed. Streptococcus mutans biofilm was disrupted from sandblasted, acid etched (SLA) Ti discs and polished Ti discs. Significant biofilm removal occurred due to cavitation from ultrasonic scaling (p < 0.001). The mean sensitivity and specificity values for segmentation of the SLA surface images were 0.80 ± 0.18 and 0.62 ± 0.20 respectively and 0.74 ± 0.13 and 0.86 ± 0.09 respectively for polished surfaces. Cavitation has potential to be used as a novel way to clean dental implants. This imaging and analysis method will be of value to other researchers and manufacturers wishing to study biofilm growth and removal.

  8. Computation of mass-density images from x-ray refraction-angle images.

    PubMed

    Wernick, Miles N; Yang, Yongyi; Mondal, Indrasis; Chapman, Dean; Hasnah, Moumen; Parham, Christopher; Pisano, Etta; Zhong, Zhong

    2006-04-07

    In this paper, we investigate the possibility of computing quantitatively accurate images of mass density variations in soft tissue. This is a challenging task, because density variations in soft tissue, such as the breast, can be very subtle. Beginning from an image of refraction angle created by either diffraction-enhanced imaging (DEI) or multiple-image radiography (MIR), we estimate the mass-density image using a constrained least squares (CLS) method. The CLS algorithm yields accurate density estimates while effectively suppressing noise. Our method improves on an analytical method proposed by Hasnah et al (2005 Med. Phys. 32 549-52), which can produce significant artefacts when even a modest level of noise is present. We present a quantitative evaluation study to determine the accuracy with which mass density can be determined in the presence of noise. Based on computer simulations, we find that the mass-density estimation error can be as low as a few per cent for typical density variations found in the breast. Example images computed from less-noisy real data are also shown to illustrate the feasibility of the technique. We anticipate that density imaging may have application in assessment of water content of cartilage resulting from osteoarthritis, in evaluation of bone density, and in mammographic interpretation.

  9. Speech recognition training for enhancing written language generation by a traumatic brain injury survivor.

    PubMed

    Manasse, N J; Hux, K; Rankin-Erickson, J L

    2000-11-01

    Impairments in motor functioning, language processing, and cognitive status may impact the written language performance of traumatic brain injury (TBI) survivors. One strategy to minimize the impact of these impairments is to use a speech recognition system. The purpose of this study was to explore the effect of mild dysarthria and mild cognitive-communication deficits secondary to TBI on a 19-year-old survivor's mastery and use of such a system-specifically, Dragon Naturally Speaking. Data included the % of the participant's words accurately perceived by the system over time, the participant's accuracy over time in using commands for navigation and error correction, and quantitative and qualitative changes in the participant's written texts generated with and without the use of the speech recognition system. Results showed that Dragon NaturallySpeaking was approximately 80% accurate in perceiving words spoken by the participant, and the participant quickly and easily mastered all navigation and error correction commands presented. Quantitatively, the participant produced a greater amount of text using traditional word processing and a standard keyboard than using the speech recognition system. Minimal qualitative differences appeared between writing samples. Discussion of factors that may have contributed to the obtained results and that may affect the generalization of the findings to other TBI survivors is provided.

  10. A Comparison of Multivariate and Pre-Processing Methods for Quantitative Laser-Induced Breakdown Spectroscopy of Geologic Samples

    NASA Technical Reports Server (NTRS)

    Anderson, R. B.; Morris, R. V.; Clegg, S. M.; Bell, J. F., III; Humphries, S. D.; Wiens, R. C.

    2011-01-01

    The ChemCam instrument selected for the Curiosity rover is capable of remote laser-induced breakdown spectroscopy (LIBS).[1] We used a remote LIBS instrument similar to ChemCam to analyze 197 geologic slab samples and 32 pressed-powder geostandards. The slab samples are well-characterized and have been used to validate the calibration of previous instruments on Mars missions, including CRISM [2], OMEGA [3], the MER Pancam [4], Mini-TES [5], and Moessbauer [6] instruments and the Phoenix SSI [7]. The resulting dataset was used to compare multivariate methods for quantitative LIBS and to determine the effect of grain size on calculations. Three multivariate methods - partial least squares (PLS), multilayer perceptron artificial neural networks (MLP ANNs) and cascade correlation (CC) ANNs - were used to generate models and extract the quantitative composition of unknown samples. PLS can be used to predict one element (PLS1) or multiple elements (PLS2) at a time, as can the neural network methods. Although MLP and CC ANNs were successful in some cases, PLS generally produced the most accurate and precise results.

  11. Multicenter Evaluation of a Commercial Cytomegalovirus Quantitative Standard: Effects of Commutability on Interlaboratory Concordance

    PubMed Central

    Shahbazian, M. D.; Valsamakis, A.; Boonyaratanakornkit, J.; Cook, L.; Pang, X. L.; Preiksaitis, J. K.; Schönbrunner, E. R.; Caliendo, A. M.

    2013-01-01

    Commutability of quantitative reference materials has proven important for reliable and accurate results in clinical chemistry. As international reference standards and commercially produced calibration material have become available to address the variability of viral load assays, the degree to which such materials are commutable and the effect of commutability on assay concordance have been questioned. To investigate this, 60 archived clinical plasma samples, which previously tested positive for cytomegalovirus (CMV), were retested by five different laboratories, each using a different quantitative CMV PCR assay. Results from each laboratory were calibrated both with lab-specific quantitative CMV standards (“lab standards”) and with common, commercially available standards (“CMV panel”). Pairwise analyses among laboratories were performed using mean results from each clinical sample, calibrated first with lab standards and then with the CMV panel. Commutability of the CMV panel was determined based on difference plots for each laboratory pair showing plotted values of standards that were within the 95% prediction intervals for the clinical specimens. Commutability was demonstrated for 6 of 10 laboratory pairs using the CMV panel. In half of these pairs, use of the CMV panel improved quantitative agreement compared to use of lab standards. Two of four laboratory pairs for which the CMV panel was noncommutable showed reduced quantitative agreement when that panel was used as a common calibrator. Commutability of calibration material varies across different quantitative PCR methods. Use of a common, commutable quantitative standard can improve agreement across different assays; use of a noncommutable calibrator can reduce agreement among laboratories. PMID:24025907

  12. Quantitative thickness prediction of tectonically deformed coal using Extreme Learning Machine and Principal Component Analysis: a case study

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Li, Yan; Chen, Tongjun; Yan, Qiuyan; Ma, Li

    2017-04-01

    The thickness of tectonically deformed coal (TDC) has positive correlation associations with gas outbursts. In order to predict the TDC thickness of coal beds, we propose a new quantitative predicting method using an extreme learning machine (ELM) algorithm, a principal component analysis (PCA) algorithm, and seismic attributes. At first, we build an ELM prediction model using the PCA attributes of a synthetic seismic section. The results suggest that the ELM model can produce a reliable and accurate prediction of the TDC thickness for synthetic data, preferring Sigmoid activation function and 20 hidden nodes. Then, we analyze the applicability of the ELM model on the thickness prediction of the TDC with real application data. Through the cross validation of near-well traces, the results suggest that the ELM model can produce a reliable and accurate prediction of the TDC. After that, we use 250 near-well traces from 10 wells to build an ELM predicting model and use the model to forecast the TDC thickness of the No. 15 coal in the study area using the PCA attributes as the inputs. Comparing the predicted results, it is noted that the trained ELM model with two selected PCA attributes yields better predication results than those from the other combinations of the attributes. Finally, the trained ELM model with real seismic data have a different number of hidden nodes (10) than the trained ELM model with synthetic seismic data. In summary, it is feasible to use an ELM model to predict the TDC thickness using the calculated PCA attributes as the inputs. However, the input attributes, the activation function and the number of hidden nodes in the ELM model should be selected and tested carefully based on individual application.

  13. Quantitative aspects of inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Bulska, Ewa; Wagner, Barbara

    2016-10-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue 'Quantitative mass spectrometry'.

  14. The linearized multistage model and the future of quantitative risk assessment.

    PubMed

    Crump, K S

    1996-10-01

    The linearized multistage (LMS) model has for over 15 years been the default dose-response model used by the U.S. Environmental Protection Agency (USEPA) and other federal and state regulatory agencies in the United States for calculating quantitative estimates of low-dose carcinogenic risks from animal data. The LMS model is in essence a flexible statistical model that can describe both linear and non-linear dose-response patterns, and that produces an upper confidence bound on the linear low-dose slope of the dose-response curve. Unlike its namesake, the Armitage-Doll multistage model, the parameters of the LMS do not correspond to actual physiological phenomena. Thus the LMS is 'biological' only to the extent that the true biological dose response is linear at low dose and that low-dose slope is reflected in the experimental data. If the true dose response is non-linear the LMS upper bound may overestimate the true risk by many orders of magnitude. However, competing low-dose extrapolation models, including those derived from 'biologically-based models' that are capable of incorporating additional biological information, have not shown evidence to date of being able to produce quantitative estimates of low-dose risks that are any more accurate than those obtained from the LMS model. Further, even if these attempts were successful, the extent to which more accurate estimates of low-dose risks in a test animal species would translate into improved estimates of human risk is questionable. Thus, it does not appear possible at present to develop a quantitative approach that would be generally applicable and that would offer significant improvements upon the crude bounding estimates of the type provided by the LMS model. Draft USEPA guidelines for cancer risk assessment incorporate an approach similar to the LMS for carcinogens having a linear mode of action. However, under these guidelines quantitative estimates of low-dose risks would not be developed for carcinogens having a non-linear mode of action; instead dose-response modelling would be used in the experimental range to calculate an LED10* (a statistical lower bound on the dose corresponding to a 10% increase in risk), and safety factors would be applied to the LED10* to determine acceptable exposure levels for humans. This approach is very similar to the one presently used by USEPA for non-carcinogens. Rather than using one approach for carcinogens believed to have a linear mode of action and a different approach for all other health effects, it is suggested herein that it would be more appropriate to use an approach conceptually similar to the 'LED10*-safety factor' approach for all health effects, and not to routinely develop quantitative risk estimates from animal data.

  15. Lipidomics of oxidized polyunsaturated fatty acids

    PubMed Central

    Massey, Karen A.; Nicolaou, Anna

    2013-01-01

    Lipid mediators are produced from the oxidation of polyunsaturated fatty acids through enzymatic and free radical-mediated reactions. When subject to oxygenation via cyclooxygenases, lipoxygenases, and cytochrome P450 monooxygenases, polyunsaturated fatty acids give rise to an array of metabolites including eicosanoids, docosanoids, and octadecanoids. These potent bioactive lipids are involved in many biochemical and signaling pathways, with inflammation being of particular importance. Moreover, because they are produced by more than one pathway and substrate, and are present in a variety of biological milieus, their analysis is not always possible with conventional assays. Liquid chromatography coupled to electrospray mass spectrometry offers a versatile and sensitive approach for the analysis of bioactive lipids, allowing specific and accurate quantitation of multiple species present in the same sample. Here we explain the principles of this approach to mediator lipidomics and present detailed protocols for the assay of enzymatically produced oxygenated metabolites of polyunsaturated fatty acids that can be tailored to answer biological questions or facilitate assessment of nutritional and pharmacological interventions. PMID:22940496

  16. Quantitative Modeling of Cerenkov Light Production Efficiency from Medical Radionuclides

    PubMed Central

    Beattie, Bradley J.; Thorek, Daniel L. J.; Schmidtlein, Charles R.; Pentlow, Keith S.; Humm, John L.; Hielscher, Andreas H.

    2012-01-01

    There has been recent and growing interest in applying Cerenkov radiation (CR) for biological applications. Knowledge of the production efficiency and other characteristics of the CR produced by various radionuclides would help in accessing the feasibility of proposed applications and guide the choice of radionuclides. To generate this information we developed models of CR production efficiency based on the Frank-Tamm equation and models of CR distribution based on Monte-Carlo simulations of photon and β particle transport. All models were validated against direct measurements using multiple radionuclides and then applied to a number of radionuclides commonly used in biomedical applications. We show that two radionuclides, Ac-225 and In-111, which have been reported to produce CR in water, do not in fact produce CR directly. We also propose a simple means of using this information to calibrate high sensitivity luminescence imaging systems and show evidence suggesting that this calibration may be more accurate than methods in routine current use. PMID:22363636

  17. The impact of smart metal artefact reduction algorithm for use in radiotherapy treatment planning.

    PubMed

    Guilfoile, Connor; Rampant, Peter; House, Michael

    2017-06-01

    The presence of metal artefacts in computed tomography (CT) create issues in radiation oncology. The loss of anatomical information and incorrect Hounsfield unit (HU) values produce inaccuracies in dose calculations, providing suboptimal patient treatment. Metal artefact reduction (MAR) algorithms were developed to combat these problems. This study provides a qualitative and quantitative analysis of the "Smart MAR" software (General Electric Healthcare, Chicago, IL, USA), determining its usefulness in a clinical setting. A detailed analysis was conducted using both patient and phantom data, noting any improvements in HU values and dosimetry with the GE-MAR enabled. This study indicates qualitative improvements in severity of the streak artefacts produced by metals, allowing for easier patient contouring. Furthermore, the GE-MAR managed to recover previously lost anatomical information. Additionally, phantom data showed an improvement in HU value with GE-MAR correction, producing more accurate point dose calculations in the treatment planning system. Overall, the GE-MAR is a useful tool and is suitable for clinical environments.

  18. A gene-targeted approach to investigate the intestinal butyrate-producing bacterial community

    PubMed Central

    2013-01-01

    Background Butyrate, which is produced by the human microbiome, is essential for a well-functioning colon. Bacteria that produce butyrate are phylogenetically diverse, which hinders their accurate detection based on conventional phylogenetic markers. As a result, reliable information on this important bacterial group is often lacking in microbiome research. Results In this study we describe a gene-targeted approach for 454 pyrotag sequencing and quantitative polymerase chain reaction for the final genes in the two primary bacterial butyrate synthesis pathways, butyryl-CoA:acetate CoA-transferase (but) and butyrate kinase (buk). We monitored the establishment and early succession of butyrate-producing communities in four patients with ulcerative colitis who underwent a colectomy with ileal pouch anal anastomosis and compared it with three control samples from healthy colons. All patients established an abundant butyrate-producing community (approximately 5% to 26% of the total community) in the pouch within the 2-month study, but patterns were distinctive among individuals. Only one patient harbored a community profile similar to the healthy controls, in which there was a predominance of but genes that are similar to reference genes from Acidaminococcus sp., Eubacterium sp., Faecalibacterium prausnitzii and Roseburia sp., and an almost complete absence of buk genes. Two patients were greatly enriched in buk genes similar to those of Clostridium butyricum and C. perfringens, whereas a fourth patient displayed abundant communities containing both genes. Most butyrate producers identified in previous studies were detected and the general patterns of taxa found were supported by 16S rRNA gene pyrotag analysis, but the gene-targeted approach provided more detail about the potential butyrate-producing members of the community. Conclusions The presented approach provides quantitative and genotypic insights into butyrate-producing communities and facilitates a more specific functional characterization of the intestinal microbiome. Furthermore, our analysis refines but and buk reference annotations found in central databases. PMID:24451334

  19. ELECTRON MICROSCOPIC EXAMINATION OF SUBCELLULAR FRACTIONS

    PubMed Central

    Baudhuin, Pierre; Evrard, Philippe; Berthet, Jacques

    1967-01-01

    A method is described for preparing, by filtration on Millipore filters, very thin (about 10 µ) pellicles of packed particles. These pellicles can be embedded in Epon for electron microscopic examination. They are also suitable for cytochemical assays. The method was used with various particulate fractions from rat liver. Its main advantages over the usual centrifugal packing techniques are that it produces heterogeneity solely in the direction perpendicular to the surface of the pellicle and that sections covering the whole depth of the pellicle can be photographed in a single field. It thus answers the essential criterion of random sampling and can be used for accurate quantitative evaluations. PMID:10976209

  20. High-throughput, label-free, single-cell, microalgal lipid screening by machine-learning-equipped optofluidic time-stretch quantitative phase microscopy.

    PubMed

    Guo, Baoshan; Lei, Cheng; Kobayashi, Hirofumi; Ito, Takuro; Yalikun, Yaxiaer; Jiang, Yiyue; Tanaka, Yo; Ozeki, Yasuyuki; Goda, Keisuke

    2017-05-01

    The development of reliable, sustainable, and economical sources of alternative fuels to petroleum is required to tackle the global energy crisis. One such alternative is microalgal biofuel, which is expected to play a key role in reducing the detrimental effects of global warming as microalgae absorb atmospheric CO 2 via photosynthesis. Unfortunately, conventional analytical methods only provide population-averaged lipid amounts and fail to characterize a diverse population of microalgal cells with single-cell resolution in a non-invasive and interference-free manner. Here high-throughput label-free single-cell screening of lipid-producing microalgal cells with optofluidic time-stretch quantitative phase microscopy was demonstrated. In particular, Euglena gracilis, an attractive microalgal species that produces wax esters (suitable for biodiesel and aviation fuel after refinement), within lipid droplets was investigated. The optofluidic time-stretch quantitative phase microscope is based on an integration of a hydrodynamic-focusing microfluidic chip, an optical time-stretch quantitative phase microscope, and a digital image processor equipped with machine learning. As a result, it provides both the opacity and phase maps of every single cell at a high throughput of 10,000 cells/s, enabling accurate cell classification without the need for fluorescent staining. Specifically, the dataset was used to characterize heterogeneous populations of E. gracilis cells under two different culture conditions (nitrogen-sufficient and nitrogen-deficient) and achieve the cell classification with an error rate of only 2.15%. The method holds promise as an effective analytical tool for microalgae-based biofuel production. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  1. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells.

    PubMed

    Park, Han Sang; Rinehart, Matthew T; Walzer, Katelyn A; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis.

  2. Automated Detection of P. falciparum Using Machine Learning Algorithms with Quantitative Phase Images of Unstained Cells

    PubMed Central

    Park, Han Sang; Rinehart, Matthew T.; Walzer, Katelyn A.; Chi, Jen-Tsan Ashley; Wax, Adam

    2016-01-01

    Malaria detection through microscopic examination of stained blood smears is a diagnostic challenge that heavily relies on the expertise of trained microscopists. This paper presents an automated analysis method for detection and staging of red blood cells infected by the malaria parasite Plasmodium falciparum at trophozoite or schizont stage. Unlike previous efforts in this area, this study uses quantitative phase images of unstained cells. Erythrocytes are automatically segmented using thresholds of optical phase and refocused to enable quantitative comparison of phase images. Refocused images are analyzed to extract 23 morphological descriptors based on the phase information. While all individual descriptors are highly statistically different between infected and uninfected cells, each descriptor does not enable separation of populations at a level satisfactory for clinical utility. To improve the diagnostic capacity, we applied various machine learning techniques, including linear discriminant classification (LDC), logistic regression (LR), and k-nearest neighbor classification (NNC), to formulate algorithms that combine all of the calculated physical parameters to distinguish cells more effectively. Results show that LDC provides the highest accuracy of up to 99.7% in detecting schizont stage infected cells compared to uninfected RBCs. NNC showed slightly better accuracy (99.5%) than either LDC (99.0%) or LR (99.1%) for discriminating late trophozoites from uninfected RBCs. However, for early trophozoites, LDC produced the best accuracy of 98%. Discrimination of infection stage was less accurate, producing high specificity (99.8%) but only 45.0%-66.8% sensitivity with early trophozoites most often mistaken for late trophozoite or schizont stage and late trophozoite and schizont stage most often confused for each other. Overall, this methodology points to a significant clinical potential of using quantitative phase imaging to detect and stage malaria infection without staining or expert analysis. PMID:27636719

  3. Quantitative aspects of inductively coupled plasma mass spectrometry

    PubMed Central

    Wagner, Barbara

    2016-01-01

    Accurate determination of elements in various kinds of samples is essential for many areas, including environmental science, medicine, as well as industry. Inductively coupled plasma mass spectrometry (ICP-MS) is a powerful tool enabling multi-elemental analysis of numerous matrices with high sensitivity and good precision. Various calibration approaches can be used to perform accurate quantitative measurements by ICP-MS. They include the use of pure standards, matrix-matched standards, or relevant certified reference materials, assuring traceability of the reported results. This review critically evaluates the advantages and limitations of different calibration approaches, which are used in quantitative analyses by ICP-MS. Examples of such analyses are provided. This article is part of the themed issue ‘Quantitative mass spectrometry’. PMID:27644971

  4. Determination of mechanical stiffness of bone by pQCT measurements: correlation with non-destructive mechanical four-point bending test data.

    PubMed

    Martin, Daniel E; Severns, Anne E; Kabo, J M J Michael

    2004-08-01

    Mechanical tests of bone provide valuable information about material and structural properties important for understanding bone pathology in both clinical and research settings, but no previous studies have produced applicable non-invasive, quantitative estimates of bending stiffness. The goal of this study was to evaluate the effectiveness of using peripheral quantitative computed tomography (pQCT) data to accurately compute the bending stiffness of bone. Normal rabbit humeri (N=8) were scanned at their mid-diaphyses using pQCT. The average bone mineral densities and the cross-sectional moments of inertia were computed from the pQCT cross-sections. Bending stiffness was determined as a function of the elastic modulus of compact bone (based on the local bone mineral density), cross-sectional moment of inertia, and simulated quasistatic strain rate. The actual bending stiffness of the bones was determined using four-point bending tests. Comparison of the bending stiffness estimated from the pQCT data and the mechanical bending stiffness revealed excellent correlation (R2=0.96). The bending stiffness from the pQCT data was on average 103% of that obtained from the four-point bending tests. The results indicate that pQCT data can be used to accurately determine the bending stiffness of normal bone. Possible applications include temporal quantification of fracture healing and risk management of osteoporosis or other bone pathologies.

  5. Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope

    DTIC Science & Technology

    2017-06-29

    Accurate Virus Quantitation Using a Scanning Transmission Electron Microscopy (STEM) Detector in a Scanning Electron Microscope Candace D Blancett1...L Norris2, Cynthia A Rossi4 , Pamela J Glass3, Mei G Sun1,* 1 Pathology Division, United States Army Medical Research Institute of Infectious...Diseases (USAMRIID), 1425 Porter Street, Fort Detrick, Maryland, 21702 2Biostatistics Division, United States Army Medical Research Institute of

  6. Production of highly-enriched 134Ba for a reference material for isotope dilution mass spectrometry measurements

    DOE PAGES

    Horkley, J. J.; Carney, K. P.; Gantz, E. M.; ...

    2015-03-17

    Isotope dilution mass spectrometry (IDMS) is an analytical technique capable of providing accurate and precise quantitation of trace isotope abundance and assay providing measurement uncertainties below 1 %. To achieve these low uncertainties, the IDMS method ideally utilizes chemically pure “spike” solutions that consist of a single highly enriched isotope that is well-characterized relating to the abundance of companion isotopes and concentration in solution. To address a current demand for accurate 137Cs/137Ba ratio measurements for “age” determination of radioactive 137Cs sources, Idaho National Laboratory (INL) is producing enriched 134Ba isotopes that are tobe used for IDMS spikes to accurately determinemore » 137Ba accumulation from the decay of 137Cs. The final objective of this work it to provide a homogenous set of reference materials that the National Institute of Standards and Technology can certify as standard reference materials used for IDMS. The process that was developed at INL for the separation and isolation of Ba isotopes, chemical purification of the isotopes in solution, and the encapsulation of the materials will be described.« less

  7. A synthetic genetic edge detection program.

    PubMed

    Tabor, Jeffrey J; Salis, Howard M; Simpson, Zachary Booth; Chevalier, Aaron A; Levskaya, Anselm; Marcotte, Edward M; Voigt, Christopher A; Ellington, Andrew D

    2009-06-26

    Edge detection is a signal processing algorithm common in artificial intelligence and image recognition programs. We have constructed a genetically encoded edge detection algorithm that programs an isogenic community of E. coli to sense an image of light, communicate to identify the light-dark edges, and visually present the result of the computation. The algorithm is implemented using multiple genetic circuits. An engineered light sensor enables cells to distinguish between light and dark regions. In the dark, cells produce a diffusible chemical signal that diffuses into light regions. Genetic logic gates are used so that only cells that sense light and the diffusible signal produce a positive output. A mathematical model constructed from first principles and parameterized with experimental measurements of the component circuits predicts the performance of the complete program. Quantitatively accurate models will facilitate the engineering of more complex biological behaviors and inform bottom-up studies of natural genetic regulatory networks.

  8. A Synthetic Genetic Edge Detection Program

    PubMed Central

    Tabor, Jeffrey J.; Salis, Howard; Simpson, Zachary B.; Chevalier, Aaron A.; Levskaya, Anselm; Marcotte, Edward M.; Voigt, Christopher A.; Ellington, Andrew D.

    2009-01-01

    Summary Edge detection is a signal processing algorithm common in artificial intelligence and image recognition programs. We have constructed a genetically encoded edge detection algorithm that programs an isogenic community of E.coli to sense an image of light, communicate to identify the light-dark edges, and visually present the result of the computation. The algorithm is implemented using multiple genetic circuits. An engineered light sensor enables cells to distinguish between light and dark regions. In the dark, cells produce a diffusible chemical signal that diffuses into light regions. Genetic logic gates are used so that only cells that sense light and the diffusible signal produce a positive output. A mathematical model constructed from first principles and parameterized with experimental measurements of the component circuits predicts the performance of the complete program. Quantitatively accurate models will facilitate the engineering of more complex biological behaviors and inform bottom-up studies of natural genetic regulatory networks. PMID:19563759

  9. Volumetric visualization of multiple-return LIDAR data: Using voxels

    USGS Publications Warehouse

    Stoker, Jason M.

    2009-01-01

    Elevation data are an important component in the visualization and analysis of geographic information. The creation and display of 3D models representing bare earth, vegetation, and surface structures have become a major focus of light detection and ranging (lidar) remote sensing research in the past few years. Lidar is an active sensor that records the distance, or range, of a laser usually fi red from an airplane, helicopter, or satellite. By converting the millions of 3D lidar returns from a system into bare ground, vegetation, or structural elevation information, extremely accurate, high-resolution elevation models can be derived and produced to visualize and quantify scenes in three dimensions. These data can be used to produce high-resolution bare-earth digital elevation models; quantitative estimates of vegetative features such as canopy height, canopy closure, and biomass; and models of urban areas such as building footprints and 3D city models.

  10. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  11. Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster

    DOE PAGES

    Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin

    2018-03-09

    We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less

  12. Accurate, Sensitive, and Precise Multiplexed Proteomics Using the Complement Reporter Ion Cluster

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sonnett, Matthew; Yeung, Eyan; Wuhr, Martin

    We present that quantitative analysis of proteomes across multiple time points, organelles, and perturbations is essential for understanding both fundamental biology and disease states. The development of isobaric tags (e.g. TMT) have enabled the simultaneous measurement of peptide abundances across several different conditions. These multiplexed approaches are promising in principle because of advantages in throughput and measurement quality. However, in practice existing multiplexing approaches suffer from key limitations. In its simple implementation (TMT-MS2), measurements are distorted by chemical noise leading to poor measurement accuracy. The current state-of-the-art (TMT-MS3) addresses this, but requires specialized quadrupole-iontrap-Orbitrap instrumentation. The complement reporter ion approachmore » (TMTc) produces high accuracy measurements and is compatible with many more instruments, like quadrupole-Orbitraps. However, the required deconvolution of the TMTc cluster leads to poor measurement precision. Here, we introduce TMTc+, which adds the modeling of the MS2-isolation step into the deconvolution algorithm. The resulting measurements are comparable in precision to TMT-MS3/MS2. The improved duty cycle, and lower filtering requirements make TMTc+ more sensitive than TMT-MS3 and comparable with TMT-MS2. At the same time, unlike TMT-MS2, TMTc+ is exquisitely able to distinguish signal from chemical noise even outperforming TMT-MS3. Lastly, we compare TMTc+ to quantitative label-free proteomics of total HeLa lysate and find that TMTc+ quantifies 7.8k versus 3.9k proteins in a 5-plex sample. At the same time the median coefficient of variation improves from 13% to 4%. Furthermore, TMTc+ advances quantitative proteomics by enabling accurate, sensitive, and precise multiplexed experiments on more commonly used instruments.« less

  13. Fluorescence correlation spectroscopy analysis for accurate determination of proportion of doubly labeled DNA in fluorescent DNA pool for quantitative biochemical assays.

    PubMed

    Hou, Sen; Sun, Lili; Wieczorek, Stefan A; Kalwarczyk, Tomasz; Kaminski, Tomasz S; Holyst, Robert

    2014-01-15

    Fluorescent double-stranded DNA (dsDNA) molecules labeled at both ends are commonly produced by annealing of complementary single-stranded DNA (ssDNA) molecules, labeled with fluorescent dyes at the same (3' or 5') end. Because the labeling efficiency of ssDNA is smaller than 100%, the resulting dsDNA have two, one or are without a dye. Existing methods are insufficient to measure the percentage of the doubly-labeled dsDNA component in the fluorescent DNA sample and it is even difficult to distinguish the doubly-labeled DNA component from the singly-labeled component. Accurate measurement of the percentage of such doubly labeled dsDNA component is a critical prerequisite for quantitative biochemical measurements, which has puzzled scientists for decades. We established a fluorescence correlation spectroscopy (FCS) system to measure the percentage of doubly labeled dsDNA (PDL) in the total fluorescent dsDNA pool. The method is based on comparative analysis of the given sample and a reference dsDNA sample prepared by adding certain amount of unlabeled ssDNA into the original ssDNA solution. From FCS autocorrelation functions, we obtain the number of fluorescent dsDNA molecules in the focal volume of the confocal microscope and PDL. We also calculate the labeling efficiency of ssDNA. The method requires minimal amount of material. The samples have the concentration of DNA in the nano-molar/L range and the volume of tens of microliters. We verify our method by using restriction enzyme Hind III to cleave the fluorescent dsDNA. The kinetics of the reaction depends strongly on PDL, a critical parameter for quantitative biochemical measurements. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Investigation of practical initial attenuation image estimates in TOF-MLAA reconstruction for PET/MR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Ju-Chieh, E-mail: chengjuchieh@gmail.com; Y

    Purpose: Time-of-flight joint attenuation and activity positron emission tomography reconstruction requires additional calibration (scale factors) or constraints during or post-reconstruction to produce a quantitative μ-map. In this work, the impact of various initializations of the joint reconstruction was investigated, and the initial average mu-value (IAM) method was introduced such that the forward-projection of the initial μ-map is already very close to that of the reference μ-map, thus reducing/minimizing the offset (scale factor) during the early iterations of the joint reconstruction. Consequently, the accuracy and efficiency of unconstrained joint reconstruction such as time-of-flight maximum likelihood estimation of attenuation and activity (TOF-MLAA)more » can be improved by the proposed IAM method. Methods: 2D simulations of brain and chest were used to evaluate TOF-MLAA with various initial estimates which include the object filled with water uniformly (conventional initial estimate), bone uniformly, the average μ-value uniformly (IAM magnitude initialization method), and the perfect spatial μ-distribution but with a wrong magnitude (initialization in terms of distribution). 3D GATE simulation was also performed for the chest phantom under a typical clinical scanning condition, and the simulated data were reconstructed with a fully corrected list-mode TOF-MLAA algorithm with various initial estimates. The accuracy of the average μ-values within the brain, chest, and abdomen regions obtained from the MR derived μ-maps was also evaluated using computed tomography μ-maps as the gold-standard. Results: The estimated μ-map with the initialization in terms of magnitude (i.e., average μ-value) was observed to reach the reference more quickly and naturally as compared to all other cases. Both 2D and 3D GATE simulations produced similar results, and it was observed that the proposed IAM approach can produce quantitative μ-map/emission when the corrections for physical effects such as scatter and randoms were included. The average μ-value obtained from MR derived μ-map was accurate within 5% with corrections for bone, fat, and uniform lungs. Conclusions: The proposed IAM-TOF-MLAA can produce quantitative μ-map without any calibration provided that there are sufficient counts in the measured data. For low count data, noise reduction and additional regularization/rescaling techniques need to be applied and investigated. The average μ-value within the object is prior information which can be extracted from MR and patient database, and it is feasible to obtain accurate average μ-value using MR derived μ-map with corrections as demonstrated in this work.« less

  15. An efficient and accurate 3D displacements tracking strategy for digital volume correlation

    NASA Astrophysics Data System (ADS)

    Pan, Bing; Wang, Bo; Wu, Dafang; Lubineau, Gilles

    2014-07-01

    Owing to its inherent computational complexity, practical implementation of digital volume correlation (DVC) for internal displacement and strain mapping faces important challenges in improving its computational efficiency. In this work, an efficient and accurate 3D displacement tracking strategy is proposed for fast DVC calculation. The efficiency advantage is achieved by using three improvements. First, to eliminate the need of updating Hessian matrix in each iteration, an efficient 3D inverse compositional Gauss-Newton (3D IC-GN) algorithm is introduced to replace existing forward additive algorithms for accurate sub-voxel displacement registration. Second, to ensure the 3D IC-GN algorithm that converges accurately and rapidly and avoid time-consuming integer-voxel displacement searching, a generalized reliability-guided displacement tracking strategy is designed to transfer accurate and complete initial guess of deformation for each calculation point from its computed neighbors. Third, to avoid the repeated computation of sub-voxel intensity interpolation coefficients, an interpolation coefficient lookup table is established for tricubic interpolation. The computational complexity of the proposed fast DVC and the existing typical DVC algorithms are first analyzed quantitatively according to necessary arithmetic operations. Then, numerical tests are performed to verify the performance of the fast DVC algorithm in terms of measurement accuracy and computational efficiency. The experimental results indicate that, compared with the existing DVC algorithm, the presented fast DVC algorithm produces similar precision and slightly higher accuracy at a substantially reduced computational cost.

  16. Validation Process for LEWICE Coupled by Use of a Navier-stokes Solver

    NASA Technical Reports Server (NTRS)

    Wright, William B.

    2016-01-01

    A research project is underway at NASA Glenn to produce computer software that can accurately predict ice growth for many meteorological conditions for any aircraft surface. This report will present results from the latest LEWICE release, version 3.5. This program differs from previous releases in its ability to model mixed phase and ice crystal conditions such as those encountered inside an engine. It also has expanded capability to use structured grids and a new capability to use results from unstructured grid flow solvers. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has also been performed. This paper will show the differences in ice shape between LEWICE 3.5 and experimental data. In addition, comparisons will be made between the lift and drag calculated on the ice shapes from experiment and those produced by LEWICE. This report will also provide a description of both programs. Quantitative geometric comparisons are shown for horn height, horn angle, icing limit, area and leading edge thickness. Quantitative comparisons of calculated lift and drag will also be shown. The results show that the predicted results are within the accuracy limits of the experimental data for the majority of cases.

  17. A quantitative evaluation of spurious results in the infrared spectroscopic measurement of CO2 isotope ratios

    NASA Astrophysics Data System (ADS)

    Mansfield, C. D.; Rutt, H. N.

    2002-02-01

    The possible generation of spurious results, arising from the application of infrared spectroscopic techniques to the measurement of carbon isotope ratios in breath, due to coincident absorption bands has been re-examined. An earlier investigation, which approached the problem qualitatively, fulfilled its aspirations in providing an unambiguous assurance that 13C16O2/12C16O2 ratios can be confidently measured for isotopic breath tests using instruments based on infrared absorption. Although this conclusion still stands, subsequent quantitative investigation has revealed an important exception that necessitates a strict adherence to sample collection protocol. The results show that concentrations and decay rates of the coincident breath trace compounds acetonitrile and carbon monoxide, found in the breath sample of a heavy smoker, can produce spurious results. Hence, findings from this investigation justify the concern that breath trace compounds present a risk to the accurate measurement of carbon isotope ratios in breath when using broadband, non-dispersive, ground state absorption infrared spectroscopy. It provides recommendations on the length of smoking abstention required to avoid generation of spurious results and also reaffirms, through quantitative argument, the validity of using infrared absorption spectroscopy to measure CO2 isotope ratios in breath.

  18. Mountain Heavy Rainfall Measurement Experiments in a Subtropical Monsoon Environment

    NASA Astrophysics Data System (ADS)

    Jong-Dao Jou, Ben; Chi-June Jung, Ultimate; Lai, Hsiao-Wei; Feng, Lei

    2014-05-01

    Quantitative rainfall measurement experiments have been conducted in Taiwan area for the past 5 years (since 2008), especially over the complex terrain region. In this paper, results from these experiments will be analyzed and discussed, especially those associated with heavy rain events in the summer monsoon season. Observations from s-band polarimetric radar (SPOL of NCAR) and also x-band vertically-pointing radar are analyzed to reveal the high resolution temporal and spatial variation of precipitation structure. May and June, the Meiyu season in the area, are months with subtropical frontal rainfall events. Mesoscale convective systems, i.e., pre-frontal squall lines and frontal convective rainbands, are very active and frequently produce heavy rain events over mountain areas. Accurate quantitative precipitation measurements are needed in order to meet the requirement for landslide and flood early warning purpose. Using ground-based disdrometers and vertically-pointing radar, we have been trying to modify the quantitative precipitation estimation in the mountain region by using coastal operational radar. In this paper, the methodology applied will be presented and the potential of its application will be discussed. *corresponding author: Ben Jong-Dao Jou, jouben43@gmail.com

  19. Improved Visualization of Gastrointestinal Slow Wave Propagation Using a Novel Wavefront-Orientation Interpolation Technique.

    PubMed

    Mayne, Terence P; Paskaranandavadivel, Niranchan; Erickson, Jonathan C; OGrady, Gregory; Cheng, Leo K; Angeli, Timothy R

    2018-02-01

    High-resolution mapping of gastrointestinal (GI) slow waves is a valuable technique for research and clinical applications. Interpretation of high-resolution GI mapping data relies on animations of slow wave propagation, but current methods remain as rudimentary, pixelated electrode activation animations. This study aimed to develop improved methods of visualizing high-resolution slow wave recordings that increases ease of interpretation. The novel method of "wavefront-orientation" interpolation was created to account for the planar movement of the slow wave wavefront, negate any need for distance calculations, remain robust in atypical wavefronts (i.e., dysrhythmias), and produce an appropriate interpolation boundary. The wavefront-orientation method determines the orthogonal wavefront direction and calculates interpolated values as the mean slow wave activation-time (AT) of the pair of linearly adjacent electrodes along that direction. Stairstep upsampling increased smoothness and clarity. Animation accuracy of 17 human high-resolution slow wave recordings (64-256 electrodes) was verified by visual comparison to the prior method showing a clear improvement in wave smoothness that enabled more accurate interpretation of propagation, as confirmed by an assessment of clinical applicability performed by eight GI clinicians. Quantitatively, the new method produced accurate interpolation values compared to experimental data (mean difference 0.02 ± 0.05 s) and was accurate when applied solely to dysrhythmic data (0.02 ± 0.06 s), both within the error in manual AT marking (mean 0.2 s). Mean interpolation processing time was 6.0 s per wave. These novel methods provide a validated visualization platform that will improve analysis of high-resolution GI mapping in research and clinical translation.

  20. An inverse approach to determining spatially varying arterial compliance using ultrasound imaging

    NASA Astrophysics Data System (ADS)

    Mcgarry, Matthew; Li, Ronny; Apostolakis, Iason; Nauleau, Pierre; Konofagou, Elisa E.

    2016-08-01

    The mechanical properties of arteries are implicated in a wide variety of cardiovascular diseases, many of which are expected to involve a strong spatial variation in properties that can be depicted by diagnostic imaging. A pulse wave inverse problem (PWIP) is presented, which can produce spatially resolved estimates of vessel compliance from ultrasound measurements of the vessel wall displacements. The 1D equations governing pulse wave propagation in a flexible tube are parameterized by the spatially varying properties, discrete cosine transform components of the inlet pressure boundary conditions, viscous loss constant and a resistance outlet boundary condition. Gradient descent optimization is used to fit displacements from the model to the measured data by updating the model parameters. Inversion of simulated data showed that the PWIP can accurately recover the correct compliance distribution and inlet pressure under realistic conditions, even under high simulated measurement noise conditions. Silicone phantoms with known compliance contrast were imaged with a clinical ultrasound system. The PWIP produced spatially and quantitatively accurate maps of the phantom compliance compared to independent static property estimates, and the known locations of stiff inclusions (which were as small as 7 mm). The PWIP is necessary for these phantom experiments as the spatiotemporal resolution, measurement noise and compliance contrast does not allow accurate tracking of the pulse wave velocity using traditional approaches (e.g. 50% upstroke markers). Results from simulations indicate reflections generated from material interfaces may negatively affect wave velocity estimates, whereas these reflections are accounted for in the PWIP and do not cause problems.

  1. Registration of T2-weighted and diffusion-weighted MR images of the prostate: comparison between manual and landmark-based methods

    NASA Astrophysics Data System (ADS)

    Peng, Yahui; Jiang, Yulei; Soylu, Fatma N.; Tomek, Mark; Sensakovic, William; Oto, Aytekin

    2012-02-01

    Quantitative analysis of multi-parametric magnetic resonance (MR) images of the prostate, including T2-weighted (T2w) and diffusion-weighted (DW) images, requires accurate image registration. We compared two registration methods between T2w and DW images. We collected pre-operative MR images of 124 prostate cancer patients (68 patients scanned with a GE scanner and 56 with Philips scanners). A landmark-based rigid registration was done based on six prostate landmarks in both T2w and DW images identified by a radiologist. Independently, a researcher manually registered the same images. A radiologist visually evaluated the registration results by using a 5-point ordinal scale of 1 (worst) to 5 (best). The Wilcoxon signed-rank test was used to determine whether the radiologist's ratings of the results of the two registration methods were significantly different. Results demonstrated that both methods were accurate: the average ratings were 4.2, 3.3, and 3.8 for GE, Philips, and all images, respectively, for the landmark-based method; and 4.6, 3.7, and 4.2, respectively, for the manual method. The manual registration results were more accurate than the landmark-based registration results (p < 0.0001 for GE, Philips, and all images). Therefore, the manual method produces more accurate registration between T2w and DW images than the landmark-based method.

  2. Sensitive and accurate identification of protein–DNA binding events in ChIP-chip assays using higher order derivative analysis

    PubMed Central

    Barrett, Christian L.; Cho, Byung-Kwan

    2011-01-01

    Immuno-precipitation of protein–DNA complexes followed by microarray hybridization is a powerful and cost-effective technology for discovering protein–DNA binding events at the genome scale. It is still an unresolved challenge to comprehensively, accurately and sensitively extract binding event information from the produced data. We have developed a novel strategy composed of an information-preserving signal-smoothing procedure, higher order derivative analysis and application of the principle of maximum entropy to address this challenge. Importantly, our method does not require any input parameters to be specified by the user. Using genome-scale binding data of two Escherichia coli global transcription regulators for which a relatively large number of experimentally supported sites are known, we show that ∼90% of known sites were resolved to within four probes, or ∼88 bp. Over half of the sites were resolved to within two probes, or ∼38 bp. Furthermore, we demonstrate that our strategy delivers significant quantitative and qualitative performance gains over available methods. Such accurate and sensitive binding site resolution has important consequences for accurately reconstructing transcriptional regulatory networks, for motif discovery, for furthering our understanding of local and non-local factors in protein–DNA interactions and for extending the usefulness horizon of the ChIP-chip platform. PMID:21051353

  3. A NOVEL TECHNIQUE FOR QUANTITATIVE ESTIMATION OF UPTAKE OF DIESEL EXHAUST PARTICLES BY LUNG CELLS

    EPA Science Inventory

    While airborne particulates like diesel exhaust particulates (DEP) exert significant toxicological effects on lungs, quantitative estimation of accumulation of DEP inside lung cells has not been reported due to a lack of an accurate and quantitative technique for this purpose. I...

  4. Synthesis and characterisation of PuPO4 - a potential analytical standard for EPMA actinide quantification

    NASA Astrophysics Data System (ADS)

    Wright, K. E.; Popa, K.; Pöml, P.

    2018-01-01

    Transmutation nuclear fuels contain weight percentage quantities of actinide elements, including Pu, Am and Np. Because of the complex spectra presented by actinide elements using electron probe microanalysis (EPMA), it is necessary to have relatively pure actinide element standards to facilitate overlap correction and accurate quantitation. Synthesis of actinide oxide standards is complicated by their multiple oxidation states, which can result in inhomogeneous standards or standards that are not stable at atmospheric conditions. Synthesis of PuP4 results in a specimen that exhibits stable oxidation-reduction chemistry and is sufficiently homogenous to serve as an EPMA standard. This approach shows promise as a method for producing viable actinide standards for microanalysis.

  5. Use of a scanning optical profilometer for toolmark characterization

    NASA Astrophysics Data System (ADS)

    Chumbley, L. S.; Eisenmann, D. J.; Morris, M.; Zhang, S.; Craft, J.; Fisher, C.; Saxton, A.

    2009-05-01

    An optical profilometer has been used to obtain 3-dimensional data for use in two research projects concerning toolmark quantification and identification. In the first study quantitative comparisons between toolmarks made using data from the optical system proved superior to similar data obtained using a stylus profilometer. In the second study the ability of the instrument to obtain accurate data from two surfaces intersecting at a high angle (approximately 90 degrees) is demonstrated by obtaining measurements from the tip of a flat screwdriver. The data obtained was used to produce a computer generated "virtual tool," which was then employed to create "virtual tool marks." How these experiments were conducted and the results obtained will be presented and discussed.

  6. [A new method of processing quantitative PCR data].

    PubMed

    Ke, Bing-Shen; Li, Guang-Yun; Chen, Shi-Min; Huang, Xiang-Yan; Chen, Ying-Jian; Xu, Jun

    2003-05-01

    Today standard PCR can't satisfy the need of biotechnique development and clinical research any more. After numerous dynamic research, PE company found there is a linear relation between initial template number and cycling time when the accumulating fluorescent product is detectable.Therefore,they developed a quantitative PCR technique to be used in PE7700 and PE5700. But the error of this technique is too great to satisfy the need of biotechnique development and clinical research. A better quantitative PCR technique is needed. The mathematical model submitted here is combined with the achievement of relative science,and based on the PCR principle and careful analysis of molecular relationship of main members in PCR reaction system. This model describes the function relation between product quantity or fluorescence intensity and initial template number and other reaction conditions, and can reflect the accumulating rule of PCR product molecule accurately. Accurate quantitative PCR analysis can be made use this function relation. Accumulated PCR product quantity can be obtained from initial template number. Using this model to do quantitative PCR analysis,result error is only related to the accuracy of fluorescence intensity or the instrument used. For an example, when the fluorescence intensity is accurate to 6 digits and the template size is between 100 to 1,000,000, the quantitative result accuracy will be more than 99%. The difference of result error is distinct using same condition,same instrument but different analysis method. Moreover,if the PCR quantitative analysis system is used to process data, it will get result 80 times of accuracy than using CT method.

  7. Three-dimensional segmentation of luminal and adventitial borders in serial intravascular ultrasound images

    NASA Technical Reports Server (NTRS)

    Shekhar, R.; Cothren, R. M.; Vince, D. G.; Chandra, S.; Thomas, J. D.; Cornhill, J. F.

    1999-01-01

    Intravascular ultrasound (IVUS) provides exact anatomy of arteries, allowing accurate quantitative analysis. Automated segmentation of IVUS images is a prerequisite for routine quantitative analyses. We present a new three-dimensional (3D) segmentation technique, called active surface segmentation, which detects luminal and adventitial borders in IVUS pullback examinations of coronary arteries. The technique was validated against expert tracings by computing correlation coefficients (range 0.83-0.97) and William's index values (range 0.37-0.66). The technique was statistically accurate, robust to image artifacts, and capable of segmenting a large number of images rapidly. Active surface segmentation enabled geometrically accurate 3D reconstruction and visualization of coronary arteries and volumetric measurements.

  8. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  9. Calibration and data collection protocols for reliable lattice parameter values in electron pair distribution function studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun

    2015-01-30

    Different protocols for calibrating electron pair distribution function (ePDF) measurements are explored and described for quantitative studies on nanomaterials. It is found that the most accurate approach to determine the camera length is to use a standard calibration sample of Au nanoparticles from the National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.

  10. Calibration and data collection protocols for reliable lattice parameter values in electron pair distribution function (ePDF) studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abeykoon, A. M. Milinda; Hu, Hefei; Wu, Lijun

    2015-02-01

    We explore and describe different protocols for calibrating electron pair distribution function (ePDF) measurements for quantitative studies on nano-materials. We find the most accurate approach to determine the camera-length is to use a standard calibration sample of Au nanoparticles from National Institute of Standards and Technology. Different protocols for data collection are also explored, as are possible operational errors, to find the best approaches for accurate data collection for quantitative ePDF studies.

  11. Accurate quantitation standards of glutathione via traceable sulfur measurement by inductively coupled plasma optical emission spectrometry and ion chromatography

    PubMed Central

    Rastogi, L.; Dash, K.; Arunachalam, J.

    2013-01-01

    The quantitative analysis of glutathione (GSH) is important in different fields like medicine, biology, and biotechnology. Accurate quantitative measurements of this analyte have been hampered by the lack of well characterized reference standards. The proposed procedure is intended to provide an accurate and definitive method for the quantitation of GSH for reference measurements. Measurement of the stoichiometrically existing sulfur content in purified GSH offers an approach for its quantitation and calibration through an appropriate characterized reference material (CRM) for sulfur would provide a methodology for the certification of GSH quantity, that is traceable to SI (International system of units). The inductively coupled plasma optical emission spectrometry (ICP-OES) approach negates the need for any sample digestion. The sulfur content of the purified GSH is quantitatively converted into sulfate ions by microwave-assisted UV digestion in the presence of hydrogen peroxide prior to ion chromatography (IC) measurements. The measurement of sulfur by ICP-OES and IC (as sulfate) using the “high performance” methodology could be useful for characterizing primary calibration standards and certified reference materials with low uncertainties. The relative expanded uncertainties (% U) expressed at 95% confidence interval for ICP-OES analyses varied from 0.1% to 0.3%, while in the case of IC, they were between 0.2% and 1.2%. The described methods are more suitable for characterizing primary calibration standards and certifying reference materials of GSH, than for routine measurements. PMID:29403814

  12. Quantitative evaluation of dual-flip-angle T1 mapping on DCE-MRI kinetic parameter estimation in head and neck

    PubMed Central

    Chow, Steven Kwok Keung; Yeung, David Ka Wai; Ahuja, Anil T; King, Ann D

    2012-01-01

    Purpose To quantitatively evaluate the kinetic parameter estimation for head and neck (HN) dynamic contrast-enhanced (DCE) MRI with dual-flip-angle (DFA) T1 mapping. Materials and methods Clinical DCE-MRI datasets of 23 patients with HN tumors were included in this study. T1 maps were generated based on multiple-flip-angle (MFA) method and different DFA combinations. Tofts model parameter maps of kep, Ktrans and vp based on MFA and DFAs were calculated and compared. Fitted parameter by MFA and DFAs were quantitatively evaluated in primary tumor, salivary gland and muscle. Results T1 mapping deviations by DFAs produced remarkable kinetic parameter estimation deviations in head and neck tissues. In particular, the DFA of [2º, 7º] overestimated, while [7º, 12º] and [7º, 15º] underestimated Ktrans and vp, significantly (P<0.01). [2º, 15º] achieved the smallest but still statistically significant overestimation for Ktrans and vp in primary tumors, 32.1% and 16.2% respectively. kep fitting results by DFAs were relatively close to the MFA reference compared to Ktrans and vp. Conclusions T1 deviations induced by DFA could result in significant errors in kinetic parameter estimation, particularly Ktrans and vp, through Tofts model fitting. MFA method should be more reliable and robust for accurate quantitative pharmacokinetic analysis in head and neck. PMID:23289084

  13. The widespread distribution of a Group I alkenone-producing haptophyte: Implications for quantitative temperature reconstructions

    NASA Astrophysics Data System (ADS)

    Richter, N.; Longo, W. M.; Amaral-Zettler, L. A.; Huang, Y.

    2016-12-01

    Isochysidales haptophytes uniquely produce unsaturated long-chain ketones called alkenones that are commonly applied to marine paleoclimate records. Recent efforts are extending alkenones as temperature proxies for continental environments; however, these systems are more complex due to the greater diversity of haptophyte species in these environments. Saline lakes, for instance, often contain multiple alkenone-producing species, making it difficult to obtain quantitative paleotemperature estimates. Recent findings point to the ubiquity of a distinct alkenone-producing Group I haptophyte that dominate the alkenones in freshwater, alkaline lakes. The purpose of this study was to confirm the presence of the Group I haptophyte in a suite of global freshwater, alkaline lakes that contain its alkenone signature: dominant C37:4 alkenones and tri-unsaturated ketone isomers. We have identified this signature in numerous lakes from North America, Europe, Asia, and the North Atlantic Islands. We have surveyed surface lake sediments for Group I haptophyte phylotypes using next-generation DNA amplicon sequencing targeting the hypervariable regions in the large and small-subunit ribosomal RNA gene. In addition, we used five lakes with distinct limnic and catchment characteristics from the North Slope of Alaska as model systems to monitor lake conditions that induce Group I haptophyte blooms. We collected multiple water column and surface sediment samples for alkenone and DNA analyses to track changes in haptophytes during the spring season. Lake ice-cover change and water column profiles monitored changes in lake stratification and isothermal mixing. These data will strengthen the springtime temperature calibration reported in a previous study by our group, thereby validating an accurate method for continental temperature reconstructions.

  14. Correction for isotopic interferences between analyte and internal standard in quantitative mass spectrometry by a nonlinear calibration function.

    PubMed

    Rule, Geoffrey S; Clark, Zlatuse D; Yue, Bingfang; Rockwood, Alan L

    2013-04-16

    Stable isotope-labeled internal standards are of great utility in providing accurate quantitation in mass spectrometry (MS). An implicit assumption has been that there is no "cross talk" between signals of the internal standard and the target analyte. In some cases, however, naturally occurring isotopes of the analyte do contribute to the signal of the internal standard. This phenomenon becomes more pronounced for isotopically rich compounds, such as those containing sulfur, chlorine, or bromine, higher molecular weight compounds, and those at high analyte/internal standard concentration ratio. This can create nonlinear calibration behavior that may bias quantitative results. Here, we propose the use of a nonlinear but more accurate fitting of data for these situations that incorporates one or two constants determined experimentally for each analyte/internal standard combination and an adjustable calibration parameter. This fitting provides more accurate quantitation in MS-based assays where contributions from analyte to stable labeled internal standard signal exist. It can also correct for the reverse situation where an analyte is present in the internal standard as an impurity. The practical utility of this approach is described, and by using experimental data, the approach is compared to alternative fits.

  15. High Resolution Gamma Ray Analysis of Medical Isotopes

    NASA Astrophysics Data System (ADS)

    Chillery, Thomas

    2015-10-01

    Compton-suppressed high-purity Germanium detectors at the University of Massachusetts Lowell have been used to study medical radioisotopes produced at Brookhaven Linac Isotope Producer (BLIP), in particular isotopes such as Pt-191 used for cancer therapy in patients. The ability to precisely analyze the concentrations of such radio-isotopes is essential for both production facilities such as Brookhaven and consumer hospitals across the U.S. Without accurate knowledge of the quantities and strengths of these isotopes, it is possible for doctors to administer incorrect dosages to patients, thus leading to undesired results. Samples have been produced at Brookhaven and shipped to UML, and the advanced electronics and data acquisition capabilities at UML have been used to extract peak areas in the gamma decay spectra. Levels of Pt isotopes in diluted samples have been quantified, and reaction cross-sections deduced from the irradiation parameters. These provide both cross checks with published work, as well as a rigorous quantitative framework with high quality state-of-the-art detection apparatus in use in the experimental nuclear physics community.

  16. Accurate virus quantitation using a Scanning Transmission Electron Microscopy (STEM) detector in a scanning electron microscope.

    PubMed

    Blancett, Candace D; Fetterer, David P; Koistinen, Keith A; Morazzani, Elaine M; Monninger, Mitchell K; Piper, Ashley E; Kuehl, Kathleen A; Kearney, Brian J; Norris, Sarah L; Rossi, Cynthia A; Glass, Pamela J; Sun, Mei G

    2017-10-01

    A method for accurate quantitation of virus particles has long been sought, but a perfect method still eludes the scientific community. Electron Microscopy (EM) quantitation is a valuable technique because it provides direct morphology information and counts of all viral particles, whether or not they are infectious. In the past, EM negative stain quantitation methods have been cited as inaccurate, non-reproducible, and with detection limits that were too high to be useful. To improve accuracy and reproducibility, we have developed a method termed Scanning Transmission Electron Microscopy - Virus Quantitation (STEM-VQ), which simplifies sample preparation and uses a high throughput STEM detector in a Scanning Electron Microscope (SEM) coupled with commercially available software. In this paper, we demonstrate STEM-VQ with an alphavirus stock preparation to present the method's accuracy and reproducibility, including a comparison of STEM-VQ to viral plaque assay and the ViroCyt Virus Counter. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Rigour in quantitative research.

    PubMed

    Claydon, Leica Sarah

    2015-07-22

    This article which forms part of the research series addresses scientific rigour in quantitative research. It explores the basis and use of quantitative research and the nature of scientific rigour. It examines how the reader may determine whether quantitative research results are accurate, the questions that should be asked to determine accuracy and the checklists that may be used in this process. Quantitative research has advantages in nursing, since it can provide numerical data to help answer questions encountered in everyday practice.

  18. Optimization of metabolite basis sets prior to quantitation in magnetic resonance spectroscopy: an approach based on quantum mechanics

    NASA Astrophysics Data System (ADS)

    Lazariev, A.; Allouche, A.-R.; Aubert-Frécon, M.; Fauvelle, F.; Piotto, M.; Elbayed, K.; Namer, I.-J.; van Ormondt, D.; Graveron-Demilly, D.

    2011-11-01

    High-resolution magic angle spinning (HRMAS) nuclear magnetic resonance (NMR) is playing an increasingly important role for diagnosis. This technique enables setting up metabolite profiles of ex vivo pathological and healthy tissue. The need to monitor diseases and pharmaceutical follow-up requires an automatic quantitation of HRMAS 1H signals. However, for several metabolites, the values of chemical shifts of proton groups may slightly differ according to the micro-environment in the tissue or cells, in particular to its pH. This hampers the accurate estimation of the metabolite concentrations mainly when using quantitation algorithms based on a metabolite basis set: the metabolite fingerprints are not correct anymore. In this work, we propose an accurate method coupling quantum mechanical simulations and quantitation algorithms to handle basis-set changes. The proposed algorithm automatically corrects mismatches between the signals of the simulated basis set and the signal under analysis by maximizing the normalized cross-correlation between the mentioned signals. Optimized chemical shift values of the metabolites are obtained. This method, QM-QUEST, provides more robust fitting while limiting user involvement and respects the correct fingerprints of metabolites. Its efficiency is demonstrated by accurately quantitating 33 signals from tissue samples of human brains with oligodendroglioma, obtained at 11.7 tesla. The corresponding chemical shift changes of several metabolites within the series are also analyzed.

  19. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe.

    PubMed

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-09

    Accurate quantitation of intracellular pH (pH i ) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pH i sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pH i . Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pH i , in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF 4 :Yb 3+ , Tm 3+ UCNPs were used as pH i response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pH i value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pH i related areas and development of the intracellular drug delivery systems.

  20. Accurate Quantitative Sensing of Intracellular pH based on Self-ratiometric Upconversion Luminescent Nanoprobe

    NASA Astrophysics Data System (ADS)

    Li, Cuixia; Zuo, Jing; Zhang, Li; Chang, Yulei; Zhang, Youlin; Tu, Langping; Liu, Xiaomin; Xue, Bin; Li, Qiqing; Zhao, Huiying; Zhang, Hong; Kong, Xianggui

    2016-12-01

    Accurate quantitation of intracellular pH (pHi) is of great importance in revealing the cellular activities and early warning of diseases. A series of fluorescence-based nano-bioprobes composed of different nanoparticles or/and dye pairs have already been developed for pHi sensing. Till now, biological auto-fluorescence background upon UV-Vis excitation and severe photo-bleaching of dyes are the two main factors impeding the accurate quantitative detection of pHi. Herein, we have developed a self-ratiometric luminescence nanoprobe based on förster resonant energy transfer (FRET) for probing pHi, in which pH-sensitive fluorescein isothiocyanate (FITC) and upconversion nanoparticles (UCNPs) were served as energy acceptor and donor, respectively. Under 980 nm excitation, upconversion emission bands at 475 nm and 645 nm of NaYF4:Yb3+, Tm3+ UCNPs were used as pHi response and self-ratiometric reference signal, respectively. This direct quantitative sensing approach has circumvented the traditional software-based subsequent processing of images which may lead to relatively large uncertainty of the results. Due to efficient FRET and fluorescence background free, a highly-sensitive and accurate sensing has been achieved, featured by 3.56 per unit change in pHi value 3.0-7.0 with deviation less than 0.43. This approach shall facilitate the researches in pHi related areas and development of the intracellular drug delivery systems.

  1. Analytical method for the accurate determination of tricothecenes in grains using LC-MS/MS: a comparison between MRM transition and MS3 quantitation.

    PubMed

    Lim, Chee Wei; Tai, Siew Hoon; Lee, Lin Min; Chan, Sheot Harn

    2012-07-01

    The current food crisis demands unambiguous determination of mycotoxin contamination in staple foods to achieve safer food for consumption. This paper describes the first accurate LC-MS/MS method developed to analyze tricothecenes in grains by applying multiple reaction monitoring (MRM) transition and MS(3) quantitation strategies in tandem. The tricothecenes are nivalenol, deoxynivalenol, deoxynivalenol-3-glucoside, fusarenon X, 3-acetyl-deoxynivalenol, 15-acetyldeoxynivalenol, diacetoxyscirpenol, and HT-2 and T-2 toxins. Acetic acid and ammonium acetate were used to convert the analytes into their respective acetate adducts and ammonium adducts under negative and positive MS polarity conditions, respectively. The mycotoxins were separated by reversed-phase LC in a 13.5-min run, ionized using electrospray ionization, and detected by tandem mass spectrometry. Analyte-specific mass-to-charge (m/z) ratios were used to perform quantitation under MRM transition and MS(3) (linear ion trap) modes. Three experiments were made for each quantitation mode and matrix in batches over 6 days for recovery studies. The matrix effect was investigated at concentration levels of 20, 40, 80, 120, 160, and 200 μg kg(-1) (n = 3) in 5 g corn flour and rice flour. Extraction with acetonitrile provided a good overall recovery range of 90-108% (n = 3) at three levels of spiking concentration of 40, 80, and 120 μg kg(-1). A quantitation limit of 2-6 μg kg(-1) was achieved by applying an MRM transition quantitation strategy. Under MS(3) mode, a quantitation limit of 4-10 μg kg(-1) was achieved. Relative standard deviations of 2-10% and 2-11% were reported for MRM transition and MS(3) quantitation, respectively. The successful utilization of MS(3) enabled accurate analyte fragmentation pattern matching and its quantitation, leading to the development of analytical methods in fields that demand both analyte specificity and fragmentation fingerprint-matching capabilities that are unavailable under MRM transition.

  2. Experimental Influences in the Accurate Measurement of Cartilage Thickness in MRI.

    PubMed

    Wang, Nian; Badar, Farid; Xia, Yang

    2018-01-01

    Objective To study the experimental influences to the measurement of cartilage thickness by magnetic resonance imaging (MRI). Design The complete thicknesses of healthy and trypsin-degraded cartilage were measured at high-resolution MRI under different conditions, using two intensity-based imaging sequences (ultra-short echo [UTE] and multislice-multiecho [MSME]) and 3 quantitative relaxation imaging sequences (T 1 , T 2 , and T 1 ρ). Other variables included different orientations in the magnet, 2 soaking solutions (saline and phosphate buffered saline [PBS]), and external loading. Results With cartilage soaked in saline, UTE and T 1 methods yielded complete and consistent measurement of cartilage thickness, while the thickness measurement by T 2 , T 1 ρ, and MSME methods were orientation dependent. The effect of external loading on cartilage thickness is also sequence and orientation dependent. All variations in cartilage thickness in MRI could be eliminated with the use of a 100 mM PBS or imaged by UTE sequence. Conclusions The appearance of articular cartilage and the measurement accuracy of cartilage thickness in MRI can be influenced by a number of experimental factors in ex vivo MRI, from the use of various pulse sequences and soaking solutions to the health of the tissue. T 2 -based imaging sequence, both proton-intensity sequence and quantitative relaxation sequence, similarly produced the largest variations. With adequate resolution, the accurate measurement of whole cartilage tissue in clinical MRI could be utilized to detect differences between healthy and osteoarthritic cartilage after compression.

  3. Accurate single-shot quantitative phase imaging of biological specimens with telecentric digital holographic microscopy.

    PubMed

    Doblas, Ana; Sánchez-Ortiga, Emilio; Martínez-Corral, Manuel; Saavedra, Genaro; Garcia-Sucerquia, Jorge

    2014-04-01

    The advantages of using a telecentric imaging system in digital holographic microscopy (DHM) to study biological specimens are highlighted. To this end, the performances of nontelecentric DHM and telecentric DHM are evaluated from the quantitative phase imaging (QPI) point of view. The evaluated stability of the microscope allows single-shot QPI in DHM by using telecentric imaging systems. Quantitative phase maps of a section of the head of the drosophila melanogaster fly and of red blood cells are obtained via single-shot DHM with no numerical postprocessing. With these maps we show that the use of telecentric DHM provides larger field of view for a given magnification and permits more accurate QPI measurements with less number of computational operations.

  4. Stroke onset time estimation from multispectral quantitative magnetic resonance imaging in a rat model of focal permanent cerebral ischemia.

    PubMed

    McGarry, Bryony L; Rogers, Harriet J; Knight, Michael J; Jokivarsi, Kimmo T; Sierra, Alejandra; Gröhn, Olli Hj; Kauppinen, Risto A

    2016-08-01

    Quantitative T2 relaxation magnetic resonance imaging allows estimation of stroke onset time. We aimed to examine the accuracy of quantitative T1 and quantitative T2 relaxation times alone and in combination to provide estimates of stroke onset time in a rat model of permanent focal cerebral ischemia and map the spatial distribution of elevated quantitative T1 and quantitative T2 to assess tissue status. Permanent middle cerebral artery occlusion was induced in Wistar rats. Animals were scanned at 9.4T for quantitative T1, quantitative T2, and Trace of Diffusion Tensor (Dav) up to 4 h post-middle cerebral artery occlusion. Time courses of differentials of quantitative T1 and quantitative T2 in ischemic and non-ischemic contralateral brain tissue (ΔT1, ΔT2) and volumes of tissue with elevated T1 and T2 relaxation times (f1, f2) were determined. TTC staining was used to highlight permanent ischemic damage. ΔT1, ΔT2, f1, f2, and the volume of tissue with both elevated quantitative T1 and quantitative T2 (V(Overlap)) increased with time post-middle cerebral artery occlusion allowing stroke onset time to be estimated. V(Overlap) provided the most accurate estimate with an uncertainty of ±25 min. At all times-points regions with elevated relaxation times were smaller than areas with Dav defined ischemia. Stroke onset time can be determined by quantitative T1 and quantitative T2 relaxation times and tissue volumes. Combining quantitative T1 and quantitative T2 provides the most accurate estimate and potentially identifies irreversibly damaged brain tissue. © 2016 World Stroke Organization.

  5. Advanced Mass Spectrometric Methods for the Rapid and Quantitative Characterization of Proteomes

    DOE PAGES

    Smith, Richard D.

    2002-01-01

    Progress is reviewedmore » towards the development of a global strategy that aims to extend the sensitivity, dynamic range, comprehensiveness and throughput of proteomic measurements based upon the use of high performance separations and mass spectrometry. The approach uses high accuracy mass measurements from Fourier transform ion cyclotron resonance mass spectrometry (FTICR) to validate peptide ‘accurate mass tags’ (AMTs) produced by global protein enzymatic digestions for a specific organism, tissue or cell type from ‘potential mass tags’ tentatively identified using conventional tandem mass spectrometry (MS/MS). This provides the basis for subsequent measurements without the need for MS/ MS. High resolution capillary liquid chromatography separations combined with high sensitivity, and high resolution accurate FTICR measurements are shown to be capable of characterizing peptide mixtures of more than 10 5 components. The strategy has been initially demonstrated using the microorganisms Saccharomyces cerevisiae and Deinococcus radiodurans. Advantages of the approach include the high confidence of protein identification, its broad proteome coverage, high sensitivity, and the capability for stableisotope labeling methods for precise relative protein abundance measurements. Abbreviations : LC, liquid chromatography; FTICR, Fourier transform ion cyclotron resonance; AMT, accurate mass tag; PMT, potential mass tag; MMA, mass measurement accuracy; MS, mass spectrometry; MS/MS, tandem mass spectrometry; ppm, parts per million.« less

  6. Understanding the Radioactive Ingrowth and Decay of Naturally Occurring Radioactive Materials in the Environment: An Analysis of Produced Fluids from the Marcellus Shale.

    PubMed

    Nelson, Andrew W; Eitrheim, Eric S; Knight, Andrew W; May, Dustin; Mehrhoff, Marinea A; Shannon, Robert; Litman, Robert; Burnett, William C; Forbes, Tori Z; Schultz, Michael K

    2015-07-01

    The economic value of unconventional natural gas resources has stimulated rapid globalization of horizontal drilling and hydraulic fracturing. However, natural radioactivity found in the large volumes of "produced fluids" generated by these technologies is emerging as an international environmental health concern. Current assessments of the radioactivity concentration in liquid wastes focus on a single element-radium. However, the use of radium alone to predict radioactivity concentrations can greatly underestimate total levels. We investigated the contribution to radioactivity concentrations from naturally occurring radioactive materials (NORM), including uranium, thorium, actinium, radium, lead, bismuth, and polonium isotopes, to the total radioactivity of hydraulic fracturing wastes. For this study we used established methods and developed new methods designed to quantitate NORM of public health concern that may be enriched in complex brines from hydraulic fracturing wastes. Specifically, we examined the use of high-purity germanium gamma spectrometry and isotope dilution alpha spectrometry to quantitate NORM. We observed that radium decay products were initially absent from produced fluids due to differences in solubility. However, in systems closed to the release of gaseous radon, our model predicted that decay products will begin to ingrow immediately and (under these closed-system conditions) can contribute to an increase in the total radioactivity for more than 100 years. Accurate predictions of radioactivity concentrations are critical for estimating doses to potentially exposed individuals and the surrounding environment. These predictions must include an understanding of the geochemistry, decay properties, and ingrowth kinetics of radium and its decay product radionuclides.

  7. Detection of blur artifacts in histopathological whole-slide images of endomyocardial biopsies.

    PubMed

    Hang Wu; Phan, John H; Bhatia, Ajay K; Cundiff, Caitlin A; Shehata, Bahig M; Wang, May D

    2015-01-01

    Histopathological whole-slide images (WSIs) have emerged as an objective and quantitative means for image-based disease diagnosis. However, WSIs may contain acquisition artifacts that affect downstream image feature extraction and quantitative disease diagnosis. We develop a method for detecting blur artifacts in WSIs using distributions of local blur metrics. As features, these distributions enable accurate classification of WSI regions as sharp or blurry. We evaluate our method using over 1000 portions of an endomyocardial biopsy (EMB) WSI. Results indicate that local blur metrics accurately detect blurry image regions.

  8. Restriction Site Tiling Analysis: accurate discovery and quantitative genotyping of genome-wide polymorphisms using nucleotide arrays

    PubMed Central

    2010-01-01

    High-throughput genotype data can be used to identify genes important for local adaptation in wild populations, phenotypes in lab stocks, or disease-related traits in human medicine. Here we advance microarray-based genotyping for population genomics with Restriction Site Tiling Analysis. The approach simultaneously discovers polymorphisms and provides quantitative genotype data at 10,000s of loci. It is highly accurate and free from ascertainment bias. We apply the approach to uncover genomic differentiation in the purple sea urchin. PMID:20403197

  9. Application of ion exchange and extraction chromatography to the separation of actinium from proton-irradiated thorium metal for analytical purposes.

    PubMed

    Radchenko, V; Engle, J W; Wilson, J J; Maassen, J R; Nortier, F M; Taylor, W A; Birnbaum, E R; Hudston, L A; John, K D; Fassbender, M E

    2015-02-06

    Actinium-225 (t1/2=9.92d) is an α-emitting radionuclide with nuclear properties well-suited for use in targeted alpha therapy (TAT), a powerful treatment method for malignant tumors. Actinium-225 can also be utilized as a generator for (213)Bi (t1/2 45.6 min), which is another valuable candidate for TAT. Actinium-225 can be produced via proton irradiation of thorium metal; however, long-lived (227)Ac (t1/2=21.8a, 99% β(-), 1% α) is co-produced during this process and will impact the quality of the final product. Thus, accurate assays are needed to determine the (225)Ac/(227)Ac ratio, which is dependent on beam energy, irradiation time and target design. Accurate actinium assays, in turn, require efficient separation of actinium isotopes from both the Th matrix and highly radioactive activation by-products, especially radiolanthanides formed from proton-induced fission. In this study, we introduce a novel, selective chromatographic technique for the recovery and purification of actinium isotopes from irradiated Th matrices. A two-step sequence of cation exchange and extraction chromatography was implemented. Radiolanthanides were quantitatively removed from Ac, and no non-Ac radionuclidic impurities were detected in the final Ac fraction. An (225)Ac spike added prior to separation was recovered at ≥ 98%, and Ac decontamination from Th was found to be ≥ 10(6). The purified actinium fraction allowed for highly accurate (227)Ac determination at analytical scales, i.e., at (227)Ac activities of 1-100 kBq (27 nCi to 2.7 μCi). Copyright © 2014 Elsevier B.V. All rights reserved.

  10. HIV Viral RNA Extraction in Wax Immiscible Filtration Assisted by Surface Tension (IFAST) Devices

    PubMed Central

    Berry, Scott M.; LaVanway, Alex J.; Pezzi, Hannah M.; Guckenberger, David J.; Anderson, Meghan A.; Loeb, Jennifer M.; Beebe, David J.

    2015-01-01

    The monitoring of viral load is critical for proper management of antiretroviral therapy for HIV-positive patients. Unfortunately, in the developing world, significant economic and geographical barriers exist, limiting access to this test. The complexity of current viral load assays makes them expensive and their access limited to advanced facilities. We attempted to address these limitations by replacing conventional RNA extraction, one of the essential processes in viral load quantitation, with a simplified technique known as immiscible filtration assisted by surface tension (IFAST). Furthermore, these devices were produced via the embossing of wax, enabling local populations to produce and dispose of their own devices with minimal training or infrastructure, potentially reducing the total assay cost. In addition, IFAST can be used to reduce cold chain dependence during transportation. Viral RNA extracted from raw samples stored at 37°C for 1 week exhibited nearly complete degradation. However, IFAST-purified RNA could be stored at 37°C for 1 week without significant loss. These data suggest that RNA isolated at the point of care (eg, in a rural clinic) via IFAST could be shipped to a central laboratory for quantitative RT-PCR without a cold chain. Using this technology, we have demonstrated accurate and repeatable measurements of viral load on samples with as low as 50 copies per milliliter of sample. PMID:24613822

  11. Research study demonstrates computer simulation can predict warpage and assist in its elimination

    NASA Astrophysics Data System (ADS)

    Glozer, G.; Post, S.; Ishii, K.

    1994-10-01

    Programs for predicting warpage in injection molded parts are relatively new. Commercial software for simulating the flow and cooling stages of injection molding have steadily gained acceptance; however, warpage software is not yet as readily accepted. This study focused on gaining an understanding of the predictive capabilities of the warpage software. The following aspects of this study were unique. (1) Quantitative results were found using a statistically designed set of experiments. (2) Comparisons between experimental and simulation results were made with parts produced in a well-instrumented and controlled injection molding machine. (3) The experimental parts were accurately measured on a coordinate measuring machine with a non-contact laser probe. (4) The effect of part geometry on warpage was investigated.

  12. Cloud computing approaches for prediction of ligand binding poses and pathways.

    PubMed

    Lawrenz, Morgan; Shukla, Diwakar; Pande, Vijay S

    2015-01-22

    We describe an innovative protocol for ab initio prediction of ligand crystallographic binding poses and highly effective analysis of large datasets generated for protein-ligand dynamics. We include a procedure for setup and performance of distributed molecular dynamics simulations on cloud computing architectures, a model for efficient analysis of simulation data, and a metric for evaluation of model convergence. We give accurate binding pose predictions for five ligands ranging in affinity from 7 nM to > 200 μM for the immunophilin protein FKBP12, for expedited results in cases where experimental structures are difficult to produce. Our approach goes beyond single, low energy ligand poses to give quantitative kinetic information that can inform protein engineering and ligand design.

  13. Accurate sparse-projection image reconstruction via nonlocal TV regularization.

    PubMed

    Zhang, Yi; Zhang, Weihua; Zhou, Jiliu

    2014-01-01

    Sparse-projection image reconstruction is a useful approach to lower the radiation dose; however, the incompleteness of projection data will cause degeneration of imaging quality. As a typical compressive sensing method, total variation has obtained great attention on this problem. Suffering from the theoretical imperfection, total variation will produce blocky effect on smooth regions and blur edges. To overcome this problem, in this paper, we introduce the nonlocal total variation into sparse-projection image reconstruction and formulate the minimization problem with new nonlocal total variation norm. The qualitative and quantitative analyses of numerical as well as clinical results demonstrate the validity of the proposed method. Comparing to other existing methods, our method more efficiently suppresses artifacts caused by low-rank reconstruction and reserves structure information better.

  14. Considerations for potency equivalent calculations in the Ah receptor-based CALUX bioassay: Normalization of superinduction results for improved sample potency estimation

    PubMed Central

    Baston, David S.; Denison, Michael S.

    2011-01-01

    The chemically activated luciferase expression (CALUX) system is a mechanistically based recombinant luciferase reporter gene cell bioassay used in combination with chemical extraction and clean-up methods for the detection and relative quantitation of 2,3,7,8-tetrachlorodibenzo-p-dioxin and related dioxin-like halogenated aromatic hydrocarbons in a wide variety of sample matrices. While sample extracts containing complex mixtures of chemicals can produce a variety of distinct concentration-dependent luciferase induction responses in CALUX cells, these effects are produced through a common mechanism of action (i.e. the Ah receptor (AhR)) allowing normalization of results and sample potency determination. Here we describe the diversity in CALUX response to PCDD/Fs from sediment and soil extracts and not only report the occurrence of superinduction of the CALUX bioassay, but we describe a mechanistically based approach for normalization of superinduction data that results in a more accurate estimation of the relative potency of such sample extracts. PMID:21238730

  15. Development of a computationally-designed polymeric adsorbent specific for mycotoxin patulin.

    PubMed

    Piletska, Elena V; Pink, Demi; Karim, Kal; Piletsky, Sergey A

    2017-12-04

    Patulin is a toxic compound which is found predominantly in apples affected by mould rot. Since apples and apple-containing products are a popular food for the elderly, children and babies, the monitoring of the toxin is crucial. This paper describes a development of a computationally-designed polymeric adsorbent for the solid-phase extraction of patulin, which provides an effective clean-up of the food samples and allows the detection and accurate quantification of patulin levels present in apple juice using conventional chromatography methods. The developed bespoke polymer demonstrates a quantitative binding towards the patulin present in undiluted apple juice. The polymer is inexpensive and easy to mass-produce. The contributing factors to the function of the adsorbent is a combination of acidic and basic functional monomers producing a zwitterionic complex in the solution that formed stronger binding complexes with the patulin molecule. The protocols described in this paper provide a blueprint for the development of polymeric adsorbents for other toxins or different food matrices.

  16. Computer-assisted revision total knee replacement.

    PubMed

    Sikorski, J M

    2004-05-01

    A technique for performing allograft-augmented revision total knee replacement (TKR) using computer assistance is described, on the basis of the results in 14 patients. Bone deficits were made up with impaction grafting. Femoral grafting was made possible by the construction of a retaining wall or dam which allowed pressurisation and retention of the graft. Tibial grafting used a mixture of corticocancellous and morsellised allograft. The position of the implants was monitored by the computer system and adjusted while the cement was setting. The outcome was determined using a six-parameter, quantitative technique (the Perth CT protocol) which measured the alignment of the prosthesis and provided an objective score. The final outcomes were not perfect with errors being made in femoral rotation and in producing a mismatch between the femoral and tibial components. In spite of the shortcomings the alignments were comparable in accuracy with those after primary TKR. Computer assistance shows considerable promise in producing accurate alignment in revision TKR with bone deficits.

  17. Understanding Longitudinal Wood Fiber Ultra-structure for Producing Cellulose Nanofibrils Using Disk Milling with Diluted Acid Prehydrolysis

    NASA Astrophysics Data System (ADS)

    Qin, Yanlin; Qiu, Xueqing; Zhu, J. Y.

    2016-10-01

    Here we used dilute oxalic acid to pretreat a kraft bleached Eucalyptus pulp (BEP) fibers to facilitate mechanical fibrillation in producing cellulose nanofibrils using disk milling with substantial mechanical energy savings. We successfully applied a reaction kinetics based combined hydrolysis factor (CHFX) as a severity factor to quantitatively control xylan dissolution and BEP fibril deploymerization. More importantly, we were able to accurately predict the degree of polymerization (DP) of disk-milled fibrils using CHFX and milling time or milling energy consumption. Experimentally determined ratio of fibril DP and number mean fibril height (diameter d), DP/d, an aspect ratio measurer, were independent of the processing conditions. Therefore, we hypothesize that cellulose have a longitudinal hierarchical structure as in the lateral direction. Acid hydrolysis and milling did not substantially cut the “natural” chain length of cellulose fibrils. This cellulose longitudinal hierarchical model provides support for using weak acid hydrolysis in the production of cellulose nanofibrils with substantially reduced energy input without negatively affecting fibril mechanical strength.

  18. Using GPS To Teach More Than Accurate Positions.

    ERIC Educational Resources Information Center

    Johnson, Marie C.; Guth, Peter L.

    2002-01-01

    Undergraduate science majors need practice in critical thinking, quantitative analysis, and judging whether their calculated answers are physically reasonable. Develops exercises using handheld Global Positioning System (GPS) receivers. Reinforces students' abilities to think quantitatively, make realistic "back of the envelope"…

  19. A Smoluchowski model of crystallization dynamics of small colloidal clusters

    NASA Astrophysics Data System (ADS)

    Beltran-Villegas, Daniel J.; Sehgal, Ray M.; Maroudas, Dimitrios; Ford, David M.; Bevan, Michael A.

    2011-10-01

    We investigate the dynamics of colloidal crystallization in a 32-particle system at a fixed value of interparticle depletion attraction that produces coexisting fluid and solid phases. Free energy landscapes (FELs) and diffusivity landscapes (DLs) are obtained as coefficients of 1D Smoluchowski equations using as order parameters either the radius of gyration or the average crystallinity. FELs and DLs are estimated by fitting the Smoluchowski equations to Brownian dynamics (BD) simulations using either linear fits to locally initiated trajectories or global fits to unbiased trajectories using Bayesian inference. The resulting FELs are compared to Monte Carlo Umbrella Sampling results. The accuracy of the FELs and DLs for modeling colloidal crystallization dynamics is evaluated by comparing mean first-passage times from BD simulations with analytical predictions using the FEL and DL models. While the 1D models accurately capture dynamics near the free energy minimum fluid and crystal configurations, predictions near the transition region are not quantitatively accurate. A preliminary investigation of ensemble averaged 2D order parameter trajectories suggests that 2D models are required to capture crystallization dynamics in the transition region.

  20. Trofile HIV co-receptor usage assay.

    PubMed

    Low, Andrew J; McGovern, Rachel A; Harrigan, P Richard

    2009-03-01

    The introduction of CCR5 antagonists increases the options available for constructing therapeutic drug regimens for HIV-positive patients. However, as these drugs do not inhibit HIV variants that use the CXCR4 co-receptor, a pretreatment test is required to determine accurately HIV co-receptor usage (tropism) before initiating CCR5 antagonist-based therapy. To discuss the Monogram Trofile assay as a diagnostic tool for determining HIV tropism by critically reviewing reported literature and available data. Monogram Trofile has become, largely by default, the de facto standard for HIV tropism assay. However, there is significant room for improvement in the speed, cost and availability of the test. Furthermore, the test is not quantitative, requires high-input HIV RNA viral loads, and produces results that are less biologically stable than expected. These technical considerations may limit the use of CCR5 antagonists in therapy. Nevertheless, this test is likely to remain the most widely used tropism diagnostic for the short term. We expect that a more practical and possibly more accurate method for measuring HIV tropism can be developed.

  1. Highly predictive and interpretable models for PAMPA permeability.

    PubMed

    Sun, Hongmao; Nguyen, Kimloan; Kerns, Edward; Yan, Zhengyin; Yu, Kyeong Ri; Shah, Pranav; Jadhav, Ajit; Xu, Xin

    2017-02-01

    Cell membrane permeability is an important determinant for oral absorption and bioavailability of a drug molecule. An in silico model predicting drug permeability is described, which is built based on a large permeability dataset of 7488 compound entries or 5435 structurally unique molecules measured by the same lab using parallel artificial membrane permeability assay (PAMPA). On the basis of customized molecular descriptors, the support vector regression (SVR) model trained with 4071 compounds with quantitative data is able to predict the remaining 1364 compounds with the qualitative data with an area under the curve of receiver operating characteristic (AUC-ROC) of 0.90. The support vector classification (SVC) model trained with half of the whole dataset comprised of both the quantitative and the qualitative data produced accurate predictions to the remaining data with the AUC-ROC of 0.88. The results suggest that the developed SVR model is highly predictive and provides medicinal chemists a useful in silico tool to facilitate design and synthesis of novel compounds with optimal drug-like properties, and thus accelerate the lead optimization in drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Topex/Poseidon: A United States/France mission. Oceanography from space: The oceans and climate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The TOPEX/POSEIDON space mission, sponsored by NASA and France's space agency, the Centre National d'Etudes Spatiales (CNES), will give new observations of the Earth from space to gain a quantitative understanding of the role of ocean currents in climate change. Rising atmospheric concentrations of carbon dioxide and other 'greenhouse gases' produced as a result of human activities could generate a global warming, followed by an associated rise in sea level. The satellite will use radar altimetry to measure sea-surface height and will be tracked by three independent systems to yield accurate topographic maps over the dimensions of entire ocean basins. The satellite data, together with the Tropical Ocean and Global Atmosphere (TOGA) program and the World Ocean Circulation Experiment (WOCE) measurements, will be analyzed by an international scientific team. By merging the satellite observations with TOGA and WOCE findings, the scientists will establish the extensive data base needed for the quantitative description and computer modeling of ocean circulation. The ocean models will eventually be coupled with atmospheric models to lay the foundation for predictions of global climate change.

  3. Adduct simplification in the analysis of cyanobacterial toxins by matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Howard, Karen L; Boyer, Gregory L

    2007-01-01

    A novel method for simplifying adduct patterns to improve the detection and identification of peptide toxins using matrix-assisted laser desorption/ionization (MALDI) time-of-flight (TOF) mass spectrometry is presented. Addition of 200 microM zinc sulfate heptahydrate (ZnSO(4) . 7H(2)O) to samples prior to spotting on the target enhances detection of the protonated molecule while suppressing competing adducts. This produces a highly simplified spectrum with the potential to enhance quantitative analysis, particularly for complex samples. The resulting improvement in total signal strength and reduction in the coefficient of variation (from 31.1% to 5.2% for microcystin-LR) further enhance the potential for sensitive and accurate quantitation. Other potential additives tested, including 18-crown-6 ether, alkali metal salts (lithium chloride, sodium chloride, potassium chloride), and other transition metal salts (silver chloride, silver nitrate, copper(II) nitrate, copper(II) sulfate, zinc acetate), were unable to achieve comparable results. Application of this technique to the analysis of several microcystins, potent peptide hepatotoxins from cyanobacteria, is illustrated. Copyright (c) 2007 John Wiley & Sons, Ltd.

  4. Tool independence for the Web Accessibility Quantitative Metric.

    PubMed

    Vigo, Markel; Brajnik, Giorgio; Arrue, Myriam; Abascal, Julio

    2009-07-01

    The Web Accessibility Quantitative Metric (WAQM) aims at accurately measuring the accessibility of web pages. One of the main features of WAQM among others is that it is evaluation tool independent for ranking and accessibility monitoring scenarios. This article proposes a method to attain evaluation tool independence for all foreseeable scenarios. After demonstrating that homepages have a more similar error profile than any other web page in a given web site, 15 homepages were measured with 10,000 different values of WAQM parameters using EvalAccess and LIFT, two automatic evaluation tools for accessibility. A similar procedure was followed with random pages and with several test files obtaining several tuples that minimise the difference between both tools. One thousand four hundred forty-nine web pages from 15 web sites were measured with these tuples and those values that minimised the difference between the tools were selected. Once the WAQM was tuned, the accessibility of 15 web sites was measured with two metrics for web sites, concluding that even if similar values can be produced, obtaining the same scores is undesirable since evaluation tools behave in a different way.

  5. Quantifying Golgi structure using EM: combining volume-SEM and stereology for higher throughput.

    PubMed

    Ferguson, Sophie; Steyer, Anna M; Mayhew, Terry M; Schwab, Yannick; Lucocq, John Milton

    2017-06-01

    Investigating organelles such as the Golgi complex depends increasingly on high-throughput quantitative morphological analyses from multiple experimental or genetic conditions. Light microscopy (LM) has been an effective tool for screening but fails to reveal fine details of Golgi structures such as vesicles, tubules and cisternae. Electron microscopy (EM) has sufficient resolution but traditional transmission EM (TEM) methods are slow and inefficient. Newer volume scanning EM (volume-SEM) methods now have the potential to speed up 3D analysis by automated sectioning and imaging. However, they produce large arrays of sections and/or images, which require labour-intensive 3D reconstruction for quantitation on limited cell numbers. Here, we show that the information storage, digital waste and workload involved in using volume-SEM can be reduced substantially using sampling-based stereology. Using the Golgi as an example, we describe how Golgi populations can be sensed quantitatively using single random slices and how accurate quantitative structural data on Golgi organelles of individual cells can be obtained using only 5-10 sections/images taken from a volume-SEM series (thereby sensing population parameters and cell-cell variability). The approach will be useful in techniques such as correlative LM and EM (CLEM) where small samples of cells are treated and where there may be variable responses. For Golgi study, we outline a series of stereological estimators that are suited to these analyses and suggest workflows, which have the potential to enhance the speed and relevance of data acquisition in volume-SEM.

  6. Supramolecular assembly affording a ratiometric two-photon fluorescent nanoprobe for quantitative detection and bioimaging.

    PubMed

    Wang, Peng; Zhang, Cheng; Liu, Hong-Wen; Xiong, Mengyi; Yin, Sheng-Yan; Yang, Yue; Hu, Xiao-Xiao; Yin, Xia; Zhang, Xiao-Bing; Tan, Weihong

    2017-12-01

    Fluorescence quantitative analyses for vital biomolecules are in great demand in biomedical science owing to their unique detection advantages with rapid, sensitive, non-damaging and specific identification. However, available fluorescence strategies for quantitative detection are usually hard to design and achieve. Inspired by supramolecular chemistry, a two-photon-excited fluorescent supramolecular nanoplatform ( TPSNP ) was designed for quantitative analysis with three parts: host molecules (β-CD polymers), a guest fluorophore of sensing probes (Np-Ad) and a guest internal reference (NpRh-Ad). In this strategy, the TPSNP possesses the merits of (i) improved water-solubility and biocompatibility; (ii) increased tissue penetration depth for bioimaging by two-photon excitation; (iii) quantitative and tunable assembly of functional guest molecules to obtain optimized detection conditions; (iv) a common approach to avoid the limitation of complicated design by adjustment of sensing probes; and (v) accurate quantitative analysis by virtue of reference molecules. As a proof-of-concept, we utilized the two-photon fluorescent probe NHS-Ad-based TPSNP-1 to realize accurate quantitative analysis of hydrogen sulfide (H 2 S), with high sensitivity and good selectivity in live cells, deep tissues and ex vivo -dissected organs, suggesting that the TPSNP is an ideal quantitative indicator for clinical samples. What's more, TPSNP will pave the way for designing and preparing advanced supramolecular sensors for biosensing and biomedicine.

  7. Quantitative single-photon emission computed tomography/computed tomography for technetium pertechnetate thyroid uptake measurement

    PubMed Central

    Lee, Hyunjong; Kim, Ji Hyun; Kang, Yeon-koo; Moon, Jae Hoon; So, Young; Lee, Won Woo

    2016-01-01

    Abstract Objectives: Technetium pertechnetate (99mTcO4) is a radioactive tracer used to assess thyroid function by thyroid uptake system (TUS). However, the TUS often fails to deliver accurate measurements of the percent of thyroid uptake (%thyroid uptake) of 99mTcO4. Here, we investigated the usefulness of quantitative single-photon emission computed tomography/computed tomography (SPECT/CT) after injection of 99mTcO4 in detecting thyroid function abnormalities. Materials and methods: We retrospectively reviewed data from 50 patients (male:female = 15:35; age, 46.2 ± 16.3 years; 17 Graves disease, 13 thyroiditis, and 20 euthyroid). All patients underwent 99mTcO4 quantitative SPECT/CT (185 MBq = 5 mCi), which yielded %thyroid uptake and standardized uptake value (SUV). Twenty-one (10 Graves disease and 11 thyroiditis) of the 50 patients also underwent conventional %thyroid uptake measurements using a TUS. Results: Quantitative SPECT/CT parameters (%thyroid uptake, SUVmean, and SUVmax) were the highest in Graves disease, second highest in euthyroid, and lowest in thyroiditis (P < 0.0001, Kruskal–Wallis test). TUS significantly overestimated the %thyroid uptake compared with SPECT/CT (P < 0.0001, paired t test) because other 99mTcO4 sources in addition to thyroid, such as salivary glands and saliva, contributed to the %thyroid uptake result by TUS, whereas %thyroid uptake, SUVmean and SUVmax from the SPECT/CT were associated with the functional status of thyroid. Conclusions: Quantitative SPECT/CT is more accurate than conventional TUS for measuring 99mTcO4 %thyroid uptake. Quantitative measurements using SPECT/CT may facilitate more accurate assessment of thyroid tracer uptake. PMID:27399139

  8. Protein Folding Free Energy Landscape along the Committor - the Optimal Folding Coordinate.

    PubMed

    Krivov, Sergei V

    2018-06-06

    Recent advances in simulation and experiment have led to dramatic increases in the quantity and complexity of produced data, which makes the development of automated analysis tools very important. A powerful approach to analyze dynamics contained in such data sets is to describe/approximate it by diffusion on a free energy landscape - free energy as a function of reaction coordinates (RC). For the description to be quantitatively accurate, RCs should be chosen in an optimal way. Recent theoretical results show that such an optimal RC exists; however, determining it for practical systems is a very difficult unsolved problem. Here we describe a solution to this problem. We describe an adaptive nonparametric approach to accurately determine the optimal RC (the committor) for an equilibrium trajectory of a realistic system. In contrast to alternative approaches, which require a functional form with many parameters to approximate an RC and thus extensive expertise with the system, the suggested approach is nonparametric and can approximate any RC with high accuracy without system specific information. To avoid overfitting for a realistically sampled system, the approach performs RC optimization in an adaptive manner by focusing optimization on less optimized spatiotemporal regions of the RC. The power of the approach is illustrated on a long equilibrium atomistic folding simulation of HP35 protein. We have determined the optimal folding RC - the committor, which was confirmed by passing a stringent committor validation test. It allowed us to determine a first quantitatively accurate protein folding free energy landscape. We have confirmed the recent theoretical results that diffusion on such a free energy profile can be used to compute exactly the equilibrium flux, the mean first passage times, and the mean transition path times between any two points on the profile. We have shown that the mean squared displacement along the optimal RC grows linear with time as for simple diffusion. The free energy profile allowed us to obtain a direct rigorous estimate of the pre-exponential factor for the folding dynamics.

  9. Quantitative characterization of metastatic disease in the spine. Part I. Semiautomated segmentation using atlas-based deformable registration and the level set method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hardisty, M.; Gordon, L.; Agarwal, P.

    2007-08-15

    Quantitative assessment of metastatic disease in bone is often considered immeasurable and, as such, patients with skeletal metastases are often excluded from clinical trials. In order to effectively quantify the impact of metastatic tumor involvement in the spine, accurate segmentation of the vertebra is required. Manual segmentation can be accurate but involves extensive and time-consuming user interaction. Potential solutions to automating segmentation of metastatically involved vertebrae are demons deformable image registration and level set methods. The purpose of this study was to develop a semiautomated method to accurately segment tumor-bearing vertebrae using the aforementioned techniques. By maintaining morphology of anmore » atlas, the demons-level set composite algorithm was able to accurately differentiate between trans-cortical tumors and surrounding soft tissue of identical intensity. The algorithm successfully segmented both the vertebral body and trabecular centrum of tumor-involved and healthy vertebrae. This work validates our approach as equivalent in accuracy to an experienced user.« less

  10. Practicable methods for histological section thickness measurement in quantitative stereological analyses.

    PubMed

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger; Blutke, Andreas

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1-3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability of SR to efficiently provide accurate section thickness measurements as a prerequisite for reliable estimates of dependent quantitative stereological parameters.

  11. A novel anthropomorphic flow phantom for the quantitative evaluation of prostate DCE-MRI acquisition techniques

    NASA Astrophysics Data System (ADS)

    Knight, Silvin P.; Browne, Jacinta E.; Meaney, James F.; Smith, David S.; Fagan, Andrew J.

    2016-10-01

    A novel anthropomorphic flow phantom device has been developed, which can be used for quantitatively assessing the ability of magnetic resonance imaging (MRI) scanners to accurately measure signal/concentration time-intensity curves (CTCs) associated with dynamic contrast-enhanced (DCE) MRI. Modelling of the complex pharmacokinetics of contrast agents as they perfuse through the tumour capillary network has shown great promise for cancer diagnosis and therapy monitoring. However, clinical adoption has been hindered by methodological problems, resulting in a lack of consensus regarding the most appropriate acquisition and modelling methodology to use and a consequent wide discrepancy in published data. A heretofore overlooked source of such discrepancy may arise from measurement errors of tumour CTCs deriving from the imaging pulse sequence itself, while the effects on the fidelity of CTC measurement of using rapidly-accelerated sequences such as parallel imaging and compressed sensing remain unknown. The present work aimed to investigate these features by developing a test device in which ‘ground truth’ CTCs were generated and presented to the MRI scanner for measurement, thereby allowing for an assessment of the DCE-MRI protocol to accurately measure this curve shape. The device comprised a four-pump flow system wherein CTCs derived from prior patient prostate data were produced in measurement chambers placed within the imaged volume. The ground truth was determined as the mean of repeat measurements using an MRI-independent, custom-built optical imaging system. In DCE-MRI experiments, significant discrepancies between the ground truth and measured CTCs were found for both tumorous and healthy tissue-mimicking curve shapes. Pharmacokinetic modelling revealed errors in measured K trans, v e and k ep values of up to 42%, 31%, and 50% respectively, following a simple variation of the parallel imaging factor and number of signal averages in the acquisition protocol. The device allows for the quantitative assessment and standardisation of DCE-MRI protocols (both existing and emerging).

  12. Practicable methods for histological section thickness measurement in quantitative stereological analyses

    PubMed Central

    Matenaers, Cyrill; Popper, Bastian; Rieger, Alexandra; Wanke, Rüdiger

    2018-01-01

    The accuracy of quantitative stereological analysis tools such as the (physical) disector method substantially depends on the precise determination of the thickness of the analyzed histological sections. One conventional method for measurement of histological section thickness is to re-embed the section of interest vertically to its original section plane. The section thickness is then measured in a subsequently prepared histological section of this orthogonally re-embedded sample. However, the orthogonal re-embedding (ORE) technique is quite work- and time-intensive and may produce inaccurate section thickness measurement values due to unintentional slightly oblique (non-orthogonal) positioning of the re-embedded sample-section. Here, an improved ORE method is presented, allowing for determination of the factual section plane angle of the re-embedded section, and correction of measured section thickness values for oblique (non-orthogonal) sectioning. For this, the analyzed section is mounted flat on a foil of known thickness (calibration foil) and both the section and the calibration foil are then vertically (re-)embedded. The section angle of the re-embedded section is then calculated from the deviation of the measured section thickness of the calibration foil and its factual thickness, using basic geometry. To find a practicable, fast, and accurate alternative to ORE, the suitability of spectral reflectance (SR) measurement for determination of plastic section thicknesses was evaluated. Using a commercially available optical reflectometer (F20, Filmetrics®, USA), the thicknesses of 0.5 μm thick semi-thin Epon (glycid ether)-sections and of 1–3 μm thick plastic sections (glycolmethacrylate/ methylmethacrylate, GMA/MMA), as regularly used in physical disector analyses, could precisely be measured within few seconds. Compared to the measured section thicknesses determined by ORE, SR measures displayed less than 1% deviation. Our results prove the applicability of SR to efficiently provide accurate section thickness measurements as a prerequisite for reliable estimates of dependent quantitative stereological parameters. PMID:29444158

  13. Tandem mass spectrometry measurement of the collision products of carbamate anions derived from CO2 capture sorbents: paving the way for accurate quantitation.

    PubMed

    Jackson, Phil; Fisher, Keith J; Attalla, Moetaz Ibrahim

    2011-08-01

    The reaction between CO(2) and aqueous amines to produce a charged carbamate product plays a crucial role in post-combustion capture chemistry when primary and secondary amines are used. In this paper, we report the low energy negative-ion CID results for several anionic carbamates derived from primary and secondary amines commonly used as post-combustion capture solvents. The study was performed using the modern equivalent of a triple quadrupole instrument equipped with a T-wave collision cell. Deuterium labeling of 2-aminoethanol (1,1,2,2,-d(4)-2-aminoethanol) and computations at the M06-2X/6-311++G(d,p) level were used to confirm the identity of the fragmentation products for 2-hydroxyethylcarbamate (derived from 2-aminoethanol), in particular the ions CN(-), NCO(-) and facile neutral losses of CO(2) and water; there is precedent for the latter in condensed phase isocyanate chemistry. The fragmentations of 2-hydroxyethylcarbamate were generalized for carbamate anions derived from other capture amines, including ethylenediamine, diethanolamine, and piperazine. We also report unequivocal evidence for the existence of carbamate anions derived from sterically hindered amines (Tris(2-hydroxymethyl)aminomethane and 2-methyl-2-aminopropanol). For the suite of carbamates investigated, diagnostic losses include the decarboxylation product (-CO(2), 44 mass units), loss of 46 mass units and the fragments NCO(-) (m/z 42) and CN(-) (m/z 26). We also report low energy CID results for the dicarbamate dianion ((-)O(2)CNHC(2)H(4)NHCO(2)(-)) commonly encountered in CO(2) capture solution utilizing ethylenediamine. Finally, we demonstrate a promising ion chromatography-MS based procedure for the separation and quantitation of aqueous anionic carbamates, which is based on the reported CID findings. The availability of accurate quantitation methods for ionic CO(2) capture products could lead to dynamic operational tuning of CO(2) capture-plants and, thus, cost-savings via real-time manipulation of solvent regeneration energies.

  14. A unique charge-coupled device/xenon arc lamp based imaging system for the accurate detection and quantitation of multicolour fluorescence.

    PubMed

    Spibey, C A; Jackson, P; Herick, K

    2001-03-01

    In recent years the use of fluorescent dyes in biological applications has dramatically increased. The continual improvement in the capabilities of these fluorescent dyes demands increasingly sensitive detection systems that provide accurate quantitation over a wide linear dynamic range. In the field of proteomics, the detection, quantitation and identification of very low abundance proteins are of extreme importance in understanding cellular processes. Therefore, the instrumentation used to acquire an image of such samples, for spot picking and identification by mass spectrometry, must be sensitive enough to be able, not only, to maximise the sensitivity and dynamic range of the staining dyes but, as importantly, adapt to the ever changing portfolio of fluorescent dyes as they become available. Just as the available fluorescent probes are improving and evolving so are the users application requirements. Therefore, the instrumentation chosen must be flexible to address and adapt to those changing needs. As a result, a highly competitive market for the supply and production of such dyes and the instrumentation for their detection and quantitation have emerged. The instrumentation currently available is based on either laser/photomultiplier tube (PMT) scanning or lamp/charge-coupled device (CCD) based mechanisms. This review briefly discusses the advantages and disadvantages of both System types for fluorescence imaging, gives a technical overview of CCD technology and describes in detail a unique xenon/are lamp CCD based instrument, from PerkinElmer Life Sciences. The Wallac-1442 ARTHUR is unique in its ability to scan both large areas at high resolution and give accurate selectable excitation over the whole of the UV/visible range. It operates by filtering both the excitation and emission wavelengths, providing optimal and accurate measurement and quantitation of virtually any available dye and allows excellent spectral resolution between different fluorophores. This flexibility and excitation accuracy is key to multicolour applications and future adaptation of the instrument to address the application requirements and newly emerging dyes.

  15. Quantitative Analysis of Radar Returns from Insects

    NASA Technical Reports Server (NTRS)

    Riley, J. R.

    1979-01-01

    When a number of flying insects is low enough to permit their resolution as individual radar targets, quantitative estimates of their aerial density are developed. Accurate measurements of heading distribution using a rotating polarization radar to enhance the wingbeat frequency method of identification are presented.

  16. A correlative imaging based methodology for accurate quantitative assessment of bone formation in additive manufactured implants.

    PubMed

    Geng, Hua; Todd, Naomi M; Devlin-Mullin, Aine; Poologasundarampillai, Gowsihan; Kim, Taek Bo; Madi, Kamel; Cartmell, Sarah; Mitchell, Christopher A; Jones, Julian R; Lee, Peter D

    2016-06-01

    A correlative imaging methodology was developed to accurately quantify bone formation in the complex lattice structure of additive manufactured implants. Micro computed tomography (μCT) and histomorphometry were combined, integrating the best features from both, while demonstrating the limitations of each imaging modality. This semi-automatic methodology registered each modality using a coarse graining technique to speed the registration of 2D histology sections to high resolution 3D μCT datasets. Once registered, histomorphometric qualitative and quantitative bone descriptors were directly correlated to 3D quantitative bone descriptors, such as bone ingrowth and bone contact. The correlative imaging allowed the significant volumetric shrinkage of histology sections to be quantified for the first time (~15 %). This technique demonstrated the importance of location of the histological section, demonstrating that up to a 30 % offset can be introduced. The results were used to quantitatively demonstrate the effectiveness of 3D printed titanium lattice implants.

  17. Changes in body composition of neonatal piglets during growth

    USDA-ARS?s Scientific Manuscript database

    During studies of neonatal piglet growth it is important to be able to accurately assess changes in body composition. Previous studies have demonstrated that quantitative magnetic resonance (QMR) provides precise and accurate measurements of total body fat mass, lean mass and total body water in non...

  18. The Use of Multidimensional Image-Based Analysis to Accurately Monitor Cell Growth in 3D Bioreactor Culture

    PubMed Central

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells. PMID:22028809

  19. The use of multidimensional image-based analysis to accurately monitor cell growth in 3D bioreactor culture.

    PubMed

    Baradez, Marc-Olivier; Marshall, Damian

    2011-01-01

    The transition from traditional culture methods towards bioreactor based bioprocessing to produce cells in commercially viable quantities for cell therapy applications requires the development of robust methods to ensure the quality of the cells produced. Standard methods for measuring cell quality parameters such as viability provide only limited information making process monitoring and optimisation difficult. Here we describe a 3D image-based approach to develop cell distribution maps which can be used to simultaneously measure the number, confluency and morphology of cells attached to microcarriers in a stirred tank bioreactor. The accuracy of the cell distribution measurements is validated using in silico modelling of synthetic image datasets and is shown to have an accuracy >90%. Using the cell distribution mapping process and principal component analysis we show how cell growth can be quantitatively monitored over a 13 day bioreactor culture period and how changes to manufacture processes such as initial cell seeding density can significantly influence cell morphology and the rate at which cells are produced. Taken together, these results demonstrate how image-based analysis can be incorporated in cell quality control processes facilitating the transition towards bioreactor based manufacture for clinical grade cells.

  20. Kinetics of Fast Atoms in the Terrestrial Atmosphere

    NASA Technical Reports Server (NTRS)

    Kharchenko, Vasili A.; Dalgarno, A.; Mellott, Mary (Technical Monitor)

    2002-01-01

    This report summarizes our investigations performed under NASA Grant NAG5-8058. The three-year research supported by the Geospace Sciences SR&T program (Ionospheric, Thermospheric, and Mesospheric Physics) has been designed to investigate fluxes of energetic oxygen and nitrogen atoms in the terrestrial thermosphere. Fast atoms are produced due to absorption of the solar radiation and due to coupling between the ionosphere and the neutral thermospheric gas. We have investigated the impact of hot oxygen and nitrogen atoms on the thermal balance, chemistry and radiation properties of the terrestrial thermosphere. Our calculations have been focused on the accurate quantitative description of the thermalization of O and N energetic atoms in collisions with atom and molecules of the ambient neutral gas. Upward fluxes of oxygen and nitrogen atoms, the rate of atmospheric heating by hot oxygen atoms, and the energy input into translational and rotational-vibrational degrees of atmospheric molecules have been evaluated. Altitude profiles of hot oxygen and nitrogen atoms have been analyzed and compared with available observational data. Energetic oxygen atoms in the terrestrial atmosphere have been investigated for decades, but insufficient information on the kinetics of fast atmospheric atoms has been a main obstacle for the interpretation of observational data and modeling of the hot geocorona. The recent development of accurate computational methods of the collisional kinetics is seen as an important step in the quantitative description of hot atoms in the thermosphere. Modeling of relaxation processes in the terrestrial atmosphere has incorporated data of recent observations, and theoretical predictions have been tested by new laboratory measurements.

  1. The analytical validation of the Oncotype DX Recurrence Score assay

    PubMed Central

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score® result (scale: 0–100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time. PMID:27729940

  2. The analytical validation of the Oncotype DX Recurrence Score assay.

    PubMed

    Baehner, Frederick L

    2016-01-01

    In vitro diagnostic multivariate index assays are highly complex molecular assays that can provide clinically actionable information regarding the underlying tumour biology and facilitate personalised treatment. These assays are only useful in clinical practice if all of the following are established: analytical validation (i.e., how accurately/reliably the assay measures the molecular characteristics), clinical validation (i.e., how consistently/accurately the test detects/predicts the outcomes of interest), and clinical utility (i.e., how likely the test is to significantly improve patient outcomes). In considering the use of these assays, clinicians often focus primarily on the clinical validity/utility; however, the analytical validity of an assay (e.g., its accuracy, reproducibility, and standardisation) should also be evaluated and carefully considered. This review focuses on the rigorous analytical validation and performance of the Oncotype DX ® Breast Cancer Assay, which is performed at the Central Clinical Reference Laboratory of Genomic Health, Inc. The assay process includes tumour tissue enrichment (if needed), RNA extraction, gene expression quantitation (using a gene panel consisting of 16 cancer genes plus 5 reference genes and quantitative real-time RT-PCR), and an automated computer algorithm to produce a Recurrence Score ® result (scale: 0-100). This review presents evidence showing that the Recurrence Score result reported for each patient falls within a tight clinically relevant confidence interval. Specifically, the review discusses how the development of the assay was designed to optimise assay performance, presents data supporting its analytical validity, and describes the quality control and assurance programmes that ensure optimal test performance over time.

  3. Rapid and quantitative detection of the microbial spoilage of meat by fourier transform infrared spectroscopy and machine learning.

    PubMed

    Ellis, David I; Broadhurst, David; Kell, Douglas B; Rowland, Jem J; Goodacre, Royston

    2002-06-01

    Fourier transform infrared (FT-IR) spectroscopy is a rapid, noninvasive technique with considerable potential for application in the food and related industries. We show here that this technique can be used directly on the surface of food to produce biochemically interpretable "fingerprints." Spoilage in meat is the result of decomposition and the formation of metabolites caused by the growth and enzymatic activity of microorganisms. FT-IR was exploited to measure biochemical changes within the meat substrate, enhancing and accelerating the detection of microbial spoilage. Chicken breasts were purchased from a national retailer, comminuted for 10 s, and left to spoil at room temperature for 24 h. Every hour, FT-IR measurements were taken directly from the meat surface using attenuated total reflectance, and the total viable counts were obtained by classical plating methods. Quantitative interpretation of FT-IR spectra was possible using partial least-squares regression and allowed accurate estimates of bacterial loads to be calculated directly from the meat surface in 60 s. Genetic programming was used to derive rules showing that at levels of 10(7) bacteria.g(-1) the main biochemical indicator of spoilage was the onset of proteolysis. Thus, using FT-IR we were able to acquire a metabolic snapshot and quantify, noninvasively, the microbial loads of food samples accurately and rapidly in 60 s, directly from the sample surface. We believe this approach will aid in the Hazard Analysis Critical Control Point process for the assessment of the microbiological safety of food at the production, processing, manufacturing, packaging, and storage levels.

  4. Quantitative Graphics in Newspapers.

    ERIC Educational Resources Information Center

    Tankard, James W., Jr.

    The use of quantitative graphics in newspapers requires achieving a balance between being accurate and getting the attention of the reader. The statistical representations in newspapers are drawn by graphic designers whose key technique is fusion--the striking combination of two visual images. This technique often results in visual puns,…

  5. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  6. A comparison of fit of CNC-milled titanium and zirconia frameworks to implants.

    PubMed

    Abduo, Jaafar; Lyons, Karl; Waddell, Neil; Bennani, Vincent; Swain, Michael

    2012-05-01

    Computer numeric controlled (CNC) milling was proven to be predictable method to fabricate accurately fitting implant titanium frameworks. However, no data are available regarding the fit of CNC-milled implant zirconia frameworks. To compare the precision of fit of implant frameworks milled from titanium and zirconia and relate it to peri-implant strain development after framework fixation. A partially edentulous epoxy resin models received two Branemark implants in the areas of the lower left second premolar and second molar. From this model, 10 identical frameworks were fabricated by mean of CNC milling. Half of them were made from titanium and the other half from zirconia. Strain gauges were mounted close to the implants to qualitatively and quantitatively assess strain development as a result of framework fitting. In addition, the fit of the framework implant interface was measured using an optical microscope, when only one screw was tightened (passive fit) and when all screws were tightened (vertical fit). The data was statistically analyzed using the Mann-Whitney test. All frameworks produced measurable amounts of peri-implant strain. The zirconia frameworks produced significantly less strain than titanium. Combining the qualitative and quantitative information indicates that the implants were under vertical displacement rather than horizontal. The vertical fit was similar for zirconia (3.7 µm) and titanium (3.6 µm) frameworks; however, the zirconia frameworks exhibited a significantly finer passive fit (5.5 µm) than titanium frameworks (13.6 µm). CNC milling produced zirconia and titanium frameworks with high accuracy. The difference between the two materials in terms of fit is expected to be of minimal clinical significance. The strain developed around the implants was more related to the framework fit rather than framework material. © 2011 Wiley Periodicals, Inc.

  7. Quantitation of spatially-localized proteins in tissue samples using MALDI-MRM imaging.

    PubMed

    Clemis, Elizabeth J; Smith, Derek S; Camenzind, Alexander G; Danell, Ryan M; Parker, Carol E; Borchers, Christoph H

    2012-04-17

    MALDI imaging allows the creation of a "molecular image" of a tissue slice. This image is reconstructed from the ion abundances in spectra obtained while rastering the laser over the tissue. These images can then be correlated with tissue histology to detect potential biomarkers of, for example, aberrant cell types. MALDI, however, is known to have problems with ion suppression, making it difficult to correlate measured ion abundance with concentration. It would be advantageous to have a method which could provide more accurate protein concentration measurements, particularly for screening applications or for precise comparisons between samples. In this paper, we report the development of a novel MALDI imaging method for the localization and accurate quantitation of proteins in tissues. This method involves optimization of in situ tryptic digestion, followed by reproducible and uniform deposition of an isotopically labeled standard peptide from a target protein onto the tissue, using an aerosol-generating device. Data is acquired by MALDI multiple reaction monitoring (MRM) mass spectrometry (MS), and accurate peptide quantitation is determined from the ratio of MRM transitions for the endogenous unlabeled proteolytic peptides to the corresponding transitions from the applied isotopically labeled standard peptides. In a parallel experiment, the quantity of the labeled peptide applied to the tissue was determined using a standard curve generated from MALDI time-of-flight (TOF) MS data. This external calibration curve was then used to determine the quantity of endogenous peptide in a given area. All standard curves generate by this method had coefficients of determination greater than 0.97. These proof-of-concept experiments using MALDI MRM-based imaging show the feasibility for the precise and accurate quantitation of tissue protein concentrations over 2 orders of magnitude, while maintaining the spatial localization information for the proteins.

  8. Quantitative protein localization signatures reveal an association between spatial and functional divergences of proteins.

    PubMed

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-03-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg.

  9. Quantitative Protein Localization Signatures Reveal an Association between Spatial and Functional Divergences of Proteins

    PubMed Central

    Loo, Lit-Hsin; Laksameethanasan, Danai; Tung, Yi-Ling

    2014-01-01

    Protein subcellular localization is a major determinant of protein function. However, this important protein feature is often described in terms of discrete and qualitative categories of subcellular compartments, and therefore it has limited applications in quantitative protein function analyses. Here, we present Protein Localization Analysis and Search Tools (PLAST), an automated analysis framework for constructing and comparing quantitative signatures of protein subcellular localization patterns based on microscopy images. PLAST produces human-interpretable protein localization maps that quantitatively describe the similarities in the localization patterns of proteins and major subcellular compartments, without requiring manual assignment or supervised learning of these compartments. Using the budding yeast Saccharomyces cerevisiae as a model system, we show that PLAST is more accurate than existing, qualitative protein localization annotations in identifying known co-localized proteins. Furthermore, we demonstrate that PLAST can reveal protein localization-function relationships that are not obvious from these annotations. First, we identified proteins that have similar localization patterns and participate in closely-related biological processes, but do not necessarily form stable complexes with each other or localize at the same organelles. Second, we found an association between spatial and functional divergences of proteins during evolution. Surprisingly, as proteins with common ancestors evolve, they tend to develop more diverged subcellular localization patterns, but still occupy similar numbers of compartments. This suggests that divergence of protein localization might be more frequently due to the development of more specific localization patterns over ancestral compartments than the occupation of new compartments. PLAST enables systematic and quantitative analyses of protein localization-function relationships, and will be useful to elucidate protein functions and how these functions were acquired in cells from different organisms or species. A public web interface of PLAST is available at http://plast.bii.a-star.edu.sg. PMID:24603469

  10. The subtle business of model reduction for stochastic chemical kinetics

    NASA Astrophysics Data System (ADS)

    Gillespie, Dan T.; Cao, Yang; Sanft, Kevin R.; Petzold, Linda R.

    2009-02-01

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S1⇌S2→S3, whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S3-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  11. A study of the utilization of ERTS-1 data from the Wabash River Basin. [crop identification, water resources, urban land use, soil mapping, and atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The most significant results were obtained in the water resources research, urban land use mapping, and soil association mapping projects. ERTS-1 data was used to classify water bodies to determine acreages and high agreement was obtained with USGS figures. Quantitative evaluation was achieved of urban land use classifications from ERTS-1 data and an overall test accuracy of 90.3% was observed. ERTS-1 data classifications of soil test sites were compared with soil association maps scaled to match the computer produced map and good agreement was observed. In some cases the ERTS-1 results proved to be more accurate than the soil association map.

  12. The subtle business of model reduction for stochastic chemical kinetics.

    PubMed

    Gillespie, Dan T; Cao, Yang; Sanft, Kevin R; Petzold, Linda R

    2009-02-14

    This paper addresses the problem of simplifying chemical reaction networks by adroitly reducing the number of reaction channels and chemical species. The analysis adopts a discrete-stochastic point of view and focuses on the model reaction set S(1)<=>S(2)-->S(3), whose simplicity allows all the mathematics to be done exactly. The advantages and disadvantages of replacing this reaction set with a single S(3)-producing reaction are analyzed quantitatively using novel criteria for measuring simulation accuracy and simulation efficiency. It is shown that in all cases in which such a model reduction can be accomplished accurately and with a significant gain in simulation efficiency, a procedure called the slow-scale stochastic simulation algorithm provides a robust and theoretically transparent way of implementing the reduction.

  13. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    PubMed Central

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  14. Investigating the Validity of Two Widely Used Quantitative Text Tools

    ERIC Educational Resources Information Center

    Cunningham, James W.; Hiebert, Elfrieda H.; Mesmer, Heidi Anne

    2018-01-01

    In recent years, readability formulas have gained new prominence as a basis for selecting texts for learning and assessment. Variables that quantitative tools count (e.g., word frequency, sentence length) provide valid measures of text complexity insofar as they accurately predict representative and high-quality criteria. The longstanding…

  15. NEW TARGET AND CONTROL ASSAYS FOR QUANTITATIVE POLYMERASE CHAIN REACTION (QPCR) ANALYSIS OF ENTEROCOCCI IN WATER

    EPA Science Inventory

    Enterococci are frequently monitored in water samples as indicators of fecal pollution. Attention is now shifting from culture based methods for enumerating these organisms to more rapid molecular methods such as QPCR. Accurate quantitative analyses by this method requires highly...

  16. A new LC-MS based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous system

    PubMed Central

    Wang, Shunhai; Bobst, Cedric E.; Kaltashov, Igor A.

    2018-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein which is viewed as a promising drug carrier to target the central nervous system due to its ability to penetrate the blood-brain barrier (BBB). Among the many challenges during the development of Tf-based therapeutics, sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult due to the presence of abundant endogenous Tf. Herein, we describe the development of a new LC-MS based method for sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous hTf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed O18-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation. PMID:26307718

  17. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE PAGES

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    2017-01-18

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  18. Communication—Quantitative Voltammetric Analysis of High Concentration Actinides in Molten Salts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, Nathaniel C.; Willit, James L.; Williamson, Mark A.

    Previous electroanalytical studies have shown that cyclic voltammetry can provide accurate quantitative measurements of actinide concentrations at low weight loadings in molten salts. However, above 2 wt%, the techniques were found to underpredict the concentrations of the reactant species. Here this work will demonstrate that much of the discrepancy is caused by uncompensated resistance and cylindrical diffusion. An improved electroanalytical approach has therefore been developed using the results of digital simulations to take these effects into account. This approach allows for accurate electroanalytical predictions across the full range of weight loadings expected to be encountered in operational nuclear fuel processingmore » equipment.« less

  19. Searching for an Accurate Marker-Based Prediction of an Individual Quantitative Trait in Molecular Plant Breeding

    PubMed Central

    Fu, Yong-Bi; Yang, Mo-Hua; Zeng, Fangqin; Biligetu, Bill

    2017-01-01

    Molecular plant breeding with the aid of molecular markers has played an important role in modern plant breeding over the last two decades. Many marker-based predictions for quantitative traits have been made to enhance parental selection, but the trait prediction accuracy remains generally low, even with the aid of dense, genome-wide SNP markers. To search for more accurate trait-specific prediction with informative SNP markers, we conducted a literature review on the prediction issues in molecular plant breeding and on the applicability of an RNA-Seq technique for developing function-associated specific trait (FAST) SNP markers. To understand whether and how FAST SNP markers could enhance trait prediction, we also performed a theoretical reasoning on the effectiveness of these markers in a trait-specific prediction, and verified the reasoning through computer simulation. To the end, the search yielded an alternative to regular genomic selection with FAST SNP markers that could be explored to achieve more accurate trait-specific prediction. Continuous search for better alternatives is encouraged to enhance marker-based predictions for an individual quantitative trait in molecular plant breeding. PMID:28729875

  20. Monitoring the injured brain: registered, patient specific atlas models to improve accuracy of recovered brain saturation values

    NASA Astrophysics Data System (ADS)

    Clancy, Michael; Belli, Antonio; Davies, David; Lucas, Samuel J. E.; Su, Zhangjie; Dehghani, Hamid

    2015-07-01

    The subject of superficial contamination and signal origins remains a widely debated topic in the field of Near Infrared Spectroscopy (NIRS), yet the concept of using the technology to monitor an injured brain, in a clinical setting, poses additional challenges concerning the quantitative accuracy of recovered parameters. Using high density diffuse optical tomography probes, quantitatively accurate parameters from different layers (skin, bone and brain) can be recovered from subject specific reconstruction models. This study assesses the use of registered atlas models for situations where subject specific models are not available. Data simulated from subject specific models were reconstructed using the 8 registered atlas models implementing a regional (layered) parameter recovery in NIRFAST. A 3-region recovery based on the atlas model yielded recovered brain saturation values which were accurate to within 4.6% (percentage error) of the simulated values, validating the technique. The recovered saturations in the superficial regions were not quantitatively accurate. These findings highlight differences in superficial (skin and bone) layer thickness between the subject and atlas models. This layer thickness mismatch was propagated through the reconstruction process decreasing the parameter accuracy.

  1. Robust and fast characterization of OCT-based optical attenuation using a novel frequency-domain algorithm for brain cancer detection

    NASA Astrophysics Data System (ADS)

    Yuan, Wu; Kut, Carmen; Liang, Wenxuan; Li, Xingde

    2017-03-01

    Cancer is known to alter the local optical properties of tissues. The detection of OCT-based optical attenuation provides a quantitative method to efficiently differentiate cancer from non-cancer tissues. In particular, the intraoperative use of quantitative OCT is able to provide a direct visual guidance in real time for accurate identification of cancer tissues, especially these without any obvious structural layers, such as brain cancer. However, current methods are suboptimal in providing high-speed and accurate OCT attenuation mapping for intraoperative brain cancer detection. In this paper, we report a novel frequency-domain (FD) algorithm to enable robust and fast characterization of optical attenuation as derived from OCT intensity images. The performance of this FD algorithm was compared with traditional fitting methods by analyzing datasets containing images from freshly resected human brain cancer and from a silica phantom acquired by a 1310 nm swept-source OCT (SS-OCT) system. With graphics processing unit (GPU)-based CUDA C/C++ implementation, this new attenuation mapping algorithm can offer robust and accurate quantitative interpretation of OCT images in real time during brain surgery.

  2. Understanding the Radioactive Ingrowth and Decay of Naturally Occurring Radioactive Materials in the Environment: An Analysis of Produced Fluids from the Marcellus Shale

    PubMed Central

    Nelson, Andrew W.; Eitrheim, Eric S.; Knight, Andrew W.; May, Dustin; Mehrhoff, Marinea A.; Shannon, Robert; Litman, Robert; Burnett, William C.; Forbes, Tori Z.

    2015-01-01

    Background The economic value of unconventional natural gas resources has stimulated rapid globalization of horizontal drilling and hydraulic fracturing. However, natural radioactivity found in the large volumes of “produced fluids” generated by these technologies is emerging as an international environmental health concern. Current assessments of the radioactivity concentration in liquid wastes focus on a single element—radium. However, the use of radium alone to predict radioactivity concentrations can greatly underestimate total levels. Objective We investigated the contribution to radioactivity concentrations from naturally occurring radioactive materials (NORM), including uranium, thorium, actinium, radium, lead, bismuth, and polonium isotopes, to the total radioactivity of hydraulic fracturing wastes. Methods For this study we used established methods and developed new methods designed to quantitate NORM of public health concern that may be enriched in complex brines from hydraulic fracturing wastes. Specifically, we examined the use of high-purity germanium gamma spectrometry and isotope dilution alpha spectrometry to quantitate NORM. Results We observed that radium decay products were initially absent from produced fluids due to differences in solubility. However, in systems closed to the release of gaseous radon, our model predicted that decay products will begin to ingrow immediately and (under these closed-system conditions) can contribute to an increase in the total radioactivity for more than 100 years. Conclusions Accurate predictions of radioactivity concentrations are critical for estimating doses to potentially exposed individuals and the surrounding environment. These predictions must include an understanding of the geochemistry, decay properties, and ingrowth kinetics of radium and its decay product radionuclides. Citation Nelson AW, Eitrheim ES, Knight AW, May D, Mehrhoff MA, Shannon R, Litman R, Burnett WC, Forbes TZ, Schultz MK. 2015. Understanding the radioactive ingrowth and decay of naturally occurring radioactive materials in the environment: an analysis of produced fluids from the Marcellus Shale. Environ Health Perspect 123:689–696; http://dx.doi.org/10.1289/ehp.1408855 PMID:25831257

  3. Complex and dynamic landscape of RNA polyadenylation revealed by PAS-Seq

    PubMed Central

    Shepard, Peter J.; Choi, Eun-A; Lu, Jente; Flanagan, Lisa A.; Hertel, Klemens J.; Shi, Yongsheng

    2011-01-01

    Alternative polyadenylation (APA) of mRNAs has emerged as an important mechanism for post-transcriptional gene regulation in higher eukaryotes. Although microarrays have recently been used to characterize APA globally, they have a number of serious limitations that prevents comprehensive and highly quantitative analysis. To better characterize APA and its regulation, we have developed a deep sequencing-based method called Poly(A) Site Sequencing (PAS-Seq) for quantitatively profiling RNA polyadenylation at the transcriptome level. PAS-Seq not only accurately and comprehensively identifies poly(A) junctions in mRNAs and noncoding RNAs, but also provides quantitative information on the relative abundance of polyadenylated RNAs. PAS-Seq analyses of human and mouse transcriptomes showed that 40%–50% of all expressed genes produce alternatively polyadenylated mRNAs. Furthermore, our study detected evolutionarily conserved polyadenylation of histone mRNAs and revealed novel features of mitochondrial RNA polyadenylation. Finally, PAS-Seq analyses of mouse embryonic stem (ES) cells, neural stem/progenitor (NSP) cells, and neurons not only identified more poly(A) sites than what was found in the entire mouse EST database, but also detected significant changes in the global APA profile that lead to lengthening of 3′ untranslated regions (UTR) in many mRNAs during stem cell differentiation. Together, our PAS-Seq analyses revealed a complex landscape of RNA polyadenylation in mammalian cells and the dynamic regulation of APA during stem cell differentiation. PMID:21343387

  4. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  5. Highly Accurate Quantitative Analysis Of Enantiomeric Mixtures from Spatially Frequency Encoded 1H NMR Spectra.

    PubMed

    Plainchont, Bertrand; Pitoux, Daisy; Cyrille, Mathieu; Giraud, Nicolas

    2018-02-06

    We propose an original concept to measure accurately enantiomeric excesses on proton NMR spectra, which combines high-resolution techniques based on a spatial encoding of the sample, with the use of optically active weakly orienting solvents. We show that it is possible to simulate accurately dipolar edited spectra of enantiomers dissolved in a chiral liquid crystalline phase, and to use these simulations to calibrate integrations that can be measured on experimental data, in order to perform a quantitative chiral analysis. This approach is demonstrated on a chemical intermediate for which optical purity is an essential criterion. We find that there is a very good correlation between the experimental and calculated integration ratios extracted from G-SERF spectra, which paves the way to a general method of determination of enantiomeric excesses based on the observation of 1 H nuclei.

  6. Accurate quantitative CF-LIBS analysis of both major and minor elements in alloys via iterative correction of plasma temperature and spectral intensity

    NASA Astrophysics Data System (ADS)

    Shuxia, ZHAO; Lei, ZHANG; Jiajia, HOU; Yang, ZHAO; Wangbao, YIN; Weiguang, MA; Lei, DONG; Liantuan, XIAO; Suotang, JIA

    2018-03-01

    The chemical composition of alloys directly determines their mechanical behaviors and application fields. Accurate and rapid analysis of both major and minor elements in alloys plays a key role in metallurgy quality control and material classification processes. A quantitative calibration-free laser-induced breakdown spectroscopy (CF-LIBS) analysis method, which carries out combined correction of plasma temperature and spectral intensity by using a second-order iterative algorithm and two boundary standard samples, is proposed to realize accurate composition measurements. Experimental results show that, compared to conventional CF-LIBS analysis, the relative errors for major elements Cu and Zn and minor element Pb in the copper-lead alloys has been reduced from 12%, 26% and 32% to 1.8%, 2.7% and 13.4%, respectively. The measurement accuracy for all elements has been improved substantially.

  7. [Doppler echocardiography of tricuspid insufficiency. Methods of quantification].

    PubMed

    Loubeyre, C; Tribouilloy, C; Adam, M C; Mirode, A; Trojette, F; Lesbre, J P

    1994-01-01

    Evaluation of tricuspid incompetence has benefitted considerably from the development of Doppler ultrasound. In addition to direct analysis of the valves, which provides information about the mechanism involved, this method is able to provide an accurate evaluation, mainly through use of the Doppler mode. In addition to new criteria being evaluated (mainly the convergence zone of the regurgitant jet), some indices are recognised as good quantitative parameters: extension of the regurgitant jet into the right atrium, anterograde tricuspid flow, laminar nature of the regurgitant flow, analysis of the flow in the supra-hepatic veins, this is only semi-quantitative, since the calculation of the regurgitation fraction from the pulsed Doppler does not seem to be reliable; This accurate semi-quantitative evaluation is made possible by careful and consistent use of all the criteria available. The authors set out to discuss the value of the various evaluation criteria mentioned in the literature and try to define a practical approach.

  8. Finding the bottom and using it

    PubMed Central

    Sandoval, Ruben M.; Wang, Exing; Molitoris, Bruce A.

    2014-01-01

    Maximizing 2-photon parameters used in acquiring images for quantitative intravital microscopy, especially when high sensitivity is required, remains an open area of investigation. Here we present data on correctly setting the black level of the photomultiplier tube amplifier by adjusting the offset to allow for accurate quantitation of low intensity processes. When the black level is set too high some low intensity pixel values become zero and a nonlinear degradation in sensitivity occurs rendering otherwise quantifiable low intensity values virtually undetectable. Initial studies using a series of increasing offsets for a sequence of concentrations of fluorescent albumin in vitro revealed a loss of sensitivity for higher offsets at lower albumin concentrations. A similar decrease in sensitivity, and therefore the ability to correctly determine the glomerular permeability coefficient of albumin, occurred in vivo at higher offset. Finding the offset that yields accurate and linear data are essential for quantitative analysis when high sensitivity is required. PMID:25313346

  9. Quantitative fluorescence tomography using a trimodality system: in vivo validation

    PubMed Central

    Lin, Yuting; Barber, William C.; Iwanczyk, Jan S.; Roeck, Werner W.; Nalcioglu, Orhan; Gulsen, Gultekin

    2010-01-01

    A fully integrated trimodality fluorescence, diffuse optical, and x-ray computed tomography (FT∕DOT∕XCT) system for small animal imaging is reported in this work. The main purpose of this system is to obtain quantitatively accurate fluorescence concentration images using a multimodality approach. XCT offers anatomical information, while DOT provides the necessary background optical property map to improve FT image accuracy. The quantitative accuracy of this trimodality system is demonstrated in vivo. In particular, we show that a 2-mm-diam fluorescence inclusion located 8 mm deep in a nude mouse can only be localized when functional a priori information from DOT is available. However, the error in the recovered fluorophore concentration is nearly 87%. On the other hand, the fluorophore concentration can be accurately recovered within 2% error when both DOT functional and XCT structural a priori information are utilized together to guide and constrain the FT reconstruction algorithm. PMID:20799770

  10. QACD: A method for the quantitative assessment of compositional distribution in geologic materials

    NASA Astrophysics Data System (ADS)

    Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.

    2017-12-01

    In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.

  11. Conditional Toxicity Value (CTV) Predictor: An In Silico Approach for Generating Quantitative Risk Estimates for Chemicals.

    PubMed

    Wignall, Jessica A; Muratov, Eugene; Sedykh, Alexander; Guyton, Kathryn Z; Tropsha, Alexander; Rusyn, Ivan; Chiu, Weihsueh A

    2018-05-01

    Human health assessments synthesize human, animal, and mechanistic data to produce toxicity values that are key inputs to risk-based decision making. Traditional assessments are data-, time-, and resource-intensive, and they cannot be developed for most environmental chemicals owing to a lack of appropriate data. As recommended by the National Research Council, we propose a solution for predicting toxicity values for data-poor chemicals through development of quantitative structure-activity relationship (QSAR) models. We used a comprehensive database of chemicals with existing regulatory toxicity values from U.S. federal and state agencies to develop quantitative QSAR models. We compared QSAR-based model predictions to those based on high-throughput screening (HTS) assays. QSAR models for noncancer threshold-based values and cancer slope factors had cross-validation-based Q 2 of 0.25-0.45, mean model errors of 0.70-1.11 log 10 units, and applicability domains covering >80% of environmental chemicals. Toxicity values predicted from QSAR models developed in this study were more accurate and precise than those based on HTS assays or mean-based predictions. A publicly accessible web interface to make predictions for any chemical of interest is available at http://toxvalue.org. An in silico tool that can predict toxicity values with an uncertainty of an order of magnitude or less can be used to quickly and quantitatively assess risks of environmental chemicals when traditional toxicity data or human health assessments are unavailable. This tool can fill a critical gap in the risk assessment and management of data-poor chemicals. https://doi.org/10.1289/EHP2998.

  12. Quantitative evaluation of haze formation of koji and progression of internal haze by drying of koji during koji making.

    PubMed

    Ito, Kazunari; Gomi, Katsuya; Kariyama, Masahiro; Miyake, Tsuyoshi

    2017-07-01

    The construction of an experimental system that can mimic koji making in the manufacturing setting of a sake brewery is initially required for the quantitative evaluation of mycelia grown on/in koji pellets (haze formation). Koji making with rice was investigated with a solid-state fermentation (SSF) system using a non-airflow box (NAB), which produced uniform conditions in the culture substrate with high reproducibility and allowed for the control of favorable conditions in the substrate during culture. The SSF system using NAB accurately reproduced koji making in a manufacturing setting. To evaluate haze formation during koji making, surfaces and cross sections of koji pellets obtained from koji making tests were observed using a digital microscope. Image analysis was used to distinguish between haze and non-haze sections of koji pellets, enabling the evaluation of haze formation in a batch by measuring the haze rate of a specific number of koji pellets. This method allowed us to obtain continuous and quantitative data on the time course of haze formation. Moreover, drying koji during the late stage of koji making was revealed to cause further penetration of mycelia into koji pellets (internal haze). The koji making test with the SSF system using NAB and quantitative evaluation of haze formation in a batch by image analysis is a useful method for understanding the relations between haze formation and koji making conditions. Copyright © 2017 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. Radial Ultrashort TE Imaging Removes the Need for Breath-Holding in Hepatic Iron Overload Quantification by R2* MRI.

    PubMed

    Tipirneni-Sajja, Aaryani; Krafft, Axel J; McCarville, M Beth; Loeffler, Ralf B; Song, Ruitian; Hankins, Jane S; Hillenbrand, Claudia M

    2017-07-01

    The objective of this study is to evaluate radial free-breathing (FB) multiecho ultrashort TE (UTE) imaging as an alternative to Cartesian FB multiecho gradient-recalled echo (GRE) imaging for quantitative assessment of hepatic iron content (HIC) in sedated patients and subjects unable to perform breath-hold (BH) maneuvers. FB multiecho GRE imaging and FB multiecho UTE imaging were conducted for 46 test group patients with iron overload who could not complete BH maneuvers (38 patients were sedated, and eight were not sedated) and 16 control patients who could complete BH maneuvers. Control patients also underwent standard BH multiecho GRE imaging. Quantitative R2* maps were calculated, and mean liver R2* values and coefficients of variation (CVs) for different acquisitions and patient groups were compared using statistical analysis. FB multiecho GRE images displayed motion artifacts and significantly lower R2* values, compared with standard BH multiecho GRE images and FB multiecho UTE images in the control cohort and FB multiecho UTE images in the test cohort. In contrast, FB multiecho UTE images produced artifact-free R2* maps, and mean R2* values were not significantly different from those measured by BH multiecho GRE imaging. Motion artifacts on FB multiecho GRE images resulted in an R2* CV that was approximately twofold higher than the R2* CV from BH multiecho GRE imaging and FB multiecho UTE imaging. The R2* CV was relatively constant over the range of R2* values for FB multiecho UTE, but it increased with increases in R2* for FB multiecho GRE imaging, reflecting that motion artifacts had a stronger impact on R2* estimation with increasing iron burden. FB multiecho UTE imaging was less motion sensitive because of radial sampling, produced excellent image quality, and yielded accurate R2* estimates within the same acquisition time used for multiaveraged FB multiecho GRE imaging. Thus, FB multiecho UTE imaging is a viable alternative for accurate HIC assessment in sedated children and patients who cannot complete BH maneuvers.

  14. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  15. Quantitative Large-Scale Three-Dimensional Imaging of Human Kidney Biopsies: A Bridge to Precision Medicine in Kidney Disease.

    PubMed

    Winfree, Seth; Dagher, Pierre C; Dunn, Kenneth W; Eadon, Michael T; Ferkowicz, Michael; Barwinska, Daria; Kelly, Katherine J; Sutton, Timothy A; El-Achkar, Tarek M

    2018-06-05

    Kidney biopsy remains the gold standard for uncovering the pathogenesis of acute and chronic kidney diseases. However, the ability to perform high resolution, quantitative, molecular and cellular interrogation of this precious tissue is still at a developing stage compared to other fields such as oncology. Here, we discuss recent advances in performing large-scale, three-dimensional (3D), multi-fluorescence imaging of kidney biopsies and quantitative analysis referred to as 3D tissue cytometry. This approach allows the accurate measurement of specific cell types and their spatial distribution in a thick section spanning the entire length of the biopsy. By uncovering specific disease signatures, including rare occurrences, and linking them to the biology in situ, this approach will enhance our understanding of disease pathogenesis. Furthermore, by providing accurate quantitation of cellular events, 3D cytometry may improve the accuracy of prognosticating the clinical course and response to therapy. Therefore, large-scale 3D imaging and cytometry of kidney biopsy is poised to become a bridge towards personalized medicine for patients with kidney disease. © 2018 S. Karger AG, Basel.

  16. Improved sample preparation of glyphosate and methylphosphonic acid by EPA method 6800A and time-of-flight mass spectrometry using novel solid-phase extraction.

    PubMed

    Wagner, Rebecca; Wetzel, Stephanie J; Kern, John; Kingston, H M Skip

    2012-02-01

    The employment of chemical weapons by rogue states and/or terrorist organizations is an ongoing concern in the United States. The quantitative analysis of nerve agents must be rapid and reliable for use in the private and public sectors. Current methods describe a tedious and time-consuming derivatization for gas chromatography-mass spectrometry and liquid chromatography in tandem with mass spectrometry. Two solid-phase extraction (SPE) techniques for the analysis of glyphosate and methylphosphonic acid are described with the utilization of isotopically enriched analytes for quantitation via atmospheric pressure chemical ionization-quadrupole time-of-flight mass spectrometry (APCI-Q-TOF-MS) that does not require derivatization. Solid-phase extraction-isotope dilution mass spectrometry (SPE-IDMS) involves pre-equilibration of a naturally occurring sample with an isotopically enriched standard. The second extraction method, i-Spike, involves loading an isotopically enriched standard onto the SPE column before the naturally occurring sample. The sample and the spike are then co-eluted from the column enabling precise and accurate quantitation via IDMS. The SPE methods in conjunction with IDMS eliminate concerns of incomplete elution, matrix and sorbent effects, and MS drift. For accurate quantitation with IDMS, the isotopic contribution of all atoms in the target molecule must be statistically taken into account. This paper describes two newly developed sample preparation techniques for the analysis of nerve agent surrogates in drinking water as well as statistical probability analysis for proper molecular IDMS. The methods described in this paper demonstrate accurate molecular IDMS using APCI-Q-TOF-MS with limits of quantitation as low as 0.400 mg/kg for glyphosate and 0.031 mg/kg for methylphosphonic acid. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Quantitative analysis of rib movement based on dynamic chest bone images: preliminary results

    NASA Astrophysics Data System (ADS)

    Tanaka, R.; Sanada, S.; Oda, M.; Mitsutaka, M.; Suzuki, K.; Sakuta, K.; Kawashima, H.

    2014-03-01

    Rib movement during respiration is one of the diagnostic criteria in pulmonary impairments. In general, the rib movement is assessed in fluoroscopy. However, the shadows of lung vessels and bronchi overlapping ribs prevent accurate quantitative analysis of rib movement. Recently, an image-processing technique for separating bones from soft tissue in static chest radiographs, called "bone suppression technique", has been developed. Our purpose in this study was to evaluate the usefulness of dynamic bone images created by the bone suppression technique in quantitative analysis of rib movement. Dynamic chest radiographs of 10 patients were obtained using a dynamic flat-panel detector (FPD). Bone suppression technique based on a massive-training artificial neural network (MTANN) was applied to the dynamic chest images to create bone images. Velocity vectors were measured in local areas on the dynamic bone images, which formed a map. The velocity maps obtained with bone and original images for scoliosis and normal cases were compared to assess the advantages of bone images. With dynamic bone images, we were able to quantify and distinguish movements of ribs from those of other lung structures accurately. Limited rib movements of scoliosis patients appeared as reduced rib velocity vectors. Vector maps in all normal cases exhibited left-right symmetric distributions, whereas those in abnormal cases showed nonuniform distributions. In conclusion, dynamic bone images were useful for accurate quantitative analysis of rib movements: Limited rib movements were indicated as a reduction of rib movement and left-right asymmetric distribution on vector maps. Thus, dynamic bone images can be a new diagnostic tool for quantitative analysis of rib movements without additional radiation dose.

  18. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network

    PubMed Central

    Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process. PMID:26977450

  19. Intelligent Soft Computing on Forex: Exchange Rates Forecasting with Hybrid Radial Basis Neural Network.

    PubMed

    Falat, Lukas; Marcek, Dusan; Durisova, Maria

    2016-01-01

    This paper deals with application of quantitative soft computing prediction models into financial area as reliable and accurate prediction models can be very helpful in management decision-making process. The authors suggest a new hybrid neural network which is a combination of the standard RBF neural network, a genetic algorithm, and a moving average. The moving average is supposed to enhance the outputs of the network using the error part of the original neural network. Authors test the suggested model on high-frequency time series data of USD/CAD and examine the ability to forecast exchange rate values for the horizon of one day. To determine the forecasting efficiency, they perform a comparative statistical out-of-sample analysis of the tested model with autoregressive models and the standard neural network. They also incorporate genetic algorithm as an optimizing technique for adapting parameters of ANN which is then compared with standard backpropagation and backpropagation combined with K-means clustering algorithm. Finally, the authors find out that their suggested hybrid neural network is able to produce more accurate forecasts than the standard models and can be helpful in eliminating the risk of making the bad decision in decision-making process.

  20. Simulating observations with HARMONI: the integral field spectrograph for the European Extremely Large Telescope

    NASA Astrophysics Data System (ADS)

    Zieleniewski, Simon; Thatte, Niranjan; Kendrew, Sarah; Houghton, Ryan; Tecza, Matthias; Clarke, Fraser; Fusco, Thierry; Swinbank, Mark

    2014-07-01

    With the next generation of extremely large telescopes commencing construction, there is an urgent need for detailed quantitative predictions of the scientific observations that these new telescopes will enable. Most of these new telescopes will have adaptive optics fully integrated with the telescope itself, allowing unprecedented spatial resolution combined with enormous sensitivity. However, the adaptive optics point spread function will be strongly wavelength dependent, requiring detailed simulations that accurately model these variations. We have developed a simulation pipeline for the HARMONI integral field spectrograph, a first light instrument for the European Extremely Large Telescope. The simulator takes high-resolution input data-cubes of astrophysical objects and processes them with accurate atmospheric, telescope and instrumental effects, to produce mock observed cubes for chosen observing parameters. The output cubes represent the result of a perfect data reduc- tion process, enabling a detailed analysis and comparison between input and output, showcasing HARMONI's capabilities. The simulations utilise a detailed knowledge of the telescope's wavelength dependent adaptive op- tics point spread function. We discuss the simulation pipeline and present an early example of the pipeline functionality for simulating observations of high redshift galaxies.

  1. Infrared thermography with non-uniform heat flux boundary conditions on the rotor endwall of an axial turbine

    NASA Astrophysics Data System (ADS)

    Lazzi Gazzini, S.; Schädler, R.; Kalfas, A. I.; Abhari, R. S.

    2017-02-01

    It is technically challenging to measure heat fluxes on the rotating components of gas turbines, yet accurate knowledge of local heat loads under engine-representative conditions is crucial for ensuring the reliability of the designs. In this work, quantitative image processing tools were developed to perform fast and accurate infrared thermography measurements on 3D-shaped film-heaters directly deposited on the turbine endwalls. The newly developed image processing method and instrumentation were used to measure the heat load on the rotor endwalls of an axial turbine. A step-transient heat flux calibration technique is applied to measure the heat flux generated locally by the film heater, thus eliminating the need for a rigorously iso-energetic boundary condition. On-board electronics installed on the rotor record the temperature readings of RTDs installed in the substrate below the heaters in order to evaluate the conductive losses in the solid. Full maps of heat transfer coefficient and adiabatic wall temperature are produced for two different operating conditions, demonstrating the sensitivity of the technique to local flow features and variations in heat transfer due to Reynolds number effect.

  2. A new method for determination of cocamidopropyl betaine synthesized from coconut oil through spectral shift of Eriochrome Black T

    NASA Astrophysics Data System (ADS)

    Gholami, Ali; Golestaneh, Mahshid; Andalib, Zeinab

    2018-03-01

    Cocamidopropyl betaine (CAPB) is a zwitterionic surfactant that is synthesized using coconut oil and usually supplied in form of an aqueous solution with 25-37% w/w. In this study, a novel method based on UV-visible spectroscopy is developed for an accurate determination of CAPB synthesized from coconut oil. Eriochrome Black T (EBT) as a specific color indicator was added to CAPB and a red shift and color change were observed. This shift leads in increasing wavelength selectivity of the method. The change in the color intensity depends on the concentration of CAPB. By measuring the absorbance of a solution containing CAPB, its concentration was measured. After optimizing all the effective parameters, CAPB was detected in commercial real samples. Using the proposed approach, limit of quantification (LOQ) and relative standard deviation (RSD) were obtained about 4.30 × 10- 5 M and 4.8% respectively. None of unreacted materials or by-products, which were produced in the synthesis of CAPB, showed any interference in the determination of CAPB. This shows that the proposed method is specific and accurate, and can potentially be used for quantitative determination of CAPB in commercial samples with satisfactory results.

  3. Development of a stability-indicating CE assay for the determination of amlodipine enantiomers in commercial tablets.

    PubMed

    Fakhari, Ali Reza; Nojavan, Saeed; Haghgoo, Soheila; Mohammadi, Ali

    2008-11-01

    A simple, accurate, precise and sensitive method using CD for separation and stability indicating assay of enantiomers of amlodipine in the commercial tablets has been established. Several types of CD were evaluated and best results were obtained using a fused-silica capillary with phosphate running buffer (100 mM, pH 3.0) containing 5 mM hydroxypropyl-alpha-CD. The method has shown adequate separation for amlodipine enantiomers from its degradation products. The drug was subjected to oxidation, hydrolysis, photolysis and heat to apply stress conditions. The range of quantitation for both enantiomers was 5-150 microg/mL. Intra- and inter-day RSD (n=6) was <4%. The limit of quantification that produced the requisite precision and accuracy was found to be 5 microg/mL for both enantiomers. The LOD for both enantiomers was found to be 0.5 microg/mL. Degradation products produced as a result of stress studies did not interfere with the detection of enantiomers and the assay can thus be considered stability indicating.

  4. Huggy Pajama: A Remote Interactive Touch and Hugging System

    NASA Astrophysics Data System (ADS)

    Cheok, Adrian David

    Huggy Pajama is a novel wearable system aimed at promoting physical interaction in remote communication between parent and child. This system enables parents and children to hug one another through a hugging interface device and a wearable, hug reproducing pajama connected through the Internet. The hug input device is a small, mobile doll with an embedded pressure sensing circuit that is able to accurately sense varying levels of pressure along the range of human touch produced from natural touch. This device sends hug signals to a haptic jacket that simulates the feeling of being hugged to the wearer. It features air pocket actuators that reproduce hug sensations, heating elements to produce warmth that accompanies hugs, and a color changing pattern and accessory to indicate distance of separation and communicate expressions. In this chapter, we present the system design of Huggy Pajama. We also show results from quantitative and qualitative user studies which show the effectiveness of the system simulating an actual human touch. Results also indicate an increased sense of presence between parents and children when used as an added component to instant messaging and video chat communication.

  5. Improving vertebra segmentation through joint vertebra-rib atlases

    NASA Astrophysics Data System (ADS)

    Wang, Yinong; Yao, Jianhua; Roth, Holger R.; Burns, Joseph E.; Summers, Ronald M.

    2016-03-01

    Accurate spine segmentation allows for improved identification and quantitative characterization of abnormalities of the vertebra, such as vertebral fractures. However, in existing automated vertebra segmentation methods on computed tomography (CT) images, leakage into nearby bones such as ribs occurs due to the close proximity of these visibly intense structures in a 3D CT volume. To reduce this error, we propose the use of joint vertebra-rib atlases to improve the segmentation of vertebrae via multi-atlas joint label fusion. Segmentation was performed and evaluated on CTs containing 106 thoracic and lumbar vertebrae from 10 pathological and traumatic spine patients on an individual vertebra level basis. Vertebra atlases produced errors where the segmentation leaked into the ribs. The use of joint vertebra-rib atlases produced a statistically significant increase in the Dice coefficient from 92.5 +/- 3.1% to 93.8 +/- 2.1% for the left and right transverse processes and a decrease in the mean and max surface distance from 0.75 +/- 0.60mm and 8.63 +/- 4.44mm to 0.30 +/- 0.27mm and 3.65 +/- 2.87mm, respectively.

  6. Proteomics-based compositional analysis of complex cellulase-hemicellulase mixtures.

    PubMed

    Chundawat, Shishir P S; Lipton, Mary S; Purvine, Samuel O; Uppugundla, Nirmal; Gao, Dahai; Balan, Venkatesh; Dale, Bruce E

    2011-10-07

    Efficient deconstruction of cellulosic biomass to fermentable sugars for fuel and chemical production is accomplished by a complex mixture of cellulases, hemicellulases, and accessory enzymes (e.g., >50 extracellular proteins). Cellulolytic enzyme mixtures, produced industrially mostly using fungi like Trichoderma reesei, are poorly characterized in terms of their protein composition and its correlation to hydrolytic activity on cellulosic biomass. The secretomes of commercial glycosyl hydrolase-producing microbes was explored using a proteomics approach with high-throughput quantification using liquid chromatography-tandem mass spectrometry (LC-MS/MS). Here, we show that proteomics-based spectral counting approach is a reasonably accurate and rapid analytical technique that can be used to determine protein composition of complex glycosyl hydrolase mixtures that also correlates with the specific activity of individual enzymes present within the mixture. For example, a strong linear correlation was seen between Avicelase activity and total cellobiohydrolase content. Reliable, quantitative and cheaper analytical methods that provide insight into the cellulosic biomass degrading fungal and bacterial secretomes would lead to further improvements toward commercialization of plant biomass-derived fuels and chemicals.

  7. a Comparative Analysis of Spatiotemporal Data Fusion Models for Landsat and Modis Data

    NASA Astrophysics Data System (ADS)

    Hazaymeh, K.; Almagbile, A.

    2018-04-01

    In this study, three documented spatiotemporal data fusion models were applied to Landsat-7 and MODIS surface reflectance, and NDVI. The algorithms included the spatial and temporal adaptive reflectance fusion model (STARFM), sparse representation based on a spatiotemporal reflectance fusion model (SPSTFM), and spatiotemporal image-fusion model (STI-FM). The objectives of this study were to (i) compare the performance of these three fusion models using a one Landsat-MODIS spectral reflectance image pairs using time-series datasets from the Coleambally irrigation area in Australia, and (ii) quantitatively evaluate the accuracy of the synthetic images generated from each fusion model using statistical measurements. Results showed that the three fusion models predicted the synthetic Landsat-7 image with adequate agreements. The STI-FM produced more accurate reconstructions of both Landsat-7 spectral bands and NDVI. Furthermore, it produced surface reflectance images having the highest correlation with the actual Landsat-7 images. This study indicated that STI-FM would be more suitable for spatiotemporal data fusion applications such as vegetation monitoring, drought monitoring, and evapotranspiration.

  8. Considerations for potency equivalent calculations in the Ah receptor-based CALUX bioassay: normalization of superinduction results for improved sample potency estimation.

    PubMed

    Baston, David S; Denison, Michael S

    2011-02-15

    The chemically activated luciferase expression (CALUX) system is a mechanistically based recombinant luciferase reporter gene cell bioassay used in combination with chemical extraction and clean-up methods for the detection and relative quantitation of 2,3,7,8-tetrachlorodibenzo-p-dioxin and related dioxin-like halogenated aromatic hydrocarbons in a wide variety of sample matrices. While sample extracts containing complex mixtures of chemicals can produce a variety of distinct concentration-dependent luciferase induction responses in CALUX cells, these effects are produced through a common mechanism of action (i.e. the Ah receptor (AhR)) allowing normalization of results and sample potency determination. Here we describe the diversity in CALUX response to PCDD/Fs from sediment and soil extracts and not only report the occurrence of superinduction of the CALUX bioassay, but we describe a mechanistically based approach for normalization of superinduction data that results in a more accurate estimation of the relative potency of such sample extracts. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Comparison of quantitative myocardial perfusion imaging CT to fluorescent microsphere-based flow from high-resolution cryo-images

    NASA Astrophysics Data System (ADS)

    Eck, Brendan L.; Fahmi, Rachid; Levi, Jacob; Fares, Anas; Wu, Hao; Li, Yuemeng; Vembar, Mani; Dhanantwari, Amar; Bezerra, Hiram G.; Wilson, David L.

    2016-03-01

    Myocardial perfusion imaging using CT (MPI-CT) has the potential to provide quantitative measures of myocardial blood flow (MBF) which can aid the diagnosis of coronary artery disease. We evaluated the quantitative accuracy of MPI-CT in a porcine model of balloon-induced LAD coronary artery ischemia guided by fractional flow reserve (FFR). We quantified MBF at baseline (FFR=1.0) and under moderate ischemia (FFR=0.7) using MPI-CT and compared to fluorescent microsphere-based MBF from high-resolution cryo-images. Dynamic, contrast-enhanced CT images were obtained using a spectral detector CT (Philips Healthcare). Projection-based mono-energetic images were reconstructed and processed to obtain MBF. Three MBF quantification approaches were evaluated: singular value decomposition (SVD) with fixed Tikhonov regularization (ThSVD), SVD with regularization determined by the L-Curve criterion (LSVD), and Johnson-Wilson parameter estimation (JW). The three approaches over-estimated MBF compared to cryo-images. JW produced the most accurate MBF, with average error 33.3+/-19.2mL/min/100g, whereas LSVD and ThSVD had greater over-estimation, 59.5+/-28.3mL/min/100g and 78.3+/-25.6 mL/min/100g, respectively. Relative blood flow as assessed by a flow ratio of LAD-to-remote myocardium was strongly correlated between JW and cryo-imaging, with R2=0.97, compared to R2=0.88 and 0.78 for LSVD and ThSVD, respectively. We assessed tissue impulse response functions (IRFs) from each approach for sources of error. While JW was constrained to physiologic solutions, both LSVD and ThSVD produced IRFs with non-physiologic properties due to noise. The L-curve provided noise-adaptive regularization but did not eliminate non-physiologic IRF properties or optimize for MBF accuracy. These findings suggest that model-based MPI-CT approaches may be more appropriate for quantitative MBF estimation and that cryo-imaging can support the development of MPI-CT by providing spatial distributions of MBF.

  10. Quantitative prediction of phase transformations in silicon during nanoindentation

    NASA Astrophysics Data System (ADS)

    Zhang, Liangchi; Basak, Animesh

    2013-08-01

    This paper establishes the first quantitative relationship between the phases transformed in silicon and the shape characteristics of nanoindentation curves. Based on an integrated analysis using TEM and unit cell properties of phases, the volumes of the phases emerged in a nanoindentation are formulated as a function of pop-out size and depth of nanoindentation impression. This simple formula enables a fast, accurate and quantitative prediction of the phases in a nanoindentation cycle, which has been impossible before.

  11. Quantitative Method to Investigate the Balance between Metabolism and Proteome Biomass: Starting from Glycine.

    PubMed

    Gu, Haiwei; Carroll, Patrick A; Du, Jianhai; Zhu, Jiangjiang; Neto, Fausto Carnevale; Eisenman, Robert N; Raftery, Daniel

    2016-12-12

    The balance between metabolism and biomass is very important in biological systems; however, to date there has been no quantitative method to characterize the balance. In this methodological study, we propose to use the distribution of amino acids in different domains to investigate this balance. It is well known that endogenous or exogenous amino acids in a biological system are either metabolized or incorporated into free amino acids (FAAs) or proteome amino acids (PAAs). Using glycine (Gly) as an example, we demonstrate a novel method to accurately determine the amounts of amino acids in various domains using serum, urine, and cell samples. As expected, serum and urine had very different distributions of FAA- and PAA-Gly. Using Tet21N human neuroblastoma cells, we also found that Myc(oncogene)-induced metabolic reprogramming included a higher rate of metabolizing Gly, which provides additional evidence that the metabolism of proliferating cells is adapted to facilitate producing new cells. It is therefore anticipated that our method will be very valuable for further studies of the metabolism and biomass balance that will lead to a better understanding of human cancers. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Mycotoxin Analysis of Human Urine by LC-MS/MS: A Comparative Extraction Study

    PubMed Central

    Escrivá, Laura; Font, Guillermina

    2017-01-01

    The lower mycotoxin levels detected in urine make the development of sensitive and accurate analytical methods essential. Three extraction methods, namely salting-out liquid–liquid extraction (SALLE), miniQuEChERS (quick, easy, cheap, effective, rugged, and safe), and dispersive liquid–liquid microextraction (DLLME), were evaluated and compared based on analytical parameters for the quantitative LC-MS/MS measurement of 11 mycotoxins (AFB1, AFB2, AFG1, AFG2, OTA, ZEA, BEA, EN A, EN B, EN A1 and EN B1) in human urine. DLLME was selected as the most appropriate methodology, as it produced better validation results for recovery (79–113%), reproducibility (RSDs < 12%), and repeatability (RSDs < 15%) than miniQuEChERS (71–109%, RSDs <14% and <24%, respectively) and SALLE (70–108%, RSDs < 14% and < 24%, respectively). Moreover, the lowest detection (LODS) and quantitation limits (LOQS) were achieved with DLLME (LODs: 0.005–2 μg L−1, LOQs: 0.1–4 μg L−1). DLLME methodology was used for the analysis of 10 real urine samples from healthy volunteers showing the presence of ENs B, B1 and A1 at low concentrations. PMID:29048356

  13. Cost Models for MMC Manufacturing Processes

    NASA Technical Reports Server (NTRS)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  14. Accuracy and precision of pseudo-continuous arterial spin labeling perfusion during baseline and hypercapnia: a head-to-head comparison with ¹⁵O H₂O positron emission tomography.

    PubMed

    Heijtel, D F R; Mutsaerts, H J M M; Bakker, E; Schober, P; Stevens, M F; Petersen, E T; van Berckel, B N M; Majoie, C B L M; Booij, J; van Osch, M J P; Vanbavel, E; Boellaard, R; Lammertsma, A A; Nederveen, A J

    2014-05-15

    Measurements of the cerebral blood flow (CBF) and cerebrovascular reactivity (CVR) provide useful information about cerebrovascular condition and regional metabolism. Pseudo-continuous arterial spin labeling (pCASL) is a promising non-invasive MRI technique to quantitatively measure the CBF, whereas additional hypercapnic pCASL measurements are currently showing great promise to quantitatively assess the CVR. However, the introduction of pCASL at a larger scale awaits further evaluation of the exact accuracy and precision compared to the gold standard. (15)O H₂O positron emission tomography (PET) is currently regarded as the most accurate and precise method to quantitatively measure both CBF and CVR, though it is one of the more invasive methods as well. In this study we therefore assessed the accuracy and precision of quantitative pCASL-based CBF and CVR measurements by performing a head-to-head comparison with (15)O H₂O PET, based on quantitative CBF measurements during baseline and hypercapnia. We demonstrate that pCASL CBF imaging is accurate during both baseline and hypercapnia with respect to (15)O H₂O PET with a comparable precision. These results pave the way for quantitative usage of pCASL MRI in both clinical and research settings. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Determination of exposure multiples of human metabolites for MIST assessment in preclinical safety species without using reference standards or radiolabeled compounds.

    PubMed

    Ma, Shuguang; Li, Zhiling; Lee, Keun-Joong; Chowdhury, Swapan K

    2010-12-20

    A simple, reliable, and accurate method was developed for quantitative assessment of metabolite coverage in preclinical safety species by mixing equal volumes of human plasma with blank plasma of animal species and vice versa followed by an analysis using high-resolution full-scan accurate mass spectrometry. This approach provided comparable results (within (±15%) to those obtained from regulated bioanalysis and did not require synthetic standards or radiolabeled compounds. In addition, both qualitative and quantitative data were obtained from a single LC-MS analysis on all metabolites and, therefore, the coverage of any metabolite of interest can be obtained.

  16. sxtA-Based Quantitative Molecular Assay To Identify Saxitoxin-Producing Harmful Algal Blooms in Marine Waters ▿ †

    PubMed Central

    Murray, Shauna A.; Wiese, Maria; Stüken, Anke; Brett, Steve; Kellmann, Ralf; Hallegraeff, Gustaaf; Neilan, Brett A.

    2011-01-01

    The recent identification of genes involved in the production of the potent neurotoxin and keystone metabolite saxitoxin (STX) in marine eukaryotic phytoplankton has allowed us for the first time to develop molecular genetic methods to investigate the chemical ecology of harmful algal blooms in situ. We present a novel method for detecting and quantifying the potential for STX production in marine environmental samples. Our assay detects a domain of the gene sxtA that encodes a unique enzyme putatively involved in the sxt pathway in marine dinoflagellates, sxtA4. A product of the correct size was recovered from nine strains of four species of STX-producing Alexandrium and Gymnodinium catenatum and was not detected in the non-STX-producing Alexandrium species, other dinoflagellate cultures, or an environmental sample that did not contain known STX-producing species. However, sxtA4 was also detected in the non-STX-producing strain of Alexandrium tamarense, Tasmanian ribotype. We investigated the copy number of sxtA4 in three strains of Alexandrium catenella and found it to be relatively constant among strains. Using our novel method, we detected and quantified sxtA4 in three environmental blooms of Alexandrium catenella that led to STX uptake in oysters. We conclude that this method shows promise as an accurate, fast, and cost-effective means of quantifying the potential for STX production in marine samples and will be useful for biological oceanographic research and harmful algal bloom monitoring. PMID:21841034

  17. Micro-computed tomography of false starts produced on bone by different hand-saws.

    PubMed

    Pelletti, Guido; Viel, Guido; Fais, Paolo; Viero, Alessia; Visentin, Sindi; Miotto, Diego; Montisci, Massimo; Cecchetto, Giovanni; Giraudo, Chiara

    2017-05-01

    The analysis of macro- and microscopic characteristics of saw marks on bones can provide useful information about the class of the tool utilized to produce the injury. The aim of the present study was to test micro-computed tomography (micro-CT) for the analysis of false starts experimentally produced on 32 human bone sections using 4 different hand-saws in order to verify the potential utility of micro-CT for distinguishing false starts produced by different saws and to correlate the morphology of the tool with that of the bone mark. Each sample was analysed through stereomicroscopy and micro-CT. Stereomicroscopic analysis allowed the identification of the false starts and the detection of the number of tool marks left by each saw. Micro-CT scans, through the integration of 3D renders and multiplanar reconstructions (MPR), allowed the identification of the shape of each false start correlating it to the injuring tool. Our results suggest that micro-CT could be a useful technique for assessing false starts produced by different classes of saws, providing accurate morphological profiles of the bone marks with all the advantages of high resolution 3D imaging (e.g., high accuracy, non-destructive analysis, preservation and documentation of evidence). However, further studies are necessary to integrate qualitative data with quantitative metrical analysis in order to further characterize the false start and the related injuring tool. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. On the accuracy and reproducibility of a novel probabilistic atlas-based generation for calculation of head attenuation maps on integrated PET/MR scanners.

    PubMed

    Chen, Kevin T; Izquierdo-Garcia, David; Poynton, Clare B; Chonde, Daniel B; Catana, Ciprian

    2017-03-01

    To propose an MR-based method for generating continuous-valued head attenuation maps and to assess its accuracy and reproducibility. Demonstrating that novel MR-based photon attenuation correction methods are both accurate and reproducible is essential prior to using them routinely in research and clinical studies on integrated PET/MR scanners. Continuous-valued linear attenuation coefficient maps ("μ-maps") were generated by combining atlases that provided the prior probability of voxel positions belonging to a certain tissue class (air, soft tissue, or bone) and an MR intensity-based likelihood classifier to produce posterior probability maps of tissue classes. These probabilities were used as weights to generate the μ-maps. The accuracy of this probabilistic atlas-based continuous-valued μ-map ("PAC-map") generation method was assessed by calculating the voxel-wise absolute relative change (RC) between the MR-based and scaled CT-based attenuation-corrected PET images. To assess reproducibility, we performed pair-wise comparisons of the RC values obtained from the PET images reconstructed using the μ-maps generated from the data acquired at three time points. The proposed method produced continuous-valued μ-maps that qualitatively reflected the variable anatomy in patients with brain tumor and agreed well with the scaled CT-based μ-maps. The absolute RC comparing the resulting PET volumes was 1.76 ± 2.33 %, quantitatively demonstrating that the method is accurate. Additionally, we also showed that the method is highly reproducible, the mean RC value for the PET images reconstructed using the μ-maps obtained at the three visits being 0.65 ± 0.95 %. Accurate and highly reproducible continuous-valued head μ-maps can be generated from MR data using a probabilistic atlas-based approach.

  19. Validation of the Sonomat Against PSG and Quantitative Measurement of Partial Upper Airway Obstruction in Children With Sleep-Disordered Breathing.

    PubMed

    Norman, Mark B; Pithers, Sonia M; Teng, Arthur Y; Waters, Karen A; Sullivan, Colin E

    2017-03-01

    To validate the Sonomat against polysomnography (PSG) metrics in children and to objectively measure snoring and stertor to produce a quantitative indicator of partial upper airway obstruction that accurately reflects the pathology of pediatric sleep-disordered breathing (SDB). Simultaneous PSG and Sonomat recordings were performed in 76 children (46 male, age 5.8 ± 2.8, BMI = 18.5 ± 3.8 kg/m2). Sleep time, individual respiratory events and the apnea/hypopnea index (AHI) were compared. Obstructed breathing sounds were measured from the unobtrusive non-contact experimental device. There was no significant difference in total sleep time (TST), respiratory events or AHI values, the latter over-estimated by 0.3 events hr-1 by the Sonomat. Poor signal quality was minimal and gender, BMI, and body position did not adversely influence event detection. Obstructive and central events were classified correctly. The number of runs and duration of snoring (13 399 events, 20% TST) and stertor (5748 events, 24% TST) were an order of magnitude greater than respiratory events (1367 events, 1% TST). Many children defined as normal by PSG had just as many or more runs of snoring and stertor as those with mild, moderate and severe obstructive sleep apnea (OSA). The Sonomat accurately diagnoses SDB in children using current metrics. In addition, it permits quantification of partial airway obstruction that can be used to better describe pediatric SDB. Its non-contact design makes it ideal for use in children. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  20. An improved FSL-FIRST pipeline for subcortical gray matter segmentation to study abnormal brain anatomy using quantitative susceptibility mapping (QSM).

    PubMed

    Feng, Xiang; Deistung, Andreas; Dwyer, Michael G; Hagemeier, Jesper; Polak, Paul; Lebenberg, Jessica; Frouin, Frédérique; Zivadinov, Robert; Reichenbach, Jürgen R; Schweser, Ferdinand

    2017-06-01

    Accurate and robust segmentation of subcortical gray matter (SGM) nuclei is required in many neuroimaging applications. FMRIB's Integrated Registration and Segmentation Tool (FIRST) is one of the most popular software tools for automated subcortical segmentation based on T 1 -weighted (T1w) images. In this work, we demonstrate that FIRST tends to produce inaccurate SGM segmentation results in the case of abnormal brain anatomy, such as present in atrophied brains, due to a poor spatial match of the subcortical structures with the training data in the MNI space as well as due to insufficient contrast of SGM structures on T1w images. Consequently, such deviations from the average brain anatomy may introduce analysis bias in clinical studies, which may not always be obvious and potentially remain unidentified. To improve the segmentation of subcortical nuclei, we propose to use FIRST in combination with a special Hybrid image Contrast (HC) and Non-Linear (nl) registration module (HC-nlFIRST), where the hybrid image contrast is derived from T1w images and magnetic susceptibility maps to create subcortical contrast that is similar to that in the Montreal Neurological Institute (MNI) template. In our approach, a nonlinear registration replaces FIRST's default linear registration, yielding a more accurate alignment of the input data to the MNI template. We evaluated our method on 82 subjects with particularly abnormal brain anatomy, selected from a database of >2000 clinical cases. Qualitative and quantitative analyses revealed that HC-nlFIRST provides improved segmentation compared to the default FIRST method. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Population-scale three-dimensional reconstruction and quantitative profiling of microglia arbors

    PubMed Central

    Rey-Villamizar, Nicolas; Merouane, Amine; Lu, Yanbin; Mukherjee, Amit; Trett, Kristen; Chong, Peter; Harris, Carolyn; Shain, William; Roysam, Badrinath

    2015-01-01

    Motivation: The arbor morphologies of brain microglia are important indicators of cell activation. This article fills the need for accurate, robust, adaptive and scalable methods for reconstructing 3-D microglial arbors and quantitatively mapping microglia activation states over extended brain tissue regions. Results: Thick rat brain sections (100–300 µm) were multiplex immunolabeled for IBA1 and Hoechst, and imaged by step-and-image confocal microscopy with automated 3-D image mosaicing, producing seamless images of extended brain regions (e.g. 5903 × 9874 × 229 voxels). An over-complete dictionary-based model was learned for the image-specific local structure of microglial processes. The microglial arbors were reconstructed seamlessly using an automated and scalable algorithm that exploits microglia-specific constraints. This method detected 80.1 and 92.8% more centered arbor points, and 53.5 and 55.5% fewer spurious points than existing vesselness and LoG-based methods, respectively, and the traces were 13.1 and 15.5% more accurate based on the DIADEM metric. The arbor morphologies were quantified using Scorcioni’s L-measure. Coifman’s harmonic co-clustering revealed four morphologically distinct classes that concord with known microglia activation patterns. This enabled us to map spatial distributions of microglial activation and cell abundances. Availability and implementation: Experimental protocols, sample datasets, scalable open-source multi-threaded software implementation (C++, MATLAB) in the electronic supplement, and website (www.farsight-toolkit.org). http://www.farsight-toolkit.org/wiki/Population-scale_Three-dimensional_Reconstruction_and_Quanti-tative_Profiling_of_Microglia_Arbors Contact: broysam@central.uh.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25701570

  2. Vision 20/20: Magnetic resonance imaging-guided attenuation correction in PET/MRI: Challenges, solutions, and opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch

    Attenuation correction is an essential component of the long chain of data correction techniques required to achieve the full potential of quantitative positron emission tomography (PET) imaging. The development of combined PET/magnetic resonance imaging (MRI) systems mandated the widespread interest in developing novel strategies for deriving accurate attenuation maps with the aim to improve the quantitative accuracy of these emerging hybrid imaging systems. The attenuation map in PET/MRI should ideally be derived from anatomical MR images; however, MRI intensities reflect proton density and relaxation time properties of biological tissues rather than their electron density and photon attenuation properties. Therefore, inmore » contrast to PET/computed tomography, there is a lack of standardized global mapping between the intensities of MRI signal and linear attenuation coefficients at 511 keV. Moreover, in standard MRI sequences, bones and lung tissues do not produce measurable signals owing to their low proton density and short transverse relaxation times. MR images are also inevitably subject to artifacts that degrade their quality, thus compromising their applicability for the task of attenuation correction in PET/MRI. MRI-guided attenuation correction strategies can be classified in three broad categories: (i) segmentation-based approaches, (ii) atlas-registration and machine learning methods, and (iii) emission/transmission-based approaches. This paper summarizes past and current state-of-the-art developments and latest advances in PET/MRI attenuation correction. The advantages and drawbacks of each approach for addressing the challenges of MR-based attenuation correction are comprehensively described. The opportunities brought by both MRI and PET imaging modalities for deriving accurate attenuation maps and improving PET quantification will be elaborated. Future prospects and potential clinical applications of these techniques and their integration in commercial systems will also be discussed.« less

  3. Vision 20/20: Magnetic resonance imaging-guided attenuation correction in PET/MRI: Challenges, solutions, and opportunities.

    PubMed

    Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib

    2016-03-01

    Attenuation correction is an essential component of the long chain of data correction techniques required to achieve the full potential of quantitative positron emission tomography (PET) imaging. The development of combined PET/magnetic resonance imaging (MRI) systems mandated the widespread interest in developing novel strategies for deriving accurate attenuation maps with the aim to improve the quantitative accuracy of these emerging hybrid imaging systems. The attenuation map in PET/MRI should ideally be derived from anatomical MR images; however, MRI intensities reflect proton density and relaxation time properties of biological tissues rather than their electron density and photon attenuation properties. Therefore, in contrast to PET/computed tomography, there is a lack of standardized global mapping between the intensities of MRI signal and linear attenuation coefficients at 511 keV. Moreover, in standard MRI sequences, bones and lung tissues do not produce measurable signals owing to their low proton density and short transverse relaxation times. MR images are also inevitably subject to artifacts that degrade their quality, thus compromising their applicability for the task of attenuation correction in PET/MRI. MRI-guided attenuation correction strategies can be classified in three broad categories: (i) segmentation-based approaches, (ii) atlas-registration and machine learning methods, and (iii) emission/transmission-based approaches. This paper summarizes past and current state-of-the-art developments and latest advances in PET/MRI attenuation correction. The advantages and drawbacks of each approach for addressing the challenges of MR-based attenuation correction are comprehensively described. The opportunities brought by both MRI and PET imaging modalities for deriving accurate attenuation maps and improving PET quantification will be elaborated. Future prospects and potential clinical applications of these techniques and their integration in commercial systems will also be discussed.

  4. Using an Educational Electronic Documentation System to Help Nursing Students Accurately Identify Nursing Diagnoses

    ERIC Educational Resources Information Center

    Pobocik, Tamara J.

    2013-01-01

    The use of technology and electronic medical records in healthcare has exponentially increased. This quantitative research project used a pretest/posttest design, and reviewed how an educational electronic documentation system helped nursing students to identify the accurate related to statement of the nursing diagnosis for the patient in the case…

  5. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    PubMed Central

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2015-01-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices. PMID:25466541

  6. Advancing the speed, sensitivity and accuracy of biomolecular detection using multi-length-scale engineering

    NASA Astrophysics Data System (ADS)

    Kelley, Shana O.; Mirkin, Chad A.; Walt, David R.; Ismagilov, Rustem F.; Toner, Mehmet; Sargent, Edward H.

    2014-12-01

    Rapid progress in identifying disease biomarkers has increased the importance of creating high-performance detection technologies. Over the last decade, the design of many detection platforms has focused on either the nano or micro length scale. Here, we review recent strategies that combine nano- and microscale materials and devices to produce large improvements in detection sensitivity, speed and accuracy, allowing previously undetectable biomarkers to be identified in clinical samples. Microsensors that incorporate nanoscale features can now rapidly detect disease-related nucleic acids expressed in patient samples. New microdevices that separate large clinical samples into nanocompartments allow precise quantitation of analytes, and microfluidic systems that utilize nanoscale binding events can detect rare cancer cells in the bloodstream more accurately than before. These advances will lead to faster and more reliable clinical diagnostic devices.

  7. Tumour cell dispersion by the ultrasonic aspirator during brain tumour resection.

    PubMed

    Preston, J K; Masciopinto, J; Salamat, M S; Badie, B

    1999-10-01

    Ultrasonic aspirators are commonly used to resect brain tumours because they allow safe, rapid and accurate removal of diseased tissue. Since ultrasonic aspirators generate a spray of aerosolized irrigating fluid around the instrument tip, we questioned whether this spray might contain viable tumours cells that could contribute to intraoperative spread of tumour fragments. To test this hypothesis, we collected the spray produced during the resection of nine brain tumours with an ultrasonic aspirator and semi-quantitatively analysed it for tumour presence. The aerosolized irrigation fluid was found to contain intact tumour cells or clumps of tumour cells in all nine instances, and there was a trend of increasing tumour cell dispersion with increasing ultrasonic aspiration times. Further examination is required to determine if this intraoperative dispersion of apparently viable tumour fragments contributes to local neoplasm recurrence.

  8. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.

  9. A new liquid chromatography-mass spectrometry-based method to quantitate exogenous recombinant transferrin in cerebrospinal fluid: a potential approach for pharmacokinetic studies of transferrin-based therapeutics in the central nervous systems.

    PubMed

    Wang, Shunhai; Bobst, Cedric E; Kaltashov, Igor A

    2015-01-01

    Transferrin (Tf) is an 80 kDa iron-binding protein that is viewed as a promising drug carrier to target the central nervous system as a result of its ability to penetrate the blood-brain barrier. Among the many challenges during the development of Tf-based therapeutics, the sensitive and accurate quantitation of the administered Tf in cerebrospinal fluid (CSF) remains particularly difficult because of the presence of abundant endogenous Tf. Herein, we describe the development of a new liquid chromatography-mass spectrometry-based method for the sensitive and accurate quantitation of exogenous recombinant human Tf in rat CSF. By taking advantage of a His-tag present in recombinant Tf and applying Ni affinity purification, the exogenous human serum Tf can be greatly enriched from rat CSF, despite the presence of the abundant endogenous protein. Additionally, we applied a newly developed (18)O-labeling technique that can generate internal standards at the protein level, which greatly improved the accuracy and robustness of quantitation. The developed method was investigated for linearity, accuracy, precision, and lower limit of quantitation, all of which met the commonly accepted criteria for bioanalytical method validation.

  10. Performance Evaluation and Quantitative Accuracy of Multipinhole NanoSPECT/CT Scanner for Theranostic Lu-177 Imaging

    NASA Astrophysics Data System (ADS)

    Gupta, Arun; Kim, Kyeong Yun; Hwang, Donghwi; Lee, Min Sun; Lee, Dong Soo; Lee, Jae Sung

    2018-06-01

    SPECT plays important role in peptide receptor targeted radionuclide therapy using theranostic radionuclides such as Lu-177 for the treatment of various cancers. However, SPECT studies must be quantitatively accurate because the reliable assessment of tumor uptake and tumor-to-normal tissue ratios can only be performed using quantitatively accurate images. Hence, it is important to evaluate performance parameters and quantitative accuracy of preclinical SPECT systems for therapeutic radioisotopes before conducting pre- and post-therapy SPECT imaging or dosimetry studies. In this study, we evaluated system performance and quantitative accuracy of NanoSPECT/CT scanner for Lu-177 imaging using point source and uniform phantom studies. We measured recovery coefficient, uniformity, spatial resolution, system sensitivity and calibration factor for mouse whole body standard aperture. We also performed the experiments using Tc-99m to compare the results with that of Lu-177. We found that the recovery coefficient of more than 70% for Lu-177 at the optimum noise level when nine iterations were used. The spatial resolutions of Lu-177 with and without adding uniform background was comparable to that of Tc-99m in axial, radial and tangential directions. System sensitivity measured for Lu-177 was almost three times less than that of Tc-99m.

  11. One step screening of retroviral producer clones by real time quantitative PCR.

    PubMed

    Towers, G J; Stockholm, D; Labrousse-Najburg, V; Carlier, F; Danos, O; Pagès, J C

    1999-01-01

    Recombinant retroviruses are obtained from either stably or transiently transfected retrovirus producer cells. In the case of stably producing lines, a large number of clones must be screened in order to select the one with the highest titre. The multi-step selection of high titre producing clones is time consuming and expensive. We have taken advantage of retroviral endogenous reverse transcription to develop a quantitative PCR assay on crude supernatant from producing clones. We used Taqman PCR technology, which, by using fluorescence measurement at each cycle of amplification, allows PCR product quantification. Fluorescence results from specific degradation of a probe oligonucleotide by the Taq polymerase 3'-5' exonuclease activity. Primers and probe sequences were chosen to anneal to the viral strong stop species, which is the first DNA molecule synthesised during reverse transcription. The protocol consists of a single real time PCR, using as template filtered viral supernatant without any other pre-treatment. We show that the primers and probe described allow quantitation of serially diluted plasmid to as few as 15 plasmid molecules. We then test 200 GFP-expressing retroviral-producing clones either by FACS analysis of infected cells or by using the quantitative PCR. We confirm that the Taqman protocol allows the detection of virus in supernatant and selection of high titre clones. Furthermore, we can determine infectious titre by quantitative PCR on genomic DNA from infected cells, using an additional set of primers and probe to albumin to normalise for the genomic copy number. We demonstrate that real time quantitative PCR can be used as a powerful and reliable single step, high throughput screen for high titre retroviral producer clones.

  12. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  13. ACVP-02: Plasma SIV/SHIV RNA Viral Load Measurements through the AIDS and Cancer Virus Program Quantitative Molecular Diagnostics Core | Frederick National Laboratory for Cancer Research

    Cancer.gov

    The SIV plasma viral load assay performed by the Quantitative Molecular Diagnostics Core (QMDC) utilizes reagents specifically designed to detect and accurately quantify the full range of SIV/SHIV viral variants and clones in common usage in the rese

  14. QUANTITATION OF MENSTRUAL BLOOD LOSS: A RADIOACTIVE METHOD UTILIZING A COUNTING DOME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tauxe, W.N.

    A description has been given of a simple, accurate tech nique for the quantitation of menstrual blood loss, involving the determination of a three- dimensional isosensitivity curve and the fashioning of a lucite dome with cover to fit these specifications. Ten normal subjects lost no more than 50 ml each per menstrual period. (auth)

  15. Systematic Standardized and Individualized Assessment of Masticatory Cycles Using Electromagnetic 3D Articulography and Computer Scripts

    PubMed Central

    Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Dias, Fernando José

    2017-01-01

    Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (−0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments. PMID:29075647

  16. Systematic Standardized and Individualized Assessment of Masticatory Cycles Using Electromagnetic 3D Articulography and Computer Scripts.

    PubMed

    Fuentes, Ramón; Arias, Alain; Lezcano, María Florencia; Saravia, Diego; Kuramochi, Gisaku; Dias, Fernando José

    2017-01-01

    Masticatory movements are studied for decades in odontology; a better understanding of them could improve dental treatments. The aim of this study was to describe an innovative, accurate, and systematic method of analyzing masticatory cycles, generating comparable quantitative data. The masticatory cycles of 5 volunteers (Class I, 19 ± 1.7 years) without articular or dental occlusion problems were evaluated using 3D electromagnetic articulography supported by MATLAB software. The method allows the trajectory morphology of the set of chewing cycles to be analyzed from different views and angles. It was also possible to individualize the trajectory of each cycle providing accurate quantitative data, such as number of cycles, cycle areas in frontal view, and the ratio between each cycle area and the frontal mandibular border movement area. There was a moderate negative correlation (-0.61) between the area and the number of cycles: the greater the cycle area, the smaller the number of repetitions. Finally it was possible to evaluate the area of the cycles through time, which did not reveal a standardized behavior. The proposed method provided reproducible, intelligible, and accurate quantitative and graphical data, suggesting that it is promising and may be applied in different clinical situations and treatments.

  17. Cerebellar ataxia: abnormal control of interaction torques across multiple joints.

    PubMed

    Bastian, A J; Martin, T A; Keating, J G; Thach, W T

    1996-07-01

    1. We studied seven subjects with cerebellar lesions and seven control subjects as they made reaching movements in the sagittal plane to a target directly in front of them. Reaches were made under three different conditions: 1) "slow-accurate," 2) "fast-accurate," and 3) "fast as possible." All subjects were videotaped moving in a sagittal plane with markers on the index finger, wrist, elbow, and shoulder. Marker positions were digitized and then used to calculate joint angles. For each of the shoulder, elbow and wrist joints, inverse dynamics equations based on a three-segment limb model were used to estimate the net torque (sum of components) and each of the component torques. The component torques consisted of the torque due to gravity, the dynamic interaction torques induced passively by the movement of the adjacent joint, and the torque produced by the muscles and passive tissue elements (sometimes called "residual" torque). 2. A kinematic analysis of the movement trajectory and the change in joint angles showed that the reaches of subjects with cerebellar lesions were abnormal compared with reaches of control subjects. In both the slow-accurate and fast-accurate conditions the cerebellar subjects made abnormally curved wrist paths; the curvature was greater in the slow-accurate condition. During the slow-accurate condition, cerebellar subjects showed target undershoot and tended to move one joint at a time (decomposition). During the fast-accurate reaches, the cerebellar subjects showed target overshoot. Additionally, in the fast-accurate condition, cerebellar subjects moved the joints at abnormal rates relative to one another, but the movements were less decomposed. Only three subjects were tested in the fast as possible condition; this condition was analyzed only to determine maximal reaching speeds of subjects with cerebellar lesions. Cerebellar subjects moved more slowly than controls in all three conditions. 3. A kinetic analysis of torques generated at each joint during the slow-accurate reaches and the fast-accurate reaches revealed that subjects with cerebellar lesions produced very different torque profiles compared with control subjects. In the slow-accurate condition, the cerebellar subjects produced abnormal elbow muscle torques that prevented the normal elbow extension early in the reach. In the fast-accurate condition, the cerebellar subjects produced inappropriate levels of shoulder muscle torque and also produced elbow muscle torques that did not very appropriately with the dynamic interaction torques that occurred at the elbow. Lack of appropriate muscle torque resulted in excessive contributions of the dynamic interaction torque during the fast-accurate reaches. 4. The inability to produce muscle torques that predict, accommodate, and compensate for the dynamic interaction torques appears to be an important cause of the classic kinematic deficits shown by cerebellar subjects during attempted reaching. These kinematic deficits include incoordination of the shoulder and the elbow joints, a curved trajectory, and overshoot. In the fast-accurate condition, cerebellar subjects often made inappropriate muscle torques relative to the dynamic interaction torques. Because of this, interaction torques often determined the pattern of incoordination of the elbow and shoulder that produced the curved trajectory and target overshoot. In the slow-accurate condition, we reason that the cerebellar subjects may use a decomposition strategy so as to simplify the movement and not have to control both joints simultaneously. From these results, we suggest that a major role of the cerebellum is in generating muscle torques at a joint that will predict the interaction torques being generated by other moving joints and compensate for them as they occur.

  18. Parabolic quantitative structure-activity relationships and photodynamic therapy: application of a three-compartment model with clearance to the in vivo quantitative structure-activity relationships of a congeneric series of pyropheophorbide derivatives used as photosensitizers for photodynamic therapy.

    PubMed

    Potter, W R; Henderson, B W; Bellnier, D A; Pandey, R K; Vaughan, L A; Weishaupt, K R; Dougherty, T J

    1999-11-01

    An open three-compartment pharmacokinetic model was applied to the in vivo quantitative structure-activity relationship (QSAR) data of a homologous series of pyropheophorbide photosensitizers for photodynamic therapy (PDT). The physical model was a lipid compartment sandwiched between two identical aqueous compartments. The first compartment was assumed to clear irreversibly at a rate K0. The measured octanol-water partition coefficients, P(i) (where i is the number of carbons in the alkyl chain) and the clearance rate K0 determined the clearance kinetics of the drugs. Solving the coupled differential equations of the three-compartment model produced clearance kinetics for each of the sensitizers in each of the compartments. The third compartment was found to contain the target of PDT. This series of compounds is quite lipophilic. Therefore these drugs are found mainly in the second compartment. The drug level in the third compartment represents a small fraction of the tissue level and is thus not accessible to direct measurement by extraction. The second compartment of the model accurately predicted the clearance from the serum of mice of the hexyl ether of pyropheophorbide a, one member of this series of compounds. The diffusion and clearance rate constants were those found by fitting the pharmacokinetics of the third compartment to the QSAR data. This result validated the magnitude and mechanistic significance of the rate constants used to model the QSAR data. The PDT response to dose theory was applied to the kinetic behavior of the target compartment drug concentration. This produced a pharmacokinetic-based function connecting PDT response to dose as a function of time postinjection. This mechanistic dose-response function was fitted to published, single time point QSAR data for the pheophorbides. As a result, the PDT target threshold dose together with the predicted QSAR as a function of time postinjection was found.

  19. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data

    PubMed Central

    Chen, Yi-Hau

    2017-01-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA. PMID:28622336

  20. A knowledge-based T2-statistic to perform pathway analysis for quantitative proteomic data.

    PubMed

    Lai, En-Yu; Chen, Yi-Hau; Wu, Kun-Pin

    2017-06-01

    Approaches to identify significant pathways from high-throughput quantitative data have been developed in recent years. Still, the analysis of proteomic data stays difficult because of limited sample size. This limitation also leads to the practice of using a competitive null as common approach; which fundamentally implies genes or proteins as independent units. The independent assumption ignores the associations among biomolecules with similar functions or cellular localization, as well as the interactions among them manifested as changes in expression ratios. Consequently, these methods often underestimate the associations among biomolecules and cause false positives in practice. Some studies incorporate the sample covariance matrix into the calculation to address this issue. However, sample covariance may not be a precise estimation if the sample size is very limited, which is usually the case for the data produced by mass spectrometry. In this study, we introduce a multivariate test under a self-contained null to perform pathway analysis for quantitative proteomic data. The covariance matrix used in the test statistic is constructed by the confidence scores retrieved from the STRING database or the HitPredict database. We also design an integrating procedure to retain pathways of sufficient evidence as a pathway group. The performance of the proposed T2-statistic is demonstrated using five published experimental datasets: the T-cell activation, the cAMP/PKA signaling, the myoblast differentiation, and the effect of dasatinib on the BCR-ABL pathway are proteomic datasets produced by mass spectrometry; and the protective effect of myocilin via the MAPK signaling pathway is a gene expression dataset of limited sample size. Compared with other popular statistics, the proposed T2-statistic yields more accurate descriptions in agreement with the discussion of the original publication. We implemented the T2-statistic into an R package T2GA, which is available at https://github.com/roqe/T2GA.

  1. Assessment of the ion-trap mass spectrometer for routine qualitative and quantitative analysis of drugs of abuse extracted from urine.

    PubMed

    Vorce, S P; Sklerov, J H; Kalasinsky, K S

    2000-10-01

    The ion-trap mass spectrometer (MS) has been available as a detector for gas chromatography (GC) for nearly two decades. However, it still occupies a minor role in forensic toxicology drug-testing laboratories. Quadrupole MS instruments make up the majority of GC detectors used in drug confirmation. This work addresses the use of these two MS detectors, comparing the ion ratio precision and quantitative accuracy for the analysis of different classes of abused drugs extracted from urine. Urine specimens were prepared at five concentrations each for amphetamine (AMP), methamphetamine (METH), benzoylecgonine (BZE), delta9-carboxy-tetrahydrocannabinol (delta9-THCCOOH), phencyclidine (PCP), morphine (MOR), codeine (COD), and 6-acetylmorphine (6-AM). Concentration ranges for AMP, METH, BZE, delta9-THCCOOH, PCP, MOR, COD, and 6-AM were 50-2500, 50-5000, 15-800, 1.5-65, 1-250, 500-32000, 250-21000, and 1.5-118 ng/mL, respectively. Sample extracts were injected into a GC-quadrupole MS operating in selected ion monitoring (SIM) mode and a GC-ion-trap MS operating in either selected ion storage (SIS) or full scan (FS) mode. Precision was assessed by the evaluation of five ion ratios for n = 15 injections at each concentration using a single-point calibration. Precision measurements for SIM ion ratios provided coefficients of variation (CV) between 2.6 and 9.8% for all drugs. By comparison, the SIS and FS data yielded CV ranges of 4.0-12.8% and 4.0-11.2%, respectively. The total ion ratio failure rates were 0.2% (SIM), 0.7% (SIS), and 1.2% (FS) for the eight drugs analyzed. Overall, the SIS mode produced stable, comparable mean ratios over the concentration ranges examined, but had greater variance within batch runs. Examination of postmortem and quality-control samples produced forensically accurate quantitation by SIS when compared to SIM. Furthermore, sensitivity of FS was equivalent to SIM for all compounds examined except for 6-AM.

  2. Quantitative Phase Imaging in a Volume Holographic Microscope

    NASA Astrophysics Data System (ADS)

    Waller, Laura; Luo, Yuan; Barbastathis, George

    2010-04-01

    We demonstrate a method for quantitative phase imaging in a Volume Holographic Microscope (VHM) from a single exposure, describe the properties of the system and show experimental results. The VHM system uses a multiplexed volume hologram (VH) to laterally separate images from different focal planes. This 3D intensity information is then used to solve the transport of intensity (TIE) equation and recover phase quantitatively. We discuss the modifications to the technique that were made in order to give accurate results.

  3. eulerAPE: Drawing Area-Proportional 3-Venn Diagrams Using Ellipses

    PubMed Central

    Micallef, Luana; Rodgers, Peter

    2014-01-01

    Venn diagrams with three curves are used extensively in various medical and scientific disciplines to visualize relationships between data sets and facilitate data analysis. The area of the regions formed by the overlapping curves is often directly proportional to the cardinality of the depicted set relation or any other related quantitative data. Drawing these diagrams manually is difficult and current automatic drawing methods do not always produce appropriate diagrams. Most methods depict the data sets as circles, as they perceptually pop out as complete distinct objects due to their smoothness and regularity. However, circles cannot draw accurate diagrams for most 3-set data and so the generated diagrams often have misleading region areas. Other methods use polygons to draw accurate diagrams. However, polygons are non-smooth and non-symmetric, so the curves are not easily distinguishable and the diagrams are difficult to comprehend. Ellipses are more flexible than circles and are similarly smooth, but none of the current automatic drawing methods use ellipses. We present eulerAPE as the first method and software that uses ellipses for automatically drawing accurate area-proportional Venn diagrams for 3-set data. We describe the drawing method adopted by eulerAPE and we discuss our evaluation of the effectiveness of eulerAPE and ellipses for drawing random 3-set data. We compare eulerAPE and various other methods that are currently available and we discuss differences between their generated diagrams in terms of accuracy and ease of understanding for real world data. PMID:25032825

  4. eulerAPE: drawing area-proportional 3-Venn diagrams using ellipses.

    PubMed

    Micallef, Luana; Rodgers, Peter

    2014-01-01

    Venn diagrams with three curves are used extensively in various medical and scientific disciplines to visualize relationships between data sets and facilitate data analysis. The area of the regions formed by the overlapping curves is often directly proportional to the cardinality of the depicted set relation or any other related quantitative data. Drawing these diagrams manually is difficult and current automatic drawing methods do not always produce appropriate diagrams. Most methods depict the data sets as circles, as they perceptually pop out as complete distinct objects due to their smoothness and regularity. However, circles cannot draw accurate diagrams for most 3-set data and so the generated diagrams often have misleading region areas. Other methods use polygons to draw accurate diagrams. However, polygons are non-smooth and non-symmetric, so the curves are not easily distinguishable and the diagrams are difficult to comprehend. Ellipses are more flexible than circles and are similarly smooth, but none of the current automatic drawing methods use ellipses. We present eulerAPE as the first method and software that uses ellipses for automatically drawing accurate area-proportional Venn diagrams for 3-set data. We describe the drawing method adopted by eulerAPE and we discuss our evaluation of the effectiveness of eulerAPE and ellipses for drawing random 3-set data. We compare eulerAPE and various other methods that are currently available and we discuss differences between their generated diagrams in terms of accuracy and ease of understanding for real world data.

  5. Multi-modal molecular diffuse optical tomography system for small animal imaging

    PubMed Central

    Guggenheim, James A.; Basevi, Hector R. A.; Frampton, Jon; Styles, Iain B.; Dehghani, Hamid

    2013-01-01

    A multi-modal optical imaging system for quantitative 3D bioluminescence and functional diffuse imaging is presented, which has no moving parts and uses mirrors to provide multi-view tomographic data for image reconstruction. It is demonstrated that through the use of trans-illuminated spectral near infrared measurements and spectrally constrained tomographic reconstruction, recovered concentrations of absorbing agents can be used as prior knowledge for bioluminescence imaging within the visible spectrum. Additionally, the first use of a recently developed multi-view optical surface capture technique is shown and its application to model-based image reconstruction and free-space light modelling is demonstrated. The benefits of model-based tomographic image recovery as compared to 2D planar imaging are highlighted in a number of scenarios where the internal luminescence source is not visible or is confounding in 2D images. The results presented show that the luminescence tomographic imaging method produces 3D reconstructions of individual light sources within a mouse-sized solid phantom that are accurately localised to within 1.5mm for a range of target locations and depths indicating sensitivity and accurate imaging throughout the phantom volume. Additionally the total reconstructed luminescence source intensity is consistent to within 15% which is a dramatic improvement upon standard bioluminescence imaging. Finally, results from a heterogeneous phantom with an absorbing anomaly are presented demonstrating the use and benefits of a multi-view, spectrally constrained coupled imaging system that provides accurate 3D luminescence images. PMID:24954977

  6. A Machine Learned Classifier That Uses Gene Expression Data to Accurately Predict Estrogen Receptor Status

    PubMed Central

    Bastani, Meysam; Vos, Larissa; Asgarian, Nasimeh; Deschenes, Jean; Graham, Kathryn; Mackey, John; Greiner, Russell

    2013-01-01

    Background Selecting the appropriate treatment for breast cancer requires accurately determining the estrogen receptor (ER) status of the tumor. However, the standard for determining this status, immunohistochemical analysis of formalin-fixed paraffin embedded samples, suffers from numerous technical and reproducibility issues. Assessment of ER-status based on RNA expression can provide more objective, quantitative and reproducible test results. Methods To learn a parsimonious RNA-based classifier of hormone receptor status, we applied a machine learning tool to a training dataset of gene expression microarray data obtained from 176 frozen breast tumors, whose ER-status was determined by applying ASCO-CAP guidelines to standardized immunohistochemical testing of formalin fixed tumor. Results This produced a three-gene classifier that can predict the ER-status of a novel tumor, with a cross-validation accuracy of 93.17±2.44%. When applied to an independent validation set and to four other public databases, some on different platforms, this classifier obtained over 90% accuracy in each. In addition, we found that this prediction rule separated the patients' recurrence-free survival curves with a hazard ratio lower than the one based on the IHC analysis of ER-status. Conclusions Our efficient and parsimonious classifier lends itself to high throughput, highly accurate and low-cost RNA-based assessments of ER-status, suitable for routine high-throughput clinical use. This analytic method provides a proof-of-principle that may be applicable to developing effective RNA-based tests for other biomarkers and conditions. PMID:24312637

  7. QSPR models for half-wave reduction potential of steroids: a comparative study between feature selection and feature extraction from subsets of or entire set of descriptors.

    PubMed

    Hemmateenejad, Bahram; Yazdani, Mahdieh

    2009-02-16

    Steroids are widely distributed in nature and are found in plants, animals, and fungi in abundance. A data set consists of a diverse set of steroids have been used to develop quantitative structure-electrochemistry relationship (QSER) models for their half-wave reduction potential. Modeling was established by means of multiple linear regression (MLR) and principle component regression (PCR) analyses. In MLR analysis, the QSPR models were constructed by first grouping descriptors and then stepwise selection of variables from each group (MLR1) and stepwise selection of predictor variables from the pool of all calculated descriptors (MLR2). Similar procedure was used in PCR analysis so that the principal components (or features) were extracted from different group of descriptors (PCR1) and from entire set of descriptors (PCR2). The resulted models were evaluated using cross-validation, chance correlation, application to prediction reduction potential of some test samples and accessing applicability domain. Both MLR approaches represented accurate results however the QSPR model found by MLR1 was statistically more significant. PCR1 approach produced a model as accurate as MLR approaches whereas less accurate results were obtained by PCR2 approach. In overall, the correlation coefficients of cross-validation and prediction of the QSPR models resulted from MLR1, MLR2 and PCR1 approaches were higher than 90%, which show the high ability of the models to predict reduction potential of the studied steroids.

  8. Accurate pointing of tungsten welding electrodes

    NASA Technical Reports Server (NTRS)

    Ziegelmeier, P.

    1971-01-01

    Thoriated-tungsten is pointed accurately and quickly by using sodium nitrite. Point produced is smooth and no effort is necessary to hold the tungsten rod concentric. The chemically produced point can be used several times longer than ground points. This method reduces time and cost of preparing tungsten electrodes.

  9. Selection of Suitable Internal Control Genes for Accurate Normalization of Real-Time Quantitative PCR Data of Buffalo (Bubalus bubalis) Blastocysts Produced by SCNT and IVF.

    PubMed

    Sood, Tanushri Jerath; Lagah, Swati Viviyan; Sharma, Ankita; Singla, Suresh Kumar; Mukesh, Manishi; Chauhan, Manmohan Singh; Manik, Radheysham; Palta, Prabhat

    2017-10-01

    We evaluated the suitability of 10 candidate internal control genes (ICGs), belonging to different functional classes, namely ACTB, EEF1A1, GAPDH, HPRT1, HMBS, RPS15, RPS18, RPS23, SDHA, and UBC for normalizing the real-time quantitative polymerase chain reaction (qPCR) data of blastocyst-stage buffalo embryos produced by hand-made cloning and in vitro fertilization (IVF). Total RNA was isolated from three pools, each of cloned and IVF blastocysts (n = 50/pool) for cDNA synthesis. Two different statistical algorithms geNorm and NormFinder were used for evaluating the stability of these genes. Based on gene stability measure (M value) and pairwise variation (V value), calculated by geNorm analysis, the most stable ICGs were RPS15, HPRT1, and ACTB for cloned blastocysts, HMBS, UBC, and HPRT1 for IVF blastocysts and RPS15, GAPDH, and HPRT1 for both the embryo types analyzed together. RPS18 was the least stable gene for both cloned and IVF blastocysts. Following NormFinder analysis, the order of stability was RPS15 = HPRT1>GAPDH for cloned blastocysts, HMBS = UBC>RPS23 for IVF blastocysts, and HPRT1>GAPDH>RPS15 for cloned and IVF blastocysts together. These results suggest that despite overlapping of the three most stable ICGs between cloned and IVF blastocysts, the panel of ICGs selected for normalization of qPCR data of cloned and IVF blastocyst-stage embryos should be different.

  10. Non-Contact Thrust Stand Calibration Method for Repetitively-Pulsed Electric Thrusters

    NASA Technical Reports Server (NTRS)

    Wong, Andrea R.; Toftul, Alexandra; Polzin, Kurt A.; Pearson, J. Boise

    2011-01-01

    A thrust stand calibration technique for use in testing repetitively-pulsed electric thrusters for in-space propulsion has been developed and tested using a modified hanging pendulum thrust stand. In the implementation of this technique, current pulses are applied to a solenoidal coil to produce a pulsed magnetic field that acts against the magnetic field produced by a permanent magnet mounted to the thrust stand pendulum arm. The force on the magnet is applied in this non-contact manner, with the entire pulsed force transferred to the pendulum arm through a piezoelectric force transducer to provide a time-accurate force measurement. Modeling of the pendulum arm dynamics reveals that after an initial transient in thrust stand motion the quasisteady average deflection of the thrust stand arm away from the unforced or zero position can be related to the average applied force through a simple linear Hooke s law relationship. Modeling demonstrates that this technique is universally applicable except when the pulsing period is increased to the point where it approaches the period of natural thrust stand motion. Calibration data were obtained using a modified hanging pendulum thrust stand previously used for steady-state thrust measurements. Data were obtained for varying impulse bit at constant pulse frequency and for varying pulse frequency. The two data sets exhibit excellent quantitative agreement with each other as the constant relating average deflection and average thrust match within the errors on the linear regression curve fit of the data. Quantitatively, the error on the calibration coefficient is roughly 1% of the coefficient value.

  11. Tomosynthesis can facilitate accurate measurement of joint space width under the condition of the oblique incidence of X-rays in patients with rheumatoid arthritis.

    PubMed

    Ono, Yohei; Kashihara, Rina; Yasojima, Nobutoshi; Kasahara, Hideki; Shimizu, Yuka; Tamura, Kenichi; Tsutsumi, Kaori; Sutherland, Kenneth; Koike, Takao; Kamishima, Tamotsu

    2016-06-01

    Accurate evaluation of joint space width (JSW) is important in the assessment of rheumatoid arthritis (RA). In clinical radiography of bilateral hands, the oblique incidence of X-rays is unavoidable, which may cause perceptional or measurement error of JSW. The objective of this study was to examine whether tomosynthesis, a recently developed modality, can facilitate a more accurate evaluation of JSW than radiography under the condition of oblique incidence of X-rays. We investigated quantitative errors derived from the oblique incidence of X-rays by imaging phantoms simulating various finger joint spaces using radiographs and tomosynthesis images. We then compared the qualitative results of the modified total Sharp score of a total of 320 joints from 20 patients with RA between these modalities. A quantitative error was prominent when the location of the phantom was shifted along the JSW direction. Modified total Sharp scores of tomosynthesis images were significantly higher than those of radiography, that is to say JSW was regarded as narrower in tomosynthesis than in radiography when finger joints were located where the oblique incidence of X-rays is expected in the JSW direction. Tomosynthesis can facilitate accurate evaluation of JSW in finger joints of patients with RA, even with oblique incidence of X-rays. Accurate evaluation of JSW is necessary for the management of patients with RA. Through phantom and clinical studies, we demonstrate that tomosynthesis may achieve more accurate evaluation of JSW.

  12. Quantitative Phase Microscopy for Accurate Characterization of Microlens Arrays

    NASA Astrophysics Data System (ADS)

    Grilli, Simonetta; Miccio, Lisa; Merola, Francesco; Finizio, Andrea; Paturzo, Melania; Coppola, Sara; Vespini, Veronica; Ferraro, Pietro

    Microlens arrays are of fundamental importance in a wide variety of applications in optics and photonics. This chapter deals with an accurate digital holography-based characterization of both liquid and polymeric microlenses fabricated by an innovative pyro-electrowetting process. The actuation of liquid and polymeric films is obtained through the use of pyroelectric charges generated into polar dielectric lithium niobate crystals.

  13. Automated selected reaction monitoring software for accurate label-free protein quantification.

    PubMed

    Teleman, Johan; Karlsson, Christofer; Waldemarson, Sofia; Hansson, Karin; James, Peter; Malmström, Johan; Levander, Fredrik

    2012-07-06

    Selected reaction monitoring (SRM) is a mass spectrometry method with documented ability to quantify proteins accurately and reproducibly using labeled reference peptides. However, the use of labeled reference peptides becomes impractical if large numbers of peptides are targeted and when high flexibility is desired when selecting peptides. We have developed a label-free quantitative SRM workflow that relies on a new automated algorithm, Anubis, for accurate peak detection. Anubis efficiently removes interfering signals from contaminating peptides to estimate the true signal of the targeted peptides. We evaluated the algorithm on a published multisite data set and achieved results in line with manual data analysis. In complex peptide mixtures from whole proteome digests of Streptococcus pyogenes we achieved a technical variability across the entire proteome abundance range of 6.5-19.2%, which was considerably below the total variation across biological samples. Our results show that the label-free SRM workflow with automated data analysis is feasible for large-scale biological studies, opening up new possibilities for quantitative proteomics and systems biology.

  14. Magnetic Resonance Imaging of Intracranial Hypotension: Diagnostic Value of Combined Qualitative Signs and Quantitative Metrics.

    PubMed

    Aslan, Kerim; Gunbey, Hediye Pinar; Tomak, Leman; Ozmen, Zafer; Incesu, Lutfi

    The aim of this study was to investigate whether the use of combination quantitative metrics (mamillopontine distance [MPD], pontomesencephalic angle, and mesencephalon anterior-posterior/medial-lateral diameter ratios) with qualitative signs (dural enhancement, subdural collections/hematoma, venous engorgement, pituitary gland enlargements, and tonsillar herniations) provides a more accurate diagnosis of intracranial hypotension (IH). The quantitative metrics and qualitative signs of 34 patients and 34 control subjects were assessed by 2 independent observers. Receiver operating characteristic (ROC) curve was used to evaluate the diagnostic performance of quantitative metrics and qualitative signs, and for the diagnosis of IH, optimum cutoff values of quantitative metrics were found with ROC analysis. Combined ROC curve was measured for the quantitative metrics, and qualitative signs combinations in determining diagnostic accuracy and sensitivity, specificity, and positive and negative predictive values were found, and the best model combination was formed. Whereas MPD and pontomesencephalic angle were significantly lower in patients with IH when compared with the control group (P < 0.001), mesencephalon anterior-posterior/medial-lateral diameter ratio was significantly higher (P < 0.001). For qualitative signs, the highest individual distinctive power was dural enhancement with area under the ROC curve (AUC) of 0.838. For quantitative metrics, the highest individual distinctive power was MPD with AUC of 0.947. The best accuracy in the diagnosis of IH was obtained by combination of dural enhancement, venous engorgement, and MPD with an AUC of 1.00. This study showed that the combined use of dural enhancement, venous engorgement, and MPD had diagnostic accuracy of 100 % for the diagnosis of IH. Therefore, a more accurate IH diagnosis can be provided with combination of quantitative metrics with qualitative signs.

  15. Design and assessment of a novel SPECT system for desktop open-gantry imaging of small animals: A simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zeraatkar, Navid; Farahani, Mohammad Hossein; Rahmim, Arman

    Purpose: Given increasing efforts in biomedical research utilizing molecular imaging methods, development of dedicated high-performance small-animal SPECT systems has been growing rapidly in the last decade. In the present work, we propose and assess an alternative concept for SPECT imaging enabling desktop open-gantry imaging of small animals. Methods: The system, PERSPECT, consists of an imaging desk, with a set of tilted detector and pinhole collimator placed beneath it. The object to be imaged is simply placed on the desk. Monte Carlo (MC) and analytical simulations were utilized to accurately model and evaluate the proposed concept and design. Furthermore, a dedicatedmore » image reconstruction algorithm, finite-aperture-based circular projections (FABCP), was developed and validated for the system, enabling more accurate modeling of the system and higher quality reconstructed images. Image quality was quantified as a function of different tilt angles in the acquisition and number of iterations in the reconstruction algorithm. Furthermore, more complex phantoms including Derenzo, Defrise, and mouse whole body were simulated and studied. Results: The sensitivity of the PERSPECT was 207 cps/MBq. It was quantitatively demonstrated that for a tilt angle of 30°, comparable image qualities were obtained in terms of normalized squared error, contrast, uniformity, noise, and spatial resolution measurements, the latter at ∼0.6 mm. Furthermore, quantitative analyses demonstrated that 3 iterations of FABCP image reconstruction (16 subsets/iteration) led to optimally reconstructed images. Conclusions: The PERSPECT, using a novel imaging protocol, can achieve comparable image quality performance in comparison with a conventional pinhole SPECT with the same configuration. The dedicated FABCP algorithm, which was developed for reconstruction of data from the PERSPECT system, can produce high quality images for small-animal imaging via accurate modeling of the system as incorporated in the forward- and back-projection steps. Meanwhile, the developed MC model and the analytical simulator of the system can be applied for further studies on development and evaluation of the system.« less

  16. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis

    PubMed Central

    Te, Shu Harn; Chen, Enid Yingru

    2015-01-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques—qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. PMID:26025892

  17. Evaluation of laser diode thermal desorption-tandem mass spectrometry (LDTD-MS-MS) in forensic toxicology.

    PubMed

    Bynum, Nichole D; Moore, Katherine N; Grabenauer, Megan

    2014-10-01

    Many forensic laboratories experience backlogs due to increased drug-related cases. Laser diode thermal desorption (LDTD) has demonstrated its applicability in other scientific areas by providing data comparable with instrumentation, such as liquid chromatography-tandem mass spectrometry, in less time. LDTD-MS-MS was used to validate 48 compounds in drug-free human urine and blood for screening or quantitative analysis. Carryover, interference, limit of detection, limit of quantitation, matrix effect, linearity, precision and accuracy and stability were evaluated. Quantitative analysis indicated that LDTD-MS-MS produced precise and accurate results with the average overall within-run precision in urine and blood represented by a %CV <14.0 and <7.0, respectively. The accuracy for all drugs in urine ranged from 88.9 to 104.5% and 91.9 to 107.1% in blood. Overall, LDTD has the potential for use in forensic toxicology but before it can be successfully implemented that there are some challenges that must be addressed. Although the advantages of the LDTD system include minimal maintenance and rapid analysis (∼10 s per sample) which makes it ideal for high-throughput forensic laboratories, a major disadvantage is its inability or difficulty analyzing isomers and isobars due to the lack of chromatography without the use of high-resolution MS; therefore, it would be best implemented as a screening technique. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labuda, Aleksander; Proksch, Roger

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less

  19. Shot noise-limited Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry

    NASA Astrophysics Data System (ADS)

    Chen, Shichao; Zhu, Yizheng

    2017-02-01

    Sensitivity is a critical index to measure the temporal fluctuation of the retrieved optical pathlength in quantitative phase imaging system. However, an accurate and comprehensive analysis for sensitivity evaluation is still lacking in current literature. In particular, previous theoretical studies for fundamental sensitivity based on Gaussian noise models are not applicable to modern cameras and detectors, which are dominated by shot noise. In this paper, we derive two shot noiselimited theoretical sensitivities, Cramér-Rao bound and algorithmic sensitivity for wavelength shifting interferometry, which is a major category of on-axis interferometry techniques in quantitative phase imaging. Based on the derivations, we show that the shot noise-limited model permits accurate estimation of theoretical sensitivities directly from measured data. These results can provide important insights into fundamental constraints in system performance and can be used to guide system design and optimization. The same concepts can be generalized to other quantitative phase imaging techniques as well.

  20. Separation and quantitation of polyethylene glycols 400 and 3350 from human urine by high-performance liquid chromatography.

    PubMed

    Ryan, C M; Yarmush, M L; Tompkins, R G

    1992-04-01

    Polyethylene glycol 3350 (PEG 3350) is useful as an orally administered probe to measure in vivo intestinal permeability to macromolecules. Previous methods to detect polyethylene glycol (PEG) excreted in the urine have been hampered by inherent inaccuracies associated with liquid-liquid extraction and turbidimetric analysis. For accurate quantitation by previous methods, radioactive labels were required. This paper describes a method to separate and quantitate PEG 3350 and PEG 400 in human urine that is independent of radioactive labels and is accurate in clinical practice. The method uses sized regenerated cellulose membranes and mixed ion-exchange resin for sample preparation and high-performance liquid chromatography with refractive index detection for analysis. The 24-h excretion for normal individuals after an oral dose of 40 g of PEG 3350 and 5 g of PEG 400 was 0.12 +/- 0.04% of the original dose of PEG 3350 and 26.3 +/- 5.1% of the original dose of PEG 400.

  1. A novel LCMSMS method for quantitative measurement of short-chain fatty acids in human stool derivatized with 12C- and 13C-labelled aniline.

    PubMed

    Chan, James Chun Yip; Kioh, Dorinda Yan Qin; Yap, Gaik Chin; Lee, Bee Wah; Chan, Eric Chun Yong

    2017-05-10

    A novel liquid chromatography tandem mass spectrometry (LCMSMS) method for the quantitative measurement of gut microbial-derived short-chain fatty acids (SCFAs) in human infant stool has been developed and validated. Baseline chromatographic resolution was achieved for 12 SCFAs (acetic, butyric, caproic, 2,2-dimethylbutyric, 2-ethylbutyric, isobutyric, isovaleric, 2-methylbutyric, 4-methylvaleric, propionic, pivalic and valeric acids) within an analysis time of 15min. A novel sequential derivatization of endogenous and spiked SCFAs in stool via 12 C- and 13 C-aniline respectively, facilitated the accurate quantitation of 12 C-aniline derivatized endogenous SCFAs based on calibration of exogenously 13 C-derivatized SCFAs. Optimized quenching of derivatization agents prior to LCMSMS analysis further reduced to negligible levels the confounding chromatographic peak due to in-line derivatization of unquenched aniline with residual acetic acid present within the LCMS system. The effect of residual acetic acid, a common LCMS modifier, in analysis of SCFAs has not been addressed in previous SCFA assays. For the first time, a total of 9 SCFAs (acetic, butyric, caproic, isobutyric, isovaleric, 2-methylbutyric, 4-methylvaleric, propionic and valeric acids) were detected and quantitated in 107 healthy infant stool samples. The abundance and diversity of SCFAs in infant stool vary temporally from 3 weeks onwards and stabilize towards the end of 12 months. This in turn reflects the maturation of infant SCFA-producing gut microbiota community. In summary, this novel method is applicable to future studies that investigate the biological roles of SCFAs in paediatric health and diseases. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Quantitative contrast-enhanced mammography for contrast medium kinetics studies

    NASA Astrophysics Data System (ADS)

    Arvanitis, C. D.; Speller, R.

    2009-10-01

    Quantitative contrast-enhanced mammography, based on a dual-energy approach, aims to extract quantitative and temporal information of the tumour enhancement after administration of iodinated vascular contrast media. Simulations using analytical expressions and optimization of critical parameters essential for the development of quantitative contrast-enhanced mammography are presented. The procedure has been experimentally evaluated using a tissue-equivalent phantom and an amorphous silicon active matrix flat panel imager. The x-ray beams were produced by a tungsten target tube and spectrally shaped using readily available materials. Measurement of iodine projected thickness in mg cm-2 has been performed. The effect of beam hardening does not introduce nonlinearities in the measurement of iodine projected thickness for values of thicknesses found in clinical investigations. However, scattered radiation introduces significant deviations from slope equal to unity when compared with the actual iodine projected thickness. Scatter correction before the analysis of the dual-energy images provides accurate iodine projected thickness measurements. At 10% of the exposure used in clinical mammography, signal-to-noise ratios in excess of 5 were achieved for iodine projected thicknesses less than 3 mg cm-2 within a 4 cm thick phantom. For the extraction of temporal information, a limited number of low-dose images were used with the phantom incorporating a flow of iodinated contrast medium. The results suggest that spatial and temporal information of iodinated contrast media can be used to indirectly measure the tumour microvessel density and determine its uptake and washout from breast tumours. The proposed method can significantly improve tumour detection in dense breasts. Its application to perform in situ x-ray biopsy and assessment of the oncolytic effect of anticancer agents is foreseeable.

  3. A quantitative test of population genetics using spatiogenetic patterns in bacterial colonies.

    PubMed

    Korolev, Kirill S; Xavier, João B; Nelson, David R; Foster, Kevin R

    2011-10-01

    It is widely accepted that population-genetics theory is the cornerstone of evolutionary analyses. Empirical tests of the theory, however, are challenging because of the complex relationships between space, dispersal, and evolution. Critically, we lack quantitative validation of the spatial models of population genetics. Here we combine analytics, on- and off-lattice simulations, and experiments with bacteria to perform quantitative tests of the theory. We study two bacterial species, the gut microbe Escherichia coli and the opportunistic pathogen Pseudomonas aeruginosa, and show that spatiogenetic patterns in colony biofilms of both species are accurately described by an extension of the one-dimensional stepping-stone model. We use one empirical measure, genetic diversity at the colony periphery, to parameterize our models and show that we can then accurately predict another key variable: the degree of short-range cell migration along an edge. Moreover, the model allows us to estimate other key parameters, including effective population size (density) at the expansion frontier. While our experimental system is a simplification of natural microbial community, we argue that it constitutes proof of principle that the spatial models of population genetics can quantitatively capture organismal evolution.

  4. Quantitative real-time reverse transcription polymerase chain reaction: normalization to rRNA or single housekeeping genes is inappropriate for human tissue biopsies.

    PubMed

    Tricarico, Carmela; Pinzani, Pamela; Bianchi, Simonetta; Paglierani, Milena; Distante, Vito; Pazzagli, Mario; Bustin, Stephen A; Orlando, Claudio

    2002-10-15

    Careful normalization is essential when using quantitative reverse transcription polymerase chain reaction assays to compare mRNA levels between biopsies from different individuals or cells undergoing different treatment. Generally this involves the use of internal controls, such as mRNA specified by a housekeeping gene, ribosomal RNA (rRNA), or accurately quantitated total RNA. The aim of this study was to compare these methods and determine which one can provide the most accurate and biologically relevant quantitative results. Our results show significant variation in the expression levels of 10 commonly used housekeeping genes and 18S rRNA, both between individuals and between biopsies taken from the same patient. Furthermore, in 23 breast cancers samples mRNA and protein levels of a regulated gene, vascular endothelial growth factor (VEGF), correlated only when normalized to total RNA, as did microvessel density. Finally, mRNA levels of VEGF and the most popular housekeeping gene, glyceraldehyde-3-phosphate dehydrogenase (GAPDH), were significantly correlated in the colon. Our results suggest that the use of internal standards comprising single housekeeping genes or rRNA is inappropriate for studies involving tissue biopsies.

  5. Improved dose-volume histogram estimates for radiopharmaceutical therapy by optimizing quantitative SPECT reconstruction parameters

    NASA Astrophysics Data System (ADS)

    Cheng, Lishui; Hobbs, Robert F.; Segars, Paul W.; Sgouros, George; Frey, Eric C.

    2013-06-01

    In radiopharmaceutical therapy, an understanding of the dose distribution in normal and target tissues is important for optimizing treatment. Three-dimensional (3D) dosimetry takes into account patient anatomy and the nonuniform uptake of radiopharmaceuticals in tissues. Dose-volume histograms (DVHs) provide a useful summary representation of the 3D dose distribution and have been widely used for external beam treatment planning. Reliable 3D dosimetry requires an accurate 3D radioactivity distribution as the input. However, activity distribution estimates from SPECT are corrupted by noise and partial volume effects (PVEs). In this work, we systematically investigated OS-EM based quantitative SPECT (QSPECT) image reconstruction in terms of its effect on DVHs estimates. A modified 3D NURBS-based Cardiac-Torso (NCAT) phantom that incorporated a non-uniform kidney model and clinically realistic organ activities and biokinetics was used. Projections were generated using a Monte Carlo (MC) simulation; noise effects were studied using 50 noise realizations with clinical count levels. Activity images were reconstructed using QSPECT with compensation for attenuation, scatter and collimator-detector response (CDR). Dose rate distributions were estimated by convolution of the activity image with a voxel S kernel. Cumulative DVHs were calculated from the phantom and QSPECT images and compared both qualitatively and quantitatively. We found that noise, PVEs, and ringing artifacts due to CDR compensation all degraded histogram estimates. Low-pass filtering and early termination of the iterative process were needed to reduce the effects of noise and ringing artifacts on DVHs, but resulted in increased degradations due to PVEs. Large objects with few features, such as the liver, had more accurate histogram estimates and required fewer iterations and more smoothing for optimal results. Smaller objects with fine details, such as the kidneys, required more iterations and less smoothing at early time points post-radiopharmaceutical administration but more smoothing and fewer iterations at later time points when the total organ activity was lower. The results of this study demonstrate the importance of using optimal reconstruction and regularization parameters. Optimal results were obtained with different parameters at each time point, but using a single set of parameters for all time points produced near-optimal dose-volume histograms.

  6. Sexing chick mRNA: A protocol based on quantitative real-time polymerase chain reaction.

    PubMed

    Wan, Z; Lu, Y; Rui, L; Yu, X; Li, Z

    2017-03-01

    The accurate identification of sex in birds is important for research on avian sex determination and differentiation. Polymerase chain reaction (PCR)-based methods have been widely applied for the molecular sexing of birds. However, these methods have used genomic DNA. Here, we present the first sexing protocol for chick mRNA based on real-time quantitative PCR. We demonstrate that this method can accurately determine sex using mRNA from chick gonads and other tissues, such as heart, liver, spleen, lung, and muscle. The strategy of this protocol also may be suitable for other species in which sex is determined by the inheritance of sex chromosomes (ZZ male and ZW female). © 2016 Poultry Science Association Inc.

  7. Using multiple PCR and CE with chemiluminescence detection for simultaneous qualitative and quantitative analysis of genetically modified organism.

    PubMed

    Guo, Longhua; Qiu, Bin; Chi, Yuwu; Chen, Guonan

    2008-09-01

    In this paper, an ultrasensitive CE-CL detection system coupled with a novel double-on-column coaxial flow detection interface was developed for the detection of PCR products. A reliable procedure based on this system had been demonstrated for qualitative and quantitative analysis of genetically modified organism-the detection of Roundup Ready Soy (RRS) samples was presented as an example. The promoter, terminator, function and two reference genes of RRS were amplified with multiplex PCR simultaneously. After that, the multiplex PCR products were labeled with acridinium ester at the 5'-terminal through an amino modification and then analyzed by the proposed CE-CL system. Reproducibility of analysis times and peak heights for the CE-CL analysis were determined to be better than 0.91 and 3.07% (RSD, n=15), respectively, for three consecutive days. It was shown that this method could accurately and qualitatively detect RRS standards and the simulative samples. The evaluation in terms of quantitative analysis of RRS provided by this new method was confirmed by comparing our assay results with those of the standard real-time quantitative PCR (RT-QPCR) using SYBR Green I dyes. The results showed a good coherence between the two methods. This approach demonstrated the possibility for accurate qualitative and quantitative detection of GM plants in a single run.

  8. Serial Scanning and Registration of High Resolution Quantitative Computed Tomography Volume Scans for the Determination of Local Bone Density Changes

    NASA Technical Reports Server (NTRS)

    Whalen, Robert T.; Napel, Sandy; Yan, Chye H.

    1996-01-01

    Progress in development of the methods required to study bone remodeling as a function of time is reported. The following topics are presented: 'A New Methodology for Registration Accuracy Evaluation', 'Registration of Serial Skeletal Images for Accurately Measuring Changes in Bone Density', and 'Precise and Accurate Gold Standard for Multimodality and Serial Registration Method Evaluations.'

  9. Intraoperative perception and estimates on extent of resection during awake glioma surgery: overcoming the learning curve.

    PubMed

    Lau, Darryl; Hervey-Jumper, Shawn L; Han, Seunggu J; Berger, Mitchel S

    2018-05-01

    OBJECTIVE There is ample evidence that extent of resection (EOR) is associated with improved outcomes for glioma surgery. However, it is often difficult to accurately estimate EOR intraoperatively, and surgeon accuracy has yet to be reviewed. In this study, the authors quantitatively assessed the accuracy of intraoperative perception of EOR during awake craniotomy for tumor resection. METHODS A single-surgeon experience of performing awake craniotomies for tumor resection over a 17-year period was examined. Retrospective review of operative reports for quantitative estimation of EOR was recorded. Definitive EOR was based on postoperative MRI. Analysis of accuracy of EOR estimation was examined both as a general outcome (gross-total resection [GTR] or subtotal resection [STR]), and quantitatively (5% within EOR on postoperative MRI). Patient demographics, tumor characteristics, and surgeon experience were examined. The effects of accuracy on motor and language outcomes were assessed. RESULTS A total of 451 patients were included in the study. Overall accuracy of intraoperative perception of whether GTR or STR was achieved was 79.6%, and overall accuracy of quantitative perception of resection (within 5% of postoperative MRI) was 81.4%. There was a significant difference (p = 0.049) in accuracy for gross perception over the 17-year period, with improvement over the later years: 1997-2000 (72.6%), 2001-2004 (78.5%), 2005-2008 (80.7%), and 2009-2013 (84.4%). Similarly, there was a significant improvement (p = 0.015) in accuracy of quantitative perception of EOR over the 17-year period: 1997-2000 (72.2%), 2001-2004 (69.8%), 2005-2008 (84.8%), and 2009-2013 (93.4%). This improvement in accuracy is demonstrated by the significantly higher odds of correctly estimating quantitative EOR in the later years of the series on multivariate logistic regression. Insular tumors were associated with the highest accuracy of gross perception (89.3%; p = 0.034), but lowest accuracy of quantitative perception (61.1% correct; p < 0.001) compared with tumors in other locations. Even after adjusting for surgeon experience, this particular trend for insular tumors remained true. The absence of 1p19q co-deletion was associated with higher quantitative perception accuracy (96.9% vs 81.5%; p = 0.051). Tumor grade, recurrence, diagnosis, and isocitrate dehydrogenase-1 (IDH-1) status were not associated with accurate perception of EOR. Overall, new neurological deficits occurred in 8.4% of cases, and 42.1% of those new neurological deficits persisted after the 3-month follow-up. Correct quantitative perception was associated with lower postoperative motor deficits (2.4%) compared with incorrect perceptions (8.0%; p = 0.029). There were no detectable differences in language outcomes based on perception of EOR. CONCLUSIONS The findings from this study suggest that there is a learning curve associated with the ability to accurately assess intraoperative EOR during glioma surgery, and it may take more than a decade to be truly proficient. Understanding the factors associated with this ability to accurately assess EOR will provide safer surgeries while maximizing tumor resection.

  10. Persistence of uranium emission in laser-produced plasmas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LaHaye, N. L.; Harilal, S. S., E-mail: hari@purdue.edu; Diwakar, P. K.

    2014-04-28

    Detection of uranium and other nuclear materials is of the utmost importance for nuclear safeguards and security. Optical emission spectroscopy of laser-ablated U plasmas has been presented as a stand-off, portable analytical method that can yield accurate qualitative and quantitative elemental analysis of a variety of samples. In this study, optimal laser ablation and ambient conditions are explored, as well as the spatio-temporal evolution of the plasma for spectral analysis of excited U species in a glass matrix. Various Ar pressures were explored to investigate the role that plasma collisional effects and confinement have on spectral line emission enhancement andmore » persistence. The plasma-ambient gas interaction was also investigated using spatially resolved spectra and optical time-of-flight measurements. The results indicate that ambient conditions play a very important role in spectral emission intensity as well as the persistence of excited neutral U emission lines, influencing the appropriate spectral acquisition conditions.« less

  11. volBrain: An Online MRI Brain Volumetry System

    PubMed Central

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  12. Fabrication de couches minces a memoire de forme et effets de l'irradiation ionique

    NASA Astrophysics Data System (ADS)

    Goldberg, Florent

    1998-09-01

    Nickel and titanium when combined in the right stoichiometric proportion (1:1) can form alloys showing the shape memory effect. Within the scope of this thesis, thin films of such alloys have been successfully produced by sputtering. Precise control of composition is crucial in order to obtain the shape memory effect. A combination of analytical tools which can accurately determine the behavior of such materials is also required (calorimetric analysis, crystallography, composition analysis, etc.). Rutherford backscattering spectrometry has been used for quantitative composition analysis. Thereafter irradiation of films with light ions (He+) of few MeV was shown to allow lowering of the characteristic premartensitic transformation temperatures while preserving the shape memory effect. Those results open the door to a new field of research, particularly for ion irradiation and its potential use as a tool to modify the thermomechanical behavior of shape memory thin film actuators.

  13. Temperature dependence of the cross section for the fragmentation of thymine via dissociative electron attachment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kopyra, Janina; Abdoul-Carime, Hassan, E-mail: hcarime@ipnl.in2p3.fr

    Providing experimental values for absolute Dissociative Electron Attachment (DEA) cross sections for nucleobases at realistic biological conditions is a considerable challenge. In this work, we provide the temperature dependence of the cross section, σ, of the dehydrogenated thymine anion (T − H){sup −} produced via DEA. Within the 393-443 K temperature range, it is observed that σ varies by one order of magnitude. By extrapolating to a temperature of 313 K, the relative DEA cross section for the production of the dehydrogenated thymine anion at an incident energy of 1 eV decreases by 2 orders of magnitude and the absolutemore » value reaches approximately 6 × 10{sup −19} cm{sup 2}. These quantitative measurements provide a benchmark for theoretical prediction and also a contribution to a more accurate description of the effects of ionizing radiation on molecular medium.« less

  14. Comparison of Aircraft Icing Growth Assessment Software

    NASA Technical Reports Server (NTRS)

    Wright, William; Potapczuk, Mark G.; Levinson, Laurie H.

    2011-01-01

    A research project is underway to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. The project shows the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. The project addresses the validation of the software against a recent set of ice-shape data in the SLD regime. This validation effort mirrors a similar effort undertaken for previous validations of LEWICE. Those reports quantified the ice accretion prediction capabilities of the LEWICE software. Several ice geometry features were proposed for comparing ice shapes in a quantitative manner. The resulting analysis showed that LEWICE compared well to the available experimental data.

  15. Spelling Acquisition in English and Italian: A Cross-Linguistic Study.

    PubMed

    Marinelli, Chiara V; Romani, Cristina; Burani, Cristina; Zoccolotti, Pierluigi

    2015-01-01

    We examined the spelling acquisition in children up to late primary school of a consistent orthography (Italian) and an inconsistent orthography (English). The effects of frequency, lexicality, length, and regularity in modulating spelling performance of the two groups were examined. English and Italian children were matched for both chronological age and number of years of schooling. Two-hundred and seven Italian children and 79 English children took part in the study. We found greater accuracy in spelling in Italian than English children: Italian children were very accurate after only 2 years of schooling, while in English children the spelling performance was still poor after 5 years of schooling. Cross-linguistic differences in spelling accuracy proved to be more persistent than the corresponding ones in reading accuracy. Orthographic consistency produced not only quantitative, but also qualitative differences, with larger frequency and regularity effects in English than in Italian children.

  16. Spelling Acquisition in English and Italian: A Cross-Linguistic Study

    PubMed Central

    Marinelli, Chiara V.; Romani, Cristina; Burani, Cristina; Zoccolotti, Pierluigi

    2015-01-01

    We examined the spelling acquisition in children up to late primary school of a consistent orthography (Italian) and an inconsistent orthography (English). The effects of frequency, lexicality, length, and regularity in modulating spelling performance of the two groups were examined. English and Italian children were matched for both chronological age and number of years of schooling. Two-hundred and seven Italian children and 79 English children took part in the study. We found greater accuracy in spelling in Italian than English children: Italian children were very accurate after only 2 years of schooling, while in English children the spelling performance was still poor after 5 years of schooling. Cross-linguistic differences in spelling accuracy proved to be more persistent than the corresponding ones in reading accuracy. Orthographic consistency produced not only quantitative, but also qualitative differences, with larger frequency and regularity effects in English than in Italian children. PMID:26696918

  17. Estimating Photosynthetically Available Radiation (PAR) at the Earth's surface from satellite observations

    NASA Technical Reports Server (NTRS)

    Frouin, Robert

    1993-01-01

    Current satellite algorithms to estimate photosynthetically available radiation (PAR) at the earth' s surface are reviewed. PAR is deduced either from an insolation estimate or obtained directly from top-of-atmosphere solar radiances. The characteristics of both approaches are contrasted and typical results are presented. The inaccuracies reported, about 10 percent and 6 percent on daily and monthly time scales, respectively, are useful to model oceanic and terrestrial primary productivity. At those time scales variability due to clouds in the ratio of PAR and insolation is reduced, making it possible to deduce PAR directly from insolation climatologies (satellite or other) that are currently available or being produced. Improvements, however, are needed in conditions of broken cloudiness and over ice/snow. If not addressed properly, calibration/validation issues may prevent quantitative use of the PAR estimates in studies of climatic change. The prospects are good for an accurate, long-term climatology of PAR over the globe.

  18. Single-band upconversion nanoprobes for multiplexed simultaneous in situ molecular mapping of cancer biomarkers.

    PubMed

    Zhou, Lei; Wang, Rui; Yao, Chi; Li, Xiaomin; Wang, Chengli; Zhang, Xiaoyan; Xu, Congjian; Zeng, Aijun; Zhao, Dongyuan; Zhang, Fan

    2015-04-24

    The identification of potential diagnostic markers and target molecules among the plethora of tumour oncoproteins for cancer diagnosis requires facile technology that is capable of quantitatively analysing multiple biomarkers in tumour cells and tissues. Diagnostic and prognostic classifications of human tumours are currently based on the western blotting and single-colour immunohistochemical methods that are not suitable for multiplexed detection. Herein, we report a general and novel method to prepare single-band upconversion nanoparticles with different colours. The expression levels of three biomarkers in breast cancer cells were determined using single-band upconversion nanoparticles, western blotting and immunohistochemical technologies with excellent correlation. Significantly, the application of antibody-conjugated single-band upconversion nanoparticle molecular profiling technology can achieve the multiplexed simultaneous in situ biodetection of biomarkers in breast cancer cells and tissue specimens and produce more accurate results for the simultaneous quantification of proteins present at low levels compared with classical immunohistochemical technology.

  19. volBrain: An Online MRI Brain Volumetry System.

    PubMed

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  20. Developments and Control of Biocompatible Conducting Polymer for Intracorporeal Continuum Robots.

    PubMed

    Chikhaoui, Mohamed Taha; Benouhiba, Amine; Rougeot, Patrick; Rabenorosoa, Kanty; Ouisse, Morvan; Andreff, Nicolas

    2018-04-30

    Dexterity of robots is highly required when it comes to integration for medical applications. Major efforts have been conducted to increase the dexterity at the distal parts of medical robots. This paper reports on developments toward integrating biocompatible conducting polymers (CP) into inherently dexterous concentric tube robot paradigm. In the form of tri-layer thin structures, CP micro-actuators produce high strains while requiring less than 1 V for actuation. Fabrication, characterization, and first integrations of such micro-actuators are presented. The integration is validated in a preliminary telescopic soft robot prototype with qualitative and quantitative performance assessment of accurate position control for trajectory tracking scenarios. Further, CP micro-actuators are integrated to a laser steering system in a closed-loop control scheme with displacements up to 5 mm. Our first developments aim toward intracorporeal medical robotics, with miniaturized actuators to be embedded into continuum robots.

  1. The recovery of weak impulsive signals based on stochastic resonance and moving least squares fitting.

    PubMed

    Jiang, Kuosheng; Xu, Guanghua; Liang, Lin; Tao, Tangfei; Gu, Fengshou

    2014-07-29

    In this paper a stochastic resonance (SR)-based method for recovering weak impulsive signals is developed for quantitative diagnosis of faults in rotating machinery. It was shown in theory that weak impulsive signals follow the mechanism of SR, but the SR produces a nonlinear distortion of the shape of the impulsive signal. To eliminate the distortion a moving least squares fitting method is introduced to reconstruct the signal from the output of the SR process. This proposed method is verified by comparing its detection results with that of a morphological filter based on both simulated and experimental signals. The experimental results show that the background noise is suppressed effectively and the key features of impulsive signals are reconstructed with a good degree of accuracy, which leads to an accurate diagnosis of faults in roller bearings in a run-to failure test.

  2. The relationship between fuel lubricity and diesel injection system wear

    NASA Astrophysics Data System (ADS)

    Lacy, Paul I.

    1992-01-01

    Use of low-lubricity fuel may have contributed to increased failure rates associated with critical fuel injection equipment during the 1991 Operation Desert Storm. However, accurate quantitative analysis of failed components from the field is almost impossible due to the unique service history of each pump. This report details the results of pump stand tests with fuels of equal viscosity, but widely different lubricity. Baseline tests were also performed using reference no. 2 diesel fuel. Use of poor lubricity fuel under these controlled conditions was found to greatly reduce both pump durability and engine performance. However, both improved metallurgy and fuel lubricity additives significantly reduced wear. Good correlation was obtained between standard bench tests and lightly loaded pump components. However, high contact loads on isolated components produced a more severe wear mechanism that is not well reflected by the Ball-on-Cylinder Lubricity Evaluator.

  3. Development of an accurate portable recording peak-flow meter for the diagnosis of asthma.

    PubMed

    Hitchings, D J; Dickinson, S A; Miller, M R; Fairfax, A J

    1993-05-01

    This article describes the systematic design of an electronic recording peak expiratory flow (PEF) meter to provide accurate data for the diagnosis of occupational asthma. Traditional diagnosis of asthma relies on accurate data of PEF tests performed by the patients in their own homes and places of work. Unfortunately there are high error rates in data produced and recorded by the patient, most of these are transcription errors and some patients falsify their records. The PEF measurement itself is not effort independent, the data produced depending on the way in which the patient performs the test. Patients are taught how to perform the test giving maximal effort to the expiration being measured. If the measurement is performed incorrectly then errors will occur. Accurate data can be produced if an electronically recording PEF instrument is developed, thus freeing the patient from the task of recording the test data. It should also be capable of determining whether the PEF measurement has been correctly performed. A requirement specification for a recording PEF meter was produced. A commercially available electronic PEF meter was modified to provide the functions required for accurate serial recording of the measurements produced by the patients. This is now being used in three hospitals in the West Midlands for investigations into the diagnosis of occupational asthma. In investigating current methods of measuring PEF and other pulmonary quantities a greater understanding was obtained of the limitations of current methods of measurement, and quantities being measured.(ABSTRACT TRUNCATED AT 250 WORDS)

  4. A review of quantitative structure-property relationships for the fate of ionizable organic chemicals in water matrices and identification of knowledge gaps.

    PubMed

    Nolte, Tom M; Ragas, Ad M J

    2017-03-22

    Many organic chemicals are ionizable by nature. After use and release into the environment, various fate processes determine their concentrations, and hence exposure to aquatic organisms. In the absence of suitable data, such fate processes can be estimated using Quantitative Structure-Property Relationships (QSPRs). In this review we compiled available QSPRs from the open literature and assessed their applicability towards ionizable organic chemicals. Using quantitative and qualitative criteria we selected the 'best' QSPRs for sorption, (a)biotic degradation, and bioconcentration. The results indicate that many suitable QSPRs exist, but some critical knowledge gaps remain. Specifically, future focus should be directed towards the development of QSPR models for biodegradation in wastewater and sediment systems, direct photolysis and reaction with singlet oxygen, as well as additional reactive intermediates. Adequate QSPRs for bioconcentration in fish exist, but more accurate assessments can be achieved using pharmacologically based toxicokinetic (PBTK) models. No adequate QSPRs exist for bioconcentration in non-fish species. Due to the high variability of chemical and biological species as well as environmental conditions in QSPR datasets, accurate predictions for specific systems and inter-dataset conversions are problematic, for which standardization is needed. For all QSPR endpoints, additional data requirements involve supplementing the current chemical space covered and accurately characterizing the test systems used.

  5. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng

    Quantitative gene expression analysis in intact single cells can be achieved using single molecule- based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple FISH sub-probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific binding of sub-probes and tissue autofluorescence, limiting its accuracy. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on blinking frequencies of photoswitchable dyes. This fluctuation localization imaging-based FISH (fliFISH) uses blinking frequency patterns, emitted frommore » a transcript bound to multiple sub-probes, which are distinct from blinking patterns emitted from partial or nonspecifically bound sub-probes and autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2, and their ratio (Nkx2-2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct blinking frequency patterns, laying the foundation for reliable single-cell transcriptomics.« less

  6. CPTAC Accelerates Precision Proteomics Biomedical Research | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    The accurate quantitation of proteins or peptides using Mass Spectrometry (MS) is gaining prominence in the biomedical research community as an alternative method for analyte measurement. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) investigators have been at the forefront in the promotion of reproducible MS techniques, through the development and application of standardized proteomic methods for protein quantitation on biologically relevant samples.

  7. Factors That Contribute to Assay Variation in Quantitative Analysis of Sex Steroid Hormones Using Liquid and Gas Chromatography-Mass Spectrometry

    ERIC Educational Resources Information Center

    Xu, Xia; Veenstra, Timothy D.

    2012-01-01

    The list of physiological events in which sex steroids play a role continues to increase. To decipher the roles that sex steroids play in any condition requires high quality cohorts of samples and assays that provide highly accurate quantitative measures. Liquid and gas chromatography coupled with mass spectrometry (LC-MS and GC-MS) have…

  8. Improvement of medical content in the curriculum of biomedical engineering based on assessment of students outcomes.

    PubMed

    Abdulhay, Enas; Khnouf, Ruba; Haddad, Shireen; Al-Bashir, Areen

    2017-08-04

    Improvement of medical content in Biomedical Engineering curricula based on a qualitative assessment process or on a comparison with another high-standard program has been approached by a number of studies. However, the quantitative assessment tools have not been emphasized. The quantitative assessment tools can be more accurate and robust in cases of challenging multidisciplinary fields like that of Biomedical Engineering which includes biomedicine elements mixed with technology aspects. The major limitations of the previous research are the high dependence on surveys or pure qualitative approaches as well as the absence of strong focus on medical outcomes without implicit confusion with the technical ones. The proposed work presents the development and evaluation of an accurate/robust quantitative approach to the improvement of the medical content in the challenging multidisciplinary BME curriculum. The work presents quantitative assessment tools and subsequent improvement of curriculum medical content applied, as example for explanation, to the ABET (Accreditation Board for Engineering and Technology, USA) accredited biomedical engineering BME department at Jordan University of Science and Technology. The quantitative results of assessment of curriculum/course, capstone, exit exam, course assessment by student (CAS) as well as of surveys filled by alumni, seniors, employers and training supervisors were, first, mapped to the expected students' outcomes related to the medical field (SOsM). The collected data were then analyzed and discussed to find curriculum weakness points by tracking shortcomings in every outcome degree of achievement. Finally, actions were taken to fill in the gaps of the curriculum. Actions were also mapped to the students' medical outcomes (SOsM). Weighted averages of obtained quantitative values, mapped to SOsM, indicated accurately the achievement levels of all outcomes as well as the necessary improvements to be performed in curriculum. Mapping the improvements to SOsM also helps in the assessment of the following cycle. The suggested assessment tools can be generalized and extended to any other BME department. Robust improvement of medical content in BME curriculum can subsequently be achieved.

  9. Survival and Reproductive Strategies in Two-Spotted Spider Mites: Demographic Analysis of Arrhenotokous Parthenogenesis of Tetranychus urticae (Acari: Tetranychidae).

    PubMed

    Tuan, Shu-Jen; Lin, Yung-Hsiang; Yang, Chung-Ming; Atlihan, Remzi; Saska, Pavel; Chi, Hsin

    2016-04-01

    Tetranychus urticae Koch is a cosmopolitan pest whose rapid developmental rate enables it to produce colonies of thousands of individuals within a short time period. When a solitary virgin female colonizes a new host plant, it is capable of producing male offspring through the arrhenotokous parthenogenesis; once her sons mature, oedipal mating occurs and the female will produce bisexual offspring. To analyze the effect of arrhenotokous reproduction on population growth, we devised and compared separate life tables for arrhenotokous and bisexual populations of T. urticae using the age-stage, two-sex life table theory. For the cohort with bisexual reproduction, the intrinsic rate of increase (r), finite rate (λ), net reproductive rate (R0), and mean generation time (T) were 0.2736 d(−1), 1.3146 d(−1), 44.66 offspring, and 13.89 d, respectively. Because only male eggs were produced during the first 8 d of the oviposition period and the cohort would soon begin bisexual reproduction, it would be theoretically wrong to calculate the population parameters using the survival rate and fecundity of an arrhenotokous cohort. We demonstrated that the effect of arrhenotokous reproduction could be accurately described and evaluated using the age-stage, two-sex life table. We also used population projection based on life table data, quantitatively showing the effect that arrhenotokous reproduction has on the growth potential and management of T. urticae.

  10. Accurate Region-of-Interest Recovery Improves the Measurement of the Cell Migration Rate in the In Vitro Wound Healing Assay.

    PubMed

    Bedoya, Cesar; Cardona, Andrés; Galeano, July; Cortés-Mancera, Fabián; Sandoz, Patrick; Zarzycki, Artur

    2017-12-01

    The wound healing assay is widely used for the quantitative analysis of highly regulated cellular events. In this essay, a wound is voluntarily produced on a confluent cell monolayer, and then the rate of wound reduction (WR) is characterized by processing images of the same regions of interest (ROIs) recorded at different time intervals. In this method, sharp-image ROI recovery is indispensable to compensate for displacements of the cell cultures due either to the exploration of multiple sites of the same culture or to transfers from the microscope stage to a cell incubator. ROI recovery is usually done manually and, despite a low-magnification microscope objective is generally used (10x), repositioning imperfections constitute a major source of errors detrimental to the WR measurement accuracy. We address this ROI recovery issue by using pseudoperiodic patterns fixed onto the cell culture dishes, allowing the easy localization of ROIs and the accurate quantification of positioning errors. The method is applied to a tumor-derived cell line, and the WR rates are measured by means of two different image processing software. Sharp ROI recovery based on the proposed method is found to improve significantly the accuracy of the WR measurement and the positioning under the microscope.

  11. Spectrally resolved opacities and Rosseland and Planck mean opacities of lowly ionized gold plasmas: a detailed level-accounting investigation.

    PubMed

    Zeng, Jiaolong; Yuan, Jianmin

    2007-08-01

    Calculation details of radiative opacity for lowly ionized gold plasmas by using our developed fully relativistic detailed level-accounting approach are presented to show the importance of accurate atomic data for a quantitative reproduction of the experimental observations. Even though a huge number of transition lines are involved in the radiative absorption of high- Z plasmas so that one believes that statistical models can often give a reasonable description of their opacities, we first show in detail that an adequate treatment of physical effects, in particular the configuration interaction (including the core-valence electron correlation), is essential to produce atomic data of bound-bound and bound-free processes for gold plasmas, which are accurate enough to correctly explain the relative intensity of two strong absorption peaks experimentally observed located near photon energy of 70 and 80 eV. A detailed study is also carried out for gold plasmas of an average ionization degree sequence of 10, for both spectrally resolved opacities and Rosseland and Planck means. For comparison, results obtained by using an average atom model are also given to show that even for a relatively higher density of matter, correlation effects are also important to predict the correct positions of absorption peaks of transition arrays.

  12. How Haptic Size Sensations Improve Distance Perception

    PubMed Central

    Battaglia, Peter W.; Kersten, Daniel; Schrater, Paul R.

    2011-01-01

    Determining distances to objects is one of the most ubiquitous perceptual tasks in everyday life. Nevertheless, it is challenging because the information from a single image confounds object size and distance. Though our brains frequently judge distances accurately, the underlying computations employed by the brain are not well understood. Our work illuminates these computions by formulating a family of probabilistic models that encompass a variety of distinct hypotheses about distance and size perception. We compare these models' predictions to a set of human distance judgments in an interception experiment and use Bayesian analysis tools to quantitatively select the best hypothesis on the basis of its explanatory power and robustness over experimental data. The central question is: whether, and how, human distance perception incorporates size cues to improve accuracy. Our conclusions are: 1) humans incorporate haptic object size sensations for distance perception, 2) the incorporation of haptic sensations is suboptimal given their reliability, 3) humans use environmentally accurate size and distance priors, 4) distance judgments are produced by perceptual “posterior sampling”. In addition, we compared our model's estimated sensory and motor noise parameters with previously reported measurements in the perceptual literature and found good correspondence between them. Taken together, these results represent a major step forward in establishing the computational underpinnings of human distance perception and the role of size information. PMID:21738457

  13. Qualification of a Multi-Channel Infrared Laser Absorption Spectrometer for Monitoring CO, HCl, HCN, HF, and CO2 Aboard Manned Spacecraft

    NASA Technical Reports Server (NTRS)

    Briggs, Ryan M.; Frez, Clifford; Forouhar, Siamak; May, Randy D.; Meyer, Marit E.; Kulis, Michael J.; Berger, Gordon M.

    2015-01-01

    Monitoring of specific combustion products can provide early-warning detection of accidental fires aboard manned spacecraft and also identify the source and severity of combustion events. Furthermore, quantitative in situ measurements are important for gauging levels of exposure to hazardous gases, particularly on long-duration missions where analysis of returned samples becomes impractical. Absorption spectroscopy using tunable laser sources in the 2 to 5 micrometer wavelength range enables accurate, unambiguous detection of CO, HCl, HCN, HF, and CO2, which are produced in varying amounts through the heating of electrical components and packaging materials commonly used aboard spacecraft. Here, we report on calibration and testing of a five-channel laser absorption spectrometer designed to accurately monitor ambient gas-phase concentrations of these five compounds, with low-level detection limits based on the Spacecraft Maximum Allowable Concentrations. The instrument employs a two-pass absorption cell with a total optical pathlength of 50 cm and a dedicated infrared semiconductor laser source for each target gas. We present results from testing the five-channel sensor in the presence of trace concentrations of the target compounds that were introduced using both gas sources and oxidative pyrolysis (non-flaming combustion) of solid material mixtures.

  14. Accurate Construction of Photoactivated Localization Microscopy (PALM) Images for Quantitative Measurements

    PubMed Central

    Coltharp, Carla; Kessler, Rene P.; Xiao, Jie

    2012-01-01

    Localization-based superresolution microscopy techniques such as Photoactivated Localization Microscopy (PALM) and Stochastic Optical Reconstruction Microscopy (STORM) have allowed investigations of cellular structures with unprecedented optical resolutions. One major obstacle to interpreting superresolution images, however, is the overcounting of molecule numbers caused by fluorophore photoblinking. Using both experimental and simulated images, we determined the effects of photoblinking on the accurate reconstruction of superresolution images and on quantitative measurements of structural dimension and molecule density made from those images. We found that structural dimension and relative density measurements can be made reliably from images that contain photoblinking-related overcounting, but accurate absolute density measurements, and consequently faithful representations of molecule counts and positions in cellular structures, require the application of a clustering algorithm to group localizations that originate from the same molecule. We analyzed how applying a simple algorithm with different clustering thresholds (tThresh and dThresh) affects the accuracy of reconstructed images, and developed an easy method to select optimal thresholds. We also identified an empirical criterion to evaluate whether an imaging condition is appropriate for accurate superresolution image reconstruction with the clustering algorithm. Both the threshold selection method and imaging condition criterion are easy to implement within existing PALM clustering algorithms and experimental conditions. The main advantage of our method is that it generates a superresolution image and molecule position list that faithfully represents molecule counts and positions within a cellular structure, rather than only summarizing structural properties into ensemble parameters. This feature makes it particularly useful for cellular structures of heterogeneous densities and irregular geometries, and allows a variety of quantitative measurements tailored to specific needs of different biological systems. PMID:23251611

  15. Quantitative analysis of low-density SNP data for parentage assignment and estimation of family contributions to pooled samples.

    PubMed

    Henshall, John M; Dierens, Leanne; Sellars, Melony J

    2014-09-02

    While much attention has focused on the development of high-density single nucleotide polymorphism (SNP) assays, the costs of developing and running low-density assays have fallen dramatically. This makes it feasible to develop and apply SNP assays for agricultural species beyond the major livestock species. Although low-cost low-density assays may not have the accuracy of the high-density assays widely used in human and livestock species, we show that when combined with statistical analysis approaches that use quantitative instead of discrete genotypes, their utility may be improved. The data used in this study are from a 63-SNP marker Sequenom® iPLEX Platinum panel for the Black Tiger shrimp, for which high-density SNP assays are not currently available. For quantitative genotypes that could be estimated, in 5% of cases the most likely genotype for an individual at a SNP had a probability of less than 0.99. Matrix formulations of maximum likelihood equations for parentage assignment were developed for the quantitative genotypes and also for discrete genotypes perturbed by an assumed error term. Assignment rates that were based on maximum likelihood with quantitative genotypes were similar to those based on maximum likelihood with perturbed genotypes but, for more than 50% of cases, the two methods resulted in individuals being assigned to different families. Treating genotypes as quantitative values allows the same analysis framework to be used for pooled samples of DNA from multiple individuals. Resulting correlations between allele frequency estimates from pooled DNA and individual samples were consistently greater than 0.90, and as high as 0.97 for some pools. Estimates of family contributions to the pools based on quantitative genotypes in pooled DNA had a correlation of 0.85 with estimates of contributions from DNA-derived pedigree. Even with low numbers of SNPs of variable quality, parentage testing and family assignment from pooled samples are sufficiently accurate to provide useful information for a breeding program. Treating genotypes as quantitative values is an alternative to perturbing genotypes using an assumed error distribution, but can produce very different results. An understanding of the distribution of the error is required for SNP genotyping platforms.

  16. Quantitation of hepatitis B virus DNA in plasma using a sensitive cost-effective "in-house" real-time PCR assay.

    PubMed

    Daniel, Hubert Darius J; Fletcher, John G; Chandy, George M; Abraham, Priya

    2009-01-01

    Sensitive nucleic acid testing for the detection and accurate quantitation of hepatitis B virus (HBV) is necessary to reduce transmission through blood and blood products and for monitoring patients on antiviral therapy. The aim of this study is to standardize an "in-house" real-time HBV polymerase chain reaction (PCR) for accurate quantitation and screening of HBV. The "in-house" real-time assay was compared with a commercial assay using 30 chronically infected individuals and 70 blood donors who are negative for hepatitis B surface antigen, hepatitis C virus (HCV) antibody and human immunodeficiency virus (HIV) antibody. Further, 30 HBV-genotyped samples were tested to evaluate the "in-house" assay's capacity to detect genotypes prevalent among individuals attending this tertiary care hospital. The lower limit of detection of this "in-house" HBV real-time PCR was assessed against the WHO international standard and found to be 50 IU/mL. The interassay and intra-assay coefficient of variation (CV) of this "in-house" assay ranged from 1.4% to 9.4% and 0.0% to 2.3%, respectively. Virus loads as estimated with this "in-house" HBV real-time assay correlated well with the commercial artus HBV RG PCR assay ( r = 0.95, P < 0.0001). This assay can be used for the detection and accurate quantitation of HBV viral loads in plasma samples. This assay can be employed for the screening of blood donations and can potentially be adapted to a multiplex format for simultaneous detection of HBV, HIV and HCV to reduce the cost of testing in blood banks.

  17. Photogrammetry of the Human Brain: A Novel Method for Three-Dimensional Quantitative Exploration of the Structural Connectivity in Neurosurgery and Neurosciences.

    PubMed

    De Benedictis, Alessandro; Nocerino, Erica; Menna, Fabio; Remondino, Fabio; Barbareschi, Mattia; Rozzanigo, Umberto; Corsini, Francesco; Olivetti, Emanuele; Marras, Carlo Efisio; Chioffi, Franco; Avesani, Paolo; Sarubbo, Silvio

    2018-04-13

    Anatomic awareness of the structural connectivity of the brain is mandatory for neurosurgeons, to select the most effective approaches for brain resections. Although standard microdissection is a validated technique to investigate the different white matter (WM) pathways and to verify the results of tractography, the possibility of interactive exploration of the specimens and reliable acquisition of quantitative information has not been described. Photogrammetry is a well-established technique allowing an accurate metrology on highly defined three-dimensional (3D) models. The aim of this work is to propose the application of the photogrammetric technique for supporting the 3D exploration and the quantitative analysis on the cerebral WM connectivity. The main perisylvian pathways, including the superior longitudinal fascicle and the arcuate fascicle were exposed using the Klingler technique. The photogrammetric acquisition followed each dissection step. The point clouds were registered to a reference magnetic resonance image of the specimen. All the acquisitions were coregistered into an open-source model. We analyzed 5 steps, including the cortical surface, the short intergyral fibers, the indirect posterior and anterior superior longitudinal fascicle, and the arcuate fascicle. The coregistration between the magnetic resonance imaging mesh and the point clouds models was highly accurate. Multiple measures of distances between specific cortical landmarks and WM tracts were collected on the photogrammetric model. Photogrammetry allows an accurate 3D reproduction of WM anatomy and the acquisition of unlimited quantitative data directly on the real specimen during the postdissection analysis. These results open many new promising neuroscientific and educational perspectives and also optimize the quality of neurosurgical treatments. Copyright © 2018 Elsevier Inc. All rights reserved.

  18. Quantitative Hydrocarbon Energies from the PMO Method.

    ERIC Educational Resources Information Center

    Cooper, Charles F.

    1979-01-01

    Details a procedure for accurately calculating the quantum mechanical energies of hydrocarbons using the perturbational molecular orbital (PMO) method, which does not require the use of a computer. (BT)

  19. Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms

    NASA Astrophysics Data System (ADS)

    Mirkovic, Djordje; Stepanian, Phillip M.; Kelly, Jeffrey F.; Chilson, Phillip B.

    2016-10-01

    The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification.

  20. Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms

    PubMed Central

    Mirkovic, Djordje; Stepanian, Phillip M.; Kelly, Jeffrey F.; Chilson, Phillip B.

    2016-01-01

    The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification. PMID:27762292

  1. Thermal heterogeneity within aqueous materials quantified by 1H NMR spectroscopy: Multiparametric validation in silico and in vitro

    NASA Astrophysics Data System (ADS)

    Lutz, Norbert W.; Bernard, Monique

    2018-02-01

    We recently suggested a new paradigm for statistical analysis of thermal heterogeneity in (semi-)aqueous materials by 1H NMR spectroscopy, using water as a temperature probe. Here, we present a comprehensive in silico and in vitro validation that demonstrates the ability of this new technique to provide accurate quantitative parameters characterizing the statistical distribution of temperature values in a volume of (semi-)aqueous matter. First, line shape parameters of numerically simulated water 1H NMR spectra are systematically varied to study a range of mathematically well-defined temperature distributions. Then, corresponding models based on measured 1H NMR spectra of agarose gel are analyzed. In addition, dedicated samples based on hydrogels or biological tissue are designed to produce temperature gradients changing over time, and dynamic NMR spectroscopy is employed to analyze the resulting temperature profiles at sub-second temporal resolution. Accuracy and consistency of the previously introduced statistical descriptors of temperature heterogeneity are determined: weighted median and mean temperature, standard deviation, temperature range, temperature mode(s), kurtosis, skewness, entropy, and relative areas under temperature curves. Potential and limitations of this method for quantitative analysis of thermal heterogeneity in (semi-)aqueous materials are discussed in view of prospective applications in materials science as well as biology and medicine.

  2. Performance Equivalence and Validation of the Soleris Automated System for Quantitative Microbial Content Testing Using Pure Suspension Cultures.

    PubMed

    Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl

    2016-09-01

    Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.

  3. Use of a liquid-crystal, heater-element composite for quantitative, high-resolution heat transfer coefficients on a turbine airfoil, including turbulence and surface roughness effects

    NASA Astrophysics Data System (ADS)

    Hippensteele, Steven A.; Russell, Louis M.; Torres, Felix J.

    1987-05-01

    Local heat transfer coefficients were measured along the midchord of a three-times-size turbine vane airfoil in a static cascade operated at roon temperature over a range of Reynolds numbers. The test surface consisted of a composite of commercially available materials: a Mylar sheet with a layer of cholestric liquid crystals, which change color with temperature, and a heater made of a polyester sheet coated with vapor-deposited gold, which produces uniform heat flux. After the initial selection and calibration of the composite sheet, accurate, quantitative, and continuous heat transfer coefficients were mapped over the airfoil surface. Tests were conducted at two free-stream turbulence intensities: 0.6 percent, which is typical of wind tunnels; and 10 percent, which is typical of real engine conditions. In addition to a smooth airfoil, the effects of local leading-edge sand roughness were also examined for a value greater than the critical roughness. The local heat transfer coefficients are presented for both free-stream turbulence intensities for inlet Reynolds numbers from 1.20 to 5.55 x 10 to the 5th power. Comparisons are also made with analytical values of heat transfer coefficients obtained from the STAN5 boundary layer code.

  4. Use of a liquid-crystal and heater-element composite for quantitative, high-resolution heat-transfer coefficients on a turbine airfoil including turbulence and surface-roughness effects

    NASA Astrophysics Data System (ADS)

    Hippensteele, S. A.; Russell, L. M.; Torres, F. J.

    Local heat transfer coefficients were measured along the midchord of a three-times-size turbine vane airfoil in a static cascade operated at room temperature over a range of Reynolds numbers. The test surface consisted of a composite of commercially available materials: a Mylar sheet with a layer of cholestric liquid crystals, which change color with temperature, and a heater made of a polyester sheet coated with vapor-deposited gold, which produces uniform heat flux. After the initial selection and calibration of the composite sheet, accurate, quantitative, and continuous heat transfer coefficients were mapped over the airfoil surface. Tests were conducted at two free-stream turbulence intensities: 0.6 percent, which is typical of wind tunnels; and 10 percent, which is typical of real engine conditions. In addition to a smooth airfoil, the effects of local leading-edge sand roughness were also examined for a value greater than the critical roughness. The local heat transfer coefficients are presented for both free-stream turbulence intensities for inlet Reynolds numbers from 1.20 to 5.55 x 10 to the 5th power. Comparisons are also made with analytical values of heat transfer coefficients obtained from the STAN5 boundary layer code.

  5. The Effect of Radiation on Selected Photographic Film

    NASA Technical Reports Server (NTRS)

    Slater, Richard; Kinard, John; Firsov, Ivan

    2000-01-01

    We conducted this film test to evaluate several manufacturers' photographic films for their ability to acquire imagery on the International Space Station. We selected 25 motion picture, photographic slide, and negative films from three different film manufacturers. We based this selection on the fact that their films ranked highest in other similar film tests, and on their general acceptance by the international community. This test differed from previous tests because the entire evaluation process leading up to the final selection was based on information derived after the original flight film was scanned to a digital file. Previously conducted tests were evaluated entirely based on 8 x 10s that were produced from the film either directly or through the internegative process. This new evaluation procedure provided accurate quantitative data on granularity and contrast from the digital data. This test did not try to define which film was best visually. This is too often based on personal preference. However, the test results did group the films by good, marginal, and unacceptable. We developed, and included in this report, a template containing quantitative, graphical, and visual information for each film. These templates should be sufficient for comparing the different films tested and subsequently selecting a film or films to be used for experiments and general documentation on the International Space Station.

  6. Quantitative high dynamic range beam profiling for fluorescence microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mitchell, T. J., E-mail: t.j.mitchell@dur.ac.uk; Saunter, C. D.; O’Nions, W.

    2014-10-15

    Modern developmental biology relies on optically sectioning fluorescence microscope techniques to produce non-destructive in vivo images of developing specimens at high resolution in three dimensions. As optimal performance of these techniques is reliant on the three-dimensional (3D) intensity profile of the illumination employed, the ability to directly record and analyze these profiles is of great use to the fluorescence microscopist or instrument builder. Though excitation beam profiles can be measured indirectly using a sample of fluorescent beads and recording the emission along the microscope detection path, we demonstrate an alternative approach where a miniature camera sensor is used directly withinmore » the illumination beam. Measurements taken using our approach are solely concerned with the illumination optics as the detection optics are not involved. We present a miniature beam profiling device and high dynamic range flux reconstruction algorithm that together are capable of accurately reproducing quantitative 3D flux maps over a large focal volume. Performance of this beam profiling system is verified within an optical test bench and demonstrated for fluorescence microscopy by profiling the low NA illumination beam of a single plane illumination microscope. The generality and success of this approach showcases a widely flexible beam amplitude diagnostic tool for use within the life sciences.« less

  7. Use of a liquid-crystal, heater-element composite for quantitative, high-resolution heat transfer coefficients on a turbine airfoil, including turbulence and surface roughness effects

    NASA Technical Reports Server (NTRS)

    Hippensteele, Steven A.; Russell, Louis M.; Torres, Felix J.

    1987-01-01

    Local heat transfer coefficients were measured along the midchord of a three-times-size turbine vane airfoil in a static cascade operated at roon temperature over a range of Reynolds numbers. The test surface consisted of a composite of commercially available materials: a Mylar sheet with a layer of cholestric liquid crystals, which change color with temperature, and a heater made of a polyester sheet coated with vapor-deposited gold, which produces uniform heat flux. After the initial selection and calibration of the composite sheet, accurate, quantitative, and continuous heat transfer coefficients were mapped over the airfoil surface. Tests were conducted at two free-stream turbulence intensities: 0.6 percent, which is typical of wind tunnels; and 10 percent, which is typical of real engine conditions. In addition to a smooth airfoil, the effects of local leading-edge sand roughness were also examined for a value greater than the critical roughness. The local heat transfer coefficients are presented for both free-stream turbulence intensities for inlet Reynolds numbers from 1.20 to 5.55 x 10 to the 5th power. Comparisons are also made with analytical values of heat transfer coefficients obtained from the STAN5 boundary layer code.

  8. Electromagnetic Model Reliably Predicts Radar Scattering Characteristics of Airborne Organisms.

    PubMed

    Mirkovic, Djordje; Stepanian, Phillip M; Kelly, Jeffrey F; Chilson, Phillip B

    2016-10-20

    The radar scattering characteristics of aerial animals are typically obtained from controlled laboratory measurements of a freshly harvested specimen. These measurements are tedious to perform, difficult to replicate, and typically yield only a small subset of the full azimuthal, elevational, and polarimetric radio scattering data. As an alternative, biological applications of radar often assume that the radar cross sections of flying animals are isotropic, since sophisticated computer models are required to estimate the 3D scattering properties of objects having complex shapes. Using the method of moments implemented in the WIPL-D software package, we show for the first time that such electromagnetic modeling techniques (typically applied to man-made objects) can accurately predict organismal radio scattering characteristics from an anatomical model: here the Brazilian free-tailed bat (Tadarida brasiliensis). The simulated scattering properties of the bat agree with controlled measurements and radar observations made during a field study of bats in flight. This numerical technique can produce the full angular set of quantitative polarimetric scattering characteristics, while eliminating many practical difficulties associated with physical measurements. Such a modeling framework can be applied for bird, bat, and insect species, and will help drive a shift in radar biology from a largely qualitative and phenomenological science toward quantitative estimation of animal densities and taxonomic identification.

  9. Quantitative real-time PCR technique for the identification of E. coli residual DNA in streptokinase recombinant product.

    PubMed

    Fazelahi, Mansoureh; Kia, Vahid; Kaghazian, Hooman; Paryan, Mahdi

    2017-11-26

    Recombinant streptokinase is a biopharmaceutical which is usually produced in E. coli. Residual DNA as a contamination and risk factor may remain in the product. It is necessary to control the production procedure to exclude any possible contamination. The aim of the present study was to develop a highly specific and sensitive quantitative real-time PCR-based method to determine the amount of E. coli DNA in recombinant streptokinase. A specific primers and a probe was designed to detect all strains of E. coli. To determine the specificity, in addition to using NCBI BLASTn, 28 samples including human, bacterial, and viral genomes were used. The results confirmed that the assay detects no genomic DNA but E. coli's and the specificity was determined to be 100%. To determine the sensitivity and limit of detection of the assay, a 10-fold serial dilution (10 1 to 10 7 copies/µL) was tested in triplicate. The sensitivity of the test was determined to be 101 copies/µL or 35 fg/µL. Inter-assay and intra-assay were determined to be 0.86 and 1.69%, respectively. Based on the results, this assay can be used as an accurate method to evaluate the contamination of recombinant streptokinase in E. coli.

  10. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  11. Detection of exogenous gene doping of IGF-I by a real-time quantitative PCR assay.

    PubMed

    Zhang, Jin-Ju; Xu, Jing-Feng; Shen, Yong-Wei; Ma, Shi-Jiao; Zhang, Ting-Ting; Meng, Qing-Lin; Lan, Wen-Jun; Zhang, Chun; Liu, Xiao-Mei

    2017-07-01

    Gene doping can be easily concealed since its product is similar to endogenous protein, making its effective detection very challenging. In this study, we selected insulin-like growth factor I (IGF-I) exogenous gene for gene doping detection. First, the synthetic IGF-I gene was subcloned to recombinant adeno-associated virus (rAAV) plasmid to produce recombinant rAAV2/IGF-I-GFP vectors. Second, in an animal model, rAAV2/IGF-I-GFP vectors were injected into the thigh muscle tissue of mice, and then muscle and blood specimens were sampled at different time points for total DNA isolation. Finally, real-time quantitative PCR was employed to detect the exogenous gene doping of IGF-I. In view of the characteristics of endogenous IGF-I gene sequences, a TaqMan probe was designed at the junction of exons 2 and 3 of IGF-I gene to distinguish it from the exogenous IGF-I gene. In addition, an internal reference control plasmid and its probe were used in PCR to rule out false-positive results through comparison of their threshold cycle (Ct) values. Thus, an accurate exogenous IGF-I gene detection approach was developed in this study. © 2016 International Union of Biochemistry and Molecular Biology, Inc.

  12. Contrast limiting factors of optical fiber bundles for flexible endoscopy

    NASA Astrophysics Data System (ADS)

    Ortega-Quijano, N.; Arce-Diego, J. L.; Fanjul-Vélez, F.

    2008-11-01

    Medical endoscopy constitutes a basic device for the development of minimally invasive procedures for a wide range of medical applications, involving diagnosis, treatment and surgery, as well as biopsy sampling. Its minimally invasive nature results in no surgery, or only small incisions, which involves a minimal hospitalization time. The medical relevance of endoscopes relies on the fact that they are one of the most effective means of early stages of cancer diagnosis, with the subsequent improvement in the patient's quality of life. Flexible endoscopy by means of coherent optical fiber bundles shows both flexibility and a high active area. However, the parallel arrangement of the fibers within the bundle produces interference phenomena between them, which results in optical crosstalk. As a consequence, there is a power exchange between contiguous fibers, producing a worsening in the contrast of the image. In this work, this quality limiting factor is deeply studied. We quantitatively analyze crosstalk, performing several studies that show the limitations imposed to the endoscopic system. Finally, we propose some solutions by an analytical method to accurately determine the appropriate optical fibers for each particular design. The method is also applied to endoscopic OCT.

  13. Accuracy of commercially available c-reactive protein rapid tests in the context of undifferentiated fevers in rural Laos.

    PubMed

    Phommasone, Koukeo; Althaus, Thomas; Souvanthong, Phonesavanh; Phakhounthong, Khansoudaphone; Soyvienvong, Laxoy; Malapheth, Phatthaphone; Mayxay, Mayfong; Pavlicek, Rebecca L; Paris, Daniel H; Dance, David; Newton, Paul; Lubell, Yoel

    2016-02-04

    C-Reactive Protein (CRP) has been shown to be an accurate biomarker for discriminating bacterial from viral infections in febrile patients in Southeast Asia. Here we investigate the accuracy of existing rapid qualitative and semi-quantitative tests as compared with a quantitative reference test to assess their potential for use in remote tropical settings. Blood samples were obtained from consecutive patients recruited to a prospective fever study at three sites in rural Laos. At each site, one of three rapid qualitative or semi-quantitative tests was performed, as well as a corresponding quantitative NycoCard Reader II as a reference test. We estimate the sensitivity and specificity of the three tests against a threshold of 10 mg/L and kappa values for the agreement of the two semi-quantitative tests with the results of the reference test. All three tests showed high sensitivity, specificity and kappa values as compared with the NycoCard Reader II. With a threshold of 10 mg/L the sensitivity of the tests ranged from 87-98 % and the specificity from 91-98 %. The weighted kappa values for the semi-quantitative tests were 0.7 and 0.8. The use of CRP rapid tests could offer an inexpensive and effective approach to improve the targeting of antibiotics in remote settings where health facilities are basic and laboratories are absent. This study demonstrates that accurate CRP rapid tests are commercially available; evaluations of their clinical impact and cost-effectiveness at point of care is warranted.

  14. Utility of high-resolution accurate MS to eliminate interferences in the bioanalysis of ribavirin and its phosphate metabolites.

    PubMed

    Wei, Cong; Grace, James E; Zvyaga, Tatyana A; Drexler, Dieter M

    2012-08-01

    The polar nucleoside drug ribavirin (RBV) combined with IFN-α is a front-line treatment for chronic hepatitis C virus infection. RBV acts as a prodrug and exerts its broad antiviral activity primarily through its active phosphorylated metabolite ribavirin 5´-triphosphate (RTP), and also possibly through ribavirin 5´-monophosphate (RMP). To study RBV transport, diffusion, metabolic clearance and its impact on drug-metabolizing enzymes, a LC-MS method is needed to simultaneously quantify RBV and its phosphorylated metabolites (RTP, ribavirin 5´-diphosphate and RMP). In a recombinant human UGT1A1 assay, the assay buffer components uridine and its phosphorylated derivatives are isobaric with RBV and its phosphorylated metabolites, leading to significant interference when analyzed by LC-MS with the nominal mass resolution mode. Presented here is a LC-MS method employing LC coupled with full-scan high-resolution accurate MS analysis for the simultaneous quantitative determination of RBV, RMP, ribavirin 5´-diphosphate and RTP by differentiating RBV and its phosphorylated metabolites from uridine and its phosphorylated derivatives by accurate mass, thus avoiding interference. The developed LC-high-resolution accurate MS method allows for quantitation of RBV and its phosphorylated metabolites, eliminating the interferences from uridine and its phosphorylated derivatives in recombinant human UGT1A1 assays.

  15. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near-Sun Conditions With a Simple One-Dimensional "Upwind" Scheme

    NASA Astrophysics Data System (ADS)

    Owens, Mathew J.; Riley, Pete

    2017-11-01

    Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).

  16. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near-Sun Conditions With a Simple One-Dimensional "Upwind" Scheme.

    PubMed

    Owens, Mathew J; Riley, Pete

    2017-11-01

    Long lead-time space-weather forecasting requires accurate prediction of the near-Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near-Sun solar wind and magnetic field conditions provide the inner boundary condition to three-dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics-based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near-Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near-Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near-Sun solar wind speed at a range of latitudes about the sub-Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun-Earth line. Propagating these conditions to Earth by a three-dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one-dimensional "upwind" scheme is used. The variance in the resulting near-Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996-2016, the upwind ensemble is found to provide a more "actionable" forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large).

  17. Probabilistic Solar Wind Forecasting Using Large Ensembles of Near‐Sun Conditions With a Simple One‐Dimensional “Upwind” Scheme

    PubMed Central

    Riley, Pete

    2017-01-01

    Abstract Long lead‐time space‐weather forecasting requires accurate prediction of the near‐Earth solar wind. The current state of the art uses a coronal model to extrapolate the observed photospheric magnetic field to the upper corona, where it is related to solar wind speed through empirical relations. These near‐Sun solar wind and magnetic field conditions provide the inner boundary condition to three‐dimensional numerical magnetohydrodynamic (MHD) models of the heliosphere out to 1 AU. This physics‐based approach can capture dynamic processes within the solar wind, which affect the resulting conditions in near‐Earth space. However, this deterministic approach lacks a quantification of forecast uncertainty. Here we describe a complementary method to exploit the near‐Sun solar wind information produced by coronal models and provide a quantitative estimate of forecast uncertainty. By sampling the near‐Sun solar wind speed at a range of latitudes about the sub‐Earth point, we produce a large ensemble (N = 576) of time series at the base of the Sun‐Earth line. Propagating these conditions to Earth by a three‐dimensional MHD model would be computationally prohibitive; thus, a computationally efficient one‐dimensional “upwind” scheme is used. The variance in the resulting near‐Earth solar wind speed ensemble is shown to provide an accurate measure of the forecast uncertainty. Applying this technique over 1996–2016, the upwind ensemble is found to provide a more “actionable” forecast than a single deterministic forecast; potential economic value is increased for all operational scenarios, but particularly when false alarms are important (i.e., where the cost of taking mitigating action is relatively large). PMID:29398982

  18. Non-invasive methods for the determination of body and carcass composition in livestock: dual-energy X-ray absorptiometry, computed tomography, magnetic resonance imaging and ultrasound: invited review.

    PubMed

    Scholz, A M; Bünger, L; Kongsro, J; Baulain, U; Mitchell, A D

    2015-07-01

    The ability to accurately measure body or carcass composition is important for performance testing, grading and finally selection or payment of meat-producing animals. Advances especially in non-invasive techniques are mainly based on the development of electronic and computer-driven methods in order to provide objective phenotypic data. The preference for a specific technique depends on the target animal species or carcass, combined with technical and practical aspects such as accuracy, reliability, cost, portability, speed, ease of use, safety and for in vivo measurements the need for fixation or sedation. The techniques rely on specific device-driven signals, which interact with tissues in the body or carcass at the atomic or molecular level, resulting in secondary or attenuated signals detected by the instruments and analyzed quantitatively. The electromagnetic signal produced by the instrument may originate from mechanical energy such as sound waves (ultrasound - US), 'photon' radiation (X-ray-computed tomography - CT, dual-energy X-ray absorptiometry - DXA) or radio frequency waves (magnetic resonance imaging - MRI). The signals detected by the corresponding instruments are processed to measure, for example, tissue depths, areas, volumes or distributions of fat, muscle (water, protein) and partly bone or bone mineral. Among the above techniques, CT is the most accurate one followed by MRI and DXA, whereas US can be used for all sizes of farm animal species even under field conditions. CT, MRI and US can provide volume data, whereas only DXA delivers immediate whole-body composition results without (2D) image manipulation. A combination of simple US and more expensive CT, MRI or DXA might be applied for farm animal selection programs in a stepwise approach.

  19. High-definition fiber tractography of the human brain: neuroanatomical validation and neurosurgical applications.

    PubMed

    Fernandez-Miranda, Juan C; Pathak, Sudhir; Engh, Johnathan; Jarbo, Kevin; Verstynen, Timothy; Yeh, Fang-Cheng; Wang, Yibao; Mintz, Arlan; Boada, Fernando; Schneider, Walter; Friedlander, Robert

    2012-08-01

    High-definition fiber tracking (HDFT) is a novel combination of processing, reconstruction, and tractography methods that can track white matter fibers from cortex, through complex fiber crossings, to cortical and subcortical targets with subvoxel resolution. To perform neuroanatomical validation of HDFT and to investigate its neurosurgical applications. Six neurologically healthy adults and 36 patients with brain lesions were studied. Diffusion spectrum imaging data were reconstructed with a Generalized Q-Ball Imaging approach. Fiber dissection studies were performed in 20 human brains, and selected dissection results were compared with tractography. HDFT provides accurate replication of known neuroanatomical features such as the gyral and sulcal folding patterns, the characteristic shape of the claustrum, the segmentation of the thalamic nuclei, the decussation of the superior cerebellar peduncle, the multiple fiber crossing at the centrum semiovale, the complex angulation of the optic radiations, the terminal arborization of the arcuate tract, and the cortical segmentation of the dorsal Broca area. From a clinical perspective, we show that HDFT provides accurate structural connectivity studies in patients with intracerebral lesions, allowing qualitative and quantitative white matter damage assessment, aiding in understanding lesional patterns of white matter structural injury, and facilitating innovative neurosurgical applications. High-grade gliomas produce significant disruption of fibers, and low-grade gliomas cause fiber displacement. Cavernomas cause both displacement and disruption of fibers. Our HDFT approach provides an accurate reconstruction of white matter fiber tracts with unprecedented detail in both the normal and pathological human brain. Further studies to validate the clinical findings are needed.

  20. Landslides, floods and sinkholes in a karst environment: the 1-6 September 2014 Gargano event, southern Italy

    NASA Astrophysics Data System (ADS)

    Martinotti, Maria Elena; Pisano, Luca; Marchesini, Ivan; Rossi, Mauro; Peruccacci, Silvia; Brunetti, Maria Teresa; Melillo, Massimo; Amoruso, Giuseppe; Loiacono, Pierluigi; Vennari, Carmela; Vessia, Giovanna; Trabace, Maria; Parise, Mario; Guzzetti, Fausto

    2017-03-01

    In karst environments, heavy rainfall is known to cause multiple geohydrological hazards, including inundations, flash floods, landslides and sinkholes. We studied a period of intense rainfall from 1 to 6 September 2014 in the Gargano Promontory, a karst area in Puglia, southern Italy. In the period, a sequence of torrential rainfall events caused severe damage and claimed two fatalities. The amount and accuracy of the geographical and temporal information varied for the different hazards. The temporal information was most accurate for the inundation caused by a major river, less accurate for flash floods caused by minor torrents and even less accurate for landslides. For sinkholes, only generic information on the period of occurrence of the failures was available. Our analysis revealed that in the promontory, rainfall-driven hazards occurred in response to extreme meteorological conditions and that the karst landscape responded to the torrential rainfall with a threshold behaviour. We exploited the rainfall and the landslide information to design the new ensemble-non-exceedance probability (E-NEP) algorithm for the quantitative evaluation of the possible occurrence of rainfall-induced landslides and of related geohydrological hazards. The ensemble of the metrics produced by the E-NEP algorithm provided better diagnostics than the single metrics often used for landslide forecasting, including rainfall duration, cumulated rainfall and rainfall intensity. We expect that the E-NEP algorithm will be useful for landslide early warning in karst areas and in other similar environments. We acknowledge that further tests are needed to evaluate the algorithm in different meteorological, geological and physiographical settings.

  1. Digital Imaging

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Digital Imaging is the computer processed numerical representation of physical images. Enhancement of images results in easier interpretation. Quantitative digital image analysis by Perceptive Scientific Instruments, locates objects within an image and measures them to extract quantitative information. Applications are CAT scanners, radiography, microscopy in medicine as well as various industrial and manufacturing uses. The PSICOM 327 performs all digital image analysis functions. It is based on Jet Propulsion Laboratory technology, is accurate and cost efficient.

  2. Strategy for Extracting DNA from Clay Soil and Detecting a Specific Target Sequence via Selective Enrichment and Real-Time (Quantitative) PCR Amplification ▿

    PubMed Central

    Yankson, Kweku K.; Steck, Todd R.

    2009-01-01

    We present a simple strategy for isolating and accurately enumerating target DNA from high-clay-content soils: desorption with buffers, an optional magnetic capture hybridization step, and quantitation via real-time PCR. With the developed technique, μg quantities of DNA were extracted from mg samples of pure kaolinite and a field clay soil. PMID:19633108

  3. Quantitative analysis of naphthenic acids in water by liquid chromatography-accurate mass time-of-flight mass spectrometry.

    PubMed

    Hindle, Ralph; Noestheden, Matthew; Peru, Kerry; Headley, John

    2013-04-19

    This study details the development of a routine method for quantitative analysis of oil sands naphthenic acids, which are a complex class of compounds found naturally and as contaminants in oil sands process waters from Alberta's Athabasca region. Expanding beyond classical naphthenic acids (CnH2n-zO2), those compounds conforming to the formula CnH2n-zOx (where 2≥x≤4) were examined in commercial naphthenic acid and environmental water samples. HPLC facilitated a five-fold reduction in ion suppression when compared to the more commonly used flow injection analysis. A comparison of 39 model naphthenic acids revealed significant variability in response factors, demonstrating the necessity of using naphthenic acid mixtures for quantitation, rather than model compounds. It was also demonstrated that naphthenic acidic heterogeneity (commercial and environmental) necessitates establishing a single NA mix as the standard against which all quantitation is performed. The authors present the first ISO17025 accredited method for the analysis of naphthenic acids in water using HPLC high resolution accurate mass time-of-flight mass spectrometry. The method detection limit was 1mg/L total oxy-naphthenic acids (Sigma technical mix). Copyright © 2013 Elsevier B.V. All rights reserved.

  4. Quantitative inference of population response properties across eccentricity from motion-induced maps in macaque V1

    PubMed Central

    Chen, Ming; Wu, Si; Lu, Haidong D.; Roe, Anna W.

    2013-01-01

    Interpreting population responses in the primary visual cortex (V1) remains a challenge especially with the advent of techniques measuring activations of large cortical areas simultaneously with high precision. For successful interpretation, a quantitatively precise model prediction is of great importance. In this study, we investigate how accurate a spatiotemporal filter (STF) model predicts average response profiles to coherently drifting random dot motion obtained by optical imaging of intrinsic signals in V1 of anesthetized macaques. We establish that orientation difference maps, obtained by subtracting orthogonal axis-of-motion, invert with increasing drift speeds, consistent with the motion streak effect. Consistent with perception, the speed at which the map inverts (the critical speed) depends on cortical eccentricity and systematically increases from foveal to parafoveal. We report that critical speeds and response maps to drifting motion are excellently reproduced by the STF model. Our study thus suggests that the STF model is quantitatively accurate enough to be used as a first model of choice for interpreting responses obtained with intrinsic imaging methods in V1. We show further that this good quantitative correspondence opens the possibility to infer otherwise not easily accessible population receptive field properties from responses to complex stimuli, such as drifting random dot motions. PMID:23197457

  5. Quantitative Live-Cell Confocal Imaging of 3D Spheroids in a High-Throughput Format.

    PubMed

    Leary, Elizabeth; Rhee, Claire; Wilks, Benjamin T; Morgan, Jeffrey R

    2018-06-01

    Accurately predicting the human response to new compounds is critical to a wide variety of industries. Standard screening pipelines (including both in vitro and in vivo models) often lack predictive power. Three-dimensional (3D) culture systems of human cells, a more physiologically relevant platform, could provide a high-throughput, automated means to test the efficacy and/or toxicity of novel substances. However, the challenge of obtaining high-magnification, confocal z stacks of 3D spheroids and understanding their respective quantitative limitations must be overcome first. To address this challenge, we developed a method to form spheroids of reproducible size at precise spatial locations across a 96-well plate. Spheroids of variable radii were labeled with four different fluorescent dyes and imaged with a high-throughput confocal microscope. 3D renderings of the spheroid had a complex bowl-like appearance. We systematically analyzed these confocal z stacks to determine the depth of imaging and the effect of spheroid size and dyes on quantitation. Furthermore, we have shown that this loss of fluorescence can be addressed through the use of ratio imaging. Overall, understanding both the limitations of confocal imaging and the tools to correct for these limits is critical for developing accurate quantitative assays using 3D spheroids.

  6. Propagating Qualitative Values Through Quantitative Equations

    NASA Technical Reports Server (NTRS)

    Kulkarni, Deepak

    1992-01-01

    In most practical problems where traditional numeric simulation is not adequate, one need to reason about a system with both qualitative and quantitative equations. In this paper, we address the problem of propagating qualitative values represented as interval values through quantitative equations. Previous research has produced exponential-time algorithms for approximate solution of the problem. These may not meet the stringent requirements of many real time applications. This paper advances the state of art by producing a linear-time algorithm that can propagate a qualitative value through a class of complex quantitative equations exactly and through arbitrary algebraic expressions approximately. The algorithm was found applicable to Space Shuttle Reaction Control System model.

  7. Quantitative measurement of vitamin K2 (menaquinones) in various fermented dairy products using a reliable high-performance liquid chromatography method.

    PubMed

    Manoury, E; Jourdon, K; Boyaval, P; Fourcassié, P

    2013-03-01

    We evaluated menaquinone contents in a large set of 62 fermented dairy products samples by using a new liquid chromatography method for accurate quantification of lipo-soluble vitamin K(2), including distribution of individual menaquinones. The method used a simple and rapid purification step to remove matrix components in various fermented dairy products 3 times faster than a reference preparation step. Moreover, the chromatography elution time was significantly shortened and resolution and efficiency were optimized. We observed wide diversity of vitamin K(2) contents in the set of fermented dairy products, from undetectable to 1,100 ng/g of product, and a remarkable diversity of menaquinone forms among products. These observations relate to the main microorganism species currently in the different fermented product technologies. The major form in this large set of fermented dairy products was menaquinone (MK)-9, and contents of MK-9 and MK-8 forms were correlated, that of MK-9 being around 4 times that of MK-8, suggesting that microorganisms able to produce MK-9 also produce MK-8. This was not the case for the other menaquinones, which were produced independently of each other. Finally, no obvious link was established between MK-9 content and fat content or pH of the fermented dairy products. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Sooting turbulent jet flame: characterization and quantitative soot measurements

    NASA Astrophysics Data System (ADS)

    Köhler, M.; Geigle, K. P.; Meier, W.; Crosland, B. M.; Thomson, K. A.; Smallwood, G. J.

    2011-08-01

    Computational fluid dynamics (CFD) modelers require high-quality experimental data sets for validation of their numerical tools. Preferred features for numerical simulations of a sooting, turbulent test case flame are simplicity (no pilot flame), well-defined boundary conditions, and sufficient soot production. This paper proposes a non-premixed C2H4/air turbulent jet flame to fill this role and presents an extensive database for soot model validation. The sooting turbulent jet flame has a total visible flame length of approximately 400 mm and a fuel-jet Reynolds number of 10,000. The flame has a measured lift-off height of 26 mm which acts as a sensitive marker for CFD model validation, while this novel compiled experimental database of soot properties, temperature and velocity maps are useful for the validation of kinetic soot models and numerical flame simulations. Due to the relatively simple burner design which produces a flame with sufficient soot concentration while meeting modelers' needs with respect to boundary conditions and flame specifications as well as the present lack of a sooting "standard flame", this flame is suggested as a new reference turbulent sooting flame. The flame characterization presented here involved a variety of optical diagnostics including quantitative 2D laser-induced incandescence (2D-LII), shifted-vibrational coherent anti-Stokes Raman spectroscopy (SV-CARS), and particle image velocimetry (PIV). Producing an accurate and comprehensive characterization of a transient sooting flame was challenging and required optimization of these diagnostics. In this respect, we present the first simultaneous, instantaneous PIV, and LII measurements in a heavily sooting flame environment. Simultaneous soot and flow field measurements can provide new insights into the interaction between a turbulent vortex and flame chemistry, especially since soot structures in turbulent flames are known to be small and often treated in a statistical manner.

  9. Image-based quantification of fiber alignment within electrospun tissue engineering scaffolds is related to mechanical anisotropy.

    PubMed

    Fee, Timothy; Downs, Crawford; Eberhardt, Alan; Zhou, Yong; Berry, Joel

    2016-07-01

    It is well documented that electrospun tissue engineering scaffolds can be fabricated with variable degrees of fiber alignment to produce scaffolds with anisotropic mechanical properties. Several attempts have been made to quantify the degree of fiber alignment within an electrospun scaffold using image-based methods. However, these methods are limited by the inability to produce a quantitative measure of alignment that can be used to make comparisons across publications. Therefore, we have developed a new approach to quantifying the alignment present within a scaffold from scanning electron microscopic (SEM) images. The alignment is determined by using the Sobel approximation of the image gradient to determine the distribution of gradient angles with an image. This data was fit to a Von Mises distribution to find the dispersion parameter κ, which was used as a quantitative measure of fiber alignment. We fabricated four groups of electrospun polycaprolactone (PCL) + Gelatin scaffolds with alignments ranging from κ = 1.9 (aligned) to κ = 0.25 (random) and tested our alignment quantification method on these scaffolds. It was found that our alignment quantification method could distinguish between scaffolds of different alignments more accurately than two other published methods. Additionally, the alignment parameter κ was found to be a good predictor the mechanical anisotropy of our electrospun scaffolds. The ability to quantify fiber alignment within and make direct comparisons of scaffold fiber alignment across publications can reduce ambiguity between published results where cells are cultured on "highly aligned" fibrous scaffolds. This could have important implications for characterizing mechanics and cellular behavior on aligned tissue engineering scaffolds. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 1680-1686, 2016. © 2016 Wiley Periodicals, Inc.

  10. Metabolic Profiling and Identification of Shikonins in Root Periderm of Two Invasive Echium spp. Weeds in Australia.

    PubMed

    Skoneczny, Dominik; Weston, Paul A; Zhu, Xiaocheng; Gurr, Geoff M; Callaway, Ragan M; Barrow, Russel A; Weston, Leslie A

    2017-02-21

    Metabolic profiling can be successfully implemented to analyse a living system's response to environmental conditions by providing critical information on an organism's physiological state at a particular point in time and allowing for both quantitative and qualitative assessment of a specific subset(s) of key metabolites. Shikonins are highly reactive chemicals that affect various cell signalling pathways and possess antifungal, antibacterial and allelopathic activity. Based on previous bioassay results, bioactive shikonins, are likely to play important roles in the regulation of rhizosphere interactions with neighbouring plants, microbes and herbivores. An effective platform allowing for rapid identification and accurate profiling of numerous structurally similar, difficult-to-separate bioactive isohexenylnaphthazarins (shikonins) was developed using UHPLC Q-TOF MS. Root periderm tissues of the invasive Australian weeds Echium plantagineum and its congener E. vulgare were extracted overnight in ethanol for shikonin profiling. Shikonin production was evaluated at seedling, rosette and flowering stages. Five populations of each species were compared for qualitative and quantitative differences in shikonin formation. Each species showed little populational variation in qualitative shikonin production; however, shikonin was considerably low in one population of E. plantagineum from Western New South Wales . Seedlings of all populations produced the bioactive metabolite acetylshikonin and production was upregulated over time. Mature plants of both species produced significantly higher total levels of shikonins and isovalerylshikonin > dimethylacrylshikonin > shikonin > acetylshikonin in mature E. plantagineum . Although qualitative metabolic profiles in both Echium spp. were nearly identical, shikonin abundance in mature plant periderm was approximately 2.5 times higher in perennial E. vulgare extracts in comparison to those of the annual E. plantagineum. These findings contribute to our understanding of the biosynthesis of shikonins in roots of two related invasive plants and their expression in relation to plant phenological stage.

  11. A simplified and accurate detection of the genetically modified wheat MON71800 with one calibrator plasmid.

    PubMed

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2015-06-01

    With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Digital PCR Quantitation of Muscle Mitochondrial DNA: Age, Fiber Type, and Mutation-Induced Changes.

    PubMed

    Herbst, Allen; Widjaja, Kevin; Nguy, Beatrice; Lushaj, Entela B; Moore, Timothy M; Hevener, Andrea L; McKenzie, Debbie; Aiken, Judd M; Wanagat, Jonathan

    2017-10-01

    Definitive quantitation of mitochondrial DNA (mtDNA) and mtDNA deletion mutation abundances would help clarify the role of mtDNA instability in aging. To more accurately quantify mtDNA, we applied the emerging technique of digital polymerase chain reaction to individual muscle fibers and muscle homogenates from aged rodents. Individual fiber mtDNA content correlated with fiber type and decreased with age. We adapted a digital polymerase chain reaction deletion assay that was accurate in mixing experiments to a mutation frequency of 0.03% and quantitated an age-induced increase in deletion frequency from rat muscle homogenates. Importantly, the deletion frequency measured in muscle homogenates strongly correlated with electron transport chain-deficient fiber abundance determined by histochemical analyses. These data clarify the temporal accumulation of mtDNA deletions that lead to electron chain-deficient fibers, a process culminating in muscle fiber loss. © The Author 2017. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. Quantitative characterization of surface topography using spectral analysis

    NASA Astrophysics Data System (ADS)

    Jacobs, Tevis D. B.; Junge, Till; Pastewka, Lars

    2017-03-01

    Roughness determines many functional properties of surfaces, such as adhesion, friction, and (thermal and electrical) contact conductance. Recent analytical models and simulations enable quantitative prediction of these properties from knowledge of the power spectral density (PSD) of the surface topography. The utility of the PSD is that it contains statistical information that is unbiased by the particular scan size and pixel resolution chosen by the researcher. In this article, we first review the mathematical definition of the PSD, including the one- and two-dimensional cases, and common variations of each. We then discuss strategies for reconstructing an accurate PSD of a surface using topography measurements at different size scales. Finally, we discuss detecting and mitigating artifacts at the smallest scales, and computing upper/lower bounds on functional properties obtained from models. We accompany our discussion with virtual measurements on computer-generated surfaces. This discussion summarizes how to analyze topography measurements to reconstruct a reliable PSD. Analytical models demonstrate the potential for tuning functional properties by rationally tailoring surface topography—however, this potential can only be achieved through the accurate, quantitative reconstruction of the PSDs of real-world surfaces.

  14. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  15. Analysis of ribosomal RNA stability in dead cells of wine yeast by quantitative PCR.

    PubMed

    Sunyer-Figueres, Merce; Wang, Chunxiao; Mas, Albert

    2018-04-02

    During wine production, some yeasts enter a Viable But Not Culturable (VBNC) state, which may influence the quality and stability of the final wine through remnant metabolic activity or by resuscitation. Culture-independent techniques are used for obtaining an accurate estimation of the number of live cells, and quantitative PCR could be the most accurate technique. As a marker of cell viability, rRNA was evaluated by analyzing its stability in dead cells. The species-specific stability of rRNA was tested in Saccharomyces cerevisiae, as well as in three species of non-Saccharomyces yeast (Hanseniaspora uvarum, Torulaspora delbrueckii and Starmerella bacillaris). High temperature and antimicrobial dimethyl dicarbonate (DMDC) treatments were efficient in lysing the yeast cells. rRNA gene and rRNA (as cDNA) were analyzed over 48 h after cell lysis by quantitative PCR. The results confirmed the stability of rRNA for 48 h after the cell lysis treatments. To sum up, rRNA may not be a good marker of cell viability in the wine yeasts that were tested. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Analysis of artifacts suggests DGGE should not be used for quantitative diversity analysis.

    PubMed

    Neilson, Julia W; Jordan, Fiona L; Maier, Raina M

    2013-03-01

    PCR-denaturing gradient gel electrophoresis (PCR-DGGE) is widely used in microbial ecology for the analysis of comparative community structure. However, artifacts generated during PCR-DGGE of mixed template communities impede the application of this technique to quantitative analysis of community diversity. The objective of the current study was to employ an artificial bacterial community to document and analyze artifacts associated with multiband signatures and preferential template amplification and to highlight their impacts on the use of this technique for quantitative diversity analysis. Six bacterial species (three Betaproteobacteria, two Alphaproteobacteria, and one Firmicutes) were amplified individually and in combinations with primers targeting the V7/V8 region of the 16S rRNA gene. Two of the six isolates produced multiband profiles demonstrating that band number does not correlate directly with α-diversity. Analysis of the multiple bands from one of these isolates confirmed that both bands had identical sequences which lead to the hypothesis that the multiband pattern resulted from two distinct structural conformations of the same amplicon. In addition, consistent preferential amplification was demonstrated following pairwise amplifications of the six isolates. DGGE and real time PCR analysis identified primer mismatch and PCR inhibition due to 16S rDNA secondary structure as the most probable causes of preferential amplification patterns. Reproducible DGGE community profiles generated in this study confirm that PCR-DGGE provides an excellent high-throughput tool for comparative community structure analysis, but that method-specific artifacts preclude its use for accurate comparative diversity analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Comparison of Quantitative PCR and Droplet Digital PCR Multiplex Assays for Two Genera of Bloom-Forming Cyanobacteria, Cylindrospermopsis and Microcystis.

    PubMed

    Te, Shu Harn; Chen, Enid Yingru; Gin, Karina Yew-Hoong

    2015-08-01

    The increasing occurrence of harmful cyanobacterial blooms, often linked to deteriorated water quality and adverse public health effects, has become a worldwide concern in recent decades. The use of molecular techniques such as real-time quantitative PCR (qPCR) has become increasingly popular in the detection and monitoring of harmful cyanobacterial species. Multiplex qPCR assays that quantify several toxigenic cyanobacterial species have been established previously; however, there is no molecular assay that detects several bloom-forming species simultaneously. Microcystis and Cylindrospermopsis are the two most commonly found genera and are known to be able to produce microcystin and cylindrospermopsin hepatotoxins. In this study, we designed primers and probes which enable quantification of these genera based on the RNA polymerase C1 gene for Cylindrospermopsis species and the c-phycocyanin beta subunit-like gene for Microcystis species. Duplex assays were developed for two molecular techniques-qPCR and droplet digital PCR (ddPCR). After optimization, both qPCR and ddPCR assays have high linearity and quantitative correlations for standards. Comparisons of the two techniques showed that qPCR has higher sensitivity, a wider linear dynamic range, and shorter analysis time and that it was more cost-effective, making it a suitable method for initial screening. However, the ddPCR approach has lower variability and was able to handle the PCR inhibition and competitive effects found in duplex assays, thus providing more precise and accurate analysis for bloom samples. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  18. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  19. Towards in vivo focal cortical dysplasia phenotyping using quantitative MRI.

    PubMed

    Adler, Sophie; Lorio, Sara; Jacques, Thomas S; Benova, Barbora; Gunny, Roxana; Cross, J Helen; Baldeweg, Torsten; Carmichael, David W

    2017-01-01

    Focal cortical dysplasias (FCDs) are a range of malformations of cortical development each with specific histopathological features. Conventional radiological assessment of standard structural MRI is useful for the localization of lesions but is unable to accurately predict the histopathological features. Quantitative MRI offers the possibility to probe tissue biophysical properties in vivo and may bridge the gap between radiological assessment and ex-vivo histology. This review will cover histological, genetic and radiological features of FCD following the ILAE classification and will explain how quantitative voxel- and surface-based techniques can characterise these features. We will provide an overview of the quantitative MRI measures available, their link with biophysical properties and finally the potential application of quantitative MRI to the problem of FCD subtyping. Future research linking quantitative MRI to FCD histological properties should improve clinical protocols, allow better characterisation of lesions in vivo and tailored surgical planning to the individual.

  20. Differential plating medium for quantitative detection of histamine-producing bacteria.

    PubMed Central

    Niven, C F; Jeffrey, M B; Corlett, D A

    1981-01-01

    A histidine-containing agar medium has been devised for quantitative detection of histamine-producing bacteria that are alleged to be associated with scombroid fish poisoning outbreaks. The responsible bacteria produce a marked pH change in the agar, with attendant color change of pH indicator adjacent to the colonies, thus facilitating their recognition. Proteus morganii and Klebsiella pneumoniae were the two most common histidine-decarboxylating species isolated from scombroid fish and mahi mahi. PMID:7013698

  1. A Critical Appraisal of Techniques, Software Packages, and Standards for Quantitative Proteomic Analysis

    PubMed Central

    Lawless, Craig; Hubbard, Simon J.; Fan, Jun; Bessant, Conrad; Hermjakob, Henning; Jones, Andrew R.

    2012-01-01

    Abstract New methods for performing quantitative proteome analyses based on differential labeling protocols or label-free techniques are reported in the literature on an almost monthly basis. In parallel, a correspondingly vast number of software tools for the analysis of quantitative proteomics data has also been described in the literature and produced by private companies. In this article we focus on the review of some of the most popular techniques in the field and present a critical appraisal of several software packages available to process and analyze the data produced. We also describe the importance of community standards to support the wide range of software, which may assist researchers in the analysis of data using different platforms and protocols. It is intended that this review will serve bench scientists both as a useful reference and a guide to the selection and use of different pipelines to perform quantitative proteomics data analysis. We have produced a web-based tool (http://www.proteosuite.org/?q=other_resources) to help researchers find appropriate software for their local instrumentation, available file formats, and quantitative methodology. PMID:22804616

  2. Electron Probe Microanalysis | Materials Science | NREL

    Science.gov Websites

    surveys of the area of interest before performing a more accurate quantitative analysis with WDS. WDS - Four spectrometers with ten diffracting crystals. The use of a single-channel analyzer allows much

  3. Toward Accurate and Quantitative Comparative Metagenomics

    PubMed Central

    Nayfach, Stephen; Pollard, Katherine S.

    2016-01-01

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  4. Development of an UPLC-MS/MS micromethod for quantitation of cinitapride in plasma and its application in a pharmacokinetic interaction trial.

    PubMed

    Marcelín-Jiménez, Gabriel; Contreras, Leticia; Esquivel, Javier; Ávila, Óscar; Batista, Dany; Ángeles, Alionka P; García-González, Alberto

    2017-03-01

    Cinitapride (CIN) is a benzamide-derived molecule used for the treatment of gastroesophageal reflux and dyspepsia. Its pharmacokinetics are controversial due to the use of supratherapeutic doses and the lack of sensitive methodology. Therefore, a sensitive and accurate micromethod was developed for its quantitation in human plasma. CIN was extracted from 300 µl of heparinized plasma by liquid-liquid extraction using cisapride as internal standard, and analyzed with an ultra performance liquid chromatograph employing positive multiple-reaction monitoring-MS. The method proved to be rapid, accurate and stable within a range between 50 and 2000 pg/ml and was successfully validated and applied in a pharmacokinetic interaction trial, where it was demonstrated that oral co-administration of simethicone does not modify the bioavailability of CIN.

  5. Toward Accurate and Quantitative Comparative Metagenomics.

    PubMed

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE PAGES

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng; ...

    2017-10-04

    Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less

  7. Fluctuation localization imaging-based fluorescence in situ hybridization (fliFISH) for accurate detection and counting of RNA copies in single cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cui, Yi; Hu, Dehong; Markillie, Lye Meng

    Here, quantitative gene expression analysis in intact single cells can be achieved using single molecule-based fluorescence in situ hybridization (smFISH). This approach relies on fluorescence intensity to distinguish between true signals, emitted from an RNA copy hybridized with multiple oligonucleotide probes, and background noise. Thus, the precision in smFISH is often compromised by partial or nonspecific probe binding and tissue autofluorescence, especially when only a small number of probes can be fitted to the target transcript. Here we provide an accurate approach for setting quantitative thresholds between true and false signals, which relies on on-off duty cycles of photoswitchable dyes.more » This fluctuation localization imaging-based FISH (fliFISH) uses on-time fractions (measured over a series of exposures) collected from transcripts bound to as low as 8 probes, which are distinct from on-time fractions collected from nonspecifically bound probes or autofluorescence. Using multicolor fliFISH, we identified radial gene expression patterns in mouse pancreatic islets for insulin, the transcription factor, NKX2-2 and their ratio ( Nkx2- 2/Ins2). These radial patterns, showing higher values in β cells at the islet core and lower values in peripheral cells, were lost in diabetic mouse islets. In summary, fliFISH provides an accurate, quantitative approach for detecting and counting true RNA copies and rejecting false signals by their distinct on-time fractions, laying the foundation for reliable single-cell transcriptomics.« less

  8. SU-D-18C-05: Variable Bolus Arterial Spin Labeling MRI for Accurate Cerebral Blood Flow and Arterial Transit Time Mapping

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, M; Jung, Y

    2014-06-01

    Purpose: Arterial spin labeling (ASL) is an MRI perfusion imaging method from which quantitative cerebral blood flow (CBF) maps can be calculated. Acquisition with variable post-labeling delays (PLD) and variable TRs allows for arterial transit time (ATT) mapping and leads to more accurate CBF quantification with a scan time saving of 48%. In addition, T1 and M0 maps can be obtained without a separate scan. In order to accurately estimate ATT and T1 of brain tissue from the ASL data, variable labeling durations were invented, entitled variable-bolus ASL. Methods: All images were collected on a healthy subject with a 3Tmore » Siemens Skyra scanner. Variable-bolus Psuedo-continuous ASL (PCASL) images were collected with 7 TI times ranging 100-4300ms in increments of 700ms with TR ranging 1000-5200ms. All boluses were 1600ms when the TI allowed, otherwise the bolus duration was 100ms shorter than the TI. All TI times were interleaved to reduce sensitivity to motion. Voxel-wise T1 and M0 maps were estimated using a linear least squares fitting routine from the average singal from each TI time. Then pairwise subtraction of each label/control pair and averaging for each TI time was performed. CBF and ATT maps were created using the standard model by Buxton et al. with a nonlinear fitting routine using the T1 tissue map. Results: CBF maps insensitive to ATT were produced along with ATT maps. Both maps show patterns and averages consistent with literature. The T1 map also shows typical T1 contrast. Conclusion: It has been demonstrated that variablebolus ASL produces CBF maps free from the errors due to ATT and tissue T1 variations and provides M0, T1, and ATT maps which have potential utility. This is accomplished with a single scan in a feasible scan time (under 6 minutes) with low sensivity to motion.« less

  9. Whole Blood Activation Results in Enhanced Detection of T Cell and Monocyte Cytokine Production by Flow Cytometry

    NASA Technical Reports Server (NTRS)

    Sams, Clarence F.; Crucian, Brian E.

    2001-01-01

    An excellent monitor of the immune balance of peripheral circulating cells is to determine their cytokine production patterns in response to stimuli. Using flow cytometry a positive identification of cytokine producing cells in a mixed culture may be achieved. Recently, the ability to assess cytokine production following a wholeblood activation culture has been described. We compared whole blood culture to standard PBMC culture and determined the individual cytokine secretion patterns for both T cells and monocytes via flow cytometry. For T cells cytokine assessment following PMA +ionomycin activation: (1) a significantly greater percentages of T cells producing IFNgamma and IL-2 were observed following whole-blood culture; (2) altered T cell cytokine production kinetics were observed by varying whole blood culture times. In addition, a four-color cytometric analysis was used to allow accurate phenotyping and quantitation of cytokine producing lymphocyte populations. Using this technique we found IFNgamma production to be significantly elevated in the CD3+/CD8+ T cell population as compared to the CD3+/CD8- population following five hours of whole blood activation. Conversely, IL-2 and IL-10 production were significantly elevated in the CD3+/CD8- T cell population as compared to the CD3+/CD8+ population. Monocyte cytokine production was assessed in both culture systems following LPS activation for 24 hours. A three-color flow cytometric was used to assess two cytokines in conjunction with CD 14. The cytokine pairs used for analysis were IL-1a/IL-12, and IL-10ITNFa. Nearly all monocytes were stimulated to produce IL-1a, IL-12 and TNFalpha equally well in both culture systems. Monocyte production of IL-10 was significantly elevated following whole blood culture as compared to PBMC culture. IL-12 producing monocytes appeared to be a distinct subpopulation of the IL-1a producing set, whereas IL-10 and TNFa producing monocytes were largely mutually exclusive. IL-10 and TNFa producing monocytes may represent functionally different monocyte subsets with distinct functions. Whole blood culture eliminates the need to purify cell populations prior to culture and may have significant utility for the routine monitoring of the cytokine balances of the peripheral blood T cell and monocyte populations. In addition, there are distinct advantages to performing whole-blood (WB) activation as compared to PBMC activation. These advantages would include retaining all various cell-cell interactions as well as any soluble factors present in serum that influence cell activation. It is likely that the altered cytokine production observed following whole blood culture more accurately represents the in-vivo immune balance.

  10. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gritsenko, Marina A.; Xu, Zhe; Liu, Tao

    Comprehensive, quantitative information on abundances of proteins and their post-translational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labelling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification andmore » quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples, and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.« less

  11. Large-Scale and Deep Quantitative Proteome Profiling Using Isobaric Labeling Coupled with Two-Dimensional LC-MS/MS.

    PubMed

    Gritsenko, Marina A; Xu, Zhe; Liu, Tao; Smith, Richard D

    2016-01-01

    Comprehensive, quantitative information on abundances of proteins and their posttranslational modifications (PTMs) can potentially provide novel biological insights into diseases pathogenesis and therapeutic intervention. Herein, we introduce a quantitative strategy utilizing isobaric stable isotope-labeling techniques combined with two-dimensional liquid chromatography-tandem mass spectrometry (2D-LC-MS/MS) for large-scale, deep quantitative proteome profiling of biological samples or clinical specimens such as tumor tissues. The workflow includes isobaric labeling of tryptic peptides for multiplexed and accurate quantitative analysis, basic reversed-phase LC fractionation and concatenation for reduced sample complexity, and nano-LC coupled to high resolution and high mass accuracy MS analysis for high confidence identification and quantification of proteins. This proteomic analysis strategy has been successfully applied for in-depth quantitative proteomic analysis of tumor samples and can also be used for integrated proteome and PTM characterization, as well as comprehensive quantitative proteomic analysis across samples from large clinical cohorts.

  12. Quantitation of circulating tumor cells in blood samples from ovarian and prostate cancer patients using tumor-specific fluorescent ligands.

    PubMed

    He, Wei; Kularatne, Sumith A; Kalli, Kimberly R; Prendergast, Franklyn G; Amato, Robert J; Klee, George G; Hartmann, Lynn C; Low, Philip S

    2008-10-15

    Quantitation of circulating tumor cells (CTCs) can provide information on the stage of a malignancy, onset of disease progression and response to therapy. In an effort to more accurately quantitate CTCs, we have synthesized fluorescent conjugates of 2 high-affinity tumor-specific ligands (folate-AlexaFluor 488 and DUPA-FITC) that bind tumor cells >20-fold more efficiently than fluorescent antibodies. Here we determine whether these tumor-specific dyes can be exploited for quantitation of CTCs in peripheral blood samples from cancer patients. A CTC-enriched fraction was isolated from the peripheral blood of ovarian and prostate cancer patients by an optimized density gradient centrifugation protocol and labeled with the aforementioned fluorescent ligands. CTCs were then quantitated by flow cytometry. CTCs were detected in 18 of 20 ovarian cancer patients (mean 222 CTCs/ml; median 15 CTCs/ml; maximum 3,118 CTCs/ml), whereas CTC numbers in 16 gender-matched normal volunteers were negligible (mean 0.4 CTCs/ml; median 0.3 CTCs/ml; maximum 1.5 CTCs/ml; p < 0.001, chi(2)). CTCs were also detected in 10 of 13 prostate cancer patients (mean 26 CTCs/ml, median 14 CTCs/ml, maximum 94 CTCs/ml) but not in 18 gender-matched healthy donors (mean 0.8 CTCs/ml, median 1, maximum 3 CTC/ml; p < 0.0026, chi(2)). Tumor-specific fluorescent antibodies were much less efficient in quantitating CTCs because of their lower CTC labeling efficiency. Use of tumor-specific fluorescent ligands to label CTCs in peripheral blood can provide a simple, accurate and sensitive method for determining the number of cancer cells circulating in the bloodstream.

  13. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet--a webserver implementation of AMPHORA2--, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under-performers: they counted quite reliably each short read to their respective taxon, producing the typical genome length bias. The benchmark dataset is available at http://pitgroup.org/static/3RandomGenome-100kavg150bps.fna.

  14. Multi-scale Modeling of Plasticity in Tantalum.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Hojun; Battaile, Corbett Chandler.; Carroll, Jay

    In this report, we present a multi-scale computational model to simulate plastic deformation of tantalum and validating experiments. In atomistic/ dislocation level, dislocation kink- pair theory is used to formulate temperature and strain rate dependent constitutive equations. The kink-pair theory is calibrated to available data from single crystal experiments to produce accurate and convenient constitutive laws. The model is then implemented into a BCC crystal plasticity finite element method (CP-FEM) model to predict temperature and strain rate dependent yield stresses of single and polycrystalline tantalum and compared with existing experimental data from the literature. Furthermore, classical continuum constitutive models describingmore » temperature and strain rate dependent flow behaviors are fit to the yield stresses obtained from the CP-FEM polycrystal predictions. The model is then used to conduct hydro- dynamic simulations of Taylor cylinder impact test and compared with experiments. In order to validate the proposed tantalum CP-FEM model with experiments, we introduce a method for quantitative comparison of CP-FEM models with various experimental techniques. To mitigate the effects of unknown subsurface microstructure, tantalum tensile specimens with a pseudo-two-dimensional grain structure and grain sizes on the order of millimeters are used. A technique combining an electron back scatter diffraction (EBSD) and high resolution digital image correlation (HR-DIC) is used to measure the texture and sub-grain strain fields upon uniaxial tensile loading at various applied strains. Deformed specimens are also analyzed with optical profilometry measurements to obtain out-of- plane strain fields. These high resolution measurements are directly compared with large-scale CP-FEM predictions. This computational method directly links fundamental dislocation physics to plastic deformations in the grain-scale and to the engineering-scale applications. Furthermore, direct and quantitative comparisons between experimental measurements and simulation show that the proposed model accurately captures plasticity in deformation of polycrystalline tantalum.« less

  15. Learning accurate and concise naïve Bayes classifiers from attribute value taxonomies and data

    PubMed Central

    Kang, D.-K.; Silvescu, A.; Honavar, V.

    2009-01-01

    In many application domains, there is a need for learning algorithms that can effectively exploit attribute value taxonomies (AVT)—hierarchical groupings of attribute values—to learn compact, comprehensible and accurate classifiers from data—including data that are partially specified. This paper describes AVT-NBL, a natural generalization of the naïve Bayes learner (NBL), for learning classifiers from AVT and data. Our experimental results show that AVT-NBL is able to generate classifiers that are substantially more compact and more accurate than those produced by NBL on a broad range of data sets with different percentages of partially specified values. We also show that AVT-NBL is more efficient in its use of training data: AVT-NBL produces classifiers that outperform those produced by NBL using substantially fewer training examples. PMID:20351793

  16. Accurate quantitation of D+ fetomaternal hemorrhage by flow cytometry using a novel reagent to eliminate granulocytes from analysis.

    PubMed

    Kumpel, Belinda; Hazell, Matthew; Guest, Alan; Dixey, Jonathan; Mushens, Rosey; Bishop, Debbie; Wreford-Bush, Tim; Lee, Edmond

    2014-05-01

    Quantitation of fetomaternal hemorrhage (FMH) is performed to determine the dose of prophylactic anti-D (RhIG) required to prevent D immunization of D- women. Flow cytometry (FC) is the most accurate method. However, maternal white blood cells (WBCs) can give high background by binding anti-D nonspecifically, compromising accuracy. Maternal blood samples (69) were sent for FC quantitation of FMH after positive Kleihauer-Betke test (KBT) analysis and RhIG administration. Reagents used were BRAD-3-fluorescein isothiocyanate (FITC; anti-D), AEVZ5.3-FITC (anti-varicella zoster [anti-VZ], negative control), anti-fetal hemoglobin (HbF)-FITC, blended two-color reagents, BRAD-3-FITC/anti-CD45-phycoerythrin (PE; anti-D/L), and BRAD-3-FITC/anti-CD66b-PE (anti-D/G). PE-positive WBCs were eliminated from analysis by gating. Full blood counts were performed on maternal samples and female donors. Elevated numbers of neutrophils were present in 80% of patients. Red blood cell (RBC) indices varied widely in maternal blood. D+ FMH values obtained with anti-D/L, anti-D/G, and anti-HbF-FITC were very similar (r = 0.99, p < 0.001). Correlation between KBT and anti-HbF-FITC FMH results was low (r = 0.716). Inaccurate FMH quantitation using the current method (anti-D minus anti-VZ) occurred with 71% samples having less than 15 mL of D+ FMH (RBCs) and insufficient RhIG calculated for 9%. Using two-color reagents and anti-HbF-FITC, approximately 30% patients had elevated F cells, 26% had no fetal cells, 6% had D- FMH, 26% had 4 to 15 mL of D+ FMH, and 12% patients had more than 15 mL of D+ FMH (RBCs) requiring more than 300 μg of RhIG. Without accurate quantitation of D+ FMH by FC, some women would receive inappropriate or inadequate anti-D prophylaxis. The latter may be at risk of immunization leading to hemolytic disease of the newborn. © 2013 American Association of Blood Banks.

  17. Fast history matching of time-lapse seismic and production data for high resolution models

    NASA Astrophysics Data System (ADS)

    Jimenez Arismendi, Eduardo Antonio

    Integrated reservoir modeling has become an important part of day-to-day decision analysis in oil and gas management practices. A very attractive and promising technology is the use of time-lapse or 4D seismic as an essential component in subsurface modeling. Today, 4D seismic is enabling oil companies to optimize production and increase recovery through monitoring fluid movements throughout the reservoir. 4D seismic advances are also being driven by an increased need by the petroleum engineering community to become more quantitative and accurate in our ability to monitor reservoir processes. Qualitative interpretations of time-lapse anomalies are being replaced by quantitative inversions of 4D seismic data to produce accurate maps of fluid saturations, pore pressure, temperature, among others. Within all steps involved in this subsurface modeling process, the most demanding one is integrating the geologic model with dynamic field data, including 4Dseismic when available. The validation of the geologic model with observed dynamic data is accomplished through a "history matching" (HM) process typically carried out with well-based measurements. Due to low resolution of production data, the validation process is severely limited in its reservoir areal coverage, compromising the quality of the model and any subsequent predictive exercise. This research will aim to provide a novel history matching approach that can use information from high-resolution seismic data to supplement the areally sparse production data. The proposed approach will utilize streamline-derived sensitivities as means of relating the forward model performance with the prior geologic model. The essential ideas underlying this approach are similar to those used for high-frequency approximations in seismic wave propagation. In both cases, this leads to solutions that are defined along "streamlines" (fluid flow), or "rays" (seismic wave propagation). Synthetic and field data examples will be used extensively to demonstrate the value and contribution of this work. Our results show that the problem of non-uniqueness in this complex history matching problem is greatly reduced when constraints in the form of saturation maps from spatially closely sampled seismic data are included. Further on, our methodology can be used to quickly identify discrepancies between static and dynamic modeling. Reducing this gap will ensure robust and reliable models leading to accurate predictions and ultimately an optimum hydrocarbon extraction.

  18. Identification of internal control genes for quantitative expression analysis by real-time PCR in bovine peripheral lymphocytes.

    PubMed

    Spalenza, Veronica; Girolami, Flavia; Bevilacqua, Claudia; Riondato, Fulvio; Rasero, Roberto; Nebbia, Carlo; Sacchi, Paola; Martin, Patrice

    2011-09-01

    Gene expression studies in blood cells, particularly lymphocytes, are useful for monitoring potential exposure to toxicants or environmental pollutants in humans and livestock species. Quantitative PCR is the method of choice for obtaining accurate quantification of mRNA transcripts although variations in the amount of starting material, enzymatic efficiency, and the presence of inhibitors can lead to evaluation errors. As a result, normalization of data is of crucial importance. The most common approach is the use of endogenous reference genes as an internal control, whose expression should ideally not vary among individuals and under different experimental conditions. The accurate selection of reference genes is therefore an important step in interpreting quantitative PCR studies. Since no systematic investigation in bovine lymphocytes has been performed, the aim of the present study was to assess the expression stability of seven candidate reference genes in circulating lymphocytes collected from 15 dairy cows. Following the characterization by flow cytometric analysis of the cell populations obtained from blood through a density gradient procedure, three popular softwares were used to evaluate the gene expression data. The results showed that two genes are sufficient for normalization of quantitative PCR studies in cattle lymphocytes and that YWAHZ, S24 and PPIA are the most stable genes. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Quantitative analysis of periodontal pathogens by ELISA and real-time polymerase chain reaction.

    PubMed

    Hamlet, Stephen M

    2010-01-01

    The development of analytical methods enabling the accurate identification and enumeration of bacterial species colonizing the oral cavity has led to the identification of a small number of bacterial pathogens that are major factors in the etiology of periodontal disease. Further, these methods also underpin more recent epidemiological analyses of the impact of periodontal disease on general health. Given the complex milieu of over 700 species of microorganisms known to exist within the complex biofilms found in the oral cavity, the identification and enumeration of oral periodontopathogens has not been an easy task. In recent years however, some of the intrinsic limitations of the more traditional microbiological analyses previously used have been overcome with the advent of immunological and molecular analytical methods. Of the plethora of methodologies reported in the literature, the enzyme-linked immunosorbent assay (ELISA), which combines the specificity of antibody with the sensitivity of simple enzyme assays and the polymerase chain reaction (PCR), has been widely utilized in both laboratory and clinical applications. Although conventional PCR does not allow quantitation of the target organism, real-time PCR (rtPCR) has the ability to detect amplicons as they accumulate in "real time" allowing subsequent quantitation. These methods enable the accurate quantitation of as few as 10(2) (using rtPCR) to 10(4) (using ELISA) periodontopathogens in dental plaque samples.

  20. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    USGS Publications Warehouse

    Anderson, Ryan; Clegg, Samuel M.; Frydenvang, Jens; Wiens, Roger C.; McLennan, Scott M.; Morris, Richard V.; Ehlmann, Bethany L.; Dyar, M. Darby

    2017-01-01

    Accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response of an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “sub-model” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. The sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.

  1. Accurate collision-induced line-coupling parameters for the fundamental band of CO in He - Close coupling and coupled states scattering calculations

    NASA Technical Reports Server (NTRS)

    Green, Sheldon; Boissoles, J.; Boulet, C.

    1988-01-01

    The first accurate theoretical values for off-diagonal (i.e., line-coupling) pressure-broadening cross sections are presented. Calculations were done for CO perturbed by He at thermal collision energies using an accurate ab initio potential energy surface. Converged close coupling, i.e., numerically exact values, were obtained for coupling to the R(0) and R(2) lines. These were used to test the coupled states (CS) and infinite order sudden (IOS) approximate scattering methods. CS was found to be of quantitative accuracy (a few percent) and has been used to obtain coupling values for lines to R(10). IOS values are less accurate, but, owing to their simplicity, may nonetheless prove useful as has been recently demonstrated.

  2. An accurate method of extracting fat droplets in liver images for quantitative evaluation

    NASA Astrophysics Data System (ADS)

    Ishikawa, Masahiro; Kobayashi, Naoki; Komagata, Hideki; Shinoda, Kazuma; Yamaguchi, Masahiro; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2015-03-01

    The steatosis in liver pathological tissue images is a promising indicator of nonalcoholic fatty liver disease (NAFLD) and the possible risk of hepatocellular carcinoma (HCC). The resulting values are also important for ensuring the automatic and accurate classification of HCC images, because the existence of many fat droplets is likely to create errors in quantifying the morphological features used in the process. In this study we propose a method that can automatically detect, and exclude regions with many fat droplets by using the feature values of colors, shapes and the arrangement of cell nuclei. We implement the method and confirm that it can accurately detect fat droplets and quantify the fat droplet ratio of actual images. This investigation also clarifies the effective characteristics that contribute to accurate detection.

  3. PASTA: Ultra-Large Multiple Sequence Alignment for Nucleotide and Amino-Acid Sequences.

    PubMed

    Mirarab, Siavash; Nguyen, Nam; Guo, Sheng; Wang, Li-San; Kim, Junhyong; Warnow, Tandy

    2015-05-01

    We introduce PASTA, a new multiple sequence alignment algorithm. PASTA uses a new technique to produce an alignment given a guide tree that enables it to be both highly scalable and very accurate. We present a study on biological and simulated data with up to 200,000 sequences, showing that PASTA produces highly accurate alignments, improving on the accuracy and scalability of the leading alignment methods (including SATé). We also show that trees estimated on PASTA alignments are highly accurate--slightly better than SATé trees, but with substantial improvements relative to other methods. Finally, PASTA is faster than SATé, highly parallelizable, and requires relatively little memory.

  4. Identification of Novel Tumor-Associated Cell Surface Sialoglycoproteins in Human Glioblastoma Tumors Using Quantitative Proteomics

    PubMed Central

    Autelitano, François; Loyaux, Denis; Roudières, Sébastien; Déon, Catherine; Guette, Frédérique; Fabre, Philippe; Ping, Qinggong; Wang, Su; Auvergne, Romane; Badarinarayana, Vasudeo; Smith, Michael; Guillemot, Jean-Claude; Goldman, Steven A.; Natesan, Sridaran; Ferrara, Pascual; August, Paul

    2014-01-01

    Glioblastoma multiform (GBM) remains clinical indication with significant “unmet medical need”. Innovative new therapy to eliminate residual tumor cells and prevent tumor recurrences is critically needed for this deadly disease. A major challenge of GBM research has been the identification of novel molecular therapeutic targets and accurate diagnostic/prognostic biomarkers. Many of the current clinical therapeutic targets of immunotoxins and ligand-directed toxins for high-grade glioma (HGG) cells are surface sialylated glycoproteins. Therefore, methods that systematically and quantitatively analyze cell surface sialoglycoproteins in human clinical tumor samples would be useful for the identification of potential diagnostic markers and therapeutic targets for malignant gliomas. In this study, we used the bioorthogonal chemical reporter strategy (BOCR) in combination with label-free quantitative mass spectrometry (LFQ-MS) to characterize and accurately quantify the individual cell surface sialoproteome in human GBM tissues, in fetal, adult human astrocytes, and in human neural progenitor cells (NPCs). We identified and quantified a total of 843 proteins, including 801 glycoproteins. Among the 843 proteins, 606 (72%) are known cell surface or secreted glycoproteins, including 156 CD-antigens, all major classes of cell surface receptor proteins, transporters, and adhesion proteins. Our findings identified several known as well as new cell surface antigens whose expression is predominantly restricted to human GBM tumors as confirmed by microarray transcription profiling, quantitative RT-PCR and immunohistochemical staining. This report presents the comprehensive identification of new biomarkers and therapeutic targets for the treatment of malignant gliomas using quantitative sialoglycoproteomics with clinically relevant, patient derived primary glioma cells. PMID:25360666

  5. Identification of novel tumor-associated cell surface sialoglycoproteins in human glioblastoma tumors using quantitative proteomics.

    PubMed

    Autelitano, François; Loyaux, Denis; Roudières, Sébastien; Déon, Catherine; Guette, Frédérique; Fabre, Philippe; Ping, Qinggong; Wang, Su; Auvergne, Romane; Badarinarayana, Vasudeo; Smith, Michael; Guillemot, Jean-Claude; Goldman, Steven A; Natesan, Sridaran; Ferrara, Pascual; August, Paul

    2014-01-01

    Glioblastoma multiform (GBM) remains clinical indication with significant "unmet medical need". Innovative new therapy to eliminate residual tumor cells and prevent tumor recurrences is critically needed for this deadly disease. A major challenge of GBM research has been the identification of novel molecular therapeutic targets and accurate diagnostic/prognostic biomarkers. Many of the current clinical therapeutic targets of immunotoxins and ligand-directed toxins for high-grade glioma (HGG) cells are surface sialylated glycoproteins. Therefore, methods that systematically and quantitatively analyze cell surface sialoglycoproteins in human clinical tumor samples would be useful for the identification of potential diagnostic markers and therapeutic targets for malignant gliomas. In this study, we used the bioorthogonal chemical reporter strategy (BOCR) in combination with label-free quantitative mass spectrometry (LFQ-MS) to characterize and accurately quantify the individual cell surface sialoproteome in human GBM tissues, in fetal, adult human astrocytes, and in human neural progenitor cells (NPCs). We identified and quantified a total of 843 proteins, including 801 glycoproteins. Among the 843 proteins, 606 (72%) are known cell surface or secreted glycoproteins, including 156 CD-antigens, all major classes of cell surface receptor proteins, transporters, and adhesion proteins. Our findings identified several known as well as new cell surface antigens whose expression is predominantly restricted to human GBM tumors as confirmed by microarray transcription profiling, quantitative RT-PCR and immunohistochemical staining. This report presents the comprehensive identification of new biomarkers and therapeutic targets for the treatment of malignant gliomas using quantitative sialoglycoproteomics with clinically relevant, patient derived primary glioma cells.

  6. In silico method for modelling metabolism and gene product expression at genome scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lerman, Joshua A.; Hyduke, Daniel R.; Latif, Haythem

    2012-07-03

    Transcription and translation use raw materials and energy generated metabolically to create the macromolecular machinery responsible for all cellular functions, including metabolism. A biochemically accurate model of molecular biology and metabolism will facilitate comprehensive and quantitative computations of an organism's molecular constitution as a function of genetic and environmental parameters. Here we formulate a model of metabolism and macromolecular expression. Prototyping it using the simple microorganism Thermotoga maritima, we show our model accurately simulates variations in cellular composition and gene expression. Moreover, through in silico comparative transcriptomics, the model allows the discovery of new regulons and improving the genome andmore » transcription unit annotations. Our method presents a framework for investigating molecular biology and cellular physiology in silico and may allow quantitative interpretation of multi-omics data sets in the context of an integrated biochemical description of an organism.« less

  7. Experimental study on the impact of temperature on the dissipation process of supersaturated total dissolved gas.

    PubMed

    Shen, Xia; Liu, Shengyun; Li, Ran; Ou, Yangming

    2014-09-01

    Water temperature not only affects the solubility of gas in water but can also be an important factor in the dissipation process of supersaturated total dissolved gas (TDG). The quantitative relationship between the dissipation process and temperature has not been previously described. This relationship affects the accurate evaluation of the dissipation process and the subsequent biological effects. This article experimentally investigates the impact of temperature on supersaturated TDG dissipation in static and turbulent conditions. The results show that the supersaturated TDG dissipation coefficient increases with the temperature and turbulence intensity. The quantitative relationship was verified by straight flume experiments. This study enhances our understanding of the dissipation of supersaturated TDG. Furthermore, it provides a scientific foundation for the accurate prediction of the dissipation process of supersaturated TDG in the downstream area and the negative impacts of high dam projects on aquatic ecosystems. Copyright © 2014. Published by Elsevier B.V.

  8. Lunar mineral feedstocks from rocks and soils: X-ray digital imaging in resource evaluation

    NASA Technical Reports Server (NTRS)

    Chambers, John G.; Patchen, Allan; Taylor, Lawrence A.; Higgins, Stefan J.; Mckay, David S.

    1994-01-01

    The rocks and soils of the Moon provide raw materials essential to the successful establishment of a lunar base. Efficient exploitation of these resources requires accurate characterization of mineral abundances, sizes/shapes, and association of 'ore' and 'gangue' phases, as well as the technology to generate high-yield/high-grade feedstocks. Only recently have x-ray mapping and digital imaging techniques been applied to lunar resource evaluation. The topics covered include inherent differences between lunar basalts and soils and quantitative comparison of rock-derived and soil-derived ilmenite concentrates. It is concluded that x-ray digital-imaging characterization of lunar raw materials provides a quantitative comparison that is unattainable by traditional petrographic techniques. These data are necessary for accurately determining mineral distributions of soil and crushed rock material. Application of these techniques will provide an important link to choosing the best raw material for mineral beneficiation.

  9. Fuzzy classifier based support vector regression framework for Poisson ratio determination

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2013-09-01

    Poisson ratio is considered as one of the most important rock mechanical properties of hydrocarbon reservoirs. Determination of this parameter through laboratory measurement is time, cost, and labor intensive. Furthermore, laboratory measurements do not provide continuous data along the reservoir intervals. Hence, a fast, accurate, and inexpensive way of determining Poisson ratio which produces continuous data over the whole reservoir interval is desirable. For this purpose, support vector regression (SVR) method based on statistical learning theory (SLT) was employed as a supervised learning algorithm to estimate Poisson ratio from conventional well log data. SVR is capable of accurately extracting the implicit knowledge contained in conventional well logs and converting the gained knowledge into Poisson ratio data. Structural risk minimization (SRM) principle which is embedded in the SVR structure in addition to empirical risk minimization (EMR) principle provides a robust model for finding quantitative formulation between conventional well log data and Poisson ratio. Although satisfying results were obtained from an individual SVR model, it had flaws of overestimation in low Poisson ratios and underestimation in high Poisson ratios. These errors were eliminated through implementation of fuzzy classifier based SVR (FCBSVR). The FCBSVR significantly improved accuracy of the final prediction. This strategy was successfully applied to data from carbonate reservoir rocks of an Iranian Oil Field. Results indicated that SVR predicted Poisson ratio values are in good agreement with measured values.

  10. Microlensing for extrasolar planets : improving the photometry

    NASA Astrophysics Data System (ADS)

    Bajek, David J.

    2013-08-01

    Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.

  11. Optical eigenmodes for illumination & imaging

    NASA Astrophysics Data System (ADS)

    Kosmeier, Sebastian

    Gravitational Microlensing, as a technique for detecting Extrasolar Planets, is recognised for its potential in discovering small-mass planets similar to Earth, at a distance of a few Astronomical Units from their host stars. However, analysing the data from microlensing events (which statistically rarely reveal planets) is complex and requires continued and intensive use of various networks of telescopes working together in order to observe the phenomenon. As such the techniques are constantly being developed and refined; this project outlines some steps of the careful analysis required to model an event and ensure the best quality data is used in the fitting. A quantitative investigation into increasing the quality of the original photometric data available from any microlensing event demonstrates that 'lucky imaging' can lead to a marked improvement in the signal to noise ratio of images over standard imaging techniques, which could result in more accurate models and thus the calculation of more accurate planetary parameters. In addition, a simulation illustrating the effects of atmospheric turbulence on exposures was created, and expanded upon to give an approximation of the lucky imaging technique. This further demonstrated the advantages of lucky images which are shown to potentially approach the quality of those expected from diffraction limited photometry. The simulation may be further developed for potential future use as a 'theoretical lucky imager' in our research group, capable of producing and analysing synthetic exposures through customisable conditions.

  12. The traditional food of migrants: Meat, water, and other challenges for dietary advice. An ethnography in Guanajuato, Mexico.

    PubMed

    Smith-Morris, Carolyn

    2016-10-01

    The term "traditional diet" is used variously in public health and nutrition literature to refer to a substantial variety of foodways. Yet it is difficult to draw generalities about dietary tradition for specific ethnic groups. Given the strong association between migration and dietary change, it is particularly important that dietary advice for migrants be both accurate and specific. In this article, I examine the cultural construct of "traditional foods" through mixed method research on diet and foodways among rural farmers in Guanajuato, MX and migrants from this community to other Mexican and U.S. destinations. Findings reveal first, that quantitatively salient terms may contain important variation, and second, that some "traditional" dietary items -like "refresco," "carne," and "agua" - may be used in nutritionally contradictory ways between clinicians and Mexican immigrant patients. Specifically, the term "traditional food" in nutritional advice for Mexican migrants may be intended to promote consumption of fresh produce or less meat; but it may also invoke other foods (e.g., meats or corn), inspire more regular consumption of formerly rare foods (e.g., meats, flavored waters), or set up financially impossible goals (e.g., leaner meats than can be afforded). Salience studies with ethnographic follow up in target populations can promote the most useful and accurate terms for dietary advice. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Quantitative contrast-enhanced optical coherence tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winetraub, Yonatan; SoRelle, Elliott D.; Bio-X Program, Stanford University, 299 Campus Drive, Stanford, California 94305

    2016-01-11

    We have developed a model to accurately quantify the signals produced by exogenous scattering agents used for contrast-enhanced Optical Coherence Tomography (OCT). This model predicts distinct concentration-dependent signal trends that arise from the underlying physics of OCT detection. Accordingly, we show that real scattering particles can be described as simplified ideal scatterers with modified scattering intensity and concentration. The relation between OCT signal and particle concentration is approximately linear at concentrations lower than 0.8 particle per imaging voxel. However, at higher concentrations, interference effects cause signal to increase with a square root dependence on the number of particles within amore » voxel. Finally, high particle concentrations cause enough light attenuation to saturate the detected signal. Predictions were validated by comparison with measured OCT signals from gold nanorods (GNRs) prepared in water at concentrations ranging over five orders of magnitude (50 fM to 5 nM). In addition, we validated that our model accurately predicts the signal responses of GNRs in highly heterogeneous scattering environments including whole blood and living animals. By enabling particle quantification, this work provides a valuable tool for current and future contrast-enhanced in vivo OCT studies. More generally, the model described herein may inform the interpretation of detected signals in modalities that rely on coherence-based detection or are susceptible to interference effects.« less

  14. Liquid chromatographic determination of oxytetracycline in edible fish fillets from six species of fish

    USGS Publications Warehouse

    Meinertz, J.R.; Stehly, G.R.; Gingerich, W.H.

    1998-01-01

    The approved use of oxytetracycline (OTC) in U.S. Aquaculture is limited to specific diseases in salmonids and channel catfish. OTC may also be effective in controlling diseases in other fish species important to public aquaculture, but before approved use of OTC can be augmented, an analytical method for determining OTC in fillet tissue from multiple species of fish will be required to support residue depletion studies. The objective of this study was to develop and validate a liquid chromatographic (LC) method that is accurate, precise, and sensitive for OTC in edible fillets from multiple species of fish. Homogenized fillet tissues from walleye, Atlantic salmon, striped bass, white sturgeon, rainbow trout, and channel catfish were fortified with OTC at nominal concentrations of 10, 20, 100, 1000, and 5000 ng/g. In tissues fortified with OTC at 100, 1000, and 5000 ng/g, mean recoveries ranged from 83 to 90%, and relative standard deviations (RSDs) ranged from 0.9 to 5.8%. In all other tissues, mean recoveries ranged from 59 to 98%, and RSDs ranged from 3.3 to 20%. Method quantitation limits ranged from 6 to 22 ng/g for the 6 species. The LC parameters produced easily integratable OTC peaks without coelution of endogenous compounds. The method is accurate, precise, and sensitive for OTC in fillet tissue from 6 species of fish from 5 phylogenetically diverse groups.

  15. Comparative study of transient hydraulic tomography with varying parameterizations and zonations: Laboratory sandbox investigation

    NASA Astrophysics Data System (ADS)

    Luo, Ning; Zhao, Zhanfeng; Illman, Walter A.; Berg, Steven J.

    2017-11-01

    Transient hydraulic tomography (THT) is a robust method of aquifer characterization to estimate the spatial distributions (or tomograms) of both hydraulic conductivity (K) and specific storage (Ss). However, the highly-parameterized nature of the geostatistical inversion approach renders it computationally intensive for large-scale investigations. In addition, geostatistics-based THT may produce overly smooth tomograms when head data used to constrain the inversion is limited. Therefore, alternative model conceptualizations for THT need to be examined. To investigate this, we simultaneously calibrated different groundwater models with varying parameterizations and zonations using two cases of different pumping and monitoring data densities from a laboratory sandbox. Specifically, one effective parameter model, four geology-based zonation models with varying accuracy and resolution, and five geostatistical models with different prior information are calibrated. Model performance is quantitatively assessed by examining the calibration and validation results. Our study reveals that highly parameterized geostatistical models perform the best among the models compared, while the zonation model with excellent knowledge of stratigraphy also yields comparable results. When few pumping tests with sparse monitoring intervals are available, the incorporation of accurate or simplified geological information into geostatistical models reveals more details in heterogeneity and yields more robust validation results. However, results deteriorate when inaccurate geological information are incorporated. Finally, our study reveals that transient inversions are necessary to obtain reliable K and Ss estimates for making accurate predictions of transient drawdown events.

  16. Preferential access to genetic information from endogenous hominin ancient DNA and accurate quantitative SNP-typing via SPEX

    PubMed Central

    Brotherton, Paul; Sanchez, Juan J.; Cooper, Alan; Endicott, Phillip

    2010-01-01

    The analysis of targeted genetic loci from ancient, forensic and clinical samples is usually built upon polymerase chain reaction (PCR)-generated sequence data. However, many studies have shown that PCR amplification from poor-quality DNA templates can create sequence artefacts at significant levels. With hominin (human and other hominid) samples, the pervasive presence of highly PCR-amplifiable human DNA contaminants in the vast majority of samples can lead to the creation of recombinant hybrids and other non-authentic artefacts. The resulting PCR-generated sequences can then be difficult, if not impossible, to authenticate. In contrast, single primer extension (SPEX)-based approaches can genotype single nucleotide polymorphisms from ancient fragments of DNA as accurately as modern DNA. A single SPEX-type assay can amplify just one of the duplex DNA strands at target loci and generate a multi-fold depth-of-coverage, with non-authentic recombinant hybrids reduced to undetectable levels. Crucially, SPEX-type approaches can preferentially access genetic information from damaged and degraded endogenous ancient DNA templates over modern human DNA contaminants. The development of SPEX-type assays offers the potential for highly accurate, quantitative genotyping from ancient hominin samples. PMID:19864251

  17. A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades

    PubMed Central

    Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd

    2017-01-01

    The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter. PMID:28813566

  18. A nonlinear generalization of the Savitzky-Golay filter and the quantitative analysis of saccades.

    PubMed

    Dai, Weiwei; Selesnick, Ivan; Rizzo, John-Ross; Rucker, Janet; Hudson, Todd

    2017-08-01

    The Savitzky-Golay (SG) filter is widely used to smooth and differentiate time series, especially biomedical data. However, time series that exhibit abrupt departures from their typical trends, such as sharp waves or steps, which are of physiological interest, tend to be oversmoothed by the SG filter. Hence, the SG filter tends to systematically underestimate physiological parameters in certain situations. This article proposes a generalization of the SG filter to more accurately track abrupt deviations in time series, leading to more accurate parameter estimates (e.g., peak velocity of saccadic eye movements). The proposed filtering methodology models a time series as the sum of two component time series: a low-frequency time series for which the conventional SG filter is well suited, and a second time series that exhibits instantaneous deviations (e.g., sharp waves, steps, or more generally, discontinuities in a higher order derivative). The generalized SG filter is then applied to the quantitative analysis of saccadic eye movements. It is demonstrated that (a) the conventional SG filter underestimates the peak velocity of saccades, especially those of small amplitude, and (b) the generalized SG filter estimates peak saccadic velocity more accurately than the conventional filter.

  19. An optimized color transformation for the analysis of digital images of hematoxylin & eosin stained slides.

    PubMed

    Zarella, Mark D; Breen, David E; Plagov, Andrei; Garcia, Fernando U

    2015-01-01

    Hematoxylin and eosin (H&E) staining is ubiquitous in pathology practice and research. As digital pathology has evolved, the reliance of quantitative methods that make use of H&E images has similarly expanded. For example, cell counting and nuclear morphometry rely on the accurate demarcation of nuclei from other structures and each other. One of the major obstacles to quantitative analysis of H&E images is the high degree of variability observed between different samples and different laboratories. In an effort to characterize this variability, as well as to provide a substrate that can potentially mitigate this factor in quantitative image analysis, we developed a technique to project H&E images into an optimized space more appropriate for many image analysis procedures. We used a decision tree-based support vector machine learning algorithm to classify 44 H&E stained whole slide images of resected breast tumors according to the histological structures that are present. This procedure takes an H&E image as an input and produces a classification map of the image that predicts the likelihood of a pixel belonging to any one of a set of user-defined structures (e.g., cytoplasm, stroma). By reducing these maps into their constituent pixels in color space, an optimal reference vector is obtained for each structure, which identifies the color attributes that maximally distinguish one structure from other elements in the image. We show that tissue structures can be identified using this semi-automated technique. By comparing structure centroids across different images, we obtained a quantitative depiction of H&E variability for each structure. This measurement can potentially be utilized in the laboratory to help calibrate daily staining or identify troublesome slides. Moreover, by aligning reference vectors derived from this technique, images can be transformed in a way that standardizes their color properties and makes them more amenable to image processing.

  20. Analytical validation of quantitative immunohistochemical assays of tumor infiltrating lymphocyte biomarkers.

    PubMed

    Singh, U; Cui, Y; Dimaano, N; Mehta, S; Pruitt, S K; Yearley, J; Laterza, O F; Juco, J W; Dogdas, B

    2018-06-04

    Tumor infiltrating lymphocytes (TIL), especially T-cells, have both prognostic and therapeutic applications. The presence of CD8+ effector T-cells and the ratio of CD8+ cells to FOXP3+ regulatory T-cells have been used as biomarkers of disease prognosis to predict response to various immunotherapies. Blocking the interaction between inhibitory receptors on T-cells and their ligands with therapeutic antibodies including atezolizumab, nivolumab, pembrolizumab and tremelimumab increases the immune response against cancer cells and has shown significant improvement in clinical benefits and survival in several different tumor types. The improved clinical outcome is presumed to be associated with a higher tumor infiltration; therefore, it is thought that more accurate methods for measuring the amount of TIL could assist prognosis and predict treatment response. We have developed and validated quantitative immunohistochemistry (IHC) assays for CD3, CD8 and FOXP3 for immunophenotyping T-lymphocytes in tumor tissue. Various types of formalin fixed, paraffin embedded (FFPE) tumor tissues were immunolabeled with anti-CD3, anti-CD8 and anti-FOXP3 antibodies using an IHC autostainer. The tumor area of stained tissues, including the invasive margin of the tumor, was scored by a pathologist (visual scoring) and by computer-based quantitative image analysis. Two image analysis scores were obtained for the staining of each biomarker: the percent positive cells in the tumor area and positive cells/mm 2 tumor area. Comparison of visual vs. image analysis scoring methods using regression analysis showed high correlation and indicated that quantitative image analysis can be used to score the number of positive cells in IHC stained slides. To demonstrate that the IHC assays produce consistent results in normal daily testing, we evaluated the specificity, sensitivity and reproducibility of the IHC assays using both visual and image analysis scoring methods. We found that CD3, CD8 and FOXP3 IHC assays met the fit-for-purpose analytical acceptance validation criteria and that they can be used to support clinical studies.

  1. Filtering Raw Terrestrial Laser Scanning Data for Efficient and Accurate Use in Geomorphologic Modeling

    NASA Astrophysics Data System (ADS)

    Gleason, M. J.; Pitlick, J.; Buttenfield, B. P.

    2011-12-01

    Terrestrial laser scanning (TLS) represents a new and particularly effective remote sensing technique for investigating geomorphologic processes. Unfortunately, TLS data are commonly characterized by extremely large volume, heterogeneous point distribution, and erroneous measurements, raising challenges for applied researchers. To facilitate efficient and accurate use of TLS in geomorphology, and to improve accessibility for TLS processing in commercial software environments, we are developing a filtering method for raw TLS data to: eliminate data redundancy; produce a more uniformly spaced dataset; remove erroneous measurements; and maintain the ability of the TLS dataset to accurately model terrain. Our method conducts local aggregation of raw TLS data using a 3-D search algorithm based on the geometrical expression of expected random errors in the data. This approach accounts for the estimated accuracy and precision limitations of the instruments and procedures used in data collection, thereby allowing for identification and removal of potential erroneous measurements prior to data aggregation. Initial tests of the proposed technique on a sample TLS point cloud required a modest processing time of approximately 100 minutes to reduce dataset volume over 90 percent (from 12,380,074 to 1,145,705 points). Preliminary analysis of the filtered point cloud revealed substantial improvement in homogeneity of point distribution and minimal degradation of derived terrain models. We will test the method on two independent TLS datasets collected in consecutive years along a non-vegetated reach of the North Fork Toutle River in Washington. We will evaluate the tool using various quantitative, qualitative, and statistical methods. The crux of this evaluation will include a bootstrapping analysis to test the ability of the filtered datasets to model the terrain at roughly the same accuracy as the raw datasets.

  2. Accurate SERS detection of malachite green in aquatic products on basis of graphene wrapped flexible sensor.

    PubMed

    Ouyang, Lei; Yao, Ling; Zhou, Taohong; Zhu, Lihua

    2018-10-16

    Malachite Green (MG) is a banned pesticide for aquaculture products. As a required inspection item, its fast and accurate determination before the products' accessing market is very important. Surface enhanced Raman scattering (SERS) is a promising tool for MG sensing, but it requires the overcoming of several problems such as fairly poor sensitivity and reproducibility, especially laser induced chemical conversion and photo-bleaching during SERS observation. By using a graphene wrapped Ag array based flexible membrane sensor, a modified SERS strategy was proposed for the sensitive and accurate detection of MG. The graphene layer functioned as an inert protector for impeding chemical transferring of the bioproduct Leucomalachite Green (LMG) to MG during the SERS detection, and as a heat transmitter for preventing laser induced photo-bleaching, which enables the separate detection of MG and LMG in fish extracts. The combination of the Ag array and the graphene cover also produced plentiful densely and uniformly distributed hot spots, leading to analytical enhancement factor up to 3.9 × 10 8 and excellent reproducibility (relative standard deviation low to 5.8% for 70 runs). The proposed method was easily used for MG detection with limit of detection (LOD) as low as 2.7 × 10 -11  mol L -1 . The flexibility of the sensor enable it have a merit for in-field fast detection of MG residues on the scale of a living fish through a surface extraction and paste transferring manner. The developed strategy was successfully applied in the analysis of real samples, showing good prospects for both the fast inspection and quantitative detection of MG. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Exploring a new quantitative image marker to assess benefit of chemotherapy to ovarian cancer patients

    NASA Astrophysics Data System (ADS)

    Mirniaharikandehei, Seyedehnafiseh; Patil, Omkar; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin

    2017-03-01

    Accurately assessing the potential benefit of chemotherapy to cancer patients is an important prerequisite to developing precision medicine in cancer treatment. The previous study has shown that total psoas area (TPA) measured on preoperative cross-section CT image might be a good image marker to predict long-term outcome of pancreatic cancer patients after surgery. However, accurate and automated segmentation of TPA from the CT image is difficult due to the fuzzy boundary or connection of TPA to other muscle areas. In this study, we developed a new interactive computer-aided detection (ICAD) scheme aiming to segment TPA from the abdominal CT images more accurately and assess the feasibility of using this new quantitative image marker to predict the benefit of ovarian cancer patients receiving Bevacizumab-based chemotherapy. ICAD scheme was applied to identify a CT image slice of interest, which is located at the level of L3 (vertebral spines). The cross-sections of the right and left TPA are segmented using a set of adaptively adjusted boundary conditions. TPA is then quantitatively measured. In addition, recent studies have investigated that muscle radiation attenuation which reflects fat deposition in the tissue might be a good image feature for predicting the survival rate of cancer patients. The scheme and TPA measurement task were applied to a large national clinical trial database involving 1,247 ovarian cancer patients. By comparing with manual segmentation results, we found that ICAD scheme could yield higher accuracy and consistency for this task. Using a new ICAD scheme can provide clinical researchers a useful tool to more efficiently and accurately extract TPA as well as muscle radiation attenuation as new image makers, and allow them to investigate the discriminatory power of it to predict progression-free survival and/or overall survival of the cancer patients before and after taking chemotherapy.

  4. Three-dimensional Hessian matrix-based quantitative vascular imaging of rat iris with optical-resolution photoacoustic microscopy in vivo

    NASA Astrophysics Data System (ADS)

    Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo

    2018-04-01

    For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.

  5. Quantitative analysis of binary polymorphs mixtures of fusidic acid by diffuse reflectance FTIR spectroscopy, diffuse reflectance FT-NIR spectroscopy, Raman spectroscopy and multivariate calibration.

    PubMed

    Guo, Canyong; Luo, Xuefang; Zhou, Xiaohua; Shi, Beijia; Wang, Juanjuan; Zhao, Jinqi; Zhang, Xiaoxia

    2017-06-05

    Vibrational spectroscopic techniques such as infrared, near-infrared and Raman spectroscopy have become popular in detecting and quantifying polymorphism of pharmaceutics since they are fast and non-destructive. This study assessed the ability of three vibrational spectroscopy combined with multivariate analysis to quantify a low-content undesired polymorph within a binary polymorphic mixture. Partial least squares (PLS) regression and support vector machine (SVM) regression were employed to build quantitative models. Fusidic acid, a steroidal antibiotic, was used as the model compound. It was found that PLS regression performed slightly better than SVM regression in all the three spectroscopic techniques. Root mean square errors of prediction (RMSEP) were ranging from 0.48% to 1.17% for diffuse reflectance FTIR spectroscopy and 1.60-1.93% for diffuse reflectance FT-NIR spectroscopy and 1.62-2.31% for Raman spectroscopy. The results indicate that diffuse reflectance FTIR spectroscopy offers significant advantages in providing accurate measurement of polymorphic content in the fusidic acid binary mixtures, while Raman spectroscopy is the least accurate technique for quantitative analysis of polymorphs. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Quantitative Metabolome Analysis Based on Chromatographic Peak Reconstruction in Chemical Isotope Labeling Liquid Chromatography Mass Spectrometry.

    PubMed

    Huan, Tao; Li, Liang

    2015-07-21

    Generating precise and accurate quantitative information on metabolomic changes in comparative samples is important for metabolomics research where technical variations in the metabolomic data should be minimized in order to reveal biological changes. We report a method and software program, IsoMS-Quant, for extracting quantitative information from a metabolomic data set generated by chemical isotope labeling (CIL) liquid chromatography mass spectrometry (LC-MS). Unlike previous work of relying on mass spectral peak ratio of the highest intensity peak pair to measure relative quantity difference of a differentially labeled metabolite, this new program reconstructs the chromatographic peaks of the light- and heavy-labeled metabolite pair and then calculates the ratio of their peak areas to represent the relative concentration difference in two comparative samples. Using chromatographic peaks to perform relative quantification is shown to be more precise and accurate. IsoMS-Quant is integrated with IsoMS for picking peak pairs and Zero-fill for retrieving missing peak pairs in the initial peak pairs table generated by IsoMS to form a complete tool for processing CIL LC-MS data. This program can be freely downloaded from the www.MyCompoundID.org web site for noncommercial use.

  7. UNiquant, a program for quantitative proteomics analysis using stable isotope labeling.

    PubMed

    Huang, Xin; Tolmachev, Aleksey V; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A; Smith, Richard D; Chan, Wing C; Hinrichs, Steven H; Fu, Kai; Ding, Shi-Jian

    2011-03-04

    Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for postmeasurement normalization of peptide ratios, which is required by the other programs.

  8. UNiquant, a Program for Quantitative Proteomics Analysis Using Stable Isotope Labeling

    PubMed Central

    Huang, Xin; Tolmachev, Aleksey V.; Shen, Yulei; Liu, Miao; Huang, Lin; Zhang, Zhixin; Anderson, Gordon A.; Smith, Richard D.; Chan, Wing C.; Hinrichs, Steven H.; Fu, Kai; Ding, Shi-Jian

    2011-01-01

    Stable isotope labeling (SIL) methods coupled with nanoscale liquid chromatography and high resolution tandem mass spectrometry are increasingly useful for elucidation of the proteome-wide differences between multiple biological samples. Development of more effective programs for the sensitive identification of peptide pairs and accurate measurement of the relative peptide/protein abundance are essential for quantitative proteomic analysis. We developed and evaluated the performance of a new program, termed UNiquant, for analyzing quantitative proteomics data using stable isotope labeling. UNiquant was compared with two other programs, MaxQuant and Mascot Distiller, using SILAC-labeled complex proteome mixtures having either known or unknown heavy/light ratios. For the SILAC-labeled Jeko-1 cell proteome digests with known heavy/light ratios (H/L = 1:1, 1:5, and 1:10), UNiquant quantified a similar number of peptide pairs as MaxQuant for the H/L = 1:1 and 1:5 mixtures. In addition, UNiquant quantified significantly more peptides than MaxQuant and Mascot Distiller in the H/L = 1:10 mixtures. UNiquant accurately measured relative peptide/protein abundance without the need for post-measurement normalization of peptide ratios, which is required by the other programs. PMID:21158445

  9. The Evolution of 3D Microimaging Techniques in Geosciences

    NASA Astrophysics Data System (ADS)

    Sahagian, D.; Proussevitch, A.

    2009-05-01

    In the analysis of geomaterials, it is essential to be able to analyze internal structures on a quantitative basis. Techniques have evolved from rough qualitative methods to highly accurate quantitative methods coupled with 3-D numerical analysis. The earliest primitive method for "seeing'" what was inside a rock was multiple sectioning to produce a series of image slices. This technique typically completely destroyed the sample being analyzed. Another destructive method was developed to give more detailed quantitative information by forming plastic casts of internal voids in sedimentary and volcanic rocks. For this, void were filled with plastic and the rock dissolved away with HF to reveal plastic casts of internal vesicles. Later, new approaches to stereology were developed to extract 3D information from 2D cross-sectional images. This has long been possible for spheres because the probability distribution for cutting a sphere along any small circle is known analytically (greatest probability is near the equator). However, large numbers of objects are required for statistical validity, and geomaterials are seldom spherical, so crystals, vesicles, and other inclusions would need a more sophisticated approach. Consequently, probability distributions were developed using numerical techniques for rectangular solids and various ellipsoids so that stereological techniques could be applied to these. The "holy grail" has always been to obtain 3D quantitative images non-destructively. A key method is Computed X-ray Tomography (CXT), in which attenuation of X-rays is recorded as a function of angular position in a cylindrical sample, providing a 2D "slice" of the interior. When a series of these "slices" is stacked (in increments equivalent with the resolution of the X-ray to make cubic voxels), a 3D image results with quantitative information regarding internal structure, particle/void volumes, nearest neighbors, coordination numbers, preferred orientations, etc. CXT can be done at three basic levels of resolution, with "normal" x-rays providing tens of microns resolution, synchrotron sources providing single to few microns, and emerging XuM techniques providing a practical 300 nm and theoretical 60 nm. The main challenges in CXT imaging have been in segmentation, which delineates material boundaries, and object recognition (registration), in which the individual objects within a material are identified. The former is critical in quantifying object volume, while the latter is essential for preventing the false appearance of individual objects as a continuous structure. Additional, new techniques are now being developed to enhance resolution and provide more detailed analysis without the complex infrastructure needed for CXT. One such method is Laser Scanning Confocal Microscopy, in which a laser is reflected from individual interior surfaces of a fluorescing material, providing a series of sharp images of internal slices with quantitative information available, just as in x-ray tomography, after "z-stacking" of planes of pixels. Another novel approach is the use of Stereo Scanning Electron Microscopy to create digital elevation models of 3D surficial features such as partial bubble margins on the surfaces of fine volcanic ash particles. As other novel techniques emerge, new opportunities will be presented to the geological research community to obtain ever more detailed and accurate information regarding the interior structure of geomaterials.

  10. Observing Clonal Dynamics across Spatiotemporal Axes: A Prelude to Quantitative Fitness Models for Cancer.

    PubMed

    McPherson, Andrew W; Chan, Fong Chun; Shah, Sohrab P

    2018-02-01

    The ability to accurately model evolutionary dynamics in cancer would allow for prediction of progression and response to therapy. As a prelude to quantitative understanding of evolutionary dynamics, researchers must gather observations of in vivo tumor evolution. High-throughput genome sequencing now provides the means to profile the mutational content of evolving tumor clones from patient biopsies. Together with the development of models of tumor evolution, reconstructing evolutionary histories of individual tumors generates hypotheses about the dynamics of evolution that produced the observed clones. In this review, we provide a brief overview of the concepts involved in predicting evolutionary histories, and provide a workflow based on bulk and targeted-genome sequencing. We then describe the application of this workflow to time series data obtained for transformed and progressed follicular lymphomas (FL), and contrast the observed evolutionary dynamics between these two subtypes. We next describe results from a spatial sampling study of high-grade serous (HGS) ovarian cancer, propose mechanisms of disease spread based on the observed clonal mixtures, and provide examples of diversification through subclonal acquisition of driver mutations and convergent evolution. Finally, we state implications of the techniques discussed in this review as a necessary but insufficient step on the path to predictive modelling of disease dynamics. Copyright © 2018 Cold Spring Harbor Laboratory Press; all rights reserved.

  11. Quantification of the level of crowdedness for pedestrian movements

    NASA Astrophysics Data System (ADS)

    Duives, Dorine C.; Daamen, Winnie; Hoogendoorn, Serge P.

    2015-06-01

    Within the realm of pedestrian research numerous measures have been proposed to estimate the level of crowdedness experienced by pedestrians. However, within the field of pedestrian traffic flow modelling there does not seem to be consensus on the question which of these measures performs best. This paper shows that the shape and scatter within the resulting fundamental diagrams differs a lot depending on the measure of crowdedness used. The main aim of the paper is to establish the advantages and disadvantages of the currently existing measures to quantify crowdedness in order to evaluate which measures provide both accurate and consistent results. The assessment is not only based on the theoretical differences, but also on the qualitative and quantitative differences between the resulting fundamental diagrams computed using the crowdedness measures on one and the same data set. The qualitative and quantitative functioning of the classical Grid-based measure is compared to with the X-T measure, an Exponentially Weighted Distance measure, and a Voronoi-Diagram measure. The consistency of relating these measures for crowdedness to the two macroscopic flow variables velocity and flow, the computational efficiency and the amount of scatter present within the fundamental diagrams produced by the implementation of the different measures are reviewed. It is found that the Voronoi-Diagram and X-T measure are the most efficient and consistent measures for crowdedness.

  12. Metabolite profiling of soy sauce using gas chromatography with time-of-flight mass spectrometry and analysis of correlation with quantitative descriptive analysis.

    PubMed

    Yamamoto, Shinya; Bamba, Takeshi; Sano, Atsushi; Kodama, Yukako; Imamura, Miho; Obata, Akio; Fukusaki, Eiichiro

    2012-08-01

    Soy sauces, produced from different ingredients and brewing processes, have variations in components and quality. Therefore, it is extremely important to comprehend the relationship between components and the sensory attributes of soy sauces. The current study sought to perform metabolite profiling in order to devise a method of assessing the attributes of soy sauces. Quantitative descriptive analysis (QDA) data for 24 soy sauce samples were obtained from well selected sensory panelists. Metabolite profiles primarily concerning low-molecular-weight hydrophilic components were based on gas chromatography with time-of-flightmass spectrometry (GC/TOFMS). QDA data for soy sauces were accurately predicted by projection to latent structure (PLS), with metabolite profiles serving as explanatory variables and QDA data set serving as a response variable. Moreover, analysis of correlation between matrices of metabolite profiles and QDA data indicated contributing compounds that were highly correlated with QDA data. Especially, it was indicated that sugars are important components of the tastes of soy sauces. This new approach which combines metabolite profiling with QDA is applicable to analysis of sensory attributes of food as a result of the complex interaction between its components. This approach is effective to search important compounds that contribute to the attributes. Copyright © 2012 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. PET guidance for liver radiofrequency ablation: an evaluation

    NASA Astrophysics Data System (ADS)

    Lei, Peng; Dandekar, Omkar; Mahmoud, Faaiza; Widlus, David; Malloy, Patrick; Shekhar, Raj

    2007-03-01

    Radiofrequency ablation (RFA) is emerging as the primary mode of treatment of unresectable malignant liver tumors. With current intraoperative imaging modalities, quick, precise, and complete localization of lesions remains a challenge for liver RFA. Fusion of intraoperative CT and preoperative PET images, which relies on PET and CT registration, can produce a new image with complementary metabolic and anatomic data and thus greatly improve the targeting accuracy. Unlike neurological images, alignment of abdominal images by combined PET/CT scanner is prone to errors as a result of large nonrigid misalignment in abdominal images. Our use of a normalized mutual information-based 3D nonrigid registration technique has proven powerful for whole-body PET and CT registration. We demonstrate here that this technique is capable of acceptable abdominal PET and CT registration as well. In five clinical cases, both qualitative and quantitative validation showed that the registration is robust and accurate. Quantitative accuracy was evaluated by comparison between the result from the algorithm and clinical experts. The accuracy of registration is much less than the allowable margin in liver RFA. Study findings show the technique's potential to enable the augmentation of intraoperative CT with preoperative PET to reduce procedure time, avoid repeating procedures, provide clinicians with complementary functional/anatomic maps, avoid omitting dispersed small lesions, and improve the accuracy of tumor targeting in liver RFA.

  14. UGT2B17 and SULT1A1 gene copy number variation (CNV) detection by LabChip microfluidic technology.

    PubMed

    Gaedigk, Andrea; Gaedigk, Roger; Leeder, J Steven

    2010-05-01

    Gene copy number variations (CNVs) are increasingly recognized to play important roles in the expression of genes and hence on their respective enzymatic activities. This has been demonstrated for a number of drug metabolizing genes, such as UDP-glucuronosyltransferases 2B17 (UGT2B17) and sulfotransferase 1A1 (SULT1A1), which are subject to genetic heterogeneity, including CNV. Quantitative assays to assess gene copy number are therefore becoming an integral part of accurate genotype assessment and phenotype prediction. In this study, we evaluated a microfluidics-based system, the Bio-Rad Experion system, to determine the power and utility of this platform to detect UGT2B17 and SULT1A1 CNV in DNA samples derived from blood and tissue. UGT2B17 is known to present with 0, 1 or 2 and SULT1A1 with up to 5 gene copies. Distinct clustering (p<0.001) into copy number groups was achieved for both genes. DNA samples derived from blood exhibited less inter-run variability compared to DNA samples obtained from liver tissue. This variability may be caused by tissue-specific PCR inhibitors as it could be overcome by using DNA from another tissue, or after the DNA had undergone whole genome amplification. This method produced results comparable to those reported for other quantitative test platforms.

  15. Printed Flexible Plastic Microchip for Viral Load Measurement through Quantitative Detection of Viruses in Plasma and Saliva

    PubMed Central

    Shafiee, Hadi; Kanakasabapathy, Manoj Kumar; Juillard, Franceline; Keser, Mert; Sadasivam, Magesh; Yuksekkaya, Mehmet; Hanhauser, Emily; Henrich, Timothy J.; Kuritzkes, Daniel R.; Kaye, Kenneth M.; Demirci, Utkan

    2015-01-01

    We report a biosensing platform for viral load measurement through electrical sensing of viruses on a flexible plastic microchip with printed electrodes. Point-of-care (POC) viral load measurement is of paramount importance with significant impact on a broad range of applications, including infectious disease diagnostics and treatment monitoring specifically in resource-constrained settings. Here, we present a broadly applicable and inexpensive biosensing technology for accurate quantification of bioagents, including viruses in biological samples, such as plasma and artificial saliva, at clinically relevant concentrations. Our microchip fabrication is simple and mass-producible as we print microelectrodes on flexible plastic substrates using conductive inks. We evaluated the microchip technology by detecting and quantifying multiple Human Immunodeficiency Virus (HIV) subtypes (A, B, C, D, E, G, and panel), Epstein-Barr Virus (EBV), and Kaposi’s Sarcoma-associated Herpes Virus (KSHV) in a fingerprick volume (50 µL) of PBS, plasma, and artificial saliva samples for a broad range of virus concentrations between 102 copies/mL and 107 copies/mL. We have also evaluated the microchip platform with discarded, de-identified HIV-infected patient samples by comparing our microchip viral load measurement results with reverse transcriptase-quantitative polymerase chain reaction (RT-qPCR) as the gold standard method using Bland-Altman Analysis. PMID:26046668

  16. Simultaneous quantitation of oxidized and reduced glutathione via LC-MS/MS: An insight into the redox state of hematopoietic stem cells.

    PubMed

    Carroll, Dustin; Howard, Diana; Zhu, Haining; Paumi, Christian M; Vore, Mary; Bondada, Subbarao; Liang, Ying; Wang, Chi; St Clair, Daret K

    2016-08-01

    Cellular redox balance plays a significant role in the regulation of hematopoietic stem-progenitor cell (HSC/MPP) self-renewal and differentiation. Unregulated changes in cellular redox homeostasis are associated with the onset of most hematological disorders. However, accurate measurement of the redox state in stem cells is difficult because of the scarcity of HSC/MPPs. Glutathione (GSH) constitutes the most abundant pool of cellular antioxidants. Thus, GSH metabolism may play a critical role in hematological disease onset and progression. A major limitation to studying GSH metabolism in HSC/MPPs has been the inability to measure quantitatively GSH concentrations in small numbers of HSC/MPPs. Current methods used to measure GSH levels not only rely on large numbers of cells, but also rely on the chemical/structural modification or enzymatic recycling of GSH and therefore are likely to measure only total glutathione content accurately. Here, we describe the validation of a sensitive method used for the direct and simultaneous quantitation of both oxidized and reduced GSH via liquid chromatography followed by tandem mass spectrometry (LC-MS/MS) in HSC/MPPs isolated from bone marrow. The lower limit of quantitation (LLOQ) was determined to be 5.0ng/mL for GSH and 1.0ng/mL for GSSG with lower limits of detection at 0.5ng/mL for both glutathione species. Standard addition analysis utilizing mouse bone marrow shows that this method is both sensitive and accurate with reproducible analyte recovery. This method combines a simple extraction with a platform for the high-throughput analysis, allows for efficient determination of GSH/GSSG concentrations within the HSC/MPP populations in mouse, chemotherapeutic treatment conditions within cell culture, and human normal/leukemia patient samples. The data implicate the importance of the modulation of GSH/GSSG redox couple in stem cells related diseases. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Distinguishing ferritin from apoferritin using magnetic force microscopy

    NASA Astrophysics Data System (ADS)

    Nocera, Tanya M.; Zeng, Yuzhi; Agarwal, Gunjan

    2014-11-01

    Estimating the amount of iron-replete ferritin versus iron-deficient apoferritin proteins is important in biomedical and nanotechnology applications. This work introduces a simple and novel approach to quantify ferritin by using magnetic force microscopy (MFM). We demonstrate how high magnetic moment probes enhance the magnitude of MFM signal, thus enabling accurate quantitative estimation of ferritin content in ferritin/apoferritin mixtures in vitro. We envisage MFM could be adapted to accurately determine ferritin content in protein mixtures or in small aliquots of clinical samples.

  18. Study on index system of GPS interference effect evaluation

    NASA Astrophysics Data System (ADS)

    Zhang, Kun; Zeng, Fangling; Zhao, Yuan; Zeng, Ruiqi

    2018-05-01

    Satellite navigation interference effect evaluation is the key technology to break through the research of Navigation countermeasure. To evaluate accurately the interference degree and Anti-jamming ability of GPS receiver, this text based on the existing research results of Navigation interference effect evaluation, build the index system of GPS receiver effectiveness evaluation from four levels of signal acquisition, tracking, demodulation and positioning/timing and establish the model for each index. These indexes can accurately and quantitatively describe the interference effect at all levels.

  19. Genomic Quantitative Genetics to Study Evolution in the Wild.

    PubMed

    Gienapp, Phillip; Fior, Simone; Guillaume, Frédéric; Lasky, Jesse R; Sork, Victoria L; Csilléry, Katalin

    2017-12-01

    Quantitative genetic theory provides a means of estimating the evolutionary potential of natural populations. However, this approach was previously only feasible in systems where the genetic relatedness between individuals could be inferred from pedigrees or experimental crosses. The genomic revolution opened up the possibility of obtaining the realized proportion of genome shared among individuals in natural populations of virtually any species, which could promise (more) accurate estimates of quantitative genetic parameters in virtually any species. Such a 'genomic' quantitative genetics approach relies on fewer assumptions, offers a greater methodological flexibility, and is thus expected to greatly enhance our understanding of evolution in natural populations, for example, in the context of adaptation to environmental change, eco-evolutionary dynamics, and biodiversity conservation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Assay Development for the Determination of Phosphorylation Stoichiometry using MRM methods with and without Phosphatase Treatment: Application to Breast Cancer Signaling Pathways

    PubMed Central

    Domanski, Dominik; Murphy, Leigh C.; Borchers, Christoph H.

    2010-01-01

    We have developed a phosphatase-based phosphopeptide quantitation (PPQ) method for determining phosphorylation stoichiometry in complex biological samples. This PPQ method is based on enzymatic dephosphorylation, combined with specific and accurate peptide identification and quantification by multiple reaction monitoring (MRM) detection with stable-isotope-labeled standard peptides. In contrast with the classical MRM methods for the quantitation of phosphorylation stoichiometry, the PPQ-MRM method needs only one non-phosphorylated SIS (stable isotope-coded standard) and two analyses (one for the untreated and one for the phosphatase-treated sample), from which the expression and modification levels can accurately be determined. From these analyses, the % phosphorylation can be determined. In this manuscript, we compare the PPQ-MRM method with an MRM method without phosphatase, and demonstrate the application of these methods to the detection and quantitation of phosphorylation of the classic phosphorylated breast cancer biomarkers (ERα and HER2), and for phosphorylated RAF and ERK1, which also contain phosphorylation sites with important biological implications. Using synthetic peptides spiked into a complex protein digest, we were able to use our PPQ-MRM method to accurately determine the total phosphorylation stoichiometry on specific peptides, as well as the absolute amount of the peptide and phosphopeptide present. Analyses of samples containing ERα protein revealed that the PPQ-MRM is capable of determining phosphorylation stoichiometry in proteins from cell lines, and is in good agreement with determinations obtained using the direct MRM approach in terms of phosphorylation and total protein amount. PMID:20524616

  1. Registration of knee joint surfaces for the in vivo study of joint injuries based on magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Cheng, Rita W. T.; Habib, Ayman F.; Frayne, Richard; Ronsky, Janet L.

    2006-03-01

    In-vivo quantitative assessments of joint conditions and health status can help to increase understanding of the pathology of osteoarthritis, a degenerative joint disease that affects a large population each year. Magnetic resonance imaging (MRI) provides a non-invasive and accurate means to assess and monitor joint properties, and has become widely used for diagnosis and biomechanics studies. Quantitative analyses and comparisons of MR datasets require accurate alignment of anatomical structures, thus image registration becomes a necessary procedure for these applications. This research focuses on developing a registration technique for MR knee joint surfaces to allow quantitative study of joint injuries and health status. It introduces a novel idea of translating techniques originally developed for geographic data in the field of photogrammetry and remote sensing to register 3D MR data. The proposed algorithm works with surfaces that are represented by randomly distributed points with no requirement of known correspondences. The algorithm performs matching locally by identifying corresponding surface elements, and solves for the transformation parameters relating the surfaces by minimizing normal distances between them. This technique was used in three applications to: 1) register temporal MR data to verify the feasibility of the algorithm to help monitor diseases, 2) quantify patellar movement with respect to the femur based on the transformation parameters, and 3) quantify changes in contact area locations between the patellar and femoral cartilage at different knee flexion angles. The results indicate accurate registration and the proposed algorithm can be applied for in-vivo study of joint injuries with MRI.

  2. Ocean forecasting in terrain-following coordinates: Formulation and skill assessment of the Regional Ocean Modeling System

    USGS Publications Warehouse

    Haidvogel, D.B.; Arango, H.; Budgell, W.P.; Cornuelle, B.D.; Curchitser, E.; Di, Lorenzo E.; Fennel, K.; Geyer, W.R.; Hermann, A.J.; Lanerolle, L.; Levin, J.; McWilliams, J.C.; Miller, A.J.; Moore, A.M.; Powell, T.M.; Shchepetkin, A.F.; Sherwood, C.R.; Signell, R.P.; Warner, J.C.; Wilkin, J.

    2008-01-01

    Systematic improvements in algorithmic design of regional ocean circulation models have led to significant enhancement in simulation ability across a wide range of space/time scales and marine system types. As an example, we briefly review the Regional Ocean Modeling System, a member of a general class of three-dimensional, free-surface, terrain-following numerical models. Noteworthy characteristics of the ROMS computational kernel include: consistent temporal averaging of the barotropic mode to guarantee both exact conservation and constancy preservation properties for tracers; redefined barotropic pressure-gradient terms to account for local variations in the density field; vertical interpolation performed using conservative parabolic splines; and higher-order, quasi-monotone advection algorithms. Examples of quantitative skill assessment are shown for a tidally driven estuary, an ice-covered high-latitude sea, a wind- and buoyancy-forced continental shelf, and a mid-latitude ocean basin. The combination of moderate-order spatial approximations, enhanced conservation properties, and quasi-monotone advection produces both more robust and accurate, and less diffusive, solutions than those produced in earlier terrain-following ocean models. Together with advanced methods of data assimilation and novel observing system technologies, these capabilities constitute the necessary ingredients for multi-purpose regional ocean prediction systems. 

  3. Depth-time interpolation of feature trends extracted from mobile microelectrode data with kernel functions.

    PubMed

    Wong, Stephen; Hargreaves, Eric L; Baltuch, Gordon H; Jaggi, Jurg L; Danish, Shabbar F

    2012-01-01

    Microelectrode recording (MER) is necessary for precision localization of target structures such as the subthalamic nucleus during deep brain stimulation (DBS) surgery. Attempts to automate this process have produced quantitative temporal trends (feature activity vs. time) extracted from mobile MER data. Our goal was to evaluate computational methods of generating spatial profiles (feature activity vs. depth) from temporal trends that would decouple automated MER localization from the clinical procedure and enhance functional localization in DBS surgery. We evaluated two methods of interpolation (standard vs. kernel) that generated spatial profiles from temporal trends. We compared interpolated spatial profiles to true spatial profiles that were calculated with depth windows, using correlation coefficient analysis. Excellent approximation of true spatial profiles is achieved by interpolation. Kernel-interpolated spatial profiles produced superior correlation coefficient values at optimal kernel widths (r = 0.932-0.940) compared to standard interpolation (r = 0.891). The choice of kernel function and kernel width resulted in trade-offs in smoothing and resolution. Interpolation of feature activity to create spatial profiles from temporal trends is accurate and can standardize and facilitate MER functional localization of subcortical structures. The methods are computationally efficient, enhancing localization without imposing additional constraints on the MER clinical procedure during DBS surgery. Copyright © 2012 S. Karger AG, Basel.

  4. Writing with voice: an investigation of the use of a voice recognition system as a writing aid for a man with aphasia.

    PubMed

    Bruce, Carolyn; Edmundson, Anne; Coleman, Michael

    2003-01-01

    People with aphasia may experience difficulties that prevent them from demonstrating in writing what they know and can produce orally. Voice recognition systems that allow the user to speak into a microphone and see their words appear on a computer screen have the potential to assist written communication. This study investigated whether a man with fluent aphasia could learn to use Dragon NaturallySpeaking to write. A single case study of a man with acquired writing difficulties is reported. A detailed account is provided of the stages involved in teaching him to use the software. The therapy tasks carried out to develop his functional use of the system are then described. Outcomes included the percentage of words accurately recognized by the system over time, the quantitative and qualitative changes in written texts produced with and without the use of the speech-recognition system, and the functional benefits the man described. The treatment programme was successful and resulted in a marked improvement in the subject's written work. It also had effects in the functional life domain as the subject could use writing for communication purposes. The results suggest that the technology might benefit others with acquired writing difficulties.

  5. A new standardized method for quantification of humic and fulvic acids in humic ores and commercial products.

    PubMed

    Lamar, Richard T; Olk, Daniel C; Mayhew, Lawrence; Bloom, Paul R

    2014-01-01

    Increased use of humic substances in agriculture has generated intense interest among producers, consumers, and regulators for an accurate and reliable method to quantify humic acid (HA) and fulvic acid (FA) in raw ores and products. Here we present a thoroughly validated method, the new standardized method for determination of HA and FA contents in raw humate ores and in solid and liquid products produced from them. The methods used for preparation of HA and FA were adapted according to the guidelines of the International Humic Substances Society involving alkaline extraction followed by acidification to separate HA from the fulvic fraction. This is followed by separation of FA from the fulvic fraction by adsorption on a nonionic macroporous acrylic ester resin at acid pH. It differs from previous methods in that it determines HA and FA concentrations gravimetrically on an ash-free basis. Critical steps in the method, e.g., initial test portion mass, test portion to extract volume ratio, extraction time, and acidification of alkaline extract, were optimized for maximum and consistent recovery of HA and FA. The method detection limits for HA and FA were 4.62 and 4.8 mg/L, respectively. The method quantitation limits for HA and FA were 14.7 and 15.3 mg/L, respectively.

  6. Fast and accurate mock catalogue generation for low-mass galaxies

    NASA Astrophysics Data System (ADS)

    Koda, Jun; Blake, Chris; Beutler, Florian; Kazin, Eyal; Marin, Felipe

    2016-06-01

    We present an accurate and fast framework for generating mock catalogues including low-mass haloes, based on an implementation of the COmoving Lagrangian Acceleration (COLA) technique. Multiple realisations of mock catalogues are crucial for analyses of large-scale structure, but conventional N-body simulations are too computationally expensive for the production of thousands of realizations. We show that COLA simulations can produce accurate mock catalogues with a moderate computation resource for low- to intermediate-mass galaxies in 1012 M⊙ haloes, both in real and redshift space. COLA simulations have accurate peculiar velocities, without systematic errors in the velocity power spectra for k ≤ 0.15 h Mpc-1, and with only 3-per cent error for k ≤ 0.2 h Mpc-1. We use COLA with 10 time steps and a Halo Occupation Distribution to produce 600 mock galaxy catalogues of the WiggleZ Dark Energy Survey. Our parallelized code for efficient generation of accurate halo catalogues is publicly available at github.com/junkoda/cola_halo.

  7. Feasibility of Using an Electrolysis Cell for Quantification of the Electrolytic Products of Water from Gravimetric Measurement

    PubMed Central

    2018-01-01

    A gravimetric method for the quantitative assessment of the products of electrolysis of water is presented. In this approach, the electrolysis cell was directly powered by 9 V batteries. Prior to electrolysis, a known amount of potassium hydrogen phthalate (KHP) was added to the cathode compartment, and an excess amount of KHCO3 was added to the anode compartment electrolyte. During electrolysis, cathode and anode compartments produced OH−(aq) and H+(aq) ions, respectively. Electrolytically produced OH−(aq) neutralized the KHP, and the completion of this neutralization was detected by a visual indicator color change. Electrolytically produced H+(aq) reacted with HCO3 −(aq) liberating CO2(g) from the anode compartment. Concurrent liberation of H2(g) and O2(g) at the cathode and anode, respectively, resulted in a decrease in the mass of the cell. Mass of the electrolysis cell was monitored. Liberation of CO2(g) resulted in a pronounced effect of a decrease in mass. Experimentally determined decrease in mass (53.7 g/Faraday) agreed with that predicted from Faraday's laws of electrolysis (53.0 g/Faraday). The efficacy of the cell was tested to quantify the acid content in household vinegar samples. Accurate results were obtained for vinegar analysis with a precision better than 5% in most cases. The cell offers the advantages of coulometric method and additionally simplifies the circuitry by eliminating the use of a constant current power source or a coulometer. PMID:29629210

  8. The precise and accurate production of millimetric water droplets using a superhydrophobic generating apparatus

    NASA Astrophysics Data System (ADS)

    Wood, Michael J.; Aristizabal, Felipe; Coady, Matthew; Nielson, Kent; Ragogna, Paul J.; Kietzig, Anne-Marie

    2018-02-01

    The production of millimetric liquid droplets has importance in a wide range of applications both in the laboratory and industrially. As such, much effort has been put forth to devise methods to generate these droplets on command in a manner which results in high diameter accuracy and precision, well-defined trajectories followed by successive droplets and low oscillations in droplet shape throughout their descents. None of the currently employed methods of millimetric droplet generation described in the literature adequately addresses all of these desired droplet characteristics. The reported methods invariably involve the cohesive separation of the desired volume of liquid from the bulk supply in the same step that separates the single droplet from the solid generator. We have devised a droplet generation device which separates the desired volume of liquid within a tee-apparatus in a step prior to the generation of the droplet which has yielded both high accuracy and precision of the diameters of the final droplets produced. Further, we have engineered a generating tip with extreme antiwetting properties which has resulted in reduced adhesion forces between the liquid droplet and the solid tip. This has yielded the ability to produce droplets of low mass without necessitating different diameter generating tips or the addition of surfactants to the liquid, well-defined droplet trajectories, and low oscillations in droplet volume. The trajectories and oscillations of the droplets produced have been assessed and presented quantitatively in a manner that has been lacking in the current literature.

  9. Feasibility of Using an Electrolysis Cell for Quantification of the Electrolytic Products of Water from Gravimetric Measurement.

    PubMed

    Melaku, Samuel; Gebeyehu, Zewdu; Dabke, Rajeev B

    2018-01-01

    A gravimetric method for the quantitative assessment of the products of electrolysis of water is presented. In this approach, the electrolysis cell was directly powered by 9 V batteries. Prior to electrolysis, a known amount of potassium hydrogen phthalate (KHP) was added to the cathode compartment, and an excess amount of KHCO 3 was added to the anode compartment electrolyte. During electrolysis, cathode and anode compartments produced OH - (aq) and H + (aq) ions, respectively. Electrolytically produced OH - (aq) neutralized the KHP, and the completion of this neutralization was detected by a visual indicator color change. Electrolytically produced H + (aq) reacted with HCO 3 - (aq) liberating CO 2 (g) from the anode compartment. Concurrent liberation of H 2 (g) and O 2 (g) at the cathode and anode, respectively, resulted in a decrease in the mass of the cell. Mass of the electrolysis cell was monitored. Liberation of CO 2 (g) resulted in a pronounced effect of a decrease in mass. Experimentally determined decrease in mass (53.7 g/Faraday) agreed with that predicted from Faraday's laws of electrolysis (53.0 g/Faraday). The efficacy of the cell was tested to quantify the acid content in household vinegar samples. Accurate results were obtained for vinegar analysis with a precision better than 5% in most cases. The cell offers the advantages of coulometric method and additionally simplifies the circuitry by eliminating the use of a constant current power source or a coulometer.

  10. GAS CHROMATOGRAPHIC TECHNIQUES FOR THE MEASUREMENT OF ISOPRENE IN AIR

    EPA Science Inventory

    The chapter discusses gas chromatographic techniques for measuring isoprene in air. Such measurement basically consists of three parts: (1) collection of sufficient sample volume for representative and accurate quantitation, (2) separation (if necessary) of isoprene from interfer...

  11. The contribution of synchrotron X-ray computed microtomography to understanding volcanic processes.

    PubMed

    Polacci, Margherita; Mancini, Lucia; Baker, Don R

    2010-03-01

    A series of computed microtomography experiments are reported which were performed by using a third-generation synchrotron radiation source on volcanic rocks from various active hazardous volcanoes in Italy and other volcanic areas in the world. The applied technique allowed the internal structure of the investigated material to be accurately imaged at the micrometre scale and three-dimensional views of the investigated samples to be produced as well as three-dimensional quantitative measurements of textural features. The geometry of the vesicle (gas-filled void) network in volcanic products of both basaltic and trachytic compositions were particularly focused on, as vesicle textures are directly linked to the dynamics of volcano degassing. This investigation provided novel insights into modes of gas exsolution, transport and loss in magmas that were not recognized in previous studies using solely conventional two-dimensional imaging techniques. The results of this study are important to understanding the behaviour of volcanoes and can be combined with other geosciences disciplines to forecast their future activity.

  12. Operator evolution for ab initio electric dipole transitions of 4He

    DOE PAGES

    Schuster, Micah D.; Quaglioni, Sofia; Johnson, Calvin W.; ...

    2015-07-24

    A goal of nuclear theory is to make quantitative predictions of low-energy nuclear observables starting from accurate microscopic internucleon forces. A major element of such an effort is applying unitary transformations to soften the nuclear Hamiltonian and hence accelerate the convergence of ab initio calculations as a function of the model space size. The consistent simultaneous transformation of external operators, however, has been overlooked in applications of the theory, particularly for nonscalar transitions. We study the evolution of the electric dipole operator in the framework of the similarity renormalization group method and apply the renormalized matrix elements to the calculationmore » of the 4He total photoabsorption cross section and electric dipole polarizability. All observables are calculated within the ab initio no-core shell model. Furthermore, we find that, although seemingly small, the effects of evolved operators on the photoabsorption cross section are comparable in magnitude to the correction produced by including the chiral three-nucleon force and cannot be neglected.« less

  13. A workflow to process 3D+time microscopy images of developing organisms and reconstruct their cell lineage

    PubMed Central

    Faure, Emmanuel; Savy, Thierry; Rizzi, Barbara; Melani, Camilo; Stašová, Olga; Fabrèges, Dimitri; Špir, Róbert; Hammons, Mark; Čúnderlík, Róbert; Recher, Gaëlle; Lombardot, Benoît; Duloquin, Louise; Colin, Ingrid; Kollár, Jozef; Desnoulez, Sophie; Affaticati, Pierre; Maury, Benoît; Boyreau, Adeline; Nief, Jean-Yves; Calvat, Pascal; Vernier, Philippe; Frain, Monique; Lutfalla, Georges; Kergosien, Yannick; Suret, Pierre; Remešíková, Mariana; Doursat, René; Sarti, Alessandro; Mikula, Karol; Peyriéras, Nadine; Bourgine, Paul

    2016-01-01

    The quantitative and systematic analysis of embryonic cell dynamics from in vivo 3D+time image data sets is a major challenge at the forefront of developmental biology. Despite recent breakthroughs in the microscopy imaging of living systems, producing an accurate cell lineage tree for any developing organism remains a difficult task. We present here the BioEmergences workflow integrating all reconstruction steps from image acquisition and processing to the interactive visualization of reconstructed data. Original mathematical methods and algorithms underlie image filtering, nucleus centre detection, nucleus and membrane segmentation, and cell tracking. They are demonstrated on zebrafish, ascidian and sea urchin embryos with stained nuclei and membranes. Subsequent validation and annotations are carried out using Mov-IT, a custom-made graphical interface. Compared with eight other software tools, our workflow achieved the best lineage score. Delivered in standalone or web service mode, BioEmergences and Mov-IT offer a unique set of tools for in silico experimental embryology. PMID:26912388

  14. Awareness and Knowledge of the U.S. Public Health Service Syphilis Study at Tuskegee: Implications for Biomedical Research

    PubMed Central

    McCallum, Jan M.; Arekere, Dhananjaya M.; Green, B. Lee; Katz, Ralph V.; Rivers, Brian M.

    2007-01-01

    The purpose of this review was to collect and interpret the findings of all published qualitative or quantitative research that assessed African Americans’ 1) general awareness and/or specific knowledge of the U.S. Public Health Service (USPHS) Syphilis Study at Tuskegee, and 2) attitudes towards and/or willingness to participate in biomedical research. An exhaustive review of the literature produced eight articles that fit the aforementioned selection criteria. All articles that assessed both awareness and knowledge found that familiarity with the USPHS Syphilis Study at Tuskegee did not necessarily ensure accurate knowledge of it Four studies also found that awareness of the USPHS Syphilis Study at Tuskegee did not relate to willingness to participate in biomedical research. In addition to awareness and knowledge of the USPHS Syphilis Study at Tuskegee, published studies suggest that a broad, array of structural and sociocultural factors influence minorities’ willingness to participate in biomedical studies. PMID:17242526

  15. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  16. Piezo-generated charge mapping revealed through direct piezoelectric force microscopy.

    PubMed

    Gomez, A; Gich, M; Carretero-Genevrier, A; Puig, T; Obradors, X

    2017-10-24

    While piezoelectric and ferroelectric materials play a key role in many everyday applications, there are still a number of open questions related to their physics. To enhance our understanding of piezoelectrics and ferroelectrics, nanoscale characterization is essential. Here, we develop an atomic force microscopy based mode that obtains a direct quantitative analysis of the piezoelectric coefficient d 33 . We report nanoscale images of piezogenerated charge in a thick single crystal of periodically poled lithium niobate (PPLN), a bismuth ferrite (BiFO 3 ) thin film, and lead zirconate titanate (PZT) by applying a force and recording the current produced by these materials. The quantification of d 33 coefficients for PPLN (14 ± 3 pC per N) and BFO (43 ± 6 pC per N) is in agreement with the values reported in the literature. Even stronger evidence of the reliability of the method is provided by an equally accurate measurement of the significantly larger d 33 of PZT.

  17. Rapid cell-free forward engineering of novel genetic ring oscillators

    PubMed Central

    Niederholtmeyer, Henrike; Sun, Zachary Z; Hori, Yutaka; Yeung, Enoch; Verpoorte, Amanda; Murray, Richard M; Maerkl, Sebastian J

    2015-01-01

    While complex dynamic biological networks control gene expression in all living organisms, the forward engineering of comparable synthetic networks remains challenging. The current paradigm of characterizing synthetic networks in cells results in lengthy design-build-test cycles, minimal data collection, and poor quantitative characterization. Cell-free systems are appealing alternative environments, but it remains questionable whether biological networks behave similarly in cell-free systems and in cells. We characterized in a cell-free system the ‘repressilator’, a three-node synthetic oscillator. We then engineered novel three, four, and five-gene ring architectures, from characterization of circuit components to rapid analysis of complete networks. When implemented in cells, our novel 3-node networks produced population-wide oscillations and 95% of 5-node oscillator cells oscillated for up to 72 hr. Oscillation periods in cells matched the cell-free system results for all networks tested. An alternate forward engineering paradigm using cell-free systems can thus accurately capture cellular behavior. DOI: http://dx.doi.org/10.7554/eLife.09771.001 PMID:26430766

  18. Local heat-transfer measurements on a large, scale-model turbine blade airfoil using a composite of a heater element and liquid crystals

    NASA Technical Reports Server (NTRS)

    Hippensteele, S. A.; Russell, L. M.; Torres, F. J.

    1985-01-01

    Local heat transfer coefficients were experimentally mapped along the midchord of a five-time-size turbine blade airfoil in a static cascade operated at room temperature over a range of Reynolds numbers. The test surface consisted of a composite of commercially available materials: a mylar sheet with a layer of cholesteric liquid crystals, that change color with temperature, and a heater sheet made of a carbon-impregnated paper, that produces uniform heat flux. After the initial selection and calibration of the composite sheet, accurate, quantitative, and continuous heat transfer coefficients were mapped over the airfoil surface. The local heat transfer coefficients are presented for Reynolds numbers from 2.8 x 10 to the 5th power to 7.6 x 10 to the 5th power. Comparisons are made with analytical values of heat transfer coefficients obtained from the STAN5 boundary layer code. Also, a leading edge separation bubble was revealed by thermal and flow visualization.

  19. An Efficient Method for the Isolation of Highly Purified RNA from Seeds for Use in Quantitative Transcriptome Analysis.

    PubMed

    Kanai, Masatake; Mano, Shoji; Nishimura, Mikio

    2017-01-11

    Plant seeds accumulate large amounts of storage reserves comprising biodegradable organic matter. Humans rely on seed storage reserves for food and as industrial materials. Gene expression profiles are powerful tools for investigating metabolic regulation in plant cells. Therefore, detailed, accurate gene expression profiles during seed development are required for crop breeding. Acquiring highly purified RNA is essential for producing these profiles. Efficient methods are needed to isolate highly purified RNA from seeds. Here, we describe a method for isolating RNA from seeds containing large amounts of oils, proteins, and polyphenols, which have inhibitory effects on high-purity RNA isolation. Our method enables highly purified RNA to be obtained from seeds without the use of phenol, chloroform, or additional processes for RNA purification. This method is applicable to Arabidopsis, rapeseed, and soybean seeds. Our method will be useful for monitoring the expression patterns of low level transcripts in developing and mature seeds.

  20. Phase-resolved acoustic radiation force optical coherence elastography

    NASA Astrophysics Data System (ADS)

    Qi, Wenjuan; Chen, Ruimin; Chou, Lidek; Liu, Gangjun; Zhang, Jun; Zhou, Qifa; Chen, Zhongping

    2012-11-01

    Many diseases involve changes in the biomechanical properties of tissue, and there is a close correlation between tissue elasticity and pathology. We report on the development of a phase-resolved acoustic radiation force optical coherence elastography method (ARF-OCE) to evaluate the elastic properties of tissue. This method utilizes chirped acoustic radiation force to produce excitation along the sample's axial direction, and it uses phase-resolved optical coherence tomography (OCT) to measure the vibration of the sample. Under 500-Hz square wave modulated ARF signal excitation, phase change maps of tissue mimicking phantoms are generated by the ARF-OCE method, and the resulting Young's modulus ratio is correlated with a standard compression test. The results verify that this technique could efficiently measure sample elastic properties accurately and quantitatively. Furthermore, a three-dimensional ARF-OCE image of the human atherosclerotic coronary artery is obtained. The result indicates that our dynamic phase-resolved ARF-OCE method can delineate tissues with different mechanical properties.

  1. Smartphone-based portable wireless optical system for the detection of target analytes.

    PubMed

    Gautam, Shreedhar; Batule, Bhagwan S; Kim, Hyo Yong; Park, Ki Soo; Park, Hyun Gyu

    2017-02-01

    Rapid and accurate on-site wireless measurement of hazardous molecules or biomarkers is one of the biggest challenges in nanobiotechnology. A novel smartphone-based Portable and Wireless Optical System (PAWS) for rapid, quantitative, and on-site analysis of target analytes is described. As a proof-of-concept, we employed gold nanoparticles (GNP) and an enzyme, horse radish peroxidase (HRP), to generate colorimetric signals in response to two model target molecules, melamine and hydrogen peroxide, respectively. The colorimetric signal produced by the presence of the target molecules is converted to an electrical signal by the inbuilt electronic circuit of the device. The converted electrical signal is then measured wirelessly via multimeter in the smartphone which processes the data and displays the results, including the concentration of analytes and its significance. This handheld device has great potential as a programmable and miniaturized platform to achieve rapid and on-site detection of various analytes in a point-of-care testing (POCT) manner. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Thyroid function and cold acclimation in the hamster, Mesocricetus auratus

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tomasi, T.E.; Horwitz, B.A.

    1987-02-01

    Basal metabolic rate (BMR), thyroxine utilization rate (T4U), and triiodothyronine utilization rate (T3U) were measured in cold-acclimated (CA) and room temperature-acclimated (RA) male golden hamsters, Mesocricetus auratus. Hormone utilization rates were calculated via the plasma disappearance technique using SVI-labeled hormones and measuring serum hormone levels via radioimmunoassay. BMR showed a significant 28% increase with cold acclimation. The same cold exposure also produced a 32% increase in T4U, and a 204% increase in T3U. The much greater increase in T3U implies that previous assessments of the relationship between cold acclimation and thyroid function may have been underestimated and that cold exposuremore » induces both quantitative and qualitative changes in thyroid function. It is concluded that in the cold-acclimated state, T3U more accurately reflects thyroid function than does T4U. A mechanism for the cold-induced change in BMR is proposed.« less

  3. Quantification and characterisation of fatty acid methyl esters in microalgae: Comparison of pretreatment and purification methods.

    PubMed

    Lage, Sandra; Gentili, Francesco G

    2018-06-01

    A systematic qualitative and quantitative analysis of fatty acid methyl esters (FAMEs) is crucial for microalgae species selection for biodiesel production. The aim of this study is to identify the best method to assess microalgae FAMEs composition and content. A single-step method, was tested with and without purification steps-that is, separation of lipid classes by thin-layer chromatography (TLC) or solid-phase extraction (SPE). The efficiency of a direct transesterification method was also evaluated. Additionally, the yield of the FAMEs and the profiles of the microalgae samples with different pretreatments (boiled in isopropanol, freezing, oven-dried and freeze-dried) were compared. The application of a purification step after lipid extraction proved to be essential for an accurate FAMEs characterisation. The purification methods, which included TLC and SPE, provided superior results compared to not purifying the samples. Freeze-dried microalgae produced the lowest FAMEs yield. However, FAMEs profiles were generally equivalent among the pretreatments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Physically-based in silico light sheet microscopy for visualizing fluorescent brain models

    PubMed Central

    2015-01-01

    Background We present a physically-based computational model of the light sheet fluorescence microscope (LSFM). Based on Monte Carlo ray tracing and geometric optics, our method simulates the operational aspects and image formation process of the LSFM. This simulated, in silico LSFM creates synthetic images of digital fluorescent specimens that can resemble those generated by a real LSFM, as opposed to established visualization methods producing visually-plausible images. We also propose an accurate fluorescence rendering model which takes into account the intrinsic characteristics of fluorescent dyes to simulate the light interaction with fluorescent biological specimen. Results We demonstrate first results of our visualization pipeline to a simplified brain tissue model reconstructed from the somatosensory cortex of a young rat. The modeling aspects of the LSFM units are qualitatively analysed, and the results of the fluorescence model were quantitatively validated against the fluorescence brightness equation and characteristic emission spectra of different fluorescent dyes. AMS subject classification Modelling and simulation PMID:26329404

  5. Awareness and knowledge of the U.S. Public Health Service syphilis study at Tuskegee: implications for biomedical research.

    PubMed

    McCallum, Jan M; Arekere, Dhananjaya M; Green, B Lee; Katz, Ralph V; Rivers, Brian M

    2006-11-01

    The purpose of this review was to collect and interpret the findings of all published qualitative or quantitative research that assessed African Americans' 1) general awareness and/or specific knowledge of the U.S. Public Health Service (USPHS) Syphilis Study at Tuskegee, and 2) attitudes towards and/or willingness to participate in biomedical research. An exhaustive review of the literature produced eight articles that fit the aforementioned selection criteria. All articles that assessed both awareness and knowledge found that familiarity with the USPHS Syphilis Study at Tuskegee did not necessarily ensure accurate knowledge of it. Four studies also found that awareness of the USPHS Syphilis Study at Tuskegee did not relate to willingness to participate in biomedical research. In addition to awareness and knowledge of the USPHS Syphilis Study at Tuskegee, published studies suggest that a broad array of structural and sociocultural factors influence minorities' willingness to participate in biomedical studies.

  6. MR Guided PET Image Reconstruction

    PubMed Central

    Bai, Bing; Li, Quanzheng; Leahy, Richard M.

    2013-01-01

    The resolution of PET images is limited by the physics of positron-electron annihilation and instrumentation for photon coincidence detection. Model based methods that incorporate accurate physical and statistical models have produced significant improvements in reconstructed image quality when compared to filtered backprojection reconstruction methods. However, it has often been suggested that by incorporating anatomical information, the resolution and noise properties of PET images could be improved, leading to better quantitation or lesion detection. With the recent development of combined MR-PET scanners, it is possible to collect intrinsically co-registered MR images. It is therefore now possible to routinely make use of anatomical information in PET reconstruction, provided appropriate methods are available. In this paper we review research efforts over the past 20 years to develop these methods. We discuss approaches based on the use of both Markov random field priors and joint information or entropy measures. The general framework for these methods is described and their performance and longer term potential and limitations discussed. PMID:23178087

  7. The quantitative control and matching of an optical false color composite imaging system

    NASA Astrophysics Data System (ADS)

    Zhou, Chengxian; Dai, Zixin; Pan, Xizhe; Li, Yinxi

    1993-10-01

    Design of an imaging system for optical false color composite (OFCC) capable of high-precision density-exposure time control and color balance is presented. The system provides high quality FCC image data that can be analyzed using a quantitative calculation method. The quality requirement to each part of the image generation system is defined, and the distribution of satellite remote sensing image information is analyzed. The proposed technology makes it possible to present the remote sensing image data more effectively and accurately.

  8. T2* Mapping Provides Information That Is Statistically Comparable to an Arthroscopic Evaluation of Acetabular Cartilage.

    PubMed

    Morgan, Patrick; Nissi, Mikko J; Hughes, John; Mortazavi, Shabnam; Ellerman, Jutta

    2017-07-01

    Objectives The purpose of this study was to validate T2* mapping as an objective, noninvasive method for the prediction of acetabular cartilage damage. Methods This is the second step in the validation of T2*. In a previous study, we established a quantitative predictive model for identifying and grading acetabular cartilage damage. In this study, the model was applied to a second cohort of 27 consecutive hips to validate the model. A clinical 3.0-T imaging protocol with T2* mapping was used. Acetabular regions of interest (ROI) were identified on magnetic resonance and graded using the previously established model. Each ROI was then graded in a blinded fashion by arthroscopy. Accurate surgical location of ROIs was facilitated with a 2-dimensional map projection of the acetabulum. A total of 459 ROIs were studied. Results When T2* mapping and arthroscopic assessment were compared, 82% of ROIs were within 1 Beck group (of a total 6 possible) and 32% of ROIs were classified identically. Disease prediction based on receiver operating characteristic curve analysis demonstrated a sensitivity of 0.713 and a specificity of 0.804. Model stability evaluation required no significant changes to the predictive model produced in the initial study. Conclusions These results validate that T2* mapping provides statistically comparable information regarding acetabular cartilage when compared to arthroscopy. In contrast to arthroscopy, T2* mapping is quantitative, noninvasive, and can be used in follow-up. Unlike research quantitative magnetic resonance protocols, T2* takes little time and does not require a contrast agent. This may facilitate its use in the clinical sphere.

  9. Development and Application of an MSALL-Based Approach for the Quantitative Analysis of Linear Polyethylene Glycols in Rat Plasma by Liquid Chromatography Triple-Quadrupole/Time-of-Flight Mass Spectrometry.

    PubMed

    Zhou, Xiaotong; Meng, Xiangjun; Cheng, Longmei; Su, Chong; Sun, Yantong; Sun, Lingxia; Tang, Zhaohui; Fawcett, John Paul; Yang, Yan; Gu, Jingkai

    2017-05-16

    Polyethylene glycols (PEGs) are synthetic polymers composed of repeating ethylene oxide subunits. They display excellent biocompatibility and are widely used as pharmaceutical excipients. To fully understand the biological fate of PEGs requires accurate and sensitive analytical methods for their quantitation. Application of conventional liquid chromatography-tandem mass spectrometry (LC-MS/MS) is difficult because PEGs have polydisperse molecular weights (MWs) and tend to produce multicharged ions in-source resulting in innumerable precursor ions. As a result, multiple reaction monitoring (MRM) fails to scan all ion pairs so that information on the fate of unselected ions is missed. This Article addresses this problem by application of liquid chromatography-triple-quadrupole/time-of-flight mass spectrometry (LC-Q-TOF MS) based on the MS ALL technique. This technique performs information-independent acquisition by allowing all PEG precursor ions to enter the collision cell (Q2). In-quadrupole collision-induced dissociation (CID) in Q2 then effectively generates several fragments from all PEGs due to the high collision energy (CE). A particular PEG product ion (m/z 133.08592) was found to be common to all linear PEGs and allowed their total quantitation in rat plasma with high sensitivity, excellent linearity and reproducibility. Assay validation showed the method was linear for all linear PEGs over the concentration range 0.05-5.0 μg/mL. The assay was successfully applied to the pharmacokinetic study in rat involving intravenous administration of linear PEG 600, PEG 4000, and PEG 20000. It is anticipated the method will have wide ranging applications and stimulate the development of assays for other pharmaceutical polymers in the future.

  10. Event specific qualitative and quantitative polymerase chain reaction detection of genetically modified MON863 maize based on the 5'-transgene integration sequence.

    PubMed

    Yang, Litao; Xu, Songci; Pan, Aihu; Yin, Changsong; Zhang, Kewei; Wang, Zhenying; Zhou, Zhigang; Zhang, Dabing

    2005-11-30

    Because of the genetically modified organisms (GMOs) labeling policies issued in many countries and areas, polymerase chain reaction (PCR) methods were developed for the execution of GMO labeling policies, such as screening, gene specific, construct specific, and event specific PCR detection methods, which have become a mainstay of GMOs detection. The event specific PCR detection method is the primary trend in GMOs detection because of its high specificity based on the flanking sequence of the exogenous integrant. This genetically modified maize, MON863, contains a Cry3Bb1 coding sequence that produces a protein with enhanced insecticidal activity against the coleopteran pest, corn rootworm. In this study, the 5'-integration junction sequence between the host plant DNA and the integrated gene construct of the genetically modified maize MON863 was revealed by means of thermal asymmetric interlaced-PCR, and the specific PCR primers and TaqMan probe were designed based upon the revealed 5'-integration junction sequence; the conventional qualitative PCR and quantitative TaqMan real-time PCR detection methods employing these primers and probes were successfully developed. In conventional qualitative PCR assay, the limit of detection (LOD) was 0.1% for MON863 in 100 ng of maize genomic DNA for one reaction. In the quantitative TaqMan real-time PCR assay, the LOD and the limit of quantification were eight and 80 haploid genome copies, respectively. In addition, three mixed maize samples with known MON863 contents were detected using the established real-time PCR systems, and the ideal results indicated that the established event specific real-time PCR detection systems were reliable, sensitive, and accurate.

  11. Blending Model Output with satellite-based and in-situ observations to produce high-resolution estimates of population exposure to wildfire smoke

    NASA Astrophysics Data System (ADS)

    Lassman, William

    In the western US, emissions from wildfires and prescribed fire have been associated with degradation of regional air quality. Whereas atmospheric aerosol particles with aerodynamic diameters less than 2.5 mum (PM2.5) have known impacts on human health, there is uncertainty in how particle composition, concentrations, and exposure duration impact the associated health response. Due to changes in climate and land-management, wildfires have increased in frequency and severity, and this trend is expected to continue. Consequently, wildfires are expected to become an increasingly important source of PM2.5 in the western US. While composition and source of the aerosol is thought to be an important factor in the resulting human health-effects, this is currently not well-understood; therefore, there is a need to develop a quantitative understanding of wildfire-smoke-specific health effects. A necessary step in this process is to determine who was exposed to wildfire smoke, the concentration of the smoke during exposure, and the duration of the exposure. Three different tools are commonly used to assess exposure to wildfire smoke: in-situ measurements, satellite-based observations, and chemical-transport model (CTM) simulations, and each of these exposure-estimation tools have associated strengths and weakness. In this thesis, we investigate the utility of blending these tools together to produce highly accurate estimates of smoke exposure during the 2012 fire season in Washington for use in an epidemiological case study. For blending, we use a ridge regression model, as well as a geographically weighted ridge regression model. We evaluate the performance of the three individual exposure-estimate techniques and the two blended techniques using Leave-One-Out Cross-Validation. Due to the number of in-situ monitors present during this time period, we find that predictions based on in-situ monitors were more accurate for this particular fire season than the CTM simulations and satellite-based observations, so blending provided only marginal improvements above the in-situ observations. However, we show that in hypothetical cases with fewer surface monitors, the two blending techniques can produce substantial improvement over any of the individual tools.

  12. An Inexpensive Electrodeposition Device and Its Use in a Quantitative Analysis Laboratory Exercise

    ERIC Educational Resources Information Center

    Parker, Richard H.

    2011-01-01

    An experimental procedure, using an apparatus that is easy to construct, was developed to incorporate a quantitative electrogravimetric determination of the solution nickel content into an undergraduate or advanced high school quantitative analysis laboratory. This procedure produces results comparable to the procedure used for the gravimetric…

  13. Precocious quantitative cognition in monkeys.

    PubMed

    Ferrigno, Stephen; Hughes, Kelly D; Cantlon, Jessica F

    2016-02-01

    Basic quantitative abilities are thought to have an innate basis in humans partly because the ability to discriminate quantities emerges early in child development. If humans and nonhuman primates share this developmentally primitive foundation of quantitative reasoning, then this ability should be present early in development across species and should emerge earlier in monkeys than in humans because monkeys mature faster than humans. We report that monkeys spontaneously make accurate quantity choices by 1 year of age in a task that human children begin to perform only at 2.5 to 3 years of age. Additionally, we report that the quantitative sensitivity of infant monkeys is equal to that of the adult animals in their group and that rates of learning do not differ between infant and adult animals. This novel evidence of precocious quantitative reasoning in infant monkeys suggests that human quantitative reasoning shares its early developing foundation with other primates. The data further suggest that early developing components of primate quantitative reasoning are constrained by maturational factors related to genetic development as opposed to learning experience alone.

  14. Global, long-term surface reflectance records from Landsat

    USDA-ARS?s Scientific Manuscript database

    Global, long-term monitoring of changes in Earth’s land surface requires quantitative comparisons of satellite images acquired under widely varying atmospheric conditions. Although physically based estimates of surface reflectance (SR) ultimately provide the most accurate representation of Earth’s s...

  15. Advanced Technologies for Structural and Functional Optical Coherence Tomography

    DTIC Science & Technology

    2015-01-07

    vertical scale bar: 500 um. 9 OCT speckle noise can significantly affect polarimetry measurement and must be reduced for birefringence...shown in Figure 7. This technique enables more accurate polarimetry measurement and quantitative assessment of tissue birefringence. Figure 7

  16. A method for the measurement of physiologic evaporative water loss.

    DOT National Transportation Integrated Search

    1963-10-01

    The precise measurement of evaporative water loss is essential to an accurate evaluation of this avenue of heat loss in acute and chronic exposures to heat. In psychological studies, the quantitative measurement of palmar sweating plays an equally im...

  17. Revisiting soil carbon and nitrogen sampling: quantitative pits versus rotary cores

    USDA-ARS?s Scientific Manuscript database

    Increasing atmospheric carbon dioxide and its feedbacks with global climate have sparked renewed interest in quantifying ecosystem carbon (C) budgets, including quantifying belowground pools. Belowground nutrient budgets require accurate estimates of soil mass, coarse fragment content, and nutrient ...

  18. Remote In-Situ Quantitative Mineralogical Analysis Using XRD/XRF

    NASA Technical Reports Server (NTRS)

    Blake, D. F.; Bish, D.; Vaniman, D.; Chipera, S.; Sarrazin, P.; Collins, S. A.; Elliott, S. T.

    2001-01-01

    X-Ray Diffraction (XRD) is the most direct and accurate method for determining mineralogy. The CHEMIN XRD/XRF instrument has shown promising results on a variety of mineral and rock samples. Additional information is contained in the original extended abstract.

  19. Accurate phase measurements for thick spherical objects using optical quadrature microscopy

    NASA Astrophysics Data System (ADS)

    Warger, William C., II; DiMarzio, Charles A.

    2009-02-01

    In vitro fertilization (IVF) procedures have resulted in the birth of over three million babies since 1978. Yet the live birth rate in the United States was only 34% in 2005, with 32% of the successful pregnancies resulting in multiple births. These multiple pregnancies were directly attributed to the transfer of multiple embryos to increase the probability that a single, healthy embryo was included. Current viability markers used for IVF, such as the cell number, symmetry, size, and fragmentation, are analyzed qualitatively with differential interference contrast (DIC) microscopy. However, this method is not ideal for quantitative measures beyond the 8-cell stage of development because the cells overlap and obstruct the view within and below the cluster of cells. We have developed the phase-subtraction cell-counting method that uses the combination of DIC and optical quadrature microscopy (OQM) to count the number of cells accurately in live mouse embryos beyond the 8-cell stage. We have also created a preliminary analysis to measure the cell symmetry, size, and fragmentation quantitatively by analyzing the relative dry mass from the OQM image in conjunction with the phase-subtraction count. In this paper, we will discuss the characterization of OQM with respect to measuring the phase accurately for spherical samples that are much larger than the depth of field. Once fully characterized and verified with human embryos, this methodology could provide the means for a more accurate method to score embryo viability.

  20. Computational Methods for Configurational Entropy Using Internal and Cartesian Coordinates.

    PubMed

    Hikiri, Simon; Yoshidome, Takashi; Ikeguchi, Mitsunori

    2016-12-13

    The configurational entropy of solute molecules is a crucially important quantity to study various biophysical processes. Consequently, it is necessary to establish an efficient quantitative computational method to calculate configurational entropy as accurately as possible. In the present paper, we investigate the quantitative performance of the quasi-harmonic and related computational methods, including widely used methods implemented in popular molecular dynamics (MD) software packages, compared with the Clausius method, which is capable of accurately computing the change of the configurational entropy upon temperature change. Notably, we focused on the choice of the coordinate systems (i.e., internal or Cartesian coordinates). The Boltzmann-quasi-harmonic (BQH) method using internal coordinates outperformed all the six methods examined here. The introduction of improper torsions in the BQH method improves its performance, and anharmonicity of proper torsions in proteins is identified to be the origin of the superior performance of the BQH method. In contrast, widely used methods implemented in MD packages show rather poor performance. In addition, the enhanced sampling of replica-exchange MD simulations was found to be efficient for the convergent behavior of entropy calculations. Also in folding/unfolding transitions of a small protein, Chignolin, the BQH method was reasonably accurate. However, the independent term without the correlation term in the BQH method was most accurate for the folding entropy among the methods considered in this study, because the QH approximation of the correlation term in the BQH method was no longer valid for the divergent unfolded structures.

  1. Selection of reliable reference genes for quantitative real-time PCR gene expression analysis in Jute (Corchorus capsularis) under stress treatments

    PubMed Central

    Niu, Xiaoping; Qi, Jianmin; Zhang, Gaoyang; Xu, Jiantang; Tao, Aifen; Fang, Pingping; Su, Jianguang

    2015-01-01

    To accurately measure gene expression using quantitative reverse transcription PCR (qRT-PCR), reliable reference gene(s) are required for data normalization. Corchorus capsularis, an annual herbaceous fiber crop with predominant biodegradability and renewability, has not been investigated for the stability of reference genes with qRT-PCR. In this study, 11 candidate reference genes were selected and their expression levels were assessed using qRT-PCR. To account for the influence of experimental approach and tissue type, 22 different jute samples were selected from abiotic and biotic stress conditions as well as three different tissue types. The stability of the candidate reference genes was evaluated using geNorm, NormFinder, and BestKeeper programs, and the comprehensive rankings of gene stability were generated by aggregate analysis. For the biotic stress and NaCl stress subsets, ACT7 and RAN were suitable as stable reference genes for gene expression normalization. For the PEG stress subset, UBC, and DnaJ were sufficient for accurate normalization. For the tissues subset, four reference genes TUBβ, UBI, EF1α, and RAN were sufficient for accurate normalization. The selected genes were further validated by comparing expression profiles of WRKY15 in various samples, and two stable reference genes were recommended for accurate normalization of qRT-PCR data. Our results provide researchers with appropriate reference genes for qRT-PCR in C. capsularis, and will facilitate gene expression study under these conditions. PMID:26528312

  2. Spectral matching techniques (SMTs) and automated cropland classification algorithms (ACCAs) for mapping croplands of Australia using MODIS 250-m time-series (2000–2015) data

    USGS Publications Warehouse

    Teluguntla, Pardhasaradhi G.; Thenkabail, Prasad S.; Xiong, Jun N.; Gumma, Murali Krishna; Congalton, Russell G.; Oliphant, Adam; Poehnelt, Justin; Yadav, Kamini; Rao, Mahesh N.; Massey, Richard

    2017-01-01

    Mapping croplands, including fallow areas, are an important measure to determine the quantity of food that is produced, where they are produced, and when they are produced (e.g. seasonality). Furthermore, croplands are known as water guzzlers by consuming anywhere between 70% and 90% of all human water use globally. Given these facts and the increase in global population to nearly 10 billion by the year 2050, the need for routine, rapid, and automated cropland mapping year-after-year and/or season-after-season is of great importance. The overarching goal of this study was to generate standard and routine cropland products, year-after-year, over very large areas through the use of two novel methods: (a) quantitative spectral matching techniques (QSMTs) applied at continental level and (b) rule-based Automated Cropland Classification Algorithm (ACCA) with the ability to hind-cast, now-cast, and future-cast. Australia was chosen for the study given its extensive croplands, rich history of agriculture, and yet nonexistent routine yearly generated cropland products using multi-temporal remote sensing. This research produced three distinct cropland products using Moderate Resolution Imaging Spectroradiometer (MODIS) 250-m normalized difference vegetation index 16-day composite time-series data for 16 years: 2000 through 2015. The products consisted of: (1) cropland extent/areas versus cropland fallow areas, (2) irrigated versus rainfed croplands, and (3) cropping intensities: single, double, and continuous cropping. An accurate reference cropland product (RCP) for the year 2014 (RCP2014) produced using QSMT was used as a knowledge base to train and develop the ACCA algorithm that was then applied to the MODIS time-series data for the years 2000–2015. A comparison between the ACCA-derived cropland products (ACPs) for the year 2014 (ACP2014) versus RCP2014 provided an overall agreement of 89.4% (kappa = 0.814) with six classes: (a) producer’s accuracies varying between 72% and 90% and (b) user’s accuracies varying between 79% and 90%. ACPs for the individual years 2000–2013 and 2015 (ACP2000–ACP2013, ACP2015) showed very strong similarities with several other studies. The extent and vigor of the Australian croplands versus cropland fallows were accurately captured by the ACCA algorithm for the years 2000–2015, thus highlighting the value of the study in food security analysis. The ACCA algorithm and the cropland products are released through http://croplands.org/app/map and http://geography.wr.usgs.gov/science/croplands/algorithms/australia_250m.html

  3. Quantitative and Sensitive Detection of Chloramphenicol by Surface-Enhanced Raman Scattering

    PubMed Central

    Ding, Yufeng; Yin, Hongjun; Meng, Qingyun; Zhao, Yongmei; Liu, Luo; Wu, Zhenglong; Xu, Haijun

    2017-01-01

    We used surface-enhanced Raman scattering (SERS) for the quantitative and sensitive detection of chloramphenicol (CAP). Using 30 nm colloidal Au nanoparticles (NPs), a low detection limit for CAP of 10−8 M was obtained. The characteristic Raman peak of CAP centered at 1344 cm−1 was used for the rapid quantitative detection of CAP in three different types of CAP eye drops, and the accuracy of the measurement result was verified by high-performance liquid chromatography (HPLC). The experimental results reveal that the SERS technique based on colloidal Au NPs is accurate and sensitive, and can be used for the rapid detection of various antibiotics. PMID:29261161

  4. An accurate computational method for an order parameter with a Markov state model constructed using a manifold-learning technique

    NASA Astrophysics Data System (ADS)

    Ito, Reika; Yoshidome, Takashi

    2018-01-01

    Markov state models (MSMs) are a powerful approach for analyzing the long-time behaviors of protein motion using molecular dynamics simulation data. However, their quantitative performance with respect to the physical quantities is poor. We believe that this poor performance is caused by the failure to appropriately classify protein conformations into states when constructing MSMs. Herein, we show that the quantitative performance of an order parameter is improved when a manifold-learning technique is employed for the classification in the MSM. The MSM construction using the K-center method, which has been previously used for classification, has a poor quantitative performance.

  5. End-to-end deep neural network for optical inversion in quantitative photoacoustic imaging.

    PubMed

    Cai, Chuangjian; Deng, Kexin; Ma, Cheng; Luo, Jianwen

    2018-06-15

    An end-to-end deep neural network, ResU-net, is developed for quantitative photoacoustic imaging. A residual learning framework is used to facilitate optimization and to gain better accuracy from considerably increased network depth. The contracting and expanding paths enable ResU-net to extract comprehensive context information from multispectral initial pressure images and, subsequently, to infer a quantitative image of chromophore concentration or oxygen saturation (sO 2 ). According to our numerical experiments, the estimations of sO 2 and indocyanine green concentration are accurate and robust against variations in both optical property and object geometry. An extremely short reconstruction time of 22 ms is achieved.

  6. Facile and quantitative electrochemical detection of yeast cell apoptosis

    NASA Astrophysics Data System (ADS)

    Yue, Qiulin; Xiong, Shiquan; Cai, Dongqing; Wu, Zhengyan; Zhang, Xin

    2014-03-01

    An electrochemical method based on square wave anodic stripping voltammetry (SWASV) was developed to detect the apoptosis of yeast cells conveniently and quantitatively through the high affinity between Cu2+ and phosphatidylserine (PS) translocated from the inner to the outer plasma membrane of the apoptotic cells. The combination of negatively charged PS and Cu2+ could decrease the electrochemical response of Cu2+ on the electrode. The results showed that the apoptotic rates of cells could be detected quantitatively through the variations of peak currents of Cu2+ by SWASV, and agreed well with those obtained through traditional flow cytometry detection. This work thus may provide a novel, simple, immediate and accurate detection method for cell apoptosis.

  7. Quantitative capillary electrophoresis and its application in analysis of alkaloids in tea, coffee, coca cola, and theophylline tablets.

    PubMed

    Li, Mengjia; Zhou, Junyi; Gu, Xue; Wang, Yan; Huang, Xiaojing; Yan, Chao

    2009-01-01

    A quantitative CE (qCE) system with high precision has been developed, in which a 4-port nano-valve was isolated from the electric field and served as sample injector. The accurate amount of sample was introduced into the CE system with high reproducibility. Based on this system, consecutive injections and separations were performed without voltage interruption. Reproducibilities in terms of RSD lower than 0.8% for retention time and 1.7% for peak area were achieved. The effectiveness of the system was demonstrated by the quantitative analysis of caffeine, theobromine, and theophylline in real samples, such as tea leaf, roasted coffee, coca cola, and theophylline tablets.

  8. Type- and Subtype-Specific Influenza Forecast.

    PubMed

    Kandula, Sasikiran; Yang, Wan; Shaman, Jeffrey

    2017-03-01

    Prediction of the growth and decline of infectious disease incidence has advanced considerably in recent years. As these forecasts improve, their public health utility should increase, particularly as interventions are developed that make explicit use of forecast information. It is the task of the research community to increase the content and improve the accuracy of these infectious disease predictions. Presently, operational real-time forecasts of total influenza incidence are produced at the municipal and state level in the United States. These forecasts are generated using ensemble simulations depicting local influenza transmission dynamics, which have been optimized prior to forecast with observations of influenza incidence and data assimilation methods. Here, we explore whether forecasts targeted to predict influenza by type and subtype during 2003-2015 in the United States were more or less accurate than forecasts targeted to predict total influenza incidence. We found that forecasts separated by type/subtype generally produced more accurate predictions and, when summed, produced more accurate predictions of total influenza incidence. These findings indicate that monitoring influenza by type and subtype not only provides more detailed observational content but supports more accurate forecasting. More accurate forecasting can help officials better respond to and plan for current and future influenza activity. © The Author 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Inverse methods for 3D quantitative optical coherence elasticity imaging (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Dong, Li; Wijesinghe, Philip; Hugenberg, Nicholas; Sampson, David D.; Munro, Peter R. T.; Kennedy, Brendan F.; Oberai, Assad A.

    2017-02-01

    In elastography, quantitative elastograms are desirable as they are system and operator independent. Such quantification also facilitates more accurate diagnosis, longitudinal studies and studies performed across multiple sites. In optical elastography (compression, surface-wave or shear-wave), quantitative elastograms are typically obtained by assuming some form of homogeneity. This simplifies data processing at the expense of smearing sharp transitions in elastic properties, and/or introducing artifacts in these regions. Recently, we proposed an inverse problem-based approach to compression OCE that does not assume homogeneity, and overcomes the drawbacks described above. In this approach, the difference between the measured and predicted displacement field is minimized by seeking the optimal distribution of elastic parameters. The predicted displacements and recovered elastic parameters together satisfy the constraint of the equations of equilibrium. This approach, which has been applied in two spatial dimensions assuming plane strain, has yielded accurate material property distributions. Here, we describe the extension of the inverse problem approach to three dimensions. In addition to the advantage of visualizing elastic properties in three dimensions, this extension eliminates the plane strain assumption and is therefore closer to the true physical state. It does, however, incur greater computational costs. We address this challenge through a modified adjoint problem, spatially adaptive grid resolution, and three-dimensional decomposition techniques. Through these techniques the inverse problem is solved on a typical desktop machine within a wall clock time of 20 hours. We present the details of the method and quantitative elasticity images of phantoms and tissue samples.

  10. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  11. Improved accuracy in quantitative laser-induced breakdown spectroscopy using sub-models

    DOE PAGES

    Anderson, Ryan B.; Clegg, Samuel M.; Frydenvang, Jens; ...

    2016-12-15

    We report that accurate quantitative analysis of diverse geologic materials is one of the primary challenges faced by the Laser-Induced Breakdown Spectroscopy (LIBS)-based ChemCam instrument on the Mars Science Laboratory (MSL) rover. The SuperCam instrument on the Mars 2020 rover, as well as other LIBS instruments developed for geochemical analysis on Earth or other planets, will face the same challenge. Consequently, part of the ChemCam science team has focused on the development of improved multivariate analysis calibrations methods. Developing a single regression model capable of accurately determining the composition of very different target materials is difficult because the response ofmore » an element’s emission lines in LIBS spectra can vary with the concentration of other elements. We demonstrate a conceptually simple “submodel” method for improving the accuracy of quantitative LIBS analysis of diverse target materials. The method is based on training several regression models on sets of targets with limited composition ranges and then “blending” these “sub-models” into a single final result. Tests of the sub-model method show improvement in test set root mean squared error of prediction (RMSEP) for almost all cases. Lastly, the sub-model method, using partial least squares regression (PLS), is being used as part of the current ChemCam quantitative calibration, but the sub-model method is applicable to any multivariate regression method and may yield similar improvements.« less

  12. Quantitative analysis of benzodiazepines in vitreous humor by high-performance liquid chromatography

    PubMed Central

    Bazmi, Elham; Behnoush, Behnam; Akhgari, Maryam; Bahmanabadi, Leila

    2016-01-01

    Objective: Benzodiazepines are frequently screened drugs in emergency toxicology, drugs of abuse testing, and in forensic cases. As the variations of benzodiazepines concentrations in biological samples during bleeding, postmortem changes, and redistribution could be biasing forensic medicine examinations, hence selecting a suitable sample and a validated accurate method is essential for the quantitative analysis of these main drug categories. The aim of this study was to develop a valid method for the determination of four benzodiazepines (flurazepam, lorazepam, alprazolam, and diazepam) in vitreous humor using liquid–liquid extraction and high-performance liquid chromatography. Methods: Sample preparation was carried out using liquid–liquid extraction with n-hexane: ethyl acetate and subsequent detection by high-performance liquid chromatography method coupled to diode array detector. This method was applied to quantify benzodiazepines in 21 authentic vitreous humor samples. Linear curve for each drug was obtained within the range of 30–3000 ng/mL with coefficient of correlation higher than 0.99. Results: The limit of detection and quantitation were 30 and 100 ng/mL respectively for four drugs. The method showed an appropriate intra- and inter-day precision (coefficient of variation < 10%). Benzodiazepines recoveries were estimated to be over 80%. The method showed high selectivity; no additional peak due to interfering substances in samples was observed. Conclusion: The present method was selective, sensitive, accurate, and precise for the quantitative analysis of benzodiazepines in vitreous humor samples in forensic toxicology laboratory. PMID:27635251

  13. Real-time three-dimensional soft tissue reconstruction for laparoscopic surgery.

    PubMed

    Kowalczuk, Jędrzej; Meyer, Avishai; Carlson, Jay; Psota, Eric T; Buettner, Shelby; Pérez, Lance C; Farritor, Shane M; Oleynikov, Dmitry

    2012-12-01

    Accurate real-time 3D models of the operating field have the potential to enable augmented reality for endoscopic surgery. A new system is proposed to create real-time 3D models of the operating field that uses a custom miniaturized stereoscopic video camera attached to a laparoscope and an image-based reconstruction algorithm implemented on a graphics processing unit (GPU). The proposed system was evaluated in a porcine model that approximates the viewing conditions of in vivo surgery. To assess the quality of the models, a synthetic view of the operating field was produced by overlaying a color image on the reconstructed 3D model, and an image rendered from the 3D model was compared with a 2D image captured from the same view. Experiments conducted with an object of known geometry demonstrate that the system produces 3D models accurate to within 1.5 mm. The ability to produce accurate real-time 3D models of the operating field is a significant advancement toward augmented reality in minimally invasive surgery. An imaging system with this capability will potentially transform surgery by helping novice and expert surgeons alike to delineate variance in internal anatomy accurately.

  14. A High Resolution/Accurate Mass (HRAM) Data-Dependent MS3 Neutral Loss Screening, Classification, and Relative Quantitation Methodology for Carbonyl Compounds in Saliva

    NASA Astrophysics Data System (ADS)

    Dator, Romel; Carrà, Andrea; Maertens, Laura; Guidolin, Valeria; Villalta, Peter W.; Balbo, Silvia

    2017-04-01

    Reactive carbonyl compounds (RCCs) are ubiquitous in the environment and are generated endogenously as a result of various physiological and pathological processes. These compounds can react with biological molecules inducing deleterious processes believed to be at the basis of their toxic effects. Several of these compounds are implicated in neurotoxic processes, aging disorders, and cancer. Therefore, a method characterizing exposures to these chemicals will provide insights into how they may influence overall health and contribute to disease pathogenesis. Here, we have developed a high resolution accurate mass (HRAM) screening strategy allowing simultaneous identification and relative quantitation of DNPH-derivatized carbonyls in human biological fluids. The screening strategy involves the diagnostic neutral loss of hydroxyl radical triggering MS3 fragmentation, which is only observed in positive ionization mode of DNPH-derivatized carbonyls. Unique fragmentation pathways were used to develop a classification scheme for characterizing known and unanticipated/unknown carbonyl compounds present in saliva. Furthermore, a relative quantitation strategy was implemented to assess variations in the levels of carbonyl compounds before and after exposure using deuterated d 3 -DNPH. This relative quantitation method was tested on human samples before and after exposure to specific amounts of alcohol. The nano-electrospray ionization (nano-ESI) in positive mode afforded excellent sensitivity with detection limits on-column in the high-attomole levels. To the best of our knowledge, this is the first report of a method using HRAM neutral loss screening of carbonyl compounds. In addition, the method allows simultaneous characterization and relative quantitation of DNPH-derivatized compounds using nano-ESI in positive mode.

  15. Accurate ECG diagnosis of atrial tachyarrhythmias using quantitative analysis: a prospective diagnostic and cost-effectiveness study.

    PubMed

    Krummen, David E; Patel, Mitul; Nguyen, Hong; Ho, Gordon; Kazi, Dhruv S; Clopton, Paul; Holland, Marian C; Greenberg, Scott L; Feld, Gregory K; Faddis, Mitchell N; Narayan, Sanjiv M

    2010-11-01

    Quantitative ECG Analysis. Optimal atrial tachyarrhythmia management is facilitated by accurate electrocardiogram interpretation, yet typical atrial flutter (AFl) may present without sawtooth F-waves or RR regularity, and atrial fibrillation (AF) may be difficult to separate from atypical AFl or rapid focal atrial tachycardia (AT). We analyzed whether improved diagnostic accuracy using a validated analysis tool significantly impacts costs and patient care. We performed a prospective, blinded, multicenter study using a novel quantitative computerized algorithm to identify atrial tachyarrhythmia mechanism from the surface ECG in patients referred for electrophysiology study (EPS). In 122 consecutive patients (age 60 ± 12 years) referred for EPS, 91 sustained atrial tachyarrhythmias were studied. ECGs were also interpreted by 9 physicians from 3 specialties for comparison and to allow healthcare system modeling. Diagnostic accuracy was compared to the diagnosis at EPS. A Markov model was used to estimate the impact of improved arrhythmia diagnosis. We found 13% of typical AFl ECGs had neither sawtooth flutter waves nor RR regularity, and were misdiagnosed by the majority of clinicians (0/6 correctly diagnosed by consensus visual interpretation) but correctly by quantitative analysis in 83% (5/6, P = 0.03). AF diagnosis was also improved through use of the algorithm (92%) versus visual interpretation (primary care: 76%, P < 0.01). Economically, we found that these improvements in diagnostic accuracy resulted in an average cost-savings of $1,303 and 0.007 quality-adjusted-life-years per patient. Typical AFl and AF are frequently misdiagnosed using visual criteria. Quantitative analysis improves diagnostic accuracy and results in improved healthcare costs and patient outcomes. © 2010 Wiley Periodicals, Inc.

  16. Discordances with HIV-1 RNA quantitative determinations by three commercial assays in Pointe Noire, Republic of Congo.

    PubMed

    Bruzzone, Bianca; Bisio, Francesca; Caligiuri, Patrizia; Mboungou, Franc A Mayinda; Nigro, Nicola; Sticchi, Laura; Ventura, Agostina; Saladini, Francesco; Zazzi, Maurizio; Icardi, Giancarlo; Viscoli, Claudio

    2014-07-01

    Accurate HIV-1 RNA quantitation is required to support the scale up of antiretroviral therapy in African countries. Extreme HIV-1 genetic variability in Africa may affect the ability of commercially available assays to detect and quantify HIV-1 RNA accurately. The aim of this study was to compare three real-time PCR assays for quantitation of plasma HIV-1 RNA levels in patients from the Republic of Congo, an area with highly diversified HIV-1 subtypes and recombinants. The Abbott RealTime HIV-1, BioMérieux HIV-1 EasyQ test 1.2 and Cobas AmpliPrep/Cobas TaqMan HIV-1 1.0 were compared for quantitation of HIV-1 RNA in 37 HIV-1 seropositive pregnant women enrolled in the Kento-Mwana project for prevention of mother-to-child transmission in Pointe-Noire, Republic of Congo. The sample panel included a variety of HIV-1 subtypes with as many as 21 (56.8%) putative unique recombinant forms. Qualitative detection of HIV-1 RNA was concordant by all three assays in 33/37 (89.2%) samples. Of the remaining 4 (10.8%) samples, all were positive by Roche, three by Abbott and none by BioMérieux. Differences exceeding 1Log in positive samples were found in 4/31 (12.9%), 10/31 (32.3%) and 5/31 (16.1%) cases between Abbott and BioMérieux, Roche and BioMérieux, and Abbott and Roche, respectively. In this sample panel representative of highly polymorphic HIV-1 in Congo, the agreement among the three assays was moderate in terms of HIV-1 RNA detectability and rather inconsistent in terms of quantitation. Copyright © 2014. Published by Elsevier B.V.

  17. Quantitative phase-digital holographic microscopy: a new imaging modality to identify original cellular biomarkers of diseases

    NASA Astrophysics Data System (ADS)

    Marquet, P.; Rothenfusser, K.; Rappaz, B.; Depeursinge, C.; Jourdain, P.; Magistretti, P. J.

    2016-03-01

    Quantitative phase microscopy (QPM) has recently emerged as a powerful label-free technique in the field of living cell imaging allowing to non-invasively measure with a nanometric axial sensitivity cell structure and dynamics. Since the phase retardation of a light wave when transmitted through the observed cells, namely the quantitative phase signal (QPS), is sensitive to both cellular thickness and intracellular refractive index related to the cellular content, its accurate analysis allows to derive various cell parameters and monitor specific cell processes, which are very likely to identify new cell biomarkers. Specifically, quantitative phase-digital holographic microscopy (QP-DHM), thanks to its numerical flexibility facilitating parallelization and automation processes, represents an appealing imaging modality to both identify original cellular biomarkers of diseases as well to explore the underlying pathophysiological processes.

  18. An autoanalyzer test for the quantitation of platelet-associated IgG

    NASA Technical Reports Server (NTRS)

    Levitan, Nathan; Teno, Richard A.; Szymanski, Irma O.

    1986-01-01

    A new quantitative antiglobulin consumption (QAC) test for the measurement of platelet-associated IgG is described. In this test washed platelets are incubated with anti-IgG at a final dilution of 1:2 million. The unneutralized fraction of anti-IgG remaining in solution is then measured with an Autoanalyzer and soluble IgG is used for calibration. The dose-response curves depicting the percent neutralization of anti-IgG by platelets and by soluble IgG were compared in detail and found to be nearly identical, indicating that platelet-associated IgG can be accurately quantitated by this method. The mean IgG values were 2287 molecules/platelet for normal adults and 38,112 molecules/platelet for ITP patients. The Autoanalyzer QAC test is a sensitive and reproducible assay for the quantitation of platelet-associated IgG.

  19. Innovative Technologies for Maskless Lithography and Non-Conventional Patterning

    DTIC Science & Technology

    2008-08-01

    wave sources are used and quantitative data is produced on the local field intensities and scattered plane and plasmon wave amplitudes and phases...transistors”, Transducers 2007, Lyon, France, 3EH5.P, 2007. 9. D. Huang and V. Subramanian “Iodine-doped pentacene schottky diodes for high-frequency RFID...wave sources are used and quantitative data is produced on the local field intensities and scattered plane and plasmon wave amplitudes and phases

  20. Statistically Self-Consistent and Accurate Errors for SuperDARN Data

    NASA Astrophysics Data System (ADS)

    Reimer, A. S.; Hussey, G. C.; McWilliams, K. A.

    2018-01-01

    The Super Dual Auroral Radar Network (SuperDARN)-fitted data products (e.g., spectral width and velocity) are produced using weighted least squares fitting. We present a new First-Principles Fitting Methodology (FPFM) that utilizes the first-principles approach of Reimer et al. (2016) to estimate the variance of the real and imaginary components of the mean autocorrelation functions (ACFs) lags. SuperDARN ACFs fitted by the FPFM do not use ad hoc or empirical criteria. Currently, the weighting used to fit the ACF lags is derived from ad hoc estimates of the ACF lag variance. Additionally, an overcautious lag filtering criterion is used that sometimes discards data that contains useful information. In low signal-to-noise (SNR) and/or low signal-to-clutter regimes the ad hoc variance and empirical criterion lead to underestimated errors for the fitted parameter because the relative contributions of signal, noise, and clutter to the ACF variance is not taken into consideration. The FPFM variance expressions include contributions of signal, noise, and clutter. The clutter is estimated using the maximal power-based self-clutter estimator derived by Reimer and Hussey (2015). The FPFM was successfully implemented and tested using synthetic ACFs generated with the radar data simulator of Ribeiro, Ponomarenko, et al. (2013). The fitted parameters and the fitted-parameter errors produced by the FPFM are compared with the current SuperDARN fitting software, FITACF. Using self-consistent statistical analysis, the FPFM produces reliable or trustworthy quantitative measures of the errors of the fitted parameters. For an SNR in excess of 3 dB and velocity error below 100 m/s, the FPFM produces 52% more data points than FITACF.

  1. Ejection fraction in myocardial perfusion imaging assessed with a dynamic phantom: comparison between IQ-SPECT and LEHR.

    PubMed

    Hippeläinen, Eero; Mäkelä, Teemu; Kaasalainen, Touko; Kaleva, Erna

    2017-12-01

    Developments in single photon emission tomography instrumentation and reconstruction methods present a potential for decreasing acquisition times. One of such recent options for myocardial perfusion imaging (MPI) is IQ-SPECT. This study was motivated by the inconsistency in the reported ejection fraction (EF) and left ventricular (LV) volume results between IQ-SPECT and more conventional low-energy high-resolution (LEHR) collimation protocols. IQ-SPECT and LEHR quantitative results were compared while the equivalent number of iterations (EI) was varied. The end-diastolic (EDV) and end-systolic volumes (ESV) and the derived EF values were investigated. A dynamic heart phantom was used to produce repeatable ESVs, EDVs and EFs. Phantom performance was verified by comparing the set EF values to those measured from a gated multi-slice X-ray computed tomography (CT) scan (EF True ). The phantom with an EF setting of 45, 55, 65 and 70% was imaged with both IQ-SPECT and LEHR protocols. The data were reconstructed with different EI, and two commonly used clinical myocardium delineation software were used to evaluate the LV volumes. The CT verification showed that the phantom EF settings were repeatable and accurate with the EF True being within 1% point from the manufacture's nominal value. Depending on EI both MPI protocols can be made to produce correct EF estimates, but IQ-SPECT protocol produced on average 41 and 42% smaller EDV and ESV when compared to the phantom's volumes, while LEHR protocol underestimated volumes by 24 and 21%, respectively. The volume results were largely similar between the delineation methods used. The reconstruction parameters can greatly affect the volume estimates obtained from perfusion studies. IQ-SPECT produces systematically smaller LV volumes than the conventional LEHR MPI protocol. The volume estimates are also software dependent.

  2. A peptidomic approach for monitoring and characterising peptide cyanotoxins produced in Italian lakes by matrix-assisted laser desorption/ionisation and quadrupole time-of-flight mass spectrometry.

    PubMed

    Ferranti, Pasquale; Nasi, Antonella; Bruno, Milena; Basile, Adriana; Serpe, Luigi; Gallo, Pasquale

    2011-05-15

    In recent years, the occurrence of cyanobacterial blooms in eutrophic freshwaters has been described all over the world, including most European countries. Blooms of cyanobacteria may produce mixtures of toxic secondary metabolites, called cyanotoxins. Among these, the most studied are microcystins, a group of cyclic heptapeptides, because of their potent hepatotoxicity and activity as tumour promoters. Other peptide cyanotoxins have been described whose structure and toxicity have not been thoroughly studied. Herein we present a peptidomic approach aimed to characterise and quantify the peptide cyanotoxins produced in two Italian lakes, Averno and Albano. The procedure was based on matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry mass spectrometry (MALDI-TOF-MS) analysis for rapid detection and profiling of the peptide mixture complexity, combined with liquid chromatography/electrospray ionisation quadrupole time-of- flight tandem mass spectrometry (LC/ESI-Q-TOF-MS/MS) which provided unambiguous structural identification of the main compounds, as well as accurate quantitative analysis of microcystins. In the case of Lake Averno, a novel variant of microcystin-RR and two novel anabaenopeptin variants (Anabaenopeptins B(1) and Anabaenopeptin F(1)), presenting homoarginine in place of the commonly found arginine, were detected and characterised. In Lake Albano, the peculiar peptide patterns in different years were compared, as an example of the potentiality of the peptidomic approach for fast screening analysis, prior to fine structural analysis and determination of cyanotoxins, which included six novel aeruginosin variants. This approach allows for wide range monitoring of cyanobacteria blooms, and to collect data for evaluating possible health risks to consumers, through the panel of the compounds produced along different years. Copyright © 2011 John Wiley & Sons, Ltd.

  3. Personal Exposure Monitoring Wearing Protocol Compliance: An Initial Assessment of Quantitative Measurements

    EPA Science Inventory

    Personal exposure sampling provides the most accurate and representative assessment of exposure to a pollutant, but only if measures are implemented to minimize exposure misclassification and reduce confounders that may cause misinterpretation of the collected data. Poor complian...

  4. EVALUATION OF METHODS FOR SAMPLING, RECOVERY, AND ENUMERATION OF BACTERIA APPLIED TO THE PHYLLOPANE

    EPA Science Inventory

    Determining the fate and survival of genetically engineered microorganisms released into the environment requires the development and application of accurate and practical methods of detection and enumeration. everal experiments were performed to examine quantitative recovery met...

  5. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  6. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  7. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  8. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  9. 40 CFR 35.102 - Definitions of terms.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... that is related to an environmental or programmatic goal or objective. Outcomes must be quantitative... will be produced or provided over a period of time or by a specified date. Outputs may be quantitative...

  10. Streamlined system for purifying and quantifying a diverse library of compounds and the effect of compound concentration measurements on the accurate interpretation of biological assay results.

    PubMed

    Popa-Burke, Ioana G; Issakova, Olga; Arroway, James D; Bernasconi, Paul; Chen, Min; Coudurier, Louis; Galasinski, Scott; Jadhav, Ajit P; Janzen, William P; Lagasca, Dennis; Liu, Darren; Lewis, Roderic S; Mohney, Robert P; Sepetov, Nikolai; Sparkman, Darren A; Hodge, C Nicholas

    2004-12-15

    As part of an overall systems approach to generating highly accurate screening data across large numbers of compounds and biological targets, we have developed and implemented streamlined methods for purifying and quantitating compounds at various stages of the screening process, coupled with automated "traditional" storage methods (DMSO, -20 degrees C). Specifically, all of the compounds in our druglike library are purified by LC/MS/UV and are then controlled for identity and concentration in their respective DMSO stock solutions by chemiluminescent nitrogen detection (CLND)/evaporative light scattering detection (ELSD) and MS/UV. In addition, the compound-buffer solutions used in the various biological assays are quantitated by LC/UV/CLND to determine the concentration of compound actually present during screening. Our results show that LC/UV/CLND/ELSD/MS is a widely applicable method that can be used to purify, quantitate, and identify most small organic molecules from compound libraries. The LC/UV/CLND technique is a simple and sensitive method that can be easily and cost-effectively employed to rapidly determine the concentrations of even small amounts of any N-containing compound in aqueous solution. We present data to establish error limits for concentration determination that are well within the overall variability of the screening process. This study demonstrates that there is a significant difference between the predicted amount of soluble compound from stock DMSO solutions following dilution into assay buffer and the actual amount present in assay buffer solutions, even at the low concentrations employed for the assays. We also demonstrate that knowledge of the concentrations of compounds to which the biological target is exposed is critical for accurate potency determinations. Accurate potency values are in turn particularly important for drug discovery, for understanding structure-activity relationships, and for building useful empirical models of protein-ligand interactions. Our new understanding of relative solubility demonstrates that most, if not all, decisions that are made in early discovery are based upon missing or inaccurate information. Finally, we demonstrate that careful control of compound handling and concentration, coupled with accurate assay methods, allows the use of both positive and negative data in analyzing screening data sets for structure-activity relationships that determine potency and selectivity.

  11. Two schemes for quantitative photoacoustic tomography based on Monte Carlo simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yubin; Yuan, Zhen, E-mail: zhenyuan@umac.mo

    Purpose: The aim of this study was to develop novel methods for photoacoustically determining the optical absorption coefficient of biological tissues using Monte Carlo (MC) simulation. Methods: In this study, the authors propose two quantitative photoacoustic tomography (PAT) methods for mapping the optical absorption coefficient. The reconstruction methods combine conventional PAT with MC simulation in a novel way to determine the optical absorption coefficient of biological tissues or organs. Specifically, the authors’ two schemes were theoretically and experimentally examined using simulations, tissue-mimicking phantoms, ex vivo, and in vivo tests. In particular, the authors explored these methods using several objects withmore » different absorption contrasts embedded in turbid media and by using high-absorption media when the diffusion approximation was not effective at describing the photon transport. Results: The simulations and experimental tests showed that the reconstructions were quantitatively accurate in terms of the locations, sizes, and optical properties of the targets. The positions of the recovered targets were accessed by the property profiles, where the authors discovered that the off center error was less than 0.1 mm for the circular target. Meanwhile, the sizes and quantitative optical properties of the targets were quantified by estimating the full width half maximum of the optical absorption property. Interestingly, for the reconstructed sizes, the authors discovered that the errors ranged from 0 for relatively small-size targets to 26% for relatively large-size targets whereas for the recovered optical properties, the errors ranged from 0% to 12.5% for different cases. Conclusions: The authors found that their methods can quantitatively reconstruct absorbing objects of different sizes and optical contrasts even when the diffusion approximation is unable to accurately describe the photon propagation in biological tissues. In particular, their methods are able to resolve the intrinsic difficulties that occur when quantitative PAT is conducted by combining conventional PAT with the diffusion approximation or with radiation transport modeling.« less

  12. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schmidt, J. M., E-mail: jschmidt@physics.usyd.edu.au; Cairns, Iver H.

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME andmore » plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 10{sup 6} and ≈ 10{sup 3}, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth’s magnetosphere and drive space weather events.« less

  13. Demonstration of a viable quantitative theory for interplanetary type II radio bursts

    NASA Astrophysics Data System (ADS)

    Schmidt, J. M.; Cairns, Iver H.

    2016-03-01

    Between 29 November and 1 December 2013 the two widely separated spacecraft STEREO A and B observed a long lasting, intermittent, type II radio burst for the extended frequency range ≈ 4 MHz to 30 kHz, including an intensification when the shock wave of the associated coronal mass ejection (CME) reached STEREO A. We demonstrate for the first time our ability to quantitatively and accurately simulate the fundamental (F) and harmonic (H) emission of type II bursts from the higher corona (near 11 solar radii) to 1 AU. Our modeling requires the combination of data-driven three-dimensional magnetohydrodynamic simulations for the CME and plasma background, carried out with the BATS-R-US code, with an analytic quantitative kinetic model for both F and H radio emission, including the electron reflection at the shock, growth of Langmuir waves and radio waves, and the radiations propagation to an arbitrary observer. The intensities and frequencies of the observed radio emissions vary hugely by factors ≈ 106 and ≈ 103, respectively; the theoretical predictions are impressively accurate, being typically in error by less than a factor of 10 and 20 %, for both STEREO A and B. We also obtain accurate predictions for the timing and characteristics of the shock and local radio onsets at STEREO A, the lack of such onsets at STEREO B, and the z-component of the magnetic field at STEREO A ahead of the shock, and in the sheath. Very strong support is provided by these multiple agreements for the theory, the efficacy of the BATS-R-US code, and the vision of using type IIs and associated data-theory iterations to predict whether a CME will impact Earth's magnetosphere and drive space weather events.

  14. A comparison of quantitative methods for clinical imaging with hyperpolarized (13)C-pyruvate.

    PubMed

    Daniels, Charlie J; McLean, Mary A; Schulte, Rolf F; Robb, Fraser J; Gill, Andrew B; McGlashan, Nicholas; Graves, Martin J; Schwaiger, Markus; Lomas, David J; Brindle, Kevin M; Gallagher, Ferdia A

    2016-04-01

    Dissolution dynamic nuclear polarization (DNP) enables the metabolism of hyperpolarized (13)C-labelled molecules, such as the conversion of [1-(13)C]pyruvate to [1-(13)C]lactate, to be dynamically and non-invasively imaged in tissue. Imaging of this exchange reaction in animal models has been shown to detect early treatment response and correlate with tumour grade. The first human DNP study has recently been completed, and, for widespread clinical translation, simple and reliable methods are necessary to accurately probe the reaction in patients. However, there is currently no consensus on the most appropriate method to quantify this exchange reaction. In this study, an in vitro system was used to compare several kinetic models, as well as simple model-free methods. Experiments were performed using a clinical hyperpolarizer, a human 3 T MR system, and spectroscopic imaging sequences. The quantitative methods were compared in vivo by using subcutaneous breast tumours in rats to examine the effect of pyruvate inflow. The two-way kinetic model was the most accurate method for characterizing the exchange reaction in vitro, and the incorporation of a Heaviside step inflow profile was best able to describe the in vivo data. The lactate time-to-peak and the lactate-to-pyruvate area under the curve ratio were simple model-free approaches that accurately represented the full reaction, with the time-to-peak method performing indistinguishably from the best kinetic model. Finally, extracting data from a single pixel was a robust and reliable surrogate of the whole region of interest. This work has identified appropriate quantitative methods for future work in the analysis of human hyperpolarized (13)C data. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  15. Physiological and biochemical basis of clinical liver function tests: a review.

    PubMed

    Hoekstra, Lisette T; de Graaf, Wilmar; Nibourg, Geert A A; Heger, Michal; Bennink, Roelof J; Stieger, Bruno; van Gulik, Thomas M

    2013-01-01

    To review the literature on the most clinically relevant and novel liver function tests used for the assessment of hepatic function before liver surgery. Postoperative liver failure is the major cause of mortality and morbidity after partial liver resection and develops as a result of insufficient remnant liver function. Therefore, accurate preoperative assessment of the future remnant liver function is mandatory in the selection of candidates for safe partial liver resection. A MEDLINE search was performed using the key words "liver function tests," "functional studies in the liver," "compromised liver," "physiological basis," and "mechanistic background," with and without Boolean operators. Passive liver function tests, including biochemical parameters and clinical grading systems, are not accurate enough in predicting outcome after liver surgery. Dynamic quantitative liver function tests, such as the indocyanine green test and galactose elimination capacity, are more accurate as they measure the elimination process of a substance that is cleared and/or metabolized almost exclusively by the liver. However, these tests only measure global liver function. Nuclear imaging techniques ((99m)Tc-galactosyl serum albumin scintigraphy and (99m)Tc-mebrofenin hepatobiliary scintigraphy) can measure both total and future remnant liver function and potentially identify patients at risk for postresectional liver failure. Because of the complexity of liver function, one single test does not represent overall liver function. In addition to computed tomography volumetry, quantitative liver function tests should be used to determine whether a safe resection can be performed. Presently, (99m)Tc-mebrofenin hepatobiliary scintigraphy seems to be the most valuable quantitative liver function test, as it can measure multiple aspects of liver function in, specifically, the future remnant liver.

  16. New microfluidic-based sampling procedure for overcoming the hematocrit problem associated with dried blood spot analysis.

    PubMed

    Leuthold, Luc Alexis; Heudi, Olivier; Déglon, Julien; Raccuglia, Marc; Augsburger, Marc; Picard, Franck; Kretz, Olivier; Thomas, Aurélien

    2015-02-17

    Hematocrit (Hct) is one of the most critical issues associated with the bioanalytical methods used for dried blood spot (DBS) sample analysis. Because Hct determines the viscosity of blood, it may affect the spreading of blood onto the filter paper. Hence, accurate quantitative data can only be obtained if the size of the paper filter extracted contains a fixed blood volume. We describe for the first time a microfluidic-based sampling procedure to enable accurate blood volume collection on commercially available DBS cards. The system allows the collection of a controlled volume of blood (e.g., 5 or 10 μL) within several seconds. Reproducibility of the sampling volume was examined in vivo on capillary blood by quantifying caffeine and paraxanthine on 5 different extracted DBS spots at two different time points and in vitro with a test compound, Mavoglurant, on 10 different spots at two Hct levels. Entire spots were extracted. In addition, the accuracy and precision (n = 3) data for the Mavoglurant quantitation in blood with Hct levels between 26% and 62% were evaluated. The interspot precision data were below 9.0%, which was equivalent to that of a manually spotted volume with a pipet. No Hct effect was observed in the quantitative results obtained for Hct levels from 26% to 62%. These data indicate that our microfluidic-based sampling procedure is accurate and precise and that the analysis of Mavoglurant is not affected by the Hct values. This provides a simple procedure for DBS sampling with a fixed volume of capillary blood, which could eliminate the recurrent Hct issue linked to DBS sample analysis.

  17. Quantitative measurement and analysis for detection and treatment planning of developmental dysplasia of the hip

    NASA Astrophysics Data System (ADS)

    Liu, Xin; Lu, Hongbing; Chen, Hanyong; Zhao, Li; Shi, Zhengxing; Liang, Zhengrong

    2009-02-01

    Developmental dysplasia of the hip is a congenital hip joint malformation affecting the proximal femurs and acetabulum that are subluxatable, dislocatable, and dislocated. Conventionally, physicians made diagnoses and treatments only based on findings from two-dimensional (2D) images by manually calculating clinic parameters. However, anatomical complexity of the disease and the limitation of current standard procedures make accurate diagnosis quite difficultly. In this study, we developed a system that provides quantitative measurement of 3D clinical indexes based on computed tomography (CT) images. To extract bone structure from surrounding tissues more accurately, the system firstly segments the bone using a knowledge-based fuzzy clustering method, which is formulated by modifying the objective function of the standard fuzzy c-means algorithm with additive adaptation penalty. The second part of the system calculates automatically the clinical indexes, which are extended from 2D to 3D for accurate description of spatial relationship between femurs and acetabulum. To evaluate the system performance, experimental study based on 22 patients with unilateral or bilateral affected hip was performed. The results of 3D acetabulum index (AI) automatically provided by the system were validated by comparison with 2D results measured by surgeons manually. The correlation between the two results was found to be 0.622 (p<0.01).

  18. SATe-II: very fast and accurate simultaneous estimation of multiple sequence alignments and phylogenetic trees.

    PubMed

    Liu, Kevin; Warnow, Tandy J; Holder, Mark T; Nelesen, Serita M; Yu, Jiaye; Stamatakis, Alexandros P; Linder, C Randal

    2012-01-01

    Highly accurate estimation of phylogenetic trees for large data sets is difficult, in part because multiple sequence alignments must be accurate for phylogeny estimation methods to be accurate. Coestimation of alignments and trees has been attempted but currently only SATé estimates reasonably accurate trees and alignments for large data sets in practical time frames (Liu K., Raghavan S., Nelesen S., Linder C.R., Warnow T. 2009b. Rapid and accurate large-scale coestimation of sequence alignments and phylogenetic trees. Science. 324:1561-1564). Here, we present a modification to the original SATé algorithm that improves upon SATé (which we now call SATé-I) in terms of speed and of phylogenetic and alignment accuracy. SATé-II uses a different divide-and-conquer strategy than SATé-I and so produces smaller more closely related subsets than SATé-I; as a result, SATé-II produces more accurate alignments and trees, can analyze larger data sets, and runs more efficiently than SATé-I. Generally, SATé is a metamethod that takes an existing multiple sequence alignment method as an input parameter and boosts the quality of that alignment method. SATé-II-boosted alignment methods are significantly more accurate than their unboosted versions, and trees based upon these improved alignments are more accurate than trees based upon the original alignments. Because SATé-I used maximum likelihood (ML) methods that treat gaps as missing data to estimate trees and because we found a correlation between the quality of tree/alignment pairs and ML scores, we explored the degree to which SATé's performance depends on using ML with gaps treated as missing data to determine the best tree/alignment pair. We present two lines of evidence that using ML with gaps treated as missing data to optimize the alignment and tree produces very poor results. First, we show that the optimization problem where a set of unaligned DNA sequences is given and the output is the tree and alignment of those sequences that maximize likelihood under the Jukes-Cantor model is uninformative in the worst possible sense. For all inputs, all trees optimize the likelihood score. Second, we show that a greedy heuristic that uses GTR+Gamma ML to optimize the alignment and the tree can produce very poor alignments and trees. Therefore, the excellent performance of SATé-II and SATé-I is not because ML is used as an optimization criterion for choosing the best tree/alignment pair but rather due to the particular divide-and-conquer realignment techniques employed.

  19. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  20. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. Ultrasonic geometrical characterization of periodically corrugated surfaces.

    PubMed

    Liu, Jingfei; Declercq, Nico F

    2013-04-01

    Accurate characterization of the characteristic dimensions of a periodically corrugated surface using ultrasonic imaging technique is investigated both theoretically and experimentally. The possibility of accurately characterizing the characteristic dimensions is discussed. The condition for accurate characterization and the quantitative relationship between the accuracy and its determining parameters are given. The strategies to avoid diffraction effects instigated by the periodical nature of a corrugated surface are also discussed. Major causes of erroneous measurements are theoretically discussed and experimentally illustrated. A comparison is made between the presented results and the optical measurements, revealing acceptable agreement. This work realistically exposes the capability of the proposed ultrasonic technique to accurately characterize the lateral and vertical characteristic dimensions of corrugated surfaces. Both the general principles developed theoretically as well as the proposed practical techniques may serve as useful guidelines to peers. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Opportunities to Intercalibrate Radiometric Sensors From International Space Station

    NASA Technical Reports Server (NTRS)

    Roithmayr, C. M.; Lukashin, C.; Speth, P. W.; Thome, K. J.; Young, D. F.; Wielicki, B. A.

    2012-01-01

    Highly accurate measurements of Earth's thermal infrared and reflected solar radiation are required for detecting and predicting long-term climate change. We consider the concept of using the International Space Station to test instruments and techniques that would eventually be used on a dedicated mission such as the Climate Absolute Radiance and Refractivity Observatory. In particular, a quantitative investigation is performed to determine whether it is possible to use measurements obtained with a highly accurate reflected solar radiation spectrometer to calibrate similar, less accurate instruments in other low Earth orbits. Estimates of numbers of samples useful for intercalibration are made with the aid of year-long simulations of orbital motion. We conclude that the International Space Station orbit is ideally suited for the purpose of intercalibration.

  3. Forensic 3D Scene Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LITTLE,CHARLES Q.; PETERS,RALPH R.; RIGDON,J. BRIAN

    Traditionally law enforcement agencies have relied on basic measurement and imaging tools, such as tape measures and cameras, in recording a crime scene. A disadvantage of these methods is that they are slow and cumbersome. The development of a portable system that can rapidly record a crime scene with current camera imaging, 3D geometric surface maps, and contribute quantitative measurements such as accurate relative positioning of crime scene objects, would be an asset to law enforcement agents in collecting and recording significant forensic data. The purpose of this project is to develop a feasible prototype of a fast, accurate, 3Dmore » measurement and imaging system that would support law enforcement agents to quickly document and accurately record a crime scene.« less

  4. Quantitative Surface Chirality Detection with Sum Frequency Generation Vibrational Spectroscopy: Twin Polarization Angle Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Feng; Xu, Yanyan; Guo, Yuan

    2009-12-27

    Here we report a novel twin polarization angle (TPA) approach in the quantitative chirality detection with the surface sum-frequency generation vibrational spectroscopy (SFG-VS). Generally, the achiral contribution dominates the surface SFG-VS signal, and the pure chiral signal is usually two or three orders of magnitude smaller. Therefore, it has been difficult to make quantitative detection and analysis of the chiral contributions to the surface SFG- VS signal. In the TPA method, by varying together the polarization angles of the incoming visible light and the sum frequency signal at fixed s or p polarization of the incoming infrared beam, the polarizationmore » dependent SFG signal can give not only direct signature of the chiral contribution in the total SFG-VS signal, but also the accurate measurement of the chiral and achiral components in the surface SFG signal. The general description of the TPA method is presented and the experiment test of the TPA approach is also presented for the SFG-VS from the S- and R-limonene chiral liquid surfaces. The most accurate degree of chiral excess values thus obtained for the 2878 cm⁻¹ spectral peak of the S- and R-limonene liquid surfaces are (23.7±0.4)% and ({25.4±1.3)%, respectively.« less

  5. Direct Standard-Free Quantitation of Tamiflu® and Other Pharmaceutical Tablets using Clustering Agents with Electrospray Ionization Mass Spectrometry

    PubMed Central

    Flick, Tawnya G.; Leib, Ryan D.; Williams, Evan R.

    2010-01-01

    Accurate and rapid quantitation is advantageous to identify counterfeit and substandard pharmaceutical drugs. A standard-free electrospray ionization mass spectrometry method is used to directly determine the dosage in the prescription and over-the-counter drugs, Tamiflu®, Sudafed®, and Dramamine®. A tablet of each drug was dissolved in aqueous solution, filtered, and introduced into solutions containing a known concentration of either L-tryptophan, L-phenylalanine or prednisone as clustering agents. The active ingredient(s) incorporates statistically into large clusters of the clustering agent where effects of differential ionization/detection are substantially reduced. From the abundances of large clusters, the dosages of the active ingredients in each of the tablets were determined to typically better than 20% accuracy even when the ionization/detection efficiency of the individual components differed by over 100×. Although this unorthodox method for quantitation is not as accurate as using conventional standards, it has the advantages that it is fast, it can be applied to mixtures where the identities of the analytes are unknown, and it can be used when suitable standards may not be readily available, such as schedule I or II controlled substances or new designer drugs that have not previously been identified. PMID:20092258

  6. Comparative study of quantitative phase imaging techniques for refractometry of optical fibers

    NASA Astrophysics Data System (ADS)

    de Dorlodot, Bertrand; Bélanger, Erik; Bérubé, Jean-Philippe; Vallée, Réal; Marquet, Pierre

    2018-02-01

    The refractive index difference profile of optical fibers is the key design parameter because it determines, among other properties, the insertion losses and propagating modes. Therefore, an accurate refractive index profiling method is of paramount importance to their development and optimization. Quantitative phase imaging (QPI) is one of the available tools to retrieve structural characteristics of optical fibers, including the refractive index difference profile. Having the advantage of being non-destructive, several different QPI methods have been developed over the last decades. Here, we present a comparative study of three different available QPI techniques, namely the transport-of-intensity equation, quadriwave lateral shearing interferometry and digital holographic microscopy. To assess the accuracy and precision of those QPI techniques, quantitative phase images of the core of a well-characterized optical fiber have been retrieved for each of them and a robust image processing procedure has been applied in order to retrieve their refractive index difference profiles. As a result, even if the raw images for all the three QPI methods were suffering from different shortcomings, our robust automated image-processing pipeline successfully corrected these. After this treatment, all three QPI techniques yielded accurate, reliable and mutually consistent refractive index difference profiles in agreement with the accuracy and precision of the refracted near-field benchmark measurement.

  7. Method for accurate quantitation of background tissue optical properties in the presence of emission from a strong fluorescence marker

    NASA Astrophysics Data System (ADS)

    Bravo, Jaime; Davis, Scott C.; Roberts, David W.; Paulsen, Keith D.; Kanick, Stephen C.

    2015-03-01

    Quantification of targeted fluorescence markers during neurosurgery has the potential to improve and standardize surgical distinction between normal and cancerous tissues. However, quantitative analysis of marker fluorescence is complicated by tissue background absorption and scattering properties. Correction algorithms that transform raw fluorescence intensity into quantitative units, independent of absorption and scattering, require a paired measurement of localized white light reflectance to provide estimates of the optical properties. This study focuses on the unique problem of developing a spectral analysis algorithm to extract tissue absorption and scattering properties from white light spectra that contain contributions from both elastically scattered photons and fluorescence emission from a strong fluorophore (i.e. fluorescein). A fiber-optic reflectance device was used to perform measurements in a small set of optical phantoms, constructed with Intralipid (1% lipid), whole blood (1% volume fraction) and fluorescein (0.16-10 μg/mL). Results show that the novel spectral analysis algorithm yields accurate estimates of tissue parameters independent of fluorescein concentration, with relative errors of blood volume fraction, blood oxygenation fraction (BOF), and the reduced scattering coefficient (at 521 nm) of <7%, <1%, and <22%, respectively. These data represent a first step towards quantification of fluorescein in tissue in vivo.

  8. Modeling noisy resonant system response

    NASA Astrophysics Data System (ADS)

    Weber, Patrick Thomas; Walrath, David Edwin

    2017-02-01

    In this paper, a theory-based model replicating empirical acoustic resonant signals is presented and studied to understand sources of noise present in acoustic signals. Statistical properties of empirical signals are quantified and a noise amplitude parameter, which models frequency and amplitude-based noise, is created, defined, and presented. This theory-driven model isolates each phenomenon and allows for parameters to be independently studied. Using seven independent degrees of freedom, this model will accurately reproduce qualitative and quantitative properties measured from laboratory data. Results are presented and demonstrate success in replicating qualitative and quantitative properties of experimental data.

  9. 10 CFR Appendix A to Part 70 - Reportable Safety Events

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...

  10. 10 CFR Appendix A to Part 70 - Reportable Safety Events

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...

  11. 10 CFR Appendix A to Part 70 - Reportable Safety Events

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...

  12. 10 CFR Appendix A to Part 70 - Reportable Safety Events

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...

  13. 10 CFR Appendix A to Part 70 - Reportable Safety Events

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... quantitative standards established to satisfy the requirements in § 70.61(b)(4). (4) An event or condition such... material or hazardous chemicals produced from licensed materials that exceeds the quantitative standards...

  14. Rapid Protein Global Fold Determination Using Ultrasparse Sampling, High-Dynamic Range Artifact Suppression, and Time-Shared NOESY

    PubMed Central

    Coggins, Brian E.; Werner-Allen, Jonathan W.; Yan, Anthony; Zhou, Pei

    2012-01-01

    In structural studies of large proteins by NMR, global fold determination plays an increasingly important role in providing a first look at a target’s topology and reducing assignment ambiguity in NOESY spectra of fully-protonated samples. In this work, we demonstrate the use of ultrasparse sampling, a new data processing algorithm, and a 4-D time-shared NOESY experiment (1) to collect all NOEs in 2H/13C/15N-labeled protein samples with selectively-protonated amide and ILV methyl groups at high resolution in only four days, and (2) to calculate global folds from this data using fully automated resonance assignment. The new algorithm, SCRUB, incorporates the CLEAN method for iterative artifact removal, but applies an additional level of iteration, permitting real signals to be distinguished from noise and allowing nearly all artifacts generated by real signals to be eliminated. In simulations with 1.2% of the data required by Nyquist sampling, SCRUB achieves a dynamic range over 10000:1 (250× better artifact suppression than CLEAN) and completely quantitative reproduction of signal intensities, volumes, and lineshapes. Applied to 4-D time-shared NOESY data, SCRUB processing dramatically reduces aliasing noise from strong diagonal signals, enabling the identification of weak NOE crosspeaks with intensities 100× less than diagonal signals. Nearly all of the expected peaks for interproton distances under 5 Å were observed. The practical benefit of this method is demonstrated with structure calculations for 23 kDa and 29 kDa test proteins using the automated assignment protocol of CYANA, in which unassigned 4-D time-shared NOESY peak lists produce accurate and well-converged global fold ensembles, whereas 3-D peak lists either fail to converge or produce significantly less accurate folds. The approach presented here succeeds with an order of magnitude less sampling than required by alternative methods for processing sparse 4-D data. PMID:22946863

  15. Methodology for Formulating Diesel Surrogate Fuels with Accurate Compositional, Ignition-Quality, and Volatility Characteristics

    DOE PAGES

    Mueller, Charles J.; Cannella, William J.; Bruno, Thomas J.; ...

    2012-05-22

    In this study, a novel approach was developed to formulate surrogate fuels having characteristics that are representative of diesel fuels produced from real-world refinery streams. Because diesel fuels typically consist of hundreds of compounds, it is difficult to conclusively determine the effects of fuel composition on combustion properties. Surrogate fuels, being simpler representations of these practical fuels, are of interest because they can provide a better understanding of fundamental fuel-composition and property effects on combustion and emissions-formation processes in internal-combustion engines. In addition, the application of surrogate fuels in numerical simulations with accurate vaporization, mixing, and combustion models could revolutionizemore » future engine designs by enabling computational optimization for evolving real fuels. Dependable computational design would not only improve engine function, it would do so at significant cost savings relative to current optimization strategies that rely on physical testing of hardware prototypes. The approach in this study utilized the state-of-the-art techniques of 13C and 1H nuclear magnetic resonance spectroscopy and the advanced distillation curve to characterize fuel composition and volatility, respectively. The ignition quality was quantified by the derived cetane number. Two well-characterized, ultra-low-sulfur #2 diesel reference fuels produced from refinery streams were used as target fuels: a 2007 emissions certification fuel and a Coordinating Research Council (CRC) Fuels for Advanced Combustion Engines (FACE) diesel fuel. A surrogate was created for each target fuel by blending eight pure compounds. The known carbon bond types within the pure compounds, as well as models for the ignition qualities and volatilities of their mixtures, were used in a multiproperty regression algorithm to determine optimal surrogate formulations. The predicted and measured surrogate-fuel properties were quantitatively compared to the measured target-fuel properties, and good agreement was found.« less

  16. Spatio-temporal pattern clustering for skill assessment of the Korea Operational Oceanographic System

    NASA Astrophysics Data System (ADS)

    Kim, J.; Park, K.

    2016-12-01

    In order to evaluate the performance of operational forecast models in the Korea operational oceanographic system (KOOS) which has been developed by Korea Institute of Ocean Science and Technology (KIOST), a skill assessment (SA) tool has developed and provided multiple skill metrics including not only correlation and error skills by comparing predictions and observation but also pattern clustering with numerical models, satellite, and observation. The KOOS has produced 72 hours forecast information on atmospheric and hydrodynamic forecast variables of wind, pressure, current, tide, wave, temperature, and salinity at every 12 hours per day produced by operating numerical models such as WRF, ROMS, MOM5, WW-III, and SWAN and the SA has conducted to evaluate the forecasts. We have been operationally operated several kinds of numerical models such as WRF, ROMS, MOM5, MOHID, WW-III. Quantitative assessment of operational ocean forecast model is very important to provide accurate ocean forecast information not only to general public but also to support ocean-related problems. In this work, we propose a method of pattern clustering using machine learning method and GIS-based spatial analytics to evaluate spatial distribution of numerical models and spatial observation data such as satellite and HF radar. For the clustering, we use 10 or 15 years-long reanalysis data which was computed by the KOOS, ECMWF, and HYCOM to make best matching clusters which are classified physical meaning with time variation and then we compare it with forecast data. Moreover, for evaluating current, we develop extraction method of dominant flow and apply it to hydrodynamic models and HF radar's sea surface current data. By applying pattern clustering method, it allows more accurate and effective assessment of ocean forecast models' performance by comparing not only specific observation positions which are determined by observation stations but also spatio-temporal distribution of whole model areas. We believe that our proposed method will be very useful to examine and evaluate large amount of numerical modeling data as well as satellite data.

  17. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity

    PubMed Central

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A.; Bradford, William D.; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S.; Li, Rong

    2015-01-01

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein−based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. PMID:25823586

  18. Single-Cell Based Quantitative Assay of Chromosome Transmission Fidelity.

    PubMed

    Zhu, Jin; Heinecke, Dominic; Mulla, Wahid A; Bradford, William D; Rubinstein, Boris; Box, Andrew; Haug, Jeffrey S; Li, Rong

    2015-03-30

    Errors in mitosis are a primary cause of chromosome instability (CIN), generating aneuploid progeny cells. Whereas a variety of factors can influence CIN, under most conditions mitotic errors are rare events that have been difficult to measure accurately. Here we report a green fluorescent protein-based quantitative chromosome transmission fidelity (qCTF) assay in budding yeast that allows sensitive and quantitative detection of CIN and can be easily adapted to high-throughput analysis. Using the qCTF assay, we performed genome-wide quantitative profiling of genes that affect CIN in a dosage-dependent manner and identified genes that elevate CIN when either increased (icCIN) or decreased in copy number (dcCIN). Unexpectedly, qCTF screening also revealed genes whose change in copy number quantitatively suppress CIN, suggesting that the basal error rate of the wild-type genome is not minimized, but rather, may have evolved toward an optimal level that balances both stability and low-level karyotype variation for evolutionary adaptation. Copyright © 2015 Zhu et al.

  19. Comparison of culture-based, vital stain and PMA-qPCR methods for the quantitative detection of viable hookworm ova.

    PubMed

    Gyawali, P; Sidhu, J P S; Ahmed, W; Jagals, P; Toze, S

    2017-06-01

    Accurate quantitative measurement of viable hookworm ova from environmental samples is the key to controlling hookworm re-infections in the endemic regions. In this study, the accuracy of three quantitative detection methods [culture-based, vital stain and propidium monoazide-quantitative polymerase chain reaction (PMA-qPCR)] was evaluated by enumerating 1,000 ± 50 Ancylostoma caninum ova in the laboratory. The culture-based method was able to quantify an average of 397 ± 59 viable hookworm ova. Similarly, vital stain and PMA-qPCR methods quantified 644 ± 87 and 587 ± 91 viable ova, respectively. The numbers of viable ova estimated by the culture-based method were significantly (P < 0.05) lower than vital stain and PMA-qPCR methods. Therefore, both PMA-qPCR and vital stain methods appear to be suitable for the quantitative detection of viable hookworm ova. However, PMA-qPCR would be preferable over the vital stain method in scenarios where ova speciation is needed.

  20. Optimization of homonuclear 2D NMR for fast quantitative analysis: application to tropine-nortropine mixtures.

    PubMed

    Giraudeau, Patrick; Guignard, Nadia; Hillion, Emilie; Baguet, Evelyne; Akoka, Serge

    2007-03-12

    Quantitative analysis by (1)H NMR is often hampered by heavily overlapping signals that may occur for complex mixtures, especially those containing similar compounds. Bidimensional homonuclear NMR spectroscopy can overcome this difficulty. A thorough review of acquisition and post-processing parameters was carried out to obtain accurate and precise, quantitative 2D J-resolved and DQF-COSY spectra in a much reduced time, thus limiting the spectrometer instabilities in the course of time. The number of t(1) increments was reduced as much as possible, and standard deviation was improved by optimization of spectral width, number of transients, phase cycling and apodization function. Localized polynomial baseline corrections were applied to the relevant chemical shift areas. Our method was applied to tropine-nortropine mixtures. Quantitative J-resolved spectra were obtained in less than 3 min and quantitative DQF-COSY spectra in 12 min, with an accuracy of 3% for J-spectroscopy and 2% for DQF-COSY, and a standard deviation smaller than 1%.

  1. Influence of Pre-Analytical Factors on Thymus- and Activation-Regulated Chemokine Quantitation in Plasma

    PubMed Central

    Zhao, Xuemei; Delgado, Liliana; Weiner, Russell; Laterza, Omar F.

    2015-01-01

    Thymus- and activation-regulated chemokine (TARC) in serum/plasma associates with the disease activity of atopic dermatitis (AD), and is a promising tool for assessing the response to the treatment of the disease. TARC also exists within platelets, with elevated levels detectable in AD patients. We examined the effects of pre-analytical factors on the quantitation of TARC in human EDTA plasma. TARC levels in platelet-free plasma were significantly lower than those in platelet-containing plasma. After freeze-thaw, TARC levels increased in platelet-containing plasma, but remained unchanged in platelet-free plasma, suggesting TARC was released from the platelets during the freeze-thaw process. In contrast, TARC levels were stable in serum independent of freeze-thaw. These findings underscore the importance of pre-analytical factors to TARC quantitation. Plasma TARC levels should be measured in platelet-free plasma for accurate quantitation. Pre-analytical factors influence the quantitation, interpretation, and implementation of circulating TARC as a biomarker for the development of AD therapeutics. PMID:28936246

  2. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  3. Multidimensional quantitative analysis of mRNA expression within intact vertebrate embryos.

    PubMed

    Trivedi, Vikas; Choi, Harry M T; Fraser, Scott E; Pierce, Niles A

    2018-01-08

    For decades, in situ hybridization methods have been essential tools for studies of vertebrate development and disease, as they enable qualitative analyses of mRNA expression in an anatomical context. Quantitative mRNA analyses typically sacrifice the anatomy, relying on embryo microdissection, dissociation, cell sorting and/or homogenization. Here, we eliminate the trade-off between quantitation and anatomical context, using quantitative in situ hybridization chain reaction (qHCR) to perform accurate and precise relative quantitation of mRNA expression with subcellular resolution within whole-mount vertebrate embryos. Gene expression can be queried in two directions: read-out from anatomical space to expression space reveals co-expression relationships in selected regions of the specimen; conversely, read-in from multidimensional expression space to anatomical space reveals those anatomical locations in which selected gene co-expression relationships occur. As we demonstrate by examining gene circuits underlying somitogenesis, quantitative read-out and read-in analyses provide the strengths of flow cytometry expression analyses, but by preserving subcellular anatomical context, they enable bi-directional queries that open a new era for in situ hybridization. © 2018. Published by The Company of Biologists Ltd.

  4. Quantitative imaging methods in osteoporosis.

    PubMed

    Oei, Ling; Koromani, Fjorda; Rivadeneira, Fernando; Zillikens, M Carola; Oei, Edwin H G

    2016-12-01

    Osteoporosis is characterized by a decreased bone mass and quality resulting in an increased fracture risk. Quantitative imaging methods are critical in the diagnosis and follow-up of treatment effects in osteoporosis. Prior radiographic vertebral fractures and bone mineral density (BMD) as a quantitative parameter derived from dual-energy X-ray absorptiometry (DXA) are among the strongest known predictors of future osteoporotic fractures. Therefore, current clinical decision making relies heavily on accurate assessment of these imaging features. Further, novel quantitative techniques are being developed to appraise additional characteristics of osteoporosis including three-dimensional bone architecture with quantitative computed tomography (QCT). Dedicated high-resolution (HR) CT equipment is available to enhance image quality. At the other end of the spectrum, by utilizing post-processing techniques such as the trabecular bone score (TBS) information on three-dimensional architecture can be derived from DXA images. Further developments in magnetic resonance imaging (MRI) seem promising to not only capture bone micro-architecture but also characterize processes at the molecular level. This review provides an overview of various quantitative imaging techniques based on different radiological modalities utilized in clinical osteoporosis care and research.

  5. Precise Estimation of Allele Frequencies of Single-Nucleotide Polymorphisms by a Quantitative SSCP Analysis of Pooled DNA

    PubMed Central

    Sasaki, Tomonari; Tahira, Tomoko; Suzuki, Akari; Higasa, Koichiro; Kukita, Yoji; Baba, Shingo; Hayashi, Kenshi

    2001-01-01

    We show that single-nucleotide polymorphisms (SNPs) of moderate to high heterozygosity (minor allele frequencies >10%) can be efficiently detected, and their allele frequencies accurately estimated, by pooling the DNA samples and applying a capillary-based SSCP analysis. In this method, alleles are separated into peaks, and their frequencies can be reliably and accurately quantified from their peak heights (SD <1.8%). We found that as many as 40% of publicly available SNPs that were analyzed by this method have widely differing allele frequency distributions among groups of different ethnicity (parents of Centre d'Etude Polymorphisme Humaine families vs. Japanese individuals). These results demonstrate the effectiveness of the present pooling method in the reevaluation of candidate SNPs that have been collected by examination of limited numbers of individuals. The method should also serve as a robust quantitative technique for studies in which a precise estimate of SNP allele frequencies is essential—for example, in linkage disequilibrium analysis. PMID:11083945

  6. The attentional drift-diffusion model extends to simple purchasing decisions.

    PubMed

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions.

  7. Photo ion spectrometer

    DOEpatents

    Gruen, Dieter M.; Young, Charles E.; Pellin, Michael J.

    1989-01-01

    A method and apparatus for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected autoionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy.

  8. Photo ion spectrometer

    DOEpatents

    Gruen, D.M.; Young, C.E.; Pellin, M.J.

    1989-08-08

    A method and apparatus are described for extracting for quantitative analysis ions of selected atomic components of a sample. A lens system is configured to provide a slowly diminishing field region for a volume containing the selected atomic components, enabling accurate energy analysis of ions generated in the slowly diminishing field region. The lens system also enables focusing on a sample of a charged particle beam, such as an ion beam, along a path length perpendicular to the sample and extraction of the charged particles along a path length also perpendicular to the sample. Improvement of signal to noise ratio is achieved by laser excitation of ions to selected auto-ionization states before carrying out quantitative analysis. Accurate energy analysis of energetic charged particles is assured by using a preselected resistive thick film configuration disposed on an insulator substrate for generating predetermined electric field boundary conditions to achieve for analysis the required electric field potential. The spectrometer also is applicable in the fields of SIMS, ISS and electron spectroscopy. 8 figs.

  9. Fast-tracking determination of homozygous transgenic lines and transgene stacking using a reliable quantitative real-time PCR assay.

    PubMed

    Wang, Xianghong; Jiang, Daiming; Yang, Daichang

    2015-01-01

    The selection of homozygous lines is a crucial step in the characterization of newly generated transgenic plants. This is particularly time- and labor-consuming when transgenic stacking is required. Here, we report a fast and accurate method based on quantitative real-time PCR with a rice gene RBE4 as a reference gene for selection of homozygous lines when using multiple transgenic stacking in rice. Use of this method allowed can be used to determine the stacking of up to three transgenes within four generations. Selection accuracy reached 100 % for a single locus and 92.3 % for two loci. This method confers distinct advantages over current transgenic research methodologies, as it is more accurate, rapid, and reliable. Therefore, this protocol could be used to efficiently select homozygous plants and to expedite time- and labor-consuming processes normally required for multiple transgene stacking. This protocol was standardized for determination of multiple gene stacking in molecular breeding via marker-assisted selection.

  10. Morphological characterization of coral reefs by combining lidar and MBES data: A case study from Yuanzhi Island, South China Sea

    NASA Astrophysics Data System (ADS)

    Zhang, Kai; Yang, Fanlin; Zhang, Hande; Su, Dianpeng; Li, QianQian

    2017-06-01

    The correlation between seafloor morphological features and biological complexity has been identified in numerous recent studies. This research focused on the potential for accurate characterization of coral reefs based on high-resolution bathymetry from multiple sources. A standard deviation (STD) based method for quantitatively characterizing terrain complexity was developed that includes robust estimation to correct for irregular bathymetry and a calibration for the depth-dependent variablity of measurement noise. Airborne lidar and shipborne sonar bathymetry measurements from Yuanzhi Island, South China Sea, were merged to generate seamless high-resolution coverage of coral bathymetry from the shoreline to deep water. The new algorithm was applied to the Yuanzhi Island surveys to generate maps of quantitive terrain complexity, which were then compared to in situ video observations of coral abundance. The terrain complexity parameter is significantly correlated with seafloor coral abundance, demonstrating the potential for accurately and efficiently mapping coral abundance through seafloor surveys, including combinations of surveys using different sensors.

  11. The Attentional Drift-Diffusion Model Extends to Simple Purchasing Decisions

    PubMed Central

    Krajbich, Ian; Lu, Dingchao; Camerer, Colin; Rangel, Antonio

    2012-01-01

    How do we make simple purchasing decisions (e.g., whether or not to buy a product at a given price)? Previous work has shown that the attentional drift-diffusion model (aDDM) can provide accurate quantitative descriptions of the psychometric data for binary and trinary value-based choices, and of how the choice process is guided by visual attention. Here we extend the aDDM to the case of purchasing decisions, and test it using an eye-tracking experiment. We find that the model also provides a reasonably accurate quantitative description of the relationship between choice, reaction time, and visual fixations using parameters that are very similar to those that best fit the previous data. The only critical difference is that the choice biases induced by the fixations are about half as big in purchasing decisions as in binary choices. This suggests that a similar computational process is used to make binary choices, trinary choices, and simple purchasing decisions. PMID:22707945

  12. Identification of a chicken (Gallus gallus) endogenous reference gene (Actb) and its application in meat adulteration.

    PubMed

    Xiang, Wenjin; Shang, Ying; Wang, Qin; Xu, Yuancong; Zhu, Pengyu; Huang, Kunlun; Xu, Wentao

    2017-11-01

    The genes commonly used to determine meat species are mainly mitochondrial, but the copy numbers of such genes are high, meaning they cannot be accurately quantified. In this paper, for the first time, the chromosomal gene Actb was selected as an endogenous reference gene for chicken species. It was assayed in four different chicken varieties and 16 other species using both qualitative and quantitative PCR. No amplification of the Actb gene was found in species other than chicken and no allelic variations were detected in chicken. Southern blot and digital-PCR confirmed the Actb gene was present as a single copy in the chicken genome. The quantitative detection limit was 10pg of DNA, which is equivalent to eight copies. All experiments indicated that the Actb gene is a useful endogenous reference gene for chicken, and provides a convenient and accurate approach for detection of chicken in feed and food. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Probabilistic short-term volcanic hazard in phases of unrest: A case study for tephra fallout

    NASA Astrophysics Data System (ADS)

    Selva, Jacopo; Costa, Antonio; Sandri, Laura; Macedonio, Giovanni; Marzocchi, Warner

    2014-12-01

    During volcanic crises, volcanologists estimate the impact of possible imminent eruptions usually through deterministic modeling of the effects of one or a few preestablished scenarios. Despite such an approach may bring an important information to the decision makers, the sole use of deterministic scenarios does not allow scientists to properly take into consideration all uncertainties, and it cannot be used to assess quantitatively the risk because the latter unavoidably requires a probabilistic approach. We present a model based on the concept of Bayesian event tree (hereinafter named BET_VH_ST, standing for Bayesian event tree for short-term volcanic hazard), for short-term near-real-time probabilistic volcanic hazard analysis formulated for any potential hazardous phenomenon accompanying an eruption. The specific goal of BET_VH_ST is to produce a quantitative assessment of the probability of exceedance of any potential level of intensity for a given volcanic hazard due to eruptions within restricted time windows (hours to days) in any area surrounding the volcano, accounting for all natural and epistemic uncertainties. BET_VH_ST properly assesses the conditional probability at each level of the event tree accounting for any relevant information derived from the monitoring system, theoretical models, and the past history of the volcano, propagating any relevant epistemic uncertainty underlying these assessments. As an application example of the model, we apply BET_VH_ST to assess short-term volcanic hazard related to tephra loading during Major Emergency Simulation Exercise, a major exercise at Mount Vesuvius that took place from 19 to 23 October 2006, consisting in a blind simulation of Vesuvius reactivation, from the early warning phase up to the final eruption, including the evacuation of a sample of about 2000 people from the area at risk. The results show that BET_VH_ST is able to produce short-term forecasts of the impact of tephra fall during a rapidly evolving crisis, accurately accounting for and propagating all uncertainties and enabling rational decision making under uncertainty.

  14. Application of a Snow Growth Model to Radar Remote Sensing

    NASA Astrophysics Data System (ADS)

    Erfani, E.; Mitchell, D. L.

    2014-12-01

    Microphysical growth processes of diffusion, aggregation and riming are incorporated analytically in a steady-state snow growth model (SGM) to solve the zeroth- and second- moment conservation equations with respect to mass. The SGM is initiated by radar reflectivity (Zw), supersaturation, temperature, and a vertical profile of the liquid water content (LWC), and it uses a gamma size distribution (SD) to predict the vertical evolution of size spectra. Aggregation seems to play an important role in the evolution of snowfall rates and the snowfall rates produced by aggregation, diffusion and riming are considerably greater than those produced by diffusion and riming alone, demonstrating the strong interaction between aggregation and riming. The impact of ice particle shape on particle growth rates and fall speeds is represented in the SGM in terms of ice particle mass-dimension (m-D) power laws (m = αDβ). These growth rates are qualitatively consistent with empirical growth rates, with slower (faster) growth rates predicted for higher (lower) β values. In most models, β is treated constant for a given ice particle habit, but it is well known that β is larger for the smaller crystals. Our recent work quantitatively calculates β and α for cirrus clouds as a function of D where the m-D expression is a second-order polynomial in log-log space. By adapting this method to the SGM, the ice particle growth rates and fall speeds are predicted more accurately. Moreover, the size spectra predicted by the SGM are in good agreement with those from aircraft measurements during Lagrangian spiral descents through frontal clouds, indicating the successful modeling of microphysical processes. Since the lowest Zw over complex topography is often significantly above cloud base, the precipitation is often underestimated by radar quantitative precipitation estimates (QPE). Our SGM is capable of being initialized with Zw at the lowest reliable radar echo and consequently improves QPE at ground level.

  15. A National Crop Progress Monitoring and Decision Support System Based on NASA Earth Science Results

    NASA Astrophysics Data System (ADS)

    di, L.; Yang, Z.

    2009-12-01

    Timely and accurate information on weekly crop progress and development is essential to a dynamic agricultural industry in the U. S. and the world. By law, the National Agricultural Statistics Service (NASS) of the U. S. Department of Agriculture’s (USDA) is responsible for monitoring and assessing U.S. agricultural production. Currently NASS compiles and issues weekly state and national crop progress and development reports based on reports from knowledgeable state and county agricultural officials and farmers. Such survey-based reports are subjectively estimated for an entire county, lack spatial coverage, and are labor intensive. There has been limited use of remote sensing data to assess crop conditions. NASS produces weekly 1-km resolution un-calibrated AVHRR-based NDVI static images to represent national vegetation conditions but there is no quantitative crop progress information. This presentation discusses the early result for developing a National Crop Progress Monitoring and Decision Support System. The system will overcome the shortcomings of the existing systems by integrating NASA satellite and model-based land surface and weather products, NASS’ wealth of internal crop progress and condition data and Cropland Data Layers (CDL), and the Farm Service Agency’s (FSA) Common Land Units (CLU). The system, using service-oriented architecture and web service technologies, will automatically produce and disseminate quantitative national crop progress maps and associated decision support data at 250-m resolution, as well as summary reports to support NASS and worldwide users in their decision-making. It will provide overall and specific crop progress for individual crops from the state level down to CLU field level to meet different users’ needs on all known croplands. This will greatly enhance the effectiveness and accuracy of the NASS aggregated crop condition data and charts of and provides objective and scientific evidence and guidance for the adjustment of NASS survey data. This presentation will discuss the architecture, Earth observation data, and the crop progress model used in the decision support system.

  16. A Radioactivity Based Quantitative Analysis of the Amount of Thorium Present in Ores and Metallurgical Products; ANALYSE QUANTITATIVE DU THORIUM DANS LES MINERAIS ET LES PRODUITS THORIFERES PAR UNE METHODE BASEE SUR LA RADIOACTIVITE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collee, R.; Govaerts, J.; Winand, L.

    1959-10-31

    A brief resume of the classical methods of quantitative determination of thorium in ores and thoriferous products is given to show that a rapid, accurate, and precise physical method based on the radioactivity of thorium would be of great utility. A method based on the utilization of the characteristic spectrum of the thorium gamma radiation is presented. The preparation of the samples and the instruments needed for the measurements is discussed. The experimental results show that the reproducibility is very satisfactory and that it is possible to detect Th contents of 1% or smaller. (J.S.R.)

  17. Quantitative 1H NMR: Development and Potential of an Analytical Method – an Update

    PubMed Central

    Pauli, Guido F.; Gödecke, Tanja; Jaki, Birgit U.; Lankin, David C.

    2012-01-01

    Covering the literature from mid-2004 until the end of 2011, this review continues a previous literature overview on quantitative 1H NMR (qHNMR) methodology and its applications in the analysis of natural products (NPs). Among the foremost advantages of qHNMR is its accurate function with external calibration, the lack of any requirement for identical reference materials, a high precision and accuracy when properly validated, and an ability to quantitate multiple analytes simultaneously. As a result of the inclusion of over 170 new references, this updated review summarizes a wealth of detailed experiential evidence and newly developed methodology that supports qHNMR as a valuable and unbiased analytical tool for natural product and other areas of research. PMID:22482996

  18. Assessment of the neutron dose field around a biomedical cyclotron: FLUKA simulation and experimental measurements.

    PubMed

    Infantino, Angelo; Cicoria, Gianfranco; Lucconi, Giulia; Pancaldi, Davide; Vichi, Sara; Zagni, Federico; Mostacci, Domiziano; Marengo, Mario

    2016-12-01

    In the planning of a new cyclotron facility, an accurate knowledge of the radiation field around the accelerator is fundamental for the design of shielding, the protection of workers, the general public and the environment. Monte Carlo simulations can be very useful in this process, and their use is constantly increasing. However, few data have been published so far as regards the proper validation of Monte Carlo simulation against experimental measurements, particularly in the energy range of biomedical cyclotrons. In this work a detailed model of an existing installation of a GE PETtrace 16.5MeV cyclotron was developed using FLUKA. An extensive measurement campaign of the neutron ambient dose equivalent H ∗ (10) in marked positions around the cyclotron was conducted using a neutron rem-counter probe and CR39 neutron detectors. Data from a previous measurement campaign performed by our group using TLDs were also re-evaluated. The FLUKA model was then validated by comparing the results of high-statistics simulations with experimental data. In 10 out of 12 measurement locations, FLUKA simulations were in agreement within uncertainties with all the three different sets of experimental data; in the remaining 2 positions, the agreement was with 2/3 of the measurements. Our work allows to quantitatively validate our FLUKA simulation setup and confirms that Monte Carlo technique can produce accurate results in the energy range of biomedical cyclotrons. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. The MODIS Aerosol Algorithm: Critical Evaluation and Plans for Collection 6

    NASA Technical Reports Server (NTRS)

    Remer, Lorraine

    2010-01-01

    For ten years the MODIS aerosol algorithm has been applied to measured MODIS radiances to produce a continuous set of aerosol products, over land and ocean. The MODIS aerosol products are widely used by the scientific and applied science communities for variety of purposes that span operational air quality forecasting in estimates o[ clear-sky direct radiative effects over ocean and aerosol-cloud interactions. The products undergo continual evaluation, including self-consistency checks and comparisons with highly accurate ground-based instruments. The result of these evaluation exercises is a quantitative understanding of the strengths and weaknesses of the retrieval, where and when the products are accurate and the situations where and when accuracy degrades. We intend 10 present results of the most recent critical evaluations including the first comparison of the over ocean products against the shipboard aerosol optical depth measurements of the Marine Aerosol Network (MAN), the demonstration of the lack of sensitivity to size parameter in the over land products and identification of residual problems and regional issues. While the current data set is undergoing evaluation, we are preparing for the next data processing, labeled Collection 6. Collection 6 will include transparent Quality Flags, a 3 km aerosol product and the 500m resolution cloud mask used within the aerosol n:bicvu|. These new products and adjustments to algorithm assumptions should provide users with more options and greater control, as they adapt the product for their own purposes.

  20. Biomedical research with cyclotron produced radionuclides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laughlin, J.S.; Benua, R.S.; Tilbury, R.S.

    1979-09-01

    Progress is reported on: metabolic and tumor localization in man and animals; radiodrug development; dosimetry for internally deposited isotopes; and radioactive material transfer system. Based on experience with /sup 13/N-glutamate in osteogenic sarcoma and Ewing's sarcoma, we conclude that (a) the /sup 13/N label enters tumor tissue rapidly at a rate similar to that at which activity leaves the blood, suggesting that the labeled glutamate itself is being transported into the tumor rather than some labeled metabolite; (b) uptake in the tumor is related to its metabolic activity, but factors such as blood flow are also important; (c) changes inmore » the glutamate scan accurately reflect the response of osteogenic sarcoma to pre-operative chemotherapy as measured by conventional means, and that it is desirable to extend this experience to other types of tumors. /sup 13/N-Glutamate (and other /sup 13/N-labeled compounds) afford several advantages over conventional tumor imaging agents, such as rapid blood clearance and localization, low radiation exposure and the possibility of obtaining accurate, three-dimensional quantitative images via positron emission tomography. It is doubtful that these advantages will justify the routine use of /sup 13/N-glutamate to detect tumors or to monitor therapy except in clinical situations where conventional techniques are unsatisfactory. The value of /sup 1/3N-glutamate is as a tool to assess the metabolic requirement of neoplastic tissue in cancer patients in-vivo. (PCS)« less

Top